Script for testing all files on a drive ?
On 2011/12/23 01:07, linux guy wrote:
I am having an issue with my laptop. I've found some corrupt files on
an SSD drive. I need to know exactly how many are corrupt and which
ones they are.
I've run various block check and surface scan tools and they all come
up with zero errors.
I found the corrupt files when I was attempting to back up every file
on the drive with a simple cp command.
I would now like to run a script that checks every file on the drive
and puts the name and path of every corrupt file into a text file for
What is the easiest way to do this ?
If I understand how SSDs work when they detect a sector is going bad they
map it out and map in a new sector from the spare pool just like a regular
drive. If it can it reads the data from the old sector and puts its best
shot in the new sector. The disk also uses the spare pool as a means of
minimizing repeated writes to the same block by spreading them around.
I gotta ask just how did "cp" detect the bad sectors? I'm curious. I
suspect you'll tell me read errors. The surface scan might have lead to
automatic repairs. Heck, I'd have expected the "cp" to do the same.
If you have good copies of the files you can use diff or cmp against the
backups. Otherwise the only way to really find if the files is bad is to
try to use them.
users mailing list
To unsubscribe or change subscription options:
Have a question? Ask away: http://ask.fedoraproject.org