Jump to content


Photo

advanced portable harddrive question


  • You cannot start a new topic
  • Please log in to reply
9 replies to this topic

#1 harddrivewd

harddrivewd

    Member

  • Member
  • 7 posts

Posted 28 February 2013 - 02:16 PM

i'd like to know, if I don't do "safely remove" from windows and unplug the USB cable from computer when they are no transferring or copying of the files,
will it cause any problems, corruption of the files already saved on the harddisks? Will the MD5 checksum of any of the arbitrary files on the disks be altered? Thanks

#2 dietrc70

dietrc70

    Member

  • Member
  • 106 posts

Posted 28 February 2013 - 10:35 PM

If all the files are closed and the buffers are flushed then no, removing the disk isn't a problem. However if you don't unmount (i.e. "safely remove") the disk first it's hard to be sure that some files aren't open. If the "quick removal" option is set for the drive in the Device Manager, and you close whatever program you were using first, you probably can remove the drive without any problem.

NTFS doesn't verify the checksum of files, although more modern filesystems like ZFS do. Files on magnetic storage can (rarely) go bad over time, due to a phenomenon called "bit rot." If file integrity is especially important, than I suggest using archive software (RAR, for example) with recovery and checksum verification features enabled.

#3 harddrivewd

harddrivewd

    Member

  • Member
  • 7 posts

Posted 01 March 2013 - 03:02 AM

If all the files are closed and the buffers are flushed then no, removing the disk isn't a problem. However if you don't unmount (i.e. "safely remove") the disk first it's hard to be sure that some files aren't open. If the "quick removal" option is set for the drive in the Device Manager, and you close whatever program you were using first, you probably can remove the drive without any problem.

NTFS doesn't verify the checksum of files, although more modern filesystems like ZFS do. Files on magnetic storage can (rarely) go bad over time, due to a phenomenon called "bit rot." If file integrity is especially important, than I suggest using archive software (RAR, for example) with recovery and checksum verification features enabled.



Hello, thanks for the answers but I have a further question to ask.

Yes, I have quite some files in which their integrity is particularly important, I don't want them to have even a single bit / string of data alternations (MD5 or other checksums). But sorry I don't understand your last statement - Using archive software with recovery and checksum verification features enabled. RAR? Do you mean WinRAR? Aren't they a compression software only?

#4 FastMHz

FastMHz

    Member

  • Member
  • 405 posts

Posted 01 March 2013 - 03:23 PM

RAR + PAR2 files (see QuickPar) provide excellent bad bit/byte recovery. You can also make RAR files without compression, i.e. file storage only.

Production: Vishera 8350/32gb RAM/Dual SSD/VelociRaptor/Radeon 7750
Gaming: Phenom II 955/16gb RAM/SSD/VelociRaptor/Radeon 7950
Retro: K6-2 550/256mb RAM/160gb HDD/CompactFlash/3DFX/ATI AIW Pro/SB16/DB50XG
http://www.fastmhz.com

#5 harddrivewd

harddrivewd

    Member

  • Member
  • 7 posts

Posted 04 March 2013 - 03:20 AM

RAR + PAR2 files (see QuickPar) provide excellent bad bit/byte recovery. You can also make RAR files without compression, i.e. file storage only.



RAR ? Do you mean WinRAR? How can I make file storage without compression? I don't need compression I have plenty of space to waste, I only need complete integrity of the stored file.

By the way, another question is - is it sufficient to trust MD5 checksum? I don't know about Par2 and RAR but my current method is to generate a list of MD5 checksum for the freshly obtained file and then I keep the file containing the MD5 value and I'll compare the value after some years to see if anything changed. Is it sufficeint? If not, I don't mind learning about PAR2 or RAR if they are really a better alternative.

Thanks

Edited by harddrivewd, 04 March 2013 - 04:04 AM.

#6 continuum

continuum

    Mod

  • Mod
  • 3,574 posts

Posted 04 March 2013 - 07:00 PM

If you just need complete integrity, use QuickPAR (PAR2).

#7 harddrivewd

harddrivewd

    Member

  • Member
  • 7 posts

Posted 05 March 2013 - 01:29 AM

RAR + PAR2 files (see QuickPar) provide excellent bad bit/byte recovery. You can also make RAR files without compression, i.e. file storage only.


hello, i am reading the tutorial of PAR2 but I have something I don't understand. While it is of course necessary to create a new type of file to ensure the original file is correct and will not corrupt over time. Who can ensure that the generated PAR2 won't corrupt with time, thanks

#8 dietrc70

dietrc70

    Member

  • Member
  • 106 posts

Posted 05 March 2013 - 01:56 AM

Maybe someone can give you specific probabilities, but the chance of the checksum and recovery files and the original file all corrupting in a way that would be undetectable is so remote that it can be safely regarded as "impossible." You'd be far, far, more likely to experience a disk hardware failure. For this reason, you should naturally have files this important backed up in multiple locations, one of which should be online with a backup service (i.e. Crashplan).

BTW, while looking into this it seems that Quickpar hasn't been in development since 2004. The current PAR3 software is called Multipar, and is free.

You could also look into proprietary backup software, which usually has very sophisticated verification and recovery features.

Edited by dietrc70, 05 March 2013 - 01:58 AM.

#9 harddrivewd

harddrivewd

    Member

  • Member
  • 7 posts

Posted 05 March 2013 - 05:12 AM

Maybe someone can give you specific probabilities, but the chance of the checksum and recovery files and the original file all corrupting in a way that would be undetectable is so remote that it can be safely regarded as "impossible." You'd be far, far, more likely to experience a disk hardware failure. For this reason, you should naturally have files this important backed up in multiple locations, one of which should be online with a backup service (i.e. Crashplan).

BTW, while looking into this it seems that Quickpar hasn't been in development since 2004. The current PAR3 software is called Multipar, and is free.

You could also look into proprietary backup software, which usually has very sophisticated verification and recovery features.


hello thanks for your answers it's close to what I want.

I've just read about the tutorial of PAR2 and I ve also noticed that there had been no development for some time, and it's like a little bit complicated to me

Now I've think of another strategy - Generate the MD5 checksum file for all the important files that I want to keep them flawlessly for decades, and I then regularly (monthly or annually) compare all of the important files with the small checksum files that I generated in the time of creation. I'd like to know if it's sufficient to ensure perfectness of the files?


Last but not least, I really doubt if there are something called 'data rot' (or have I misunderstood about it?)

I've kept a few important video files since 2005, over the years the file was shifted from one storage device to another lots of time and some drives even had gone bad and were opened by unofficial and unverified local guy to
repair to take out of the files, over the years I think the files were transferred between lots of unreliable and problematic drives but now I compare their MD5 checksum with the freshly obtained copy from the remote server. They have the same MD5 identity. How come it doesn't have even one little string being changed after the years on lots of
problem sectors?


Thanks all

#10 dietrc70

dietrc70

    Member

  • Member
  • 106 posts

Posted 06 March 2013 - 05:56 AM

The built-in error correction in hard-drives has been very good for a long time. Most people don't even know what "data rot" is and have never had any problem with it. Spontaneous bit rot is most likely with CD-RW's, magnetic tapes, and flash media. It's rarer in hard drives but still possible. RAID arrays are also susceptible to undetected corruption if there is some a system crash and a failure to properly flush the cache.

I suggest getting a checksum for the files and having multiple backups in different locations. In the extremely unlikely event one file in one location is corrupted, you will still have the others to compare and verify.



0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users