3268: OK to defrag drive with image file on it?

26 replies [Last post]
Stephen Whitesell
Offline
Beginner
Joined: 2009-08-24
Posts: 4

The external USB drive I am using for my True Image (.tib) files is in need of defragmenting.  Is it safe to do so, or would defragging a .tib file cause problems if I need to do a restore from it.  I create complete image files (not incrementals) and alternate between Image 1 (M-W-F at 2:00 am) and Image 2 (T-T-S at 2 :00 am).  I am using V9.0, although it probably isn't relevant to this question.

Colin B
Colin B's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 9248

 There is a risk albeit small that defragmenting an image could cause corruption, but probably not more so than when you defrag a drive normally.

Defragging an image drive will often take a longer than a normal uncompressed disk.

What might be a better solution as you only make full images would be to defrag your system drive before making the image.

Assuming that the drive with the images on is only used for this purpose then you shouldn't be getting too much defragmentation as the drive is not being read and written to as a system drive is.

 

 

__________________

Windows 10 Pro 1607:14393 + Windows 7 Ultimate 64bit SP1 + Server 2008 R2, SBS2011. Testing Environment VirtualBox VM. TI2016:5518 Acronis Backup Advanced:43994, DD12:3223, vmProtect 7.0:5173, SnapDeploy 4 On a clear disk you can seek forever

Stephen Whitesell
Offline
Beginner
Joined: 2009-08-24
Posts: 4

Thanks for your reply.  I'm glad to hear the risk in defragging an image file is no greater than for any other type file.  I guess I could temporarily copy the two image files to another hard drive, defrag the USB drive, and then copy them back.  At any rate, one will be replaced tonight and the second one tomorrow night.

My system drive is actually in good shape as I defrag it periodically.  I was really suprised to see the USB drive I am using for the image files (and for archiving other files) was more than 50% fragmented.  I have need to read the archive files once in a while, but I don't make changes to them, and only make additions about once a year.  I suppose the alternate day erasing and writing a new image file must be contributing to the fragmenting as the image file size changes.

Dmitry
Dmitry's picture
Offline
Frequent Poster
Joined: 2009-04-17
Posts: 953

Hello Stephen,

Thank you for using Acronis Products

Also note that incremental backup saving all changes that have been made since the latest incremental or full image was created. Process of files defragmentation on your hard drive is considered as changes being taken. It will result in creation of large in size incremental backup.

__________________

Dmitry Nikolaev

[[http://www.acronis.com/support | Acronis Customer Central]] | [[http://www.acronis.com/ | Acronis Backup Software]]
For more answers to your questions, try our [[http://kb.acronis.com | Knowledge base]]

Scott Hieber
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-17
Posts: 3056

YOu might want to consider that a really big file like backup image, even if it's in a few dozen pieces, is not going to run appreciably faster if it is defragged -- you might save a few nanoseconds per every twenty or so mintues of operation.  In exchange for that small gain when you restore from the backup file, you get the long process of defragging the large file when you defrag and all the extra wear and tear on the hard drive head positioner.

 

It's a good idea in many cases to just exclude tibs from defrag operations -- most aftermarket defraggers will let you exclude file types.

__________________

Be part of the discussion; Post your objective product views on Amazon and other sites. Reading works. Get the User Guide at http://www.acronis.com/en-us/support/documentation/

Stephen Whitesell
Offline
Beginner
Joined: 2009-08-24
Posts: 4

Scott - Good point about just excluding the .tib files when I defrag.  Thanks!

Darren
Offline
Beginner
Joined: 2009-08-22
Posts: 15




If you exclude the tib files then those files wont get organised. So your drive will then remain over 50% fragmented. Do this little test. Analyse your drive note the degree of fragmentation. Move a tib file to another HD re-run the analyse I bet the fragmentation will be lower.


Please see http://en.wikipedia.org/wiki/Defragmentation#Defragmenting_and_optimising

just my observation.

Darren

Mark Wharton
Mark Wharton's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 2066

Darren:

Your observation is correct, but think about what's accomplished by defragmenting a large .tib file. Even if the file has 50 fragments, each fragment may add a few milliseconds of additional seek time to a process that takes many minutes. You would never notice the difference between, say, a restore operation that takes 30 minutes from a non-fragmented .tib file and 30 minutes plus 150 milliseconds for a .tib file with 50 fragments. Defragmenting large .tib files is a complete waste of time.

__________________

Acronis TI 9, TI 10, TI 2011, TI 2013, TI 2014, TI2015, TI 2016, DD 10, DD 11, DD12 user. Amateur Radio K0LO

Darren
Offline
Beginner
Joined: 2009-08-22
Posts: 15

Sure, the point I'm trying to make is Stephen's USB HD only contains .tib files so what the point in excluding them from the defrag process.

Stephen Whitesell wrote:

The external USB drive I am using for my True Image (.tib) files is in need of defragmenting. 

 

In effect he might as well not run the process.

Darren

Scott Hieber
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-17
Posts: 3056

Absolutely correct, almost -- the question then would be, why defrag at all if there are nothing but tibs on the disk. If all he has on the disk are tibs, then defragging will put a lot of wear and tear on the drive with vrtually no appreciable benefit  when backing up or restoring. Tibs are files that are usually used, on average, much less than twice. Once when created and then never again except on those rare ocassions when you actually use it for a restore.

 

The benefits of defragging are terribly overrated, especially by defragger vendors. If you look at the average access time for your harddisk, that's roughly the amount of time that's added when you have go from one fragment of a file to another. Suppose your average access time is 9 ms and your backup file 100GB and is in an incredible 3,200 fragments -- after going through all the math, allowing for latency and such, this means restoring this file will take about 15 seconds longer than it would if it weren't fragmented at all. A tib in that many fragments would need to be on a very fragmented disk and be very large and probably talke about 30-90 mintues to restore in any event, so what's 15 seconds on an hour long task in exchange for the hours one might have spent before that on repeated defragging to avoids frags on the disk?

 

Even if you harddisk over years of use became just impossibly fragmented beyond belief, it'd be easier to just copy off some select tibs for history's sake and then delete and start over. Probably before reached this point in terms of fragmentation, your backup disk will be at the end of its useful life and you could just copy the whole disk of the tibs to a new disk.

 

Darren wrote:

Sure, the point I'm trying to make is Stephen's USB HD only contains .tib files so what the point in excluding them from the defrag process.

Stephen Whitesell wrote:

The external USB drive I am using for my True Image (.tib) files is in need of defragmenting. 

 

In effect he might as well not run the process.

Darren

__________________

Be part of the discussion; Post your objective product views on Amazon and other sites. Reading works. Get the User Guide at http://www.acronis.com/en-us/support/documentation/

Stephen Whitesell
Offline
Beginner
Joined: 2009-08-24
Posts: 4

Thanks for all of your comments.  I actually had two types of files on the external USB drive, the .tib files and an "archive" of several important old files that I need to keep.  The archive was created by simply copying them to the drive in an organized folder structure, not by some "archiving" program.  When I saw the drive was over 50% fragmented, I didn't know how it got that way (and frankly still don't), so I thought it would be worthwhile to do a one-time defrag.  Although I might access the archive every week or so to read some of the older files, I only add to it about once per year.  Since the .tib files are "disk images", I wanted to check with the experts that defragmenting a disk with an "image" file on it wouldn't mess up the "image" and make it unusable should I need it.   I went ahead and defragged the drive based on the early response I received.  I have since replaced both copies of the image files through my nightly backup process (described in original post).

Scott Hieber
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-17
Posts: 3056

Suppose you have a 60GB drive and you put three tibs on it that are 10GB each. Suppose that the first is split in 2 frags on either side of the metadata, the second is split in two on either side of the thrird (it's filling in some space from a prior deleted file), and the third one is in 2 frags on either side of the directory listings.  You'd have have your drive space fragemented, although only three files are fragmented and each is in only 2 frags. With very large files, it's not hard for large amounts of space to be fragmented, which jsut means that statistic isn't very meaningful in terms of how scattered your files are.

__________________

Be part of the discussion; Post your objective product views on Amazon and other sites. Reading works. Get the User Guide at http://www.acronis.com/en-us/support/documentation/

Verndog
Offline
Beginner
Joined: 2009-11-13
Posts: 12

Another way of looking at this problem. If you put a 100 gig image file onto 200 gig drive that was deframented previously, the drive will automatically show up as 50% fragmented, because chances are that image file (that is 50% of the drive), will be stored in at least 2 and prob. more pieces.

The question I have is...How fragmented can you let your external backup drive get before it starts having problems since its widely known and accepted that heavilly fragmented disks have much higher failure rates.

Who defrags their backup drives with large image files and who doesn't?? I'm trying to decide should I just leave them alone, and there are good arguements on both sides here.

__________________

True Image Home 2010 build 6053 w Plus Pack
Win 7 x32
Intel Core 2 Duo E8400 3.0ghz 4.0g ram

Mark Wharton
Mark Wharton's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 2066

I think the important consideration here is whether the speed increase gained by defragmenting the disk is significant or not. And the deciding factor is how often the disk is accessed.

Without question it is a good idea to defragment the operating system partition since it contains many thousands of files that are not accessed sequentially, so saving a few milliseconds per file access is worthwhile, and especially noticeable when booting Windows.

But I maintain that it is a complete waste of time to defragment a disk containing backup images. How often are these images accessed? Probably rarely if ever, and only to restore. If a restore operation takes 20 minutes, and if defragmenting the drive can speed up that 20-minute process by ten or twenty milliseconds, then what's to gain?

__________________

Acronis TI 9, TI 10, TI 2011, TI 2013, TI 2014, TI2015, TI 2016, DD 10, DD 11, DD12 user. Amateur Radio K0LO

Verndog
Offline
Beginner
Joined: 2009-11-13
Posts: 12

K0LO wrote:
....But I maintain that it is a complete waste of time to defragment a disk containing backup images. How often are these images accessed? Probably rarely if ever, and only to restore. If a restore operation takes 20 minutes, and if defragmenting the drive can speed up that 20-minute process by ten or twenty milliseconds, then what's to gain?

I think you are ignoring the fact that a heavily fragmanted disk can and will eventually fail. It's more then just speed at risk here. If I thought this were only a speed issue I would agree with your time analogy. Problem is this is a disk health issue as well, and your opinion ignors that fact.

__________________

True Image Home 2010 build 6053 w Plus Pack
Win 7 x32
Intel Core 2 Duo E8400 3.0ghz 4.0g ram

oracledba
Offline
Regular Poster
MVP
Joined: 2009-09-16
Posts: 240

you don't want the cure to be worse then the disease.
If you ever do a defrag, and you sitting there watching it/waiting for it to finish then you are by definition maximizing the negative performance impact of fragmentation of your disk. So if your going to defragment then for gosh sakes do it off hours while your sleeping or otherwise away from the pc.
if you buy into the concept that a defrag will be unattended/automated/and possibly off hours.
the question becomes to what benifit. If use a product like diskeeper which defrags the c:\ drive continueally 24x7 in the background then be preprared for daily 700meg incr files. You'll never suffer from fragmentation but you'll pay in terms of extra backup storage needed.
I have found that a weekly or monthly defrag scheduled a few hours before a full image yields almost the same performance benifit of defragging continually/daily but at minimal cost (storage or runtime performance). I have diskeeper and have used it both ways.

I have no numbers to backup Defragging the target device (or not) I have done my target device before but not sure what that did for me as all my backups are un-attended/offhours. I really never look to see how long a backup takes, I get an email when it fails and I also have my backups send me email if the backup was a new full image. But other than the expected 48 "new full image taken" emails a year and the occasional "incr failure" email I never hear from my backups. All backups simply work day in, day out without any effort or thought. All that matters is they are done by the time I wake up - it matters not to me if it took 30 minutes or 3 hours.

Colin B
Colin B's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 9248

I would advise against defragging an image file only for the reason, that the defragging progam is rearranging blocks of data not just the entries in the file system tables. Therefore the possibility (however remote) is greater that a corrupted byte may occur which will then render the image file unrestorable.

If the image drive is only used for writing images to and you are not constantly deleting files etc on it, then the amount of defragmentation that drive will suffer will be far less than that of an active drive.

__________________

Windows 10 Pro 1607:14393 + Windows 7 Ultimate 64bit SP1 + Server 2008 R2, SBS2011. Testing Environment VirtualBox VM. TI2016:5518 Acronis Backup Advanced:43994, DD12:3223, vmProtect 7.0:5173, SnapDeploy 4 On a clear disk you can seek forever

Mark Wharton
Mark Wharton's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 2066

Verndog wrote:
I think you are ignoring the fact that a heavily fragmanted disk can and will eventually fail. It's more then just speed at risk here. If I thought this were only a speed issue I would agree with your time analogy. Problem is this is a disk health issue as well, and your opinion ignors that fact.
If your argument is that a heavily fragmented disk will fail because its head positioning mechanism is being exercised heavily, then what do you think happens when you are defragmenting? Doesn't the head positioning mechanism get exercised to the extreme? So using this line of reasoning, couldn't one argue that frequent defragmentation of a disk will eventually wear it out?

There is a happy medium to be achieved here between performance and time-saving. I agree with oracledba - a thorough defragmentation of the Windows system disk is only needed about once a month. I came to this conclusion after several years use of PerfectDisk. Once thoroughly deframented, the performance of the disk is optimum, and the degradation after a week is negligible. Monthly is good enough.

I still stand behind my claim that it is a complete waste of time to defragment a disk containing .tib files. By your reasoning, you will save wear and tear on the disk positioning mechanism by NOT defragmenting it, you will lessen the risk of corrupting a perfectly good backup image, and you will save the time needed to do the defragmentation. I can't think of a single argument in favor of defragging a bunch of tib files.

__________________

Acronis TI 9, TI 10, TI 2011, TI 2013, TI 2014, TI2015, TI 2016, DD 10, DD 11, DD12 user. Amateur Radio K0LO

Verndog
Offline
Beginner
Joined: 2009-11-13
Posts: 12

K0LO wrote:

Verndog wrote:
I think you are ignoring the fact that a heavily fragmanted disk can and will eventually fail. It's more then just speed at risk here. If I thought this were only a speed issue I would agree with your time analogy. Problem is this is a disk health issue as well, and your opinion ignors that fact.
If your argument is that a heavily fragmented disk will fail because its head positioning mechanism is being exercised heavily, then what do you think happens when you are defragmenting? Doesn't the head positioning mechanism get exercised to the extreme? So using this line of reasoning, couldn't one argue that frequent defragmentation of a disk will eventually wear it out?

There is a happy medium to be achieved here between performance and time-saving. I agree with oracledba - a thorough defragmentation of the Windows system disk is only needed about once a month. I came to this conclusion after several years use of PerfectDisk. Once thoroughly deframented, the performance of the disk is optimum, and the degradation after a week is negligible. Monthly is good enough.

I still stand behind my claim that it is a complete waste of time to defragment a disk containing .tib files. By your reasoning, you will save wear and tear on the disk positioning mechanism by NOT defragmenting it, you will lessen the risk of corrupting a perfectly good backup image, and you will save the time needed to do the defragmentation. I can't think of a single argument in favor of defragging a bunch of tib files.

I'm not concerned about wearing the drive. I'm concerned about corrupted data from heavily fragmanted disks. Exactly what happens when you delete and add large files to a drive over and over daily.

Testing shows this can occur and it makes sence at some point be it 1x per month.. year or whatever, but defragmentaion has other benifits to disk health besides access speed within the NTFS system.

Here is some evidence of this happening from testing done...

"C. FILE CORRUPTION AND DATA LOSS
File corruption and data loss are both immediately traceable to fragmentation.
In recent tests on Windows 2000 and Windows XP, a specially designed utility
was utilized to fragment an NTFS volume. Even though the test drive was
only 40 percent full, the files themselves were fragmented resulting in the
automatic creation of additional MFT records. When attempting to move one
contiguous 72 MB file onto that disk, the result was the corruption of
everything on the disk.
Why would this occur? The presence of excessive file fragments on a disk
makes it more difficult for the operating system to function efficiently. When a
file is added, large-scale data corruption can result.
This message, for example, is not uncommon:
Windows NT could not start because the following file is missing
or corrupt:
\System32\Ntoskrnl.exe. Please re-install a copy
of the above file.
This form of corruption/data loss led to the inability to boot up. Why?
According to Microsoft KB article Q224526 some key files needed for booting
the operating system were situated beyond cylinder 1023 on the volume. But,
given the CHS (cylinder/head/sector) setup on the machine, the boot
sequence could only see the first 7.68 GB of the volume during the initial boot
phase. The needed system file was situated beyond where the INT 13 BIOS
interface could find it."

__________________

True Image Home 2010 build 6053 w Plus Pack
Win 7 x32
Intel Core 2 Duo E8400 3.0ghz 4.0g ram

Mark Wharton
Mark Wharton's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 2066

Could you provide a reference to the article that you're quoting? Its conclusions sound extremely dubious to me, but I'd like to read it myself.

For example, your last paragraph refers to extremely outdated technology (the 7.68 GB barrier on hard disks) that is not an issue any more.

__________________

Acronis TI 9, TI 10, TI 2011, TI 2013, TI 2014, TI2015, TI 2016, DD 10, DD 11, DD12 user. Amateur Radio K0LO

Verndog
Offline
Beginner
Joined: 2009-11-13
Posts: 12

K0LO wrote:
Could you provide a reference to the article that you're quoting? Its conclusions sound extremely dubious to me, but I'd like to read it myself. For example, your last paragraph refers to extremely outdated technology (the 7.68 GB barrier on hard disks) that is not an issue any more.

Here you go Mark. I have researched both sides of this after having 2 external drives crash over the last 5 years that were used solely for backup of large files. They were used very little otherwise, and were NOT defragged.

__________________

True Image Home 2010 build 6053 w Plus Pack
Win 7 x32
Intel Core 2 Duo E8400 3.0ghz 4.0g ram

Mark Wharton
Mark Wharton's picture
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-15
Posts: 2066

Verndog:

Thank you for the article reference. It was written by Executive Software, now known as Diskeeper, in 2002. As you might imagine, Diskeeper has a vested interest in promoting and selling defragmentation software, so you might keep that in mind when reading the article.

I did read the article and have a couple of observations, but I don't have any particular expertise in defragmentation, so take my observations as coming from an interested outsider. First, the desire to oversell the merits of defragmentation comes across strongly in the article. They seize upon tiny issues and blow them up way out of proportion, IMHO. Second, many of the issues discussed are outdated. 2002 was an eternity ago in computer time and there have been several generational changes in both disk drive hardware and PC software since then.

Nonetheless, Diskeeper and Raxco (makers of PerfectDisk) are both respectable companies and their products are in widespread use. I was a big fan of PerfectDisk for many years but now consider the defragmentation software that is built into Windows 7 to be perfectly adequate.

In any event, thank you for the interesting discussion. Remember that PC stands for Personal Computer, and we are all free to make our own personal decisions on how we organize, maintain, and use these wonderful pieces of technology.

__________________

Acronis TI 9, TI 10, TI 2011, TI 2013, TI 2014, TI2015, TI 2016, DD 10, DD 11, DD12 user. Amateur Radio K0LO

Verndog
Offline
Beginner
Joined: 2009-11-13
Posts: 12

I think we are all here to attempt to reduce the chances of lost data. If leaving a piece of the puzzel out means failure somewhere along the way, then I want to avoid that when possible.

I dont profess to be an exprt on defrag or hard drives even, just someone that in years past did suffer from data loss that I set out to not invite back into my experiance. I tend to use the common sence approach to many things. And learning that files can be fragmented into upwards of 95,000 pieces, to me means increased potential for loss.

I found this section quite interesting....

"...100 pieces per file may be a conservative estimate, however. A study by
American Business Research conducted on 100 companies revealed that 56
percent of NT/Windows 2000 workstations had files fragmented into between
1050 and 8162 pieces. One in four reported finding files with as many as
10,000 to 51,222 fragments. For servers, an even greater degree of
fragmentation exists. Half of the respondents discovered 2000 to 10,000
fragments and another 33 percent had files fragmented into 10,333 to 95,000
pieces."

However...I would agree that it's in their best interest to put fear into potential customers, and weigh that into the mix. Personally I find the freeware Auslogics Disk Defrag program to work excellent. So they haven't convinced me entirely... Thank you for the responces as well. For now, I'll defrag c: weekly, watch the external drives... and ponder this issue further.

http://download.cnet.com/Auslogics-Disk-Defrag/3000-2094_4-10567503.html

__________________

True Image Home 2010 build 6053 w Plus Pack
Win 7 x32
Intel Core 2 Duo E8400 3.0ghz 4.0g ram

Scott Hieber
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-17
Posts: 3056

Fragmented drives fail more often or sooner?

The only increase in activity or movement or wear due to fragmented files is head movement -- you would need a heck of a lot of fragmentations for the diff to matter. Otoh, defragging presents lots of head movement. Arguably, defragging, especially really large backup files, will put more wear and tear on your drive than if you just left those files alone -- this is especially so since backup files are rarely used, except (read and written, of course, if you do a lot of defragging. ;)

If yo need to defrag a drive, it would be prudent to exclude large backup files from the defrag process, unless you don't mind the extra wear and tear.

__________________

Be part of the discussion; Post your objective product views on Amazon and other sites. Reading works. Get the User Guide at http://www.acronis.com/en-us/support/documentation/

GoneToPlaid
Offline
Regular Poster
Joined: 2009-11-14
Posts: 128

Hi everyone,

I specifically use VoptXP for defragging all of my drives. The author of the program now has a new version for Win7. The author of the program used to work for Microsoft and he wrote the defrag utility which was included in Windows9x and updated in XP. Anyway, I've used VoptXP for years on numerous machines and I have never encountered any problems. Thats my two cents. :-)

Scott Hieber
Offline
Acronis MVP Volunteer
MVP
Joined: 2009-08-17
Posts: 3056

lol. I doubt MS would have let him use anything remotely like the code he developed while in MS's employ, unless it was pretty generic code. They've got pretty good lawyers over there at MS, I hear ;).

Otoh, MS has long been thought to have, shall we say, adapted most of its defrag code from the version of diskeeper is used to use.

__________________

Be part of the discussion; Post your objective product views on Amazon and other sites. Reading works. Get the User Guide at http://www.acronis.com/en-us/support/documentation/

oracledba
Offline
Regular Poster
MVP
Joined: 2009-09-16
Posts: 240

chain2gen may be of assistance with regard to scheduling/performing defrags of ones hard drive.
http://forum.acronis.com/forum/5940#comment-13500