It have been looking for some time now to find my Ultimate Defragger, as Windows unfortunately still treats your files as all being equal. UltimateDefrag is good, but it lacks some ease of use. This is the reason, why I have not bought it yet.
I believe in their theory, that the outer tracks are faster than the inner tracks, and that the data should be organized in such a way that the unused old data is moved to the inner tracks and that often used data is consolidated on the outer tracks. However setting up what data is being used often and what not seems somewhat tedious and difficult. Maybe I do not completely understand how to set up UltimateDefrag the way I would want it (in order):On the fast outer tracks:
1) Windows Boot-up files first on the fastest place
=> only the files windows needs -- not the whole slew of files in the Windows folder, the goal being that Windows will bootup faster.
2) Services files
=> all files of the various services that get started (most likely these overlap with the Windows Boot-up files)
3) 'Program Files' of the programs that run on startup (Messenger, Anti-virus, SQL Service Manager, Gmail Notifier)
=> these are usually all files that get loaded when the user logs in
4) Windows shut-down files
=> I really dislike waiting on Windows XP to take several disk-grinding minutes to shut-down
5) 'Program Files' of the x most used programs
=> a typical user has their set of applications that s/he will start after logging on, and after all disk-grinding startup programs have loaded. (in my case Outlook, Visual Studio, IE, RSS reader)
6) Recently used data files
=> the data files could be analyzed to determine their grow-rate, allow the defragmenter to 'pad' some empty space, avoiding instant defragmentation of the files when they are written to again.
(Outlook ost file, my project files, most of the 'My Documents' files, SQL Server database files)
7) Randomly used files by programs
=> a good example here is IE temporary files. these get written to all the time, so the usage of read/write is high. Other such files are application settings or log files being read/written
On the slower inner tracks:
I'd like to have indeed the unused files, most likely ordered from never used => compressed and consolidated on the inner tracks (in an attempt to use the minimal amount of space), to latst used (but not recently, like not used/accessed for at least ?? days).
All the rest of the data lives in the middle tracks, again being aware of files that change often and could use some padding to prevent instant defragmentation. Another special group of files are the 'large' files that get written to such as virtual pc images and sql server database files. We want to avoid defragmentation on these files and have good performance from them.
Now I have been looking at writing my own defragger, in C# using Jeffrey Wall's WebLog Defrag API C# wrappers, and the knowledge gained from Inside Windows NT Disk Defragmentingand looking at the source of several sysinternal tools such as NTFSInfo. But I doubt I'll ever find the time to start with this experiment.
Last revised: 19 Jan, 2012 08:47 PM