Phantasm66
Posts: 4,909 +8
I just finished reading this great article on partitioning strategies here: http://partition.radified.com/
I recommend that everyone read it, it had a really good number of points, tricks and tips and makes some fairly good recommendations as to a sensible partitioning structure.
source: http://partition.radified.com/
I recommend that everyone read it, it had a really good number of points, tricks and tips and makes some fairly good recommendations as to a sensible partitioning structure.
This guide shares insights on the topic of hard drive partitioning: strategies gleaned from plenty of practical experience. The dual-premise behind this guide is that:
With hard drive capacities growing so large, intelligent partitioning becomes more of an issue.
It's better to partition intelligently the first time, than have to go back and repartition later.
When discussing hard drive partitions, it helps to have a graphical representation in mind. The image located at the top of this page comes from the program Partition Magic. There you see three (FAT32) partitions represented in a single 45GB hard drive: 1 Primary partition (labeled D drive), and 2 Logical DOS drives (labeled I & J) in a single Extended partition (light-blue outline). The yellow areas contain data; white areas are empty.
You can verify this for yourself by partitioning your drive as I described. But a much quicker method would be by using the hard drive benchmarking utility HD Tach, which contains a setting called Advanced Size Check (ASC). If you put a checkmark in the ASC box, HD Tach will benchmark your *entire* drive. But .. if you remove the check, HD Tach will *truncate* the test at the first 8GB.
In other words, HD Tach will only benchmark the first 8 gigs. Notice how your *reported* access times are improved (lower) when HD Tach truncates the test at the first 8 gigs. This is cuz the read/write heads don't have travel all the way to the far end of the drive (inner tracks).
A drive with a larger capacity will notice a more dramatic 'truncating' effect. For example, 8 gigs is roughly 40% of a 20-gig drive. But it's only 10% of a 80-gig drive. In other words, you can limit your drive's travel (seeks) to the fastest 10% of a 80-gig drive by creating an 8-gig partition and storing only your operating system & applications there. It's common knowledge that a drive with the same amount of data will 'feel' more responsive on a *larger* drive than a smaller one .. even tho both drives may have *identical* manufacturer's performance specs.
This is cuz the data on the larger drive will be limited to a smaller area. This is also one of the reasons why larger drives feel faster, even tho they have the exact same manufacturer specs as the smaller one. It's cuz the larger drive has a lower effective seek time.
A drive with multiple partitions allows you to defrag only those partitions that actually need defragging. This saves wear and tear on your drive, and may even help keep it from failing prematurely.
It can take literally hours to defrag 60GB worth of fragmented data, not to mention 120 or 160 gigs. This means you have to plan your defrags much more carefully than you do with a drive containing smaller, multiple partitions. A small partition can be defragged in the time it takes to grab a cup of coffee, or take a .. uh, I mean, go to the bathroom. =)
source: http://partition.radified.com/