Apple changes how Mac OS X reports drive capacity in Mac OS X 10.6 Snow Leopard

Mac OS X Snow LeopardApple has posted a KnowledgeBase article (TS2419) which explains changes the company has made in how Mac OS X reports drive capacity in Mac OS X 10.6 Snow Leopard vs. earlier versions of Mac OS X:

Storage drive capacity in Mac OS X v10.0 through 10.5:

Storage drive manufacturers measure storage drive capacity using the decimal system (base 10), so 1 gigabyte (GB) is calculated as exactly 1,000,000,000 bytes. The capacity of the storage drive in your Mac, iPod, iPhone and other Apple hardware is measured using the decimal system. We set this out on our product packaging and on our website through the statement “1 GB = 1 billion bytes.”

Operating systems, including the operating system on your Mac, iPod, iPhone, or other electronic devices, use the binary system (base 2) of measurement. In binary, 1 GB is calculated as 1,073,741,824 bytes. This difference in how the decimal and binary numeral systems measure a GB is what causes a 4 GB storage drive to appear as 3.7 GB when detailed by an operating system, even though the storage drive still has 4 billion bytes, as reported. You will see this difference if you look at how your computer summarizes the capacity of the computer’s storage drive or of your iPod’s or iPhone’s storage drive when the device is connected to your computer. You will also see this difference in the “About” menu on your iPod or iPhone. The important point to understand is that the available storage capacity is the same no matter which system is used. Nothing is missing.

The storage drive in your Apple product, like all storage drives, uses some capacity for formatting, so actual capacity available for applications will be less. In addition, other factors, such as pre-installed systems or other software and media, will also use part of the available storage capacity on the drive.

Storage drive capacity in Mac OS X v10.6 and later:

In Mac OS X v10.6 Snow Leopard, storage capacity is displayed as per product specifications (base 10). A 200 GB drive show 200 GB capacity (for example, if you select the hard drive’s icon and choose Get Info from the Finder’s File menu, then look at the Capacity line). This means that, for example, if you upgrade from an earlier version of Mac OS X, your drive may show more capacity than in the earlier Mac OS X version.

The storage drive in your Apple product, like all storage drives, uses some capacity for formatting, so actual storage available for applications will be less. In addition, other factors, such as pre-installed systems or other software and media, will also use part of the available storage capacity on the drive.

Source: Apple Inc.

MacDailyNews Note: While the measurement has changed, so has the code. Snow Leopard is indeed smaller than Leopard regardless of the measurement system you choose.

[Thanks to MacDailyNews Reader “Shawn J.” for the heads up.]

64 Comments

  1. I have a 500GB HDD here on my iMac and in disk utility it shows as 500.11GB WDC WD500 whereas it used to show it as 480 something or so.

    My “available” in Leopard said 233.44GB just before upgrading. Now with Snow Leopard it’s showing 260GB available. Someone can do the calculation if they want.

    Fun fun.

  2. it seems like this was always going to be a contentious action to take, but the reality is that Giga and Mega are words used in Base 10 and not Base2. The convention has always been wrong. It’s isn’t a marketing scam, is a semantics issue. Most people will probably welcome this change. Now all we have to do is get the industry as a whole to move over to the correct line of thinking. Processor and HDD manufacturers have always had it right. Gibi and Mebi byte are the correct terms for Base 2. Here’s a good article on it: http://www.macworld.co.uk/mac/news/index.cfm?newsid=27034&pagtype=allchandate

  3. And to respond, off-topic, to Ken1w:

    In the 18 years that I had lived in the States, I have not been able to figure out how can dividing by fractions be easier than metrics. Which drill bit is bigger: 5/16″, 9/32″ or 3/8″? And exactly how big is it? I can always easily tell between 6.7, 6.9 and 7.1mm, and I will be able to always pick each one out of a line-up just by looking at them. Let’s not forget 12 inches in a foot and 3 feet in a yard (and nice, round numbers of 1,760 yards, or 5,280 feet in a mile). If my apartment building has a wall that’s 27m long, and I were to convert it into the imperial measurements, it would be (approximately) 30 yards, or 90 feet, or (I need a calculator for this) 1,080 inches. Or it would be 27m, 270dm, 2,700 cm, 27,000 mm, etc. Just writing this is giving me a massive headache…

  4. @ Predrag

    The only reason a 250 GB drive is labeled as such is because, for PURLEY marketing purposes, a GB has been defined as 1,000,000,000 Bytes in accordance with the decimal or metric (BASE10) system of measurement because it makes the drives seem to have higher capacity than they really have. In this case, giga = 1000³. This is NOT how computers read data. Computers operate on the binary (BASE2) system. By the binary system, giga = 1024³. Here’s wikipedia’s page on the matter… read up.
    http://en.wikipedia.org/wiki/Gigabyte

  5. And to quote myself:

    To anyone but an übergeek, this (Base2 division) makes absolutely no sense.

    Again, you’re wrong. The main (if not only) reason 250GB drive is labeled as such is, so that ordinary people (unlike you and me) can infer that it has 250GB, or 250,000MB, or 250,000,000kB (etc). To label it any differently would cause massive confusion among ordinary people (again, unlike you and me).

    Sometimes, one has to just give it up and go with what the prevailing system is. It wouldn’t be the first time.

  6. @ Predrag

    Why would ordinary people need or want to infer what 250GB breaks down to in MB or KB? The industry should just start labeling the capacities according to how computers are “supposed” to read them, and be done with it. The unwashed masses aren’t going to break it down, on paper or in their heads. As it is, there still may be mass confusion when someone downloads a file that is supposed to be one size, but now OS X says it’s something else.

  7. @Predrag, @topshot

    After giving it some thought, I have to agree that the IEC trying to promote a change to using “Gibibyte” and so on is a step in the right direction to avoid confusion. But in the same sense, the whole industry really needs to switch to it wholly and stop using Giga completely. Failing to do so just makes things more confusing.

  8. The hell with numbers! The fuel gage in my car isn’t marked in gallons or liters. It has a needle that registers from ‘Empty’ to ‘Full’. That’s all we friggin’ need.

  9. …”To anyone but an übergeek, this makes absolutely no sense.”

    I’m only a regular geek, and I like binary.

    “…Using Base10 may have been a marketing scam long ago (the Base10 number is higher than Base2 for the same capacity),…”

    It’s still a marketing scam. The average person can figger out that a 500GB drive is twice as big as a 250, whatever base is used.

    It really makes no difference to me, but using 1K=1,024 is proper for computers. The only reason the change to decimal bothers me so much is that it means the marketing people won. Kind of like how MS is run by a salesman.

  10. My only concern — small, though it may be — is that it is new to me.

    How big is a 4.7GB DVD in this new math?
    What padding factor do I use when I have to copy a bunch of files to/from Windows or Linux NFS shares?

    I just have to adjust, that’s all. Wish this were a preference — OR that I could see the new unit of measure in parens so I could adjust over time.

  11. The correct unit abbreviated with “GB” is metric: 10^9=1000^3, Giga. In the past, Mac OS X reported in “Gibibytes” (2^30=1024^3 bytes), properly abbreviated “GiB”, but Mac OS X falsely used the abbreviation “GB”. The problem here is that no one besides IT geeks and physicists knew what GiB were.

    All Apple has done (once again as the first manufacturer?) is using the proper units and following international standards.

    As a physicist, I can only applaud this decision.

    Bravo, Apple!

  12. @ Predrag

    RE: Metric system

    The metric system may be better for people just want a consistent system that is easy to remember, convert, and calculate with a calculator. Traditional measurements that evolved over time are better suited for people who actually need to measure things in the course of executing their professions. One inch is a useful size (not too small – not too big); one centimeter is less useful – one decimeter is useless. One foot is also a useful size; there is no metric equivalent. Twelve inches in a foot is useful, because you can divide 12 by 2, 3, 4, and 6 and get an whole number. You can only divide 10 by 2 and 5 (and dividing by 5 is seldom done as a measuring act). And dividing 10 by 3 (dividing by 3 is often done) results in something that you cannot even represent precisely with decimals. Cooking in the U.S. is done using units like tablespoon and cup. Metric cooking recipes are often in milliliters and grams; those measuring cups with milliliter measurements curiously tend to be just about the size of a traditional “cup” (I wonder why?); and I would go crazy if I had to measure simple things with a scale (by weight in grams) instead of by teaspoon or tablespoon (what are the metric equivalents?).

    Bottom line, metric is consistent in concept but arbitrary in how its units are suited to actual uses. Traditional measurements are the way they are because people who’s livelihoods depend on them, over centuries of actual use, found those units to be the most useful.

  13. @ken1w:

    Bullshit.

    You’re only right for household uses. Outside of this, metric quickly becomes extremely more useful, easier, better.

    There’s a reason why physicists, astronomers, and the world’s best engineers use the metric system.

  14. @elgarak
    Apple was not using GB incorrectly. By JEDEC’s long-existing standards, a Gigabyte is defined as 1024³. Any other definition of Gigabyte ignores the fact that computers operate on the binary system. The “Gibibyte” term is a relatively recent creation by others to attempt to address this mass confusion over what Gigabyte means. See my previously linked article at Wikipedia for further explanation.

  15. Do you people even KNOW what Fahrenheit used as a scale? If you did, you probably wouldn’t be posting nonsense about such a bizarre metric…

    Celsius is the best for everyday use.
    Kelvin is the “right” one because its zero point is the absolute zero.

    THE END.

  16. @Fat Bastard:

    I’m pretty sure that the SI definition of the Giga-prefix (10^9) precedes the JEDEC definition… which, incidentally, is far less consequent than the alternative, which has been adopted by pretty much all other standards granting entity, since the prefixes are used in the metric sense for data transfer speeds, for instance…

  17. @Lukeskymac

    Farenheit used human beings as the scale. Normal body temperature was to be 100. A frozen body was to be around zero (he used salt water). The numbers are slightly off (e.g. 98.6), but close.

  18. @elgarak
    If JEDEC’s definition of Giga is technically incorrect, then re-using it in accordance to the metric system is incorrect as well, which is what Apple is now doing. Computers do not operate using the metric system. That’s like saying “I walked for five gallons today.”

  19. And to continue the off-topic debate…

    ken1w‘s answer is totally arbitrary. The measuring utensils (cups, spoons, vials and other vessels have all kinds of sizes; some are 10ml (around 1/3 floz), some are 50ml (about 1.5 floz), 100 ml (3 oz), 200 ml, 250ml (around 1 cup), 333 ml (1/3 litre), 0.5l, 1l, 5l (about a gallon and a quarter), 10l, etc. They all have nice and round numbers, easily multipliable by 2, 5 or 10 (and in some cases, 4 and 3, where practical).

    My mother, as well as my grandmothers, have cooked all their lives using metric system (in a country where everything is made from scratch). When she arrived in the US, she was just baffled by the myriad of measurements that had very little correlation between them. You simply have to memorise them all if you want to survive in the US (teaspoon, tablespoon, fluid ounce (not to be confused with dry ounce), cup, pint, quart, gallon; then weight (ounce, pound), and not to mention how many of each in the other. Length is no less awkward (inches, feet, yards, miles, and arbitrary relations between them), not to mention the unnatural sizes of any and all of them (at least for all of us who grew up on metric).

    The point is, when you use metric system, you don’t need to memorise anything. Everything is consistent, divisible by 10 and interchangeable (1 litre of water is 1kg heavy and takes up 1 cubic decimetre).

    As for the subject at hand, Fat Bastard is still wrong. Ordinary people very often need to figure out how many kilobytes and Megabytes they can squeeze in the hard drive. Their digital cameras shoot pictures that are about 300kB-1MB; their MP3 music is around 4MB each; their movies are around 2GB each. So, how many of those 300kB files can fit in to 1MB? 3? More than 3? Is that 1MB 1,000kB or 1024kB?

    This stuff is definitely not for geeks, and us, normal folk strongly prefer dividing by 10 instead of dividing by 2 to the power of 8.

  20. The commentary on this site is getting more and more bizarre by the day. Thanks for the free entertainment folks!

    @ Predrag
    Stop trying to be logical — you’re trying to have a battle of the wits with unarmed opponents! ” width=”19″ height=”19″ alt=”LOL” style=”border:0;” />

  21. My first reaction was “how far Apple has changed from the days of 12-inch monitors”–back around 1990 when everyone else falsely advertised 13″ monitors, Apple alone claimed 12″. The dumb public couldn’t be bothered to check they were the same viewable size, and the so-called 13″ ones had a whole inch hidden by the bezel.

    Naturally, after Apple gave in and started advertising CRT sizes like everyone else, lawyers got involved and rather than force everyone to list viewable size, they added fine print instead to cover their asses. Not an issue anymore since LCDs have taken over.

    But, after comments by Predrag’s and others, I’ve kind of swung around–it was the computer storage giants of the day (maybe circa 1960s?) who first messed up by applying base-10 terms, e.g. kilo and mega, to base-2 units. Back then those were HUGE amounts of data and nobody outside the computer storage field cared what it was called.

    Fast forward several decades. As an engineer who specialized in computers, I’m forced to agree with the IEC–it’s the computer techs/geeks who need to change terminology, and use KiB, MiB, GiB, etc if they are specifically referring to base-2 numbers.

    Usually tradition ensures the continued use of what was originally a small mistake, but for once, the mainstream is taking a tech term and transforming its meaning for the better (an example of a bad re-definition: “hackers”).

Reader Feedback (You DO NOT need to log in to comment. If not logged in, just provide any name you choose and an email address after typing your comment below)

This site uses Akismet to reduce spam. Learn how your comment data is processed.