I've searched the forums but after searching through a few pages figured I'll just ask.
I just purchased a 16gb micro sdhc and transferred all my stuff from my 8gb. With my 8gb it states 7.9gb on the storm which I can deal with that but with the 16gb it shows 14.9gb. Where did the 1.1gb go?
Basically it is because manufacturers sell a 1gb as 1,000,000,000 bytes when in reality a gigabyte is actually 1,073,741,824, so a 16gb card is (16x1,000,000,000)/1,073,741,824=14.9gb
Basically it is because manufacturers sell a 1gb as 1,000,000,000 bytes when in reality a gigabyte is actually 1,073,741,824, so a 16gb card is (16x1,000,000,000)/1,073,741,824=14.9gb
What he said. Lol.
Posted from my CrackBerry at wapforums.crackberry.com
Basically it is because manufacturers sell a 1gb as 1,000,000,000 bytes when in reality a gigabyte is actually 1,073,741,824, so a 16gb card is (16x1,000,000,000)/1,073,741,824=14.9gb
lol that's a lot smarter sounding than my explanation
Basically it is because manufacturers sell a 1gb as 1,000,000,000 bytes when in reality a gigabyte is actually 1,073,741,824, so a 16gb card is (16x1,000,000,000)/1,073,741,824=14.9gb
And I've always wondered what the he|| was going on here...
...
The 1 GB = 1 million is sleezy manufacturer notation
Technically that is not true. Giga, has always meant 1 billion bytes, long before computers were around. Only when the first computers came out did they erroneously select 1024 to be 1 kilobytes (because it was close to 1000).
The 16 GB for a memory card, as holding 1 billion bytes is correct. The 14.9 should be shown by the phone, or any computer, as 14.9 GiB (or Gibi-bytes). Don't believe me, do a search for IEEE and Gibi.
Science has always used the base 10 math (i.e. K = 1000, M = 1,000,000, G = 1,000,000,000). Computers use the Base 2 math, thus KiB = 1024, MiB = 1,048,576, etc.).
There is no "space" as someone put it. It's all there, just different base mathematics used.
Technically that is not true. Giga, has always meant 1 billion bytes, long before computers were around. Only when the first computers came out did they erroneously select 1024 to be 1 kilobytes (because it was close to 1000).
The 16 GB for a memory card, as holding 1 billion bytes is correct. The 14.9 should be shown by the phone, or any computer, as 14.9 GiB (or Gibi-bytes). Don't believe me, do a search for IEEE and Gibi.
Science has always used the base 10 math (i.e. K = 1000, M = 1,000,000, G = 1,000,000,000). Computers use the Base 2 math, thus KiB = 1024, MiB = 1,048,576, etc.).
There is no "space" as someone put it. It's all there, just different base mathematics used.
Thanks Tony. I've been a lurker since I got my storm, back in April, but just couldn't resist replying to this discussion. The idea of GB vs GiB s a pet peave of mine, since I deal with that stuff all the time at work!
16 billion bytes = about 14.6 gb, because there are 1024 bytes in one kilobyte, there part of your "loss, also, formatting uses a small portion as well
Technically that is not true. Giga, has always meant 1 billion bytes, long before computers were around.
Since you're arguing semantics, I think you meant the prefix Giga existed before computers. The notion of a "byte" did not exist "long before computers were around."
Nevertheless, I stand by my previous statement. It's to the benefit of hard drive manufacturers to use SI base 10 notation because it allows them to advertise a seemingly "larger" and simpler (lots of nice 0's) number in their marketing. It's been their practice long before the IEC defined Gibi, hence why I called it sleazy.
Apple in Snow Leopard switched to base 10, but until Microsoft, RIM, and most everyone else in the computer software world switches, you'll still have confusion like the OP experienced.
Since you're arguing semantics, I think you meant the prefix Giga existed before computers. The notion of a "byte" did not exist "long before computers were around."
Nevertheless, I stand by my previous statement. It's to the benefit of hard drive manufacturers to use SI base 10 notation because it allows them to advertise a seemingly "larger" and simpler (lots of nice 0's) number in their marketing. It's been their practice long before the IEC defined Gibi, hence why I called it sleazy.
Apple in Snow Leopard switched to base 10, but until Microsoft, RIM, and most everyone else in the computer software world switches, you'll still have confusion like the OP experienced.
Ahhhh, yes you are correct, on how long Giga has been around... My bad!!!!
Mostly the drive specs on packaging is driven by the Marketing folks, who have to pass it by legal, who enforces that Scientific Giga = 1,000,000,000.
I agree that either the manufacturers/software vendors need to update their software to use base 10 math, or they need to update it, to show the capacity using the GiB, TiB, prefixes.
what about my 8gb only showing 1.8 after formatting? its an 8gb micro sdhc sandisk and i put it in brand new formatted and it shows 1.8 like its a 2gb card
what about my 8gb only showing 1.8 after formatting? its an 8gb micro sdhc sandisk and i put it in brand new formatted and it shows 1.8 like its a 2gb card
Lurch what device do u have? I'm mobile and can't see. Format it in a pc using a card reader first and see if it still says 1.8gb
Posted from my CrackBerry at wapforums.crackberry.com