What's up with PS3's greedy HDD space consumption?

Yes, when it comes to hard drives the manufacturers are correct. A 320GB HDD is really 320GB. It's just that all the software is wrong because they report GB, when they actually mean GiB.
 
There is no such thing as real GB. IEC and IEEE use SI like decimal prefix definitions which are clearly more consumer friendly, so do HDD manufacturers.
This is recent and non-standard. Way back when the decimal prefix was used for the binary values. A kilobyte meant 1024 bytes, and a megabyte meant 1024x1024 bytes. HDDs went rogue when they started using the decimal interpretation. If you buy a gigabyte of RAM now, or a flash card of gigabytes, you get the binary equivalent, despite the technically decimal prefix. And that's what everyone has been used to for as long as computers have been around, so I can see the confusion. Until the binary prefixes are properly used (and IMO they sound too stupid and awkward to become popular), the decimals will be expected to be binary. And I see no problem with that. The only application of the binary prefixes is in talking computer components, so no-one's going to get confused by a little prefix overloading. We're not going to have the first interstellar flight crash into Venus because someone wrote kilometers meaning 1024 metres!

As for the PS3, the thread explains it all. The decimal value is reduced to the binary quantity, and then a percentage is reserved for the OS. From my first fears the 40 GB getting filled, it's clear there's ample space for ordinary downloads etc.
 
This is recent
The funny binary prefixes were defined 9-10 years ago. The usage of SI decimal prefixes for byte predates that. The discussion itself is probably older than me.

and non-standard.
That's what I said.
Way back when the decimal prefix was used for the binary values. A kilobyte meant 1024 bytes, and a megabyte meant 1024x1024 bytes. HDDs went rogue when they started using the decimal interpretation. If you buy a gigabyte of RAM now, or a flash card of gigabytes, you get the binary equivalent, despite the technically decimal prefix.
Yes, memory stuck with old binary prefixes with SI mask, because their capacity is almost always a power of two.
And that's what everyone has been used to for as long as computers have been around, so I can see the confusion. Until the binary prefixes are properly used (and IMO they sound too stupid and awkward to become popular), the decimals will be expected to be binary.
And I see no problem with that. The only application of the binary prefixes is in talking computer components, so no-one's going to get confused by a little prefix overloading. We're not going to have the first interstellar flight crash into Venus because someone wrote kilometers meaning 1024 metres!
Let's hope so.

We, especially the consumers, don't really need binary prefixes any more, whether the funny (-bi-) ones become popular or not is not really important. SI decimal will and should rule eventually.
 
We, especially the consumers, don't really need binary prefixes any more, whether the funny (-bi-) ones become popular or not is not really important. SI decimal will and should rule eventually.

Why should SI decimal rule? I don't see any point in this, other than consumer convinience. As long as computers run binary code and space size is of a power of two, I don't see why we should be using decimal.
 
Why should SI decimal rule? I don't see any point in this, other than consumer convinience. As long as computers run binary code and space size is of a power of two, I don't see why we should be using decimal.

Prefixes are stolen from SI.
This is for humans, not computers (which don't care either way).
It's easier for computers to convert to decimal prefixes than for humans to convert to/from binary prefixes.
Computers already does decimal conversion anyway, the excuse of "but only right shift" is ridiculous at best.
Have I mentioned SI is actually a standard system, and in use everywhere else?
 
Yes, memory stuck with old binary prefixes with SI mask, because their capacity is almost always a power of two.
Everything in computing is powers of two! There's no particular need to transition to a new protocol when the old protocol worked without any confusion whatsoever. The easiest switch to make would be HDD manufacturers going back to one megabyte etc. being the binary form as they used to. The fact is the SI system has introduced confusion that never existed! From a language POV, I don't think a new prefix was at all needed. The binary definition of kilo, mega, giga, is only applied to base two quantites like byte, so we don't need new words, only a rule that when applied to base-two concepts, the prefixes take their base-two equivalents. Etymologically the definitions can be described as a corruption of the greek 'gigas', rather than an exact application of the prefix, which is what it always meant to be even if no standards agency recorded as such.
 
Prefixes are stolen from SI.
This is for humans, not computers (which don't care either way).
It's easier for computers to convert to decimal prefixes than for humans to convert to/from binary prefixes.

Why do we as humans need to work with decimal prefixes if the binary form is just as well suited?

I don't see the problem for us human if he buys a harddrive that is quoted of having 465GB (500'000'000'000 bytes) instead of 500. 1 Byte will always be 8 bits, not 10 too.

Then again, maybe I'm just to oldschool to understand this.
 
Everything in computing is powers of two!
You wish.
There's no particular need to transition to a new protocol when the old protocol worked without any confusion whatsoever. The easiest switch to make would be HDD manufacturers going back to one megabyte etc. being the binary form as they used to. The fact is the SI system has introduced confusion that never existed!
Are you serious? SI defined prefixes long ago, probably more than 200 hundred years or something ago.
From a language POV, I don't think a new prefix was at all needed. The binary definition of kilo, mega, giga, is only applied to base two quantites like byte, so we don't need new words, only a rule that when applied to base-two concepts, the prefixes take their base-two equivalents.
It's not exactly equivalent though, if you know what I mean.
Etymologically the definitions can be described as a corruption of the greek 'gigas', rather than an exact application of the prefix, which is what it always meant to be even if no standards agency recorded as such.
Interesting take :)
I don't really blame whoever first used Giga for 2^30, nor do I think they need a defense.

Anyway, seriously I don't see what we are discussing.
The fact of matter is that the current binary prefix convention stuck before computers were even remotely popular, it is inconsistent with SI, there is no real reason to use it besides the fact that it's a popular convention.

I'm not on a crusade to change that, but it will be changed. :)

Phil, ignoring the consumer who has no real idea about binary prefixes, even those of use who know what they are still use them as approximate decimal prefixes. How many people who is not autistic actually does the binary conversion in a daily basis?
 
Are you serious? SI defined prefixes long ago, probably more than 200 hundred years or something ago.
Not for computers ;) I'm a big fan of SI units and the way they've simplified scientific calculation - decimal is Good! I just think the computing change wasn't necessary.

The fact of matter is that the current binary prefix convention stuck before computers were even remotely popular, it is inconsistent with SI, there is no real reason to use it besides the fact that it's a popular convention.
Which is precisely why we should stick with it, IMO! The average consumer doesn't need to know the difference - as long as they know all numbers are measured equally they know the bigger numbers are better. The problem was some HDD manufacturers decided to ignore convention and use an unconventional interpretation of the words to present their capacities in a better light. This is where a standard mesure was needed, but sadly the standard didn't go with the common language long-used, but a new language. Now it's a case of trying to convince everyone to swap the word 'woman' for 'man' because that's truer to the origins of the word.
Anyway, seriously I don't see what we are discussing.
Yeah, we're not really talking anything console. This is all terribly OT :mrgreen:
 
Back
Top