*spin-off* How should we classify resolution

I repeat, noone (including myself even though I sometimes do) should use kilobyte = 1024 nowadays, its like talking in inches for length (outside the US), btw inches as in ~2.54cm ah fuck it, lets just call it, 2 and a half cm :D

Why? The standards committee that invented that distinction only applies it to hard discs. There is a separate committee for semiconductor memory and they use the traditional "kilo" and "mega" for binary numbers. A kilobyte in ram now is the same as a kilobyte in ram in 1996. The change only happened in one facet of the tech industry, and it also is the sector that would see the largest increase by using decimal terminology because of the scale of drive sizes.

You can't expect regular people to conform to a standard that doesn't have industry wide acceptance.
 
There are squillions of words that have evolved over time to not follow their exact meanings, but we use them without confusion.
Yes like Kilobyte :devilish:, here is the wiki page
https://en.wikipedia.org/wiki/Kilobyte
The kilobyte is a multiple of the unit byte for digital information. The International System of Units (SI) defines the prefix kilo as 1000 (103); therefore one kilobyte is 1000 bytes.

read this (the international standard guys)
http://www.iec.ch/si/binary.htm
Years ago, at a time when computer capacities barely matched the few tens of kilobytes required by this single web “page”, computer engineers noticed that the binary 210 (1024) was very nearly equal to the decimal 103 (1000) and, purely as a matter of convenience, they began referring to 1024 bytes as a kilobyte. It was, after all, only a 2,4 % difference and all the professionals generally knew what they were talking about among themselves.


Despite its inaccuracy and the inappropriate use of the decimal SI prefix "kilo", the term was also easy for salesmen and shops to use, and it caught on with the public.


As time has passed, kilobytes have grown into megabytes, then gigabytes and now terabytes. The problem is that, at the SI tera-scale (1012), the discrepancy with the binary equivalent (240) is not the 2,4 % at kilo-scale but rather approaching 10 %. At exa-scale (1018 and 260), it is nearer 20 %. It is just mathematics that dictates that the bigger the number of bytes, the bigger the difference, so that the inaccuracies – for engineers, marketing staff and public alike – are set to grow more and more significant.


Similar confusions arose between the computing and the telecommunications sectors of the IT world, where data transmission rates have grown enormously over the past few years. Network designers have generally used megabits per second (Mbit/s) to mean 1 048 576 bit/s, while telecommunications engineers have traditionally used the same term to mean 1 000 000 bit/s. Even the usually stated bandwidth of a PCI bus, 133,3 MB/s based on it being four bytes wide and running at 33,3 MHz, is inaccurate because the M in MHz means 1 000 000 while the M in MB means 1 048 576.


Mathematics dictate that the disparities resulting from mixed and incorrect use of decimal prefixes will become increasingly significant as capacities and data rates continue to grow. In IEC 80000-13:2008, all branches of the IT industry have a tool with which to iron out this inconsistency. It eliminates confusion by setting out the prefixes and symbols for the binary, as opposed to decimal, multiples that most often apply in these fields.
Time will tell if comfortable reading will prevail over technical accuracy.
 
Yes like Kilobyte :devilish:, here is the wiki page
https://en.wikipedia.org/wiki/Kilobyte


read this (the international standard guys)
http://www.iec.ch/si/binary.htm
JEDEC is also a standard, is it not? They say 1024.

Regardless, you are only proving the case that it was universally accepted that when you were using kilo, mega etc prefixes IRT things that were binary (bytes and bits), you were using the binary equivalent. And only recently was new vocabulary introduced that is only accepted by one facet of the industry (hard drive manufactures). Everyone else uses the binary standard that has existed since the beginning. It's not that everyone was confused and were using the wrong terminology, it's that a sector of the market that would benefit from inflating numbers invented new language to change the definition of an already defined term. Exactly what Shift has been saying the whole time.
 
JEDEC is also a standard, is it not? They say 1024.
The National Institute of Standards and Technology says 1000, in fact they require kilobyte = 1000, 1024 must be written as kibibyte. As rocketscientists know, mixing your measurement systems can lead to trouble. :devilish:
And only recently was new vocabulary introduced that is only accepted by one facet of the industry (hard drive manufactures)
Recently? The very first harddisk, the ibm 305 had 5MB,
http://web.archive.org/web/20110726102519/http://www.disktrend.com/5decades2.htm

and when I say 5MB I mean 5,000,000 charcters, i.e. its not a recent development

Also you say only harddisk manufactures? what about network speed etc?
 
And to really drive the point home. Jedec also say 'we done fucked up' by using the term kilo as 1024, though they do say we will keep using the symbol K as 1024
 
JEDEC is also a standard, is it not? They say 1024.
More reference and professional standards bodies have adopted decimal interpretations than have not but the situation is ludicrous.
 
Back
Top