> On Sat, Dec 22, 2001 at 09:57:11AM +1300, Chris Wedgwood wrote:
> > I don't want to get dragged into this silly debate; the point I was
> > making is that we already have considerable inconsistency and choosing
> > STANDARDS BASED tokens might not be a bad thing.
> They're defacto standards that have been in use for well over a decade.
And it's been wrong for all of that decade, because the rest of the world
- *all* of it - has been using a different version for quite a bit longer
> Hmmm, all of the advertising, computer media and electrical engineering
> related material I've read recently seems to be using GB. In fact, there
> was one very article about the whole issue that found the computer
> industry to be remarkably consistent in using terms like "10GB" with no
> space between the number and the measuring unit. Oh wait, sorry, that's
> not formally approved by any standards bodies.
Unfortunately, while the typesetting may be consistent, the *meaning*
isn't. You cannot swap out 10GB RAM to a 10GB hard disk, it will overflow.
> Many standards bodies are examples of confusopolies.
Well, this particular "de facto standard" certainly is a confusopoly.
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to email@example.com
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/