So for 99.9% of all cases my program will do much much more work than is
actually needed.
I may still save the data in a database, or go over the network with it,
so I should implement 1024 bit signed integers in all of that code too ?
And what happens when we do crypto and 1024 bits is not enough ?
I think the "use rediculously large datatypes" solution is a poor one,
as it can never cover all cases in the future, and it will impose a large
overhead on existing and new applications.
--
................................................................
: jakob@unthought.net : And I see the elder races, :
:.........................: putrid forms of man :
: Jakob Østergaard : See him rise and claim the earth, :
: OZ9ABN : his downfall is at hand. :
:.........................:............{Konkhra}...............:
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/