> IEEE 754 is like democracy in a sense that it is the worst, except for all the others.
I can't see what would be worse. The entire raison d'etre for computers is to give accurate results. Introducing a math system which is inherently inaccurate to computers cuts against the whole reason they exist! Literally any other math solution seems like it would be better, so long as it produces accurate results.
That's doing a lot of work. IEE-754 does very in terms of error vs representation size.
What system has accurate results? I don't know any number system at all in usage that 1) represents numbers with a fixed size 2) Can represent 1/n accurately for reasonable integers 3) can do exponents accurately
Electronic computers were created to be faster and cheaper than a pool of human computers (who may have had slide rules or mechanical adding machines). Human computers were basically doing decimal floating point with limited precision.
It's ideal for engineering calculations which is a common use of computers. There, nobody cares if 1-1=0 exactly or not because you could never have measured those values exactly in the first place. Single precision is good enough for just about any real-world measurement or result while double precision is good for intermediate results without losing accuracy that's visible in the single precision input/output as long as you're not using a numerically unstable algorithm.
I can't see what would be worse. The entire raison d'etre for computers is to give accurate results. Introducing a math system which is inherently inaccurate to computers cuts against the whole reason they exist! Literally any other math solution seems like it would be better, so long as it produces accurate results.