Natural 4 floating-point accuracy

Has the improvement in the accuracy of Natural’s floating point calculations caused anyone concern?

A customer’s financial application uses floating point to get more decimal digits in their calculations. We foresee complications in long-time clients seeing slight negative changes (pennies) in their transactions. They may not readily accept the argument that the previous amounts were incorrect. Of course, we don’t expect an outcry from clients who see a positive change.

I haven’t seen documentation which explains in detail how SAG’s improvements were made, so I don’t see how we could foresee which, or how many, of our clients will be affected. Preliminary tests indicate that the percentage will be quite small. Any ideas?

We are considering several options, such as replacing all floating-point fields and calculations. Most options require extensive regression testing by our already-overworked users, and will burden our service desk staff with a wave of customer calls demanding explanations of the differences. I don’t expect SAG to be receptive to a request to create a zap or compiler option to revert to the floating-point algorithms in Natural 3 (if it’s even possible), but are there any users out there who would support this request?

(also posted to SAG-L)