https://medium.com/@bellmar/is-cobol-holding-you-hostage-with-math-5498c0eb428b
Face it: nobody likes fractions, not even computers.
When we talk about COBOL the first question on everyone's mind is always Why are we still using it in so many critical places? Banks are still running COBOL, close to 7% of the GDP is dependent on COBOL in the form of payments from the Centers for Medicare & Medicaid Services, The IRS famously still uses COBOL, airlines still use COBOL (Adam Fletcher dropped my favorite fun fact on this topic in his Systems We Love talk: the reservation number on your ticket used to be just a pointer), lots of critical infrastructure both in the private and public sector still runs on COBOL.
Why?
The traditional answer is deeply cynical. Organizations are lazy, incompetent, stupid. They are cheap: unwilling to invest the money needed upfront to rewrite the whole system in something modern. Overall we assume that the reason so much of civil society runs on COBOL is a combination of inertia and shortsightedness. And certainly there is a little truth there. Rewriting a mass of spaghetti code is no small task. It is expensive. It is difficult. And if the existing software seems to be working fine there might be little incentive to invest in the project.
But back when I was working with the IRS the old COBOL developers used to tell me: "We tried to rewrite the code in Java and Java couldn't do the calculations right."
[Ed note: The referenced article is extremely readable and clearly explains the differences between floating-point and fixed-point math, as well as providing an example and explanation that clearly shows the tradeoffs.]
(Score: 2) by Muad'Dave on Tuesday September 17 2019, @04:34PM (1 child)
> The *machines* were not record oriented.
Early filesystems, at least those on the Interdata/Perkin Elmer/Concurrent machines I used to work on, were record-oriented. You could declare a file as 'indexed' with a fixed max record length like 80 or 132 characters, or as 'contiguous' where the record length was fixed at 256 bytes (one blocksize - used for executables, etc). There was no concept of a line terminator.
Obviously the disk drive itself was just storing 256 byte blocks.
(Score: 0) by Anonymous Coward on Wednesday September 18 2019, @04:28AM
Huh. Interesting. So this was effectively an abstraction over the physical blocks, to have OS-level controllable virtual blocksize? Ie. there was no recordsize-centric design at the hardware level?