Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday September 16 2019, @01:48PM   Printer-friendly
from the COBOL-is-often-fractionally-better dept.

https://medium.com/@bellmar/is-cobol-holding-you-hostage-with-math-5498c0eb428b

Face it: nobody likes fractions, not even computers.

When we talk about COBOL the first question on everyone's mind is always Why are we still using it in so many critical places? Banks are still running COBOL, close to 7% of the GDP is dependent on COBOL in the form of payments from the Centers for Medicare & Medicaid Services, The IRS famously still uses COBOL, airlines still use COBOL (Adam Fletcher dropped my favorite fun fact on this topic in his Systems We Love talk: the reservation number on your ticket used to be just a pointer), lots of critical infrastructure both in the private and public sector still runs on COBOL.

Why?

The traditional answer is deeply cynical. Organizations are lazy, incompetent, stupid. They are cheap: unwilling to invest the money needed upfront to rewrite the whole system in something modern. Overall we assume that the reason so much of civil society runs on COBOL is a combination of inertia and shortsightedness. And certainly there is a little truth there. Rewriting a mass of spaghetti code is no small task. It is expensive. It is difficult. And if the existing software seems to be working fine there might be little incentive to invest in the project.

But back when I was working with the IRS the old COBOL developers used to tell me: "We tried to rewrite the code in Java and Java couldn't do the calculations right."

[Ed note: The referenced article is extremely readable and clearly explains the differences between floating-point and fixed-point math, as well as providing an example and explanation that clearly shows the tradeoffs.]


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1) 2
  • (Score: 4, Insightful) by Anonymous Coward on Monday September 16 2019, @02:04PM (13 children)

    by Anonymous Coward on Monday September 16 2019, @02:04PM (#894600)

    Its obsolete! Need to upgrade need to upgrade!

    (It still can toast ( though it's not 'certified' for gluten free mode bread which came out in a later rev)

    • (Score: 2, Insightful) by Anonymous Coward on Monday September 16 2019, @02:23PM (4 children)

      by Anonymous Coward on Monday September 16 2019, @02:23PM (#894602)
      • (Score: 2) by kazzie on Monday September 16 2019, @05:27PM

        by kazzie (5309) Subscriber Badge on Monday September 16 2019, @05:27PM (#894703)

        I was just about to post that myself. It's well worth a watch.

      • (Score: 5, Insightful) by DeathMonkey on Monday September 16 2019, @05:38PM (1 child)

        by DeathMonkey (1380) on Monday September 16 2019, @05:38PM (#894712) Journal

        God I hate argumentum ad Youtubum.

        Maybe use your words, too. Or at least tell us what you are linking to.

        • (Score: 3, Informative) by Common Joe on Tuesday September 17 2019, @08:50AM

          by Common Joe (33) <{common.joe.0101} {at} {gmail.com}> on Tuesday September 17 2019, @08:50AM (#895077) Journal

          Fully agree, but I watched the video, so I can summarize for everyone.

          In short, toaster design from 1948 cooks toast by measuring how well cooked the toast is. There is no timer like the modern toasters have. It also cooks is in near absolute silence; it does not come up with a pop like a modern day toaster. There is no lever to push the bread down. It goes down automatically and it comes up automatically. No computers; just heating elements, levers, bi-metal thermostats, and lots of cleverness. There are some gotchas like non-grounded wiring and the outside gets really hot. He explains in detail how the toaster works and how improvements could be made to bring it up to modern day options and safety.

          And I can personally vouch for this authenticity. My grandfather had one of these and I used it. I wouldn't get as excited about it as this guy does, but he lays out some really good points. And as far as how silent it is? I barely heard anything except the expansion of the metal.

      • (Score: 0) by Anonymous Coward on Monday September 16 2019, @09:12PM

        by Anonymous Coward on Monday September 16 2019, @09:12PM (#894818)
    • (Score: 4, Insightful) by fustakrakich on Monday September 16 2019, @03:09PM

      by fustakrakich (6150) on Monday September 16 2019, @03:09PM (#894627) Journal

      Exactly. We use it because it works. Our airliners are 60 year old designs also. Cars? Ancient contraptions!

      Cobol is old, but is still superior. Just like the B-52

      --
      La politica e i criminali sono la stessa cosa..
    • (Score: 5, Funny) by ikanreed on Monday September 16 2019, @03:40PM (4 children)

      by ikanreed (3164) Subscriber Badge on Monday September 16 2019, @03:40PM (#894647) Journal

      How can you live without a real time alert on your smartphone when your toast is done?

      • (Score: 2, Funny) by Anonymous Coward on Monday September 16 2019, @05:10PM (2 children)

        by Anonymous Coward on Monday September 16 2019, @05:10PM (#894688)

        When the toast is done already is too late.
         
        For only $1.99 you can subscribe to the toast ready prealert which includes a countdown (may show ads and purchases in app).
        FAQ:
        Q:Your toaster app doesn't work and my $199 toaster won't even start!
        R:Please reboot your toaster and grant all 58 requested permisions.

        • (Score: 4, Funny) by ikanreed on Monday September 16 2019, @06:00PM (1 child)

          by ikanreed (3164) Subscriber Badge on Monday September 16 2019, @06:00PM (#894731) Journal

          Note, please do not use grocery store bread with your ToestrTM. DRM-enabled bread-pods are available for a mere 16.99 from our website.

          • (Score: 2) by PartTimeZombie on Monday September 16 2019, @08:38PM

            by PartTimeZombie (4827) on Monday September 16 2019, @08:38PM (#894801)

            + 1 Funny, but also +1 Insightful or something, because this sounds like a business model.

      • (Score: 0) by Anonymous Coward on Tuesday September 17 2019, @04:16AM

        by Anonymous Coward on Tuesday September 17 2019, @04:16AM (#894984)

        COBOL can do that via CICS these days.

    • (Score: 2) by Thexalon on Monday September 16 2019, @05:01PM (1 child)

      by Thexalon (636) on Monday September 16 2019, @05:01PM (#894683)

      ) (I was feeling some existential dread due to an unmatched paren above.)

      OK, more seriously, a lot of damage has been done by assuming "new" = "improved".

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2, Touché) by nitehawk214 on Monday September 16 2019, @05:13PM

        by nitehawk214 (1304) on Monday September 16 2019, @05:13PM (#894692)

        He programs in COBOL, not lisp.

        --
        "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
  • (Score: 2) by PiMuNu on Monday September 16 2019, @02:32PM (28 children)

    by PiMuNu (3823) on Monday September 16 2019, @02:32PM (#894604)

    thanks

    • (Score: 2) by FatPhil on Monday September 16 2019, @04:16PM (27 children)

      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @04:16PM (#894665) Homepage
      Meh. The author started out by comparing COBOL with Java, which proved to me that she knew very little about floating point arithmetic, or at least how dangerous Java is for floating point arithmetic. Either that, or she did know that, and deliberately chose one of the worst languages possible for the comparison, which is little more than setting light to a straw man. Friends don't let friends do Java.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 0) by Anonymous Coward on Monday September 16 2019, @04:56PM (2 children)

        by Anonymous Coward on Monday September 16 2019, @04:56PM (#894679)

        Friends don't let friends do Java.

        But especially don't let friends use Java that can't figure out how to program using Java.

        • (Score: 3, Insightful) by DannyB on Monday September 16 2019, @06:19PM (1 child)

          by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:19PM (#894744) Journal

          It's pretty amazing how Java has come to be one of the top languages for years running now. Somebody must know something that the rest of the industry doesn't know. Must be nice to be so smart and unappreciated.

          --
          People today are educated enough to repeat what they are taught but not to question what they are taught.
          • (Score: 0) by Anonymous Coward on Tuesday September 17 2019, @01:50AM

            by Anonymous Coward on Tuesday September 17 2019, @01:50AM (#894948)

            You mean the same industry that uses mass surveillance, creates web 2.0 abominations, and is responsible for disservices like Facebook? That wise, wise industry which is known for its high-quality code, good decision-making, and long-term thinking? The very same industry that needs to be nuked from orbit? Yeah, it would take a real genius to be smarter than the ridiculous MBA losers who have almost complete control.

      • (Score: 2) by PiMuNu on Monday September 16 2019, @05:02PM (1 child)

        by PiMuNu (3823) on Monday September 16 2019, @05:02PM (#894685)

        Fair enough. I don't know much about java, replaced java in my head with C. I am a scientific programmer who deals with floating point precision issues routinely; I had never really thought about alternatives to floating point arithmetic so I found it interesting.

        • (Score: 2) by kazzie on Monday September 16 2019, @05:30PM

          by kazzie (5309) Subscriber Badge on Monday September 16 2019, @05:30PM (#894704)

          By coincidence I was looking at issues of fixed point and floating point at work today, when contemplating square root algorithms for an FPGA. As a result I found the article both interesting and timely.

      • (Score: 1) by nitehawk214 on Monday September 16 2019, @05:16PM (16 children)

        by nitehawk214 (1304) on Monday September 16 2019, @05:16PM (#894696)

        You do realize that C has floating point datatypes and suffer the same flaws, right? And that no precise calculations should use them, right?

        --
        "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
        • (Score: 2) by FatPhil on Monday September 16 2019, @05:52PM (13 children)

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @05:52PM (#894727) Homepage
          C's floating points *absolutely DO NOT* have the same flaws as Java.

          Have you ever read any Kahan?

          So if the penaly fees were 300*1.27^(17/365), how much *exactly* do I owe?
          No, exactly, not that approximation.
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
          • (Score: 2) by DannyB on Monday September 16 2019, @06:17PM (3 children)

            by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:17PM (#894743) Journal

            C's floating points *absolutely DO NOT* have the same flaws as Java.

            Unrelated to financial calculations. Please elaborate on that. I tend to think of standardized FP implementations are giving the same results.

            I kind of thought that the whole point of IEEE standards were binary interchangeable formats, and reproducible results. Something sadly lacking once long ago when ever language, indeed every implementation of every compiler had its own weird FP implementation.

            Thanks.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2) by FatPhil on Tuesday September 17 2019, @07:47AM (2 children)

              by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @07:47AM (#895062) Homepage
              See my link to Kahan's JAVAHurt.pdf elsethread.
              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
              • (Score: 2) by DannyB on Tuesday September 17 2019, @02:12PM (1 child)

                by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @02:12PM (#895145) Journal

                Can you kindly point me to that link please?

                --
                People today are educated enough to repeat what they are taught but not to question what they are taught.
                • (Score: 2) by DannyB on Tuesday September 17 2019, @02:12PM

                  by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @02:12PM (#895146) Journal

                  Nevermind, I believe I found it.

                  --
                  People today are educated enough to repeat what they are taught but not to question what they are taught.
          • (Score: 2) by DannyB on Monday September 16 2019, @06:44PM (8 children)

            by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:44PM (#894757) Journal

            Have you ever read any Kahan?

            Obviously not. Can you elaborate on what Kahan is?

            Googling for Java vs C floating point. I find that Java FP, depends on platform implementation, and a runtime option. Based on these, Java may do intermediate> arithmetic of 32-bit floats, using higher precision before converting back to 32-bit floats. If C / C++ don't do this, that could account for some discrepancy.

            I don't find a whole lot of relevant anything that pops up in Google. So I'm sincerely interested in what Java gets wrong in floating point.

            I do notice a number of examples of values that are not exactly representable in floating point. The famous 0.1 is but one of many examples. In fact, I suspect there are a huge number of exact decimal numbers not exactly representable in binary, and vice versa.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2, Informative) by Anonymous Coward on Monday September 16 2019, @07:25PM (1 child)

              by Anonymous Coward on Monday September 16 2019, @07:25PM (#894772)

              Not vice versa. Any fraction that can be represented in binary can be represented in decimal.

              The numbers that have terminating representation in binary are those that can be expressed as the sum of fractions where the denominator is a power of two. e.g. 0.140625 is the sum of 1/8 and 1/64. Or 5/16 + 9/128 + 23/256 = 0.47265625.

              The numbers that have terminating representation in decimal on the other hand are those that can be expressed as the sum of fractions where the denominator is a power of two times a power of five (the factors of ten). So there are a lot of denominators that are available in base ten but not base two.

              Of course rational numbers are rational, and irrational numbers are irrational, regardless (or irregardless ;) ) of the radix used. The numbers that don't have terminating representations have repeating representations instead. You can see this in the simple fractions that have repeating representations in decimal : they're the ones where the denominator is relatively prime to 2^n*5^m. 1/3, 1/6, 1/7, 1/9...

              • (Score: 3, Informative) by DannyB on Monday September 16 2019, @08:02PM

                by DannyB (5839) Subscriber Badge on Monday September 16 2019, @08:02PM (#894786) Journal

                Thank you. That makes sense.

                Terminating binary denominator must be power of 2.

                Terminating decimal denominator must be power of 2 times power of 5.

                So there are a lot of denominators that are available in base ten but not base two.

                This is excellent point to refute another reply that I trotted out the often used example of 0.1 not having an exact binary representation.

                --
                People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 3, Informative) by FatPhil on Tuesday September 17 2019, @07:20AM (5 children)

              by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @07:20AM (#895054) Homepage

              Kahan is one of the guys who wrote the IEEE754 floating point standards. Hasn't been a research academic for a long time, but over the decades has published a whole bunch of reports on what can go wrong with FP, including at the language-implementation level, such as How JAVA's Floating-Point Hurts Everyone Everywhere [berkeley.edu] (PDF file) whose title should be self-explanatory.

              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
              • (Score: 2) by DannyB on Tuesday September 17 2019, @02:23PM (4 children)

                by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @02:23PM (#895151) Journal

                That is interesting.

                It is from 1998, over twenty years ago. Java has changed a lot -- but I don't know how much in relation to this topic.

                There are a few rebuttals I could make -- but, I don't use floating point much in Java. But I've never had a problem with it.

                Since I mostly work with BigDecimal (which is really an unlimited precision integer underneath), I don't have the inexactness of floating point approximations of money values.

                --
                People today are educated enough to repeat what they are taught but not to question what they are taught.
                • (Score: 2) by FatPhil on Tuesday September 17 2019, @02:51PM (3 children)

                  by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @02:51PM (#895160) Homepage
                  The complaints are mostly from an HPC persepective, big simulations and modelling, with lots of feedback and opportunities for seemingly trivial inaccuracies can have larger cascading consequences. Finance is abacus work compared to that.

                  Having a specific type for currency values is ideal (as long as you can trust the implementer of that black box). It can also help with dimensional correctness too, if you have strict typing.
                  --
                  Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
                  • (Score: 3, Interesting) by DannyB on Tuesday September 17 2019, @03:32PM (2 children)

                    by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @03:32PM (#895191) Journal

                    Microsoft, amusingly, has a type for currency. It is basically your standard 2s compliment 64 bit integer with an implied decimal point four places from the right. So it counts in hundredths or a cent, or ten-thousandths of a dollar -- or whatever unit of currency in use.

                    I would like to see more languages that can attach dimensional correctness to scalar values, which in turn can be carried into more complex types such as rational values, complex numbers, etc. It would be a compile time error to have incorrect dimensions.

                    But when I muse about higher level and more abstract languages, I am often taken to task for it because programming should always be only about working at the bits and bytes and cpu cycles.

                    --
                    People today are educated enough to repeat what they are taught but not to question what they are taught.
                    • (Score: 2) by FatPhil on Wednesday September 18 2019, @08:19AM (1 child)

                      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Wednesday September 18 2019, @08:19AM (#895536) Homepage
                      Never knew about the MS centi-cents - interesting. That has the side effect of still forcing the programmer to do with the centi-cents they don't want to propagate to the outside world (invoice/etc.). I don't like the implication that every single step of a financial calculation will be performed in this type, because the errors would accumulate at a terifying rate compared with how quickly they would if you were performing the calculations in FP, unless everything you are dealing with was multi-billion. I fear this might be a half-baked idea. Inside the black box, always use the best tool for the job. There really needs to be 2 currency types - currency-for-userspace and currency-for-calculations - where the user should never see or use the latter.
                      --
                      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
                      • (Score: 2) by DannyB on Wednesday September 18 2019, @01:58PM

                        by DannyB (5839) Subscriber Badge on Wednesday September 18 2019, @01:58PM (#895640) Journal

                        Microsoft has had its currency type since at least the 1990's. It is in all their languages, as far as I know, but started in VB, I think. At the time it probably seemed like a good idea. And it probably stopped many programmers from using floating point instead.

                        --
                        People today are educated enough to repeat what they are taught but not to question what they are taught.
        • (Score: 3, Insightful) by iWantToKeepAnon on Tuesday September 17 2019, @02:36PM (1 child)

          by iWantToKeepAnon (686) on Tuesday September 17 2019, @02:36PM (#895156) Homepage Journal

          "Don't you ever miss the days when you used to be nostalgic?" -Loiosh

          Nostalgia isn't what it used to be. :/

          --
          "Happy families are all alike; every unhappy family is unhappy in its own way." -- Anna Karenina by Leo Tolstoy
          • (Score: 1) by nitehawk214 on Wednesday September 18 2019, @02:27PM

            by nitehawk214 (1304) on Wednesday September 18 2019, @02:27PM (#895657)

            "Happy families are all alike; every unhappy family is unhappy in its own way." -- Anna Karenina by Leo Tolstoy

            Wtf is a happy family? Something that only happens in books? :/

            --
            "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
      • (Score: 5, Informative) by sjames on Monday September 16 2019, @07:22PM (1 child)

        by sjames (2882) on Monday September 16 2019, @07:22PM (#894771) Journal

        The author made a fair point that the industry frequently bills Java as COBOL's natural successor, so it is perfectly fair to compare the two to see why that may not be such a good idea and why it hasn't been all that successful so far.

        TLDR; The choice of Java for comparison was not made in a vacuum. No word of spherical cows.

        • (Score: 2) by FatPhil on Tuesday September 17 2019, @07:13AM

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @07:13AM (#895049) Homepage
          Fair point, +1. I've mostly ignored Java, as I do more scientific programming than business programming, which emphasises your point - Java clearly has that "business language" aura about it.
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 2, Insightful) by Anonymous Coward on Monday September 16 2019, @07:36PM

        by Anonymous Coward on Monday September 16 2019, @07:36PM (#894773)

        I recall my manager's manager pushing towards Java. And then watched two competent software engineers struggle for 8 months with making Java work with simple numbers. The language is a pile of _insert_favorite_expletive_ since day one and has never improved. I cut my teeth on COBOL and despite being overdue for a place at the paliative care center, it can at least work simple arithmetic. New and Shiny is not always Better.
        Do systems need to get updated? Yes. But choose good languages, OSes, tools. Too many people fall for some flashy presentation. That's how we have arrived at the mess the industry has become today.

      • (Score: 1) by cyberthanasis on Wednesday September 18 2019, @07:42AM (1 child)

        by cyberthanasis (5212) on Wednesday September 18 2019, @07:42AM (#895531)

        Not at all. She says that Java is what is usually used to replace Cobol. And she actually gives examples about numerical instability with floats and fixed point reals.
        And she says that Java does have fixed point library, but Cobol has fixed point builtin and it is a compiled language. Therefore it is better when you need millions of transaction per second.

        • (Score: 2) by FatPhil on Wednesday September 18 2019, @08:40AM

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Wednesday September 18 2019, @08:40AM (#895540) Homepage
          You seem to be reading too much, or too little, into "compiled", "library", and "builtin".
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 5, Insightful) by DannyB on Monday September 16 2019, @02:36PM (10 children)

    by DannyB (5839) Subscriber Badge on Monday September 16 2019, @02:36PM (#894605) Journal

    Back in my college daze. I was still using BASIC. Rebelling against FORTRAN. Saw no possible use for COBOL. I had a side (work study) job in the computer center actually writing code. Was friends with the head of the computer center who was seriously into COBOL.

    He told me that COBOL would be around at the turn of the century. (We didn't call it turn of millennium back then, but ah, hindsight...) I couldn't believe him. Just could not. But he explained why he believed this. The sheer economic value of COBOL code that existed, even in 1980, was so vast that it could never be replaced in only twenty years. Too many systems deeply depended on COBOL.

    Of course, I was still a youngster and he had a few more years of wisdom. Of course, he was right.

    I see the same thing now in Java. And for the same reason. Too much Java code exists to be replaced any time soon. And especially in very large, very conservative organizations. Including especially banks and financial institutions. Places where you don't have hipsters coming up with new framework of the week that everything must be rewritten in . . . just because. Java has been extremely conservative about backward source and binary compatibility. Really old source and binaries will run on current runtimes -- and that's hard to do, just ask Python.

    Why COBOL? I didn't see it for a long time. But eventually did. As I looked back on computer history, I came to realize (which I also partly realized in college) that mainframes are seriously RECORD oriented machines. Perfect for business. Where UNIX like systems, or even microcomputer OSes that give you byte-oriented file read-write are more desirable for all kinds of uses, including records. COBOL is highly record oriented. And text oriented. PICTURE this, that you can read in a card, and specify the character positions of various fields, and their parsing formats and have them loaded into variables on each iteration. All files are "stacks of punched cards". Even modern files, are text, with each 80 column line is a record. Or wider records now that we're not worried about backward compatibility to punched cards. But a 1980 COBOL program will just work. So while I now understand WHY the language is attractive, and what niche it serves, and I understand the economic reasons it is still around . . .

    I would never want to use COBOL myself. Just as some would never want to use Java. Or other languages. Hey if it's not for your problem domain, don't use it. But don't knock it if it is in widespread use. There is probably a reason for that. Warts and all.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @02:55PM

      by Anonymous Coward on Monday September 16 2019, @02:55PM (#894617)

      Finally some one gets it!!!

      COBOL
      RPG (assembler in drag)
      IBM ASM

      ALL are powerful languages for bulk processing ng. All from the day that 1400 machines were running the world. COBOL was the “first” write once and run on anything. RPG was a direct replacement of 1403 back-a-lack boards and 4k core that ran the I/O cycle and set flags IBM ASM move 16kB in a single “click” tick.

      ACH transfers are card decks. Amounts are not 2.31e+10. Real processing take really machines. IBM understood business. The rest are Sciences that make data gathering devices.

      Kids today?

    • (Score: 3, Insightful) by bussdriver on Monday September 16 2019, @03:55PM (2 children)

      by bussdriver (6876) Subscriber Badge on Monday September 16 2019, @03:55PM (#894652)

      If it ain't broke don't fix it!!! That needs to be tattooed onto every nerd's hands.

      If you can't manage the COBOL, then get out of the career; you are not competent enough. Bind to new languages if you must. The insanity and utter failure of software is that we are rewriting everything constantly when the real problems are BUGS. It takes decades of man hours to perfect something and even then the astronomical level of branching involved can (and is) hiding unnoticed bugs. An experienced wise old programmer will tell you that whatever new design you invent it'll have pitfalls to it and those will grow with time (and other people's opinions.) Now some things will become obsolete like the horse and buggy... so starting over might actually make sense; but to re-invent the wheel is foolishness (no, adding rubber to a wheel shouldn't be reinvention.)

      The older the code, the more it's been tested and debugged, the more valuable it is. You do not redo it unless absolutely necessary. The goal of next gen programming needs to be write-once software that will last (or at least components of it) forever and be efficient to produce. COBOL might be a pain in the ass but it clearly succeeded in filtering out moron programmers (unlike Java) and it is lasting forever. Rust maybe could be another COBOL/C... it's goal is efficient long-term write-once... but again, only for new code; secure mature C / COBOL does not need rewriting.... because the work that Rust helps with was done the hard way already. Tweaking old code to fit new needs or removing bugs from it is just fine. It's unlikely you will ever save $$$ porting it unless you have perfect translation software.... and the support costs of shifting are justified by the gains of the new tools.

      People talking of porting COBOL are like a carpenter replacing an old nailgun with a bluetooth enabled hammer! A specialty tool used in it's niche will probably beat a hip new generic tool.

      • (Score: 5, Funny) by DannyB on Monday September 16 2019, @04:29PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @04:29PM (#894670) Journal

        Hipsters: If it ain't broke, fix it 'till it is!

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
      • (Score: 1, Informative) by Anonymous Coward on Monday September 16 2019, @06:06PM

        by Anonymous Coward on Monday September 16 2019, @06:06PM (#894736)

        It has no way not to be; if it were written like today, the olden days' hardware would be totally unable to run it at all.

    • (Score: 2) by Thexalon on Monday September 16 2019, @05:14PM (1 child)

      by Thexalon (636) on Monday September 16 2019, @05:14PM (#894694)

      Or other languages. Hey if it's not for your problem domain, don't use it. But don't knock it if it is in widespread use. There is probably a reason for that. Warts and all.

      Counterpoint: Sometimes the platform the thing needs to run on forces you to use a particular language no matter how much it sucks. For example, web browsers only universally understand Javascript, so web developers code Javascript not because they want to but because it's the only choice available. Or writing an application in a machine's machine code because there aren't any compilers / interpreters yet for that CPU and somebody demanded that you write software for it, not because it's a great language.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2) by DannyB on Monday September 16 2019, @06:46PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:46PM (#894759) Journal

        In the 70's it was obvious that some machines were 'biased' towards certain languages. Or modes of thinking.

        Developers, like today, or all humans actually, take the path of least resistance.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @07:40PM (2 children)

      by Anonymous Coward on Monday September 16 2019, @07:40PM (#894774)

      The *machines* were not record oriented. The *language* was record oriented. The machines only seem record oriented because they are viewed through the prison of an incredibly limited language. (I originally wrote prism, but it autocorrected and as so often happens, the new word is better).

      Lots of business processes seem to be record oriented... Right up until something happens that doesn't fit in the record. In the 50s it was normal for humans to write little notes in the margins or whatever... Computers cannot do this. Limitations in the language directly cause limitations in the process. Why do you suppose the IRS, the phone company and so many other COBOL users are so notoriously inflexible? They can only do what the computer lets them do.

      COBOL is one of the best examples of "the worst thing that can possibly work." That is not a strength. Inability to get rid of it doesn't make it good.

      • (Score: 2) by Muad'Dave on Tuesday September 17 2019, @04:34PM (1 child)

        by Muad'Dave (1413) on Tuesday September 17 2019, @04:34PM (#895245)

        > The *machines* were not record oriented.

        Early filesystems, at least those on the Interdata/Perkin Elmer/Concurrent machines I used to work on, were record-oriented. You could declare a file as 'indexed' with a fixed max record length like 80 or 132 characters, or as 'contiguous' where the record length was fixed at 256 bytes (one blocksize - used for executables, etc). There was no concept of a line terminator.

        Obviously the disk drive itself was just storing 256 byte blocks.

        • (Score: 0) by Anonymous Coward on Wednesday September 18 2019, @04:28AM

          by Anonymous Coward on Wednesday September 18 2019, @04:28AM (#895492)

          Huh. Interesting. So this was effectively an abstraction over the physical blocks, to have OS-level controllable virtual blocksize? Ie. there was no recordsize-centric design at the hardware level?

    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @10:35PM

      by Anonymous Coward on Monday September 16 2019, @10:35PM (#894848)

      Most COBOL code stick because of regulations. That data you have? By law must match what it did last year. So you re-wrote it. Does it match? *reaaaaallly* match. I mean all several trillion records you have. Does it match? Does it roll up the same? OR leave it the fuck alone and the regulators are fine. No fines. Java will stick for the same reasons. Oracle smells the blood in the water :)

  • (Score: 5, Insightful) by jimtheowl on Monday September 16 2019, @03:05PM

    by jimtheowl (5929) on Monday September 16 2019, @03:05PM (#894623)
    Why spend so much time talking about COBOL and not put the effort to just learn it?

    https://en.wikipedia.org/wiki/GnuCOBOL [wikipedia.org]

    A tool-chest has all kinds of screwdrivers in it.
  • (Score: 3, Insightful) by KilroySmith on Monday September 16 2019, @03:06PM (8 children)

    by KilroySmith (2113) on Monday September 16 2019, @03:06PM (#894624)

    Cobol and Fortran are holding us hostage because of a complete and total failure of the discipline of Software Engineering.

    After 70 years of existence, Software Engineering is still completely incapable of specifying and implementing large, complex systems. We have no language to capture the specifications and behavior of a COBOL system, so our ability to translate it's 50 years of evolved behavior into a document capable of being used to architect and design a new system that meets the needs of the organization is nearly nil. We can capture individual function specifications, but writing Java implementations of every COBOL function and trying to tie them all together doesn't leave you with a more understandable, faster, more reliable system.

    If Software Engineering had evolved to the point it should be at, we would have a modern Air Traffic Control system, and banks that updated in real time and not in batch at night.

    • (Score: 2) by bussdriver on Monday September 16 2019, @04:07PM (4 children)

      by bussdriver (6876) Subscriber Badge on Monday September 16 2019, @04:07PM (#894659)

      Software Engineering isn't the problem. It's HUMANS. Our inherent flaws and limitations are what has held back the evolution of software.

      We can't specify complex systems in detail today much better than back then. Agile is kind of an admission that we can't handle complex software by planning. Sure you can use everything we have and the next 20 years of progress and you'll still end up with an edited specification that changes while it is being implemented and runs into design dead-ends requiring work around kludges, etc. It's like you need to implement a working 1.0 before you realize all the mistakes then design a rewrite that isn't as nearsighted... because hind-sight is so good and we never seem to foresee well enough... It may have less to do with human limitations and flaws/politics than it has to do with accurately predicting the future in a way that the ideal optimal path can be chosen (for a set of problems that are not yet completely understood.)

      Think of English, well written things are revised and edited even by the best authors. Software is the same but it's far more intricate... Changing phrasing on 1 sentence doesn't alter the whole book so it fails to communicate it's meaning.

      • (Score: 1, Insightful) by Anonymous Coward on Monday September 16 2019, @04:42PM (2 children)

        by Anonymous Coward on Monday September 16 2019, @04:42PM (#894673)

        It's a good point, but one of the other areas that COBOL became relevant was a drive to reduce or eliminate programming jargon and instead approach processing using ordinary language tools. COBOL (and FORTRAN) were implementations trying to put the tools of programming in the hands of not programmers but subject-matter experts. When dealing with the complexity of a business or scientific problem, instead of requiring intermediary developers or developers who must master business or science concepts why not design the system such that end users can understand the concepts well enough to either program themselves or do some level of debugging on it? Placing programming in end-users hands these days resorts to, "learn the intricacies, syntax, and quirks of language ______." Not that COBOL and FORTRAN don't have these also, but there was intention to lessen that burden and allow the scientists and businesspeople of the days of the language development direct access to understanding a codebase. At the tradeoff of some degree of flexibility and elegance.

        • (Score: 2) by bussdriver on Monday September 16 2019, @05:13PM

          by bussdriver (6876) Subscriber Badge on Monday September 16 2019, @05:13PM (#894693)

          Anybody can program. Not everybody can do it well. Something people forget; besides that, your IQ is like a power meter and no matter how big yours is, it goes down the harder you concentrate on tasks. A great programmer who sucks at some science is going to use up their IQ on the problem while an expert is going to use their IQ on the programming. It really depends upon the situation which one is going to do better. Complex languages like C++ that provide way too much bloat of options (or Perl with insane linguistic power) might entertain a pro programmer also lower their IQ thinking about the details. For systems work, those details ARE the work so it's not as horrible... but they work when the pro knows when to ignore all the feature bloat and work with the proper subset (and future Perl 6 gurus will make the language conform to the problem at hand and then use an even more niche subset. Will people notice the powerful potential? probably not...)

        • (Score: 0) by Anonymous Coward on Monday September 16 2019, @07:54PM

          by Anonymous Coward on Monday September 16 2019, @07:54PM (#894782)

          And SQL. But the notion of some non-technical business specialist writing their own queries in SQL would be laughable today. These languages all date from an era when it was commonly believed that the hard part of programming was the syntax. And it's not a surprise, many non-programmers today still think that. But that's as absurd as the notion that the hard part of being a lawyer is the Latin. We now know that not only is syntax not the hard part, but these languages are the opposite of helpful : it's important to keep code concise, and to make the language efficiently represent the problem. "English-syntax" languages make both of those things worse.

          In hindsight, this is obvious. Natural languages are for talking to humans, and computers do not work like humans. But in the 1950s, nobody realized just how different they are.

          Nobody blames Henry Ford for not including seat belts and power steering, but that doesn't mean there's any reason to drive a Model T today except for historical interest.

      • (Score: 2) by kazzie on Monday September 16 2019, @05:36PM

        by kazzie (5309) Subscriber Badge on Monday September 16 2019, @05:36PM (#894709)

        Software Engineering isn't the problem. It's HUMANS. Our inherent flaws and limitations are what has held back the evolution of software.

        Are you writing that on behalf of Bot?

    • (Score: 3, Insightful) by DannyB on Monday September 16 2019, @04:38PM

      by DannyB (5839) Subscriber Badge on Monday September 16 2019, @04:38PM (#894672) Journal

      Software Engineering is still completely incapable of specifying and implementing large, complex systems.

      Maybe computers and software allow humans to create systems of such complexity that they cannot be completely specified in the way that, say, a large building can be.

      Most other human activity involves lots of repetition. Many floors in a skyscraper? Lots of repetition or at least similarity. In software, repetition and similarity are factored into common functions, or collections (libraries) of such functions. Yet even with that factoring, every major software project involves a lot of complexity that is unique to the project. Even with UI frameworks. UI input validation frameworks. Database patterns and common practices. A lot has been made systematic -- yet each major project brings tons of uniqueness that is difficult to completely specify.

      Software also is created for many one-off tasks by people who do not specialize in software and do not have the best practices. They just need it to work -- for them. And then share it with ten thousand colleagues.

      --
      People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 3, Insightful) by Thexalon on Monday September 16 2019, @05:39PM

      by Thexalon (636) on Monday September 16 2019, @05:39PM (#894715)

      Required reading: Fred Brooks - No Silver Bullet [unc.edu]

      Summary: Some of the complexity of software is due to software design and development making things harder than they need to be. A lot of it, though, is because the problem that it's trying to solve is complicated, and there's just no getting around that.

      For example, a major user of COBOL historically has been the IRS. Have you read the entire US tax code? No, of course not, because it's a big mess that's so complicated and changing so quickly that even the IRS has a tough time keeping up. In what universe would you expect the code to address all those nooks and crannies to be simple and clear?

      When I'm trying to explain the work of coding to non-coders, I tend to focus on the fact that the real skill of coding is anticipating and responding appropriately to everything that can possibly go wrong with something before it happens. For example, good code takes into account the fact that the power could be cut to the computer at any time, meaning that the program can stop unexpectedly on any instruction, and needs to usefully recover from that.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @06:43PM

      by Anonymous Coward on Monday September 16 2019, @06:43PM (#894756)

      Cobol and Fortran are holding us hostage because of a complete and total failure of the discipline of Software Engineering.

      Software Engineering is still completely incapable of specifying and implementing large, complex systems

      Regarding specifying stuff, "Customers/users don't really know what they want" is not a Software Engineering problem but is a common major problem.

      We have no language to capture the specifications and behavior of a COBOL system, so our ability to translate it's 50 years of evolved behavior into a document capable of being used to architect and design a new system

      Copying an existing complicated building from scratch so that it works EXACTLY the same (bugs, squeaks, defects and all) isn't that easy either. Is that a fault of Civil Engineering?
      Copying an existing complicated building from scratch MINUS the "unwanted" bugs and "defects" while keeping some "wanted bugs" is harder (especially if the Client and end users haven't fully figured out what is unwanted and what is wanted). Is that a fault of Civil Engineering?

      See also:
      https://soylentnews.org/comments.pl?noupdate=1&sid=10481&cid=260038#commentwrap [soylentnews.org]
      https://soylentnews.org/comments.pl?noupdate=1&sid=1603&cid=38521#commentwrap [soylentnews.org]

  • (Score: 4, Touché) by Anonymous Coward on Monday September 16 2019, @03:09PM (5 children)

    by Anonymous Coward on Monday September 16 2019, @03:09PM (#894626)

    Why we still use stone buildings built centuries ago? COBOL was designed by the disciplined people for use by normal people. I observe more incompetence, laziness, stupidity and singularities in all post-modern languages created by undisciplined freaks and drug abusers.

    • (Score: 2) by ikanreed on Monday September 16 2019, @03:46PM (3 children)

      by ikanreed (3164) Subscriber Badge on Monday September 16 2019, @03:46PM (#894650) Journal

      Nothing says "robust" like a tangled mess of GOTO.

      • (Score: 3, Funny) by FatPhil on Monday September 16 2019, @03:53PM (2 children)

        by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @03:53PM (#894651) Homepage

        You want some COMEFROM [wikipedia.org] instructions too, if you please.

        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
        • (Score: 3, Funny) by Anonymous Coward on Monday September 16 2019, @04:50PM (1 child)

          by Anonymous Coward on Monday September 16 2019, @04:50PM (#894677)

          I always wanted a language that had both GET and BACK commands. That way I could design a program that could GET(LONER) and it would return JOJO and GET(PLACE) and return TUCSON ARIZONA and GET(THING) and return CALIFORNIA GRASS. Then I could send the program BACK to where it ONCE BELONGED.

          • (Score: 2) by janrinok on Monday September 16 2019, @05:35PM

            by janrinok (52) Subscriber Badge on Monday September 16 2019, @05:35PM (#894708) Journal
            Made me smile - thank you.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @07:59PM

      by Anonymous Coward on Monday September 16 2019, @07:59PM (#894784)

      Perhaps you are not aware of the actual history of COBOL. It was designed by non-experts, with multiple incompatible "standards" modified according to political whims, littered with incomplete and incompatible extensions with widely varying support, and then modified again by computer vendors trying to create incompatibilities in order to achieve lock-in.

      Of everything you could say about this language, "disciplined" would be among the last things.

  • (Score: 2) by epitaxial on Monday September 16 2019, @03:17PM (2 children)

    by epitaxial (3165) on Monday September 16 2019, @03:17PM (#894632)

    the not invented here ideology that is ruining modern software. Something is old therefor it cannot be any good.

    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @04:07PM (1 child)

      by Anonymous Coward on Monday September 16 2019, @04:07PM (#894658)

      Something is old therefor it cannot be any good.

      Well I am old and not very good, so they may have a point.

      However, if it took 50 years to debug the code, replacing it might not be the best strategy.

      • (Score: 2) by DannyB on Monday September 16 2019, @06:14PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:14PM (#894742) Journal

        I'm old too.

        When you get old enough, everything becomes a joke.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 3, Interesting) by jelizondo on Monday September 16 2019, @03:28PM (13 children)

    by jelizondo (653) Subscriber Badge on Monday September 16 2019, @03:28PM (#894641) Journal

    Very surprised this is still a problem. Some thirty years ago (maybe a tad more) I was involved in designing an accounting system that had to deal with very large numbers (1012) and found it gave slightly wrong answers due to rounding.

    I remember adding with a calculator the output of the system and coming up with the correct answer but the friggin’ system couldn’t. Finally we decided to use Binary Coded Decimals as the author suggests in the TFA and voilá! the answers matched.

    The paper High Performance Computing: are we just getting wrong answers faster? [nd.edu] [.PDF warning] referenced in the TFA gives some great examples of how floating-point can’t get some answers right, regardless of the precision used. The following example is on pages 2-3:

    f = 1.172603...

    when using double precision, the result is
    f = 1.1726039400531...

    and when using extended precision, the result is
    f = 1.172603940053178...

    The fact that the answer does not change with increasing precision is often taken as confirmation that the correct answer has been obtained. However, the correct answer is, in fact:
    f = -0.827396059946...

    So, I am not afraid of our robot overlords, they can’t even get the sign right on an equation!

    • (Score: 2) by FatPhil on Monday September 16 2019, @04:02PM (11 children)

      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @04:02PM (#894654) Homepage
      What does that calculation yield when performed in fixed-width BCD of no greater precision than extended precision IEEE-854 floating point?

      Because at the moment, you've compared standard FP with absolutely no alternative at all apart from infinite precision, which even though in this case can be easily faked using arbitrary precision, is still a completely different beast, a complete apple to bunch of bananas comparison.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 5, Insightful) by DannyB on Monday September 16 2019, @05:33PM (10 children)

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @05:33PM (#894706) Journal

        Don't use floating point for financial calculations.

        Floating point is an approximation overall. Subsets of representable numbers have exact representation. Such as integers, or 0.25. But ten cents 0.10 only has an approximate representation because it is a repeating binary value.

        For money, you want exact values. You don't want to know that approximately this much got deposited to your bank account.

        Floating point is fantastic -- for certain uses. Integers and other financial types are also fantastic for their uses. Just like programming languages have specific use cases that they are good at.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
        • (Score: 2) by FatPhil on Monday September 16 2019, @05:48PM (9 children)

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @05:48PM (#894722) Homepage
          Now go off and read some Kahan.

          The *only* example that decimal-heads *ever* come up with is "0.1 isn't exact in binary". Such statements are stupid, and they identify the decimal-heads as stupid decimal-heads. I can think of an infinitude of example numbers that aren't exact in decimal FP. Therefore my example is infinitely more meaningful than yours.

          Decimal FP is only good if the *only* thing you ever do is add, subtract, or multiply by decimally-trivial ratios. It's no better than binary FP if you ever need to take a ratio, or a logarithm, or an exponent, or other things that people who deal with finance are likely to do. Decimal FP is provably less accurate than binary FP - and the proof's in Kahan, now go and read it.
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
          • (Score: 3, Insightful) by DannyB on Monday September 16 2019, @06:11PM (6 children)

            by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:11PM (#894739) Journal

            Throughout this topic I have never advocated decimal FP. I advocate integer data types for money. In Java, BigDecimal is often used -- but that is a facade for a BigInteger with a base 10 exponent in order to insert the right decimal point once the BigInteger is converted to base 10 digits.

            There are (rarely) calculations done with money that involve floating point. But that is the exception and not the rule. Floating Point is an approximation. Integers are exact.

            Money has always been an integer data type since before recorded history. The age of computers did not change that.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2) by FatPhil on Tuesday September 17 2019, @07:57AM (5 children)

              by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @07:57AM (#895064) Homepage
              > Throughout this topic I have never advocated decimal FP. I advocate integer data types for money.

              Then statements like your "[in binary FP] But ten cents 0.10 only has an approximate representation" are a red herring or deliberately ingenuous, because you can't represent 0.10 as an integer either - the thing you advocate is as useless as thing you were criticising.

              And before you spin up your seemingly imbalanced rotors, any "but I scale by 2 decimal places" can be countered with a "but I can scale by a factor of 100".
              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
              • (Score: 2) by DannyB on Tuesday September 17 2019, @02:10PM (4 children)

                by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @02:10PM (#895144) Journal

                To clarify, since I must have been unclear.

                Don't use FP for money. FP is a type for approximate values, not exact values.

                Money has always been integers since long before computers existed. Use an integer type. (aside: BCD can be thought of as integer type with an implied decimal point somewhere) Integer arithmetic, and I mean 2's compliment, is exact. Even division has an exact quotient and remainder.

                As an example of why not to use floating point for money, I point out the value of 0.10 which perfectly illustrates that there is no exact FP representation for ten cents. There are many values that have no exact FP representation.

                Back in the 1970's, some would say 1960's everyone figured out that FP is inappropriate for money.

                Floating point is excellent for certain uses, just as most tools have an ideal use. Floating point is the wrong tool for the job of representing money.

                Parsing a string like "5.00" into an integer is an exact process. Converting an integer 500 into a string like "5.00" with a decimal point inserted two places from the right, is an exact process. At no point do any approximations exist.

                I hope that is more clear.

                --
                People today are educated enough to repeat what they are taught but not to question what they are taught.
                • (Score: 2) by FatPhil on Tuesday September 17 2019, @03:13PM (3 children)

                  by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @03:13PM (#895173) Homepage
                  Agree with pretty much all of that.

                  However, it is possible to use binary FP correctly for money - the rider is that:
                  If your use of FP does not permit you to exactly represent all the values you want it to represent - you're doing it wrong.

                  And that's the issue with 1.0 := $1 binary FP.

                  If, however, you can exactly represent all the values you want it to represent (e.g. 100.0 = $1 FP, and you only care about results in cents), then there's an isomorphism between that representation and your integer representation, they do the job equally well, one can be turned into the other, and back again, with no loss of information. Both represent integer multiples of an atomic value. The calculations you perform on these values will almost always want to have sub-atom-sized (i.e. fractional in FP context) guards on them, which FP gives you automatically, but you need to know exactly when you should changing from as-close-to-exact-as-possible intermediate values into must-be-an-integer-multiple-of-the-atom representation again (e.g. suitably round()ing an FP). Not needing to flip between FP and FP makes FP easier. Any integer representation must flip into FP in order for the calculations to be performed on it, and flip back out of FP as it leaves the black box - but why bother with the overhead if FP can do it with NOP and round()? The alternative is to write your own FP-avoiding numeric library, which if you have the skill for, you have the skill to use FP correctly anyway.
                  --
                  Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
                  • (Score: 2) by DannyB on Tuesday September 17 2019, @03:34PM (2 children)

                    by DannyB (5839) Subscriber Badge on Tuesday September 17 2019, @03:34PM (#895192) Journal

                    I one is to use a floating point type to hold exact integers, why not just use an integer type and then never have any doubt?

                    --
                    People today are educated enough to repeat what they are taught but not to question what they are taught.
                    • (Score: 2) by FatPhil on Wednesday September 18 2019, @08:00AM (1 child)

                      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Wednesday September 18 2019, @08:00AM (#895535) Homepage
                      I know I do go on a bit, but that was clearly stated above - the overhead of the conversion into and out of the form you actually want to perform the calculations on. With FP it's NOP and round(). Why do unnecessary operations that you don't need to?
                      --
                      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
                      • (Score: 2) by DannyB on Wednesday September 18 2019, @01:54PM

                        by DannyB (5839) Subscriber Badge on Wednesday September 18 2019, @01:54PM (#895638) Journal

                        For simple arithmetic, round should never be an allowed operation. That is the virtue of integers. It exactly models what currency has always been. Floating point is fantastic if I'm plotting a course to pluto, or doing statistical calculations, or doing 3D rendering and many, many other applications.

                        I won't go on about it, but IMO, floating point has no place when representing currency. (Possibly useful for certain intermediate calculations, but then the result is back to a currency value.) Integer arithmetic (+, -, *, /) is exact. (Division has an exact quotient and remainder.) Converting to / from string representation is exact. The rest of the industry figured this out in the 1970's. Possibly even 1960's.

                        Use floating point for money if you want to. But it probably won't happen on any real commercial type of project. It's the wrong tool for the job.

                        I use Java, as do many others, for the type of work I do, because it is the right tool. (I didn't say perfect. But its combination of attributes is great.) And developer productivity is the major thing to optimize for now. Complain about it if you must, but it is deeply entrenched. As is COBOL -- and I dislike COBOL, but I understand WHY it is.

                        OTOH, Other people use C and C++ for certain things which they are exceedingly good at.

                        Use the right tool for the right job.

                        --
                        People today are educated enough to repeat what they are taught but not to question what they are taught.
          • (Score: 2) by DannyB on Monday September 16 2019, @06:13PM

            by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:13PM (#894741) Journal

            One more thing. Using FP for money is the wrong tool for the wrong job very much like:

            * using a hammer to drive wood screws into wood
            * using C to write accounting software or application software

            There are many software tools, and some of them are better for specific uses than other tools.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
          • (Score: 1, Informative) by Anonymous Coward on Monday September 16 2019, @06:52PM

            by Anonymous Coward on Monday September 16 2019, @06:52PM (#894763)

            ^^^ Proof that a little knowledge leads to a bigger idiot.

            Decimal FP is only good if the *only* thing you ever do is add, subtract, or multiply by decimally-trivial ratios.

            You should reread the OP's post instead of telling him to read Kahan. Decimal trivial ratios are what most financial systems in the world use. And thus the OP's point remains valid.

    • (Score: 4, Insightful) by DannyB on Monday September 16 2019, @04:57PM

      by DannyB (5839) Subscriber Badge on Monday September 16 2019, @04:57PM (#894680) Journal

      You can use wide enough integers instead of BCD. Are 64 bit integers big enough? Integers are EXACT. Just as exact as BCD. Use an integer to represent US cents, for example. When displaying, add in the decimal point. When parsing, remove the decimal point. It's just cents as an integer. In 64 bits you can represent the US National Debt expressed in Argentine Pesos without overflow.

      DO NOT, EVER use floating point for money calculations. Floating point is for scientific calculations. NOT REPEAT NOT for accounting.

      BCD really is just an integer format, in base ten, with a decimal point implied somewhere. For efficiency, just use a 64 bit or 128 bit integer.

      --
      People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 3, Insightful) by bradley13 on Monday September 16 2019, @04:01PM (10 children)

    by bradley13 (3053) on Monday September 16 2019, @04:01PM (#894653) Homepage Journal

    On the one hand, there's nothing wrong with the idea of "never touch a running system". Cobol works, and by now the compilers and libraries have been thoroughly proven.

    On the other hand, claiming other programming languages cannot do fixed-point math is dumb. And anyway, that's actually not really the point. TFA seems not emphasize the real point: you should not use binary representations for decimal numbers. They mention programs written for the IRS: If anyone uses floating-point numbers for financial calculations they are screwed before they start. You *must* use decimal arithmetic, which means that you have two solutions:

    There are three clean solutions:

    - Shift your decimals so that you can do integer math, and shift back, for the final result. In other words, implement (or use) a fixed-point library. If Cobol has some oddities in rounding/truncation, you can implement those as needed. This would allow you to use binary integer types, as long as you stay below the maximum values.

    - Use a library that correctly implements decimal arithmetic (like Java's BigDecimal). TFA mentions this, but doesn't really bring the point home. This is the cleaner solution, but may deliver results slightly different from existing Cobol code.

    - If the performance of libraries like BigDecimal is a problem (unlikely, but possible) then implement Cobol-style math in a new library. If Cobol compilers can do it, so can libraries in other languages. The operations are well-understood (um, because the Cobol compilers have already implemented them), so this is entirely possible.

    "We tried to rewrite the code in Java and Java couldn't do the calculations right."

    Color me unimpressed. They apparently weren't very good programmers, or at least they did not understand Java.

    - - - - -

    P.s. For anyone who doubts the inaccuracy of using binary arithmetic for decimal values, consider the following program:

    float A = 1000000;
    float B = (A + 1) / 1000;
    float C = A / 1000;
    float D = B - C;
    float E = 1000000 * D;
    float F = E - 1000;
    System.out.println("F should be 0.0, but is actually " + F);

    Result:

    F should be 0.0, but is actually -23.4375
    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by FatPhil on Monday September 16 2019, @04:13PM (5 children)

      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @04:13PM (#894662) Homepage
      Now use decimal, and replace those 1000s with 1001, 1000000s with 1002001s, and report back.

      You see, if you want to do anything apart from multiply and divide by factors of powers of 10, you "are screwed before [you] start".

      No difference from binary. Read some Kahan (or at least /What ever computer scientist should know about floating point arithmetic/) before bloviating so.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 5, Touché) by bradley13 on Monday September 16 2019, @04:36PM (4 children)

        by bradley13 (3053) on Monday September 16 2019, @04:36PM (#894671) Homepage Journal

        The point is this: When you are doing financial calculations, you are using powers of 10. Because that's how currencies work.

        --
        Everyone is somebody else's weirdo.
        • (Score: 4, Insightful) by FatPhil on Monday September 16 2019, @05:58PM (2 children)

          by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Monday September 16 2019, @05:58PM (#894729) Homepage
          False. You're thinking of noddy stuff like ledger accounting where the input numbers have been passed a priori through a filter that has simplified them to be trivial, and exactly handlable in decimal. This is only a subset of financial computations. If you can't imagine exponentiation to a non-integer power ever happening in the world of finance, then *put down the calculator, and walk away, you're not safe in charge of such a thing*. Or become president of the USA.
          --
          Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
          • (Score: 2) by DannyB on Monday September 16 2019, @06:03PM (1 child)

            by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:03PM (#894733) Journal

            I have written calculations like you describe in a financial calculation. That calculation is a black box. It takes inputs, does an inexact calculation, to great precision, and then the results are fixed into some form of currency values that are the result(s) of that black box.

            Everyone who does the same calculations (different languages, implementations, etc) will get the same results. But outside that black box, you're back to some kind of currency representation of money.

            Values of money have always been integer data types since before recorded history. The age of computers did not change that.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2) by FatPhil on Tuesday September 17 2019, @07:05AM

              by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday September 17 2019, @07:05AM (#895044) Homepage
              > Values of money have always been integer data types since before recorded history.

              I'd change that to "Values of money have always been values representing integer multiples of an atomic unit of value since before recorded history." as we've usually chosen a fixed point representation. If the base unit of currency was the cent, rather than the dollar, then many of the issues people bring up would simply evaporate. This is because decimal-heads love saying stupid things like "you can't represent $0.01 exactly in binary", which is patently false if you're representing cents. If you encounter side effects like "1 dollar (100)" * "1 dollar (100)" = "100 dollars (10,000)", then you are almost certainly doing something wrong, as there's no such thing as a square dollar.

              But you seem to get it - firstly you need an exact representation for every value that you wish to represent (can be fixed point, can be floating point suitably scaled, can be decimal FP, can be integer). Binary FP based on dollars is not such a beast - noone's denying that. Once you've got that, then operate on those values using whatever is most accurate. And given that you want to avoid overflow at all costs, that's almost certainly FP. And as I've mentioned elsewhere, binary provably does better than decimal.

              The problem is when people fail to perform that first decision-making step correctly. Some languages protect idiot programmers against this better than others. I personally think it's better to just not hire idiot programmers, as they'll make a mistake elsewhere.
              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
        • (Score: 3, Informative) by DannyB on Monday September 16 2019, @05:59PM

          by DannyB (5839) Subscriber Badge on Monday September 16 2019, @05:59PM (#894730) Journal

          It is powers of 10 true. But it is also true that currencies are ALWAYS integers. Never approximations like floating point.

          Once you recognize it is integers, you can use integer types for much greater efficiency. Integer arithmetic is always EXACT. Even division has an exact quotient and remainder. EXACT. No fuzzyness, ambiguity or imprecision. All currency values (within range) are exactly representable as integer. Conversion to and from Base 10 for printing and parsing is also EXACT. So for Five Dollars you would use an integer of 500.

          --
          People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 2) by Dr Spin on Monday September 16 2019, @04:13PM (1 child)

      by Dr Spin (5239) on Monday September 16 2019, @04:13PM (#894663)

      I think 64 bit integers are still large enough to handle the current US debt as cents.

      However, if Trump remains in power, this may not be so for much longer. In which case, you could always use double precision.

      OTOH, if you are using floats for money, you should probably get a head transplant.

      --
      Warning: Opening your mouth may invalidate your brain!
      • (Score: 2) by DannyB on Monday September 16 2019, @04:59PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @04:59PM (#894682) Journal

        64 bit integers can hand the US National Debt as expressed in Argentine Pesos, which is a larger magnitude number.

        As you say, NEVER use floats for money. I thought everyone figured that out in the 1970's. Why in the 21st century does nobody seem to know that.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 2) by sjames on Monday September 16 2019, @11:04PM

      by sjames (2882) on Monday September 16 2019, @11:04PM (#894868) Journal

      - Use a library that correctly implements decimal arithmetic (like Java's BigDecimal). TFA mentions this, but doesn't really bring the point home. This is the cleaner solution, but may deliver results slightly different from existing Cobol code.

      That could be a big problem during validation (nobody is going to put a program in charge of that much money without some serious validation).

    • (Score: 1) by dumain on Thursday September 19 2019, @08:25AM

      by dumain (8558) on Thursday September 19 2019, @08:25AM (#896017) Homepage

      Guile Scheme has exact rationals as a built in type which should be able to adequately represent both fixed point and BCD. Guile can also be interfaced to GNUCobol so the transition to the smugly superior programming language could be managed in accord with modern buzzwords (Agile!,Devops!).

  • (Score: 5, Insightful) by DannyB on Monday September 16 2019, @04:48PM (6 children)

    by DannyB (5839) Subscriber Badge on Monday September 16 2019, @04:48PM (#894676) Journal

    I've got it myself. (eg, Perl) I've seen it here on SN.

    Here is a true thing: If there were one perfect programming language, everyone would be using it already.

    (Or at least migrating to it.)

    If there is a language you don't like, but it is among the top languages in use and top languages for high paying job offers, then there is probably a reason for that. Even if you are too dumb uninformed to realize why these languages have such high level placement.

    Different languages have a better fit for different porpoises.

    Would you write a device driver in PHP? Yet PHP seems to work great for creating small one-off pages or sites -- which then morph into Facebook. Which is one my problems with dynamic language. I love the quick one-off-ness of Lisp or Python. But I wouldn't write a gigantic system like I maintain at work in such a language. An earlier generation of software that I work on was in a "dynamic / duck type" language (specific to Microsoft) and as the code base grew I came to understand why such languages are inappropriate for this. Despite how appealing they are for small quick projects or one-off tasks.

    Clearly C and C++ have a place, but not in the software I write. But people who should know better, here on SN, just don't get it.

    If you're going to criticize a language because it has certain warts or artifacts of evolution, then I would point out the thing about throwing rocks and glass houses.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @05:39PM (1 child)

      by Anonymous Coward on Monday September 16 2019, @05:39PM (#894716)

      Well, sort of. Sure, there are some languages that are bad for certain purposes but good for others. COBOL is bad for all purposes. The only reason to use it is because you have to. That was always true : even in the 60s the only reason it got any traction was because the Department of Defense mandated it (and it very much feels like a language designed to implement the processes of the 1960s military).

      I'm old enough to remember the 80s and everyone hated COBOL then, too. It has twice the verbosity of Java, half the convenience of assembly, twice as many land mines as perl and half the power of BASIC. Its type system is designed to cause the maximum number of errors while also getting in the way whenever possible. The few features it has to make things easier instead end up forcing you to do things wrong.

      Aside from joke languages [wikipedia.org], it's very close to the worst possible language.

      • (Score: 2) by DannyB on Monday September 16 2019, @05:56PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @05:56PM (#894728) Journal

        I remember the 70s.

        I agree with everything you said about COBOL.

        I hated COBOL. I think I mentioned it somewhere in this topic that I hated COBOL. (look for it) But looking at history I came to understand why it is. It was the path of least resistance.

        Many things in life aren't perfect, but we live with them. How many people use Microsoft Windows? Internet Explorer?

        So COBOL was used. Tons of software written in it. Now here we are. It is too big to simply replace overnight. (BTW, my other post I mentioned pointed out the prediction, which I disagreed with and was proven wrong, that COBOL would be here at the turn of the century.)

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 2) by DeathMonkey on Monday September 16 2019, @05:41PM (2 children)

      by DeathMonkey (1380) on Monday September 16 2019, @05:41PM (#894717) Journal

      You should post this in the Python thread too!

      • (Score: 2) by DannyB on Monday September 16 2019, @05:47PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @05:47PM (#894721) Journal

        I posted something about Java in the Python thread. And I got one example of that kind of bigotry. Conflating various things together -- where I could create counter examples for C, such as blaming C for there being wxWindows, Gtk and Qt.

        The amusing thing is one of the very first things I said was Java has warts.

        People who hate are never going to see the legitimacy of things they hate. Even if they should know better.

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
      • (Score: 2) by DannyB on Monday September 16 2019, @06:29PM

        by DannyB (5839) Subscriber Badge on Monday September 16 2019, @06:29PM (#894748) Journal

        I suddenly realized we're talking about different Python threads. :-)

        --
        People today are educated enough to repeat what they are taught but not to question what they are taught.
    • (Score: 0) by Anonymous Coward on Monday September 16 2019, @05:51PM

      by Anonymous Coward on Monday September 16 2019, @05:51PM (#894726)

      Very true, and it reminds me to one of my favorite web comics about programming languages [sandraandwoo.com].

(1) 2