Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Friday May 09 2014, @09:18AM   Printer-friendly
from the Get-Off-My-Extremely-Efficient-Lawn dept.

Ars technica looks at Fortran, and some new number crunching languages in Scientific computing's future: Can any coding language top a 1950s behemoth?

This state of affairs seems paradoxical. Why, in a temple of modernity employing research instruments at the bleeding edge of technology, does a language from the very earliest days of the electronic computer continue to dominate? When Fortran was created, our ancestors were required to enter their programs by punching holes in cardboard rectangles: one statement per card, with a tall stack of these constituting the code. There was no vim or emacs. If you made a typo, you had to punch a new card and give the stack to the computer operator again. Your output came to you on a heavy pile of paper. The computers themselves, about as powerful as today's smartphones, were giant installations that required entire buildings.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by legont on Friday May 09 2014, @09:40AM

    by legont (4179) on Friday May 09 2014, @09:40AM (#41178)

    You have a handy razor blade to cut new holes and a bunch of paper squares to cover wrong ones.

    Back to FORTRAN, it's simple and efficient - hard to beat, same as AK47.

    --
    "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
    • (Score: 2, Insightful) by Wootery on Friday May 09 2014, @11:41AM

      by Wootery (2341) on Friday May 09 2014, @11:41AM (#41206)

      simple and efficient - hard to beat, same as AK47

      Indeed.

      They've both shown themselves to effective problem-solvers despite being entirely devoid of pointer arithmetic.

      (You were right to use same as AK47 rather than same as the AK47 - it forces the use of a Russian accent.)

      • (Score: 4, Insightful) by hubie on Friday May 09 2014, @01:38PM

        by hubie (1068) Subscriber Badge on Friday May 09 2014, @01:38PM (#41238) Journal

        With a COMMON block, you don't need pointer arithmetic.

        • (Score: 3, Informative) by Anonymous Coward on Friday May 09 2014, @02:47PM

          by Anonymous Coward on Friday May 09 2014, @02:47PM (#41269)

          Ultimately, COMMON blocks just were shared global variables. Indeed, in some aspects they are cleaner than true global variables, since every function has to explicitly declare which of the common blocks it uses. Yes, there's the lack of type checking, but as soon as you have several files, C is not much better in that respect (the preprocessor helps this, but then, preprocessing FORTRAN code isn't exactly unheard of either, although it was never part of the standard).

          Note that there is no such thing as a dangling COMMON block, or dereferencing a NULL COMMON block. There's no COMMON BLOCK arithmetic. A COMMON block does not change over time (the values of its variables may change, of course). And given the simple syntax of pre-90 FORTRAN, a program that checks that all your COMMON blocks are consistent should not be too hard to write either. OTOH, testing whether a program has no pointer issues is equivalent to the halting problem.

          Note that I'm not advocating the use of COMMON blocks in modern software; today we have superior constructs also in Fortran. But COMMON blocks were by far not as complex and error prone as pointers are.

          • (Score: 2) by hubie on Saturday May 10 2014, @02:37AM

            by hubie (1068) Subscriber Badge on Saturday May 10 2014, @02:37AM (#41458) Journal

            Not only are they shared global variables, what made them immensely useful to me was that they are also contiguous memory. Because of this it was trivial to do type conversion on the fly when reading in a record. If one had a 1000 byte record, one could read it in one subroutine

            SUBROUTINE READDATA(ILUN)
            COMMON /MYCOMMON/ IDUMMY(1000)
            INTEGER IDUMMY
            READ(UNIT=ILUN, "I1000") IDUMMY
            END

            And in the subroutine where you handle the data

            SUBROUTINE DATAPROC( )
            COMMON /MYCOMMON/ IREC, TSTAMP, TEMP1, X(100), ...
            INTEGER IREC, TSTAMP
            REAL TEMP1, X
            . . .
            END

            The variables were filled in and ready to use. Like with all languages, there were ways to get yourself in trouble if you weren't careful.

  • (Score: 3, Funny) by Jaruzel on Friday May 09 2014, @09:57AM

    by Jaruzel (812) on Friday May 09 2014, @09:57AM (#41181) Homepage Journal

    If you made a typo...

    Oh the irony, the summary title has a typo in it! (Unless of course that's deliberate; Is SN now running Fortran?)

    -Jar

    --
    This is my opinion, there are many others, but this one is mine.
    • (Score: 1, Funny) by Anonymous Coward on Friday May 09 2014, @10:32AM

      by Anonymous Coward on Friday May 09 2014, @10:32AM (#41188)

      Real programmers can run FORTRAN on any web framework!

  • (Score: 5, Informative) by pTamok on Friday May 09 2014, @10:05AM

    by pTamok (3042) on Friday May 09 2014, @10:05AM (#41182)

    It is alluded to in the article, but one of the reasons FORTRAN is so popular is the existence of extensively tested suites of numerical algorithm libraries available. A great deal of work has gone into the implementations so that the results are correct, stable, and generated as fast as the state of the art allows. One example is the NAG library

    http://www.nag.co.uk/numeric/fl/FLdescription.asp [nag.co.uk]

    But there are others: most of the physical sciences will have FORTRAN libraries and entire applications tailored for their use, so you do not have to re-invent the wheel every time you need to do some processing. An example of this is MOPAC [ http://en.wikipedia.org/wiki/MOPAC [wikipedia.org] ] used in computational chemistry.

    So there is a huge, robust, and well-tested installed base of FORTRAN programs, so you need some pretty compelling features in a new language to overcome the sheer inertia of using what is already in existence and is proven to work well.

  • (Score: 0) by Anonymous Coward on Friday May 09 2014, @10:17AM

    by Anonymous Coward on Friday May 09 2014, @10:17AM (#41183)

    Is it true that those computers were as powerful as smartphones?

    What were the specs: cores, ram, clockspeed ...

    • (Score: 2) by c0lo on Friday May 09 2014, @10:32AM

      by c0lo (156) on Friday May 09 2014, @10:32AM (#41187) Journal
      Here you have an example [wikipedia.org] for the state of the art computing device 1965:

      The system came with four memory sizes: E (32 KiB), F (64 KiB), G (128 KiB), and H (256 KiB), with an access time of 1 us, which put it closer to the Model 65 (.75 us) than the Model 50 (2.0 us).[3]:pp.6-11,6-12[1] Storage protection was an optional feature

      So, 256 kB of memory, with 1MHz clock (no need to go faster, you wouldn't be able to read the memory anyway), looked approximately like this [wikipedia.org].
      And this is [wikipedia.org] how the "flash memory stick" of that time looked like - it was indeed "portable"... provided you were wearing a van as a coat.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0
      • (Score: 2) by egcagrac0 on Friday May 09 2014, @01:28PM

        by egcagrac0 (2705) on Friday May 09 2014, @01:28PM (#41234)

        Ahh, the good old days, when "hard disk" was not the same as "hard drive"...

    • (Score: 3, Informative) by choose another one on Friday May 09 2014, @11:20AM

      by choose another one (515) Subscriber Badge on Friday May 09 2014, @11:20AM (#41199)

      Is it true that those computers were as powerful as smartphones?

      I doubt it. I reckon a modern low-end smartphone is about on a par with a 10yr old reasonable spec desktop pc. All except for persistent storage where the PC probably had more in spinning rust than you get on a sd card these days. But not by much. Go back another 10yrs or so and 16MB ram was a lot, disk was hundreds of MB and CPU tens of MHz. By the early 80's you are back to 10s of KB and a 1MHz CPU on the desktop.

      If we're charitable and take the biggest stuff at the very end of the 1950s, then you are looking at something like IBM 7000 series with 32k Words RAM and 100 KFlops.

      My first Fortran was late 80s on VAXen which probably topped out at 5Mhz and 16M RAM - shared between a bunch of people and charged back to your dept. by the CPU-second. Unless you were one of the incredibly lucky / demi-gods who got their own VaxStation. Still nowhere near the power of a modern feature phone let alone smart phone.

      Is it true ? - No, out by a few orders of magnitude or a few decades.

  • (Score: 5, Interesting) by c0lo on Friday May 09 2014, @10:17AM

    by c0lo (156) on Friday May 09 2014, @10:17AM (#41184) Journal

    All memory allocation was static - yes, all the necessary memory was determined at compile time. No structures (just arrays), no pointers - this means absolutely trivial memory addressing arithmetics.
    Want more? Fortran 77 did not allow recursion (introduced only in the F90 spec) - thus juggling parameters and return addresses on the stack was also relatively minimal and known after the compilation.
    Even more? If you didn't have enough memory, though luck... no memory swapping/paging as such - you'd need to spit your precrunched data on tapes and reload them later which, understandably, you would hate to do; after looking at MIX [wikipedia.org], why do you think there are so many pages dedicated to computing algo complexity and memory footprint in TAoCP - measure zillions of times and cut once

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0
  • (Score: 1, Informative) by Anonymous Coward on Friday May 09 2014, @10:22AM

    by Anonymous Coward on Friday May 09 2014, @10:22AM (#41185)

    It may not be common knowledge that Fortran has undergone quite substantial evolution (it did stagnate between 1977 and 1990, that probably caused much of its image as an outdated language). Modern Fortran differs more from 1950's FORTRAN than C++11 does from K&R C.

    • (Score: 2, Interesting) by pasky on Friday May 09 2014, @10:59AM

      by pasky (1050) on Friday May 09 2014, @10:59AM (#41193)

      That's *nice*, but having "1990" in the name of the language doesn't mean it's actually well supported. If you look at e.g. https://github.com/scipy/scipy/issues/2829 [github.com] you see that scipy has trouble migrating from 37 year old Fortran language standard to 24 year old Fortran language standard because upgrade from 8 years old gcc version to a newer one would interfere with library linking on 12 years old Windows version.

      It's a special kind of hell.

      • (Score: 2, Interesting) by choose another one on Friday May 09 2014, @12:32PM

        by choose another one (515) Subscriber Badge on Friday May 09 2014, @12:32PM (#41215)

        Your problem is not with FORTRAN versions but with trying to use Python, and probably nasty GUI stuff. If you stick to being a "real programmer" and write everything in FORTRAN you wouldn't have that problem (but then you'd use '66 to ensure that you "compile DO loops like God meant them to be").

        If you really have to do that "visualization" stuff, put your output numbers in a text file and visualize in your head (what it's for). If there are too many numbers for the screen, then print them out, pin the top of the printout to the top of the wall and drape downwards and across the floor from there, repeat with extra printouts to the right if > 132 columns required. You do have a 132 col line printer & fan fold paper, right (I mean how do you visualise your data otherwise) ?

        If you really have to, calculate your pixels for your visualization / GUI (shudder), in FORTRAN and drop to assembler to blit them to screen. Or use a FORTRAN visualization library that creates Gif or Pdf - but we're heading away from the one true path and towards heresy there...

        [ mostly, but not all, with tongue in cheek and obviously ref http://en.wikipedia.org/wiki/Real_Programmers_Don' t_Use_Pascal [wikipedia.org] ]

      • (Score: 0) by Anonymous Coward on Friday May 09 2014, @12:32PM

        by Anonymous Coward on Friday May 09 2014, @12:32PM (#41216)

        High performance calculations don't run on Windows anyway. Do you know a single Windows supercomputer?

        • (Score: 0) by Anonymous Coward on Friday May 09 2014, @08:13PM

          by Anonymous Coward on Friday May 09 2014, @08:13PM (#41377)

          Do you know a single Windows supercomputer?

          You can see that if you squint really hard. [wikimedia.org]
          Among the fastest 500 recorded, the number fluctuates between two [tomsitpro.com] and three. [google.com]

          I wish I had bookmarked the page:
          A new system had been built and they were benchmarking it.
          For about an hour the world's fastest system ran Windoze.
          After they installed Linux on that system, they had a new mark for world's fastest.

          -- gewg_

  • (Score: 3, Insightful) by bradley13 on Friday May 09 2014, @11:08AM

    by bradley13 (3053) Subscriber Badge on Friday May 09 2014, @11:08AM (#41196) Homepage Journal

    One of more of these (see comment title) is missing from almost every "modern" programming language. They are "more powerful" languages, meaning that they add layers of complexity that hinder fast numerical processing. Object-oriented languages are inefficient, and anyway OO is the wrong paradigm for number-crunching applications. Dynamic data structures and garbage collection make computation time unpredictable. Fancy features (like Java's variable-sized array rows) make for complex semantics and inefficient code.

    Fortran is simple, it's behavior is totally predictable, and the language is designed to support compilation to highly optimized and efficient machine code.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2, Interesting) by RaffArundel on Friday May 09 2014, @02:15PM

      by RaffArundel (3108) on Friday May 09 2014, @02:15PM (#41250) Homepage

      When I was working on my aerospace engineering degree, Fortran, specifically Fortran 90, was the language you learned. I graduated in 2002, and vaguely remember another version being available but as far as I know they still taught 90, off books based on 77. The professors said if you didn't know Fortran you wouldn't get a job in the field.

      However, when we were doing embedded stuff, it was in C. That was probably because the simulation, development and deployment tools supported it.

    • (Score: 1) by Max Hyre on Friday May 09 2014, @02:24PM

      by Max Hyre (3427) <maxhyreNO@SPAMyahoo.com> on Friday May 09 2014, @02:24PM (#41258)
      One major aspect of that predictability is localization of reference: it's a lot easier to ensure that the maximum amount of your data fit into level-n cache in FORTRAN. These days, that really matters.
    • (Score: 3, Interesting) by Dr Ippy on Friday May 09 2014, @05:15PM

      by Dr Ippy (3973) on Friday May 09 2014, @05:15PM (#41320)

      I started out with Fortran (1970s) but later (1985) switched to C, as Fortran was no longer "flavour of the month" (as we used to say in those days).

      As a scientific / engineering programmer, just about all my programs have had this structure:

      1. Read input data from file(s)
      2. Crunch data
      3. Write output data to file(s)

      Sometimes this is inside a loop, until the input data runs out.

      With this form of program, things like objects, functional programming and recursion leave me cold. They're just not necessary for what I do. Essentially my style is imperative programming with subroutines.

      My language of choice for the past 15 years or so has been Perl, usually ignoring its kludged-on object orientation. Perl is simple (though it can be difficult to read if not well written and well commented -- but that's true for a lot of languages), predictable and efficient.

      I've rarely used libraries. Who reinvents the wheel understands the wheel.

      Yes, I'm old fashioned. That's what comes of being born before Fortran. ;-)

      --
      This signature intentionally left blank.
  • (Score: 2) by jimshatt on Friday May 09 2014, @11:12AM

    by jimshatt (978) on Friday May 09 2014, @11:12AM (#41197) Journal
    The contenders are FORTRAN, Clojure, Julia and Haskell. Haskell and Clojure are Functional Programming languages, and Julia is not an imperative language either. A lot of languages deserve to be taken into account as well (e.g. Python and C++ (mentioned)). So, weird comparison.
  • (Score: 3, Interesting) by mendax on Friday May 09 2014, @12:20PM

    by mendax (2840) on Friday May 09 2014, @12:20PM (#41213)
    The following quote has been attributed to Seymour Cray [wikipedia.org], C.A.R. Hoare [wikipedia.org], and John Backus [wikipedia.org], the latter being the Father of Fortran, but nevertheless it is true:

    "I don't know what the programming language of the year 2000 will look like, but I know it will be called FORTRAN."

    Based upon what Fortran looks like now, that seems to have been an accurate prediction.

    --
    It's really quite a simple choice: Life, Death, or Los Angeles.
    • (Score: 0) by Anonymous Coward on Friday May 09 2014, @12:34PM

      by Anonymous Coward on Friday May 09 2014, @12:34PM (#41217)

      Not entirely accurate, because the versions since 1990 are spelled "Fortran", not "FORTRAN".

  • (Score: 1) by gidds on Friday May 09 2014, @01:20PM

    by gidds (589) on Friday May 09 2014, @01:20PM (#41232)

    I know people who would love to use a more modern language, but are running formidably-sized numerical programs on big iron.

    They say that after all its years of being heavily optimised, FORTRAN is simply faster for that than anything else going.

    --
    [sig redacted]
  • (Score: 1) by Oligonicella on Friday May 09 2014, @01:35PM

    by Oligonicella (4169) on Friday May 09 2014, @01:35PM (#41237)

    What a lame rhetorical device.

    • (Score: 2) by Blackmoore on Friday May 09 2014, @02:14PM

      by Blackmoore (57) on Friday May 09 2014, @02:14PM (#41249) Journal

      I had originally closed with "I'll be over here yelling at clouds"

    • (Score: 2) by Bot on Friday May 09 2014, @04:06PM

      by Bot (3902) on Friday May 09 2014, @04:06PM (#41291) Journal

      Potentially inaccurate too.
      My database hasn't got a definitive answer to the question: "Do geeks mate at all?"

      --
      Account abandoned.
  • (Score: 0) by Anonymous Coward on Friday May 09 2014, @06:18PM

    by Anonymous Coward on Friday May 09 2014, @06:18PM (#41343)

    I love the lamenting of how outdated this silly language is, only to use vim and emacs as examples of how far we've come. :) HA!

    • (Score: 2) by Foobar Bazbot on Saturday May 10 2014, @12:11AM

      by Foobar Bazbot (37) on Saturday May 10 2014, @12:11AM (#41438) Journal

      My understanding is that vim and emacs are rather meant as examples of "ancient" tech, to show that FORTRAN was "prehistoric" in comparison.

      But in that vein, I'd like to complain about referring to this new-fangled vim. FORTRAN is order than vi itself.