Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday December 04 2018, @03:01PM   Printer-friendly
from the portents-of-future-ecma-script dept.

The Enterprises Project writes about how the demand for several very specific, established skills, including COBOL, is increasing as boomers retire, taking their knowledge with them. Part of the skill gap between the old and the new is familiarity with the work flow and business processes.

Baby Boomers are retiring and taking with them the skills to run legacy technologies upon which organizations still (amazingly) rely – from AS/400 wrangling to COBOL development. That leaves many CIOs in a tight spot, trying to fill roles that not only require specialized knowledge no longer being taught but that most IT professionals agree also have limited long-term prospects. "Specific skill sets associated with mainframes, DB2 and Oracle, for example, are complex and require years of training, and can be challenging to find in young talent," says Graig Paglieri, president of Randstad Technologies.

Apparently, COBOL is still in use in 9 percent of businesses, mainly in finance and government. And so the demand for COBOL is gradually growing. If one has interest to pick up that plus one or more of the other legacy technologies, on top of something newer and trendier, there should be a possibility to clean up before the last of these jobs moves to India.

Earlier on SN:
Jean Sammet, Co-Designer of a Pioneering Computer Language, Dies at 89 (2017)
Banks Should Let Ancient Programming Language COBOL Die (2017)
Honesty in Employment Ads (2016)
3 Open Source Projects for Modern COBOL Development (2015)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by The Mighty Buzzard on Tuesday December 04 2018, @09:54PM (13 children)

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday December 04 2018, @09:54PM (#769781) Homepage Journal

    If you're relying on GC, I do not want to work with you. GC = "I don't understand how to manage memory for shit. Why can't The Computer do this for me?"

    --
    My rights don't end where your fear begins.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 4, Insightful) by DannyB on Tuesday December 04 2018, @10:42PM

    by DannyB (5839) Subscriber Badge on Tuesday December 04 2018, @10:42PM (#769822) Journal

    If you're relying on GC, I do not want to work with you. GC = "I don't understand how to manage memory for shit. Why can't The Computer do this for me?"

    There are several arguments here really.

    If you don't like GC, that's fine. There's nothing wrong with that. There are entire problem domains where that is in fact the proper view to have.

    We could all just write in C. Or assembly language. Or hex codes even. Yes, really. We could. Even flip front panel switches.

    So why don't we? Productivity. Because really what we're dancing around here is money.

    If I need an extra 64 GB of ram and more cpu cores but can beat my C++ competitor to market by six months, my boss won't even blink when I ask. And laugh all the way to the bank. I am not optimizing for bytes and cpu cycles. I'm optimizing for dollars. That's probably why so many banks and financial institutions use Java.

    Why do these high level languages exist, and why does GC exist?

    "A programming language is low level when its programs require attention to the irrelevant."
    - Alan J. Perlis.

    I can manage memory just fine actually. I did that for years and years with Pascal in the 80's, and C and C++ in the 90's. But it is irrelevant. Managing memory doesn't help me actually solve the actual problems that my code is designed to solve. It is unnecessary bookkeeping.

    We have these machines now called computers. They can take care of the unnecessary bookkeeping for us. Just like dishwashers can wash dishes for us.

    My final argument is this. For the programming problems I routinely solve, it would probably be fairly easy to manage memory. It is allocated and released in fairly simple patterns.

    Sometimes programs are very complex and do not have simple patterns of allocation and release. You begin to notice this in Lisp programs. GC was invented for a reason.

    There is another famous quote that I won't bother to look up, but the gist is this . . . in any sufficiently complex C++ program is a bug ridden, poorly specified, suboptimal ad-hoc garbage collector.

    Why can't The Computer do this for me?

    I know how to take square roots by hand. (A child of the 70's) But I use a pocket calculator.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 2) by c0lo on Wednesday December 05 2018, @07:33AM (11 children)

    by c0lo (156) Subscriber Badge on Wednesday December 05 2018, @07:33AM (#770000) Journal

    If you're relying on GC, I do not want to work with you

    Either you fixed S/N unwillingly or you used a Perl implementation with explicit memory management?

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 2) by The Mighty Buzzard on Thursday December 06 2018, @12:07AM (10 children)

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday December 06 2018, @12:07AM (#770345) Homepage Journal

      I take it you've never done any mod_perl programming? You absolutely do memory management in it, though not of the typical variety. The vast majority of our code gets interpreted at runtime and then given the boot as soon as it's finished running. The same with data. Very few things are going to be worth keeping in memory given the stateless nature of web programming, so we don't for the most part.

      --
      My rights don't end where your fear begins.
      • (Score: 2) by c0lo on Thursday December 06 2018, @12:42AM (9 children)

        by c0lo (156) Subscriber Badge on Thursday December 06 2018, @12:42AM (#770375) Journal

        You absolutely do memory management in it, though not of the typical variety. The vast majority of our code gets interpreted at runtime and then given the boot as soon as it's finished running. The same with data.

        Riiight!
        I suppose you could call it "explicit memory management" for some values of "explicit memory management".

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2) by The Mighty Buzzard on Thursday December 06 2018, @04:31AM (8 children)

          by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday December 06 2018, @04:31AM (#770485) Homepage Journal

          I'd call it extremely inefficient and ill-advised but then I think cgi should be written in a compiled language.

          --
          My rights don't end where your fear begins.
          • (Score: 2) by DannyB on Thursday December 06 2018, @02:40PM (5 children)

            by DannyB (5839) Subscriber Badge on Thursday December 06 2018, @02:40PM (#770652) Journal

            I have to pinch my nose while I say this, but there is some merit to the idea that a cgi could be written in a compiled language, and NOT do any memory management. Just write the code as if you had GC. Simply do not deallocate anything. Just let go of pointers when you're done with them.

            The idea here is that the CGI does some processing, and then its process is destroyed making any kind of memory management unnecessary. Why spend cycles cleaning up when the process is just going to exit almost immediately. I would call it an effective hack. Efficient to be sure. Saving developer time to be sure. Therefore economical, and that's what it's all about. Yet the geek in me screams that there is something bad about this.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2) by The Mighty Buzzard on Thursday December 06 2018, @02:54PM (4 children)

              by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday December 06 2018, @02:54PM (#770662) Homepage Journal

              I hate to say this but it depends largely on scope. If you're not taking up significant amounts of memory and not taking any at all up for significant amounts of time, it's fairly pointless to either explicitly free said memory or to implement a GC. It's not an approach you'd want to take with an OS or any program with a runtime longer than minutes at the most but freeing ALL of the memory a process is using by way of exiting is actually better than freeing just some of it (unless process creation/reaping overhead becomes an issue).

              --
              My rights don't end where your fear begins.
              • (Score: 2) by DannyB on Thursday December 06 2018, @03:09PM (3 children)

                by DannyB (5839) Subscriber Badge on Thursday December 06 2018, @03:09PM (#770668) Journal

                freeing ALL of the memory a process is using by way of exiting is actually better than freeing just some of it

                Yep. That's the crux of the point I wanted to make.

                Not a bad idea if process launching is cheap and efficient. See my related post on how Java (and .NET) take a very different approach where everything is native code, hot in memory, and directly wired into the web server's dispatch. No launching processes.

                GC threads are always running, on other cpu cores. The deallocation of memory does not consume cpu cycles in the stream of processing the http request but at a completely different time and cpu core.

                --
                People today are educated enough to repeat what they are taught but not to question what they are taught.
                • (Score: 2) by The Mighty Buzzard on Thursday December 06 2018, @03:23PM (2 children)

                  by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday December 06 2018, @03:23PM (#770678) Homepage Journal

                  mod_perl does something similar but allows you to decide what code and data is permanently resident and what is loaded at need. I don't particularly care for it but I'm not in the mood to rewrite the entire SN codebase either.

                  Rust has really freaking spoiled me on the memory allocation/deallocation front lately. Pretty much every memory allocation has an explicit or implicit lifetime (unless you go way out of your way to make this not so) and is freed immediately upon the end of that lifetime. It's not even a runtime enforced thing; you literally cannot compile something that tries a use-after-free.

                  --
                  My rights don't end where your fear begins.
                  • (Score: 2) by DannyB on Thursday December 06 2018, @07:30PM (1 child)

                    by DannyB (5839) Subscriber Badge on Thursday December 06 2018, @07:30PM (#770796) Journal

                    Even with the concepts that Rust brings to help manage the 'ownership" of an allocation, there are programs that don't fit well.

                    I won't say it is impossible, just difficult enough that you probably won't see any. Try building a CAS (computer algebra system), or theorem proover, or Minimax type game AI without a GC.

                    Especially try to build any type of Logic Programming (eg, Prolog, Minikanren, etc) system, or Haskell (pervasive lazy evaluation) without GC, I doubt it could be done.

                    While Rust would seem fantastic for boot loaders, os kernels, device drivers, microcontrollers, etc, there are a lot of programs at the other high level end of the spectrum where GC is perfectly appropriate. And commercial applications like a web server where the GC cost is on a different thread and not inline on the same thread as the processing of the request. Servicing an HTTP request does not need to spend cycles doing any memory management. Even if the GC is less efficient -- just throw another few cpu cores at it. They're cheap by the dozen. If the economic case is to want to process requests as fast as possible, then don't spend cycles -- of request processing -- on memory management. But GC is not for everything or every use. However I find it interesting how pervasive it has become from the Lisp days. Visual Basic. Visual FoxPro. Perl. Python. JavaScript -- which also means Node.js. Java. C#. (Actually any language on the JVM or .NET runtime) All new languages like Go. And many more I have not listed.

                    --
                    People today are educated enough to repeat what they are taught but not to question what they are taught.
                    • (Score: 2) by The Mighty Buzzard on Thursday December 06 2018, @10:17PM

                      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Thursday December 06 2018, @10:17PM (#770894) Homepage Journal

                      Honestly, if you're writing something in Perl that GC is even an issue in, you've utterly failed at picking the proper tool for the job IMO. Yes, you technically can write a script in Perl whose lifetime isn't measured in (fractional) seconds but why would you*? Perl's merits are that it is a fucking badass at mangling text. This shines like a motherfucker at scripts slightly more complicated than you want to write in Bash. It's even passable at marking up web pages if you don't do something silly like let the whole bloody script+interpreter+data stay in memory. Writing a video game or web browser in it would be bloody stupid though.

                      * I ask myself this every time I'm reminded of our IPN daemon for the site billing system. I have no idea why I wrote it in Perl except that possibly it was simple enough that it wasn't worth the mental page fault.

                      --
                      My rights don't end where your fear begins.
          • (Score: 2) by DannyB on Thursday December 06 2018, @02:51PM (1 child)

            by DannyB (5839) Subscriber Badge on Thursday December 06 2018, @02:51PM (#770660) Journal

            This is sort of a companion post to what I just wrote.

            A Java web app takes a very different approach than CGI.

            The Java source is compiled to bytecode. The Java runtime then compiles this into native code -- for the type of OS and processor in use. Then another compiler comes along later and compiles that same code again, but spending a lot more time to compile it to much more efficient code -- very specific to the processor you have including the exact instruction set extensions your processor has.

            All this code is hot in memory and all wired up to the web server it is installed into. When an HTTP request comes along, it is immediately directed to executable code that processes it. There are no interpreted "html" type pages. (Think like "php" or "jsp" where you embed code within an html style page) Those JSP pages (that you think of as html with embedded code) are turned inside out and literally compiled into code that contains "print statements" that emit the static html fragments directly into the bytestream going out to the browser.

            An http request does not even launch a process. It is merely dispatch to an already allocated awaiting thread of executable code that directly processes the http request. Nothing is interpreted. No bytecodes. No files on disk are touched to process a request. No os processes are launched.

            This is why it is so fast. At the cost of memory. But memory and cycles are cheap. dirt cheap And developer time is expensive and getting more expensive every day. And talented developers are hard to find. A limited resource. The economics are not to optimize for cpu cycles and bytes but to optimize for dollars. And ultimately we are all doing our work to make money. The economics trumps everything else. If a developer doesn't understand the basic economics of what they are doing, they probably never become a senior developer.

            I'm not trying to 'sell' you on Java. Just explaining why there is some serious merit to why it is so heavily used when money is at stake.

            --
            People today are educated enough to repeat what they are taught but not to question what they are taught.
            • (Score: 2) by DannyB on Thursday December 06 2018, @02:56PM

              by DannyB (5839) Subscriber Badge on Thursday December 06 2018, @02:56PM (#770664) Journal

              The fact that the bytecode is compiled twice, by two different compilers is why a Java web app seems to "warm up". It takes a few minutes before it becomes super fast.

              --
              People today are educated enough to repeat what they are taught but not to question what they are taught.