Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday March 11 2020, @10:00PM   Printer-friendly
from the what-do-YOU-think dept.

Ilya Dudkin at Skywell Software has a story

Top 7 Dying Programming Languages to Avoid Studying in 2019 –2020.

Each language gets a paragraph's treatment as to why he thinks these languages are dead or dying. Those languages are:

  • Visual Basic
  • Objective-C
  • Perl
  • COBOL
  • CoffeeScript
  • Scala
  • Lisp

Do you agree with his assessment? Are there any other language(s) you would add to the list?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by JoeMerchant on Wednesday March 11 2020, @10:17PM (53 children)

    by JoeMerchant (3937) on Wednesday March 11 2020, @10:17PM (#969881)

    Visual Basic - I predict it will never really die, due to the load of spreadsheets used by people who have absolutely no understanding of the magic that makes their reports work every quarter...

    Objective-C - About time is all I can say - ultra-transparent lock-in efforts by Apple, Inc. notwithstanding, what did Objective-C ever bring to the party?

    Perl - hmmmm.... unlike VB, most people who use Perl have some modicum of understanding of how it works and can probably translate the legacy code to the latest hotness with relatively little trouble. Still, lots of legacy out there.

    COBOL - like VB, in spades. If COBOL were going to die, it would have done so 20 years ago or more.

    CoffeeScript - never heard of it, won't miss it. Has anyone here heard of SquirrelScript? I actually got a job at one of the three commercial enterprises to ever use it - though almost everyone there shunned it, except for the "Rock Star" (TM).

    Scala - out of my sphere of experience...

    Lisp - last time I used Lisp (I think) was a Turtle Graphics exercise out of a magazine in the 1980s...

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by Osamabobama on Wednesday March 11 2020, @10:25PM (10 children)

    by Osamabobama (5842) on Wednesday March 11 2020, @10:25PM (#969888)

    I remember almost 30 years ago when I had to write FORTRAN for my engineering curriculum, there was talk of the dying language COBOL over at the business school. The only context where I've heard it mentioned since then is as a punchline for jokes about obsolete computers. I had assumed it was already dead...

    --
    Appended to the end of comments you post. Max: 120 chars.
    • (Score: 2) by JoeMerchant on Wednesday March 11 2020, @10:29PM (8 children)

      by JoeMerchant (3937) on Wednesday March 11 2020, @10:29PM (#969891)

      I believe a scary volume of financial transactions pass through COBOL code every day, still.

      I think the year was 1983 when my Fortran teacher said "COBOL is dead, but will live forever due to the amount of installed code" - our local college had just removed the paper card punch terminals and replaced them with CRT terminals, but the system still had a 77 character per line maximum and also ran batches of cards for people who had established themselves as punchcard users.

      --
      🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @01:54AM

        by Anonymous Coward on Thursday March 12 2020, @01:54AM (#970002)

        And all EFT systems. The format there is still punch cards too. Thought they talk about batch file :)

      • (Score: 1, Informative) by Anonymous Coward on Thursday March 12 2020, @02:13AM (1 child)

        by Anonymous Coward on Thursday March 12 2020, @02:13AM (#970020)

        According to my friend who gets the mid six figures coding COBOL in the financial industry, their LoCs are going up.

        • (Score: 0) by Anonymous Coward on Saturday March 14 2020, @06:23AM

          by Anonymous Coward on Saturday March 14 2020, @06:23AM (#971063)

          COBOL code that is 30+ years old still needs to be maintained. No one has a solution.

          Mid 6 figures? so $400,000? High end contractor rates. Possible, and likely to happen.
          They still employ permanent staff for $70K to $120K for permanent jobs maintaining COBOL systems in Australia.

          Have a trawl through https://www.apsjobs.gov.au [apsjobs.gov.au] if you are interested.

          Find the dept and teams and contact them directly. Services Australia, ATO and several other depts still have COBOL systems. Also check the banks. Commbank and StGeorge and Westpac and ANZ specifically.

          They also need systems support people for these ancient systems. Mainframe ops, etc etc.

      • (Score: 1, Insightful) by Anonymous Coward on Thursday March 12 2020, @08:20AM

        by Anonymous Coward on Thursday March 12 2020, @08:20AM (#970151)

        You believe, and I know. I used to administrate these systems.
        SAP / ABAP / Java tried to take over but failed.
        There is no true successor for COBOL.
        Feel free to invent one.

      • (Score: 4, Informative) by epitaxial on Thursday March 12 2020, @06:38PM (1 child)

        by epitaxial (3165) on Thursday March 12 2020, @06:38PM (#970336)

        Why would it be scary? Does the age of the language have something to do with its usefulness? The fact that nothing better has replaced it in a half century should speak for itself.

        • (Score: 2) by JoeMerchant on Thursday March 12 2020, @07:41PM

          by JoeMerchant (3937) on Thursday March 12 2020, @07:41PM (#970351)

          What's scary is that it is approaching "dead language" status. Thankfully, it's relatively simple and the basics can be learned quickly, but the larger structures and practices used are becoming a lost art. And, yet, millions of dollars per day flow through its pipes, like the Detroit water system in 2001 poisoning us in ways we are not yet aware of.

          --
          🌻🌻 [google.com]
      • (Score: 2) by hendrikboom on Saturday March 14 2020, @02:14PM (1 child)

        by hendrikboom (1125) Subscriber Badge on Saturday March 14 2020, @02:14PM (#971199) Homepage Journal

        Wasn't it a 72-character limit instead of 77? The remaining 8 characters set aside for line numbers so you could mechanically sort the deck into order after dropping it downstairs be accident?

        • (Score: 2) by JoeMerchant on Saturday March 14 2020, @04:47PM

          by JoeMerchant (3937) on Saturday March 14 2020, @04:47PM (#971231)

          That sounds familiar, though our CRT based systems may have gotten a 5 character bonus since there was no longer the danger of accidental shuffling.

          --
          🌻🌻 [google.com]
    • (Score: 4, Insightful) by Dr Spin on Thursday March 12 2020, @06:34AM

      by Dr Spin (5239) on Thursday March 12 2020, @06:34AM (#970124)

      COBOL is Undead!

      If there was any more of it, it would be a Zombie apocalypse.

      --
      Warning: Opening your mouth may invalidate your brain!
  • (Score: 5, Interesting) by Unixnut on Wednesday March 11 2020, @10:37PM (19 children)

    by Unixnut (5779) on Wednesday March 11 2020, @10:37PM (#969900)

    I don't know. I used Perl(5) on and off historically, mostly to deal with existing legacy stuff others had written, so I knew it on a basic level, but never put the effort in to understand it. Spent most of my time in Python (and C occasionally).

    Funny thing is, after Python3 was released, and I found it a frustrating and irritating modification of what was originally a good language (IMO), that made even basic tasks harder, I started looking more into Perl, and actually writing software in it, and actively learning more about it... and the thing is, I like it.

    In many ways, it does things no other language can do as easily. It is an excellent scripting language in the original sense of the term, and it is now my go-to for small to medium automation, data munging and general scripting tasks. I don't think it is dying, I think it is just fading more into the background. It was originally developed as a systems language, for sysadmins to glue together and build environments and machines. Within that scope it does the job brilliantly, and I suspect that deep in the bowels of pretty much every system "popular languages" touch in some way, there is some Perl getting the work done.

    It is not a language I would write a massive object orientated project in. For large scale software projects I still use Python3, as the annoyances they added to the language fade a bit when you are building a proper piece of software vs scripting/automation single file executable, but it has displaced Python in most other places for me at least, making it very much alive and kicking.

    • (Score: 2) by JoeMerchant on Wednesday March 11 2020, @11:14PM (13 children)

      by JoeMerchant (3937) on Wednesday March 11 2020, @11:14PM (#969918)

      I have found that most of the obscure scripting languages (like SquirrelScript) were basically "doing Python" before Python got popular, and they were a bit ahead of Python in this area or that for a while, but the snake has pretty well squeezed the scripting world to where it is the weapon of choice, if you need that sort of thing.

      I recently ported a (Qt) C++ app to Python (PyQt) - and I was pretty happy with the initial results, calling cookie cutter library functions Python was only about 4% slower than C++, but, when I got to the meat of what I really wanted to do (pixel by pixel manipulations in LOTS of images), Python just fell on its face, taking over 2000x longer to process than C++.

      --
      🌻🌻 [google.com]
      • (Score: 5, Insightful) by Unixnut on Thursday March 12 2020, @12:09AM (12 children)

        by Unixnut (5779) on Thursday March 12 2020, @12:09AM (#969939)

        There is no one language "to rule them all". They all have pros and con's, which is why I tend to try to use whichever one I have in my toolbox that best fits the use case.

        I would write image manipulation in C for performance, even if the rest of the program is Python (it is nice you can drop into C from python when you need to, although its a bit clunky). Likewise I would not write a quick script to parse out and batch rename some files in C.

        Generally what I have done when writing image processing systems, is first write the whole thing in Python, even if the compute intensive bits are 2000x slower, you have a proof of concept that provides the correct result, and a reference program to compare optimisations (or C rewrites) against.

        I switch between Bash/Perl/Python/C depending on needs, and (so far) those are the only ones I have needed in my toolbox, between them they cover everything I need to do well enough to get the job done in a sensible way.

        Python has strength in its libraries, especially things like its ZeroMQ bindings (PyZMQ) , which I use to split tasks across machines (sometimes that is faster than the programming effort needed to rewrite in C for more performance), numpy, and the fact is is used more and more in data science/modelling and statistical analysis.

        Saying that, I use Perl to pre-process the (generally messy) data sources into one consistent format for the Python analysis to do its thing, and in some cases the data is initially fetched with a bash wrapper around Curl. A lovely thing about Unix is the fact you easily string together programs on the command line, allowing me to use the right tool, written in the right language, for each stage of the work.

        • (Score: 3, Funny) by Arik on Thursday March 12 2020, @01:37AM (10 children)

          by Arik (4543) on Thursday March 12 2020, @01:37AM (#969990) Journal
          I'm sorry, that was generally a good post but;

          "There is no one language "to rule them all"."

          Yes. Yes there is. It's binary*.

          Anything else you do has to be machine-translated, in one way or another, into binary. Therefore that is the veritable one language to rule them all.

          *It's generally easier to think of it as hex, but they map 1:1 unless you're on exotic hardware.
          --
          If laughter is the best medicine, who are the best doctors?
          • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @05:09AM

            by Anonymous Coward on Thursday March 12 2020, @05:09AM (#970112)

            Wrong! Specs are no less important than machine code implementing them, and their native tongue is not binary.

          • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @07:32AM (2 children)

            by Anonymous Coward on Thursday March 12 2020, @07:32AM (#970142)

            I'm so old, I remember why most of the RFCs specify data transfer in octets instead of bytes or another unit because they were different sizes on different machines. And "nybble" has a "y" in it, heretics!

            • (Score: 2) by turgid on Thursday March 12 2020, @10:18AM (1 child)

              by turgid (4318) Subscriber Badge on Thursday March 12 2020, @10:18AM (#970174) Journal
              • (Score: 0) by Anonymous Coward on Friday March 13 2020, @06:31AM

                by Anonymous Coward on Friday March 13 2020, @06:31AM (#970575)

                That comment divides all Soylentils into two categories. 1) Those who think you are old because they know what that comment is referencing. 2) Those who think you are old because you had a stroke when writing that comment.

          • (Score: 2) by maxwell demon on Thursday March 12 2020, @08:31AM (4 children)

            by maxwell demon (1608) on Thursday March 12 2020, @08:31AM (#970156) Journal

            Binary is not a language. Maybe you are thinking of machine language? But then, there is not one machine language; the x86 machine language is different from the ARM machine language, for example. So still no language to rule them all, just a family of languages.

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 2) by Arik on Thursday March 12 2020, @09:18AM (3 children)

              by Arik (4543) on Thursday March 12 2020, @09:18AM (#970165) Journal
              Fundamentally it's the language of mathematics. Everything is a number. Yes, there are dialects of machine language, but they're all remarkably similar. If you're writing in anything else, it has to be translated into machine language before it can run.
              --
              If laughter is the best medicine, who are the best doctors?
              • (Score: 2) by quietus on Thursday March 12 2020, @10:32AM (1 child)

                by quietus (6328) on Thursday March 12 2020, @10:32AM (#970175) Journal
                Everything is an approximation of a number. Now go get a copy of Introduction to Applied Numerical Analysis (Richard W. Hamming, Dover Publications, 1989).
                • (Score: 2) by Arik on Friday March 13 2020, @08:58PM

                  by Arik (4543) on Friday March 13 2020, @08:58PM (#970861) Journal
                  No, in the natural world you could make that argument, but not inside a computer.

                  It's 1 or it's 0, or at most the third option is there's an error and we backup and try again.
                  --
                  If laughter is the best medicine, who are the best doctors?
              • (Score: 2) by JoeMerchant on Thursday March 12 2020, @12:56PM

                by JoeMerchant (3937) on Thursday March 12 2020, @12:56PM (#970199)

                Binary is more the phonemes of computer control - and it doesn't cover analog, or trinary, or other systems.

                Yes, there are dialects of machine language, but they're all remarkably similar.

                That's not even close to what I would call true... 6502/6809 are somewhat related, but they're pretty far from 8088, which itself bears almost no practical resemblance to modern Core iX machine code. The only thing they all have in common is some degree of simplicity / closeness to the hardware, a lack of abstraction.

                If you're writing in anything else, it has to be translated into machine language before it can run.

                This is also true of human spoken / written language, the difference being: we don't understand how wetware machine language works.

                --
                🌻🌻 [google.com]
          • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @01:41PM

            by Anonymous Coward on Thursday March 12 2020, @01:41PM (#970208)

            "Binary" isn't a language.

        • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @06:11PM

          by Anonymous Coward on Thursday March 12 2020, @06:11PM (#970326)

          I tend to try to use whichever one I have in my toolbox that best fits the use case

          The thing is, the tool that best fits the use case is more often than not the tool you understand the best. So language A is better for case X than language B really just comes down to "I know how to do this in language A, but I don't know how to do this in language B"
          At an academic level an expert in the field may be able to recommend one language over another. But in the real world, it rarely if ever matters.

    • (Score: 3, Insightful) by Anonymous Coward on Thursday March 12 2020, @02:41AM (1 child)

      by Anonymous Coward on Thursday March 12 2020, @02:41AM (#970034)

      You can't go a week without somebody predicting perls doom. But there is no other language that will let you do the crazy shit perl will let you do. Not that you should do it. But you can.

      The story behind perl hate is almost unversally this:

      Boss ask tech to fix. Tech get tired of fixing, he slap a perl one liner on it. Boss notices tech no longer contemplating hari-kari. Boss tell tech to fix other. Tech extends perl. Five years later, one liner is company wide ERP system that saves million of shinys.

      Boss tells tech to make big tittied naked women fall from the sky. Tech finally says: "Yeah, that needs a ground up rewrite with a proper development cycle". Boss wonder why tech no longer love him? Cry's quietly in a corner, then become angry. "Fucking tech! Do what I want!". Tech says: "No. It doesn't work that way." Boss fires tech. Boss's boss asks to fix something. Boss cannot. Boss no program. Boss is sad. Boss reads article that say: "Python so easy!". Boss read 5 lines of Python and think that programming. Boss hires tech, says "python gud!". Tech says, "Yeah, a whole team needed for that.". Boss pays ten times as much to do it in Python. Boss tell everyone: "perl is for assholes! See! Me did it in python!" Boss fires four out of five programmers, and locks the last one in a cage with bread and water to maintain code. Boss gets promotion and new company Wooly Mammoth.

      If you like it dirty, perl. If you want it to scale, perl. If you want it efficient, perl. If you want the programmer to be an easily replacable commodity: python.

      • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @04:21AM

        by Anonymous Coward on Thursday March 12 2020, @04:21AM (#970090)

        With that being a rare commodity, bitchfest from those afflicted by sour grapes never ceases and never will.

    • (Score: 2) by edIII on Thursday March 12 2020, @11:00PM (2 children)

      by edIII (791) on Thursday March 12 2020, @11:00PM (#970436)

      Perl isn't fading into the background, as much as the BSD's are in the background. It seems to me that Perl is a very big thing in OpenBSD. That's why I laughed when Joe said, "A lot of legacy code". Perl is on the front line for a great many people, including myself.

      I'll admit, I don't know what this Perl 6 BS is actually about, but Perl 5 is perfectly adequate for modern tasks. Some pretty nice enterprise frameworks based on Perl like Catalyst, and there is Mojolicious (which I took to be Catalyst Light).

      There's a lot of people using Perl for modern use cases with varying complexities and requirements. Not on legacy equipment, but cutting edge embedded systems. To say it's a dying language is amusing.

      --
      Technically, lunchtime is at any moment. It's just a wave function.
      • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @11:30PM

        by Anonymous Coward on Thursday March 12 2020, @11:30PM (#970454)

        Yup, at Intel in the hardware engineering groups, perl is the most commonly used scripting language with TCL being second since all the EDA tools have TCL parsers.

      • (Score: 1) by sorpigal on Friday March 13 2020, @12:15AM

        by sorpigal (6061) on Friday March 13 2020, @12:15AM (#970473)

        First, Perl6 is now called Raku (renamed last year to hopefully end the confusion). Consider it to be a language riffing on Perl sensibilities, but for the modern age and the future. If there's a programming paradigm or language feature that has come out of academia in the last 30 years and is useful to anyone, Raku implements it--but it doesn't just throw everything in to a big pile and say "there you go." Instead, the designers seem to have looked at all of the different features out there and broken them down in to primitives that they then used to build up to every feature you could possibly imagine. The net effect of this is that everything works with everything else, and if you dig down far enough, much like the root system of a vast interconnected mushroom, everything is connected. The worst and best part of Raku is when you're trying to figure out how to do something and take a trip to the documentation, then wind up staring in to the abyss that is the depths that are available for you to plumb. I expect that this will, with time and experience, happen less often (but not, alas, for me as yet).

        Perl5, or (now) just Perl, meanwhile has continued to evolve--and its evolution is accelerating. The primary reason, I think, for the loss of Perl popularity was that at a critical time in the explosion of the web (late 1999 to maybe about 2006) there was nearly zero work on the language. This is now dramatically better and improvements are being made regularly again, sometimes borrowing bits and pieces of the good ideas from Raku. If you haven't looked at Damian Conway's Dios.pm or his Regexp::Debugger, for example, you're missing out on modern Perl at its best.

        Perl may not get the press any more, apart from unfortunate inclusions in articles like this, but it's an exciting time for the language... and for Raku, too.

  • (Score: 3, Insightful) by zocalo on Wednesday March 11 2020, @10:41PM (2 children)

    by zocalo (302) on Wednesday March 11 2020, @10:41PM (#969902)
    Are VB and VBA really the same thing though? VBA definitely isn't going anywhere anytime soon, but standalone VB is almost certainly done.
    Python seems to have supplanted Perl for much of the backend/admin scripting of Linux etc., so possibly, although there's still a LOT of Perl in the traditional LAMP stacks that is going to be around for a long time, not to mention extremely popular tools like SpamAssassin that use it. Have to disagree with TFA on that one.
    In too minds on COBOL - yes, there was a brief resurgence for updating legacy code for Y2K and a good deal of of that code is probably still chugging along, but surely, *surely*, we're not going to have another do-over in 2038?

    Similar situation on the rest; I knew CoffeeScript had something to do with Java, but thought it was probably another Apple thing (mixed up with Cocoa, I suspect), and that Scala has been losing traction for a while. I did some recursive coding in Lisp at university but found Prolog much easier to work with and only used Lisp when I had to but, then again, I'm not an Emacs fan and I think that uses Lisp for scripting, in which case I suspect it's going to linger around a LOT longer than the author of TFA might expect.
    --
    UNIX? They're not even circumcised! Savages!
    • (Score: 2) by meustrus on Thursday March 12 2020, @04:09AM (1 child)

      by meustrus (4961) on Thursday March 12 2020, @04:09AM (#970089)

      CoffeeScript has to do with Javascript. I remember it existing over 10 years ago, well before Javascript really exploded and it started evolving in a more standardized manner. These days, just use Ecmascript 2018 or whatever the latest version is, or TypeScript if you don't hate yourself.

      Scala is within my domain, and I can confidently say don't use it for production code. It can be fun to mess around in if you like type system logic puzzles, but it's underlying premise - mix pure functions with side effecting, exception throwing code in whatever way works for you - is fatally flawed. Maybe Dotty (Scala 3) will be better, but it's been coming for 8+ years now and it still isn't giving up this ludicrous idea that you can safely ignore runtime exceptions even when accepting first-class functions that have no meaningful way to prevent runtime exceptions. Also, most people don't like type system logic puzzles.

      --
      If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
      • (Score: 2) by zocalo on Thursday March 12 2020, @08:52AM

        by zocalo (302) on Thursday March 12 2020, @08:52AM (#970159)
        Yep, I looked it up before posting - that's what made me think I'd probably confused it up with Apple's Cocoa - bit of a "duh!" moment; Java is a kind of coffee and after that the "Script" part should have been obvious really. I suspect I probably had a better idea what CoffeeScript was 10 years ago, but it's not something I've ever used so things have definently got fuzzy since then, so I'm going to blame it on age.

        I've read a few articles on Scala, amongst lots of other niche/obscure languages since they feature in a monthly magazine I subscribe to, and while it sounds like it would absolutely have an appeal to certain types of coder I'm definitely not one of those any more. I was pretty good at low level coding in Assembly and C back in the day - including all those ugly hacks that would have people running for the hills screaming in terror these days - e.g. I'd have no problem with self-modifying assembly to squeeze all the code into a few KB or less - but these days my coding it limited mostly to hacking out a bit of script to fix some problem or automating some repetetive task.

        And probably still sending people running for the hills, screaming in terror... :)
        --
        UNIX? They're not even circumcised! Savages!
  • (Score: 2) by HiThere on Wednesday March 11 2020, @11:02PM (7 children)

    by HiThere (866) Subscriber Badge on Wednesday March 11 2020, @11:02PM (#969914) Journal

    Sorry, but Objective C was not developed by and not a property of Apple. Objective C was an excellent language with lousy documentation. If you didn't want to use the Apple fork you basically needed to know someone who already knew how to use it to learn how to use it. It had it's problems, but they weren't serious...except for the lousy documentation.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 2) by JoeMerchant on Wednesday March 11 2020, @11:21PM (2 children)

      by JoeMerchant (3937) on Wednesday March 11 2020, @11:21PM (#969921)

      Sorry, but Objective C was not developed by and not a property of Apple.

      Nothing to be sorry about - neither did Apple develop nor own the Linux stack that OS-X was based on. The only application I ever found for Objective-C in the "Real World" was to play in Apple's walled garden, even if the soil was Linux and the flora coded in Objective-C, it was clearly Apple's space - and like any language, it's the API of the libraries that you're learning more than the language. In 2006 it didn't take much business acumen to choose Qt/C++ over Carbon/Cocoa/Objective-C as the place to put our development efforts, and that's a decision I've never regretted - even if it did piss off one petulant little fanboi that I worked for, I certainly didn't want to hitch my wagon to his train.

      --
      🌻🌻 [google.com]
      • (Score: 5, Touché) by Thexalon on Wednesday March 11 2020, @11:42PM (1 child)

        by Thexalon (636) on Wednesday March 11 2020, @11:42PM (#969927)

        Nothing to be sorry about - neither did Apple develop nor own the Linux stack that OS-X was based on.

        I think you misspelled "BSD".

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 2) by DannyB on Thursday March 12 2020, @02:57AM (3 children)

      by DannyB (5839) Subscriber Badge on Thursday March 12 2020, @02:57AM (#970049) Journal

      The issue with Objective C was not about the language or who created it, but rather the rise of Swift in replacing it in Apple products. Thus the majority of Objective C use may disappear. Or not. Who knows.

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 2) by HiThere on Thursday March 12 2020, @04:27PM (2 children)

        by HiThere (866) Subscriber Badge on Thursday March 12 2020, @04:27PM (#970276) Journal

        If you think of languages as a popularity contest, then OK. If you think of them as a useful toolkit, then that's almost irrelevant. (Not quite, as libraries, etc., tend to get developed for popular languages.) I tend to think of them as toolkits, and I've been profoundly disgusted recently because the toolkit's aren't getting properly filled.

        It was actually easier to do development of GUI programs a couple of decades ago. The advantage of modern languages is that they have the opportunity to handle unicode well. Few do. There's no reason for 16-bit unicode except to communicate with system calls that demand it, everything else should be either utf-8 or utf-32 (which is actually 24 bits). Go, Vala, D, and Python handle it well. Perhaps Ruby. C and C++ are atrocious. Java is locked into a standard that was stupid when they developed it. No unsigned integers, No compile time constants, and 16-bit unicode. (OK, at the time they issued the first standard 16-bit unicode was an improvement. They didn't, however, change.)

        Then there's concurrent execution on multi-processors. Here LISP should have lead the pack, as it already had a purely functional subset. But they didn't. Even Racket Scheme, which had the best development of the Lisp derivatives prior to Clojure, never came up with a decent implementation. And Clojure decided to be "purely functional" and not allow any variables to be changed. Which makes it a bad choice in lots of situations. Even Erlang handled that better, having local hash tables that were changeable, and built in databases that were globally changeable. The right idea, but the correct answer would be that strictly local variables were changeable, and only arguments to function calls were immutable. Go's channels are a reasonable intermediate step, but it doesn't support desktop graphics in any reasonable sense of the term. It still currently looks like the best answer for a local gui program that also has to handle concurrency and unicode...which is an atrocious statement of the current state of affairs. (The gui has to be exported to C routines ... cairo, gtk, etc. on my system.) I did seriously consider C++/Qt, but garbage collection problems decided the issue in favor of Go. Qt has better graphics, but it's unicode system is based around 16-bit unicode, not at all what I want. And the garbage collection/allocation gets too complicated between threads.

        So OK, a couple of decades ago I wouldn't have worried about multi-processor systems, and 16-bit unicode didn't look so bad. But it looks bad today.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 2) by DannyB on Thursday March 12 2020, @05:39PM (1 child)

          by DannyB (5839) Subscriber Badge on Thursday March 12 2020, @05:39PM (#970308) Journal

          Languages are not a popularity contest. However popularity of languages is a factor in career planning.

          I would have loved for Lisp to have a much bigger prominence in languages. Being purely functional is not strictly a requirement to do parallel programming well. I would argue that immutability is at least as important. I enjoyed Clojure for hobby playing for a few years. The purely functional and immutability did take some getting used to. I still could fall back on an iterative non functional style with recur. Only recently have I started playing with Racket. (and also wxMaxima) I find I miss some things from Clojure. I haven't tried Erlang, yet. I might.

          Clojure does have mutable versions of, at least, arrays. In my playing, I wrote a prime sieve function that took two arguments describing a range of integers to sieve over. Say from 1 trillion to 1 trilliion + 1 million. That would find all the primes in a 1 million long space starting at 1 trillion. Return them as an immutable list. The local array was mutable to make the sieve easy to implement. That was when I realized that some local mutability can be useful as long as mutability never extends outside of a single threaded function.

          In the opposite style of Java with it's "final" modifier, I would prefer all local variables to be final (eg unchangeable) by default, with the programmer having to specify changeability for variables that should have it. My Java IDE automatically changes all possible variables to final automatically (a customizable setting) and I am astonished in the last decade just how few variables actually need to be variable.

          I haven't tried Go channels nor Clojure's core.async.

          I've built GUIs in a number of different ways over the decades. Back in the day, classic Mac OS was difficult compared to today. I find Java Swing to be remarkably good overall, but not without it's warts. It's just that that the positive experiences outweigh drawbacks. And being pure Java it's all GC. The UI has plugable look and feel. It's cross platform portable. There are extensions to use desktop specific features of various platforms. If I do another GUI project I think I'll try Java FX.

          If I ever were to try a portable GUI in any of Golang, Python, D or something else, should I use Qt or wxWidgets or Gtk? I think I lean towards either wxWidgets or Gtk. But in my day job, I build web apps.

          --
          The lower I set my standards the more accomplishments I have.
          • (Score: 2) by HiThere on Thursday March 12 2020, @05:51PM

            by HiThere (866) Subscriber Badge on Thursday March 12 2020, @05:51PM (#970316) Journal

            With C++ pick WxWidgets or Qt. Qt has better appearance in my opinion, but WxWidgets is easier to use. In C I'd grit my teeth and pick gtk. From Go gtk is the only reasonable choice.

            And note that it depends a LOT on what you're doing. Some things are easy in one toolkit and hard in another, like selecting a line of text from a displayed block of text (none of them make that easy, unless you identify a line as "something that ends with a carriage return" rather than "the stuff displayed on one line of this variably sized text control". Gtk uses markers to say when you're highlighting a word, WxWidgets allows you to use HTML markup, and Qt assumes you're going to be using HTML markup.

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 2) by arslan on Thursday March 12 2020, @01:49AM (5 children)

    by arslan (3462) on Thursday March 12 2020, @01:49AM (#969998)

    You can write Javascript now instead of VBA for Excel, but yea not as pervasive yet, but I imagine the next gen that grows up with the internet and more familiar with JS will likely make it happen. Excel aint going anywhere unfortunately.

    • (Score: 1) by Ethanol-fueled on Thursday March 12 2020, @02:28AM (3 children)

      by Ethanol-fueled (2792) on Thursday March 12 2020, @02:28AM (#970025) Homepage

      VB is here to stay in industry because of all the legacy code and unwillingness of middle managers to waste pennies from their bonuses.

      And that's not even counting the insane number of VBA macros in important documents. Even new and expensive toys like rate tables with 4 decimal points of a degree accuracy use certain dialects of VB.

      • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @04:50AM (1 child)

        by Anonymous Coward on Thursday March 12 2020, @04:50AM (#970108)

        Units problem?
        > rate tables with 4 decimal points of a degree accuracy
        Wouldn't that be:
          rate tables with 4 decimal points of a degree/second accuracy

        Years ago I wanted to run a check cal (not precision) on a small rate gyro, used a high quality turntable, 33 rpm (198 deg/sec) was right in the middle of the rate range I needed. Record a couple of revs, then lift the gyro off the turntable before the wire wound up too much.

      • (Score: 2) by HiThere on Thursday March 12 2020, @05:54PM

        by HiThere (866) Subscriber Badge on Thursday March 12 2020, @05:54PM (#970318) Journal

        And if they use VBA I doubt their accuracy. This is unfair, because it's based on Access Basic of a couple of decades ago, but that stupid language would occasionally give different answers for the same input, and I haven't trusted a Microsoft Basic since then.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 0) by Anonymous Coward on Thursday March 12 2020, @08:27AM

      by Anonymous Coward on Thursday March 12 2020, @08:27AM (#970153)

      omfg lul what?

  • (Score: 2) by driverless on Thursday March 12 2020, @08:02AM

    by driverless (4770) on Thursday March 12 2020, @08:02AM (#970146)

    Why isn't MUMPS on that list? MUMPS should actually have been added the minute it was released.

  • (Score: 4, Informative) by TheRaven on Thursday March 12 2020, @03:26PM (1 child)

    by TheRaven (270) on Thursday March 12 2020, @03:26PM (#970252) Journal

    Objective-C - About time is all I can say - ultra-transparent lock-in efforts by Apple, Inc. notwithstanding, what did Objective-C ever bring to the party?

    I am the author / maintainer of the GNUstep Objective-C runtime (used in a bunch of Android apps, in WinObjC, and a few other places, including some quite surprising ones - including some widely deployed SS7 routing stuff). I also maintain Objective-C support for non-Apple platforms, and have written a couple of books about the language, so I'll have a go at this one:

    Objective-C is a great language for mid-90s computers. It provides late-binding and makes it easy to create stable ABIs for libraries. The original goal of Objective-C was as a language for packaging C libraries for use by high-level languages. It's still pretty good at that. The rich reflection means that it's trivial to write generic Objective-C bridges for scripting languages. It was a compromise between Smalltalk and C. C-level performance where you need it, but Smalltalk levels of abstraction (and, sadly, performance) for high-level abstractions.

    Objective-C++ improved this a little bit by filling in the gap. C arrays are fast, Objective-C arrays are very slow but safe, std::vector can be safe and fast. With ARC, you can store Objective-C objects in C++ containers without having to think about memory management. For single-threaded programs, it's a pretty nice world.

    Objective-C is not a great language for the 2020s. ARC is not memory safe in the presence of concurrent mutation. As with std::shared_ptr, it is safe to concurrently update the refcounts to the same object, but not safe to update the same pointer from two threads (e.g. a global or a field in an object reachable from two threads). It doesn't have a concept of deep immutability, so there's no good way of creating a data structure that you can statically guarantee is safe to share between threads. It doesn't have a concept of move semantics or linear types, so there's no good way of guaranteeing that mutable data is passed (not shared) between threads. It has a single global namespace for the entire program, so there is no way of building programs in it that have any notion of sandboxing other than as two separate processes that communicate via mechanisms that are opaque tot the compiler. It does not guarantee memory safety, which means that it can't benefit from a load of the optimisations available to higher-level languages, but also can't benefit from a load of the optimisations available to lower-level languages.

    The problem with the Apple ecosystem is that Swift inherits all of these limitations and brings nothing new other than different syntax. It does not address any of the fundamental limitations of Objective-C and inherits a PDP-11-like abstract machine, in a world where modern computers look nothing like PDP-11s.

    --
    sudo mod me up
    • (Score: 2) by bussdriver on Friday March 13 2020, @06:46AM

      by bussdriver (6876) Subscriber Badge on Friday March 13 2020, @06:46AM (#970579)

      Objective-C needed a team evolving it over time like C++ and others. COBAL doesn't look like it did MANY decades ago even it evolved a bit. Objective-C should be more like C++ with changes to fit today's needs. Even if some breakage happens. A read a history book on the development of C++ and didn't agree with many design choices that seemed to be where Objective-C went the other direction.

      Swift I don't know other than I heard it was an evolution of Objective-C with a bunch of syntax games that were not likely needed. It seemed to me from a glance they were trying to get some kind of hybrid between compiled and scripted languages. A worthy goal to attempt as we do so much more in scripting languages today; it sure would be nice if scripting could compile into something much faster while still being easy to develop quickly. I won't look into Swift until it's more free from Apple's grip.

      Objective-C, I dabbled in and loved it but the lack of backers outside of Apple discouraged me from investing more effort. Had I known 20 years ago about GNUstep I might have gone into it much more.

  • (Score: 2) by bussdriver on Friday March 13 2020, @06:28AM (1 child)

    by bussdriver (6876) Subscriber Badge on Friday March 13 2020, @06:28AM (#970574)

    Objective-C is open source but it's rarely used and that is the problem. It is nearly as old as C++ but it made many better design decisions than C++. I have done both and clearly Objective-C is better than C++. Now days you can mix the two languages and use the best of both worlds which is cool stuff.

    I wish Objective-C took off instead of C++ but ultimately it's the people controlling the language that decide it's long term success. Adding flexible OOP to C without obsessing over speed was the way to go but over application of OOP to everything was the fad so crazy measures to keep speed seemed more important than it was. You could get that extra speed when you needed by just using C.

    Apple seems to be the primary driver and they are migrating away. It's a shame because if a C guru and OOP guru came up with a solution it would be a lot like Objective-C and not C++. Not that some C++ concepts weren't great help to compiler it had many great ideas.

    CoffeeScript was clever; relates to Objective-C. I'd think it should be doing better since it's use case is a popular one today and it's mature and well designed for the niche some people want to go into with other fad tools right now.

    • (Score: 2) by JoeMerchant on Friday March 13 2020, @01:03PM

      by JoeMerchant (3937) on Friday March 13 2020, @01:03PM (#970672)

      I'm not knocking the academic/technical/aesthetic merits of Objective-C, just its real world applicability. Betamax was clearly better than VHS, BluRay is better than DVD, but what's it worth if the content in the format is lacking, or difficult to access? Like in spoken languages, Esperanto may have all kinds of academic/technical/aesthetic merits over whatever other languages you care to compare it with, but how useful is it anywhere other than at a special gathering of Esperanto speakers, and how useful are such gatherings? Lack of APIs, lack of code base, lack of "speakers" or available developer experience, even lack of stack overflow and similar problem solving resources on Google are much more important than any particular merits/demerits of a language. If you have a "need for speed" there are almost always ways to get that in any language - though for scripting languages like Python the way to get that speed is to leave the language altogether and build your speed critical modules in something else, like C/C++, so I have a hard time getting on board with the "Python is great" crowd when such a critical part of the language is learning how to stitch together disparate libraries (often built in languages you don't understand) with the inevitable bag of snakes that is configuration management of the complete product.

      In 1996 we converted an application that we had been developing (more like organically growing) since 1990 in C. The core of the application was a set of data processing blocks (filters, feature extractors, etc.) that layered on top of one another, generally about 6 layers deep from the source data to what was presented to the user as graphs. The C++ translation did indeed fully embrace object oriented design, and the running speed hit was about 50% vs. the C implementation. However, this was ~1996 - it took about a year to port 90% of the old app from C to C++, and during that year new PCs gained as least 2x speed per dollar performance, so the "new stuff" really didn't run slower than the "old stuff" in absolute terms. The thing of value that did accelerate greatly was speed of development. With the object oriented design, a new idea could be translated into running code about 4x faster than the C version, on average. This may have been more attributable to the fact that the C++ app was "architected" with 6 years of experience whereas the C app grew organically, like a coral reef, as each new idea layered on top of the existing code which started growing from a tiny seed that had no idea what it would become. The real motivation for using C++ wasn't the app's core "value add" unique IP, the real motivation was that the Windowing UI API was all written in C++. To do that exercise all over again (and waste yet another man year on re-development) we _might_ have gained back the 50% speed lost during the migration to C++ by keeping our compute core in straight C, or... it's possible that it was just the early C++ compilers that didn't optimize as well as the more mature C compilers. Back when we started the C++ compilers were straight up too buggy to use, which is what drove us to implement in C in the first place.

      --
      🌻🌻 [google.com]