Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Thursday July 07 2022, @09:53PM   Printer-friendly
from the agile-SNOBOL-FTW dept.

Over at ACM.org, Doug Meil posits that programming languages are often designed for certain tasks or workloads in mind, and in that sense most languages differ less in what they make possible, and more in terms of what they make easy:

I had the opportunity to visit the Computer History Museum in Mountain View, CA, a few years ago. It's a terrific museum, and among the many exhibits is a wall-size graph of the evolution of programming languages. This graph is so big that anyone who has ever written "Hello World" in anything has the urge to stick their nose against the wall and search section by section to try find their favorite languages. I certainly did. The next instinct is to trace the "influenced" edges of the graph with their index finger backwards in time. Or forwards, depending on how old the languages happen to be.

[...] There is so much that can be taken for granted in computing today. Back in the early days everything was expensive and limited: storage, memory, and processing power. People had to walk uphill and against the wind, both ways, just to get to the computer lab, and then stay up all night to get computer time. One thing that was easier during that time was that the programming language namespace was greenfield, and initial ones from the 1950's and 1960's had the luxury of being named precisely for the thing they did: FORTRAN (Formula Translator), COBOL (Common Business Oriented Language), BASIC (Beginner's All-purpose Symbolic Instruction Code), ALGOL (Algorithmic Language), LISP (List Processor). Most people probably haven't heard of SNOBOL (String Oriented and Symbolic Language, 1962), but one doesn't need many guesses to determine what it was trying to do. Had object-oriented programming concepts been more fully understood during that time, it's possible we would be coding in something like "OBJOL" —an unambiguously named object-oriented language, at least by naming patterns of the era.

It's worth noting and admiring the audacity of PL/I (1964), which was aiming to be that "one good programming language." The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's.

The author goes on to reason that new languages are mostly created for control and fortune, citing Microsoft's C# as an example of their answer to Java for a middleware language they could control.

Related:
Non-Programmers are Building More of the World's Software
Twist: MIT's New Programming Language for Quantum Computing
10 Most(ly dead) Influential Programming Languages


Original Submission

Related Stories

10 Most(ly dead) Influential Programming Languages 39 comments

10 Most(ly dead) Influential Programming Languages:

The other day I read 20 most significant programming languages in history, a "preposterous table I just made up." He certainly got preposterous right: he lists Go as "most significant" but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it's "mostly dead". Preposterous! That defeats the whole point of what "significant in history" means.

So let's talk about some "mostly dead" languages and why they matter so much.

Disclaimer: Yeah not all of these are dead and not all of these are forgotten. Like most people have heard of Smalltalk, right? Also there's probably like a billion mistakes in this, because when you're doing a survey of 60 years of computing history you're gonna get some things wrong. Feel free to yell at me if you see anything!

Disclaimer 2: Yeah I know some of these are "first to invent" and others are "first to popularize". History is complicated!

<no-sarcasm>
If there were one perfect language we would all be using it already.
</no-sarcasm>

Recently:
(2020-03-11) Top 7 Dying Programming Languages to Avoid Studying in 2019-2020


Original Submission

Twist: MIT’s New Programming Language for Quantum Computing 4 comments

Twist: MIT's New Programming Language for Quantum Computing:

Quantum computing. Unlike traditional computers that use bits, quantum computers use qubits to encode information as zeros or ones, or both at the same time. Coupled with a cocktail of forces from quantum physics, these refrigerator-sized machines can process a whole lot of information — but they're far from flawless. Just like our regular computers, we need to have the right programming languages to properly compute on quantum computers.

Programming quantum computers requires awareness of something called "entanglement," a computational multiplier for qubits of sorts, which translates to a lot of power. When two qubits are entangled, actions on one qubit can change the value of the other, even when they are physically separated, giving rise to Einstein's characterization of "spooky action at a distance." But that potency is equal parts a source of weakness. When programming, discarding one qubit without being mindful of its entanglement with another qubit can destroy the data stored in the other, jeopardizing the correctness of the program.

Scientists from MIT's Computer Science and Artificial Intelligence (CSAIL) aimed to do some unraveling by creating their own programming language for quantum computing called Twist. Twist can describe and verify which pieces of data are entangled in a quantum program, through a language a classical programmer can understand. The language uses a concept called purity, which enforces the absence of entanglement and results in more intuitive programs, with ideally fewer bugs. For example, a programmer can use Twist to say that the temporary data generated as garbage by a program is not entangled with the program's answer, making it safe to throw away.

Non-Programmers are Building More of the World's Software 62 comments

Nonprogrammers are building more of the world's software: A computer scientist explains 'no-code':

Traditional computer programming has a steep learning curve that requires learning a programming language, for example C/C++, Java or Python, just to build a simple application such as a calculator or Tic-tac-toe game. Programming also requires substantial debugging skills, which easily frustrates new learners. The study time, effort and experience needed often stop nonprogrammers from making software from scratch.

No-code is a way to program websites, mobile apps and games without using codes or scripts, or sets of commands. People readily learn from visual cues, which led to the development of "what you see is what you get" (WYSIWYG) document and multimedia editors as early as the 1970s. WYSIWYG editors allow you to work in a document as it appears in finished form. The concept was extended to software development in the 1990s.

There are many no-code development platforms that allow both programmers and nonprogrammers to create software through drag-and-drop graphical user interfaces instead of traditional line-by-line coding. For example, a user can drag a label and drop it to a website. The no-code platform will show how the label looks and create the corresponding HTML code. No-code development platforms generally offer templates or modules that allow anyone to build apps.

[...] There are many current no-code website-building platforms such as Bubble, Wix, WordPress and GoogleSites that overcome the shortcomings of the early no-code website builders. Bubble allows users to design the interface by defining a workflow. A workflow is a series of actions triggered by an event. For instance, when a user clicks on the save button (the event), the current game status is saved to a file (the series of actions).

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by looorg on Thursday July 07 2022, @10:31PM (38 children)

    by looorg (578) on Thursday July 07 2022, @10:31PM (#1258775)

    Why? Everyone thinks they can invent a better language, one that fixes that ONE (or more) annoying things that all (or some) the others do, or one that includes the best part of umpteen other languages into one super-meta-language. It's a selfsustaining problem that will never be solved. Somehow the more new languages there are the more problems there are and the more shit they appear to become. There is always another one just around the corner that promises to fix all and be the one. There can be only one.

    It's either that or it's so the ransomware people in that other post somewhat below this one can update their malware to the new shiny language that increases malware performance ...

    • (Score: 3, Insightful) by gringer on Thursday July 07 2022, @10:41PM (1 child)

      by gringer (962) on Thursday July 07 2022, @10:41PM (#1258776)

      Relevant XKCD:

      https://xkcd.com/927/ [xkcd.com]

      --
      Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
      • (Score: 2) by Freeman on Thursday July 07 2022, @10:45PM

        by Freeman (732) Subscriber Badge on Thursday July 07 2022, @10:45PM (#1258777) Journal

        Yep, that's it in a nutshell.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 5, Insightful) by JoeMerchant on Friday July 08 2022, @01:20AM (35 children)

      by JoeMerchant (3937) on Friday July 08 2022, @01:20AM (#1258802)

      Or, you can just fix the problems with an updated API on top of C++...

      Need garbage collection? There's a library for that.

      Need type-free variables: try a variant class.

      and on and on and on and on...

      Or we can all start watching our indentation so we can call it Python (or just use a style Nazi on C++). https://bitbucket.org/verateam/vera/wiki/Introduction [bitbucket.org]

      Or we can force ourselves into functional paradigms and call it Erlang (or just use functional paradigms in C++). https://docs.microsoft.com/en-us/archive/msdn-magazine/2012/august/c-functional-style-programming-in-c [microsoft.com]

      In the end, most new languages are themselves coded in C++... the only thing that makes C++ "too hard to use safely" are programmers in too much of a hurry to enforce safe practices on their coding, and there are more and more static analysis tools every day that will warn you about all kinds of unsafe practices - even some perfectly safe ones, if you practice them correctly.

      --
      Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
      • (Score: 2) by RamiK on Friday July 08 2022, @10:44AM (21 children)

        by RamiK (1813) on Friday July 08 2022, @10:44AM (#1258857)

        The purpose of programming languages is to simplify assembly/machine code via abstractions. C++ provides so many incompatible abstractions with shitty syntax and backwards grammar that it fails to simplify much of anything. And when all of that meshes across different libraries, it all just breaks spectacularly when it comes time to actually understanding what the hell is going on.

        In the end, most new languages are themselves coded in C++

        The fact that you have so many competent specialists across different problem-spaces all looking at this and that and concluding that the issue is better off handled with a different language goes to show you there really isn't a single legitimate use-case where C++ can do the job well enough that isn't entirely isolated and self-contained.

        --
        compiling...
        • (Score: 3, Interesting) by JoeMerchant on Friday July 08 2022, @06:35PM (20 children)

          by JoeMerchant (3937) on Friday July 08 2022, @06:35PM (#1258942)

          You sound like you think the majority of professionally employed programmers are competent. I find the opposite to be true, which is why all these training wheels and sandboxes have been provided for them to work in.

          No matter how "idiot proof" you make a development environment, there's always a more clever idiot that will come along and break things.

          gcc provides a layer of abstraction from machine language to a reasonably human readable language. Boost or Qt or your library of choice provides another layer of proven implementations of common constructs and tasks. There are plenty of "drag and drop codeless" software construction systems which are essentially apps written on libraries, and they have their place.

          All I see python providing is a way to stitch together libraries (mostly written in C or C++) in such a way as to recreate version management hell writ even larger than Microsoft ever managed to with dll hell.

          Go Rust Ruby and their faddy friends? There is a reason they come and go.

          --
          Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
          • (Score: 3, Insightful) by RamiK on Saturday July 09 2022, @05:33PM (19 children)

            by RamiK (1813) on Saturday July 09 2022, @05:33PM (#1259208)

            all these training wheels and sandboxes...gcc provides a layer of abstraction from machine language to a reasonably human readable language.

            Python is abstracting C++/C. C is abstracting assembly. Assembly abstracts machine code. Machine code abstracts microcode. Microcode abstract logic gates. Logic gates abstract transistors. Transistors abstract electrical differential equations...

            Your "readable" is another specialist's training wheels. There's no clear line being crossed here. You can build a computer out of relay logic. You can do analog computing that out-performs anything done with transistors. And you can take the analog all the way down to quantum computing since that's basically what it all is.

            All I see python providing is a way to stitch together libraries

            This is precisely what I see C vs. C++ as: Just a (particularly bad) way of stitching together C code.

            Mind you, I think python 3 has gone too much multi-paradigm and is showing serious C++-like problems nowadays. However, despite its faults it's still pragmatic to use in many cases at least compared to Perl or Ruby.

            Go Rust Ruby and their faddy friends? There is a reason they come and go.

            I'll give you Ruby mostly because Rails is coming and going all the time... But Rust is only now got accepted into the Linux kernel as a development language while Go has similarly just recently (1.18 generics and 1.19 memory model) started to aim at growth beyond its well carved back-end niche.

            Regardless, there's nothing wrong with languages coming and going. Like spoken languages, it's only natural for languages to reflect their population's migration from different domains. The problem begins when you're trying to stay backwards compatible... That leaves you talking pig Latin and writing Chinese script.

            --
            compiling...
            • (Score: 3, Insightful) by JoeMerchant on Sunday July 10 2022, @08:49PM (18 children)

              by JoeMerchant (3937) on Sunday July 10 2022, @08:49PM (#1259568)

              I don't view auto destruction upon scope exit as training wheels, but garbage collection is and that is because scope exit is a precisely defined behavior whereas gc is a squishy just don't worry your pretty little head about the details implementation.

              Abstraction that results in predictable results because similar things are done the same way every time is simply good practice. Variables that guess their type for you? Training wheels.

              The thing I like least about Python is that it doesn't add any useful abstraction as compared with including a library, but when you ask it to do something like an RGB i8 to HSL f8 translation, it is dog slow (~100x slower than C++) forcing you to jump into C or Fortran or C++ to get something that simple done, unless you can find a reliable implementation out there, but stitching together so much simple stuff is A) a significant management problem and B) a serious security issue.

              --
              Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
              • (Score: 2) by RamiK on Monday July 11 2022, @07:58AM (17 children)

                by RamiK (1813) on Monday July 11 2022, @07:58AM (#1259697)

                I don't view auto destruction upon scope exit as training wheels, but garbage collection is and that is because scope exit is a precisely defined behavior whereas gc is a squishy just don't worry your pretty little head about the details implementation.

                Garbage collection is to the heap as automatic variables are to the stack. Sometimes it shares the scope rules. Sometimes it has its own rules. Either way, the lifetime of variables is always clearly defined in every language's specifications.

                Variables that guess their type for you? Training wheels.

                C does implicit type conversions all the time: https://www.scaler.com/topics/c/implicit-type-conversion-in-c/ [scaler.com]

                C++ templates and generics take it a step further.

                The thing I like least about Python is that it doesn't add any useful abstraction as compared with including a library...

                Nothing adds anything to C++ since C++ has and abstracts around everything. The problem is that doing everything, everywhere all at once with so much ass-broken syntax means you end up with unjustifiable cognitive overhead that comes down to unsustainable productivity and manpower costs. Literally every python, ruby or whatever popular code-base is an example of a project that won the time-to-market competition against C++. Even in the low-level, you find micropython winning over not just C++, but even C and assembly.

                And just when you think C++'s existing library offering ecosystem is enough to compensate for its slow development times, you see projects like Redox OS gradually redoing everything from scratch, design included, and still out pacing any other similar development effort. And you don't have to look so far into the newer languages. The Arcan guy is writing his own window manager, game engine and pretty much a whole graphical user-land from scratch with C and Lua.

                Add it all up and it becomes clear C++ just isn't worth it to at most scales. In fact, C++ is often used specifically to keep the cost-of-entry so high that the competition won't bother. That's what screwed up Mozilla: They tried competing against Chromium using the same high-cost tool set. So, naturally, they can't keep up.

                --
                compiling...
                • (Score: 2) by JoeMerchant on Monday July 11 2022, @01:58PM (16 children)

                  by JoeMerchant (3937) on Monday July 11 2022, @01:58PM (#1259777)

                  >with so much ass-broken syntax means you end up with unjustifiable cognitive overhead

                  Chinese speakers say this (and quite justifiably) about English. English speakers are mostly too ignorant of Chinese to have a complaint like this. English speakers complain about how the French do their numbers, Parisian French complain about - well - everything.

                  Point? It took me 5+ years of (solo, unguided, unmentored) daily professional practice with C++ to learn the parts of the ass-broken syntax that I use frequently. That probably could have been cut down to a couple of years if I focused 20% of my time on learning the language instead of just using it to get stuff done. Could have been a year or less with a good mentor who spent 20% of our mutual time doing code reviews of what I, and they, had written in the other 80% of our time - there's a best practice that I have never ever seen implemented in the real world, entirely due to the perceived and actual values of management.

                  >C++ is often used specifically to keep the cost-of-entry so high that the competition won't bother.

                  More than twice, I was hired to to untangle a bunch of academic code written in user-unfriendly combinations of: Matlab, Fortran, Python, Squirrel Script (yes, it's actually a thing...), ancient C, of course Java, and various other junk, and translate it into a user friendly and maintainable package. Being a Qt/C++ speaker, that's what I recommended, and implemented in, and reduced the various toolchain + language towers of Babel to a single language, single API code base that ported to all kinds of target systems in a single click to launch (or auto-launch on power-up) user experience. The "powerful abstraction" Matlab code was hiding an un-necessary nesting of a loop which, when untangled, sped up execution of the C++ code 100x - just translating Matlab to C++ was good for another factor of about 2, but the real gain was in our multi-programmer code review of the guilty module which the profiler pointed out. C++ has pretty much all those fancy toys easily available, like highly developed runtime profilers, which tend to have spotty availability and less robust development in newer languages.

                  That's where I really balk at flavor of the day languages: in the toolchains' setup and maintenance. You can literally spend days getting the tools figured out before even starting on your code, and if you need to update to version x.y every few months that adds to the cost. The Qt/C++ has some toolchain setup overhead, particularly in the Microsoft influenced domains, but in general it's one of the quickest and easiest environments I have used for setup from scratch. Of course, I'm biased, because it's what I've been practicing for 16 years now, but during those 16 years I've watched so many man-months sunk into other toolchain setup and maintenance efforts it only reinforces that aspect of my prejudice.

                  --
                  Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                  • (Score: 2) by RamiK on Monday July 11 2022, @10:19PM (15 children)

                    by RamiK (1813) on Monday July 11 2022, @10:19PM (#1259935)

                    Chinese speakers say this (and quite justifiably) about English. English speakers are mostly too ignorant of Chinese to have a complaint like this.

                    My Chinese isn't even HSK 1 - I've basically learned just enough pinyin and some grammar to pull off wo bu hui shuo hanyu in a pinch and more or less catch basic keywords - but from what I've seen and heard, Chinese grammar is a delight compared to English.

                    academic code written in user-unfriendly combinations

                    A rewrite of interpreted academic code into a compiled language is expected to yield 10-100x performance improvements regardless if it's C++, C or whatever. Python specifically is officially quoted as being 10-100x slower than C exactly because it's simply the unavoidable cost of dynamic type and garbage collection. The thing is, the question isn't how pure interpreted dynamic code performs, but what would have happened if you'd write the performance-hungry loops in C libraries and binded everything else in python, go or lisp for all I care so it won't take 5 years of training and 15 years of practice to be able to maintain and develop the code going forward?

                    That's where I really balk at flavor of the day languages: in the toolchains' setup and maintenance.

                    Toolchain setup and maintenance is a big part of the overall picture but it doesn't explain Java, C# or even, albeit after many years of dedicated focused work, Go's lack of adoption within the C++ ranks.

                    Being a Qt/C++ speaker

                    Qt is a good example of everything that's wrong with C++: Instead of having a few widgets that can be used with different rendering engines and what not, the Qt devs just went full retard and implemented their own memory management, types and main loop. Admittedly, the GTK and EFL devs screwed it up as well in much the same way... And I suppose at least with C++ you can override the methods so it's not quite as awful as GTK and EFL ended up being... But really, even if you claim to like C++, Qt isn't really C++ anymore. It's a dialect of C++. A horrible, mutated dialect running on some fubar garbage collected OS...

                    --
                    compiling...
                    • (Score: 2) by JoeMerchant on Monday July 11 2022, @10:54PM (14 children)

                      by JoeMerchant (3937) on Monday July 11 2022, @10:54PM (#1259957)

                      >what would have happened if you'd write the performance-hungry loops in C libraries and binded everything else in python, go or lisp for all I care

                      So, your suggestion is to learn just enough C or C++ (or Fortran would do the job too) to implement the stuff that Python or whatever is too hobbled to handle, then hybridize that with the training wheels language so we have the opportunity to experience the worst of both worlds plus the grafting of multiple languages into a single project?

                      I am actually a fan of message broker systems, AMQP, MQTT, DDS, whatever. Build a system with whatever (single) environment makes sense for the task at hand, be that embedded microcontrollers, or C++ or whatever and let them coordinate through the (well established and debugged) message broker. Due to the ability to have tiny microcontrollers and heavy lift operating systems in a single device, it makes a lot of sense to me.

                      Qt has gone full retard in many ways, but it is a robust enough environment that I can not only shill/plug the company line: code once run anywhere, but also step out to straight C++ or other included libraries when the Qt answers are lacking, or otherwise undesirable. All without having to graft multiple languages and tool chains into a single executable.

                      --
                      Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                      • (Score: 2) by RamiK on Tuesday July 12 2022, @05:50AM (13 children)

                        by RamiK (1813) on Tuesday July 12 2022, @05:50AM (#1260049)

                        So, your suggestion is to learn just enough C or C++ (or Fortran would do the job too) to implement the stuff that Python or whatever is too hobbled to handle, then hybridize that with the training wheels language so we have the opportunity to experience the worst of both worlds plus the grafting of multiple languages into a single project?

                        I'm not sure what you're complaining about. Between templates and C itself, C++ is already trilingual. The problem is the tool-chains. But, between how Go embeds C and Lua is embedded in C, you obviously have better choices than python when you want to have C++ expressiveness and performance without C++ syntax so just do that.

                        And it's not my solution. Again, it's what already being done in most, if not all, large successful C++ project. Less so with C since you can still write large scale C code at the kernel and library problem space without deteriorating everything C touches into an unsafe language. However, I can't say the same about C++. Most C++ libraries assume method overrides so binding to them creates issues. It's not always the case (protocol and file-type C++ libraries typically avoid this so there's no issue with using them...). But it's the exception that proves the rule.

                        I am actually a fan of message broker systems, AMQP, MQTT, DDS, whatever. Build a system with whatever (single) environment makes sense for the task at hand, be that embedded microcontrollers, or C++ or whatever and let them coordinate through the (well established and debugged) message broker.

                        Careful there. Generalizing communication protocols and IPC as "message broker" might work on paper but context switching and most data sharing makes IPC slow (well, on general purpose machines at least...) so that generalized design only really applies to networked machines and specific services/daemons. You don't want to end up with Plan 9...

                        it is a robust enough environment that I can not only shill/plug the company line

                        As previously mentioned, the company line often trends towards "the tools we know and that keep the competition away are good enough". It clearly doesn't reflect the market seeing how Qt somehow managed to lose market to dog slow abominations like Electron.

                        ...it is a robust enough environment...

                        The company line is the market and the market has decided in favor of Electron so it might be best not to refer to the company line going forward.

                        --
                        compiling...
                        • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @10:35AM (12 children)

                          by JoeMerchant (3937) on Tuesday July 12 2022, @10:35AM (#1260098)

                          Qt marketing has been a schizophrenic shit show for the past sixteen years. The fact that the product survived a Microsoft acquisition and still remains useful is a testament to the strength of open source licensing, yet even today aggressive rent seeking by Qt marketing continues to scare away long time large and small users.

                          As for the tech, they lost me when they introduced quick/JavaScript, the timing and horrible state of uselessness when it was rolled out suggest to me that the whole thing is/was a Microsoft plot to kill Qt market share.

                          Still, it is a (the?) great cross platform desktop environment and is very widely used in embedded devices with touch displays. Pretty much anytime you encounter a product that runs on Linux and Windows and OSx Qt will be behind the curtain.

                          As for having a choice to avoid Python, I don't find that to be the case in AI/ML circles, and the effort there seems to be 90% in the data gathering, 9% in the tool chain setup and less than 1% in the coding. Ironic that they spend so much time waiting for each learning cycle to run and have such an optimization hostile environment. Sure, there is plenty of pre baked hardware acceleration, but how many Python coders does it take to tweak a hardware acceleration module?

                          --
                          Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                          • (Score: 2) by RamiK on Tuesday July 12 2022, @05:39PM (11 children)

                            by RamiK (1813) on Tuesday July 12 2022, @05:39PM (#1260216)

                            they lost me when they introduced quick/JavaScript...widely used in embedded devices with touch displays

                            You know, half the apps on my phone are webkit front-ends and they launch instantaneously. I know it's because webkit is constantly cached in RAM for about ~100-350MB worth (platform and model dependent)... But, the thing is, RAM is cheap, abundant and a prerequisite for 4k anyhow so you knew you'll get plenty of it even 10 years ago. So, what were they supposed to do as the market shifted away? Downsize? They had to deliver the product the clients wanted and the clients wanted .JS since C++ devs were too expensive and time-consuming to train.

                            Sure, there is plenty of pre baked hardware acceleration

                            It's all hardware accelerated. Each and every function. I've used some of those libraries for dataset clustering (nothing fancy. basically a few dozen lines of sklearn and numpy for DBSCAN) and it felt like writing gl shaders whereby the pipeline was so stupid obvious that you'd actually have to go out of your way and forcibly use the python standard library and data types instead of relevant libraries' functions and data types to screw up. And mind you, numpy actually forces explicit casting to do un-optimized operations so you're really going to have to make a conscious effort to fuck things up here.

                            how many Python coders does it take to tweak a hardware acceleration module?

                            Are you paying per individual or per working and training hour? Industrial scaling costs is what drives automation so I don't see why you'd measure in number of coders instead of man-hours. C++ devs are scarce and costly. Python devs and faster hardware is cheaper and more readily available. The point is that there's a middle way: Have the low-level written in actual low-level C and the high-level written in actual high-level python/go/flavor-of-the-month. It's the most cost-effective approach as the market repeatedly shown.

                            Besides, don't forget that very well funded, modern, all-C++ desktop OS is falling behind the antiquated 70s-mainframe knockoff C linux really is. So, there's a fundamental fallacy behind C++ puritanism that isn't easy to reconcile with reality unless you squint real hard to avoid the entirety of your software stack.

                            --
                            compiling...
                            • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @07:12PM (10 children)

                              by JoeMerchant (3937) on Tuesday July 12 2022, @07:12PM (#1260253)

                              >C++ devs are scarce and costly. Python devs and faster hardware is cheaper and more readily available.

                              I agree about the abundance of RAM and fast hardware - I regularly read entire, fairly large, files into a QString and toss 'em around in RAM before writing them back out, something that would have been considered "amateurish and unscalable" back in 8 bit days.

                              However, there's a bit of a fallacy about Python devs being cheaper, and I think it's driven in-part by miserly management. Follow me here: in _generic big city_ C++ devs earn $150K/yr while starting Python devs make $50K. Sounds like you can afford 3:1 Python:C++ devs, and I know far too many managers who think that way. But... the first year of work at most jobs is about 80% productive, at best, often much worse, so turnover makes your employees costlier, and those low-cost devs will turn over faster for lots of reasons. Then there's the overhead cost that doesn't scale with salary, particularly with "back in office" management demands - something on the order of $50K per developer / year when you're at a place that does real HR and more than one level of management. So, to retain those Python devs for more than 9 months, you'll be giving them raises to $75K fairly quickly - putting your "loaded" costs at $200K / head for C++ vs $125K / head for Python. Also, don't forget your mythical man month cost of communication, the more devs you have, the more time they have to spend talking to each other to be effective, and all these "cheap and readily available" Python programmers will be needing more of both communication and mentoring time, particularly considering that people who whine about C++ being "too hard" are likely to need much more hands-on mentoring and guidance vs those who can figure stuff out for themselves, given some time and a broadband connection.

                              I walked into a shop that had about 6 junior devs, heavy Python preference but some C/C++ and Matlab going on too. One of the junior devs was every bit my equal in productivity and ability, though he didn't know how to push back on management when they were being obtuse. The other 5 didn't add up to the two of us, not even close. Some had compensating other talents, like communication with the academic community and grant sources, but all those "cheap" programmers were costing far more than two of me.

                              I started near ground zero at another startup and hired in two programmers with limited C experience to meet the requirement: "Build a GUI app on OS-X" - that's where I started using Qt, and the three of us learned enough to be productive in it in the space of a few months, and had our first decent looking translation of the Fortran/Matlab mess we were handed within about 4 months from "go." Python wouldn't have gotten us any closer to the OS-X "native look and feel" goal at that time, and I don't think the learning curve for my two hires was particularly steep for Qt - one had some OpenGL experience, the other could at least follow examples in C. Should mention: pickings were really slim in that job market, had to turn away about 8 interviewees who literally couldn't program their way out of a paper bag given sample code that cut an opening from top to bottom. Hiring them to program in Python would have been just as pointless as teaching them Qt.

                              The real cost of using developers who have trouble with "hard languages" is that they have all kinds of other challenges too... they put 4 levels of nested loop in a place where 3 will do the job, and your execution time is 100x what it should be, and it doesn't matter how great the syntax of your language is, unless you have a library that calculates a value histogram of a volume ready to hand, they will be writing their own loops to get it done. Hardware acceleration is easily overwhelmed by bad implementations.

                              Re: C vs C++, I have no great loyalty to C++ and objects. One really great aspect of the torturous syntax of C++ is that C "just works" anywhere you drop it in to a C++ program. C++ and objects are really good analogies for Windowed GUI widgets, they're pretty good for containers like strings, lists of strings, hash tables, etc. but they definitely got overwrought in the late 1990s into things they had no business displacing a simple struct from.

                              --
                              Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                              • (Score: 2) by RamiK on Tuesday July 12 2022, @09:24PM (9 children)

                                by RamiK (1813) on Tuesday July 12 2022, @09:24PM (#1260298)

                                However, there's a bit of a fallacy about Python devs being cheaper...

                                If python was a new language I'd say you might have a point. But many big companies have been hiring python developers for a decade now and they aren't in any way looking to replace python with c++.

                                Besides, don't be so naive to believe your cost analysis is even remotely close to what your boss has in mind when making those hiring decisions. Big companies have whole teams of statisticians and HR writing 40 page cost/risk-analysis reports for every project that weigh-in on everything in such resolutions you wouldn't believe. e.g. There's reports that look into different school districts bus ride times to measure potential performance implications cut against marital state, age and gender.

                                So, when they look at the C++ hiring market and decide to diversify into python, they're not doing it by mistake.

                                I walked into a shop that had about 6 junior devs...

                                That's true for all entry positions, C++ included. The reason you're not seeing it is because there's so very few young people going into C++ these days.

                                they put 4 levels of nested loop in a place where 3 will do the job, and your execution time is 100x what it should be

                                But how many good developers does a team need to correct the mistakes of bad ones? Certainly they don't all need to be super stars. And we agree they can't all be terrible at their jobs... So, already, the premise is mixed skill levels. And from there it's pretty obvious you'd want to leverage the tools for the job by having the less skilled work with the training wheels on while having the more skilled work on their end. i.e. The Golang approach.

                                really good analogies for Windowed GUI widgets, they're pretty good for containers like strings, lists of strings, hash tables

                                Quick observation: Note how everything you've listing + parallelism readily falls into dataflow so if we only had a sane, domain specific dataflow language that didn't try to replace C, but to complement it in those very specific use cases, it would have yielded more agreeable results than C++'s OO.

                                --
                                compiling...
                                • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @09:46PM (8 children)

                                  by JoeMerchant (3937) on Tuesday July 12 2022, @09:46PM (#1260311)

                                  >when they look at the C++ hiring market and decide to diversify into python, they're not doing it by mistake.

                                  Of course that depends on the sophistication of the management group. Credit Suisse First Boston did all kinds of interesting due diligence when acquiring a new spin of a tech that we developed over the course of about 10 man-years, with about 6 man-years in the research and development of the analysis software component. On the hardware side, they flew us up to DEKA to have Mr. Kamen (Segway inventor, among other more significant less well known things) himself give them a read on the electronics side of things and whether or not it really did what we had been selling it for for the previous 20 years. For the software, they basically opened the newspaper and found 10x as many ads for the Microsoft API of the moment as they did for the (then quite superior) Borland environment. So, drop of a hat, they hired a team of 4 - which grew to 8 - to recode the application in the MS API with an initial projected completion of 3 months, growing to 12 before they got it done. They easily paid more for those programmers than they did for the rest of the acquisition, simply because the MS API had more ads in the paper. P.S. if you're involved with investment bankers at any time, you should know that CSFB incorporated the new entity in Delaware, then proceeded to issue debt from their own bank to the company which they had bought with 80% stock, then when they had issued enough debt to make the net value of the organization $0 they reissued the stock giving all the investors checks for $0.01 in exchange for their shares, some of which had invested over $1M of their own money to obtain about 5 years earlier. Me? I only had stock given as bonuses, nominally worth about $80K at one point, I also received a check from CSFB for $0.01. Debt takes precedence over equity, and in Delaware you can pull shit like that and screw the equity holders legally. End of the day: they put a lot more effort into controlling the legal framework the acquisition deal happened under than they did language / API selection.

                                  >OO

                                  Don't forget, OO as a concept comes from the 1980s. Think about the hardware that was available in 1985. OO has been widely abused since, and earned some of its bad reputation - just because you have a buggy whip on your electric sports car doesn't mean you have to pull it out and use it.

                                  --
                                  Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                                  • (Score: 2) by RamiK on Wednesday July 13 2022, @07:40AM (7 children)

                                    by RamiK (1813) on Wednesday July 13 2022, @07:40AM (#1260421)

                                    So, drop of a hat, they hired a team of 4 - which grew to 8 - to recode the application in the MS API with an initial projected completion of 3 months, growing to 12 before they got it done. They easily paid more for those programmers than they did for the rest of the acquisition, simply because the MS API had more ads in the paper.

                                    I've heard a similar Microsoft vs. Borland account where the decision fell in favor of Microsoft since they were offering free on-site support where Borland weren't even willing to put down a rough quota on paper. Also note that while engineering consider having all your software come from the same vendor as putting all your eggs in one basket, for acquisitions it means having better leverage since the bigger you are as a client, the better are the deals and treatment in general. So, from management's of view, out-spending a single development effort is often worth deepening a service deal. And note how it's yet another side to that "c++ is often chosen to lock out competition" thing: A lot of what we think of as pure technical decisions goes down to job market, hiring options and third-party corporate connections.

                                    P.S. if you're involved with investment bankers at any time...

                                    Yeah that's pretty typical to the east coast. The legal and finance frameworks in New York and Texas are damn right hostile to startups.

                                    Anyhow, this ties well to my point: C++ isn't simply just a (bad) programming language. It's a specific supply chain of HR, tooling and compiler/OS providers that is not only unjustified on technical considerations, but also involves some dubious business practices. i.e. It's always support package deals here, vendor lock there type things... And to most businesses, especially the small-to-medium ones, it's not only a bad technical choice, it's that entering into that particular ecosystem is damn right hazardous. And when you think about it like that, the whole Nukia-Microsoft and Qt/Trolltech situation doesn't become one bad anecdote. It's simply the nature of heavy weight tools: If you depend on a big language and a big os, you're going to need to be able to deal with big companies. So, unless you are a big company, that's not a bed you want to get into.

                                    --
                                    compiling...
                                    • (Score: 3, Insightful) by JoeMerchant on Wednesday July 13 2022, @11:34AM (6 children)

                                      by JoeMerchant (3937) on Wednesday July 13 2022, @11:34AM (#1260453)

                                      Thank you for sharing your opinions. We agree on most of the facts, but from the perspective of small startups to medium sized dev teams in larger corporations, I arrive at different conclusions.

                                      --
                                      Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                                      • (Score: 4, Insightful) by RamiK on Wednesday July 13 2022, @12:14PM (5 children)

                                        by RamiK (1813) on Wednesday July 13 2022, @12:14PM (#1260468)

                                        Yeah I can't argue with that.

                                        --
                                        compiling...
                                        • (Score: 2) by JoeMerchant on Friday July 15 2022, @02:06PM (4 children)

                                          by JoeMerchant (3937) on Friday July 15 2022, @02:06PM (#1261072)

                                          And if you really want to program in Lisp, it isn't all that far away: https://github.com/Robert-van-Engelen/tinylisp [github.com]

                                          --
                                          Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                                          • (Score: 2) by RamiK on Friday July 15 2022, @07:06PM (3 children)

                                            by RamiK (1813) on Friday July 15 2022, @07:06PM (#1261119)

                                            Meanwhile in Texas: https://www.youtube.com/watch?v=DXvJ8duZqdA [youtube.com]

                                            --
                                            compiling...
                                            • (Score: 2) by JoeMerchant on Friday July 15 2022, @07:38PM (2 children)

                                              by JoeMerchant (3937) on Friday July 15 2022, @07:38PM (#1261122)

                                              Cool. When I was in school my programmable calculator had the one true language: BASIC. Graphing calculators weren't a thing yet.

                                              --
                                              Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                                              • (Score: 2) by RamiK on Saturday July 16 2022, @08:37AM (1 child)

                                                by RamiK (1813) on Saturday July 16 2022, @08:37AM (#1261255)

                                                Did it have an alpha-numeric display or a dot matrix? If it's running BASIC, it has indirect addressing and conditional branching so you only need the high-res display to draw graphs...

                                                --
                                                compiling...
                                                • (Score: 2) by JoeMerchant on Saturday July 16 2022, @01:27PM

                                                  by JoeMerchant (3937) on Saturday July 16 2022, @01:27PM (#1261282)

                                                  It was dot matrix but resolution was something like 128x8 and I believe it was only character addressable from the software layer.

                                                  --
                                                  Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
      • (Score: 2) by DannyB on Friday July 08 2022, @02:15PM (12 children)

        by DannyB (5839) Subscriber Badge on Friday July 08 2022, @02:15PM (#1258897) Journal

        If you're still writing in C++ with high level libraries, that is not fixing the problem.

        What is missing is that C++ is still a table saw without any guards. (Or perhaps a few more guards than C table saw has.)

        Some basic things should simply not be possible.
        1. disposing of something twice
        2. not disposing of it
        3. using it after disposing of it (dangling reference pointer somewhere)
        These three things alone have cost, I would dare say, billions of dollars in bugs.

        It should not be possible to cut off your fingers.

        Some languages allow you to reason at a much higher level of abstraction. Lisp is but one good example. What you are talking about is a poorly specified, bug ridden, unsafe, non-standard implementation of Lisp. Sure I could write a program in that. But a standard Lisp implementation has already been worked on by experts. It is as efficient as it is capable of being. So what does a set of libraries on C++ offer than using a good Lisp implementation offer? Other experts have labored over optimizing that implementation.

        These other languages exist for a reason. People can learn these languages that are suited for the problem domain, and get hired for knowing them.

        Would it even be possible to use C++ and libraries to have a Type system such as in the Julia language?

        --
        How often should I have my memory checked? I used to know but...
        • (Score: 3, Insightful) by JoeMerchant on Friday July 08 2022, @07:18PM (11 children)

          by JoeMerchant (3937) on Friday July 08 2022, @07:18PM (#1258949)

          First: I only cut about 2mm off of my thumb tip before investing in the SawStop. I still used cheap table saws after the tip healed over, but the SawStop is much better for serious work, and the safety factor does provide confidence when doing a lot of cutting.

          If you want guards like double disposal protection, there are "smart pointers" for that. No matter the language it's hard for a compiler or interpreter to know when you really want to keep something in memory or not. I do think that auto destruction when leaving scope is a very good thing that C++ does, eliminating the need for explicit free calls on most allocations. For use after disposal protection, smart pointers take you most of the way there, and I often start a signal handler with nullptr checks on all of the smart pointers in the function. If you want to go nuts with it, you can overload the pointer dereference operator to always do a null check before access and throw an exception. I won't let you try-catch and throw in my projects, but the language certainly supports it.

          >It should not be possible to cut off your fingers.

          It is ALWAYS possible to cut off your fingers. In door frames, belt and pulley machinery, fan blades, etc. You can put on guards like SawStop, but there will be cases (like cutting wet wood) where the guards have to be disabled or the tool just won't do it's job. Even if the entire constructed world is padded rooms with safe doors, people can still bite their own fingers (and tongue).

          >Some languages allow you to reason at a much higher level of abstraction.

          Like the container classes that have been ubiquitous for 20 years in C++? You don't need a new language to express higher levels of abstraction in compact forms.

          >a poorly specified, bug ridden, unsafe, non-standard implementation of Lisp.

          Lisp was first specified in 1958, and it has its place, but it does not seem to have ever been a popular choice of language, maybe there are good reasons for that?

          >Would it even be possible to use C++ and libraries to have a Type system such as in the Julia language?

          It is possible to write "mixed language" projects in pretty much any combination of languages you choose, possible but usually a bad idea IMO. It would be possible to do something that pre-compiles C++ code with a type system, if you can specify how you expect that to work without internal conflicts or ambiguity. Many flavor of the month languages do things like that.

          If Lisp (or Clojure or whatever) is "better" for what you are doing, then go for it. Lately I have been doing a lot of C++ that makes system calls like a bash script would, but being in C++ gives me much better access to the system message broker and several other standard libraries than bash ever could, and if bash is ever better than C++ I can call a bash script from C++ rather than a series of single commands. Being Qt/C++ means it's easy to put a GUI on these systems calling programs, providing status dashboards, sorted log streams, and a tabbed interface with push buttons or other GUI elements as appropriate to control things. The main power of C++ in this situation that virtually everything I am asked to work with is relatively easily accessible from C++, without the version control hell of something like Python.

          --
          Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
          • (Score: 2, Insightful) by anubi on Saturday July 09 2022, @09:34AM (10 children)

            by anubi (2828) on Saturday July 09 2022, @09:34AM (#1259052) Journal

            Thanks for that post.

            I am another C++ fan, and will gladly code another library and define new objects as needed.

            Yes, C++ let's me do anything. Including shoot myself. I am careful with it, and it allows me to write concise clean fast code. But I've had my learning curve of "what's going on here?" just as anyone else has. After a while, I have developed my own little toolsets and ways of doing things that work for me.

            Your post is a refreshing read for me, giving me more ideas of how to counter arguments of why I am avoiding some languages that I consider to compile to bloaty code that I can't verify that what I need done is all it's doing.

            --
            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
            • (Score: 2) by DannyB on Saturday July 09 2022, @04:23PM (9 children)

              by DannyB (5839) Subscriber Badge on Saturday July 09 2022, @04:23PM (#1259184) Journal

              You are optimizing for cpu cycles and bytes.

              I am optimizing for dollars.

              Different problems, different solutions.

              --
              How often should I have my memory checked? I used to know but...
              • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @10:56AM (8 children)

                by JoeMerchant (3937) on Tuesday July 12 2022, @10:56AM (#1260101)

                If you are optimizing for dollars by choosing tools that ill-trained low cost developers can use relatively more safely due to their constraints, I would say you are doing it wrong. One good developer is more productive than five cheap ones. Better to spend your low budget headcount on test and quality and procedure mavens. Software isn't a sawmill, mistakes on work in progress are only costly if they are released to customers. Best to force your genius developers to explain their work to newbie testers who are backed up by management to have the power to hold releases as long as it takes to get them right.

                --
                Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                • (Score: 2) by DannyB on Tuesday July 12 2022, @01:56PM (7 children)

                  by DannyB (5839) Subscriber Badge on Tuesday July 12 2022, @01:56PM (#1260144) Journal

                  Good developers can make simple mistakes. It happens. Sometimes it takes a lot of time to find it. Even given your very obvious advice of explaining code, that gee nobody else would have thought of, mistakes still happen. It is better to eliminate the possibility of making some of the most obvious and worst blunders. Especially ones that are purely mechanical and have nothing to do with the problem domain. Such as double freeing a pointer.

                  The language should fit the problem domain. When you talk to accountants they don't ever seem to mention bits, bytes, cpu cycles, pointers, etc. Your programming language should not force you to consider details irrelevant to the problem.

                  Even in a low level language like C++ you end up building abstractions upon abstractions. Like auto pointers, just for example. Other languages are merely abstractions that shield you from the raw sharp edges.

                  Garbage Collection is also a huge advantage for most everyday programmers. Most modern languages all have GC. Especially in the last twenty years. GC eliminated the three biggest sources of bugs (and vulnerabilities) that have cost billions of dollars. Here on SN I've explained before how GC is actually a performance (latency) advantage for a big busy commerce server doing business.


                  An HTTP transaction happens. All of the code that services that request does not see one single cpu cycle of memory management code. The malloc is extremely fast and assumes infinite memory unless the GC is not doing its job properly. This reduces latency because those cpu cycles for memory management happen on other cpu cores later on. Now the money earned by servicing this HTTP request has to pay for the GC, and it does. But the GC happens later, on other cpu cores not servicing HTTP requests. This keeps the latency lower for the money earning threads. None of this reference counting nonsense.

                  We analyze everything in terms of money. Not cpu cycles and bytes. This is where you and I differ. You don't have the money focus.

                  Java workloads can sometimes have terabytes of memory. Finally the 16 TB limitation was lifted and now the only limitation is in Linux itself at 128 TB. I think the Java runtime still is limited to only 768 cores. The thing is we can throw servers, memory, cores whatever is needed at the problem to make money and save time. Developer time is expensive. You cannot argue that point away. Machine resources are cheap.

                  You must be right and everyone else is wrong. How did Java get to be the top #1 language for over 15 years straight? Even today it stays in the top three. What is wrong with all these people? How do they not have your wisdom to understand their problems and how they should solve them.

                  I'll say it again, but it will fall on deaf ears:

                  If there were one perfect programming languages that was ideal for all possible uses, we would all be using it already.

                  Do you not think that in my business we all understand the problems and how to best solve them? And that this has been going on for decades? We explore alternatives and choose things that work best for our problems.

                  --
                  How often should I have my memory checked? I used to know but...
                  • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @02:39PM (6 children)

                    by JoeMerchant (3937) on Tuesday July 12 2022, @02:39PM (#1260152)

                    >Good developers can make simple mistakes. It happens.

                    Absolutely, which is why independent test is key to quality.

                    >It is better to eliminate the possibility of making some of the most obvious and worst blunders.

                    It is better to make an obvious blunder than a subtle one. Also, the frequency of automatically or otherwise flagged obvious blunders is a good indicator of how many subtle problems you can expect from the same source.

                    >GC is actually a performance (latency) advantage for a big busy commerce server doing business.

                    GC has its place, but it should be an easy choice to make, not a default difficult to change.

                    >How did Java get to be the top #1 language for over 15 years straight?

                    Sloppy web developers and other quick-fix junkies. And it's good for that. If you want tiny projects, Java, Python and friends are great quick fix tools to replicate common things. Want to replicate that cool AI project? Just open your jupyter notebook and copy-paste what the previous guy did, just don't expect to be able to dig in deep for changes like processing your images in HSL instead of RGB without a 10x jump up in the learning curve. Were it all coded in C++ to start with, that would be a 5 minute change to the source.

                    >We explore alternatives and choose things that work best for our problems.

                    Kudos, most businesses suffer from horrendous inertia, Not Invented Here syndrome, susceptibility to vendor lock-in and incentives, and a general apathy of management to herd the cats in development.

                    Anything can be coded in any (Turing complete) language. The tendency I have seen for the training wheel languages is for little projects to get started, show a blinking light or hello world in record fast time, generate enthusiasm as they customize to do one or two domain specific things, then get mired in horrendous maintenance overhead when somebody says something like: "Let's go back and make all the buttons perform closed loop communication verifications after activation." Rather than capability scaling with code supporting arrays or matrices of functions that get easier to add as you grow, each new widget or feature seems to take a bit more effort than the last. Of course you can do great things in any language, trac is a pretty impressive python project that I have used daily for 15+ years, but most python projects I have seen developed in-house reach a point where they just stop growing due to effort required for expansion and the suggestion to start over - usually in another language - starts coming up more and more often as they grow.

                    --
                    Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                    • (Score: 2) by DannyB on Tuesday July 12 2022, @05:40PM (5 children)

                      by DannyB (5839) Subscriber Badge on Tuesday July 12 2022, @05:40PM (#1260219) Journal

                      I'm glad you know everyone's business better than they do. And you know the one perfect language they should all be using.

                      GC has its place. That place is in most modern languages. There are still languages that do not have GC for uses where GC is wholly inappropriate.

                      Sloppy web developers and other quick-fix junkies. And it's good for that. If you want tiny projects, Java, Python and friends are great quick fix tools to replicate common things.

                      This is the one that tells me you really don't know what you're talking about. Maybe you are thinking of Javascript not Java?

                      Java is for huge gigantic projects. Millions of lines of code. Written by many developers. Maintained for years even decades by many different people who have no contact with the original authors. Java does very well for this use. Unlike Python 2 vs 3 you can take the very oldest compiled binary or source code and run it on the latest Java versions. Java has a very carefully maintained backward compatibility. I think you do not understand the sheer amount of Java code in existence. I'm not saying you should like Java. Just realize that it really does have its place and is not some kind of mistake just because it is not suitable for YOUR purposes.

                      Python is at the opposite end. Great for small projects because it is interactive and dynamic. Fantastic for prototyping. Even for good sized programs. But not for an enterprise application that I just described. Refactoring is a big problem with dynamically typed languages.

                      I like Python. I like Lisp (various lisps). Java. Other languages. I use the right tool for the right job. Not a one size fits all.

                      I take the view that any successful language must be doing something right for someone.

                      Anything can be coded in any (Turing complete) language.

                      That actually proves my point. We could, and you could write everything in assembly language. Do away with C and all higher level languages and mandate only assembly language worldwide for all programming. Period. No exceptions.

                      Yes, it could actually be done. But at what cost?

                      There is a reason Java exists. There is a reason it is successful. Java it not the right language for all possible uses.

                      --
                      How often should I have my memory checked? I used to know but...
                      • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @07:21PM (4 children)

                        by JoeMerchant (3937) on Tuesday July 12 2022, @07:21PM (#1260255)

                        >And you know the one perfect language they should all be using.

                        There is no one perfect language. GC has its place, but the languages that use it tend to become unmaintainable byzantine disasters when you put them in the hands of people who "need" GC and ask them to do a large, long lived project with it.

                        >I think you do not understand the sheer amount of Java code in existence.

                        I have encountered plenty of it. I have also encountered a great number of management types who thought that Java was their answer for high performance cross platform solutions, and could never quite grasp why it would be unacceptably slow for any of our applications. I even encountered a couple of "Java prototypes" where management steamrollered the crew into "doing it in Java" and had to translate it out to something faster to make it anywhere near competitive in the marketplace. For my purposes, Java usually isn't the answer. I'm sure there are plenty of places where it makes sense, but I also suspect there are many places where it was a bad choice that got implemented anyway because nobody who knew better stood up to explain the issues.

                        --
                        Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                        • (Score: 2) by DannyB on Tuesday July 12 2022, @08:45PM (3 children)

                          by DannyB (5839) Subscriber Badge on Tuesday July 12 2022, @08:45PM (#1260280) Journal

                          Java is not slow. It is quite fast actually. You use commercial web sites that are Java without knowing it. All of the java bytecode is compiled down to native code. Unlike, say, PHP or Perl which is interpreted, or Python, when Java services an HTTP request, it is pure native compiled machine code that services that thread.

                          When a Java program starts up, the java bytecode is interpreted (slow). As soon as any (at this point every) function is using a disproportionate amount of cpu, that function is immediately rapidly compiled into poorly optimized machine code by the C1 compiler, and is put on a list to be recompiled again soon by the C2 compiler. When the C2 compiler comes around, it spends a lot of time and effort producing highly optimized code. Better than an ahead of time compiler is capable of (such as C).

                          Why better? Because the C2 compiler in Java has the WHOLE PROGRAM, not just part of it. When you compile in C, the compiler is only compiling part of the program. The linker may see the machine code of all the component parts which it links together, but it does not globally optimize them in any deep way.

                          Java's C2 compiler can aggressively inline code and it does. It optimizes for speed over size. You can always buy more memory but you can't buy back time. With a compiled and linked program in C, a function in my precompiled library cannot be inlined into your code where you call my function. Your code and my code are independent development efforts. The C2 compiler has the entire program, in java bytecode form, to work with. It can rewrite function parameter calling conventions. It can recognize when certain methods in an object do not need a vtable entry because they are the only possible call target. (Something C++ cannot do)

                          Because of all of this, a Java program seems to start up slowly and then "warm up" and run fast. Thus, Java is not a good solution for writing a replacement for the 'ls' command. But it is fantastic for a very large program running on giant servers where the program runs for a very long time without being interrupted or restarted.

                          I almost never hear anyone complain about Java unless they really don't know much about Java.

                          My journal article: It's fashionable to hate Java [soylentnews.org]

                          That article explains a lot, and some SN readers were surprised.

                          and had to translate it out to something faster to make it anywhere near competitive in the marketplace.

                          It is funny that the marketplace, for enterprise applications, has already decided two decades ago.

                          Ask yourself, why does Red Hat do research on the latest state of the art Java garbage collectors? Why are they spending money on this? See Red Hat's Shenandoah GC for Java. This can work with 16 Terabyte memory heaps with a max GC pause time of 1 millisecond. See for yourself. Yeah, Java is slow.

                          Or Oracle's ZGC garbage collector which is also state of the art and has similar specs to the previous paragraph.

                          Or why does IBM build its own Java runtime called J9?

                          And why oh why does Microsoft, yes Microsoft contribute to Java development? Here's why: because their biggest mega customers all use Java extensively.

                          If you're using Java for dinky programs that run in less than 32 GB of memory you're doing it wrong.

                          --
                          How often should I have my memory checked? I used to know but...
                          • (Score: 2) by JoeMerchant on Tuesday July 12 2022, @09:29PM (2 children)

                            by JoeMerchant (3937) on Tuesday July 12 2022, @09:29PM (#1260304)

                            >I almost never hear anyone complain about Java unless they really don't know much about Java.

                            I had to actively start defending against Java being used on desktop PCs for heavy-lift analysis software 20 years ago. The primary Java cheerleaders had no idea of what they were talking about. At the time, bytecode interpretation in dedicated silicon was one of the "visions of the near future" which, to my knowledge, never materialized.

                            Since then, Java (not Javascript) as web based applications have occasionally haunted me, forcing end-runs around modern browser security improvements. Small time stuff, 2-3 man-months to implement, that would have been better implemented in something else to achieve their "low maintenance cross platform" objectives.

                            No doubt the large scale web-apps have benefited from advances in JIT compilation, but if you tell me that much of Facebook runs on Java and owes its elegant user experience to the power of Java I'm going to have to laugh you off the board.

                            I wouldn't advocate for C++ to implement a big web based app, although I'm listening to an http interfaced mp3 player I coded up last year right now, it is pretty good for simple-ish stuff.

                            >why does Red Hat do research on the latest state of the art Java garbage collectors? Why are they spending money on this?

                            Because the customers want it. Doesn't make it "the best" answer for all problems. If you're at scale with 1000+ coders in a single playground, I see the need for sandboxes, garbage collection, etc. I play more on teams of 50 or less, often times over the past 30 years on teams of 5 or fewer engineers. We have always made self-contained widgets, not web-facing million+ user monsters. The widgets tend to have 5 to 100 man-years in their development, and serve a handful of users at a time, often just one.

                            >If you're using Java for dinky programs that run in less than 32 GB of memory you're doing it wrong.

                            Agreed. Our latest box only has 16GB of RAM.

                            --
                            Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
                            • (Score: 2) by DannyB on Wednesday July 13 2022, @07:17PM (1 child)

                              by DannyB (5839) Subscriber Badge on Wednesday July 13 2022, @07:17PM (#1260577) Journal

                              It sounds like you have been influenced by Java long long ago and are unaware of what modern Java is like. Java has come a very long way. Java's C2 compiler is one of the most sophisticated compilers there is. Similarly some of its modern GC's.

                              It was a mistake for browsers to ever support:
                              * Java applets
                              * Flash
                              * ActiveX
                              * Silverlight

                              Java's C2 compiler can rewrite functions. For instance, produce two versions with two parameter lists and calling conventions. It can do this because it has the WHOLE program to work with at runtime.

                              --
                              How often should I have my memory checked? I used to know but...
                              • (Score: 2) by JoeMerchant on Wednesday July 13 2022, @08:19PM

                                by JoeMerchant (3937) on Wednesday July 13 2022, @08:19PM (#1260596)

                                >Java's C2 compiler can rewrite functions. For instance, produce two versions with two parameter lists and calling conventions. It can do this because it has the WHOLE program to work with at runtime.

                                That's nice. Right now, I'm fighting with gcc over optimizing a "whole module" at one time. We have a system backbone of ~3000 properties with message broker getters, setters, default values, value changed signals, etc. and the biggest of our property using applications is causing the gcc linker-optimizer to balloon in optimization time such that the total system build is taking 14 minutes with 2900 properties, 15 minutes with 2925 properties, 17 minutes with 2950 properties, 20 minutes with 2975 properties, 24 minutes with 3000 properties, etc. The basic problem is: all the properties are written to a single module and when gcc is optimizing this "whole module" at one time, it is scaling at something like O(n^3) optimization time, and we're getting to the point where that is starting to hurt.

                                I believe we're coming down to horses for courses. If you're writing a Facebook competitor, Java may be your horse. I live in a world of 5-50 developers writing apps that are used by 1-5 users at a time running on a single machine of the Core i7-6xxx class. We've got some STM-32 accessory boards playing in the system, and might network out for 2% of the total functionality of the system, otherwise we're focused on delivering the best user experience to our local user(s) on ~10K copies of this mostly self-contained system running in 40+ countries around the world. We sell the system for $10K-$40K at a build cost of around $8K, but that's not the point, the point is to sell $200 disposable "razor blades" (that we make for $50 net per copy after $5M development costs) that the system enables our users to benefit from during their $2-5K operations. Our best users do ~1500 operations a year, for a profit of $225K annually per device to us on the "blades" alone. So, we're not Facebook, but just this one device (of about 6 that we make) nets around $700M per year - not all users are "best" users, some only do 10 operations per year per device. Our software dev team for this device might run a total headcount cost of around 30 (coders, testers, quality, etc.), at maybe $220K per head (not that we take home anywhere near that), so we're only costing about 1% of gross revenue - not a bad margin, and not an insignificant business model. Over the last 30+ years, I have worked for a dozen similar companies with similar products, and Java is just not our horse, though we are starting to let the Javascript camel's nose under the tent flap, in the name of diversity and accommodating the available hiring pool which includes our latest programmer who took a maternity leave last month.

                                --
                                Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
  • (Score: 2) by Freeman on Thursday July 07 2022, @10:48PM (1 child)

    by Freeman (732) Subscriber Badge on Thursday July 07 2022, @10:48PM (#1258778) Journal

    The author goes on to reason that new languages are mostly created for control and fortune, citing Microsoft's C# as an example of their answer to Java for a middleware language they could control.

    That's probably the best explanation. Whether it's Microsoft trying to control X or people just trying to get out from Microsoft/Apple/BIGCORPNAMEHERE's thumb.

    --
    Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 2) by Freeman on Thursday July 07 2022, @10:52PM

      by Freeman (732) Subscriber Badge on Thursday July 07 2022, @10:52PM (#1258779) Journal

      Okay, I just more or less agreed with two different explanations. Brain is smooshed at end of day.

      They are both plausible explanations though. Likely as with anything, not just one thing is responsible for the creation of so many different, yet similar things.

      --
      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
  • (Score: 5, Funny) by oumuamua on Thursday July 07 2022, @11:41PM (1 child)

    by oumuamua (8401) on Thursday July 07 2022, @11:41PM (#1258783)

    Needs to be mentioned, if for no other reason, for having the funniest promotion video ever: https://www.youtube.com/watch?v=rRbY3TMUcgQ [youtube.com]

    • (Score: 0) by Anonymous Coward on Friday July 08 2022, @12:29AM

      by Anonymous Coward on Friday July 08 2022, @12:29AM (#1258790)

      Hi Joe.

  • (Score: 3, Funny) by Samantha Wright on Thursday July 07 2022, @11:53PM (3 children)

    by Samantha Wright (4062) on Thursday July 07 2022, @11:53PM (#1258784)

    As a PLT enthusiast, this article is so annoying that I'm going to go create a new programming language. Right now.

    • (Score: 4, Informative) by hendrikboom on Friday July 08 2022, @12:21AM (2 children)

      by hendrikboom (1125) on Friday July 08 2022, @12:21AM (#1258787) Homepage Journal

      PLT. Do you mean PLT Scheme, which has been renamed and is now called Racket?

      For those not in the know, it's a language whose ultimate ancestor is LISP, and has been extended to make language extensions easy. So you can build new notations on top of the existing language, and then the new stuff can be completely compatible with the old.

      You can carry this out to an extreme degree if you choose. One of the Racket family of languages is Algol 60 [racket-lang.org]

      Another is Scribble [racket-lang.org] -- a library and a new syntax for Scheme that makes it into a text mark-up language.

  • (Score: 2, Funny) by Mojibake Tengu on Friday July 08 2022, @12:19AM (14 children)

    by Mojibake Tengu (8598) on Friday July 08 2022, @12:19AM (#1258785) Journal

    Why Are There So Many Programming Languages?

    Because clowns under IQ160 are unable to learn and use macro assembler effectively.

    Why would you want more than machine language?
    https://blog.deta.sh/posts/assembly/ [blog.deta.sh]

    It is a recorded fact John von Neumann wrote programs in machine code for ENIAC into paper forms passed for punching to typists girls directly from his head, with an ink pen.
    No pencil, no bugs.
    And his own quotes about compilers of programming languages as useless tools for mentally retarded are legendary.

    This civilization is degenerate.

    --
    The edge of 太玄 cannot be defined, for it is beyond every aspect of design
    • (Score: 5, Insightful) by Kell on Friday July 08 2022, @01:24AM (2 children)

      by Kell (292) on Friday July 08 2022, @01:24AM (#1258803)

      That's cool and all, but part of the power of technology is that it makes its utility available to those who are not god-geniuses. A tool that can only be used competently by the elite is far less valuable than a tool that can be used competently by an average person. Unless you have an army of von Neumanns available (perhaps produced by some sort of von Neumann machine?) then ultimately you must compromise if you want to be productive. It's not degenerate: it's merely pragmatic.

      --
      Scientists ask questions. Engineers solve problems.
      • (Score: 3, Interesting) by krishnoid on Friday July 08 2022, @03:26AM

        by krishnoid (1156) on Friday July 08 2022, @03:26AM (#1258811)

        And yet with tools like compilers and static analyzers and filesystems and keyboards, we still have trouble following rigorous discipline [joelonsoftware.com] when using those tools. Probably reflects more on the human condition than competency, really.

      • (Score: 2) by JoeMerchant on Sunday July 10 2022, @09:24PM

        by JoeMerchant (3937) on Sunday July 10 2022, @09:24PM (#1259579)

        Not everyone should be designing bridges and skyscrapers, no matter how easy to use you make the design tools.

        --
        Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
    • (Score: 3, Funny) by Reziac on Friday July 08 2022, @02:15AM

      by Reziac (2489) on Friday July 08 2022, @02:15AM (#1258806) Homepage

      REAL programmers use COPY CON PROGRAM.ZIP

      And here I thought that was a joke!

      --
      And there is no Alkibiades to come back and save us from ourselves.
    • (Score: 5, Interesting) by DannyB on Friday July 08 2022, @03:31AM

      by DannyB (5839) Subscriber Badge on Friday July 08 2022, @03:31AM (#1258812) Journal

      If I write my program in Java and my competitor writes in a macro assembler, I will beat my competitor to market by a year. My manager and I will laugh all the way to the bank.

      Inefficient runtime you say? Heck, we'll just throw an extra 64 GB of memory and another 4 processors in the production server and call it a day.

      It's cheap. (for the money it will make)

      As mentioned in TFA, once upon a time every cpu cycle and byte mattered because they were very expensive, but developers were cheap.

      Now cpu cycles and bytes are dirt cheap and developers are very expensive. For one month's pay and benefits for a single developer, I could add tons of bytes and cpu cycles to a machine.

      The world has changed.

      Now all that said, I remember reading about Apple's macro assembler back in the day. I was amazed. I never used it, but I understood how I could have conditional logic that could cause macros to expand in different ways. And parameterized ways.

      --
      How often should I have my memory checked? I used to know but...
    • (Score: 2) by Mykl on Friday July 08 2022, @03:54AM (1 child)

      by Mykl (1112) on Friday July 08 2022, @03:54AM (#1258816)

      It is a recorded fact John von Neumann wrote programs in machine code for ENIAC into paper forms passed for punching to typists girls directly from his head, with an ink pen

      He was also a hit at parties.

      • (Score: 1, Interesting) by Anonymous Coward on Friday July 08 2022, @12:11PM

        by Anonymous Coward on Friday July 08 2022, @12:11PM (#1258868)

        Actually, he was well known for hosting quite a lot of parties and was very much the social butterfly. He seemed to have an insecurity about being seen not as a "regular guy," which is one reason that he wanted everyone to call him Johnny.

    • (Score: 3, Insightful) by tangomargarine on Friday July 08 2022, @08:26AM (3 children)

      by tangomargarine (667) on Friday July 08 2022, @08:26AM (#1258834)

      Why would you want more than machine language?

      Readability.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by Opportunist on Friday July 08 2022, @10:55AM (2 children)

        by Opportunist (5545) on Friday July 08 2022, @10:55AM (#1258861)

        I can read asm code all right, what seems to be the problem?

        • (Score: 3, Insightful) by DannyB on Friday July 08 2022, @02:03PM

          by DannyB (5839) Subscriber Badge on Friday July 08 2022, @02:03PM (#1258891) Journal

          Some programs deal with subject matter at a much higher level of abstraction. Your programming language should not force you to focus on the irrelevant details.

          A therom prover or computer algebra system written in Lisp could be written in assembler. But now it is non portable. Hard to maintain. Difficult (at best) to reason about.

          --
          How often should I have my memory checked? I used to know but...
        • (Score: 3, Insightful) by tangomargarine on Friday July 08 2022, @06:21PM

          by tangomargarine (667) on Friday July 08 2022, @06:21PM (#1258939)

          It's not a question of whether *you* can read your code; it's whether whoever follows you can understand your code later. The same reason why sometimes it's better to hire an ordinary average programmer than a genius code wizard who nobody else can keep up with.

          --
          "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 3, Insightful) by Opportunist on Friday July 08 2022, @10:29AM

      by Opportunist (5545) on Friday July 08 2022, @10:29AM (#1258855)

      True, but do you want to spend your life writing code just 'cause you and me are the only ones who can do it? I'd gladly pass on that menial task to the less gifted while I deal with real problems.

    • (Score: 2) by DannyB on Friday July 08 2022, @02:01PM

      by DannyB (5839) Subscriber Badge on Friday July 08 2022, @02:01PM (#1258890) Journal

      Why would you want more than machine language?

      Development time.

      Development cost. (yes, this is a real thing! Believe it or not!)

      Maintainability.

      Easy refactoring tools.

      Multi platform. I want to take my compiled binary and run it on different operating systems, different types of processors -- and have it become machine code at the very last possible moment. This is a HUGE advantage. I took a Mandelbrot viewer program I wrote in Java in 2004 and years later ran it (the binary!) on a Raspberry Pi which didn't exist when I wrote the program.

      Optimization. Once you have the complete program to optimize, it is possible to do optimizations that are not possible in an ahead of time compiler. For example, the global optimizer could realize that a certain function does not need a vtable entry and could make the calling convention more efficient -- throughout the entire program wherever this function is called. Aggressive inlining of code is possible that an ahead of time compiler and linker cannot do. Once you have the WHOLE program, you can inline some function from one library into where it is called in some other library, in machine code form. The optimizer could rewrite a single function into two variations of that function with slightly different parameter lists so it is called more efficiently depending on how it is used. A class member function that never references the class itself (eg, a 'static' function) could be turned into a static function at runtime -- but this affects every single place in the global program where that function may be called from. This cannot be done in an ahead of time compiler and linker.

      Maybe a human can do some optimizations better than a machine. But that is debatable. There are now experiments in using ML (machine lernin') to optimize code in LLVM.

      Finally that brilliantly written, beautifully crafted, macro assembler code -- which is a true work of art -- probably becomes obsolete in five years and is scrapped. Today's micro controllers are more powerful than the mainframes of your you're yore.

      --
      How often should I have my memory checked? I used to know but...
    • (Score: 0) by Anonymous Coward on Friday July 08 2022, @05:50PM

      by Anonymous Coward on Friday July 08 2022, @05:50PM (#1258936)
      If you're one of those who are writing truly novel programs where most of the code etc has to be written from scratch, then pick a language for all the code that you will write.

      But otherwise pick a language for all the code you WON'T have to write (and document, support, etc).

      If you disagree, please go write an OS and web browser in machine language from scratch and then use it to reply to my comment.
  • (Score: 3, Insightful) by DannyB on Friday July 08 2022, @03:32AM

    by DannyB (5839) Subscriber Badge on Friday July 08 2022, @03:32AM (#1258813) Journal

    If there were one perfect programming language for all purposes, we would all be using it already.

    I snicker at people who are prejudice against any successful language.

    I have my own favorites. But I recognize that any language that is in widespread use, for a long time, and successful, must be doing something right for someone. It checks all the right boxes for someone.

    --
    How often should I have my memory checked? I used to know but...
  • (Score: 2) by Snotnose on Friday July 08 2022, @03:44AM

    by Snotnose (1623) on Friday July 08 2022, @03:44AM (#1258815)

    Back in the 80s there was a monthly magazine called Computer Languages. Every issue had at least 1 article for a new language. If memory serves, later on it got to be pretty C/C++ specific, but it was always interesting to read about a new language.

    --
    I just passed a drug test. My dealer has some explaining to do.
  • (Score: 1, Insightful) by Anonymous Coward on Friday July 08 2022, @06:17AM

    by Anonymous Coward on Friday July 08 2022, @06:17AM (#1258828)

    I've written my own optimizing compiler (in C) for a homebrew language to fill a specific gap in my pipeline that nothing else would've been suitable for, and I'd imagine that at least *some* of the language surplus out there (originally) stemmed from similar happenstances.

  • (Score: 1) by jman on Friday July 08 2022, @10:46AM

    by jman (6085) Subscriber Badge on Friday July 08 2022, @10:46AM (#1258858) Homepage

    I wonder what would happen if this were applied to Human languages. Would we discover that High German was all about control, and Navajo about expression of beauty?

  • (Score: 2) by srobert on Friday July 08 2022, @04:58PM

    by srobert (4803) on Friday July 08 2022, @04:58PM (#1258918)

    Because God felt threatened by mankind building a tower to heaven and achieving godlike powers, so he confounded their language so that .... wait, what? Oh PROGRAMMING languages, nevermind.

  • (Score: 2) by darkfeline on Friday July 08 2022, @09:03PM

    by darkfeline (1030) on Friday July 08 2022, @09:03PM (#1258980) Homepage

    Because it's easy to make a new language.

    There is never a best anything, whether that be language or culture or book or game or species or plant or... What we end up with is something that is "good enough". So it comes down to how easy it is to make a new thing that is better for me compared to the existing thing that is good enough.

    Why aren't there thousands of different OSes? Because writing an OS is hard, much harder than using the "good enough" OSes we have now.

    Making a new language is easy; any decent programmer could do it in a day. So the threshold for "not good enough" can be as low as "I don't like this bit of syntax".

    --
    Join the SDF Public Access UNIX System today!
(1)