Over at ACM.org, Doug Meil posits that programming languages are often designed for certain tasks or workloads in mind, and in that sense most languages differ less in what they make possible, and more in terms of what they make easy:
I had the opportunity to visit the Computer History Museum in Mountain View, CA, a few years ago. It's a terrific museum, and among the many exhibits is a wall-size graph of the evolution of programming languages. This graph is so big that anyone who has ever written "Hello World" in anything has the urge to stick their nose against the wall and search section by section to try find their favorite languages. I certainly did. The next instinct is to trace the "influenced" edges of the graph with their index finger backwards in time. Or forwards, depending on how old the languages happen to be.
[...] There is so much that can be taken for granted in computing today. Back in the early days everything was expensive and limited: storage, memory, and processing power. People had to walk uphill and against the wind, both ways, just to get to the computer lab, and then stay up all night to get computer time. One thing that was easier during that time was that the programming language namespace was greenfield, and initial ones from the 1950's and 1960's had the luxury of being named precisely for the thing they did: FORTRAN (Formula Translator), COBOL (Common Business Oriented Language), BASIC (Beginner's All-purpose Symbolic Instruction Code), ALGOL (Algorithmic Language), LISP (List Processor). Most people probably haven't heard of SNOBOL (String Oriented and Symbolic Language, 1962), but one doesn't need many guesses to determine what it was trying to do. Had object-oriented programming concepts been more fully understood during that time, it's possible we would be coding in something like "OBJOL" —an unambiguously named object-oriented language, at least by naming patterns of the era.
It's worth noting and admiring the audacity of PL/I (1964), which was aiming to be that "one good programming language." The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's.
The author goes on to reason that new languages are mostly created for control and fortune, citing Microsoft's C# as an example of their answer to Java for a middleware language they could control.
Related:
Non-Programmers are Building More of the World's Software
Twist: MIT's New Programming Language for Quantum Computing
10 Most(ly dead) Influential Programming Languages
(Score: 4, Insightful) by looorg on Thursday July 07 2022, @10:31PM (38 children)
Why? Everyone thinks they can invent a better language, one that fixes that ONE (or more) annoying things that all (or some) the others do, or one that includes the best part of umpteen other languages into one super-meta-language. It's a selfsustaining problem that will never be solved. Somehow the more new languages there are the more problems there are and the more shit they appear to become. There is always another one just around the corner that promises to fix all and be the one. There can be only one.
It's either that or it's so the ransomware people in that other post somewhat below this one can update their malware to the new shiny language that increases malware performance ...
(Score: 3, Insightful) by gringer on Thursday July 07 2022, @10:41PM (1 child)
Relevant XKCD:
https://xkcd.com/927/ [xkcd.com]
Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
(Score: 2) by Freeman on Thursday July 07 2022, @10:45PM
Yep, that's it in a nutshell.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 5, Insightful) by JoeMerchant on Friday July 08 2022, @01:20AM (35 children)
Or, you can just fix the problems with an updated API on top of C++...
Need garbage collection? There's a library for that.
Need type-free variables: try a variant class.
and on and on and on and on...
Or we can all start watching our indentation so we can call it Python (or just use a style Nazi on C++). https://bitbucket.org/verateam/vera/wiki/Introduction [bitbucket.org]
Or we can force ourselves into functional paradigms and call it Erlang (or just use functional paradigms in C++). https://docs.microsoft.com/en-us/archive/msdn-magazine/2012/august/c-functional-style-programming-in-c [microsoft.com]
In the end, most new languages are themselves coded in C++... the only thing that makes C++ "too hard to use safely" are programmers in too much of a hurry to enforce safe practices on their coding, and there are more and more static analysis tools every day that will warn you about all kinds of unsafe practices - even some perfectly safe ones, if you practice them correctly.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Friday July 08 2022, @10:44AM (21 children)
The purpose of programming languages is to simplify assembly/machine code via abstractions. C++ provides so many incompatible abstractions with shitty syntax and backwards grammar that it fails to simplify much of anything. And when all of that meshes across different libraries, it all just breaks spectacularly when it comes time to actually understanding what the hell is going on.
The fact that you have so many competent specialists across different problem-spaces all looking at this and that and concluding that the issue is better off handled with a different language goes to show you there really isn't a single legitimate use-case where C++ can do the job well enough that isn't entirely isolated and self-contained.
compiling...
(Score: 3, Interesting) by JoeMerchant on Friday July 08 2022, @06:35PM (20 children)
You sound like you think the majority of professionally employed programmers are competent. I find the opposite to be true, which is why all these training wheels and sandboxes have been provided for them to work in.
No matter how "idiot proof" you make a development environment, there's always a more clever idiot that will come along and break things.
gcc provides a layer of abstraction from machine language to a reasonably human readable language. Boost or Qt or your library of choice provides another layer of proven implementations of common constructs and tasks. There are plenty of "drag and drop codeless" software construction systems which are essentially apps written on libraries, and they have their place.
All I see python providing is a way to stitch together libraries (mostly written in C or C++) in such a way as to recreate version management hell writ even larger than Microsoft ever managed to with dll hell.
Go Rust Ruby and their faddy friends? There is a reason they come and go.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 3, Insightful) by RamiK on Saturday July 09 2022, @05:33PM (19 children)
Python is abstracting C++/C. C is abstracting assembly. Assembly abstracts machine code. Machine code abstracts microcode. Microcode abstract logic gates. Logic gates abstract transistors. Transistors abstract electrical differential equations...
Your "readable" is another specialist's training wheels. There's no clear line being crossed here. You can build a computer out of relay logic. You can do analog computing that out-performs anything done with transistors. And you can take the analog all the way down to quantum computing since that's basically what it all is.
This is precisely what I see C vs. C++ as: Just a (particularly bad) way of stitching together C code.
Mind you, I think python 3 has gone too much multi-paradigm and is showing serious C++-like problems nowadays. However, despite its faults it's still pragmatic to use in many cases at least compared to Perl or Ruby.
I'll give you Ruby mostly because Rails is coming and going all the time... But Rust is only now got accepted into the Linux kernel as a development language while Go has similarly just recently (1.18 generics and 1.19 memory model) started to aim at growth beyond its well carved back-end niche.
Regardless, there's nothing wrong with languages coming and going. Like spoken languages, it's only natural for languages to reflect their population's migration from different domains. The problem begins when you're trying to stay backwards compatible... That leaves you talking pig Latin and writing Chinese script.
compiling...
(Score: 3, Insightful) by JoeMerchant on Sunday July 10 2022, @08:49PM (18 children)
I don't view auto destruction upon scope exit as training wheels, but garbage collection is and that is because scope exit is a precisely defined behavior whereas gc is a squishy just don't worry your pretty little head about the details implementation.
Abstraction that results in predictable results because similar things are done the same way every time is simply good practice. Variables that guess their type for you? Training wheels.
The thing I like least about Python is that it doesn't add any useful abstraction as compared with including a library, but when you ask it to do something like an RGB i8 to HSL f8 translation, it is dog slow (~100x slower than C++) forcing you to jump into C or Fortran or C++ to get something that simple done, unless you can find a reliable implementation out there, but stitching together so much simple stuff is A) a significant management problem and B) a serious security issue.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Monday July 11 2022, @07:58AM (17 children)
Garbage collection is to the heap as automatic variables are to the stack. Sometimes it shares the scope rules. Sometimes it has its own rules. Either way, the lifetime of variables is always clearly defined in every language's specifications.
C does implicit type conversions all the time: https://www.scaler.com/topics/c/implicit-type-conversion-in-c/ [scaler.com]
C++ templates and generics take it a step further.
Nothing adds anything to C++ since C++ has and abstracts around everything. The problem is that doing everything, everywhere all at once with so much ass-broken syntax means you end up with unjustifiable cognitive overhead that comes down to unsustainable productivity and manpower costs. Literally every python, ruby or whatever popular code-base is an example of a project that won the time-to-market competition against C++. Even in the low-level, you find micropython winning over not just C++, but even C and assembly.
And just when you think C++'s existing library offering ecosystem is enough to compensate for its slow development times, you see projects like Redox OS gradually redoing everything from scratch, design included, and still out pacing any other similar development effort. And you don't have to look so far into the newer languages. The Arcan guy is writing his own window manager, game engine and pretty much a whole graphical user-land from scratch with C and Lua.
Add it all up and it becomes clear C++ just isn't worth it to at most scales. In fact, C++ is often used specifically to keep the cost-of-entry so high that the competition won't bother. That's what screwed up Mozilla: They tried competing against Chromium using the same high-cost tool set. So, naturally, they can't keep up.
compiling...
(Score: 2) by JoeMerchant on Monday July 11 2022, @01:58PM (16 children)
>with so much ass-broken syntax means you end up with unjustifiable cognitive overhead
Chinese speakers say this (and quite justifiably) about English. English speakers are mostly too ignorant of Chinese to have a complaint like this. English speakers complain about how the French do their numbers, Parisian French complain about - well - everything.
Point? It took me 5+ years of (solo, unguided, unmentored) daily professional practice with C++ to learn the parts of the ass-broken syntax that I use frequently. That probably could have been cut down to a couple of years if I focused 20% of my time on learning the language instead of just using it to get stuff done. Could have been a year or less with a good mentor who spent 20% of our mutual time doing code reviews of what I, and they, had written in the other 80% of our time - there's a best practice that I have never ever seen implemented in the real world, entirely due to the perceived and actual values of management.
>C++ is often used specifically to keep the cost-of-entry so high that the competition won't bother.
More than twice, I was hired to to untangle a bunch of academic code written in user-unfriendly combinations of: Matlab, Fortran, Python, Squirrel Script (yes, it's actually a thing...), ancient C, of course Java, and various other junk, and translate it into a user friendly and maintainable package. Being a Qt/C++ speaker, that's what I recommended, and implemented in, and reduced the various toolchain + language towers of Babel to a single language, single API code base that ported to all kinds of target systems in a single click to launch (or auto-launch on power-up) user experience. The "powerful abstraction" Matlab code was hiding an un-necessary nesting of a loop which, when untangled, sped up execution of the C++ code 100x - just translating Matlab to C++ was good for another factor of about 2, but the real gain was in our multi-programmer code review of the guilty module which the profiler pointed out. C++ has pretty much all those fancy toys easily available, like highly developed runtime profilers, which tend to have spotty availability and less robust development in newer languages.
That's where I really balk at flavor of the day languages: in the toolchains' setup and maintenance. You can literally spend days getting the tools figured out before even starting on your code, and if you need to update to version x.y every few months that adds to the cost. The Qt/C++ has some toolchain setup overhead, particularly in the Microsoft influenced domains, but in general it's one of the quickest and easiest environments I have used for setup from scratch. Of course, I'm biased, because it's what I've been practicing for 16 years now, but during those 16 years I've watched so many man-months sunk into other toolchain setup and maintenance efforts it only reinforces that aspect of my prejudice.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Monday July 11 2022, @10:19PM (15 children)
My Chinese isn't even HSK 1 - I've basically learned just enough pinyin and some grammar to pull off wo bu hui shuo hanyu in a pinch and more or less catch basic keywords - but from what I've seen and heard, Chinese grammar is a delight compared to English.
A rewrite of interpreted academic code into a compiled language is expected to yield 10-100x performance improvements regardless if it's C++, C or whatever. Python specifically is officially quoted as being 10-100x slower than C exactly because it's simply the unavoidable cost of dynamic type and garbage collection. The thing is, the question isn't how pure interpreted dynamic code performs, but what would have happened if you'd write the performance-hungry loops in C libraries and binded everything else in python, go or lisp for all I care so it won't take 5 years of training and 15 years of practice to be able to maintain and develop the code going forward?
Toolchain setup and maintenance is a big part of the overall picture but it doesn't explain Java, C# or even, albeit after many years of dedicated focused work, Go's lack of adoption within the C++ ranks.
Qt is a good example of everything that's wrong with C++: Instead of having a few widgets that can be used with different rendering engines and what not, the Qt devs just went full retard and implemented their own memory management, types and main loop. Admittedly, the GTK and EFL devs screwed it up as well in much the same way... And I suppose at least with C++ you can override the methods so it's not quite as awful as GTK and EFL ended up being... But really, even if you claim to like C++, Qt isn't really C++ anymore. It's a dialect of C++. A horrible, mutated dialect running on some fubar garbage collected OS...
compiling...
(Score: 2) by JoeMerchant on Monday July 11 2022, @10:54PM (14 children)
>what would have happened if you'd write the performance-hungry loops in C libraries and binded everything else in python, go or lisp for all I care
So, your suggestion is to learn just enough C or C++ (or Fortran would do the job too) to implement the stuff that Python or whatever is too hobbled to handle, then hybridize that with the training wheels language so we have the opportunity to experience the worst of both worlds plus the grafting of multiple languages into a single project?
I am actually a fan of message broker systems, AMQP, MQTT, DDS, whatever. Build a system with whatever (single) environment makes sense for the task at hand, be that embedded microcontrollers, or C++ or whatever and let them coordinate through the (well established and debugged) message broker. Due to the ability to have tiny microcontrollers and heavy lift operating systems in a single device, it makes a lot of sense to me.
Qt has gone full retard in many ways, but it is a robust enough environment that I can not only shill/plug the company line: code once run anywhere, but also step out to straight C++ or other included libraries when the Qt answers are lacking, or otherwise undesirable. All without having to graft multiple languages and tool chains into a single executable.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Tuesday July 12 2022, @05:50AM (13 children)
I'm not sure what you're complaining about. Between templates and C itself, C++ is already trilingual. The problem is the tool-chains. But, between how Go embeds C and Lua is embedded in C, you obviously have better choices than python when you want to have C++ expressiveness and performance without C++ syntax so just do that.
And it's not my solution. Again, it's what already being done in most, if not all, large successful C++ project. Less so with C since you can still write large scale C code at the kernel and library problem space without deteriorating everything C touches into an unsafe language. However, I can't say the same about C++. Most C++ libraries assume method overrides so binding to them creates issues. It's not always the case (protocol and file-type C++ libraries typically avoid this so there's no issue with using them...). But it's the exception that proves the rule.
Careful there. Generalizing communication protocols and IPC as "message broker" might work on paper but context switching and most data sharing makes IPC slow (well, on general purpose machines at least...) so that generalized design only really applies to networked machines and specific services/daemons. You don't want to end up with Plan 9...
As previously mentioned, the company line often trends towards "the tools we know and that keep the competition away are good enough". It clearly doesn't reflect the market seeing how Qt somehow managed to lose market to dog slow abominations like Electron.
The company line is the market and the market has decided in favor of Electron so it might be best not to refer to the company line going forward.
compiling...
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @10:35AM (12 children)
Qt marketing has been a schizophrenic shit show for the past sixteen years. The fact that the product survived a Microsoft acquisition and still remains useful is a testament to the strength of open source licensing, yet even today aggressive rent seeking by Qt marketing continues to scare away long time large and small users.
As for the tech, they lost me when they introduced quick/JavaScript, the timing and horrible state of uselessness when it was rolled out suggest to me that the whole thing is/was a Microsoft plot to kill Qt market share.
Still, it is a (the?) great cross platform desktop environment and is very widely used in embedded devices with touch displays. Pretty much anytime you encounter a product that runs on Linux and Windows and OSx Qt will be behind the curtain.
As for having a choice to avoid Python, I don't find that to be the case in AI/ML circles, and the effort there seems to be 90% in the data gathering, 9% in the tool chain setup and less than 1% in the coding. Ironic that they spend so much time waiting for each learning cycle to run and have such an optimization hostile environment. Sure, there is plenty of pre baked hardware acceleration, but how many Python coders does it take to tweak a hardware acceleration module?
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Tuesday July 12 2022, @05:39PM (11 children)
You know, half the apps on my phone are webkit front-ends and they launch instantaneously. I know it's because webkit is constantly cached in RAM for about ~100-350MB worth (platform and model dependent)... But, the thing is, RAM is cheap, abundant and a prerequisite for 4k anyhow so you knew you'll get plenty of it even 10 years ago. So, what were they supposed to do as the market shifted away? Downsize? They had to deliver the product the clients wanted and the clients wanted .JS since C++ devs were too expensive and time-consuming to train.
It's all hardware accelerated. Each and every function. I've used some of those libraries for dataset clustering (nothing fancy. basically a few dozen lines of sklearn and numpy for DBSCAN) and it felt like writing gl shaders whereby the pipeline was so stupid obvious that you'd actually have to go out of your way and forcibly use the python standard library and data types instead of relevant libraries' functions and data types to screw up. And mind you, numpy actually forces explicit casting to do un-optimized operations so you're really going to have to make a conscious effort to fuck things up here.
Are you paying per individual or per working and training hour? Industrial scaling costs is what drives automation so I don't see why you'd measure in number of coders instead of man-hours. C++ devs are scarce and costly. Python devs and faster hardware is cheaper and more readily available. The point is that there's a middle way: Have the low-level written in actual low-level C and the high-level written in actual high-level python/go/flavor-of-the-month. It's the most cost-effective approach as the market repeatedly shown.
Besides, don't forget that very well funded, modern, all-C++ desktop OS is falling behind the antiquated 70s-mainframe knockoff C linux really is. So, there's a fundamental fallacy behind C++ puritanism that isn't easy to reconcile with reality unless you squint real hard to avoid the entirety of your software stack.
compiling...
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @07:12PM (10 children)
>C++ devs are scarce and costly. Python devs and faster hardware is cheaper and more readily available.
I agree about the abundance of RAM and fast hardware - I regularly read entire, fairly large, files into a QString and toss 'em around in RAM before writing them back out, something that would have been considered "amateurish and unscalable" back in 8 bit days.
However, there's a bit of a fallacy about Python devs being cheaper, and I think it's driven in-part by miserly management. Follow me here: in _generic big city_ C++ devs earn $150K/yr while starting Python devs make $50K. Sounds like you can afford 3:1 Python:C++ devs, and I know far too many managers who think that way. But... the first year of work at most jobs is about 80% productive, at best, often much worse, so turnover makes your employees costlier, and those low-cost devs will turn over faster for lots of reasons. Then there's the overhead cost that doesn't scale with salary, particularly with "back in office" management demands - something on the order of $50K per developer / year when you're at a place that does real HR and more than one level of management. So, to retain those Python devs for more than 9 months, you'll be giving them raises to $75K fairly quickly - putting your "loaded" costs at $200K / head for C++ vs $125K / head for Python. Also, don't forget your mythical man month cost of communication, the more devs you have, the more time they have to spend talking to each other to be effective, and all these "cheap and readily available" Python programmers will be needing more of both communication and mentoring time, particularly considering that people who whine about C++ being "too hard" are likely to need much more hands-on mentoring and guidance vs those who can figure stuff out for themselves, given some time and a broadband connection.
I walked into a shop that had about 6 junior devs, heavy Python preference but some C/C++ and Matlab going on too. One of the junior devs was every bit my equal in productivity and ability, though he didn't know how to push back on management when they were being obtuse. The other 5 didn't add up to the two of us, not even close. Some had compensating other talents, like communication with the academic community and grant sources, but all those "cheap" programmers were costing far more than two of me.
I started near ground zero at another startup and hired in two programmers with limited C experience to meet the requirement: "Build a GUI app on OS-X" - that's where I started using Qt, and the three of us learned enough to be productive in it in the space of a few months, and had our first decent looking translation of the Fortran/Matlab mess we were handed within about 4 months from "go." Python wouldn't have gotten us any closer to the OS-X "native look and feel" goal at that time, and I don't think the learning curve for my two hires was particularly steep for Qt - one had some OpenGL experience, the other could at least follow examples in C. Should mention: pickings were really slim in that job market, had to turn away about 8 interviewees who literally couldn't program their way out of a paper bag given sample code that cut an opening from top to bottom. Hiring them to program in Python would have been just as pointless as teaching them Qt.
The real cost of using developers who have trouble with "hard languages" is that they have all kinds of other challenges too... they put 4 levels of nested loop in a place where 3 will do the job, and your execution time is 100x what it should be, and it doesn't matter how great the syntax of your language is, unless you have a library that calculates a value histogram of a volume ready to hand, they will be writing their own loops to get it done. Hardware acceleration is easily overwhelmed by bad implementations.
Re: C vs C++, I have no great loyalty to C++ and objects. One really great aspect of the torturous syntax of C++ is that C "just works" anywhere you drop it in to a C++ program. C++ and objects are really good analogies for Windowed GUI widgets, they're pretty good for containers like strings, lists of strings, hash tables, etc. but they definitely got overwrought in the late 1990s into things they had no business displacing a simple struct from.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Tuesday July 12 2022, @09:24PM (9 children)
If python was a new language I'd say you might have a point. But many big companies have been hiring python developers for a decade now and they aren't in any way looking to replace python with c++.
Besides, don't be so naive to believe your cost analysis is even remotely close to what your boss has in mind when making those hiring decisions. Big companies have whole teams of statisticians and HR writing 40 page cost/risk-analysis reports for every project that weigh-in on everything in such resolutions you wouldn't believe. e.g. There's reports that look into different school districts bus ride times to measure potential performance implications cut against marital state, age and gender.
So, when they look at the C++ hiring market and decide to diversify into python, they're not doing it by mistake.
That's true for all entry positions, C++ included. The reason you're not seeing it is because there's so very few young people going into C++ these days.
But how many good developers does a team need to correct the mistakes of bad ones? Certainly they don't all need to be super stars. And we agree they can't all be terrible at their jobs... So, already, the premise is mixed skill levels. And from there it's pretty obvious you'd want to leverage the tools for the job by having the less skilled work with the training wheels on while having the more skilled work on their end. i.e. The Golang approach.
Quick observation: Note how everything you've listing + parallelism readily falls into dataflow so if we only had a sane, domain specific dataflow language that didn't try to replace C, but to complement it in those very specific use cases, it would have yielded more agreeable results than C++'s OO.
compiling...
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @09:46PM (8 children)
>when they look at the C++ hiring market and decide to diversify into python, they're not doing it by mistake.
Of course that depends on the sophistication of the management group. Credit Suisse First Boston did all kinds of interesting due diligence when acquiring a new spin of a tech that we developed over the course of about 10 man-years, with about 6 man-years in the research and development of the analysis software component. On the hardware side, they flew us up to DEKA to have Mr. Kamen (Segway inventor, among other more significant less well known things) himself give them a read on the electronics side of things and whether or not it really did what we had been selling it for for the previous 20 years. For the software, they basically opened the newspaper and found 10x as many ads for the Microsoft API of the moment as they did for the (then quite superior) Borland environment. So, drop of a hat, they hired a team of 4 - which grew to 8 - to recode the application in the MS API with an initial projected completion of 3 months, growing to 12 before they got it done. They easily paid more for those programmers than they did for the rest of the acquisition, simply because the MS API had more ads in the paper. P.S. if you're involved with investment bankers at any time, you should know that CSFB incorporated the new entity in Delaware, then proceeded to issue debt from their own bank to the company which they had bought with 80% stock, then when they had issued enough debt to make the net value of the organization $0 they reissued the stock giving all the investors checks for $0.01 in exchange for their shares, some of which had invested over $1M of their own money to obtain about 5 years earlier. Me? I only had stock given as bonuses, nominally worth about $80K at one point, I also received a check from CSFB for $0.01. Debt takes precedence over equity, and in Delaware you can pull shit like that and screw the equity holders legally. End of the day: they put a lot more effort into controlling the legal framework the acquisition deal happened under than they did language / API selection.
>OO
Don't forget, OO as a concept comes from the 1980s. Think about the hardware that was available in 1985. OO has been widely abused since, and earned some of its bad reputation - just because you have a buggy whip on your electric sports car doesn't mean you have to pull it out and use it.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Wednesday July 13 2022, @07:40AM (7 children)
I've heard a similar Microsoft vs. Borland account where the decision fell in favor of Microsoft since they were offering free on-site support where Borland weren't even willing to put down a rough quota on paper. Also note that while engineering consider having all your software come from the same vendor as putting all your eggs in one basket, for acquisitions it means having better leverage since the bigger you are as a client, the better are the deals and treatment in general. So, from management's of view, out-spending a single development effort is often worth deepening a service deal. And note how it's yet another side to that "c++ is often chosen to lock out competition" thing: A lot of what we think of as pure technical decisions goes down to job market, hiring options and third-party corporate connections.
Yeah that's pretty typical to the east coast. The legal and finance frameworks in New York and Texas are damn right hostile to startups.
Anyhow, this ties well to my point: C++ isn't simply just a (bad) programming language. It's a specific supply chain of HR, tooling and compiler/OS providers that is not only unjustified on technical considerations, but also involves some dubious business practices. i.e. It's always support package deals here, vendor lock there type things... And to most businesses, especially the small-to-medium ones, it's not only a bad technical choice, it's that entering into that particular ecosystem is damn right hazardous. And when you think about it like that, the whole Nukia-Microsoft and Qt/Trolltech situation doesn't become one bad anecdote. It's simply the nature of heavy weight tools: If you depend on a big language and a big os, you're going to need to be able to deal with big companies. So, unless you are a big company, that's not a bed you want to get into.
compiling...
(Score: 3, Insightful) by JoeMerchant on Wednesday July 13 2022, @11:34AM (6 children)
Thank you for sharing your opinions. We agree on most of the facts, but from the perspective of small startups to medium sized dev teams in larger corporations, I arrive at different conclusions.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 4, Insightful) by RamiK on Wednesday July 13 2022, @12:14PM (5 children)
Yeah I can't argue with that.
compiling...
(Score: 2) by JoeMerchant on Friday July 15 2022, @02:06PM (4 children)
And if you really want to program in Lisp, it isn't all that far away: https://github.com/Robert-van-Engelen/tinylisp [github.com]
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Friday July 15 2022, @07:06PM (3 children)
Meanwhile in Texas: https://www.youtube.com/watch?v=DXvJ8duZqdA [youtube.com]
compiling...
(Score: 2) by JoeMerchant on Friday July 15 2022, @07:38PM (2 children)
Cool. When I was in school my programmable calculator had the one true language: BASIC. Graphing calculators weren't a thing yet.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by RamiK on Saturday July 16 2022, @08:37AM (1 child)
Did it have an alpha-numeric display or a dot matrix? If it's running BASIC, it has indirect addressing and conditional branching so you only need the high-res display to draw graphs...
compiling...
(Score: 2) by JoeMerchant on Saturday July 16 2022, @01:27PM
It was dot matrix but resolution was something like 128x8 and I believe it was only character addressable from the software layer.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by DannyB on Friday July 08 2022, @02:15PM (12 children)
If you're still writing in C++ with high level libraries, that is not fixing the problem.
What is missing is that C++ is still a table saw without any guards. (Or perhaps a few more guards than C table saw has.)
Some basic things should simply not be possible.
1. disposing of something twice
2. not disposing of it
3. using it after disposing of it (dangling reference pointer somewhere)
These three things alone have cost, I would dare say, billions of dollars in bugs.
It should not be possible to cut off your fingers.
Some languages allow you to reason at a much higher level of abstraction. Lisp is but one good example. What you are talking about is a poorly specified, bug ridden, unsafe, non-standard implementation of Lisp. Sure I could write a program in that. But a standard Lisp implementation has already been worked on by experts. It is as efficient as it is capable of being. So what does a set of libraries on C++ offer than using a good Lisp implementation offer? Other experts have labored over optimizing that implementation.
These other languages exist for a reason. People can learn these languages that are suited for the problem domain, and get hired for knowing them.
Would it even be possible to use C++ and libraries to have a Type system such as in the Julia language?
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 3, Insightful) by JoeMerchant on Friday July 08 2022, @07:18PM (11 children)
First: I only cut about 2mm off of my thumb tip before investing in the SawStop. I still used cheap table saws after the tip healed over, but the SawStop is much better for serious work, and the safety factor does provide confidence when doing a lot of cutting.
If you want guards like double disposal protection, there are "smart pointers" for that. No matter the language it's hard for a compiler or interpreter to know when you really want to keep something in memory or not. I do think that auto destruction when leaving scope is a very good thing that C++ does, eliminating the need for explicit free calls on most allocations. For use after disposal protection, smart pointers take you most of the way there, and I often start a signal handler with nullptr checks on all of the smart pointers in the function. If you want to go nuts with it, you can overload the pointer dereference operator to always do a null check before access and throw an exception. I won't let you try-catch and throw in my projects, but the language certainly supports it.
>It should not be possible to cut off your fingers.
It is ALWAYS possible to cut off your fingers. In door frames, belt and pulley machinery, fan blades, etc. You can put on guards like SawStop, but there will be cases (like cutting wet wood) where the guards have to be disabled or the tool just won't do it's job. Even if the entire constructed world is padded rooms with safe doors, people can still bite their own fingers (and tongue).
>Some languages allow you to reason at a much higher level of abstraction.
Like the container classes that have been ubiquitous for 20 years in C++? You don't need a new language to express higher levels of abstraction in compact forms.
>a poorly specified, bug ridden, unsafe, non-standard implementation of Lisp.
Lisp was first specified in 1958, and it has its place, but it does not seem to have ever been a popular choice of language, maybe there are good reasons for that?
>Would it even be possible to use C++ and libraries to have a Type system such as in the Julia language?
It is possible to write "mixed language" projects in pretty much any combination of languages you choose, possible but usually a bad idea IMO. It would be possible to do something that pre-compiles C++ code with a type system, if you can specify how you expect that to work without internal conflicts or ambiguity. Many flavor of the month languages do things like that.
If Lisp (or Clojure or whatever) is "better" for what you are doing, then go for it. Lately I have been doing a lot of C++ that makes system calls like a bash script would, but being in C++ gives me much better access to the system message broker and several other standard libraries than bash ever could, and if bash is ever better than C++ I can call a bash script from C++ rather than a series of single commands. Being Qt/C++ means it's easy to put a GUI on these systems calling programs, providing status dashboards, sorted log streams, and a tabbed interface with push buttons or other GUI elements as appropriate to control things. The main power of C++ in this situation that virtually everything I am asked to work with is relatively easily accessible from C++, without the version control hell of something like Python.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2, Insightful) by anubi on Saturday July 09 2022, @09:34AM (10 children)
Thanks for that post.
I am another C++ fan, and will gladly code another library and define new objects as needed.
Yes, C++ let's me do anything. Including shoot myself. I am careful with it, and it allows me to write concise clean fast code. But I've had my learning curve of "what's going on here?" just as anyone else has. After a while, I have developed my own little toolsets and ways of doing things that work for me.
Your post is a refreshing read for me, giving me more ideas of how to counter arguments of why I am avoiding some languages that I consider to compile to bloaty code that I can't verify that what I need done is all it's doing.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 2) by DannyB on Saturday July 09 2022, @04:23PM (9 children)
You are optimizing for cpu cycles and bytes.
I am optimizing for dollars.
Different problems, different solutions.
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @10:56AM (8 children)
If you are optimizing for dollars by choosing tools that ill-trained low cost developers can use relatively more safely due to their constraints, I would say you are doing it wrong. One good developer is more productive than five cheap ones. Better to spend your low budget headcount on test and quality and procedure mavens. Software isn't a sawmill, mistakes on work in progress are only costly if they are released to customers. Best to force your genius developers to explain their work to newbie testers who are backed up by management to have the power to hold releases as long as it takes to get them right.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by DannyB on Tuesday July 12 2022, @01:56PM (7 children)
Good developers can make simple mistakes. It happens. Sometimes it takes a lot of time to find it. Even given your very obvious advice of explaining code, that gee nobody else would have thought of, mistakes still happen. It is better to eliminate the possibility of making some of the most obvious and worst blunders. Especially ones that are purely mechanical and have nothing to do with the problem domain. Such as double freeing a pointer.
The language should fit the problem domain. When you talk to accountants they don't ever seem to mention bits, bytes, cpu cycles, pointers, etc. Your programming language should not force you to consider details irrelevant to the problem.
Even in a low level language like C++ you end up building abstractions upon abstractions. Like auto pointers, just for example. Other languages are merely abstractions that shield you from the raw sharp edges.
Garbage Collection is also a huge advantage for most everyday programmers. Most modern languages all have GC. Especially in the last twenty years. GC eliminated the three biggest sources of bugs (and vulnerabilities) that have cost billions of dollars. Here on SN I've explained before how GC is actually a performance (latency) advantage for a big busy commerce server doing business.
You must be right and everyone else is wrong. How did Java get to be the top #1 language for over 15 years straight? Even today it stays in the top three. What is wrong with all these people? How do they not have your wisdom to understand their problems and how they should solve them.
I'll say it again, but it will fall on deaf ears:
If there were one perfect programming languages that was ideal for all possible uses, we would all be using it already.
Do you not think that in my business we all understand the problems and how to best solve them? And that this has been going on for decades? We explore alternatives and choose things that work best for our problems.
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @02:39PM (6 children)
>Good developers can make simple mistakes. It happens.
Absolutely, which is why independent test is key to quality.
>It is better to eliminate the possibility of making some of the most obvious and worst blunders.
It is better to make an obvious blunder than a subtle one. Also, the frequency of automatically or otherwise flagged obvious blunders is a good indicator of how many subtle problems you can expect from the same source.
>GC is actually a performance (latency) advantage for a big busy commerce server doing business.
GC has its place, but it should be an easy choice to make, not a default difficult to change.
>How did Java get to be the top #1 language for over 15 years straight?
Sloppy web developers and other quick-fix junkies. And it's good for that. If you want tiny projects, Java, Python and friends are great quick fix tools to replicate common things. Want to replicate that cool AI project? Just open your jupyter notebook and copy-paste what the previous guy did, just don't expect to be able to dig in deep for changes like processing your images in HSL instead of RGB without a 10x jump up in the learning curve. Were it all coded in C++ to start with, that would be a 5 minute change to the source.
>We explore alternatives and choose things that work best for our problems.
Kudos, most businesses suffer from horrendous inertia, Not Invented Here syndrome, susceptibility to vendor lock-in and incentives, and a general apathy of management to herd the cats in development.
Anything can be coded in any (Turing complete) language. The tendency I have seen for the training wheel languages is for little projects to get started, show a blinking light or hello world in record fast time, generate enthusiasm as they customize to do one or two domain specific things, then get mired in horrendous maintenance overhead when somebody says something like: "Let's go back and make all the buttons perform closed loop communication verifications after activation." Rather than capability scaling with code supporting arrays or matrices of functions that get easier to add as you grow, each new widget or feature seems to take a bit more effort than the last. Of course you can do great things in any language, trac is a pretty impressive python project that I have used daily for 15+ years, but most python projects I have seen developed in-house reach a point where they just stop growing due to effort required for expansion and the suggestion to start over - usually in another language - starts coming up more and more often as they grow.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by DannyB on Tuesday July 12 2022, @05:40PM (5 children)
I'm glad you know everyone's business better than they do. And you know the one perfect language they should all be using.
GC has its place. That place is in most modern languages. There are still languages that do not have GC for uses where GC is wholly inappropriate.
This is the one that tells me you really don't know what you're talking about. Maybe you are thinking of Javascript not Java?
Java is for huge gigantic projects. Millions of lines of code. Written by many developers. Maintained for years even decades by many different people who have no contact with the original authors. Java does very well for this use. Unlike Python 2 vs 3 you can take the very oldest compiled binary or source code and run it on the latest Java versions. Java has a very carefully maintained backward compatibility. I think you do not understand the sheer amount of Java code in existence. I'm not saying you should like Java. Just realize that it really does have its place and is not some kind of mistake just because it is not suitable for YOUR purposes.
Python is at the opposite end. Great for small projects because it is interactive and dynamic. Fantastic for prototyping. Even for good sized programs. But not for an enterprise application that I just described. Refactoring is a big problem with dynamically typed languages.
I like Python. I like Lisp (various lisps). Java. Other languages. I use the right tool for the right job. Not a one size fits all.
I take the view that any successful language must be doing something right for someone.
That actually proves my point. We could, and you could write everything in assembly language. Do away with C and all higher level languages and mandate only assembly language worldwide for all programming. Period. No exceptions.
Yes, it could actually be done. But at what cost?
There is a reason Java exists. There is a reason it is successful. Java it not the right language for all possible uses.
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @07:21PM (4 children)
>And you know the one perfect language they should all be using.
There is no one perfect language. GC has its place, but the languages that use it tend to become unmaintainable byzantine disasters when you put them in the hands of people who "need" GC and ask them to do a large, long lived project with it.
>I think you do not understand the sheer amount of Java code in existence.
I have encountered plenty of it. I have also encountered a great number of management types who thought that Java was their answer for high performance cross platform solutions, and could never quite grasp why it would be unacceptably slow for any of our applications. I even encountered a couple of "Java prototypes" where management steamrollered the crew into "doing it in Java" and had to translate it out to something faster to make it anywhere near competitive in the marketplace. For my purposes, Java usually isn't the answer. I'm sure there are plenty of places where it makes sense, but I also suspect there are many places where it was a bad choice that got implemented anyway because nobody who knew better stood up to explain the issues.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by DannyB on Tuesday July 12 2022, @08:45PM (3 children)
Java is not slow. It is quite fast actually. You use commercial web sites that are Java without knowing it. All of the java bytecode is compiled down to native code. Unlike, say, PHP or Perl which is interpreted, or Python, when Java services an HTTP request, it is pure native compiled machine code that services that thread.
When a Java program starts up, the java bytecode is interpreted (slow). As soon as any (at this point every) function is using a disproportionate amount of cpu, that function is immediately rapidly compiled into poorly optimized machine code by the C1 compiler, and is put on a list to be recompiled again soon by the C2 compiler. When the C2 compiler comes around, it spends a lot of time and effort producing highly optimized code. Better than an ahead of time compiler is capable of (such as C).
Why better? Because the C2 compiler in Java has the WHOLE PROGRAM, not just part of it. When you compile in C, the compiler is only compiling part of the program. The linker may see the machine code of all the component parts which it links together, but it does not globally optimize them in any deep way.
Java's C2 compiler can aggressively inline code and it does. It optimizes for speed over size. You can always buy more memory but you can't buy back time. With a compiled and linked program in C, a function in my precompiled library cannot be inlined into your code where you call my function. Your code and my code are independent development efforts. The C2 compiler has the entire program, in java bytecode form, to work with. It can rewrite function parameter calling conventions. It can recognize when certain methods in an object do not need a vtable entry because they are the only possible call target. (Something C++ cannot do)
Because of all of this, a Java program seems to start up slowly and then "warm up" and run fast. Thus, Java is not a good solution for writing a replacement for the 'ls' command. But it is fantastic for a very large program running on giant servers where the program runs for a very long time without being interrupted or restarted.
I almost never hear anyone complain about Java unless they really don't know much about Java.
My journal article: It's fashionable to hate Java [soylentnews.org]
That article explains a lot, and some SN readers were surprised.
It is funny that the marketplace, for enterprise applications, has already decided two decades ago.
Ask yourself, why does Red Hat do research on the latest state of the art Java garbage collectors? Why are they spending money on this? See Red Hat's Shenandoah GC for Java. This can work with 16 Terabyte memory heaps with a max GC pause time of 1 millisecond. See for yourself. Yeah, Java is slow.
Or Oracle's ZGC garbage collector which is also state of the art and has similar specs to the previous paragraph.
Or why does IBM build its own Java runtime called J9?
And why oh why does Microsoft, yes Microsoft contribute to Java development? Here's why: because their biggest mega customers all use Java extensively.
If you're using Java for dinky programs that run in less than 32 GB of memory you're doing it wrong.
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 2) by JoeMerchant on Tuesday July 12 2022, @09:29PM (2 children)
>I almost never hear anyone complain about Java unless they really don't know much about Java.
I had to actively start defending against Java being used on desktop PCs for heavy-lift analysis software 20 years ago. The primary Java cheerleaders had no idea of what they were talking about. At the time, bytecode interpretation in dedicated silicon was one of the "visions of the near future" which, to my knowledge, never materialized.
Since then, Java (not Javascript) as web based applications have occasionally haunted me, forcing end-runs around modern browser security improvements. Small time stuff, 2-3 man-months to implement, that would have been better implemented in something else to achieve their "low maintenance cross platform" objectives.
No doubt the large scale web-apps have benefited from advances in JIT compilation, but if you tell me that much of Facebook runs on Java and owes its elegant user experience to the power of Java I'm going to have to laugh you off the board.
I wouldn't advocate for C++ to implement a big web based app, although I'm listening to an http interfaced mp3 player I coded up last year right now, it is pretty good for simple-ish stuff.
>why does Red Hat do research on the latest state of the art Java garbage collectors? Why are they spending money on this?
Because the customers want it. Doesn't make it "the best" answer for all problems. If you're at scale with 1000+ coders in a single playground, I see the need for sandboxes, garbage collection, etc. I play more on teams of 50 or less, often times over the past 30 years on teams of 5 or fewer engineers. We have always made self-contained widgets, not web-facing million+ user monsters. The widgets tend to have 5 to 100 man-years in their development, and serve a handful of users at a time, often just one.
>If you're using Java for dinky programs that run in less than 32 GB of memory you're doing it wrong.
Agreed. Our latest box only has 16GB of RAM.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(Score: 2) by DannyB on Wednesday July 13 2022, @07:17PM (1 child)
It sounds like you have been influenced by Java long long ago and are unaware of what modern Java is like. Java has come a very long way. Java's C2 compiler is one of the most sophisticated compilers there is. Similarly some of its modern GC's.
It was a mistake for browsers to ever support:
* Java applets
* Flash
* ActiveX
* Silverlight
Java's C2 compiler can rewrite functions. For instance, produce two versions with two parameter lists and calling conventions. It can do this because it has the WHOLE program to work with at runtime.
If you think a fertilized egg is a child but an immigrant child is not, please don't pretend your concerns are religious
(Score: 2) by JoeMerchant on Wednesday July 13 2022, @08:19PM
>Java's C2 compiler can rewrite functions. For instance, produce two versions with two parameter lists and calling conventions. It can do this because it has the WHOLE program to work with at runtime.
That's nice. Right now, I'm fighting with gcc over optimizing a "whole module" at one time. We have a system backbone of ~3000 properties with message broker getters, setters, default values, value changed signals, etc. and the biggest of our property using applications is causing the gcc linker-optimizer to balloon in optimization time such that the total system build is taking 14 minutes with 2900 properties, 15 minutes with 2925 properties, 17 minutes with 2950 properties, 20 minutes with 2975 properties, 24 minutes with 3000 properties, etc. The basic problem is: all the properties are written to a single module and when gcc is optimizing this "whole module" at one time, it is scaling at something like O(n^3) optimization time, and we're getting to the point where that is starting to hurt.
I believe we're coming down to horses for courses. If you're writing a Facebook competitor, Java may be your horse. I live in a world of 5-50 developers writing apps that are used by 1-5 users at a time running on a single machine of the Core i7-6xxx class. We've got some STM-32 accessory boards playing in the system, and might network out for 2% of the total functionality of the system, otherwise we're focused on delivering the best user experience to our local user(s) on ~10K copies of this mostly self-contained system running in 40+ countries around the world. We sell the system for $10K-$40K at a build cost of around $8K, but that's not the point, the point is to sell $200 disposable "razor blades" (that we make for $50 net per copy after $5M development costs) that the system enables our users to benefit from during their $2-5K operations. Our best users do ~1500 operations a year, for a profit of $225K annually per device to us on the "blades" alone. So, we're not Facebook, but just this one device (of about 6 that we make) nets around $700M per year - not all users are "best" users, some only do 10 operations per year per device. Our software dev team for this device might run a total headcount cost of around 30 (coders, testers, quality, etc.), at maybe $220K per head (not that we take home anywhere near that), so we're only costing about 1% of gross revenue - not a bad margin, and not an insignificant business model. Over the last 30+ years, I have worked for a dozen similar companies with similar products, and Java is just not our horse, though we are starting to let the Javascript camel's nose under the tent flap, in the name of diversity and accommodating the available hiring pool which includes our latest programmer who took a maternity leave last month.
Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end