In which ESR pontificates on the future while reflecting on the past.
I was thinking a couple of days ago about the new wave of systems languages now challenging C for its place at the top of the systems-programming heap – Go and Rust, in particular. I reached a startling realization – I have 35 years of experience in C. I write C code pretty much every week, but I can no longer remember when I last started a new project in C!
...
I started to program just a few years before the explosive spread of C swamped assembler and pretty much every other compiled language out of mainstream existence. I'd put that transition between about 1982 and 1985. Before that, there were multiple compiled languages vying for a working programmer's attention, with no clear leader among them; after, most of the minor ones were simply wiped out. The majors (FORTRAN, Pascal, COBOL) were either confined to legacy code, retreated to single-platform fortresses, or simply ran on inertia under increasing pressure from C around the edges of their domains.Then it stayed that way for nearly thirty years. Yes, there was motion in applications programming; Java, Perl, Python, and various less successful contenders. Early on these affected what I did very little, in large part because their runtime overhead was too high for practicality on the hardware of the time. Then, of course, there was the lock-in effect of C's success; to link to any of the vast mass of pre-existing C you had to write new code in C (several scripting languages tried to break that barrier, but only Python would have significant success at it).
One to RTFA rather than summarize. Don't worry, this isn't just ESR writing about how great ESR is.
(Score: 2) by RamiK on Saturday November 11 2017, @01:36PM
Capabilities was referring to https://en.wikipedia.org/wiki/Capability-based_addressing [wikipedia.org]
ARM going with multi-stage branch prediction and Jazelle enabled Android's Java app development. Quite the different programming practices when you're not dealing with pointers and have a GC.
Nvidia and AMD switching form VLIWs to RISC brought GPU compute to HPC which is a big segment of servers. Very different workflow with profiling and unit testing taking priority.
Virtualization on servers was\is huge. Whole business models and software markets came and went\stayed over it. It's not "just" by any means when you have distro developers bisecting commits and running a system clone on qemu to find which user-land change halted boot to X.
DRM moved game development from general purpose to consoles. Well, the Cell architecture came and went so it's not quite the last decade... But the way game-engines became the target rather then each game developer making their own engine even for AAA games is a huge change in practice that was only possible due to the hardware forcing it.
You can't even say the desktops went unaffected since they clearly lost market share to everything from smartphones to tablets to streamers over power usage to the point Microsoft is sweating and Intel has been busy failing Atom and switching to a value-adding scheme ( https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/baumann-hotos17.pdf [microsoft.com] ) as they're rapidly losing market share.
Does it matter how many? The point is that there are gradual changes that change practices all the time. More unit testing. More scripting to glue C\C++ instead of writing C. More GUIs in browsers over native which also means more server-client designs even for locally run apps... And a lot less pointers.
compiling...