Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday November 10 2017, @06:23PM   Printer-friendly
from the C,-C-Rust,-C-Rust-Go,-Go-Rust-Go! dept.

In which ESR pontificates on the future while reflecting on the past.

I was thinking a couple of days ago about the new wave of systems languages now challenging C for its place at the top of the systems-programming heap – Go and Rust, in particular. I reached a startling realization – I have 35 years of experience in C. I write C code pretty much every week, but I can no longer remember when I last started a new project in C!
...
I started to program just a few years before the explosive spread of C swamped assembler and pretty much every other compiled language out of mainstream existence. I'd put that transition between about 1982 and 1985. Before that, there were multiple compiled languages vying for a working programmer's attention, with no clear leader among them; after, most of the minor ones were simply wiped out. The majors (FORTRAN, Pascal, COBOL) were either confined to legacy code, retreated to single-platform fortresses, or simply ran on inertia under increasing pressure from C around the edges of their domains.

Then it stayed that way for nearly thirty years. Yes, there was motion in applications programming; Java, Perl, Python, and various less successful contenders. Early on these affected what I did very little, in large part because their runtime overhead was too high for practicality on the hardware of the time. Then, of course, there was the lock-in effect of C's success; to link to any of the vast mass of pre-existing C you had to write new code in C (several scripting languages tried to break that barrier, but only Python would have significant success at it).

One to RTFA rather than summarize. Don't worry, this isn't just ESR writing about how great ESR is.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by KiloByte on Friday November 10 2017, @07:08PM (18 children)

    by KiloByte (375) on Friday November 10 2017, @07:08PM (#595281)

    So we get death of C announced every a year or so per tech news site. But please tell me, what language is used to write the kernel, most of those fancy new programming languages' interpreters, and most serious code? Yeah, code that is used to directly earn money (business logic, shit webpages, etc) tends to be written in a higher level language, but all the foundation is in C.

    Remember the big hype of Go or Swift? They're moribund now, with Rust being hyped up this year (for extra irony, Rust predates Swift, its popularity spike merely happened later). Then in a couple of years there'll be something else. Meanwhile, C is going strong.

    The last project I started was in C++, just out of laziness because of STL for a single data structure. The other guy immediately went replacing that with proper C, so there'll be no entrenched C++isms later.

    Unless you're writing a webpage or some accounts receivable poo, there are only two kinds of languages: prototyping/glue and system/library. For the latter, there's no real choice other than C.

    --
    Ceterum censeo systemd esse delendam.
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by RS3 on Friday November 10 2017, @07:24PM (7 children)

    by RS3 (6367) on Friday November 10 2017, @07:24PM (#595287)

    At least one other guy agrees with you and I: http://harmful.cat-v.org/software/c++/linus [cat-v.org]

    Recently I was helping a friend with an Arduino project he dreamed up. I hadn't touched Arduino before, but have done a fair bit of assembler here and there on an assortment of microprocessors. I was annoyed that the code was C++. He, like me, thinks a processor does actions, (verbs) applied to things (data, ports, etc.) I tried to explain the object model concept to him, but it was difficult because I was trying to sell something I don't believe in, and fundamentally wasn't making sense to either of us. I had to explain that C++ is being used because so many programmers are (only) doing OOP. It would have been nice if they had given us a C, or some other procedural programming environment. I'm sure they exist, I just haven't bothered to look.

    • (Score: 2, Insightful) by Ethanol-fueled on Friday November 10 2017, @08:21PM (4 children)

      by Ethanol-fueled (2792) on Friday November 10 2017, @08:21PM (#595321) Homepage

      C++ is used for Arduino because Arduino is for babies and assembler makes babies cry.

      99% of Arduino code is essentially dumbed-down C/C++ anyway. When was the last time you saw any pointers in any hobbyist-written Arduino code?

      • (Score: 4, Funny) by RS3 on Friday November 10 2017, @08:28PM

        by RS3 (6367) on Friday November 10 2017, @08:28PM (#595329)

        C++ is used for Arduino because Arduino is for babies and assembler makes babies cry.

        Very funny

        When was the last time you saw any pointers in any hobbyist-written Arduino code?

        You mean intentional ones?

      • (Score: 2) by forkazoo on Saturday November 11 2017, @12:53AM (2 children)

        by forkazoo (2561) on Saturday November 11 2017, @12:53AM (#595423)

        When was the last time you saw any pointers in any hobbyist-written Arduino code?

        If a microcontroller environment is extremely memory constrained, it may be impractical to have a sensible malloc implementation and a normal free store. It may not be useful to do anything that requires bare pointers in that kind of environment, especially if you just need to blink an LED.

        • (Score: 1) by Ethanol-fueled on Saturday November 11 2017, @01:04AM

          by Ethanol-fueled (2792) on Saturday November 11 2017, @01:04AM (#595427) Homepage

          Bare pointers in baby's first Arduino code are most often associated with timers and other mechanisms to have non-blocking clumps of logic. They are most certainly a good idea but abstracted away in libraries that babies don't bother to read.

          As far as Arduino hobbyists go, you can tell an O.G. Nigga from a baby because the O.G. Niggas use long instead of int.

        • (Score: 2) by crafoo on Saturday November 11 2017, @01:51AM

          by crafoo (6639) on Saturday November 11 2017, @01:51AM (#595440)

          indirect addressing on a micro-controller? you think this is uncommon? I've changed my mind. No place is safe from the javashit and python cancer.

    • (Score: 1, Touché) by Anonymous Coward on Friday November 10 2017, @09:03PM (1 child)

      by Anonymous Coward on Friday November 10 2017, @09:03PM (#595346)

      C++ (and Python) are perfectly able to do procedural programming without those silly objects.

      The nice thing is, when objects start to make sense for your code, you have those as well.

      • (Score: 2) by HiThere on Saturday November 11 2017, @02:00AM

        by HiThere (866) Subscriber Badge on Saturday November 11 2017, @02:00AM (#595442) Journal

        Sorry, Python's better than Java about it, but you can't do much in Python without invoking objects. Almost all the built-in libraries are quite heavily object oriented.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 4, Informative) by darkfeline on Friday November 10 2017, @07:50PM (2 children)

    by darkfeline (1030) on Friday November 10 2017, @07:50PM (#595300) Homepage

    Remember the big hype of Go or Swift? They're moribund now

    You're very misinformed.

    Go is used a lot. Hell, it's been listed at 15-20 on TIOBE and Github for years now. Just because it's not used as a systems language doesn't mean it's dead. By that logic, all programming languages except C are dead, which is a simply stupid conclusion.

    I don't know why you mentioned Swift, but considering that it's a language exclusively for the Apple ecosystem (yeah, in theory you can compile it for Linux, but no one does that), the only language you can really compare it against is Objective C, where it is doing pretty well.

    there are only two kinds of languages

    There are only two kinds of people: those that incorrectly oversimplify things into two categories, and those who realize that reality is a bit more complex than that.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 4, Insightful) by aristarchus on Friday November 10 2017, @08:22PM

      by aristarchus (2645) on Friday November 10 2017, @08:22PM (#595322) Journal

      There are only two kinds of people: those that incorrectly oversimplify things into two categories, and those who realize that reality is a bit more complex than that.

      Wrong! And needlessly complex and obfuscatory. Those that incorrectly oversimplify things, and those who correctly oversimplify things, these are the correct two kinds of people. Those who claim that reality is a bit more complex belong to the former.

    • (Score: 2) by KiloByte on Friday November 10 2017, @09:04PM

      by KiloByte (375) on Friday November 10 2017, @09:04PM (#595348)

      Note that I explicitly excluded business logic, websites and so on. It depends on what type of program you're writing. Heck, most of my coding time is spent doing Perl! Use the right tool for the job.

      there are only two kinds of languages

      There are only two kinds of people: those that incorrectly oversimplify things into two categories, and those who realize that reality is a bit more complex than that.

      And those who quote out of context to ignore the third kind I mentioned. But yeah, include the obligatory joke about off-by-one errors. :)

      --
      Ceterum censeo systemd esse delendam.
  • (Score: 5, Informative) by Thexalon on Friday November 10 2017, @07:51PM

    by Thexalon (636) on Friday November 10 2017, @07:51PM (#595302)

    Also, to write those higher-level things properly, sometimes you need to be able to dive into C. A couple of examples:

    - I worked on some fairly high-level business code at a Fortune 1000 company for nearly 5 years. Most of it was in Python, which worked just fine. But we had to interface with a 3rd party that didn't have a Python library but did have a C library. So we took advantage of Python's C interface capabilities [python.org], wrote a wrapper around that C library in C, and were in business. And yes, sometimes we had to go mucking around in that C wrapper as the 3rd party's C library evolved.

    - I worked on a PHP-based website for a while, and one of the key behind-the-scenes processes had a bug where it was periodically seg-faulting, and eventually the server would be out of actively running interpreter processes. With nothing being reported at the PHP level, I had to go digging through with some stracing and reading interpreter code until we located what the interpreter was doing that caused the problem (specifically, pointers to non-scalar default arguments that were modified over the course of a function were persisting between calls to that function, so all of a sudden we'd get pointers off to nowhere in particular).

    - Occasionally when compiling system software from C source, I've come across situations where I needed to create a patch to make the thing work with an unusual distro setup, which I couldn't have done had I not been familiar with C (and yes, I send the patch upstream when it makes sense).

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
  • (Score: 2) by choose another one on Friday November 10 2017, @10:42PM (5 children)

    by choose another one (515) Subscriber Badge on Friday November 10 2017, @10:42PM (#595386)

    So we get death of C announced every a year or so per tech news site. But please tell me, what language is used to write the kernel, most of those fancy new programming languages' interpreters, and most serious code? Yeah, code that is used to directly earn money (business logic, shit webpages, etc) tends to be written in a higher level language, but all the foundation is in C.

    I think the problem is the ESR is measuring life and death by the creation of entirely new projects - realistically how many _new_ major kernels or system library projects are being written in C these days? I can't think of any, except possibly systemd which probably doesn't count because it shouldn't be written ever in anything, or something like that... The real question is does it matter? I'd say no. It is merely a sign of the maturity of what we have got, if you want a kernel you use an existing one, same for system libraries.

    What ESR is saying is like saying concrete is dead because no one is creating or researching new types of concrete foundation - everyone just picks a concrete foundation type that suits the building and uses that. The reality is that there is still a heck of a lot of concrete being used in foundations, and a heck of a lot of work available in doing it, but it may not be particularly interesting for some people because it is (largely) a _solved problem_. Same goes with programming foundations.

    Is concrete dead? No.
    Is C dead? ditto.

    • (Score: 2) by Grishnakh on Friday November 10 2017, @11:27PM (4 children)

      by Grishnakh (2831) on Friday November 10 2017, @11:27PM (#595406)

      Your point there is good, but I think you're incorrect about concrete. We wouldn't know much about it here because there's likely no civil engineers on this site, but there's a good amount of research still being done on improving concrete.

      If I were to try to think of other examples of things that are basically solved problems and there's really nothing new going on, just reimplementations of what's already known, I'd guess 1) laser printers (those haven't changed at all in 10-15 years now, except the engines having more memory and faster CPUs so they rasterize pages faster), 2) automobile suspensions (these haven't changed significantly in 15-20 years, and in some ways have gotten a little worse/cheaper), 3) automobile brakes (20 years, some high-end cars have carbon brakes but race cars had those a couple decades ago), and 4) speakers (they're all just paper/plastic cones and voice coils like 50-100 years ago, and frequently plastic or metal dome tweeters; there were some different technologies explored a couple decades ago like electrostatic speakers but they never caught on).

      Of course, there's also technologies which have gotten significantly worse over time: 1) computer keyboards (they continue to get worse and worse, with the latest being "island keyboards" on laptops), and 2) desktop computer UIs (they peaked around 2005-2009, and have gotten horrifically bad since 2010).

      • (Score: 2) by crafoo on Saturday November 11 2017, @01:55AM (1 child)

        by crafoo (6639) on Saturday November 11 2017, @01:55AM (#595441)

        I'm an optimist. I think the UI thing will turn. Just like a wheel, all that was old is new again. I'm looking forward to the invention of the menu bar and boxes around buttons.

        • (Score: 2) by Grishnakh on Saturday November 11 2017, @02:40AM

          by Grishnakh (2831) on Saturday November 11 2017, @02:40AM (#595456)

          It'll take at least a decade; we have to wait for a new crop of young people to rise up and displace the current 25-30yo idiots who are pushing this shit.

          I'm looking forward to the "invention" of 3D-looking buttons.

      • (Score: 2) by choose another one on Saturday November 11 2017, @02:09PM (1 child)

        by choose another one (515) Subscriber Badge on Saturday November 11 2017, @02:09PM (#595579)

        Your point there is good, but I think you're incorrect about concrete.

        I know it's not a perfect analogy, but I still think it is a good one. Sure there is work still being done on improving concrete itself but it is tinkering, if you take, say, skyscrapers the basic foundation design is concrete piles and concrete mat/raft - Chicago has been building on concrete piles for over a century, some sets of piles have even been reused for new buildings. Sears/willis (1970) sits on a bunch of concrete piles and a mat, so does Burj Khalifa (c2004), so does the BT Tower in London (50yrs old) and the Shard (2009) on the other side of the river. Skyscraper foundations were a problem before 1900 (big problems in Chicago), but for at least the last 50yrs - solved problem.

        Automobile brakes? Yes, buuuttt... hybrids and EVs have brought re-gen braking, an addition rather than replacement but nonetheless arguably a major change. The others - yep.

        Computer keyboards getting worse - nah. My kids may argue over which colour cherry keyswitches they want, but the IBM Model M will always win simply because it is so much heavier and therefore shuts kids up so much faster when you hit them over the head with it :-) What is getting worse is the standard of keyboard provided with modern consumer computer kit - but that is just because most modern consumers don't actually need or want (to pay for) a decent keyboard, so it's been value-engineered out.

        As to UI, yes it's gone backwards, and not just on the desktop. I don't think we will see 3D buttons on desktop again, because real live buttons no longer look or work that way. I can think of stacks of examples, too many to list, but I think we must have reached the tipping point where touch screen UIs are cheaper to design and build than physical moving buttons, because touchscreen-for-the-sake-of-it UI is all around us. UI discovery has moved from "what does this button do" to "where the **** do I prod the featureless bit of glass/plastic to get something to happen". UI design will now move on through gesture control to voice control, because we have to fix the mess we've made of physical UI somehow.

        Voice control will get messed up too once we start making stuff too "smart" - refer to Douglas Adams' predictions from decades ago for this, he was actually a great observer (and predictor) of technology and UI. I give it maybe ten years before Alexa has a mode that analyses your tone of voice and will only open the door if you ask it nicely, it'll be a fair few more years before opening the door requires an argument and a threat to reprogram the computer with an axe (or counting, or maybe quoting C code at it!), but it'll happen.

        • (Score: 2) by Grishnakh on Monday November 13 2017, @01:31AM

          by Grishnakh (2831) on Monday November 13 2017, @01:31AM (#596031)

          Automobile brakes? Yes, buuuttt... hybrids and EVs have brought re-gen braking, an addition rather than replacement but nonetheless arguably a major change.

          I disagree; it's an addition like you said, not a change at all to the actual mechanics of the friction brakes. They're the same; they're just not actuated quite the same. This is splitting hairs perhaps.

          Computer keyboards getting worse - nah. My kids may argue over which colour cherry keyswitches they want, but the IBM Model M will always win simply because it is so much heavier

          The Model M is only still made by one tiny company for enthusiasts (and even there, it's generally said to not be quite as good as the originals). Back in the "old days", those keyboards used to be ubiquitous and standard, and others were somewhat similar even if not quite as good (such as the old Dell Quietkey).

          What is getting worse is the standard of keyboard provided with modern consumer computer kit

          And see, this is the problem: you can only use a dedicated keyboard with a desktop computer or a docking station or if you plug it into a USB port on your laptop. You won't be taking it with you, and these days, most PC users have laptops, not desktops, so we're pretty much stuck with whatever craptastic keyboard they build in. For a while, it wasn't so bad: the Thinkpad keyboards were generally considered the very best, and the Dell Latitude keyboards a close runner-up, while keyboards on cheaper laptops were generally crap. Not any more; the keyboards on both Thinkpads and Latitudes have gone down the tubes, with the adoption of the "island" keyboard scheme.

          I don't think we will see 3D buttons on desktop again, because real live buttons no longer look or work that way.

          We still have real live buttons on some things. But they are fading. But even in cars, we still have knobs and buttons, though a few shitty brands have tried to eliminate them, but there's no agreement across the industry on this at all. Touchscreens are dangerous in cars if intended to control major functions while driving (they're fine for displaying info, and for accessing rarely-used settings).