Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 13 submissions in the queue.
The Fine print: The following are owned by whoever posted them. We are not responsible for them in any way.

Journal by turgid

I've written this code at work and it's not pretty. As usual, it was done in a hurry with a Grumpy Boss Man shouting and making Basil Fawlty appear calm and collected. It also uses code from a third party, unsuitable for our hardware but nonetheless required for integration and testing.

This is the code for an embedded system which was written by Windows people in C++ using a cross-platform GUI toolkit including a GUI but also using this toolkit's message-passing infrastructure to facilitate inter-thread communication. Yes, it's multi-threaded.

Grumpy Boss Man wouldn't let us put this GUI toolkit on our system even just to get this code up and running so I had to re-implement various select parts of said toolkit myself to get the useful buts of the code from the supplier working, fortunately with the GUI thrown out.

Working all hours, with my fingers on fire, my brain melting and all sorts of things I replaced the TCP/IP socket functionality and the thread classes (in a very cheap, Scottish, minimalist, parsimonious way).

Lo-and-behold it ran!

Now this system contains Secret Sauce(TM) that I'm not allowed to see because IP and all that. So I have a target build with a secret binary module provided by the suppliers. I have my own little stub module implementing it's API which I wrote so I could do a host (x86-64) build. Grumpy Boss Man never quite understood why anyone would want to run the code on the host as well as the target (Aarch64).

It's quite simple: expediency. I can compile, link and execute the code in a couple of seconds on the host. I have rigged up a little automated test harness, in addition to my unit tests, which runs the application and sends messages to it, and waits for and checks the replies. I can run it through various test scenarios just by typing make. Remember, this is asynchronous multi-threaded code with TCP/IP sockets. Every time I compile I get free tests. The same tests can be run on the target too (I've done it).

The second reason is that compiling and running (testing) on a different architecture shakes out certain bugs. Ideally, it would be on an architecture with a different endianness and a different OS but the world is becoming more homogeneous these days. Unless there's a SPARC box about, if it's x86-64 or Aarch64, it's going to be Little Endian.

However, x86 is CISC and ARM is RISC and we all know that CISC and RISC processors treat memory differently. Now here comes the fun part.

My host (x86-64) builds/tests were fine. So were my target Aarch64) builds and they ran fine when I put them on the target and ran my tests there.

Our suppliers produced a new version of their Secret Sauce that needed some reconfiguration inside my code. My code (actually, their example code but a bit modified) had a couple of arrays holding certain configuration data and these became twice as large and held more constants.

All the compiles worked. My host regression tests passed. Putting the target binary on the hardware and running it resulted in a crash. It was a nice crash in that my pthread_create() failed with an error code and I printed a nice error message and the rest of the program kept going.

As I said earlier, I had been re-implementing parts of this C++ library at breakneck pace and I was thinking about memory corruption and perhaps I'd made some mistakes in one of the C++ constructors for the thread class.

I instrumented the code six ways to Sunday and came to the conclusion that there was stack corruption somewhere because all the right addresses for the thread main routine and arguments were getting set in the object instances but when pthread_create() was getting called it was returning a nasty error.

Then I remembered the mighty Valgrind. So I installed it.

After about half an hour I had the answer to the problem. I had forgotten to initialise the attributes for the thread (pthread_attr_init()) and then initialise a mutex for a shared buffer (pthread_mutex_init()).

It just so happened that on x86-64, due to the layout of memory, and due to the random contents of that memory, the program was running correctly. On Aarch64 it was falling over in a smouldering pile.

The moral of the story is (1) Don't write code on your own. Get someone to review it. (2) Don't write code in a hurry even when there's a Grumpy Boss Man (3) Compile and test on at least two different architectures and (4) use Valgrind (5) I hate C++.

Display Options Threshold/Breakthrough Reply to Comment Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Saturday May 11 2024, @04:10PM (17 children)

    by Anonymous Coward on Saturday May 11 2024, @04:10PM (#1356553)

    so pay a grant to add a gcc option "init with 0"

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 3, Insightful) by RamiK on Saturday May 11 2024, @06:11PM (16 children)

    by RamiK (1813) on Saturday May 11 2024, @06:11PM (#1356565)

    That's what -ftrivial-auto-var-init=zero was meant for and there's still attempts at getting it adopted: https://serge-sans-paille.github.io/pythran-stories/trivial-auto-var-init-experiments.html [github.io]

    However, since overcoming the performance costs requires adding a language feature to mark variables not to be zeroed meaning backwards incompatibility, it will never get adopted. After all, if we're willing to break backwards compatibility, why not just design a new language without all the legacy crap? Which, of course, is what the various new system languages are trying to do.

    --
    compiling...
    • (Score: 2) by turgid on Saturday May 11 2024, @09:49PM (15 children)

      by turgid (4318) Subscriber Badge on Saturday May 11 2024, @09:49PM (#1356586) Journal

      Every so often I have a look at these new programming languages and I'm always a little surprised. They all have some good ideas, but they all seem to be in different directions and the good ideas never seem to coalesce in any one language. Conversely, some of them seem intent to repeat some mistakes from decades past. I always end up thinking about Paul Graham's essays, and everything does seem to tend towards LISP.

      • (Score: 3, Insightful) by RamiK on Sunday May 12 2024, @05:23AM (14 children)

        by RamiK (1813) on Sunday May 12 2024, @05:23AM (#1356624)

        They all have some good ideas, but they all seem to be in different directions and the good ideas never seem to coalesce in any one language.

        If the only tool you have is a hammer, you tend to see every problem as a nail. If the only language you have is C/C++... Point is, that whole line of thinking is flawed. Some paradigms are the ideal solutions to some problems but don't mix with other paradigms that are the ideal solutions to other problems.

        To be clear, this isn't a new trend. People don't code GUIs in verilog nor do we write kernels in python. We're simply misusing the term "general purpose" to signify "systems programming". And even within that space there's niches...

        --
        compiling...
        • (Score: 2) by turgid on Sunday May 12 2024, @01:12PM (13 children)

          by turgid (4318) Subscriber Badge on Sunday May 12 2024, @01:12PM (#1356652) Journal

          What I was trying to say is that these new languages contain maybe one or two good ideas but they're too narrowly focused. Rust is OK, but it's VHS to D's Betamax, as far as I can tell. Nim is sloppy. The syntax is way too permissive. The way things are going, I doubt I will have time to really learn these new things which are coming a long which is a real shame. I'm so busy. I must be getting old. I fear I have settled into a niche. When I was younger I used to have the time and energy to reinvent myself every few years by trying new things.

          • (Score: 2) by RamiK on Sunday May 12 2024, @05:23PM (12 children)

            by RamiK (1813) on Sunday May 12 2024, @05:23PM (#1356672)

            Rust is OK, but it's VHS to D's Betamax, as far as I can tell.

            Not at all. Dlang's Ownership/Borrowing system [dlang.org] and safeD aren't the default (in fact, live functions are still an experimental feature almost 5 years after introduction) and its entire code base and standard libraries weren't made using it. Essentially it's mirroring the issue with 0-init: The whole point about Rust is that safety is the default rather than the exception.

            Nim is sloppy. The syntax is way too permissive.

            I don't particularly like Nim's syntax myself and only worked through the language's basics before putting it aside so I can't hold that against you.

            The way things are going, I doubt I will have time to really learn these new things which are coming a long which is a real shame. I'm so busy. I must be getting old. I fear I have settled into a niche. When I was younger I used to have the time and energy to reinvent myself every few years by trying new things.

            It's not your age. It's pretty normal for C/C++ devs (and honestly, anyone really) to find Rust difficult to pick up: https://stevedonovan.github.io/rust-gentle-intro/pain-points.html [github.io]

            As for dlang, I'm guessing you approached it hoping for something more C++-like only to find, well, something C++-like. That is, D has quite a bit of baggage and no added value aside from clean syntax so one tends to wonder why not just stick to C++ if you already know the syntax... This is where Odin and Zig come in: They don't chase after every single paradigm and feature like C++ and D. Instead, they narrow down on specific problems and make sure they have good syntax for just those features without going into the various pitfalls C/C++ ended up with.

            But regardless, the approach here shouldn't be "I want to learn a new language" but "I have a hobby project I want to do with this new language". After all, there's just not much for you to learn from reading about any new language once you work with C++ for a few years. What you need is hands on experience. In the end, you learn new languages by using them. Not reading about them.

            --
            compiling...
            • (Score: 2) by turgid on Sunday May 12 2024, @05:29PM (11 children)

              by turgid (4318) Subscriber Badge on Sunday May 12 2024, @05:29PM (#1356673) Journal

              Yes, all my hobby projects are in C.

              • (Score: 3, Interesting) by RamiK on Sunday May 12 2024, @07:47PM (10 children)

                by RamiK (1813) on Sunday May 12 2024, @07:47PM (#1356695)

                The C3 dev made a blog post a while back you'd appreciate: https://c3.handmade.network/blog/p/8486-the_case_against_a_c_alternative [c3.handmade.network]

                --
                compiling...
                • (Score: 2) by turgid on Monday May 13 2024, @07:14PM (9 children)

                  by turgid (4318) Subscriber Badge on Monday May 13 2024, @07:14PM (#1356832) Journal

                  My perspective comes from where I started and that was with a very limited BASIC on 8-bit micros. Then I learned a but of FORTH and Z80 assembler followed by C (K&R) and Modula-2, 8086, followed by a bit of Pascal and C++. I've looked at many other languages, and written a bit of Perl and Ruby. I've pottered about in Python and LISP. Daily I write the odd bit of bash. The languages I prefer are simpler and more consistent. Big languages scare me, and they frustrate me, because to get any way near good at them requires a lot of commitment (time) and the problems I work on never allow that. For example, if I wanted to be good at C++ I would have had to be doing that every day for 20 years. That was never going to happen. For me, a language needs to be flexible. I think the great Alan Kay once described LISP as more of a building material than a programming language. I suppose my ultimate hobby project will be to devise my own programming language, just for me personally, that fits the way my own mind works. Like my grandfather's boat, I fear that will never make it to launch.

                  • (Score: 3, Insightful) by RamiK on Monday May 13 2024, @11:28PM (8 children)

                    by RamiK (1813) on Monday May 13 2024, @11:28PM (#1356862)

                    For example, if I wanted to be good at C++ I would have had to be doing that every day for 20 years. That was never going to happen.

                    It's why I'll always go back to Go: I can write a small toy project in golang over a weekend, put it aside for months if not years and then jump back to updating it or writing a different project within hours. Basically, it's almost as easy as Python but with performance.

                    For me, a language needs to be flexible. I think the great Alan Kay once described LISP as more of a building material than a programming language.

                    LISP is a legitimate tool to explore different language features and can be used effectively instead of Lua in various places... But, IMHO, the notation is really too flexible to be used in a general programming language. Like, Script-Fu in GIMP is delightful and LilyPond gets it right for the most part... But elisp in emacs, common lisp in lem and guile in guix are just too heavy weight to pick up for odd jobs while being yet-another-big-language if approached as standalone languages.

                    I suppose my ultimate hobby project will be to devise my own programming language, just for me personally, that fits the way my own mind works.

                    For the sake of inspiration to a C dev that's eyeing Lisp, it's not an active project but here's a lisp that did away with the GC in favor of a borrow checker: https://github.com/carp-lang/Carp [github.com]

                    --
                    compiling...
                    • (Score: 2) by turgid on Tuesday May 14 2024, @07:45PM (2 children)

                      by turgid (4318) Subscriber Badge on Tuesday May 14 2024, @07:45PM (#1356951) Journal

                      That's very interesting, thanks. I'll have to give it a try. I remember something called Chicken Scheme which is a scheme to C compiler. I don't suppose it has anything as fancy as a borrow checker.

                      I did some Java for a few years, and I was pleased at how much cleaner it was than C++ but also how relatively primitive it was given its design goals. I would have expected its multithreading to be far more sophisticated (simpler for the programmer) and its libraries to be better in general. Also, the language seemed to be needlessly wordy. When I thought I was going to be living in the JVM, I got a book on Scala. Then Scala jumped the shark.

                      Then I ended up going back to C. Then I looked into Clojure and D. Now I hear of Kotlin and Rust.

                      And here I am, writing good old C and trying hard still to minimise the amount of C++ I have to do.

                      • (Score: 1, Insightful) by Anonymous Coward on Wednesday May 15 2024, @02:35AM

                        by Anonymous Coward on Wednesday May 15 2024, @02:35AM (#1356986)

                        Chicken uses a generational garbage collector. The very ingenious part of their design is that they implemented the nursery on top of the C stack. Once the stack is full, the garbage collector runs as part of the stack rebuilding process. Live objects are copied to the second generation heap in the heap. Then the entire stack is popped down to the base continuation and the flattened stack reconstructed. This saves all sorts of complexity and makes the garbage collector quite fast compared to the alternative designs with a separate nursery.

                      • (Score: 3, Interesting) by RamiK on Wednesday May 15 2024, @03:46PM

                        by RamiK (1813) on Wednesday May 15 2024, @03:46PM (#1357051)

                        And here I am, writing good old C and trying hard still to minimise the amount of C++ I have to do.

                        Then I'll add a final good words about Odin and leave things as is: Although I doubt it will ever get wide adoption at this point in time*, Odin is probably the best C++ replacement for C programmers since it eliminates undefined behavior without doing away with manual memory management meaning you get to use all your C and C++ patterns as is: https://odin-lang.org/docs/overview/ [odin-lang.org]

                        * Go has ~550k github repos, Rust has ~450k, Zig has ~6k, Nim has ~3.5k and Odin has ~900.

                        --
                        compiling...
                    • (Score: 1, Interesting) by Anonymous Coward on Wednesday May 15 2024, @05:26AM (4 children)

                      by Anonymous Coward on Wednesday May 15 2024, @05:26AM (#1357005)

                      I don't quite support your assertion. Part of the issue is that you are looking at your experience and then judging others based on it. Of course programs you wrote in Go are easy for you to understand later. They are written in a language you have a good amount of experience in, using constructs you are familiar with, in a model that matches doing a task you are familiar with, in a style you are familiar with. Meanwhile, people with different experience may have drastically different determinations. Someone more used to the declarative languages, for example, might prefer to use a LISP-like language over something prototype-based such as Lua because that already matches how they think. Even the variation of style or constructs used within the same language can make a huge difference. I know my functional-like Python code style drives some people up the wall (especially when they see something completely out of place in the functional world, such as when I nest higher-order functions and generators over 35 levels deep). But that is something everybody does, confusing their perspective for everyone else's too.

                      • (Score: 3, Insightful) by RamiK on Wednesday May 15 2024, @03:28PM (3 children)

                        by RamiK (1813) on Wednesday May 15 2024, @03:28PM (#1357050)

                        Of course programs you wrote in Go are easy for you to understand later. They are written in a language you have a good amount of experience in, using constructs you are familiar with, in a model that matches doing a task you are familiar with, in a style you are familiar with.

                        Go is objectively easy to read and understand. Writing it can be a challenging for people coming from very expressive languages that focused on entirely different paradigms where they build their entire pattern toolbox on paradigms Go simply doesn't support. e.g. Heavy OOP users miss the classes while the lack of generics was a meme-worthy common complaint from newbies coming from Java who just didn't know how to approach various data structures in any way but custom generics. However, readability and edit-ability wise, no one contends Go is easy.

                        I know my functional-like Python code style drives some people up the wall...such as when I nest higher-order functions and generators over 35 levels deep

                        Function calls in python were very costly (up until the introduction of the JIT in 3.11-3.13 especially) so while the language officially supports the full paradigm - if only to check a box, beyond the basics of lambda, map, filter and reduce, functional programming isn't taught to python developers and is explicitly rejected in code reviews due to "bad style" even when the reviewers themselves don't know why it's the case. e.g. Switch to 3.10 and try to profile everything from function calls to monads and functors (or if lazy, run your test code with "python -m dis foobar.py") and behold the horror of having little to no optimizations done.

                        The current push to deprecate of old language features on tail wind of the removal of the GIL through the new JIT is probably going to make some "room" in the language curricula for additional functional content that will surely make the ML people rejoice... But I wouldn't be surprised if we'll see a repeat of the Go's generics where the community, as a whole, chooses to adopt simplicity as a code style and reject the additional functional features as hard to read.

                        After all, there's a reason Haskel isn't popular and Rust isn't being widely adopted despite being just as capable as anything really.

                        --
                        compiling...
                        • (Score: 0) by Anonymous Coward on Thursday May 16 2024, @01:13AM (2 children)

                          by Anonymous Coward on Thursday May 16 2024, @01:13AM (#1357136)

                          Again, Go is easy to read for you. People with your experience will find it easy. People with others will not. You literally said as much in your own comment, at least when it comes to writing, but then elevate your subjective experience of reading to objective truth about readability because it feels right. I know different people who think R, Perl, Java, Ada, Haskell, and COBOL are the easiest languages to read. But each of those languages are notorious among certain groups for creating unintelligible code.

                          Also, in Python different types of function calls in CPython have different costs. And one thing many people don't consider is that many things in CPython are function calls (sometimes quite complex ones) under the hood. It is perfectly possible to write functional code that drastically outperforms its iterative equivalent when run in CPython. But regardless, I have zero problems using functional code. The simplicity you speak of is the highly condensed function calls with zero intermediate variables. For many Java people, the verbosity is simplicity since it tells you exactly what is going on without hiding it. Same thing with the inability of Go's "simple" generics complicating code if you end up in a situation where you can't specify the "simple" logic of what you are doing. There is much more keeping Haskell and Rust from being adopted than just the ease or difficulty you have in reading it. And part of the difficulty you have in reading it is because they require knowing a language you aren't experienced in, using constructs are not familiar with, in a model that doesn't match your model, doing a task you may not be familiar with, in a style you don't use.

                          • (Score: 2) by turgid on Thursday May 16 2024, @09:38PM (1 child)

                            by turgid (4318) Subscriber Badge on Thursday May 16 2024, @09:38PM (#1357281) Journal

                            In days of yore, before FOSS was a thing, there used to be programming languages, often designed and specified semi-formally, and their implementations. If you were unlucky, you'd have to pay for the compiler or interpreter. If it was FORTH, you probably got a listing out of a book or magazine for a nucleus and then rolled your own.

                            A language was considered distinct from its implementation. Even today with C and C++ (and Objective-C) we have a choice of compilers and they're all independent, even Java had alternative compilers, virtual machines and class libraries.

                            Things like Perl, Python and Ruby came along and they were a bit different. There was a de-facto standard defined by a particular implementation and the standard kept changing gradually with new releases of the interpreter/environment.

                            I have written a handful of lines of Perl, Python and Ruby many years ago. Perl is very difficult. It's so easy to shoot yourself in the foot with it. Python is nice and clean to read. Ruby was a but more involved, but I liked it. They wee all criticised for being "slow." That's the nature of interpreted languages. They're not for things that need to be fast. They're really good for doing relatively complex coding with less human time.

                            Python has become very popular but there are a number of problems with it. It's being used where it shouldn't. There are all sorts of hacks to try to make it work in certain situations. There is some sort of Python-to-C compiler. When you look at it, you'll see it only copes with a subset of the language. That subset is often not quire big enough to cover the features you need. There there is the issue of floats and doubles. Python only does doubles. So there are questions you have to ask yourself. Do you write the Python to the subset that can be compiled? If so you find that you are quite restricted. So why not just use C (or, spit, C++)? Or pony up for a MATLAB licence and write it in MATLAB and have that translate it to tens of thousands of lines of incomprehensible C?

                            What about the dependencies? What libraries do you need? What versions? You're going to need one of those pipenv or poetry things.

                            See, the thing is, as time goes on, things are getting more and more complex and brittle. That's not how it should be. When I was learning to code the idea was you built yourself up libraries of subroutines to accomplish your goals. As time when on, you'd tease out APIs which abstracted away the complexity under well-tested, well-understood code.

                            These days it seems like you have to have umpteen different languages, runtimes, libraries, protocols, paradigms and goodness knows what else in a single, relatively simple application. And the hacks to make them all work together...

                            And underneath it all is the C ABI and the standard library.

                            • (Score: 1, Insightful) by Anonymous Coward on Friday May 17 2024, @03:01AM

                              by Anonymous Coward on Friday May 17 2024, @03:01AM (#1357317)

                              I hate to say it, but I think your are missing a bit of history. Many languages started out with their reference implementations and then grew into a specification later. Even C started out that way with the C behavior being whatever Bell and later Unix did. A lot of languages and their features are like that but most people have forgotten about the times before they were specified in a way more independent from their reference implementation.

                              But to your other point. The layering you speak of is just the cycle repeating itself exacerbated by the large expansion of people in the field. Eventually things will stabilize again like the did the previous times. And then it will probably happen again. Such is the way of things. But once the field gets over this painful adolescence, not only will things be more stable but we will also benefit from its maturity.