Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday September 23 2022, @11:35PM   Printer-friendly
from the leaks-are-for-kids dept.

Arthur T Knackerbracket has processed the following story:

Mark Russinovich, the chief technology office (CTO) of Microsoft Azure, says developers should avoid using C or C++ programming languages in new projects and instead use Rust because of security and reliability concerns.

Rust, which hit version 1.0 in 2020 and was born at Mozilla, is now being used within the Android Open Source Project (AOSP), at Meta, at Amazon Web Services, at Microsoft for parts of Windows and Azure, in the Linux kernel, and in many other places. 

Engineers value its "memory safety guarantees", which reduce the need to manually manage a program's memory and, in turn, cut the risk of memory-related security flaws burdening big projects written in "memory unsafe" C or C++, which includes Chrome, Android, the Linux kernel, and Windows. 

Microsoft drove home this point in 2019 after revealing 70% of its patches in the past 12 years were fixes for memory safety bugs due largely to Windows being written mostly in C and C++. Google's Chrome team weighed in with its own findings in 2020, revealing that 70% of all serious security bugs in the Chrome codebase were memory management and safety bugs. It's written mostly in C++.     

"Unless something odd happens, it [Rust] will make it into 6.1," wrote Torvalds, seemingly ending a long-running debate over Rust becoming a second language to C for the Linux kernel. 

The Azure CTO's only qualifier about using Rust is that it was preferable over C and C+ for new projects that require a non-garbage-collected (GC) language. GC engines handle memory management. Google's Go is a garbage-collection language, while the Rust project promotes that Rust is not. AWS engineers like Rust over Go because of the efficiencies it offers without GC.

"Speaking of languages, it's time to halt starting any new projects in C/C++ and use Rust for those scenarios where a non-GC language is required. For the sake of security and reliability. the industry should declare those languages as deprecated," Russinovich wrote. 

Rust is a promising replacement for C and C++, particularly for systems-level programming, infrastructure projects, embedded software development, and more – but not everywhere and not in all projects.  

[...] Rust shouldn't be viewed as a silver bullet for all the bad habits developers practice when coding in C or C++. 

Bob Rudis, a cybersecurity researcher for GreyNoise Intelligence, who was formerly with Rapid7, noted developers can carry across the same bad security habits to Rust.

"As others have said, you can write "safely" in C or C++, but it's much harder, no matter what dialect you use than it is in Rust. Mind you, you can still foul up security in Rust, but it does avoid a lot of old memory problems."


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by bart9h on Saturday September 24 2022, @03:18PM (20 children)

    by bart9h (767) on Saturday September 24 2022, @03:18PM (#1273396)

    "Stop using C and C++ FOR NEW PROJECTS" is very different from completely eradicating C and C++.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Informative=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by JoeMerchant on Saturday September 24 2022, @05:51PM (18 children)

    by JoeMerchant (3937) on Saturday September 24 2022, @05:51PM (#1273443)

    I don't know about the rest of the world, but I am most often paid to do new things, things that haven't been digested into libraries for multiple higher level languages yet. I mean, sure I leverage lots and lots of existing stuff, but it seems that 5-10% of what I am doing just isn't available in library calls yet.

    What this means is that when I use a higher level language like Python or whatever, I inevitably end up having to figure out how to write certain modules of my code in C++ or C, or suffer 100:1 slowdowns inherent in the so called superior language. Yes, it's higher level, but the only reason the whole thing isn't 100x slower than C is because it is calling C and C++ modules for everything that would slow it down.

    When you are writing in Python, you are actually calling a collection of C and C++ submodules, I assume Rust is much the same.

    --
    🌻🌻 [google.com]
    • (Score: 3, Insightful) by sgleysti on Saturday September 24 2022, @08:52PM (14 children)

      by sgleysti (56) Subscriber Badge on Saturday September 24 2022, @08:52PM (#1273476)

      I'm pretty sure Rust compiles to machine language.

      At my last job, I had to take some researcher's MATLAB code for processing a certain kind of data and make it run in robust and automated fashion within certain time constraints in GNU Octave. After reorganizing the code a bit and making it run in Octave, I spent some quality time with the profiler and ended up writing several large, slow functions in C++ against Octave's API. I should note that MATLAB has a just in time compiler, while Octave does not.

      To this day, I think that interpreted languages are wasteful for anything that runs often. They're ok for prototyping, running quick calculations, and otherwise playing around. But if your code needs to run, why not compile to machine language?

      • (Score: 3, Insightful) by JoeMerchant on Saturday September 24 2022, @09:41PM (12 children)

        by JoeMerchant (3937) on Saturday September 24 2022, @09:41PM (#1273479)

        Yes, makes sense that the Linux kernel would use a compiled language, so it _could_ handle that 5-10% of my code that needs to be custom, what remains to be seen is if Microsoft's auto translation of "all the things" really covers my library calls that constitute 90-95% of my stuff.

        I led a team that did a Matlab to C++ translation, and after some profiler time we found an extra (unnecessary) level in a nested loop that resulted in about 100x speedup of the code... Language doesn't help much when your algorithms are intentionally wasting time...

        --
        🌻🌻 [google.com]
        • (Score: 2) by turgid on Sunday September 25 2022, @09:24AM (11 children)

          by turgid (4318) Subscriber Badge on Sunday September 25 2022, @09:24AM (#1273538) Journal

          Kernel code has to be compiled for performance and timing reasons, and also to have complete access to all the hardware. An interpreter implies all sorts of overhead. You could have an interpreter in the kernel, but underneath it there would have to be some basic functionality in machine code to make things like paging and virtual memory work, to implement the scheduling and device drivers and so on. Typically, in kernel code, there are all sorts of constraints that you don't have in user-land coding. For example, in the days of 32-bit CPUs (and many embedded systems use 32-bit ARMs these days) when memory pages were 4k bytes, you were very limited in the amount of stack each function or each thread could have. Basically, it had to fit in one 4k page. The Linux kernel build system checks your code for this and spits out an error if you exceed the bounds. You could write an interpreter for something that would fit in there. It would have to be something simple like FORTH or maybe a very tiny BASIC like the 8-bit micros from 40 years ago. The philosophy of the kernel is that it should be small and simple and provide the bare minimum. All the complicated stuff should be in user land, where processes are isolated from each other, memory is protected and virtual, things can be run without root privileges, stacks and heaps can be megabytes in size, the OS cleans up after your process when it terminates etc.

          • (Score: 3, Insightful) by JoeMerchant on Sunday September 25 2022, @12:35PM (10 children)

            by JoeMerchant (3937) on Sunday September 25 2022, @12:35PM (#1273548)

            You are also describing my impression of Python in ML/AI applications: it has to have compiled modules (C, Fortran, whatever) underneath, or you will be waiting for years instead of days for your advanced models to train.

            --
            🌻🌻 [google.com]
            • (Score: 2) by turgid on Sunday September 25 2022, @12:46PM (9 children)

              by turgid (4318) Subscriber Badge on Sunday September 25 2022, @12:46PM (#1273552) Journal

              Correct. Interpreters add a significant overhead. Try writing one yourself to see how much. I once wrote a very simple interpreter, an "emulator" for an imaginary CPU whose instruction set I made up myself. I wrote it in C. It ran at about 8 MIPS on a 1.67GHz superscalar machine.

              • (Score: 2) by JoeMerchant on Sunday September 25 2022, @03:13PM (8 children)

                by JoeMerchant (3937) on Sunday September 25 2022, @03:13PM (#1273563)

                Oh, c'mon, I remember the Java sales pitches of the late '90s: it will compile to byte-code. We will have byte-code interpreters in silicon. Even today, aren't they pitching JIT byte code compilation and optimization as the most efficient possible solution (for a very particular problem space...)?

                --
                🌻🌻 [google.com]
                • (Score: 2) by turgid on Sunday September 25 2022, @04:26PM (7 children)

                  by turgid (4318) Subscriber Badge on Sunday September 25 2022, @04:26PM (#1273569) Journal

                  A bytecode interpreter in silicon is called a CPU :-) To be fair, there are apparently some corner cases that a JIT can do better than a standard compiler through feedback from the running program. Or at least, there were claims some years ago. I meant to write a re-compiler for my imaginary CPU to translate the machine code into C source, and then see how much faster it is. Hey. there's an idea for a little hobby project!

                  • (Score: 2) by JoeMerchant on Sunday September 25 2022, @05:15PM (6 children)

                    by JoeMerchant (3937) on Sunday September 25 2022, @05:15PM (#1273575)

                    >re-compiler for my imaginary CPU to translate the machine code into C source, and then see how much faster it is. Hey. there's an idea for a little hobby project!

                    A carefully selected University might just give you a PhD for that, unless something comes up that they need you to do instead to secure some grant....

                    --
                    🌻🌻 [google.com]
                    • (Score: 2) by turgid on Sunday September 25 2022, @05:44PM (5 children)

                      by turgid (4318) Subscriber Badge on Sunday September 25 2022, @05:44PM (#1273580) Journal

                      Really? I can think of a completely trivial solution that ticks the box. It might not be optimal... But I think a certain Farbice Bellard of mega-super-intelligence and qemu fame might already have invented it and perfected it many years ago.

                      • (Score: 3, Funny) by JoeMerchant on Sunday September 25 2022, @06:21PM (4 children)

                        by JoeMerchant (3937) on Sunday September 25 2022, @06:21PM (#1273588)

                        Could be, I disengaged from the theory long ago...

                        Now, when you say "imaginary CPU" that's where you become PhD eligible, since you can always imagine a CPU that hasn't existed before, and therefore your PhD will be original :-P

                        --
                        🌻🌻 [google.com]
                        • (Score: 2) by turgid on Sunday September 25 2022, @06:41PM (3 children)

                          by turgid (4318) Subscriber Badge on Sunday September 25 2022, @06:41PM (#1273598) Journal

                          It was an ungodly mix of SPARC and ARM, FYI.

                          • (Score: 3, Interesting) by JoeMerchant on Tuesday September 27 2022, @01:58AM (2 children)

                            by JoeMerchant (3937) on Tuesday September 27 2022, @01:58AM (#1273801)

                            For my Masters' Thesis in 1989 I essentially built the processor from the PS3: Common Motorola CPU for a central controller with an array of 8 DSPs for the "heavy lifting."

                            --
                            🌻🌻 [google.com]
                            • (Score: 2) by turgid on Tuesday September 27 2022, @11:49AM (1 child)

                              by turgid (4318) Subscriber Badge on Tuesday September 27 2022, @11:49AM (#1273861) Journal
                              • (Score: 3, Funny) by JoeMerchant on Tuesday September 27 2022, @02:37PM

                                by JoeMerchant (3937) on Tuesday September 27 2022, @02:37PM (#1273873)

                                When they announced the architecture, some 15+ years later, I was like: "Well, it was obvious to me, what took everyone else so long?!!"

                                Side note: although I bought the parts and started breadboarding the actual thing, people from my advisor to other professors and other MS/PhD candidates kept saying "you know, you don't have to actually build one to get your degree...." I eventually listened to them after about a semester and a half.

                                --
                                🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Sunday September 25 2022, @08:06PM

        by Anonymous Coward on Sunday September 25 2022, @08:06PM (#1273616)

        At my last job, I had to take some researcher's MATLAB code for processing a certain kind of data and make it run in robust and automated fashion within certain time constraints in GNU Octave. After reorganizing the code a bit and making it run in Octave, I spent some quality time with the profiler and ended up writing several large, slow functions in C++ against Octave's API. I should note that MATLAB has a just in time compiler, while Octave does not.

        A large part of the skill of leveraging high level languages such as MATLAB is to jack-knife your problem into things they are supremely fast at. In MATLAB, you're a fool if you're not using it as a wrapper to the hand-optimized assembly math libraries. It's a fantastic tool and I only resent that it costs so much, but it's a tool you have to learn. Think of it as a musical instrument like a piano and using it to record a track a single line at a time. Wrong. Play all the lines at the same time. Right.

    • (Score: 2) by bart9h on Saturday September 24 2022, @09:50PM (2 children)

      by bart9h (767) on Saturday September 24 2022, @09:50PM (#1273481)

      No, Rust is not a higher level language. It's a system language, compiled directly to assembly. It's pretty much at the same level of C or C++.

      • (Score: 2) by JoeMerchant on Saturday September 24 2022, @11:48PM (1 child)

        by JoeMerchant (3937) on Saturday September 24 2022, @11:48PM (#1273489)

        True enough, though all those machine translated C libraries may not work exactly the same when compiled as translated C to Rust...

        --
        🌻🌻 [google.com]
        • (Score: 2) by bart9h on Sunday September 25 2022, @05:24PM

          by bart9h (767) on Sunday September 25 2022, @05:24PM (#1273576)

          Actually, Rust code can use C libraries with ease.

          The libraries are not translated from C to Rust. Either you re-implement it in Rust, or, more commonly, you use the C library directly.

  • (Score: 2) by DannyB on Monday September 26 2022, @03:36PM

    by DannyB (5839) Subscriber Badge on Monday September 26 2022, @03:36PM (#1273723) Journal

    "Stop using C and C++ FOR NEW PROJECTS" is very different from completely eradicating C and C++.

    I am reminded of a conversation between Londo and Mr. Morden . . .

    Londo: "why don't you eliminate the entire Narn homeworld while you're at it?"

    Morden: "One thing at a time ambassador, one thing at a time"

    The look on Londo's face shows that he suddenly realizes what he has gotten himself in to.

    Maybe just wait for Rust++

    --
    The lower I set my standards the more accomplishments I have.