Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday September 23 2022, @11:35PM   Printer-friendly
from the leaks-are-for-kids dept.

Arthur T Knackerbracket has processed the following story:

Mark Russinovich, the chief technology office (CTO) of Microsoft Azure, says developers should avoid using C or C++ programming languages in new projects and instead use Rust because of security and reliability concerns.

Rust, which hit version 1.0 in 2020 and was born at Mozilla, is now being used within the Android Open Source Project (AOSP), at Meta, at Amazon Web Services, at Microsoft for parts of Windows and Azure, in the Linux kernel, and in many other places. 

Engineers value its "memory safety guarantees", which reduce the need to manually manage a program's memory and, in turn, cut the risk of memory-related security flaws burdening big projects written in "memory unsafe" C or C++, which includes Chrome, Android, the Linux kernel, and Windows. 

Microsoft drove home this point in 2019 after revealing 70% of its patches in the past 12 years were fixes for memory safety bugs due largely to Windows being written mostly in C and C++. Google's Chrome team weighed in with its own findings in 2020, revealing that 70% of all serious security bugs in the Chrome codebase were memory management and safety bugs. It's written mostly in C++.     

"Unless something odd happens, it [Rust] will make it into 6.1," wrote Torvalds, seemingly ending a long-running debate over Rust becoming a second language to C for the Linux kernel. 

The Azure CTO's only qualifier about using Rust is that it was preferable over C and C+ for new projects that require a non-garbage-collected (GC) language. GC engines handle memory management. Google's Go is a garbage-collection language, while the Rust project promotes that Rust is not. AWS engineers like Rust over Go because of the efficiencies it offers without GC.

"Speaking of languages, it's time to halt starting any new projects in C/C++ and use Rust for those scenarios where a non-GC language is required. For the sake of security and reliability. the industry should declare those languages as deprecated," Russinovich wrote. 

Rust is a promising replacement for C and C++, particularly for systems-level programming, infrastructure projects, embedded software development, and more – but not everywhere and not in all projects.  

[...] Rust shouldn't be viewed as a silver bullet for all the bad habits developers practice when coding in C or C++. 

Bob Rudis, a cybersecurity researcher for GreyNoise Intelligence, who was formerly with Rapid7, noted developers can carry across the same bad security habits to Rust.

"As others have said, you can write "safely" in C or C++, but it's much harder, no matter what dialect you use than it is in Rust. Mind you, you can still foul up security in Rust, but it does avoid a lot of old memory problems."


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by JoeMerchant on Saturday September 24 2022, @09:41PM (12 children)

    by JoeMerchant (3937) on Saturday September 24 2022, @09:41PM (#1273479)

    Yes, makes sense that the Linux kernel would use a compiled language, so it _could_ handle that 5-10% of my code that needs to be custom, what remains to be seen is if Microsoft's auto translation of "all the things" really covers my library calls that constitute 90-95% of my stuff.

    I led a team that did a Matlab to C++ translation, and after some profiler time we found an extra (unnecessary) level in a nested loop that resulted in about 100x speedup of the code... Language doesn't help much when your algorithms are intentionally wasting time...

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by turgid on Sunday September 25 2022, @09:24AM (11 children)

    by turgid (4318) Subscriber Badge on Sunday September 25 2022, @09:24AM (#1273538) Journal

    Kernel code has to be compiled for performance and timing reasons, and also to have complete access to all the hardware. An interpreter implies all sorts of overhead. You could have an interpreter in the kernel, but underneath it there would have to be some basic functionality in machine code to make things like paging and virtual memory work, to implement the scheduling and device drivers and so on. Typically, in kernel code, there are all sorts of constraints that you don't have in user-land coding. For example, in the days of 32-bit CPUs (and many embedded systems use 32-bit ARMs these days) when memory pages were 4k bytes, you were very limited in the amount of stack each function or each thread could have. Basically, it had to fit in one 4k page. The Linux kernel build system checks your code for this and spits out an error if you exceed the bounds. You could write an interpreter for something that would fit in there. It would have to be something simple like FORTH or maybe a very tiny BASIC like the 8-bit micros from 40 years ago. The philosophy of the kernel is that it should be small and simple and provide the bare minimum. All the complicated stuff should be in user land, where processes are isolated from each other, memory is protected and virtual, things can be run without root privileges, stacks and heaps can be megabytes in size, the OS cleans up after your process when it terminates etc.

    • (Score: 3, Insightful) by JoeMerchant on Sunday September 25 2022, @12:35PM (10 children)

      by JoeMerchant (3937) on Sunday September 25 2022, @12:35PM (#1273548)

      You are also describing my impression of Python in ML/AI applications: it has to have compiled modules (C, Fortran, whatever) underneath, or you will be waiting for years instead of days for your advanced models to train.

      --
      🌻🌻 [google.com]
      • (Score: 2) by turgid on Sunday September 25 2022, @12:46PM (9 children)

        by turgid (4318) Subscriber Badge on Sunday September 25 2022, @12:46PM (#1273552) Journal

        Correct. Interpreters add a significant overhead. Try writing one yourself to see how much. I once wrote a very simple interpreter, an "emulator" for an imaginary CPU whose instruction set I made up myself. I wrote it in C. It ran at about 8 MIPS on a 1.67GHz superscalar machine.

        • (Score: 2) by JoeMerchant on Sunday September 25 2022, @03:13PM (8 children)

          by JoeMerchant (3937) on Sunday September 25 2022, @03:13PM (#1273563)

          Oh, c'mon, I remember the Java sales pitches of the late '90s: it will compile to byte-code. We will have byte-code interpreters in silicon. Even today, aren't they pitching JIT byte code compilation and optimization as the most efficient possible solution (for a very particular problem space...)?

          --
          🌻🌻 [google.com]
          • (Score: 2) by turgid on Sunday September 25 2022, @04:26PM (7 children)

            by turgid (4318) Subscriber Badge on Sunday September 25 2022, @04:26PM (#1273569) Journal

            A bytecode interpreter in silicon is called a CPU :-) To be fair, there are apparently some corner cases that a JIT can do better than a standard compiler through feedback from the running program. Or at least, there were claims some years ago. I meant to write a re-compiler for my imaginary CPU to translate the machine code into C source, and then see how much faster it is. Hey. there's an idea for a little hobby project!

            • (Score: 2) by JoeMerchant on Sunday September 25 2022, @05:15PM (6 children)

              by JoeMerchant (3937) on Sunday September 25 2022, @05:15PM (#1273575)

              >re-compiler for my imaginary CPU to translate the machine code into C source, and then see how much faster it is. Hey. there's an idea for a little hobby project!

              A carefully selected University might just give you a PhD for that, unless something comes up that they need you to do instead to secure some grant....

              --
              🌻🌻 [google.com]
              • (Score: 2) by turgid on Sunday September 25 2022, @05:44PM (5 children)

                by turgid (4318) Subscriber Badge on Sunday September 25 2022, @05:44PM (#1273580) Journal

                Really? I can think of a completely trivial solution that ticks the box. It might not be optimal... But I think a certain Farbice Bellard of mega-super-intelligence and qemu fame might already have invented it and perfected it many years ago.

                • (Score: 3, Funny) by JoeMerchant on Sunday September 25 2022, @06:21PM (4 children)

                  by JoeMerchant (3937) on Sunday September 25 2022, @06:21PM (#1273588)

                  Could be, I disengaged from the theory long ago...

                  Now, when you say "imaginary CPU" that's where you become PhD eligible, since you can always imagine a CPU that hasn't existed before, and therefore your PhD will be original :-P

                  --
                  🌻🌻 [google.com]
                  • (Score: 2) by turgid on Sunday September 25 2022, @06:41PM (3 children)

                    by turgid (4318) Subscriber Badge on Sunday September 25 2022, @06:41PM (#1273598) Journal

                    It was an ungodly mix of SPARC and ARM, FYI.

                    • (Score: 3, Interesting) by JoeMerchant on Tuesday September 27 2022, @01:58AM (2 children)

                      by JoeMerchant (3937) on Tuesday September 27 2022, @01:58AM (#1273801)

                      For my Masters' Thesis in 1989 I essentially built the processor from the PS3: Common Motorola CPU for a central controller with an array of 8 DSPs for the "heavy lifting."

                      --
                      🌻🌻 [google.com]
                      • (Score: 2) by turgid on Tuesday September 27 2022, @11:49AM (1 child)

                        by turgid (4318) Subscriber Badge on Tuesday September 27 2022, @11:49AM (#1273861) Journal
                        • (Score: 3, Funny) by JoeMerchant on Tuesday September 27 2022, @02:37PM

                          by JoeMerchant (3937) on Tuesday September 27 2022, @02:37PM (#1273873)

                          When they announced the architecture, some 15+ years later, I was like: "Well, it was obvious to me, what took everyone else so long?!!"

                          Side note: although I bought the parts and started breadboarding the actual thing, people from my advisor to other professors and other MS/PhD candidates kept saying "you know, you don't have to actually build one to get your degree...." I eventually listened to them after about a semester and a half.

                          --
                          🌻🌻 [google.com]