Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday September 23 2022, @11:35PM   Printer-friendly
from the leaks-are-for-kids dept.

Arthur T Knackerbracket has processed the following story:

Mark Russinovich, the chief technology office (CTO) of Microsoft Azure, says developers should avoid using C or C++ programming languages in new projects and instead use Rust because of security and reliability concerns.

Rust, which hit version 1.0 in 2020 and was born at Mozilla, is now being used within the Android Open Source Project (AOSP), at Meta, at Amazon Web Services, at Microsoft for parts of Windows and Azure, in the Linux kernel, and in many other places. 

Engineers value its "memory safety guarantees", which reduce the need to manually manage a program's memory and, in turn, cut the risk of memory-related security flaws burdening big projects written in "memory unsafe" C or C++, which includes Chrome, Android, the Linux kernel, and Windows. 

Microsoft drove home this point in 2019 after revealing 70% of its patches in the past 12 years were fixes for memory safety bugs due largely to Windows being written mostly in C and C++. Google's Chrome team weighed in with its own findings in 2020, revealing that 70% of all serious security bugs in the Chrome codebase were memory management and safety bugs. It's written mostly in C++.     

"Unless something odd happens, it [Rust] will make it into 6.1," wrote Torvalds, seemingly ending a long-running debate over Rust becoming a second language to C for the Linux kernel. 

The Azure CTO's only qualifier about using Rust is that it was preferable over C and C+ for new projects that require a non-garbage-collected (GC) language. GC engines handle memory management. Google's Go is a garbage-collection language, while the Rust project promotes that Rust is not. AWS engineers like Rust over Go because of the efficiencies it offers without GC.

"Speaking of languages, it's time to halt starting any new projects in C/C++ and use Rust for those scenarios where a non-GC language is required. For the sake of security and reliability. the industry should declare those languages as deprecated," Russinovich wrote. 

Rust is a promising replacement for C and C++, particularly for systems-level programming, infrastructure projects, embedded software development, and more – but not everywhere and not in all projects.  

[...] Rust shouldn't be viewed as a silver bullet for all the bad habits developers practice when coding in C or C++. 

Bob Rudis, a cybersecurity researcher for GreyNoise Intelligence, who was formerly with Rapid7, noted developers can carry across the same bad security habits to Rust.

"As others have said, you can write "safely" in C or C++, but it's much harder, no matter what dialect you use than it is in Rust. Mind you, you can still foul up security in Rust, but it does avoid a lot of old memory problems."


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Funny) by MIRV888 on Saturday September 24 2022, @05:00AM (5 children)

    by MIRV888 (11376) on Saturday September 24 2022, @05:00AM (#1273328)

    I don't know jack about programming, but if it's anything like hardware there are new iterations constantly. Wouldn't this necessitate eventually moving to a new coding language because of new features the advancing hardware provides?
    I just know C / C++ are long in the tooth age wise. I don't know if this equates to being dated in capabilities.
    Thanks

    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Funny=1, Total=2
    Extra 'Funny' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 5, Informative) by coolgopher on Saturday September 24 2022, @05:41AM

    by coolgopher (1157) on Saturday September 24 2022, @05:41AM (#1273337)

    Generally speaking, no, you don't need a new language. Hardware advances/advantages are taken care of by the compilers. Modern compilers typically comprise a few distinct parts. You have a frontend which translates a particular programming language into what's known as an Abstract Syntax Tree. Various optimisers then reshape this for speed (or size, if that's the stated goal), possibly with some knowledge of the target architecture. The processed syntax tree is then handed to a hardware-architecture specific code generator backend, which may in turn further tweak the syntax tree to make the most of the available hardware. In this model, a new human/machine language needs only a new frontend implementation, after which point it can tap into the full optimisation/target support, while hardware advances often only need support added in the relevant backend. This is obviously a very generalised overview.

    In short, only the compiler implementers end up having to worry about hardware advancements. Thankfully. Because modern hardware has gotten stupidly complex. And I have a huge amount of respect and appreciation for the people who spend their time working on the open source compiler toolchains.

    All that said, there are indeed times when new languages are needed. For example, the massively parallel architecture of GPUs lead to the development of CUDA in order to better take advantage of that architecture, since it was such a radical departure from your typical CPU. And of course, development for quantum computers looks utterly different from classical computing (and there's a lot of maturing to be done in that area).

  • (Score: 4, Insightful) by maxwell demon on Saturday September 24 2022, @05:41AM (2 children)

    by maxwell demon (1608) on Saturday September 24 2022, @05:41AM (#1273338) Journal

    There are new iterations of C and C++ constantly. Also, unlike hardware, languages usually get new capabilities through libraries. A new language means loss of all the libraries that haven't yet been ported. To mitigate that problem, many languages include ways to call C code.

    --
    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 3, Insightful) by sgleysti on Saturday September 24 2022, @04:45PM

      by sgleysti (56) Subscriber Badge on Saturday September 24 2022, @04:45PM (#1273423)

      To mitigate that problem, many languages include ways to call C code.

      And FORTRAN code.

    • (Score: 3, Informative) by turgid on Saturday September 24 2022, @05:44PM

      by turgid (4318) Subscriber Badge on Saturday September 24 2022, @05:44PM (#1273437) Journal

      There's another subtlety, though. On a Unix/Linux system, the Application Binary Interface, the way programs call each other, is defined by the output of the C compiler. Because Unix was written in C (the first portable OS written in a high-level language) the way C calls functions is the way you call the OS and hence other libraries on the system. Any language that can generate code that speaks this ABI is therefore compatible. By default, C++ does not. It has all sorts of hackery underneath. In general, interpreted languages do not either, for obvious reasons. Pascal and its family of languages do not by default (they put arguments on the stack in the opposite order).

      On a Unix or Linux, the convention across the OS is for libraries to use the C ABI. That's what comes out of the compiler used to compile the system, after all. The problem comes when you have a load of libraries written in C++ (which is quite common). Because of C++'s weirdness, you won't be calling those libraries from any other program not written in C++ and not only that, having been compiled with the same compiler. That is, unless the programmers have provided an interface to the library using the C calling convention. That's a load more work in C++. It means wrapping each API call in an 'extern "C {}' declaration with a C function call...

      C++ was supposed to take over the world. Fortunately it didn't.

  • (Score: 2) by sgleysti on Saturday September 24 2022, @04:59PM

    by sgleysti (56) Subscriber Badge on Saturday September 24 2022, @04:59PM (#1273425)

    Perhaps the biggest new hardware features (for various definitions of new) are multiprocessing / parallelism and vector units. Some older languages don't handle parallelism well or need libraries or extensions to do so. Support for vector units in modern processors is usually handled by upgrades to the compilers. The compilers might add intrinsics to aid in vectorizing code, although I'm pretty sure a Linear Algebra package that I use has explicit assembly code for various processors and architectures to make sure that critical routines use the vector units.

    Graphics processors are so different from CPUs that they require special languages and/or programming techniques.

    And then FPGAs are usually "programmed" in a hardware description language because they're a more different type of hardware still.