Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday June 19 2024, @06:10PM   Printer-friendly
from the curl-and-libcurl dept.

cURL (established 1998) is one of the most widely used pieces of software in the world, especially if the development library libcurl is included. As has become tradition, the founder and lead developer Daniel Stenberg has published a detailed analysis of the annual cURL survey.

Ten quick things to take away

If you are in too much of a hurry to read it all, here are ten facts this year's survey revealed:

  • 96.4% of the users run it on Linux (*)
  • 98.6% use it for HTTPS (*)
  • 98% of Windows users run curl on x86 64 bit architecture (*)
  • 99% of the users who know, use curl with OpenSSL (*)
  • curl users run it on Android more than on FreeBSD
  • Windows 10 is the most used Windows version for curl use
  • More than 100 different command line options is a favorite for at least one
  • 83.1% rates our "security handling" 5 out of 5
  • 22.8% of users would like to see it offer recursive HTML download
  • 14.1% of users have used curl for 18 or more years

(*) = not exclusively - as these questions allowed respondents to select multiple answers the total ends up larger than 100%

cURL and libcurl are Free and Open Source Software. cURL and libcurl are licensed under conditions similar to the MIT License.

Previously:
(2024) The I in LLM Stands for Intelligence
(2023) "cURL", the URL Code That Can, Marks 25 Years of Transfers
(2021) Half of Curl's Security Vulnerabilities Due to C Mistakes
(2020) curl up 2020 and Other Conferences Go Online Only
(2018) Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again
(2018) Reducing Year 2038 Problems in curl
(2016) Your wget (and curl) is Broken and Should DIE, GitHubbers Tell Microsoft


Original Submission

Related Stories

Your wget (and curl) is Broken and Should DIE, GitHubbers Tell Microsoft 53 comments

Well, that didn't take long: within a week of applause for Microsoft's decision to open-source PowerShell, a comment-war has broken out over curl and wget.

For those not familiar with these commands: they're open source command line tools for fetching Internet content without a browser. Apart from obvious applications like downloading whole sites (for example as backup), they're also under the hood for a lot of other toolsets (an example the author is familiar with – GIS tools use curl and/or wget to fetch maps from Web services).

For some reason, Microsoft's team decided to put aliases for curl and wget in Windows PowerShell – but, as this thread begins, those aliases don't deliver curl and wget functionality.

The pull request says the aliases should be spiked: "They block use of the commonly used command line tools without providing even an attempt to offer the same functionality. They serve no purpose for PowerShell users but cause confusion and problems to existing curl and wget users."

http://www.theregister.co.uk/2016/08/23/your_wget_is_broken_and_should_die_powershellers_tell_microsoft/

-- submitted from IRC


Original Submission

Reducing Year 2038 Problems in curl 25 comments

curl is a text-based utility and library for transferring data identified by their URLs. It is now year-2038 safe even on 32-bit systems. Daniel Stenberg, the orginal hacker of curl, has overseen a year-2038 fix for 32-bit systems. Without specific modifications, 32-bit systems cannot handle dates beyond 03:14:07 UTC on 19 January 2038. After that date, the time counter flips over and starts over again at zero, which would be the beginning of the UNIX epoch known as 00:00:00 UTC on 1 January 1970. Given the pervasiveness of 32-bit embedded systems and their long service lives, this is a serious problem and good (essential) to have fixed decades in advance. The OpenBSD project was the first major software project to take steps to avoid potential disaster from 32-bit time and awareness has since started to spread to other key software project such as curl.


Original Submission

Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again 63 comments

Daniel Stenberg, author of the ubiquitous URL fetcher cURL and the libcurl multiprotocol file transfer library, and recipient of the 2017 Polheim Prize, has been blocked again from attending a US-based conference. Daniel has written a post in his blog about his two-year odyssey through the byzantine US bureaucracy to try to get permission to attend a work-related conference in California. He has been in the US nine times previously but despite pages of paperwork, hundreds of dollars in fees, and personal visits to the embassy, no dice. As a result the conference will have to move outside the US and probably Canada too if it wants to stay open to the world's top talent.

Earlier on SN:
US Visa Applications May Soon Require Five Years of Social Media Info (2018)
Reducing Year 2038 Problems in curl (2018)
cURL turns Seventeen Today (2015)


Original Submission

curl up 2020 and Other Conferences Go Online Only 6 comments

The 2020 edition of curl up has gone to an online-only format this year and will not involve a physical meetup. Many other upcoming conferences have already announced either a complete cancellation or a similar move to an online-only edition for 2020.

curl up 2020 will still take place, and at the same date as planned (May 9-10), but we will change the event to a pure online and video-heavy occasion. This way we can of course also even [more easily] welcome audience and participants from even [further] away who previously would have had a hard time to participate.

Which other relevant conferences, expositions, trade shows, or similar events have been moved to online only for this year?


Original Submission

Half of Curl's Security Vulnerabilities Due to C Mistakes 83 comments

curl developer Daniel Stenberg has gone through his project's security problems and calculated that 51 out of curl's 98 security vulnerabilities have been C mistakes. The total number of bugs in the database is about 6.6k, meaning that not quite 1.5% have been security flaws.

Let me also already now say that if you check out the curl security section, you will find very detailed descriptions of all vulnerabilities. Using those, you can draw your own conclusions and also easily write your own blog posts on this topic!

This post is not meant as a discussion around how we can rewrite C code into other languages to avoid these problems. This is an introspection of the C related vulnerabilities in curl. curl will not be rewritten but will continue to support backends written in other languages.

It seems hard to draw hard or definite conclusions based on the CVEs and C mistakes in curl's history due to the relatively small amounts to analyze. I'm not convinced this is data enough to actually spot real trends, but might be mostly random coincidences.

After the stats and methodology, he goes into more detail about the nature of the 51 bugs and the areas in the program (and library) where they occur. In general, the problems sort out into buffer overreads, buffer overflows, use after frees, double frees, and NULL mistakes.

Previously:
(2020) curl up 2020 and Other Conferences Go Online Only
(2019) Google to Reimplement Curl in Libcrurl
(2018) Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again
(2018) Twenty Years of cURL on March 20, 2018
(2018) Reducing Year 2038 Problems in curl
(2017) Eric Raymond: "The long goodbye to C"


Original Submission

"cURL", the URL Code That Can, Marks 25 Years of Transfers 6 comments

Utility began as a personal project, found its way into billions of devices:

Daniel Stenberg has observed the 25th anniversary of the curl open source project with the publication of curl 8.0.0, the 215th release of the command line tool, and a modest tele-celebration.

The name curl, originally rendered as "cURL" to emphasize its function, stands for "Client for URLs" or "Client URL Request Library" or its recursive form, "curl URL Request Library."

It's a command line tool and library for transferring data with URLs. Once installed on a device with command line access, curl can be used, through a text command, to send or fetch data to and from a server using a variety of network protocols.

Any developer who is serious about writing code that interacts over a network has probably used curl, or does so regularly. Presently, billions of devices rely on curl – cars, mobile phones, set top boxes, routers, and other such items use it internally for data transfer.

"The curl project started out very humbly as a small renamed URL transfer tool that almost nobody knew about for the first few years," said Stenberg in a blog post. "It scratched a personal itch of mine."

The first version of curl debuted on March 20, 1998 as version 4.0. It had 2,200 lines of code and had been adapted from projects known as httpget and urlget. As Stenberg explained, curl 4.0 supported just three protocols, HTTP, GOPHER and FTP, and 24 command line options. Version 8.0.0 can handle 28 protocols and 249 command line options.

"The first release of curl was not that special event since I had been shipping httpget and urlget releases for over a year already, so while this was a new name it was also 'just another release' as I had done many times already," he wrote.

The I in LLM Stands for Intelligence 28 comments

Daniel Stenberg of cURL fame has written about the impact of fake, LLM-generated bug reports has on his project, cURL. The main problem with LLM-generated bug reports is that they tend to be bunk while at the same time looking close enough to a real bug report as to end up wasting a lot of developer time which could have been used triaging and addressing real bugs.

A security report can take away a developer from fixing a really annoying bug. because a security issue is always more important than other bugs. If the report turned out to be crap, we did not improve security and we missed out time on fixing bugs or developing a new feature. Not to mention how it drains you on energy having to deal with rubbish.

Often wannabe security consultants will take the output of an LLM and modify it with their own language, thus intentionally or unintentionally obscuring some of the telltale warning signs of LLM-generated bunk.

Previously:
(2023) "cURL", the URL Code That Can, Marks 25 Years of Transfers
(2023) Half of Curl's Security Vulnerabilities Due to C Mistakes
(2020) curl up 2020 and Other Conferences Go Online Only
(2018) Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again


Original Submission

This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Troll) by pkrasimirov on Wednesday June 19 2024, @08:49PM (4 children)

    by pkrasimirov (3358) Subscriber Badge on Wednesday June 19 2024, @08:49PM (#1361078)

    > recursive HTML download
    How about first making a single download, but reliably? Like download me a file, but don't if it is already downloaded, but do download if there is a newer version on the server or the local one does not match (incomplete download or tampered). And yes, that includes follow redirects etc.

    • (Score: 4, Touché) by Unixnut on Wednesday June 19 2024, @09:22PM (3 children)

      by Unixnut (5779) on Wednesday June 19 2024, @09:22PM (#1361083)

      Why reinvent the wheel when wget does all that (including recursive fetch?).

      Curl has its uses, as does wget, and they are both good at what they need to do. I see no reason for them to adopt each others features, as that just ends up being duplicated effort.

      • (Score: 4, Interesting) by bzipitidoo on Wednesday June 19 2024, @11:51PM (2 children)

        by bzipitidoo (4388) on Wednesday June 19 2024, @11:51PM (#1361103) Journal

        Is there much of a difference between curl and wget? I use wget more, but a long time ago it had a minor bug that caused it to quit without fetching anything if some parameter was null. For those cases I used curl, until I took a look at the wget source code and found it was real easy to fix the problem.

        • (Score: 4, Informative) by Unixnut on Thursday June 20 2024, @09:21AM

          by Unixnut (5779) on Thursday June 20 2024, @09:21AM (#1361152)

          To me curl seems geared more for use in scripts and/or as a library. For example by default curl will output what it fetches to stdout. wget on the other hand is more for end-user/human interaction, with a goal towards mirroring/spidering websites.

          You can see this by the fact wget will download the data into a file and name it according to the upstream URL by default, which is more human friendly. It also does recursive downloads/mirroring/spidering, and has a lot of tricks like URL rewriting so that your local mirrors links all work like they do on the source.

          Basically if I need to do a one shot HTTP request (either as a script or in a library) I use curl (e.g. if you are interacting with a web API). If I want to download a web page/image/file, or mirror a website, I use wget.

          You can beat each of the tools into doing what the other does, but why bother? I find them complementary as it is.

        • (Score: 3, Informative) by SomeRandomGeek on Friday June 21 2024, @12:20AM

          by SomeRandomGeek (856) on Friday June 21 2024, @12:20AM (#1361288)

          WGET is a good tool for downloading files using HTTP(s).
          cURL is a good tool for invoking RESTful APIs, which often use weird HTTP(s) features, like specifying a lot of headers and passing input parameters as an HTTP PUT.

  • (Score: 0, Troll) by Anonymous Coward on Wednesday June 19 2024, @09:45PM (12 children)

    by Anonymous Coward on Wednesday June 19 2024, @09:45PM (#1361088)

    I can't take any developer seriously who talks about security, but makes excuses about using c. Just admit you don't care about security that much. You don't owe anyone a rewrite, but stop trying to justify your shitty old language.

    • (Score: 5, Insightful) by JoeMerchant on Wednesday June 19 2024, @10:14PM (6 children)

      by JoeMerchant (3937) on Wednesday June 19 2024, @10:14PM (#1361096)

      > any developer seriously who talks about security, but makes excuses about using c

      Any language can be used to code insecure apps.

      Any application can be coded securely in C, or assembly, or Forth, or Rust, or whatever language you choose.

      Thinking that "I am taking security seriously" because you are using the latest indeterminant garbage collected flavor of the week is the real joke.

      (Openly acknowledge: the average programmer will make less memory management mistakes when using memory management training wheels.)

      --
      🌻🌻🌻 [google.com]
      • (Score: 3, Interesting) by DannyB on Thursday June 20 2024, @02:15PM (5 children)

        by DannyB (5839) Subscriber Badge on Thursday June 20 2024, @02:15PM (#1361186) Journal

        <no-sarcasm>
        Leonardo could have created beautiful art using human excrement, but he knew better.
        </no-sarcasm>

        Billions of dollars could have been saved if C didn't have null terminated strings. Pascal strings had a length byte at the beginning and an allocated space for the string that could only be known at compile time (not runtime, oops). C could have had the foresight to not seriously limit string lengths by using only a single byte as the string length. Strings in C should have been designed with a 128 bit integer string length prefix.

        the average programmer will make less memory management mistakes when using memory management training wheels

        <no-sarcasm>
        That is just an outright uncalled for insult. I would point out that many skilled people use GC every day. It is not training wheels as if using GC is some kind of precursor to graduating to languages without GC. People use GC because it saves DOLLARS. Development and maintenance dollars. It is easier to avoid bugs in a language that simply cannot express certain kinds of bugs, such as double-free, forgetting to free, or referencing after freeing.

        If there were one perfect language for all uses, we would all already be using it.
        </no-sarcasm>

        --
        Stop asking "How stupid can you be?" Some people apparently take it as a challenge.
        • (Score: 3, Interesting) by JoeMerchant on Thursday June 20 2024, @03:44PM (4 children)

          by JoeMerchant (3937) on Thursday June 20 2024, @03:44PM (#1361199)

          >Billions of dollars could have been saved if C didn't have null terminated strings.

          Which is why, since 2006, I have been using C++ / Qt / QString - it's just better. If I were ever forced into "straight C" for a long term project that did lots of string manipulation, I would be coding something similar into a library to call.

          >It is not training wheels as if using GC is some kind of precursor to graduating to languages without GC.

          Fair enough. However, I did ride a bike with training wheels for several months before "graduating" to two wheels without the "helpers." The feeling I get with GC in Python on Raspberry Pi Pico / Thonny is very analogous to training wheels - sure, if you don't lean on them too hard they can keep you from falling over, but they're also constantly in the way - wondering why this routine completes on-time most of the time, but once in a while takes too long, let's trace through and see what could be happening... oh, it's that "magic memory manager" taking my cycles again, just as much fun as coding for Windows 3.1.1 where you never knew how long it would be before your code reawakened from arbitrarily induced slumber.

          >double-free, forgetting to free, or referencing after freeing.

          Smart pointers heard you, over a decade ago, and they're pretty good about all that.

          >If there were one perfect language for all uses, we would all already be using it.

          No language will ever be optimal for all uses. My preference is for languages that do what you tell them to, without high variability in how they go about doing it - particularly memory managers which can, and do, error out for mysterious causes.

          --
          🌻🌻🌻 [google.com]
          • (Score: 2) by DannyB on Monday June 24 2024, @01:58PM (3 children)

            by DannyB (5839) Subscriber Badge on Monday June 24 2024, @01:58PM (#1361815) Journal

            Smart pointers are a great idea. An idea that was obvious to me back in the day. However they do not handle circular references.

            The reason there can be no perfect language is because various language design goals can be at odds with one another. C is great for programming close to the hardware. You can see every clock cycle in C. Every single clock cycle, such as a pointer dereference is clearly visible in the source code. It's great for microcontroller firmware, boot loaders, device drivers and maybe even OS kernels. I don't think of it as so great for applications, but that is just me. At the opposite end are languages which are after high level abstraction (not training wheels) as a design goal. The programmer doesn't know and isn't interested in the size of a machine word. There is no maximum limit on the value of an integer, in principle. The programmer doesn't waste brain cycles on memory management. There could be no perfect language that would satisfy the high level abstraction and the low level access without being poor at both. The jack of all trades and the master of none.

            I think you have not seen modern GC. And Python is a very poor sampling. A language that is so dynamic it is already slow in execution because you can't be sure exactly what a given statement means. And that statement is even more true in JavaScript.

            Java has state of the art GC. ZGC is now the standard in the latest Java. This can handle heaps with many Terabytes (TB) of RAM with maximum GC pause times of 1 ms on a slow day.

            It is good not to have language bigotry. Any language that is popular is so for some good reason. Obviously a large community of people have found it useful to their commercial purposes. I don't disrespect popular languages because they must be doing something right for somebody in order to be popular, and it must be for a reason for those peoples' work.

            --
            Stop asking "How stupid can you be?" Some people apparently take it as a challenge.
            • (Score: 2) by JoeMerchant on Monday June 24 2024, @04:21PM (2 children)

              by JoeMerchant (3937) on Monday June 24 2024, @04:21PM (#1361836)

              >The programmer doesn't know and isn't interested in the size of a machine word.

              I have sunk too many hours (literally months worth of hours) into fixing issues that arose because the programmer wasn't interested in the size of a machine word. I don't "int", I "int32_t" or "int64_t", because it matters, not all the time, but often enough that taking that extra moment to choose an option and specify it will avoid many many more moments in the future questioning which way it went and what are the possible ramifications. It matters at the application level far too often to ignore it.

              >There is no maximum limit on the value of an integer, in principle.

              Until computers are so powerful that we can all afford to run big-int arithmetic all the time, there is a practical limit. Even running bigint and bigfloat, do you want the convenience of floating point, or the precision of rational ratios? Choices matter. Most people are good with float. If float would be implemented in the same base that the numbers are specified in (usually decimal) then float would be an even more universally applicable choice, but even on today's "most advanced" hardware, we typically have floats implemented in binary, leading to the need to choose: float vs int, as is more appropriate to the application.

              >The programmer doesn't waste brain cycles on memory management.

              This is great, until it's not. Once it's not, you tend to pay back that technical debt with interest.

              >modern GC. And Python is a very poor sampling.

              True enough. I gave up running around after C compilers checking their assembly output around 1995 when they finally started doing a better job than me - at least the good ones did. As for GC... memory is cheap, like unfathomably cheap, and when I allocate something one-time-only in a program, I don't give a damn if it's "leaked" and not recaptured until the program exits. Memory management starts to matter when the code allocates a lot of things, destroys those things and then allocates more of them such that if the destruction isn't done completely we actually will start to consume significant quantities of system memory. It matters much much more if something gets over-eager and "frees" something that something else will reference in the future. Maybe Java does all that better than I can. I will confess to some language bigotry when it comes to Java - things that have historically called themselves Java have earned the "poop" emojii as an icon in my mind, and the "latest Java" you speak of has, historically, been larger and larger piles of poo from my perspective as the years rolled on. I'm not in an environment that benefits from the miracles of modern Java, we're still dealing with legacy code from 15+ years ago that's wrapped up in security problems so intractable that it just needs a recode in something less problem prone.

              >Obviously a large community of people have found it useful to their commercial purposes.

              The same can be said for the Facebook platform. I don't disrespect the social value of Facebook, but from a technical / usability standpoint, it gets three giant animated poop emojiis.

              --
              🌻🌻🌻 [google.com]
              • (Score: 2) by DannyB on Tuesday June 25 2024, @01:58PM (1 child)

                by DannyB (5839) Subscriber Badge on Tuesday June 25 2024, @01:58PM (#1361929) Journal

                I have sunk too many hours (literally months worth of hours) into fixing issues that arose because the programmer wasn't interested in the size of a machine word.

                In your case, the programmer SHOULD have been concerned with the size of a machine word, but was not.

                What I said seemed to go right over your head.

                I am talking about a case where the programmer and the problem he is working on has NOTHING to do with the size of machine words. For example, a computer algebra system (CAS). I need Integers, yes. But they have unlimited precision (in principle). Complex term rewriting systems are best written using GC. GC works and it is used for good reasons. I understand that you are hyper focused on an entirely different world of problems where C and manual memory management are the best solution. You seem absolutely blind and even willfully blind to the obvious fact that large successful systems that are different, with differing goals and solutions, are popular FOR GOOD REASONS. Because they work best for the problem at hand. They are the most economical solutions. One example of a popular CAS: Mathematica aka Wolfram Alpha. Or Maxima. Or now days, advanced pocket calculators which have CAS capabilities.

                Also: in all but a few cases: if you're optimizing for cpu cycles and bytes, you're doing it wrong. You should be optimizing for dollars. In some cases these two optimizations are the same thing. Especially in the 1970s. But much less often today.

                I am not trying to convince you to use very high level languages and adopt GC. I am trying to convince you that these things are used by vast numbers of talented people because THEY WORK and are often OPTIMAL solutions, especially in terms of money. I think you are blinded by language prejudice and are unable to see that.

                There are plenty of languages that I don't like. (eg, Perl) But I recognize that the popularity of it must be because some large number of people find it to have some valuable utility, despite my feelings about it.

                All Lisp family languages are another example of GC, abstraction away from machine features such as word length, etc.

                --
                Stop asking "How stupid can you be?" Some people apparently take it as a challenge.
                • (Score: 2) by JoeMerchant on Tuesday June 25 2024, @03:32PM

                  by JoeMerchant (3937) on Tuesday June 25 2024, @03:32PM (#1361954)

                  > large successful systems that are different, with differing goals and solutions, are popular FOR GOOD REASONS. Because they work best for the problem at hand.

                  Know that I agree with you in the majority of real-world cases. However, there are a very non-trivial set of real world cases where those GOOD REASONS amount to: the middle manager was getting kickbacks from the vendor, the majority of the team is too lazy to learn something different than what they are familiar with, corporate policies make it ridiculously onerous to change tool choices, we all get paid whether the system is buggy or not. In these and so many similar cases, they "work best" because the people involved in the implementation (which can be a MUCH larger set of people than the programmers using the language) have preferences completely unrelated to reliable, proper, efficient building and maintenance of the system.

                  The real world is too full of "the system is down, you'll have to try again later" to say that the best practices using the best tools implemented by the best people is happening everywhere, all the time.

                  I was recently pumping gas, two gallons in the power to the pump glitched, the touch display spent 3 minutes rebooting - I stuck around to ensure that my CC wasn't going to be used to fill the next customer's tank. During the reboot the screen prompted me: Press to enter setup... I decided to let that go by. Finally, after 3 minutes, the screen got around to informing me that the pump was out of order, indefinitely - but you can be damn sure that my CC was charged for the two gallons I had pumped... so I guess that system was working well, from the owners' perspective. Having met a software shop that does part of that gas pump interface stack, I can assure you: the vast majority of the people involved in the implementation barely have a clue what they are doing, and the people making the decisions (the ones selling the interface systems and employing their programmers at $30-40K per year) are absolutely clueless what a Garbage Collector might be in their system or why it might matter. Across town, the same can be said for a medical practice management software company who turns over their programmer staff even faster than their sales and marketing who have a mean term of employment less than two years.

                  > if you're optimizing for cpu cycles and bytes, you're doing it wrong. You should be optimizing for dollars. In some cases these two optimizations are the same thing.

                  About bytes, yes - as I said: memory is unfathomably cheap and getting cheaper. About CPU cycles, that depends - in 2006 we were building a system that needed "real-time" performance, meaning 5 minutes or less to run some pretty heavy computations. When I was hired, the proposed approach was a massively parallel array of networked MacPros - 50 or more - to achieve that 5 minute target. In the course of development we spotted a bottleneck, reworked the code to play nice with the CPU cache, and achieved a 100x speedup - the process now took less than 5 minutes on a dual core laptop. That's dollars, and that's also the last time I made such a dramatic improvement, but... even today our team uses an in-house developed (by me) tool in the build process that was taking 20-25 minutes to run, until I did an optimization cycle that brought execution time down to 2 minutes (since creeping back up to 3-4 as the project continues to grow...) That 20 minutes 2 or 3 times a day per active developer is definitely dollars, particularly when you factor in the marketing perspective about time-to-market market capture ratios, etc.

                  > recognize that the popularity of it must be because some large number of people find it to have some valuable utility

                  bash scripts have plenty of valuable utility, doesn't change the fact that they are a horrible language.

                  --
                  🌻🌻🌻 [google.com]
    • (Score: 4, Informative) by Anonymous Coward on Wednesday June 19 2024, @10:22PM (2 children)

      by Anonymous Coward on Wednesday June 19 2024, @10:22PM (#1361099)

      I can't take any developer seriously who talks about security, but makes excuses about using c. Just admit you don't care about security that much. You don't owe anyone a rewrite, but stop trying to justify your shitty old language.

      I'll start caring about your shitty new language if you can show me even one single program written it which can reasonably make anything even remotely close to this claim (from the article):

      We have records of curl running on 101 different operating systems and 28 different CPU architectures

      Let's take rust. There is essentially one compiler for this language. This compiler has (in their own words [rust-lang.org]) "guaranteed to work" support for a grand total of three different operating systems on three different CPU architectures, it does not support all of the combinations, and the "guaranteed to work" 32-bit x86 architecture support doesn't actually work on the vast majority of actual 32-bit x86 CPUs so it pretty much doesn't count.

      If you include the tier 2 targets that brings the total number of CPU architectures up to 9 and the total number of operating systems up to 6, again not all combinations are supported and the 32-bit x86 port is still useless.

      If you include the tier 3 targets you finally get a usable x86 port, a couple more CPU architectures, a couple more operating systems, and quite a few more combinations, but normal people can't use any of these because there are no bootstrap binaries provided.

      • (Score: 0) by Anonymous Coward on Thursday June 20 2024, @07:04PM (1 child)

        by Anonymous Coward on Thursday June 20 2024, @07:04PM (#1361239)

        If portability is more important than security for you, then that's fine. I didn't even mention portability. I'm just tired of devs who don't want to learn a new lang, gaslighting people about how mem-safe langs don't matter for security.

        • (Score: 0) by Anonymous Coward on Friday June 21 2024, @06:30AM

          by Anonymous Coward on Friday June 21 2024, @06:30AM (#1361329)

          It's not really about portability (although for writing portable programs there is no contest, C is king).

          It's about the fact that even if you wanted to use rust, most of the time you can't because there's simply no rust compiler for whatever system you're trying to program.

          The people pushing rust as a replacement for C have apparently no interest in solving this problem. So unless that changes, they will fail to replace C.

    • (Score: 1) by shrewdsheep on Thursday June 20 2024, @07:34AM (1 child)

      by shrewdsheep (5215) on Thursday June 20 2024, @07:34AM (#1361140)

      Care to have a "civil" argument with Linus Torvalds?

      • (Score: 1, Interesting) by Anonymous Coward on Thursday June 20 2024, @07:01PM

        by Anonymous Coward on Thursday June 20 2024, @07:01PM (#1361236)

        Linus is a smart guy, but i keep seeing kernel memory bugs in the cve emails. At least he is more honest and doesn't really pretend to care about security specifically. They are all just bugs to him and that's fair enough.

  • (Score: 1, Funny) by Anonymous Coward on Thursday June 20 2024, @12:58AM (6 children)

    by Anonymous Coward on Thursday June 20 2024, @12:58AM (#1361108)

    curl users run it on Android more than on FreeBSD

    That few FreeBSD users nowadays? Or fewer of them respond to such surveys?

    • (Score: 4, Insightful) by tekk on Thursday June 20 2024, @02:36AM (4 children)

      by tekk (5704) Subscriber Badge on Thursday June 20 2024, @02:36AM (#1361113)

      It's not too surprising really. I believe that Android is the most used OS on the planet. Even the tiny percentage of users who are going to install termux and run curl on their Android out of the billions of users is going to be bigger than the thousands(? Tens of thousands maybe? I recall a stat someone threw around recently that there are estimated to be 3 digits of OpenBSD users so I don't imagine FreeBSD even has the installation base of some Linux distros,) of intentional FreeBSD users.

      Of course FreeBSD does have 6 figures of users in that the PS4 and PS5 run FreeBSD with tweaks, but I don't think most of them are running curl :)

    • (Score: 2) by Unixnut on Thursday June 20 2024, @09:29AM

      by Unixnut (5779) on Thursday June 20 2024, @09:29AM (#1361153)

      I sure did not know the survey was a thing, but whether FreeBSD has a low number of users does not concern me much. I mean, the more popular Linux became the worse it got. Windows is still the number one OS popularity wise, and it is even worse than Linux.

      If the price of popularity is a ruination of a system, then I would rather the system be unpopular and well engineered. I need a system to work, and work well. That allows me to use it to achieve my goals. As far as I am concerned the BSD's can hang on quietly in the background doing what they do best.

      As for Android users running curl more than FreeBSD, that is not surprising. I would be very surprised to not find out 80+% of Android app users make use of curl to interface with their cloud/web API's, not to mention Android itself uses curl for its OS web interactions. That means that every single Android phone is also a "curl user". By that metric curl may well be more used on Android/Linux then on GNU/Linux, let alone other OS'es.

(1)