Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by LaminatorX on Friday March 20 2015, @11:41PM   Printer-friendly
from the Olympic-cURLing dept.

Daniel Stenberg lets the world know that cURL, the little command line utility that lets you download stuff off the internet via HTTP along with a bunch of other protocols, has turned 17 today (March 20). Considering that it is also available to all of us for use in our programs as a nifty little library called 'libcurl', and that PHP, the most common web development language depends on libcurl for handling HTTP requests, we can be happy that cURL exists. I personally cannot count the number of times it has saved me and the machines I administer.

Related Stories

Twenty Years of cURL on March 20, 2018 16 comments

March 20th, 2018, Daniel Stenberg notes twenty years of his flexible, multi-protocol, text-based utility, curl. It is a very common client-side file transfer utility. The associated development libraries, libcurl are a couple of years younger.

curl itself and components from libcurl are found nearly everywhere these days. Due to such widespread use, it is hard to be precise with usage numbers, but conservative estimates suggest billions of people every day are using it, though mostly under the hood several layers down inside devices they own. It is the Internet transfer utility of choice for thousands of software applications. It is found in cars, television sets, routers, printers, audio equipment, mobile phones, tablets, settop boxes, and media players for starters.

A detailed, free-of-charge, ebook, Everything curl, covers basically everything there is to know about curl, libcurl, and the associated project.

Earlier on SN:

Reducing Year 2038 Problems in curl
cURL turns Seventeen Today

Original Submission

Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again 63 comments

Daniel Stenberg, author of the ubiquitous URL fetcher cURL and the libcurl multiprotocol file transfer library, and recipient of the 2017 Polheim Prize, has been blocked again from attending a US-based conference. Daniel has written a post in his blog about his two-year odyssey through the byzantine US bureaucracy to try to get permission to attend a work-related conference in California. He has been in the US nine times previously but despite pages of paperwork, hundreds of dollars in fees, and personal visits to the embassy, no dice. As a result the conference will have to move outside the US and probably Canada too if it wants to stay open to the world's top talent.

Earlier on SN:
US Visa Applications May Soon Require Five Years of Social Media Info (2018)
Reducing Year 2038 Problems in curl (2018)
cURL turns Seventeen Today (2015)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Funny) by wantkitteh on Saturday March 21 2015, @12:00AM

    by wantkitteh (3362) on Saturday March 21 2015, @12:00AM (#160621) Homepage Journal

    As you're now 17, we expect to see you hiding out the back of your birthday party with a furtive beer and sneaking out after dark with the cute gURL next door.

  • (Score: -1, Redundant) by Anonymous Coward on Saturday March 21 2015, @12:24AM

    by Anonymous Coward on Saturday March 21 2015, @12:24AM (#160627)
    Now it's grown up will it come with a no-nonsense cli program like it's older sibling wget?

    wget http://example.org/file.ext

    Is all I want to know.

    ----
    Something badly wrong with ecode parsing and URL's in plain-text and html modes!
    • (Score: 4, Informative) by wonkey_monkey on Saturday March 21 2015, @12:35AM

      by wonkey_monkey (279) on Saturday March 21 2015, @12:35AM (#160632) Homepage

      Now it's grown up will it come with a no-nonsense cli program like it's older sibling wget?

      curl http://example.org/file.ext >file.ext

       

      Sorry if that's not "no-nonsense" enough for you.

      --
      systemd is Roko's Basilisk
      • (Score: 4, Informative) by hemocyanin on Saturday March 21 2015, @12:57AM

        by hemocyanin (186) on Saturday March 21 2015, @12:57AM (#160635) Journal

        also -o or -O

        Name the file whatever you want:
        curl http://example.org/longfilename.ext [example.org] -o file.ext

        Use the remote file name:
        curl http://example.org/longfilename.ext [example.org] -O

      • (Score: 0) by Anonymous Coward on Saturday March 21 2015, @01:23AM

        by Anonymous Coward on Saturday March 21 2015, @01:23AM (#160643)

        So how is it better than wget?

        • (Score: 4, Informative) by hemocyanin on Saturday March 21 2015, @01:36AM

          by hemocyanin (186) on Saturday March 21 2015, @01:36AM (#160646) Journal

          curl and wget are different, but also share a lot of functionality. Where curl really stands out is that you don't really need to go through the intermediary of a file to process data -- the default behavior is to dump the text of a page into your terminal and you can then pipe it through whatever you want to manipulate it from there. It's an a great tool to scrape sites for info, saving to file only what you want.

          • (Score: 2) by hemocyanin on Saturday March 21 2015, @01:37AM

            by hemocyanin (186) on Saturday March 21 2015, @01:37AM (#160647) Journal

            Lame self-reply -- I believe wget has better mirroring capabilities than curl if you want to mirror a site.

            So you see, overlap with different competencies. You should learn both.

            • (Score: 2) by tynin on Saturday March 21 2015, @02:09AM

              by tynin (2013) on Saturday March 21 2015, @02:09AM (#160657) Journal

              I think we both agree this would be better if it was framed as a vi Vs emacs debate. Common on man, the flames don't spread themselves!

              Besides, wget is the original... curl didn't show up for a whole year later. Anything you can do with curl, I can do with wget better.

              • (Score: 2) by hemocyanin on Saturday March 21 2015, @04:02PM

                by hemocyanin (186) on Saturday March 21 2015, @04:02PM (#160798) Journal

                wget the fuck out of here before i curl you out.

                OK -- not so great, my heart isn't in this flame war.

          • (Score: 2) by Marand on Saturday March 21 2015, @05:53AM

            by Marand (1081) on Saturday March 21 2015, @05:53AM (#160703) Journal

            You basically covered it there. They have similarities, but curl's more useful inside scripts or command chains with pipes, and wget's one of the most useful/flexible mirroring tools.

            Another nice option is aria2c. It doesn't do recursive downloading like wget, but it's fast, has support for things wget doesn't (metalinks, bittorrent) and has some interesting options like file splitting, settings for multiple connections per server, and some other things. I first found it a while back when I needed metalink [wikipedia.org] support for something* and neither curl nor wget had it (curl has it now I think), and ended up liking it.

            Now I mostly only use wget for mirroring and use aria2c or curl for everything else.

            * I don't even remember what I needed it for, now; I vaguely remember it being something on an open source project's page or discussion board, and being annoyed that they had to use some obscure link format that I (correctly) predicted I wouldn't see again for years.

        • (Score: 2) by wonkey_monkey on Saturday March 21 2015, @08:46AM

          by wonkey_monkey (279) on Saturday March 21 2015, @08:46AM (#160727) Homepage

          I didn't say it was.

          --
          systemd is Roko's Basilisk
      • (Score: -1, Spam) by Anonymous Coward on Saturday March 21 2015, @01:50AM

        by Anonymous Coward on Saturday March 21 2015, @01:50AM (#160651)

        yo mama is so fat she invades foreign countries from her living room

        • (Score: -1, Spam) by Anonymous Coward on Saturday March 21 2015, @02:10AM

          by Anonymous Coward on Saturday March 21 2015, @02:10AM (#160658)

          ...is the butt

          • (Score: -1, Spam) by Anonymous Coward on Saturday March 21 2015, @02:22AM

            by Anonymous Coward on Saturday March 21 2015, @02:22AM (#160660)

            ciri: tama lama ding dong ping pong

    • (Score: 2) by CirclesInSand on Saturday March 21 2015, @01:26AM

      by CirclesInSand (2899) on Saturday March 21 2015, @01:26AM (#160644)

      Yeah, I thought all the cool people used wget. What's the difference between that and cURL?

      • (Score: 0) by Anonymous Coward on Saturday March 21 2015, @01:54AM

        by Anonymous Coward on Saturday March 21 2015, @01:54AM (#160652)

        real men use fsockopen

        /me cracks open the popcorn :p

        • (Score: 0) by Anonymous Coward on Saturday March 21 2015, @02:01AM

          by Anonymous Coward on Saturday March 21 2015, @02:01AM (#160654)

          Can I be hip with netcat

  • (Score: 0, Troll) by Anonymous Coward on Saturday March 21 2015, @02:29AM

    by Anonymous Coward on Saturday March 21 2015, @02:29AM (#160663)

    I see several "how is this different from wget?" comments and fairly tepid replies.

    The biggest difference is that curl can be used to POST things too.

    • (Score: 2) by http on Saturday March 21 2015, @03:16AM

      by http (1920) on Saturday March 21 2015, @03:16AM (#160682)

      wget also has post capability.

      --
      I browse at -1 when I have mod points. It's unsettling.
  • (Score: 2) by Subsentient on Saturday March 21 2015, @05:18AM

    by Subsentient (1111) on Saturday March 21 2015, @05:18AM (#160698) Homepage Journal

    I used libcurl for some important stuff and I still do for my bot aqu4. libcurl is wonderful.

    --
    "It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
  • (Score: 2) by darkfeline on Saturday March 21 2015, @07:02PM

    by darkfeline (1030) on Saturday March 21 2015, @07:02PM (#160837) Homepage

    I tend to stick with Python and BeautifulSoup for my web scraping purposes, so I don't really use curl. Sometimes I use wget for fetching sources and other files from the command line.

    I can see how curl might be useful combined with, say, Perl, but you really should use a dedicated HTML parser instead of REs to scrape pages.

    I see that curl also supports protocols like Kerberos and IMAP, which does sound useful, but I haven't needed it.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 2) by Yog-Yogguth on Sunday April 05 2015, @12:13PM

      by Yog-Yogguth (1862) Subscriber Badge on Sunday April 05 2015, @12:13PM (#166638) Journal

      Thanks for mentioning Beautiful Soup [crummy.com] I hadn't heard of it before!

      If anyone is looking for a Python challenge they could try using it with rules of their own making (stripping out tags based on tag content? Stripping away non-text? Adding required features? Heading tags i.e. <H1> etc. structure/layout detection? Smiley/emote/emoticon detection and translation?) aimed at functioning as a workaround to whip non-ADA-compliant web pages into faked ADA compliance before piping it to a screen reader or other assistive technology. Maybe someone will get inspired :)

      Even bigger challenge: parsing through javascript to see if there's anything sensible there/functionality that can somehow be duplicated into a link, or if it's just fluff to be discarded.

      [Ideas? Check. Time and knowledge? Sorry I haven't got enough (not yet at least but likely never).]

      --
      Bite harder Ouroboros, bite! tails.boum.org/ linux USB CD secure desktop IRC *crypt tor (not endorsements (XKeyScore))