Ubuntu users will see a few changes to their command line tools with the launch of Ubuntu 25.10 in October. The wget utility for downloading files is being replaced by wcurl which offers most of the same basic functionality. It's FOSS reports: "Ubuntu Server 25.10 will no longer include wget by default, switching to wcurl instead."
Fresh installations will see this change when 25.10 releases in October. wget has been the standard command-line download tool on Linux systems for years. Most server administrators and scripts rely on its straightforward syntax for file downloads. On the other hand, wcurl is a simple curl wrapper that lets you download files without remembering curl parameters, using curl under the hood with sane defaults."
The report goes on to note another GNU utility, the screen command, will be dropped in favour of Tmux.
(Score: 4, Interesting) by canopic jug on Friday August 08, @02:49PM (22 children)
The article is very short and does not cover any reason for the changes. cURL is a lot smaller than wget and it both are great web clients. I guess the wcurl wrapper makes up for wget being the standard all these decades, so the net effect is probably null for end users. But since Canonical is not the least worried about bloat any more, and seems to actively encourage it, one wonders what the real motive for switching from wget to cURL is.
However, the move from screen to tmux is a good one. I used screen for ages, but pivoted to tmux once I got around to trying it out. It's that much better from my perspective. Apparently the code is much cleaner under the hood, too. Unfortunately, or maybe fortunately, despite weighing in at twice the size tmux lacks a serial line utility like screen has.
What's next, moving to Zsh from Bash? If so, why? The rationale can be quite important.
Money is not free speech. Elections should not be auctions.
(Score: 3, Interesting) by Anonymous Coward on Friday August 08, @03:07PM
Interesting call, curl is on a deviant of the MIT license.
Bash -> zsh, same thing/reason as MacOS -- bash is GPL-licensed (gplv3). Mac includes bash 3 (gpl v2) and zsh is default.
Something wget does that curl does not: recursive downloading of pages; download multiple files at a time (curl will output both links to one file).
(Score: 5, Interesting) by Anonymous Coward on Friday August 08, @03:16PM (3 children)
They won't tell you the real reason because the real reason is the pushover license of wcurl (as opposed to wget, which is licensed under the GPU GPL). The replacement of screen with tmux is the same.
Ubuntu wants to make money off proprietary software while at the same time getting well-meaning volunteers to do their work for free. Copyleft licenses are an obstacle to this so Ubuntu has been gradually purging GPL-licensed software from the distribution for many years now.
The GPL is specifically designed to limit this abuse but unfortunately the big tech companies have been very successful at convincing volunteers to abandon their rights with pushover licenses in exchange for nothing whatsoever.
(Score: 3, Interesting) by GloomMower on Friday August 08, @03:25PM (2 children)
So what is it that they don't like about it? I thought apple did it because they didn't like the patent clause. For ubuntu is it the same?
(Score: 5, Informative) by Anonymous Coward on Friday August 08, @04:17PM (1 child)
The GNU GPL does not permit proprietary modifications. Canonical wants to leech value off the backs of volunteer free software developers while at the same time they would very much prefer that their users do not receive those same freedoms [gnu.org]. Convincing volunteers to use pushover licenses enables this. The GPL is designed to ensure all users get the four freedoms, not just proprietary software companies.
(Score: 4, Insightful) by Anonymous Coward on Friday August 08, @10:02PM
Fuck Ubuntu then.
(Score: 3, Interesting) by VLM on Friday August 08, @03:28PM (1 child)
I don't speak for them but AFAIK its the usual mixture of lame and boring.
There's always the usual entryist nonsense, change for the sake of change to appear relevant. For example in the old days entire classes of security problems were discovered, obviously they were in wget that got patched. So wget is "less secure" than curl because curl is newer and never got patched. Either because curl was written long after the patched security bugs in wget, or curl is still full of zero-day exploits that are not public yet. Either way you're better off with wget at least until 2020 or so. Now both are old enough to be "pretty much equivalent" so you're no longer safer using wget, or you don't lose THAT much by using curl.
There's also dumb ideas like "we gonna replace https as a file transfer system to download binaries and stuff by using CURL talking raw to a REST API doing a "GET" which technically is about the same. No just no.
On a practical real world side the way to embed wget is a shell callout in your program and pray the command line API hasn't changed (it hasn't in eons) and funnel stdout to a string or pray they haven't F-d up filesystem security WRT the user you ran the shell command as. On the other hand libcurl is a thing and it "works" and if your programming language talks to libcurl (which isn't hard) you now have native curl in your app complete with all the static or dynamic library linking etc, I have to admit its pretty nice when it works and it usually works. Also CURL can do some WEIRD and unexpected but cool things like speak LDAP natively. I don't mean using LDAP auth on a server or something, I mean whacking a LDAP server with your own custom query. And it supports a lot of other protocols (does IMAP too, IIRC). Historically wget did http/https "better" with recursion and mirroring and restarting interrupted xfers and stuff, but curl did more protocols.
I guess in 2025 you'd say curl is a weaker wider ranged generic CLI protocol tool, whereas wget is/was better solely at http/https. In practice 99.9% of people use either ONLY to download files so they don't care if one is "deeper" than the other or the other is "wider" than the other.
Note that if you read the ACTUAL announcement carefully they're not getting rid of wget, its still a first class package officially maintained to the highest priority, its just not installed by default anymore. So its not a big deal, I need to add ONE line of code to ansible to install the wget package on future ubuntu systems so any existing old scripts keep working. I know I have some rando scripts that use wget. grep -R wget * | wc -l reports I have 79 lines in my Ansible system that use wget. Switching to curl isn't going to gain me anything other than wasted time, although I'll probably do it sometime. Who knows how many scripts where the code is not in Ansible are using wget, probably a lot, and its not worth messing stuff up to find out.
(Score: 4, Informative) by Anonymous Coward on Friday August 08, @06:49PM
The number of known and patched vulnerabilities isn't necessarily a reliable indicator of security and code quality. In some respects, it's really more of a function of bug reporting and how well the software is audited for security.
I think this change is stupid, and I see a bit of hypocrisy from Canonical here. They seemed to be concerned with the size of their distribution, arguing that this removes redundant tools from the default install. Of course, for a great many people, ubuntu-advantage-tools is useless to them. I don't like the ads they display in the MOTD, for example, and promptly disable them when I install Ubuntu. But you can't really uninstall them without uninstalling other packages and breaking stuff. Asshole design by Canonical, for sure. They argue this isn't a problem, and that ubuntu-advantage-tools and the related packages aren't large enough to be a problem. So why are commonly-used but potentially redundant tools like wget and screen problems?
I don't use Ubuntu Server, but I have a number of things I immediately do on any fresh install. I disable the automatic install of updates while they're being phased to try to keep my systems more stable. I prefer to be in control of when updates are installed (I do this daily) instead of being automatic, so "sudo dpkg-reconfigure unattended-upgrades" is one of the first things I do. Then there's the obligatory "sudo apt-get remove systemd-oomd". Perhaps things have gotten better, but I don't trust it since systemd-oomd was way too aggressive at killing processes when it initially became part of the Ubuntu desktop default install. Ubuntu doesn't automatically install screen for desktop installs, so "sudo apt-get install screen" is part of those initial commands. Although I don't like Ubuntu's hypocrisy and user-hostile decisions, I'm not that bothered since I can still just do "sudo apt-get install wget screen" and get those tools back.
As a desktop user, snaps piss me off far more. For example, I can control things like the phasing of updates through apt-get and when those updates are applied. Snaps are phased, but I don't know of any way to control when those phased updates are delivered to different systems. There also doesn't seem to be a great way to just disable automatic snap updates permanently, so I can just be in control with "sudo snap refresh". I'm way less bothered by two tools being removed from the default install, where I can still install them with apt-get, than I am by the abomination that is snaps. If they made them less user-hostile, I'd have no problem with the approach. But snaps in their current form are horrible. I'm in favor of picking battles, and I see this as a much bigger one.
(Score: 5, Interesting) by Anonymous Coward on Friday August 08, @03:30PM (2 children)
Sibling comment calls it. It is about Cannonical doing the anti-GPL Apple thing.
But, to your comment on the superiority of tmux, it depends. Tmux can do things that screen cannot. But, screen can do things that tmux cannot e.g., serial terminal. Personally, I use both. Ditto for wget and curl, wget when I just want to download a file, and curl when I want to look at headers, or interact with an api.
I think this is just further evidence that Cannonical has lost its way. But, I don't expect Mark Shuttleworth to care what I think.
Despite drama over systemd (justified, IMO), Debian is an excellent distro that has been around almost as long as Slackware (they are the two oldest distros still active, and both are community supported-- i.e., no shitty corporation engaging in rapid enshitification. And, Debian will be familiar enough to Ubuntu users, as it is what Ubuntu is derived from.
(Score: 4, Disagree) by RedGreen on Friday August 08, @11:00PM
"Despite drama over systemd (justified, IMO), Debian is an excellent distro that has been around almost as long as Slackware (they are the two oldest distros still active, and both are community supported-- i.e., no shitty corporation engaging in rapid enshitification."
No they are doing the slow boil method of enshitification tiny change by tiny change supporting the parasite corporate agenda all the way. The adoption of the systemd garbage proves it the damn thing is/was supposed to be an init system yet it continually spreads its tentacles into all aspects of the system. Their supposed support of the *nix way of doing things (do one thing and do it well) the GNU/Linux distribution of Debian continues to support that abomination through all of its mission creep. They continue to support this with no questions asked at all about it. So tell that fucking lie elsewhere they are in it full on by their continuing support and enabling of that absolute trash.
"I modded down, down, down, and the flames went higher." -- Sven Olsen
(Score: 4, Informative) by SDRefugee on Saturday August 09, @05:00PM
And if you don't like systemd, there's always Devuan.. All the up-to-date goodness of Debian WITHOUT the idiocy of systemd.
America should be proud of Edward Snowden, the hero, whether they know it or not..
(Score: 5, Interesting) by VLM on Friday August 08, @03:33PM (2 children)
Thats actually pretty funny as Ubuntu is downstream of Debian, and Debian moved from bash as the main shell to some inferior shell to intentionally make it harder to write initscripts to push the meme that only systemd can save us from harder to write initscripts, because initscripts are easier to write and use than systemd, unless you intentionally sabotage, in which case... A self-inflicted injury, LOL.
Devuan and Alpine are the OS of choice for sysadmins anyway, Debian and Ubuntu and Redhat are corporate distros for corporate people and I'm not a megacorp so I have no interest in them anymore other than software thats tied to specific OS versions.
(Score: 4, Informative) by krishnoid on Friday August 08, @04:17PM
From the Debian website [debian.org]:
Since busybox sh on Linux and Windows is also a stripped-down version, it may also improve portability a tiny bit.
(Score: 2) by JoeMerchant on Friday August 08, @05:19PM
Ubuntu is effectively hooking our mega-corp by offering "extended security patch support" for a price. Just the fact that it can last longer in the field, regardless of cost, makes it more attractive than an alternate distro which doesn't stand up and say "you're supported until ... whenever."
Actually, that move effectively sabotaged all Linux out of the next round of our product, it will be Win11 based this time, and 8 or so years from now having suffered the trials of a Win11 based system we may (or may not) come crawling back to Linux, again.
🌻🌻🌻 [google.com]
(Score: 4, Interesting) by krishnoid on Friday August 08, @03:59PM (4 children)
I've heard that screen is difficult to understand, maintain, and extend, and tmux's source code is very carefully laid out.
Particularly, for two non-trivial programs that do very similar things, it always struck me that it could be a teaching example of the "trade" part of software engineering, that wasn't just about "correctness" that's taught in school. I haven't dug into the code myself though.
(Score: 3, Informative) by rufty on Friday August 08, @07:10PM (3 children)
The only thing I use screen for is talking to serial devices: $ screen /dev/ttyGPS0 9600
Can I do that with tmux?
(Score: 3, Interesting) by krishnoid on Friday August 08, @11:13PM
Nope and they won't implement it [github.com], ostensibly.
(Score: 1, Interesting) by Anonymous Coward on Sunday August 10, @12:17AM (1 child)
(Score: 2) by canopic jug on Sunday August 10, @04:02AM
I'm curious why can't you run tmux then run cu or similar in tmux?
Well, it was years since I tried but back when I last tested it cu(1) on GNU/Linux didn't actually work. However, now that you bring it up, I see that there is a more recent port of OpenBSD's version of cu(1). Checking again, it seems that both versions do the job. So screen(1) is no longer needed here.
Money is not free speech. Elections should not be auctions.
(Score: 3, Interesting) by epitaxial on Friday August 08, @04:29PM
Not Invented Here Syndrome.
The reason why traceroute, nslookup, route, and ifconfig were replaced by something different.
(Score: 3, Interesting) by JoeMerchant on Friday August 08, @05:16PM
All I have to relate is about standard curl which introduced breaking changes from curl7 to curl8, and a "simple" OS update (for security reasons) broke a pile of our existing code.
Never been similarly burned by wget...
🌻🌻🌻 [google.com]
(Score: 2) by driverless on Saturday August 09, @03:09AM (1 child)
I think curl has evolved into being more of a Swiss-army-chainsaw than wget, so it'll fetch stuff under conditions where wget will bail. And then since they do more or less the same thing they went with the more featureful option.
(Score: 2) by https on Sunday August 10, @05:33AM
Stenberg is quite clear [github.com] that there are some truly significant differences between wget and curl. Eg. with wget, you can resume partial downloads after interruption. Curl can't.
Offended and laughing about it.
(Score: 2) by bzipitidoo on Friday August 08, @09:21PM
Any word on preferred pseudoterminals? xterm is the original one, now very old. Very messy code. I found it interesting that the GNOME and KDE projects made their own terminals, konsole and GNOME terminal. Why? Weren't existing terminals good enough?
The new terminals on the block are kitty and alacritty. Alacritty was created in part to make vim in tmux more performant.
(Score: 5, Interesting) by ShovelOperator1 on Saturday August 09, @03:05PM
I see curl and wget as two totally different tools for two totally different tasks.
Curl is for a lot of protocols, for downloading rather small chunks of data relatively to the bandwidth, useful especially in scripting applications. Want to just get something and process it? curl is here. Manually add headers any way possible? Works from the command line. Alter the protocol version itself? Yes, there are switches for it. Additionally by default many builds of curl like to output progress to stderr while configured to download file to stdout, and this, when handled correctly, results with a nice progress meter inside your script.
However, wget is not for it. Wget will download large files automatically supporting time-out and resuming as configured. Wget will download recursively, spidering thru links according to regular expressions specified in parameters. When the links are hard-coded to domain, it will correct it for you. It will expose the mostly used options of specific protocol to be altered with parameters, but will use sane defaults for other options. And, contrary to curl, supports failing gracefully when the connection is unstable. This last one is nothing when it comes to download 10kB of JSON from some API, but critical if wget is pulling the 10GB file over a poor wifi or proxy links. On the other hand, contrary to curl, wget seems to reliably support HTTP(S) and FTP only.
So these are programs for two different tasks. Replacing one with another is like replacing a spreadsheet with text editor plus a table template - generally possible, practically you will have to do all math using system's calculator. Do they think that we should write own batch/recursive downloaders?
Or maybe it's a political decision because, for example, Canonical wants to be in some crawler business?