Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

When transferring multiple 100+ MB files between computers or devices, I typically use:

  • USB memory stick, SD card, or similar
  • External hard drive
  • Optical media (CD/DVD/Blu-ray)
  • Network app (rsync, scp, etc.)
  • Network file system (nfs, samba, etc.)
  • The "cloud" (Dropbox, Cloud, Google Drive, etc.)
  • Email
  • Other (specify in comments)

[ Results | Polls ]
Comments:166 | Votes:301

posted by janrinok on Saturday October 11, @08:58PM   Printer-friendly

From Cory Doctorow's blog:

Like you, I'm sick to the back teeth of talking about AI. Like you, I keep getting dragged into discussions of AI. Unlike you, I spent the summer writing a book about why I'm sick of writing about AI, which Farrar, Straus and Giroux will publish in 2026.

A week ago, I turned that book into a speech, which I delivered as the annual Nordlander Memorial Lecture at Cornell, where I'm an AD White Professor-at-Large. This was my first-ever speech about AI and I wasn't sure how it would go over, but thankfully, it went great and sparked a lively Q&A. One of those questions came from a young man who said something like "So, you're saying a third of the stock market is tied up in seven AI companies that have no way to become profitable and that this is a bubble that's going to burst and take the whole economy with it?"

I said, "Yes, that's right."

He said, "OK, but what can we do about that?"

So I re-iterated the book's thesis: that the AI bubble is driven by monopolists who've conquered their markets and have no more growth potential, who are desperate to convince investors that they can continue to grow by moving into some other sector, e.g. "pivot to video," crypto, blockchain, NFTs, AI, and now "super-intelligence." Further: the topline growth that AI companies are selling comes from replacing most workers with AI, and re-tasking the surviving workers as AI babysitters ("humans in the loop"), which won't work. Finally: AI cannot do your job, but an AI salesman can 100% convince your boss to fire you and replace you with an AI that can't do your job, and when the bubble bursts, the money-hemorrhaging "foundation models" will be shut off and we'll lose the AI that can't do your job, and you will be long gone, retrained or retired or "discouraged" and out of the labor market, and no one will do your job. AI is the asbestos we are shoveling into the walls of our society and our descendants will be digging it out for generations:

The only thing (I said) that we can do about this is to puncture the AI bubble as soon as possible, to halt this before it progresses any further and to head off the accumulation of social and economic debt. To do that, we have to take aim at the material basis for the AI bubble (creating a growth story by claiming that defective AI can do your job).

"OK," the young man said, "but what can we do about the crash?" He was clearly very worried.

"I don't think there's anything we can do about that. I think it's already locked in. I mean, maybe if we had a different government, they'd fund a jobs guarantee to pull us out of it, but I don't think Trump'll do that, so –"

[...] I firmly believe the (economic) AI apocalypse is coming. These companies are not profitable. They can't be profitable. They keep the lights on by soaking up hundreds of billions of dollars in other people's money and then lighting it on fire. Eventually those other people are going to want to see a return on their investment, and when they don't get it, they will halt the flow of billions of dollars. Anything that can't go on forever eventually stops.

[...] The data-center buildout has genuinely absurd finances – there are data-center companies that are collateralizing their loans by staking their giant Nvidia GPUs as collateral. This is wild: there's pretty much nothing (apart from fresh-caught fish) that loses its value faster than silicon chips. That goes triple for GPUs used in AI data-centers, where it's normal for tens of thousands of chips to burn out over a single, 54-day training run.

That barely scratches the surface of the funny accounting in the AI bubble. Microsoft "invests" in Openai by giving the company free access to its servers. Openai reports this as a ten billion dollar investment, then redeems these "tokens" at Microsoft's data-centers. Microsoft then books this as ten billion in revenue.

That's par for the course in AI, where it's normal for Nvidia to "invest" tens of billions in a data-center company, which then spends that investment buying Nvidia chips. It's the same chunk of money is being energetically passed back and forth between these closely related companies, all of which claim it as investment, as an asset, or as revenue (or all three).

[...] Industry darlings like Coreweave (a middleman that rents out data-centers) are sitting on massive piles of debt, secured by short-term deals with tech companies that run out long before the debts can be repaid. If they can't find a bunch of new clients in a couple short years, they will default and collapse.

[...] Plan for a future where you can buy GPUs for ten cents on the dollar, where there's a buyer's market for hiring skilled applied statisticians, and where there's a ton of extremely promising open source models that have barely been optimized and have vast potential for improvement.

[...] The most important thing about AI isn't its technical capabilities or limitations. The most important thing is the investor story and the ensuing mania that has teed up an economical catastrophe that will harm hundreds of millions or even billions of people. AI isn't going to wake up, become superintelligent and turn you into paperclips – but rich people with AI investor psychosis are almost certainly going to make you much, much poorer.


Original Submission

posted by janrinok on Saturday October 11, @04:13PM   Printer-friendly
from the stand-up-to-bullies dept.

Last week the U.S. Education Secretary Linda McMahon produced the latest attack on academia, "Compact for Academic Excellence in Higher Education," which was addressed to a small group of well known US universities. If you missed it, there is a description at https://en.wikipedia.org/wiki/Compact_for_Academic_Excellence_in_Higher_Education

Today (10/10/2025) MIT was the first of the group to reject the offer. Here is the letter from MIT's president, https://orgchart.mit.edu/letters/regarding-compact
It's not long and worth a read, here is the punch line,

In our view, America's leadership in science and innovation depends on independent thinking and open competition for excellence. In that free marketplace of ideas, the people of MIT gladly compete with the very best, without preferences. Therefore, with respect, we cannot support the proposed approach to addressing the issues facing higher education.

And here's one of her bullet points,

MIT opens its doors to the most talented students regardless of their family's finances. Admissions are need-blind. Incoming undergraduates whose families earn less than $200,000 a year pay no tuition. Nearly 88% of our last graduating class left MIT with no debt for their education. We make a wealth of free courses and low-cost certificates available to any American with an internet connection. Of the undergraduate degrees we award, 94% are in STEM fields. And in service to the nation, we cap enrollment of international undergraduates at roughly 10%.


Original Submission

posted by janrinok on Saturday October 11, @11:28AM   Printer-friendly

Baseload power is functionally extinct:

Much has been made of the notion that "renewables can't supply baseload power". This line suggests we need to replace Australia's ageing coal fleet with new coal or nuclear. The fact of the matter is that, already, "baseload" is an outdated concept and baseload generators face extinction.

Traditional utility grid management suggests there are three types of load: baseload, shoulder, and peak. Baseload is underlying 24/7 energy demand. Peak load is regular, but short-lived periods of high demand and shoulder loads are what lie in between. Under this model, system planning is straightforward – assign different types of energy generation to the different loads according to the price and qualitative characteristics.

Traditional, simple dispatch of generation technologies according to cost and flexibility

Historically in Australia, coal supplies most baseload demand since it is relatively cheap and very slow to ramp its output up or down. In some countries, baseload is met with nuclear since it is even less flexible than coal, but only two countries generate more than 50% of their energy from nuclear.

With the roles of different generators clearly delineated, power planners' jobs are much easier in this idealised system than today's grid.

In a system with lots of solar, prices fall dramatically at around midday because solar has no fuel cost. Because much of Australian solar is on rooftops, grid demand also falls. For those hours, baseload generators must either operate at a loss or shut down. Continuing to generate produces more energy than the grid requires at very low or negative prices. This is not a conscious choice—it is the structure of the market that the cheapest bid gets dispatched first.

In practice, most baseload generators are simply not capable of ramping up and down fast enough – they must bear loss-making prices in the middle of the day and try to make it up with high prices at peak periods. Moreover, this daily up/down ramp (called "load-following") brings efficiency losses and extra maintenance costs.

The situation in modern Australia – because baseload generators cannot be turned off, cheap solar is curtailed in the middle of the day.

As solar increases, this dynamic makes baseload generators impractical and unprofitable. Already, this is the situation in South Australia – in the last week of Winter 2024, SA ran on more than 100% net renewables. SA is instantaneously meeting 100% of demand from solar alone most days. It is no surprise that SA's last coal-fired power plant shut nearly a decade ago, in 2016, after years of being operated only seasonally.

The rest of Australia has not yet caught up to SA and Tasmania in terms of renewables and there is still a case for coal in the national energy market. However, the trend in solar uptake is abundantly clear and there will be no economic case for coal in just a few short years' time anywhere in Australia.

Excess energy in the middle of the day is useless if no-one wants to use it or if they want to use it overnight; this is where firming is required. When variable renewables are paired with enough storage or back-up power, it is called "firm". For a utility grid, this means large amounts of storage such as batteries and pumped hydro energy storage, as well as flexible generation such as hydro and possibly open cycle gas turbines.

In our transitioning grid, baseload generators run at a loss in the day while storage offtakes cheap solar to sell at peak times. This is called energy arbitrage —buying low and selling high — and it is extremely profitable. It is tempting to think this arrangement could continue, but it cannot. As more batteries come online, the economics of baseload generators gets worse.

We are set for a storage surge as:

utility batteries come online, electric vehicles ingrate with the grid, Albanese offers household battery subsidies, and battery prices continue to plummet. In this future, midday energy is still practically free because storage cannot consume it all and peak power prices are reduced because of battery arbitrage. Without profitable peak power prices, the economics of baseload generation are well and truly dead.

Power-hungry data centres have been meeting planning roadblocks because they consume more power than local infrastructure can handle. Rather than waiting for third parties to build out infrastructure, big tech companies want to take matters into their own hands. The possibility of big tech companies commissioning or commandeering nuclear reactors to supply new data centres with 24/7 power has created a media buzz.

It is unlikely that a self-reliant data centre would look to 100% renewables. This is not because renewables are unreliable, it is because firming renewables is easier at larger scales – wide geography helps to smooth out locally variable weather. Although nuclear is the most expensive option, big tech has cash to burn. The bigger hurdle to new nuclear is a 10-year-plus build timeline.

But whether or not data centres adopt nuclear is irrelevant for civil electricity because utility electricity grids are not data centres. If big tech builds nuclear to power data centres, it neither proves nor disproves that that technology is a good option for the whole grid.

Peter Dutton, if he succeeds in the upcoming election, faces an uphill battle to enact his nuclear energy policy. Not only must he overturn federal and state bans on nuclear power, he has to figure out how they would make money. If Dutton were to build a nuclear plant, it would require a forever-subsidy to compete in the market.

The industry is aware of this. Daniel Westerman, chief executive of the market operator AEMO, was recently quoted as saying: "Australia's operational paradigm is no longer 'baseload-and-peaking." AEMO has said competition from renewables is a key reason why coal has been retiring faster than announced.

The market is aware, and the industry is aware that baseload is not endangered, it is already functionally extinct. If the Coalition do build a nuclear power plant, Australian taxpayers will be the proud owners of an unprofitable, uncompetitive, expensive and unsellable liability.


Original Submission

posted by janrinok on Saturday October 11, @06:42AM   Printer-friendly
from the living-history dept.

David C Brock interviewed Ken Thompson for the Computer History Museum. It's a long interview with a video with a written transcript. The video is just over 4.5 hours long. The transcript weighs in at 64 pages as a downloadable PDF locked behind a CPU- and RAM-chewing web app.

This is an oral history interview with Ken Thompson, created in partnership by the Association for Computing Machinery and the Computer History Museum, in connection with his A.M. Turing Award in 1983. The interview begins with Thompson's family background and youth, detailing the hobbies he pursued intently from electronics and radio projects, to music, cars, and chess. He describes his experience at the University of California, Berkeley, and his deepening engagement with computers and computer programming there.

The interview then moves to his recruitment to the Bell Telephone Laboratories, and his experience of the Multics project. Thompson next describes his development of Unix and, with Dennis Ritchie, the programming language C. He describes the development of Unix and the Unix community at Bell Labs, and then details his work using Unix for the Number 5 Electronic Switching System. Thompson details his Turing Award lecture, the work on compromised compilers that led to it, and his views on computer security.

Next, he details his career in computer chess and work he did for Bell Labs artist Lillian Schwartz. Thompson describes his work on the Plan 9 operating system at Bell Labs with Rob Pike, and his efforts to create a digital music archive. He then details his post Bell Labs career at Entrisphere and then Google, including his role in Google Books and the creation of the Go programming language.

Previously:
(2025) Why Bell Labs Worked
(2022) Unix History: A Mighty Origin Story
(2019) Vintage Computer Federation East 2019 -- Brian Kernighan Interviews Ken Thompson


Original Submission

posted by janrinok on Saturday October 11, @02:01AM   Printer-friendly

From the Trenches

An interesting article about software quality over the years - by Denis Stetskov

The Apple Calculator leaked 32GB of RAM.

Not used. Not allocated. Leaked. A basic calculator app is haemorrhaging more memory than most computers had a decade ago.

Twenty years ago, this would have triggered emergency patches and post-mortems. Today, it's just another bug report in the queue.

We've normalized software catastrophes to the point where a Calculator leaking 32GB of RAM barely makes the news. This isn't about AI. The quality crisis started years before ChatGPT existed. AI just weaponized existing incompetence.

The Numbers Nobody Wants to Discuss:

I've been tracking software quality metrics for three years. The degradation isn't gradual—it's exponential.

Memory consumption has lost all meaning:

VS Code: 96GB memory leaks through SSH connections

Microsoft Teams: 100% CPU usage on 32GB machines

Chrome: 16GB consumption for 50 tabs is now "normal"

Discord: 32GB RAM usage within 60 seconds of screen sharing

Spotify: 79GB memory consumption on macOS

These aren't feature requirements. They're memory leaks that nobody bothered to fix.

This isn't sustainable. Physics doesn't negotiate. Energy is finite. Hardware has limits.

The companies that survive won't be those who can outspend the crisis.

There'll be those who remember how to engineer. We're living through the greatest software quality crisis in computing history. A Calculator leaks 32GB of RAM. AI assistants delete production databases. Companies spend $364 billion to avoid fixing fundamental problems.


Original Submission

posted by janrinok on Friday October 10, @09:14PM   Printer-friendly
from the messy-wires dept.

Qualcomm on Tuesday said it has acquired Arduino, an Italian not-for-profit firm that makes hardware and software for developing prototypes of robots and other electronic gadgets, Reuters reports at https://www.reuters.com/world/asia-pacific/qualcomm-buys-open-source-electronics-firm-arduino-2025-10-07/.

Arduino's own announcement can be found at https://blog.arduino.cc/2025/10/07/a-new-chapter-for-arduino-with-qualcomm-uno-q-and-you/.

Along with the news that might confuse those that could not imagine "Arduino" itself as a tangible sales item, Arduino introduced a new model in the Uno form factor that comprises a Qualcomm Dragonwing QRB2210 to run Linux, an STM32U585 microcontroller for hardware interfacing, and some new high density connector on the bottom side. It is priced at $44 in the Arduino store.

Reception of the news seems to be mixed in various channels, many doubt Qualcomm with its history would be a good steward for an ecosystem like Arduino.

The new Arduino Q moves squarely into Raspberry Pi territory, where the Pi 5 currently sells for around $55 with mostly comparable features, at least if the RP2040-like features in the RP1 I/O controller are counted in.


Original Submission

posted by jelizondo on Friday October 10, @04:31PM   Printer-friendly
from the trading-climate-abatement-for-microplastics-infiltration dept.

Turning dissolved carbon dioxide from seawater to biodegradable plastic is an especially powerful way to clean up the ocean:

Not-so-fun fact: our oceans hold 150 times more carbon dioxide than the Earth's atmosphere. Adding to that causes ocean acidification, which can disrupt marine food chains and reduce biodiversity.

Addressing this could not only help restore balance to underwater ecosystems, but also take advantage of an opportunity to sustainably use this stored CO2 for a variety of purposes – including producing the industrial chemicals needed to make plastic.

The first towards this is called Direct Ocean Capture – which refers to removing dissolved carbon directly from seawater – happens through electrochemical processes. While there are a bunch of companies working on this, it hasn't extensively been applied at scale yet, and the cost benefit doesn't look great at the moment (it's estimated that removing 1 ton of CO2 from the ocean could cost at least US$373, according to Climate Interventions).

Scientists from the Chinese Academy of Sciences and the University of Electronic Science and Technology of China – both in Shenzhen, China – have devised a DOC method which involves converting the captured CO2 into biodegradable plastic precursors. This approach is also described as operating at 70% efficiency, while consuming a relatively small amount of energy (3 kWh per kg of CO2), and working out to an impressive $230 per ton of CO2.

What's also worth noting is the use of modified marine bacteria for the last step. Here's a breakdown of the process, described in a paper appearing in Nature Catalysis:

First, electricity is used in a special reactor to acidify natural seawater. This converts the invisible, dissolved carbon into pure gas, which is collected. The system then restores the water's natural chemistry before returning it to the ocean.

Next, the captured CO2 gas is fed into a second reactor containing a bismuth-based catalyst to yield a concentrated, pure liquid called formic acid. Formic acid is a critical intermediate because it is an energy-rich food source for microbes.

Engineered marine microbes, specifically Vibrio natriegens, are fed the pure formic acid as their sole source of carbon. The microbes metabolize the formic acid and efficiently produce succinic acid, which is then used directly as the essential precursor to synthesize biodegradable plastics, such as polybutylene succinate (PBS).

That's a pretty good start. The researchers note there's room for optimization to boost yields and integrate this system into industrial processes. It could also be altered to produce chemicals for use in fuels, drugs, and foods.

It also remains to be seen how quickly the team can commercialize this DOC method, because it may have formidable competition. For example, Netherlands-based Brineworks says it will get to under $200/ton by 2030 with its electrolysis-based solution. The next couple of years will be worth watching in this fascinating niche of decarbonization.

Journal Reference: Li, C., Guo, M., Yang, B. et al. Efficient and scalable upcycling of oceanic carbon sources into bioplastic monomers. Nat Catal (2025). https://doi.org/10.1038/s41929-025-01416-4


Original Submission

posted by jelizondo on Friday October 10, @11:47AM   Printer-friendly
from the better-late-than-never-news dept.

The transistor was patented 75 years ago today:

75 years ago, the three Bell Labs scientists behind the invention of the transistor would, at last, have the U.S. Patent in their hands. This insignificant-looking semiconductor device with three electrodes sparked the third industrial revolution. Moreover, it ushered in the age of silicon and software, which still dominates business and human society to this day.

The first working transistor was demonstrated in 1947, but it wasn't until October 3, 1950, that the patent was secured by John Bardeen, Walter Brattain, and William Shockley. The patent was issued for a "three-electrode circuit element utilizing semiconductor materials." It would take several more years before the significant impacts transistors would have on business and society were realized.

Transistors replaced the bulky, fragile and power-hungry valves, that stubbornly remain present in some guitar amplifiers, audiophile sound systems, studio gear, where their 'organic' sound profile is sometime preferred. We also still see valves in some military, scientific, and microwave/RF applications, where transistors might be susceptible to radiation or other interference. There are other niche use cases.

Beyond miniaturization, transistors would deliver dramatic boosts in - computational speed, energy efficiency, and reliability. Moreover, they became the foundation for integrated circuits and processors, where billions of transistors could operate reliably in a much smaller footprint than taken up by a single valve. Processors featuring a trillion transistors are now on the horizon.

For PC enthusiasts, probably the best known piece of transistor lore comes from Intel co-founder Gordon Moore. Of course, we are talking about Moore's Law, which was an observation by the pioneering American engineer. Moore's most famous prediction was that "the number of transistors on an integrated circuit will double every two years with minimal rise in cost." (Law was revised from one to two years in 1975).

Obviously, prior to 1965, when Moore's Law was set out, the startling advance in transistor technology indicated that such an extrapolation would be reasonable. Even, now, certain semiconductor companies, engineers, and commentators reckon that Moore's Law is still alive and well. You can see Intel's position in the slides, above.

Whatever the case, it can't be denied that since the patenting of the transistor, we have seen incredible miniaturization and advances in computing and software, expanding the possibilities of minds and machines. The current tech universe is actually buzzing with firms that reckon they can make machines with minds - artificial intelligence.


Original Submission

posted by janrinok on Friday October 10, @11:11AM   Printer-friendly
from the party-in-stockholm-whoooo! dept.

Venezuelan opposition leader María Corina Machado has been awarded this year's Nobel Peace Prize

Somewhat better then the IG Nobel is the actual Nobel prizes. Winners started to be announced this week. So far Medicine and Physics, others to be revealed in the following days as I write this.

Physics. "for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit"
Medicine. "for their discoveries concerning peripheral immune tolerance"

Chemistry.
Literature.

Peace.
Economic Science.

https://www.nobelprize.org/all-nobel-prizes-2025/


Original Submission

posted by jelizondo on Friday October 10, @07:03AM   Printer-friendly

StatCounter reports that Windows 7 has gained almost 10% market share in the last month, just as Windows 10 support is coming to an end. It's clear people aren't ready to switch to Windows 11.

Someone must be wishing really hard, as according to StatCounter, Windows 7 is gaining market share in the year 2025, five years after support for it officially ended. As of this week, Windows 7 is now in use on 9.61% of Windows PCs within StatCounters pool of data, and that's up from the 3.59% it had just a month ago.

For years, Windows 7 has hovered around 2% market share on StatCounter. After mainstream support ended, the last few holdouts very quickly made the move to Windows 10, but with support for Windows 10 ending now just two weeks away, it looks like many are giving Microsoft's best version of Windows another try.

Of course, StatCounter isn't an entirely accurate measure when it comes to actual usage numbers, but it can give us a rough idea about how the market is trending, and it seems people are not happy with the idea of upgrading to Windows 11 from Windows 10. Windows 7's sudden marketshare gain is likely a blip, but interesting nonetheless.

Taking a closer look at StatCounter, it appears Windows 11 market share stalled in the last month, maintaining around 48% share. Windows 10 continued to drop, as expected, and is now on just 40% of PCs. While I wouldn't be surprised if some people had experimented with going back to Windows 7 recently, I highly doubt it's a number as high as 9.61%.

[...] Windows 11 failing to gain any market share in the final month before Windows 10's end of support is frankly shocking, and if the numbers are accurate, should be setting alarm bells off for Microsoft internally. It's clear that much of the market has rejected Windows 11, whether that be because of its high system requirements or insistence on AI features, people aren't moving to it.

In recent months, it seems Windows' reputation has fallen off a cliff. With enshittification slowly moving in, a lack of innovative new features and experiences that aren't tied to AI, and monthly updates that consistently introduce unnecessary changes and issues, people are getting tired of Microsoft's antics.


Original Submission

posted by jelizondo on Friday October 10, @02:21AM   Printer-friendly

Are VPNs Under Attack? An Anti-Censorship Group Speaks Out:

"Leave VPNs alone." That's the plea from anti-online censorship and surveillance group Fight for the Future, which designated Sept. 25 as a VPN Day of Action to press lawmakers not to ban virtual private networks. A week later, there's still an opportunity to get involved and raise awareness.

The group of activists, artists, engineers and technologists is asking people to sign an open letter encouraging politicians to preserve the existence of VPNs and "defend privacy and to access knowledge and information online." Virtual private networks encrypt internet connections and can hide your physical location.

Joining the action on Thursday was the VPN Trust Initiative -- comprised of NordVPN, Surfshark and ExpressVPN -- and the VPN Guild, which includes Amnezia VPN.

The open letter refers to recent "age-verification" laws propelling legislative moves to ban or restrict VPN usage. Such measures would lead to increased online surveillance and censorship, which "has a huge chilling effect on our freedoms, particularly the freedoms of traditionally marginalized people," the letter notes.

Lia Holland, Fight for the Future's campaigns and communication director, said VPNs are vital for "people living under authoritarian regimes" to avoid censorship and surveillance, and have become an essential tool in exercising basic human rights.

Half of all US states have passed age-verification laws requiring internet users to prove their age with government-issued IDs, credit card checks and other methods. The laws have spurred consumers to sign up for VPNs to avoid giving out sensitive information, with one recent VPN sign-up spike in the UK.

Michigan is considering a bill banning adult content online and VPNs. If it becomes law, Michigan would be the first US state to ban VPNs. Many countries, including China, India and Iran, already ban or heavily restrict VPNs.

"Amid a moral panic, ignorant 'save-the-children' politicians are getting very close to kicking the hornet's nest of millions of people who know how important VPNs are," Holland said.

Banning VPNs would be "difficult," according to attorney Mario Trujillo of the Electronic Frontier Foundation, an international digital rights group.

Trujillo told CNET that VPNs are best for routing your network connection through a different network. "They can be used to help avoid censorship, but they are also used by employees in every sector to connect to their company's network," he said. "That is a practical reality that would make any ban difficult."

Trujillo added that the US lags behind the rest of the world in privacy regulation, and that lawmakers should focus more on privacy than VPN bans.

Fight for the Future identifies local lawmakers and provides templates for contacting them. This information is on the same page as the open letter.


Original Submission

posted by jelizondo on Thursday October 09, @09:43PM   Printer-friendly

The ship wasn't designed to withstand the powerful ice compression forces—and Shackleton knew it:

In 1915, intrepid British explorer Sir Ernest Shackleton and his crew were stranded for months in the Antarctic after their ship, Endurance, was trapped by pack ice, eventually sinking into the freezing depths of the Weddell Sea. Miraculously, the entire crew survived. The prevailing popular narrative surrounding the famous voyage features two key assumptions: that Endurance was the strongest polar ship of its time, and that the ship ultimately sank after ice tore away the rudder.

However, a fresh analysis reveals that Endurance would have sunk even with an intact rudder; it was crushed by the cumulative compressive forces of the Arctic ice with no single cause for the sinking. Furthermore, the ship wasn't designed to withstand those forces and Shackleton was likely well aware of that fact, according to a new paper published in the journal Polar Record. Yet he chose to embark on the risky voyage anyway.

Author Jukka Tuhkuri of Aalto University is a polar explorer and one of the leading researchers on ice worldwide. He was among the scientists on the Endurance22 mission that discovered the Endurance shipwreck in 2022, documented in a 2024 National Geographic documentary. The ship was in pristine condition partly because of the lack of wood-eating microbes in those waters. In fact, the Endurance22 expedition's exploration director, Mensun Bound, told The New York Times at the time that the shipwreck was the finest example he's ever seen; Endurance was "in a brilliant state of preservation."

[...] Once the wreck had been found, the team recorded as much as they could with high-resolution cameras and other instruments. Vasarhelyi, particularly, noted the technical challenge of deploying a remote digital 4K camera with lighting at 9,800 feet underwater, and the first deployment at that depth of photogrammetric and laser technology. This resulted in a millimeter-scale digital reconstruction of the entire shipwreck to enable close study of the finer details.

It was shortly after the Endurance22 mission found the shipwreck that Tuhkuri realized that there had never been a thorough structural analysis conducted of the vessel to confirm the popular narrative. Was Endurance truly the strongest polar ship of that time, and was a broken rudder the actual cause of the sinking? He set about conducting his own investigation to find out, analyzing Shackleton's diaries and personal correspondence, as well as the diaries and correspondence of several Endurance crew members.

[...] Endurance was originally named Polaris; Shackleton renamed it when he purchased the ship in 1914 for his doomed expedition. Per Tuhkuri, the ship had a lower (tween) deck, a main deck, and a short bridge deck above them which stopped at the machine room in order to make space for the steam engine and boiler. There were no beams in the machine room area, nor any reinforcing diagonal beams, which weakened this significant part of the ship's hull.

[...] Based on his analysis, Tuhkuri concluded that the rudder wasn't the sole or primary reason for the ship's sinking. "Endurance would have sunk even if it did not have a rudder at all," Tuhkuri wrote; it was crushed by the ice, with no single reason for its eventual sinking. Shackleton himself described the process as ice floes "simply annihilating the ship."

Perhaps the most surprising finding is that Shackleton knew of Endurance's structural shortcomings even before undertaking the voyage. Per Tuhkuri, the devastating effects of compressive ice on ships were known to shipbuilders in the early 1900s. An early Swedish expedition were forced to abandon their ship Antarctic in February 1903 when it became trapped in the ice. Things progressed much like Endurance: the ice lifted Antarctic up so that the ship heeled over, with ice-crushed sides, buckling beams, broken planking, and a damaged rudder and stern post. The final sinking occurred when an advancing ice floe ripped off the keel.

Shackleton knew of Antarctic's fate and had even been involved in the rescue operation. He also helped Wilhelm Filchner make final preparations for Filchner's 1911-1913 polar expedition with a ship named Deutschland; he even advised his colleague to strengthen the ship's hull by adding diagonal beams, the better to withstand the Weddell Sea ice. Filchner did so and as a result, Deutschland survived eight months of being trapped in compressive ice, until the ship was finally able to break free and sail home. (It took a torpedo attack in 1917 to sink the good ship Deutschland.)

The same shipyard that modified Deutschland had also just signed a contract to build Endurance (then called Polaris). So both Shackleton and the ship builders knew how destructive compressive ice could be and how to bolster a ship against it. Yet Endurance was not outfitted with diagonal beams to strengthen its hull. And knowing this, Shackleton bought Endurance anyway for his 1914-1915 voyage. In a 1914 letter to his wife, he even compared the strength of its construction unfavorably with that of the Nimrod, the ship he used for his 1907-1909 expedition. So Shackleton had to know he was taking a big risk.

"Even simple structural analysis shows that the ship was not designed for the compressive pack ice conditions that eventually sank it," said Tuhkuri. "The danger of moving ice and compressive loads—and how to design a ship for such conditions—was well understood before the ship sailed south. So we really have to wonder why Shackleton chose a vessel that was not strengthened for compressive ice. We can speculate about financial pressures or time constraints but the truth is we may never know. At least we now have more concrete findings to flesh out the stories."

Both TFA and the open-access journal reference are very interesting reads.

Journal Reference: Polar Record, 2025. 10.1017/S0032247425100090


Original Submission

posted by jelizondo on Thursday October 09, @04:55PM   Printer-friendly

https://phys.org/news/2025-09-forensic-recovers-fingerprints-ammunition-casings.html

A pioneering new test that can recover fingerprints from ammunition casing, once thought nearly impossible, has been developed by two Irish scientists.

Dr. Eithne Dempsey, and her recent Ph.D. student Dr. Colm McKeever, of the Department of Chemistry in Ireland's Maynooth University have developed a unique electrochemical method which can visualize fingerprints on brass casings, even after they have been exposed to the high temperature conditions experienced during gunfire. The study is published in the journal Forensic Chemistry.

For decades, investigators have struggled to recover fingerprints from weapons because any biological trace is usually destroyed by the high temperatures, friction and gas released after a gun is fired. As a result, criminals often abandon their weapons or casings at crime scenes, confident that they leave no fingerprint evidence behind.

"The Holy Grail in forensic investigation has always been retrieving prints from fired ammunition casings," said Dr. Dempsey. "Traditionally, the intense heat of firing destroys any biological residue. However, our technique has been able to reveal fingerprint ridges that would otherwise remain imperceptible."

The team found they could coat brass casings with a thin layer of specialized materials to make hidden fingerprint ridges visible. Unlike existing methods that need dangerous chemicals or high-powered equipment, the new process uses readily available non-toxic polymers and minimal amounts of energy to quickly reveal prints from seemingly blank surfaces.

It works by placing the brass casing of interest in an electrochemical cell containing specific chemical substances. When a small voltage is applied, chemicals in the solution are attracted to the surface, coating the spaces between fingerprint ridges and creating a clear, high contrast image of the print. The fingerprint appears within seconds as if by magic!

"Using the burnt material that remains on the surface of the casing as a stencil, we can deposit specific materials in between the gaps, allowing for the visualization," said Dr. McKeever.

Tests showed that this technique also worked on samples aged up to 16 months, demonstrating remarkable durability.

The research has significant implications for criminal investigations, where the current assumption is that firing a gun eliminates fingerprint residues on casings.

"Currently, the best case of forensic analysis of ammunition casings is to match it to the gun that fired it," said Dr. McKeever. "But we hope a method like this could match it back to the actual person who loaded the gun."

The team focused specifically on brass ammunition casings, a substance that has been traditionally resistant to fingerprint detection and is the most common type of material used globally.

The researchers believe that the test for fingerprints on brass they have developed could be adapted for other metallic surfaces, expanding its range of potential forensic applications, from firearm-related crimes to arson.

This technique uses a device called a potentiostat, which controls voltage and can be as portable as a mobile phone, making it possible to create a compact forensic testing kit.

"With this method, we have turned the ammunition casing into an electrode, allowing us to drive chemical reactions at the surface of the casing," said Dr. McKeever.

While promising, the new technology faces rigorous testing and validation before it could potentially be adopted by law enforcement agencies worldwide.

More information: Colm McKeever et al, Electrodeposition of redox materials with potential for enhanced visualisation of latent finger-marks on brass substrates and ammunition casings., Forensic Chemistry (2025). DOI: 10.1016/j.forc.2025.100663


Original Submission

posted by janrinok on Thursday October 09, @12:13PM   Printer-friendly

Guess how much of Britain's direct transatlantic data capacity runs through two cables in Bude?:

Feature The first transatlantic cable, laid in 1858, delivered a little over 700 messages before promptly dying a few weeks later. 167 years on, the undersea cables connecting the UK to the outside world process £220 billion in daily financial transactions. Now, the UK Parliament's Joint Committee on National Security Strategy (JCNSS) has told the government that it has to do a better job of protecting them.

The Committee's report, released on September 19, calls the government "too timid" in its approach to protecting the cables that snake from the UK to various destinations around the world. It warns that "security vulnerabilities abound" in the UK's undersea cable infrastructure, when even a simple anchor-drag can cause major damage.

There are 64 cables connecting the UK to the outside world, according to the report, carrying most of the country's internet traffic. Satellites can't shoulder the data volumes involved, are too expensive, and only account for around 5 percent of traffic globally.

These cables are invaluable to the UK economy, but they're also difficult to protect. They are heavily shielded in the shallow sea close to those points. That's because accidental damage from fishing operations and other vessels is common. On average, around 200 cables suffer faults each year. But as they get further out, the shielding is less robust. Instead, the companies that lay the cables rely on the depth of the sea to do its job (you'll be pleased to hear that sharks don't generally munch on them).

The report praises a strong cable infrastructure, and admits that in some areas at least we have the redundancy in the cable infrastructure to handle disruptions. For example, it notes that 75 percent of UK transatlantic traffic routes through two cables that come ashore in Bude, Cornwall. That seems like quite the vulnerability, but it acknowledges that we have plenty of infrastructure to route around if anything happened to them. There is "no imminent threat to the UK's national connectivity," it soothes.

But it simultaneously cautions against adopting what it describes as "business-as-usual" views in the industry. The government "focuses too much on having 'lots of cables' and pays insufficient attention to the system's actual ability to absorb unexpected shocks," it frets. It warns that "the impacts on connectivity would be much more serious," if onward connections to Europe suffered as part of a coordinated attack.

"While our national connectivity does not face immediate danger, we must prepare for the possibility that our cables can be threatened in the event of a security crisis," it says.

Who is the most likely to mount such an attack, if anyone? Russia seems front and center, according to experts. It has reportedly been studying the topic for years. Keir Giles, director at The Centre for International Cyber Conflict and senior consulting fellow of the Russia and Eurasia Programme at Chatham House, argues that Russia has a long history of information warfare that stepped up after it annexed Crimea in 2014.

"The thinking part of the Russian military suddenly decided 'actually, this information isolation is the way to go, because it appears to win wars for us without having to fight them'," Giles says, adding that this approach is often combined with choke holds on land-based information sources. Cutting off the population in the target area from any source of information other than what the Russian troops feed them achieves results at low cost.

In a 2021 paper he co-wrote for the NATO Cooperative Cyber Defence Centre of Excellence, he pointed to the Glavnoye upravleniye glubokovodnykh issledovaniy (Main Directorate for Deep-Water Research, or GUGI), a secretive Russian agency responsible for analyzing undersea cables for intelligence or disruption. According to the JCNSS report, this organization operates the Losharik, a titanium-hulled submarine capable of targeting cables at extreme depth.

You don't need a fancy submarine to snag a cable, as long as you're prepared to do it in plain sight closer to the coast. The JNCSS report points to several incidents around the UK and the Baltics. November last year saw two incidents. In the first, Chinese-flagged cargo vessel Yi Peng 3 dragged its anchor for 300km and cut two cables between Sweden and Lithuania. That same month, the UK and Irish navies shadowed Yantar, a Russian research ship loitering around UK cable infrastructure in the Irish sea.

The following month saw Cook Islands-flagged ship Eagle S damage one power cable and three data cables linking Finland and Estonia. This May, unaffiliated vessel Jaguar approached an underseas cable off Estonia and was escorted out of the country's waters.

The real problem with brute-force physical damage from vessels is that it's difficult to prove that it's intentional. On one hand, it's perfect for an aggressor's plausible deniability, and could also be a way to test the boundaries of what NATO is willing to tolerate. On the other, it could really be nothing.

"Attribution of sabotage to critical undersea infrastructure is difficult to prove, a situation significantly complicated by the prevalence of under-regulated and illegal shipping activities, sometimes referred to as the shadow fleet," a spokesperson for NATO told us.

"I'd push back on an assertion of a coordinated campaign," says Alan Mauldin, research director at analyst company TeleGeography, which examines undersea cable infrastructure warns. He questions assumptions that the Baltic cable damage was anything other than a SNAFU.

The Washington Post also reported comment from officials on both sides of the Atlantic that the Baltic anchor-dragging was probably accidental. Giles scoffs at that. "Somebody had been working very hard to persuade countries across Europe that this sudden spate of cables being broken in the Baltic Sea, one after another, was all an accident, and they were trying to say that it's possible for ships to drag their anchors without noticing," he says.

One would hope that international governance frameworks could help. The UN Convention on the Law of the Sea [PDF] has a provision against messing with undersea cables, but many states haven't enacted the agreement. In any case, plausible deniability makes things more difficult.

"The main challenge in making meaningful governance reforms to secure submarine cables is figuring out what these could be. Making fishing or anchoring accidents illegal would be disproportionate," says Anniki Mikelsaar, doctoral researcher at Oxford University's Oxford Internet Institute. "As there might be some regulatory friction, regional frameworks could be a meaningful avenue to increase submarine cable security."

The difficulty in pinning down intent hasn't stopped NATO from stepping in. In January it launched Baltic Sentry, an initiative to protect undersea infrastructure in the region. That effort includes frigates, patrol aircraft, and naval drones to keep an eye on what happens both above and below the waves.

Regardless of whether vessels are doing this deliberately or by accident, we have to be prepared for it, especially as cable installation shows no sign of slowing. Increasing bandwidth needs will boost global cable kilometers by 48 percent between now and 2040, says TeleGeography, adding that annual repairs will increase 36 percent between now and 2040.

"Many cable maintenance ships are reaching the end of their design life cycle, so more investment into upgrading the fleets is needed. This is important to make repairs faster," says Mikelsaar.

There are 62 vessels capable of cable maintenance today, and TeleGeography predicts that'll be enough for the next 15 years. However, it takes time to build these vessels and train the operators, meaning that we'll need to start delivering new vessels soon.

The problem for the UK is that it doesn't own any of that repair capacity, says the JNSS. It can take a long time to travel to a cable and repair it, and ships can only work on one at a time. The Committee reported that the UK doesn't own any sovereign repair capacity, and advises that it gets some, prescribing a repair ship by 2030.

"This could be leased to industry on favorable terms during peacetime and made available for Government use in a crisis," it says, adding that the Navy should establish a set of reservists that will be trained and ready to operate the vessel.

Sir Chris Bryant MP, the Minister for Data Protection and Telecoms, told the Committee it that it was being apocalyptic and "over-egging the pudding" by examining the possibility of a co-ordinated attack. "We disagree," the Committee said in the report, arguing that the security situation in the next decade is uncertain.

"Focusing on fishing accidents and low-level sabotage is no longer good enough," the report adds. "The UK faces a strategic vulnerability in the event of hostilities. Publicly signaling tougher defensive preparations is vital, and may reduce the likelihood of adversaries mounting a sabotage effort in the first place."

To that end, it has made a battery of recommendations. These include building the risk of a coordinated campaign against undersea infrastructure into its risk scenarios, and protecting the stations - often in remote coastal locations - where the cables come onto land.

The report also recommends that the Department for Science, Innovation and Technology (DSIT) ensures all lead departments have detailed sector-by-sector technical impact studies addressing widespread cable outages.

"Government works around the clock to ensure our subsea cable infrastructure is resilient and can withstand hostile and non-hostile threats," DSIT told El Reg, adding that when breaks happen, the UK has some of the fastest cable repair times in the world, and there's usually no noticeable disruption."

"Working with NATO and Joint Expeditionary Force allies, we're also ensuring hostile actors cannot operate undetected near UK or NATO waters," it added. "We're deploying new technologies, coordinating patrols, and leading initiatives like Nordic Warden alongside NATO's Baltic Sentry mission to track and counter undersea threats."

Nevertheless, some seem worried. Vili Lehdonvirta, head of the Digital Economic Security Lab (DIESL) and professor of Technology Policy at Aalto University, has noticed increased interest from governments and private sector organizations alike in how much their daily operations depend on oversea connectivity. He says that this likely plays into increased calls for digital sovereignty.

"The rapid increase in data localization laws around the world is partly explained by this desire for increased resilience," he says. "But situating data and workloads physically close as opposed to where it is economically efficient to run them (eg. because of cheaper electricity) comes with an economic cost."

So the good news is that we know exactly how vulnerable our undersea cables are. The bad news is that so does everyone else with a dodgy cargo ship and a good poker face. Sleep tight.


Original Submission

posted by janrinok on Thursday October 09, @07:25AM   Printer-friendly

https://www.abortretry.fail/p/the-qnx-operating-system

Gordon Bell and Dan Dodge were finishing their time at the University of Waterloo in Ontario in 1979. In pursuit of their masters degrees, they'd worked on a system called Thoth in their real-time operating systems course. Thoth was interesting not only for having been real-time and having featured synchronous message passing, but also for originally having been written in the B programming langue. It was then rewritten in the UW-native Eh language (fitting for a Canadian university), and then finally rewritten in Zed. It is this last, Zed-written, version of Thoth to which Bell and Dodge would have been exposed. Having always been written in a high-level language, the system was portable, and programs were the same regardless of the underlying hardware. Both by convention and by design, Thoth strongly encouraged programs to be structured as networks of communicating processes. As the final project for the RTOS course, students were expected to implement a real-time system of their own. This experience was likely pivotal to their next adventure.

A very deep and excellent dive into the world/history of QNX:


Original Submission