Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Who or what piqued your interest in technology?

  • School
  • Parent
  • Friend
  • Book
  • Gadget
  • Curiosity
  • I have been kidnapped by a technology company you insensitive clod
  • Other (please specify in the comments)

[ Results | Polls ]
Comments:35 | Votes:114

posted by janrinok on Friday February 14, @09:25PM   Printer-friendly
from the code-bucket dept.

Surprise surprise, we've done it again. We've demonstrated an ability to compromise significantly sensitive networks, including governments, militaries, space agencies, cyber security companies, supply chains, software development systems and environments, and more:

Arguably armed still with a somewhat inhibited ability to perceive risk and seemingly no fear, in November 2024, we decided to prove out the scenario of a significant Internet-wide supply chain attack caused by abandoned infrastructure. This time however, we dropped our obsession with expired domains, and instead shifted our focus to Amazon's S3 buckets.

It's important to note that although we focused on Amazon's S3 for this endeavour, this research challenge, approach and theme is cloud-provider agnostic and applicable to any managed storage solution. Amazon's S3 just happened to be the first storage solution we thought of, and we're certain this same challenge would apply to any customer/organization usage of any storage solution provided by any cloud provider.

The TL;DR is that this time, we ended up discovering ~150 Amazon S3 buckets that had previously been used across commercial and open source software products, governments, and infrastructure deployment/update pipelines - and then abandoned.

Naturally, we registered them, just to see what would happen - "how many people are really trying to request software updates from S3 buckets that appear to have been abandoned months or even years ago?", we naively thought to ourselves.

[...] These S3 buckets received more than 8 million HTTP requests over a 2 month period for all sorts of things -

  • Software updates,
  • Pre-compiled (unsigned!) Windows, Linux and macOS binaries,
  • Virtual machine images (?!),
  • JavaScript files,
  • CloudFormation templates,
  • SSLVPN server configurations,
  • and more.

The article goes on to describe where the requests came from and provides some details on getting the word to the right companies and what actions they took. Originally spotted on Schneier on Security.

Related:


Original Submission

posted by janrinok on Friday February 14, @04:17PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

A federal judge appointed by a Republican President has castigated Trump administration officials — and ordered them to immediately restore public health websites that they abruptly abruptly shut down.

The lawsuit against the website removal, brought by a group of physicians known as Doctors for America, concerns sites operated by the Department of Health and Human Services, the Centers for Disease Control and Prevention and the Food and Drug Administration. Doctors for America says the scrubbing of the sites makes it more difficult for them to treat patients.

U.S. District Judge John Bates, appointed by George W Bush, agreed. He ordered the government to restore the pages by the end of the day of his ruling (Tuesday Feb. 11).

The TL;DR of Bates' ruling? It "was done without any public rationale, recourse or ability to challenge the decisions, despite laws and regulations that typically require them," as Politico summarized.

Sites removed by Trump officials concern HIV care, plus information on contraception drugs and student health. In their lawsuit filed against the Office for Personnel Management, HHS, CDC, and the FDA, Doctors for America says the removal of websites offering them guidance on these subjects is creating confusion, which eats up time that is better spent treating patients.

Justice Department attorneys defended the government's decision to remove the sites, saying doctors could still access the information by using the Wayback Machine, which archives offline websites. But that didn't fly with the judge.

"The Wayback Machine does not capture every webpage, and there is no information to suggest that is has archived each removed webpage," Bates wrote. "Additionally, pages archived on the Wayback Machine do not appear on search engines. In other words, a particular archived webpage is only viewable to a provider if the provider knows that the Wayback Machine exists and had recorded the pre-removal URL of the requested webpage."


Original Submission

posted by janrinok on Friday February 14, @11:34AM   Printer-friendly

https://phys.org/news/2025-02-earth-core-solid-previously-thought.html

The surface of the Earth's inner core may be changing, as shown by a new study by USC scientists that detected structural changes near the planet's center, published in Nature Geoscience.

The changes of the inner core have long been a topic of debate for scientists. However, most research has been focused on assessing rotation. John Vidale, Dean's Professor of Earth Sciences at the USC Dornsife College of Letters, Arts and Sciences and principal investigator of the study, said the researchers "didn't set out to define the physical nature of the inner core."

"What we ended up discovering is evidence that the near surface of Earth's inner core undergoes structural change," Vidale said. The finding sheds light on the role topographical activity plays in rotational changes in the inner core that have minutely altered the length of a day and may relate to the ongoing slowing of the inner core.

Located 3,000 miles below the Earth's surface, the inner core is anchored by gravity within the molten liquid outer core. Until now, the inner core was widely thought of as a solid sphere.

The original aim of the USC scientists was to further chart the slowing of the inner core. "But as I was analyzing multiple decades' worth of seismograms, one dataset of seismic waves curiously stood out from the rest," Vidale said. "Later on, I'd realize I was staring at evidence the inner core is not solid."

The study utilized seismic waveform data—including 121 repeating earthquakes from 42 locations near Antarctica's South Sandwich Islands that occurred between 1991 and 2024—to give a glimpse of what takes place in the inner core.

As the researchers analyzed the waveforms from receiver-array stations located near Fairbanks, Alaska, and Yellowknife, Canada, one dataset of seismic waves from the latter station included uncharacteristic properties the team had never seen before.

"At first the dataset confounded me," Vidale said. It wasn't until his research team improved the resolution technique did it become clear the seismic waveforms represented additional physical activity of the inner core.

The physical activity is best explained as temporal changes in the shape of the inner core. The new study indicates that the near surface of the inner core may undergo viscous deformation, changing its shape and shifting at the inner core's shallow boundary.

The clearest cause of the structural change is interaction between the inner and outer core. "The molten outer core is widely known to be turbulent, but its turbulence had not been observed to disrupt its neighbor the inner core on a human timescale," Vidale said. "What we're observing in this study for the first time is likely the outer core disturbing the inner core."

Vidale said the discovery opens a door to reveal previously hidden dynamics deep within Earth's core, and may lead to better understanding of Earth's thermal and magnetic field.

More information: John Vidale, Annual-scale variability in both the rotation rate and near surface of Earth's inner core, Nature Geoscience (2025).
DOI: 10.1038/s41561-025-01642-2. www.nature.com/articles/s41561-025-01642-2


Original Submission

posted by hubie on Friday February 14, @06:50AM   Printer-friendly

Critics accuse the company of wielding outsized private influence on public policing

Hackers leaked thousands of files from Lexipol, a Texas-based company that develops policy manuals, training bulletins, and consulting services for first responders.

The manuals, which are crafted by Lexipol's team of public sector attorneys, practitioners, and subject-matter experts, are customized to align with the specific needs and local legal requirements of agencies across the country.

But the firm also faces criticism for its blanket approach to police policies and pushback on reforms.

The data, a sample of which was given to the Daily Dot by a group referring to itself as "the puppygirl hacker polycule," includes approximately 8,543 files related to training, procedural, and policy manuals, as well as customer records that contain names, usernames, agency names, hashed passwords, physical addresses, email addresses, and phone numbers.

[...] The full dataset was provided by the hackers to DDoSecrets, the non-profit journalist and data leak hosting collective, which notes that "Lexipol retains copyright over all manuals which it creates despite the public nature of its work."

"There is little transparency on how decisions are made to draft their policies," the non-profit said, "which have an oversized influence on policing in the United States."

Some departments proactively publish their policy manuals online, while others keep them hidden from public view. One of the leaked manuals seen by the Daily Dot from the Orville Police Department in Ohio, for example, was not available online. Yet a nearly identical manual from Ohio's Beachwood Police Department can be found on the city's website.

The manuals cover matters ranging from the use of force and non-lethal alternatives to rules surrounding confidential informants and high-speed chases.

Given Lexipol's status as a private company, the widespread adoption of such manuals has led to concerns over its influence on public policing policies. The centralization, critics argue, could result in standardized policies that do not accurately represent the needs or values of local communities.

As noted by the Texas Law Review, "although there are other private, nonprofit, and government entities that draft police policies, Lexipol is now a dominant force in police policymaking across the country."

[...] Founded by two former police officers-turned-lawyers in 2003, Lexipol has increased its customer base significantly over the years. The company has also caught the attention of civil liberties groups that have accused Lexipol of helping violent officers evade justice by crafting policies that provide broad discretion in use-of-force situations.

The company has been accused of discriminatory profiling as well. In 2017, the American Civil Liberties Union (ACLU) sent a letter to Lexipol demanding that it "eliminate illegal and unclear directives that can lead to racial profiling and harassment of immigrants."

"The policies include guidelines that are unconstitutional and otherwise illegal, and can lead to improper detentions and erroneous arrests," the ACLU said at the time, highlighting directives Lexipol issued cops that indicated they had more leeway to arrest immigrants than the law allowed.

- https://ddosecrets.com/article/lexipolleaks
- https://www.beachwoodohio.com/622/Police-Manual-Core-Policies
- https://texaslawreview.org/lexipol/
- https://www.repository.law.indiana.edu/ilj/vol97/iss1/1/
- https://www.motherjones.com/criminal-justice/2020/08/lexipol-police-policy-company/
- https://www.aclu.org/press-releases/faulty-lexipol-policies-expose-police-departments-costly-lawsuits-aclu-wa-and-nwirp


Original Submission

posted by hubie on Friday February 14, @02:04AM   Printer-friendly
from the boogity-boogity-boogity-let's-stay-in-line-drivers dept.

NASCAR's first points race of 2025 is the Daytona 500, which is on February 16. The Daytona 500 is NASCAR's most prestigious race, with a unique style of racing known as pack racing. This is characterized by many cars running at very high speeds in large packs, and often massive wrecks referred to as the "Big One". However, many drivers and fans have been critical of rules changes in recent years leading to racing at NASCAR's biggest oval tracks that they describe as boring.

Bobby Allison's 210 mph crash at the 1987 Winston 500 forever changed how NASCAR races at superspeedways. Allison's car became airborne, severely damaged the catch fence along the frontstretch at Talladega, and almost flew into the stands. NASCAR decided the speeds had become too fast at their two largest and highest-banked ovals, Daytona and Talladega, and implemented restrictor plates at those tracks starting in 1988. Restrictor plates reduce the air intake into the engine, reducing both horsepower and speeds.

Although drafting had always been powerful at superspeedways, the changes caused the cars to race in large packs, often with 20 or 30 cars within a couple seconds of each other. Although NASCAR says that this is necessary to prevent the worst wrecks, often leads to large multi-car wrecks. Drivers also complain that winning restrictor plate races is influenced too heavily by luck, though this is disputed.

When a car drives forward, it displaces the air with its nose, creating high pressure at the front of the car, and low pressure behind the car in its turbulent wake. The combination of high pressure in front and low pressure behind the car creates a rearward pointing pressure gradient force (PGF), which is drag and slows the car down. If another car rides in the wake of the lead car, it experiences lower pressure on its nose, reducing the drag and allowing the car to go faster. However, if the trailing car puts its nose right behind the rear bumper of the lead car, it increases the pressure behind the lead car, reducing the lead car's drag as well. When the cars are in close proximity, both leading and trailing cars benefit from the draft When NASCAR reduced the horsepower at superspeedways, the draft became particularly powerful, and the fastest way around the track was now in a group of cars driving bumper-to-bumper.

The result is often a large pack of cars, two or three wide, driving around the track at full throttle with speeds around 190 mph. One of the best ways to pass in pack racing is for a car to back up to the bumper of the car behind it, get pushed forward to increase its speed, and then get out of line to try to move forward. For this strategy to work, either that car has to get back in the draft soon before drag slows it down too much, or it needs other cars to also get out of line and start a new line. The result is a style of racing that leads to cars making aggressive moves at high speeds, and it can be spectacular to watch. However, in recent years and especially since the introduction of NASCAR's next-gen Cup Series car in 2022, the racing at superspeedways has been criticized as boring.

Although it's difficult to find detailed historical engine specs, for much of the restrictor plate era, cars might have 750 horsepower at most tracks but be limited to 450 horsepower at superspeedways. More recently, NASCAR has been increasing the power at superspeedways while adding more aerodynamic drag to slow the cars down. However, this means the drag is more severe when a car gets out of line, and a single car will drop back quickly. This makes it much more difficult for cars to pass without multiple cars getting out of line at once.

Driver Denny Hamlin also said that higher drag in the next-gen car leads to poorer fuel mileage, leading to slower speeds to conserve fuel, and less passing. Instead of making aggressive moves to pass, cars tend to ride around in line for much of the race leading to a style of racing that many describe as boring. Suggestions to improve the racing include reducing drag, lowering horsepower in the engines, and either adjusting the lengths of race stages or eliminating stage racing altogether at superspeedways.


Original Submission

posted by hubie on Thursday February 13, @09:15PM   Printer-friendly

https://phys.org/news/2025-02-money-distance-theory.html

Two of the most commonly accepted theories for the origin of money are the commodity theory and the chartalist theory. Both have drawbacks, but in recent years, the chartalist theory has gained much traction.

A recent study by archaeologist Dr. Mikael Fauvelle, published in the Journal of Archaeological Method and Theory, proposes that a third theory, examining external factors, may better explain the origin of money in pre-state societies.

Traditionally, two theories on the origin of money exist. The first is the commodity theory, which proposes that money was developed to facilitate internal barter between community members. The premise is that trading one good for another good that you desire is inefficient and unreliable, as you cannot guarantee that the trading partner has the goods you want or that they want the goods you are offering. However, money mitigates this problem.

This theory has recently come under scrutiny as ethnographic and historical studies show that pure barter systems are rare and that most traditional societies use exchange networks based on trust and delayed reciprocity.

Meanwhile, chartalist theory focuses on the role of money as a unit of account, arguing that money was imposed by the state to facilitate taxation, tribute collection, and financing of wars. However, this theory falls flat when looking at pre-state societies that did not tax or had no tribute to collect.

These two theories are often presented as opposing each other. However, not only need they not be mutually exclusive, but they tend to view money as having the same definition in ancient societies as it has today, namely as a medium of exchange, a unit of account, a standard of value, and a store of value.

Dr. Fauvelle provides evidence that supports a third theory, the so-called "Trade Money Theory." The theory proposes that it was not internal barter problems that money was used to solve but rather long-distance external exchange networks that could not rely on familiar, trust-based relationships of delayed reciprocity.

To support this theory, Dr. Fauvelle examines the money systems of two pre-state societies. "I focused on shell beads in Western North America and Bronze Money in Europe as these are two well-documented case studies with considerable evidence for widespread trade and monetary economies predating the development of ancient states."

Journal Reference: Mikael Fauvelle, The Trade Theory of Money: External Exchange and the Origins of Money, Journal of Archaeological Method and Theory (2025). DOI: 10.1007/s10816-025-09694-9


Original Submission

posted by hubie on Thursday February 13, @04:29PM   Printer-friendly
from the git-clone-enshittify-??-profit! dept.

WikiTok cures boredom in spare moments with wholesome swipe-up Wikipedia article discovery:

On Wednesday, a New York-based app developer named Isaac Gemal debuted a new site called WikiTok, where users can vertically swipe through an endless stream of Wikipedia article stubs in a manner similar to the interface for video-sharing app TikTok.

It's a neat way to stumble upon interesting information randomly, learn new things, and spend spare moments of boredom without reaching for an algorithmically addictive social media app. Although to be fair, WikiTok is addictive in its own way, but without an invasive algorithm tracking you and pushing you toward the lowest-common-denominator content. It's also thrilling because you never know what's going to pop up next.

WikiTok, which works through mobile and desktop browsers, feeds visitors a random list of Wikipedia articles—culled from the Wikipedia API—into a vertically scrolling interface. Despite the name that hearkens to TikTok, there are currently no videos involved. Each entry is accompanied by an image pulled from the corresponding article. If you see something you like, you can tap "Read More," and the full Wikipedia page on the topic will open in your browser.

For now, the feed is truly random, and Gemal is currently resisting calls to automatically tailor the stream of articles to the user's interests based on what they express interest in.

"I have had plenty of people message me and even make issues on my GitHub asking for some insane crazy WikiTok algorithm," Gemal told Ars. "And I had to put my foot down and say something along the lines that we're already ruled by ruthless, opaque algorithms in our everyday life; why can't we just have one little corner in the world without them?"

[...] Gemal posted the code for WikiTok on GitHub, so anyone can modify or contribute to the project. Right now, the web app supports 14 languages, article previews, and article sharing on both desktop and mobile browsers. New features may arrive as contributors add them. It's based on a tech stack that includes React 18, TypeScript, Tailwind CSS, and Vite.

And so far, he is sticking to his vision of a free way to enjoy Wikipedia without being tracked and targeted. "I have no grand plans for some sort of insane monetized hyper-calculating TikTok algorithm," Gemal told us. "It is anti-algorithmic, if anything."


Original Submission

posted by hubie on Thursday February 13, @11:43AM   Printer-friendly
from the damage-not-wind-speeds dept.

Tornado strength is rated from 0 (weakest) to 5 (strongest) on the Enhanced Fujita (EF) scale, with roughly 2% of tornadoes being rated EF4 or EF5. The EF scale replaced the older Fujita scale to provide much more fine-grained detail in determining a tornado's rating. The EF5 rating corresponds to estimated peak winds of 200+ mph. However, it is purely a damage scale, from which the peak winds in the tornado are later estimated. Although meteorologists often discuss the wind speeds in tornadoes, measured wind speeds are never a factor in rating tornadoes.

This distinction was made apparent on April 26, 1991 when the Andover, Kansas tornado was rated F5 while the Red Rock, Oklahoma tornado was rated F4 despite likely being the stronger tornado. A mobile radar from the University of Oklahoma measured 270+ mph winds in the Red Rock tornado, well into the F5 range, and the strongest tornado winds that had ever been measured to date. However, because the Red Rock tornado remained over mostly rural areas unlike the Andover tornado, there was little opportunity for it to do severe enough damage to be rated F5. This distinction remains true with the EF scale, where the 2013 El Reno, Oklahoma tornado was originally rated EF5 on the basis of mobile radar observations, then downgraded to EF3 based on the lack of EF4 or EF5 damage in damage surveys.

A new article in the Bulletin of the American Meteorological Society discusses the current "drought" in EF5 tornadoes, with that rating being most recently assigned to the 2013 Moore, Oklahoma tornado that happened just 11 days before the 2013 El Reno tornado. The lack of EF5 tornadoes for over 11 years has raised questions of why, and if the EF5 rating is essentially obsolete.

The journal paper argues that the lack of EF5 tornadoes for 11 years is roughly 0.3%, and it's very unlikely that there have been zero EF5 tornadoes during that period. Instead, it's probable that this is due to stricter application of the EF scale standards, and several tornadoes were estimated to have peak winds of 190+ mph during that period. If those tornadoes were reclassified to EF5, it would be statistically consistent with the previous climatology of EF5 tornadoes. The authors note that some of the previous EF5 ratings such as the 2011 Joplin Missouri tornado were based on damage indicators that were not part of the EF scale specifications.

One of the biggest reasons for not assigning an EF5 rating is the presence of areas with limited damage very close to near-total devastation. However, the strongest tornadoes are generally multi-vortex tornadoes, where the strongest winds are found within small vortices embedded within a broader tornadic circulation. This could explain the proximity of extreme damage to areas with much less damage. The damage severity also depends on how long structures are exposed to extreme winds, an example of which is the 1997 Jarrell, Texas tornado, which was rated F5 but damage was more severe due to the tornado moving slowly and exposing buildings to the tornado winds for a longer than usual time. This raises the question of whether the EF5 rating is obsolete based on how the EF scale is currently applied, and if it's time to again revise how meteorologists rate tornado strength.


Original Submission

posted by hubie on Thursday February 13, @06:57AM   Printer-friendly

https://spectrum.ieee.org/aluminum-battery

This sustainable, solid-state electrolyte design outlives lithium-ion batteries

Electric vehicles( EVs) and green energy sources rely heavily on batteries to store electricity. Currently, more than 75 percent of the world's energy storage depends on batteries that contain lithium, an expensive mineral that's subject to volatile pricing. Lithium-ion (Li-ion) batteries themselves can be volatile, too, because they use a flammable electrolyte that can catch fire when overcharged.

Now, a group of scientists based in Beijing believes that aluminum offers a better solution. Aluminum is the third-most abundant mineral in the Earth's crust and costs about one-quarter as much as lithium. And if built right, aluminum-based batteries may offer longer life expectancy and a safer, more sustainable design than their volatile counterparts. Led by scientists from the Beijing Institute of Technology and the University of Science and Technology Beijing, the group has found a way to stabilize aluminum batteries that can last far longer.

Aluminum-ion (Al-ion) batteries have been the subject of research for years. But previous attempts have generally used ionic liquid electrolytes, which can lead to anode corrosion, especially in humid conditions. Other researchers have used gel polymer electrolytes, halfway between liquid and solid-state alternatives, but these tend to have low conductivity. This team of researchers took a different approach and added a pinch of salt—namely, an inert aluminum fluoride salt—to a liquid electrolyte containing aluminum ions, creating a solid-state electrolyte.

Well, more than a pinch of salt, really. The salt has a porous 3D structure, which allows it to act like a rigid sponge that absorbs and stabilizes the liquid, yet still allows the ions to move more freely. This increases conductivity of the material, and the result is a solid composite material that cannot leak. The researchers also coated the electrodes with a thin layer of material that helps prevent crystals of aluminum from forming, which would degrade battery performance over time.

"Our research shows that a stable, recyclable solid-state electrolyte can improve aluminum-ion batteries by solving issues like corrosion, safety, and long-cycle life, making them a potential alternative to lithium-based batteries," says Shuqiang Jiao, a professor of electrochemical engineering at the University of Science and Technology Beijing.


Original Submission

posted by janrinok on Thursday February 13, @02:12AM   Printer-friendly
from the what-happened-to-learn-to-code? dept.

AI increases unemployment rates in US IT sector:

The increasing use of artificial intelligence (AI) has continued to have negative impact on the information technology (IT) job market in the US, with unemployment rates increasing in this vital sector.

According to the US newspaper the Wall Street Journal (WSJ), the unemployment rate in the IT sector in the US rose from 3.9% in December 2024 to 5.7% in January 2025, as a result of the increasing reliance on automation and the use of AI technologies, pointing out that the number of unemployed IT workers rose from 98,000 in December 2024 to 152,000 in January 2025.

According to economic experts, labor market data, and specialized reports, job losses in the technology sector can be attributed in part to the impact of AI, as the emergence of generative AI has led to huge amounts of spending by giant technology companies on AI infrastructure instead of new jobs in the IT field, the newspaper added.

The WSJ said that "jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration."

"As they start looking at AI, theyre also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return," the WSJ added, indicating that companies are betting that AI will bring economic benefits to companies, whether in terms of improving efficiency or reducing costs.

"Increased corporate investment in AI has shown early signs of leading to future cuts in hiring, a concept some tech leaders are starting to call "cost avoidance." Rather than hiring new workers for tasks that can be more easily automated, some businesses are letting AI take on that work and reaping potential savings," WSJ said.

According to experts, the latest IT jobs numbers come as unemployment among white-collar workers remains at its highest levels since 2020.

"What weve really seen, especially in the last year or so, is a bifurcation in opportunities, where white-collar knowledge worker type jobs have had far less employer demand than jobs that are more in-person, skilled labor jobs," WSJ added.

See also:


Original Submission

IT Unemployment Rises to 5.7% as AI Hits Tech Jobs:

The unemployment rate in the information technology sector rose from 3.9% in December to 5.7% in January, well above last month's overall jobless rate of 4%, in the latest sign of how automation and the increasing use of artificial intelligence are having a negative impact on the tech labor market.

The number of unemployed IT workers rose from 98,000 in December to 152,000 last month, according to a report from consulting firm Janco Associates based on data from the U.S. Department of Labor.

Job losses in tech can be attributed in part to the influence of AI, according to Victor Janulaitis, chief executive of Janco Associates. The emergence of generative AI has produced massive amounts of spending by tech giants on AI infrastructure, but not necessarily new jobs in IT.

"Jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration," Janulaitis said. "As they start looking at AI, they're also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return."

[...] Another reason for January's tech job losses was that companies began implementing some intended spending cuts for this year, Janulaitis said, and many slashed budgets based on what the economy looked like during fiscal planning last year.

Layoffs have also continued at some large tech companies. Last month, Meta Platformssaid it would cut 5% of its workforce in performance-based job cuts in the U.S., and on Wednesday enterprise software giant Workday said it would cut about 8.5% of its workforce.

Alternate link


Original Submission

posted by janrinok on Wednesday February 12, @09:28PM   Printer-friendly
from the closing-in-on-the-"AI"-bubble-bursting dept.

The Beeb decided to test some LLMs to see how well they could summarize the news https://www.bbc.com/news/articles/c0m17d8827ko Turns out the answer is, "not very well".

In the study, the BBC asked ChatGPT, Copilot, Gemini and Perplexity to summarise 100 news stories and rated each answer. It got journalists who were relevant experts in the subject of the article to rate the quality of answers from the AI assistants. It found 51% of all AI answers to questions about the news were judged to have significant issues of some form. Additionally, 19% of AI answers which cited BBC content introduced factual errors, such as incorrect factual statements, numbers and dates.

[...] In her blog, Ms Turness said the BBC was seeking to "open up a new conversation with AI tech providers" so we can "work together in partnership to find solutions".

She called on the tech companies to "pull back" their AI news summaries, as Apple did after complaints from the BBC that Apple Intelligence was misrepresenting news stories.

Some examples of inaccuracies found by the BBC included:

  • Gemini incorrectly said the NHS did not recommend vaping as an aid to quit smoking
  • ChatGPT and Copilot said Rishi Sunak and Nicola Sturgeon were still in office even after they had left
  • Perplexity misquoted BBC News in a story about the Middle East, saying Iran initially showed "restraint" and described Israel's actions as "aggressive"

In general, Microsoft's Copilot and Google's Gemini had more significant issues than OpenAI's ChatGPT and Perplexity, which counts Jeff Bezos as one of its investors. Normally, the BBC blocks its content from AI chatbots, but it opened its website up for the duration of the tests in December 2024. The report said that as well as containing factual inaccuracies, the chatbots "struggled to differentiate between opinion and fact, editorialised, and often failed to include essential context."

Normally I'd add a snide remark, but I don't think I need to this time...


Original Submission

posted by janrinok on Wednesday February 12, @04:43PM   Printer-friendly
from the attractive-endeavor dept.

Arthur T Knackerbracket has processed the following story:

In mid-January, a top United States materials company announced that it had started to manufacture rare earth magnets. It was important news—there are no large U.S. makers of the neodymium magnets that underpin huge and vitally important commercial and defense industries, including electric vehicles. But it created barely a ripple during a particularly loud and stormy time in U.S. trade relations.

The press release, from MP Materials, was light on details. The company disclosed that it had started producing the magnets, called neodymium-iron-boron (NdFeB), on a “trial” basis and that the factory would begin gradually ramping up production before the end of this year. According to MP’s spokesman, Matt Sloustcher, the facility will have an initial capacity of 1,000 tonnes per annum, and has the infrastructure in place to scale up to 2,000 to 3,000 tonnes per year. The release also said that the facility, in Fort Worth, Texas, would supply magnets to General Motors and other U.S. manufacturers.

The Texas facility, which MP Materials has named Independence, is not the only major rare-earth-magnet project in the U.S. Most notably, Vacuumschmelze GmbH, a magnet maker based in Hanau, Germany, has begun constructing a plant in South Carolina through a North American subsidiary, e-VAC Magnetics. To build the US $500 million factory, the company secured $335 million in outside funds, including at least $100 million from the U.S. government. (E-VAC, too, has touted a supply agreement with General Motors for its future magnets.)

In another intriguing U.S. rare-earth magnet project, Noveon Magnetics, in San Marcos, Texas, is currently producing what it claims are “commercial quantities” of NdFeB magnets. However, the company is not making the magnets in the standard way, starting with metal alloys, but rather in a unique process based on recycling the materials from discarded magnets. USA Rare Earth announced on 8 January that it had manufactured a small amount of NdFeB magnets at a plant in Stillwater, Oklahoma.

Yet another company, Quadrant Magnetics, announced in January, 2022, that it would begin construction on a $100 million NdFeB magnet factory in Louisville, Kentucky. However, 11 months later, U.S. federal agents arrested three of the company’s top executives, charging them with passing off Chinese-made magnets as locally produced and giving confidential U.S. military data to Chinese agencies.

The multiple US neodymium-magnet projects are noteworthy but even collectively they won’t make a noticeable dent in China’s dominance. “Let me give you a reality check,” says Steve Constantinides, an IEEE member and magnet-industry consultant based in Honeoye, N.Y. “The total production of neo magnets was somewhere between 220 and 240 thousand tonnes in 2024,” he says, adding that 85 percent of the total, at least, was produced in China. And “the 15 percent that was not made in China was made in Japan, primarily, or in Vietnam.” (Other estimates put China’s share of the neodymium magnet market as high as 90 percent.)

But look at the figures from a different angle, suggests MP Materials’s Sloustcher. “The U.S. imports just 7,000 tonnes of NdFeB magnets per year,” he points out. “So in total, these [U.S.] facilities can supplant a significant percentage of U.S. imports, help re-start an industry, and scale as the production of motors and other magnet-dependent industries” returns to the United States, he argues.

And yet, it’s hard not to be a little awed by China’s supremacy. The country has some 300 manufacturers of rare-earth permanent magnets, according to Constantinides. The largest of these, JL MAG Rare-Earth Co. Ltd., in Ganzhou, produced at least 25,000 tonnes of neodymium magnets last year, Constantinides figures. (The company recently announced that it was building another facility, to begin operating in 2026, that it says will bring its installed capacity to 60,000 tonnes a year.)

That 25,000 tonnes figure is comparable to the combined output of all of the rare-earth magnet makers that aren’t in China. The $500-million e-VAC plant being built in South Carolina, for example, is reportedly designed to produce around 1,500 tonnes a year.

But even those numbers do not fully convey China’s dominance of permanent magnet manufacturing. Where ever a factory is, making neodymium magnets requires supplies of rare-earth metal, and that nearly always leads straight back to China. “Even though they only produce, say, 85 percent of the magnets, they are producing 97 percent of the metal” in the world, says Constantinides. “So the magnet manufacturers in Japan and Europe are highly dependent on the rare-earth metal coming from China.”

And there, at least, MP Materials may have an interesting edge. Hardly any firms, even in China, do what MP is attempting: produce finished magnets starting with ore that the company mines itself. Even large companies typically perform just one or at most two of the four major steps along the path to making a rare-earth magnet: mining the ore, refining the ore into rare-earth oxides, reducing the oxides to metals, and then, finally, using the metals to make magnets. Each step is an enormous undertaking requiring entirely different equipment, processes, knowledge, and skill sets.

“The one advantage they get from [doing it all] is that they get better insights into how different markets are actually growing,” says Stan Trout, a magnet industry consultant in Denver, Colorado. “Getting the timing right on any expansion is important,” Trout adds. “And so MP should be getting that information as well as anybody, with the different plants that they have, because they interact with the market in several different ways and can really see what demand is like in real time, rather than as some projection in a forecast.”

Still, it’s going to be an uphill climb. “There are a lot of both hard and soft subsidies in the supply chain in China,” says John Ormerod, an industry consultant based in Knoxville, Tenn. “It’s going to be difficult for a US manufacturer to compete with the current price levels of Chinese-made magnets,” he concludes.

And it’s not going to get better any time soon. China’s rare-earth magnet makers are only using about 60 percent of their production capacity, according to both Constantinides and Ormerod—and yet they are continuing to build new plants. “There’s going to be roughly 500,000 tonnes of capacity by the end of this year,” says Ormerod, citing figures gathered by Singapore-based analyst Thomas Kruemmer. “The demand is only about 50 percent of that.”

The upshot, all of the analysts agree, will be downward price pressure on rare earth magnets in the near future, at least. At the same time, the U.S. Department of Defense has made it a requirement that rare-earth magnets for its systems must be produced entirely, starting with ore, in “friendly” countries—which does not include China. “The DoD will need to pay a premium over cheaper imported magnets to establish a price floor enabling domestic U.S. producers to successfully and continuously supply the DoD,” says Constantinides.


Original Submission

posted by janrinok on Wednesday February 12, @11:59AM   Printer-friendly

In a 1985 paper, the computer scientist Andrew Yao, who would go on to win the A.M. Turing Award, asserted that among hash tables with a specific set of properties, the best way to find an individual element or an empty spot is to just go through potential spots randomly — an approach known as uniform probing. He also stated that, in the worst-case scenario, where you're searching for the last remaining open spot, you can never do better than x. for 40 years, most computer scientists assumed that Yao's conjecture was true.

Krapivin was not held back by the conventional wisdom for the simple reason that he was unaware of it. "I did this without knowing about Yao's conjecture," he said. His explorations with tiny pointers led to a new kind of hash table — one that did not rely on uniform probing. And for this new hash table, the time required for worst-case queries and insertions is proportional to (log x)2 — far faster than x. This result directly contradicted Yao's conjecture.

[...] "It's not just that they disproved [Yao's conjecture], they also found the best possible answer to his question," said Sepehr Assadi of the University of Waterloo.

[...] In addition to refuting Yao's conjecture, the new paper also contains what many consider an even more astonishing result. It pertains to a related, though slightly different, situation: In 1985, Yao looked not only at the worst-case times for queries, but also at the average time taken across all possible queries. He proved that hash tables with certain properties — including those that are labeled "greedy," which means that new elements must be placed in the first available spot — could never achieve an average time better than log x.

[...] They showed that it did not by providing a counterexample, a non-greedy hash table with an average query time that's much, much better than log x. In fact, it doesn't depend on x at all. "You get a number," Farach-Colton said, "something that is just a constant and doesn't depend on how full the hash table is." The fact that you can achieve a constant average query time, regardless of the hash table's fullness, was wholly unexpected — even to the authors themselves.

( https://www.quantamagazine.org/undergraduate-upends-a-40-year-old-data-science-conjecture-20250210/ )


Original Submission

posted by janrinok on Wednesday February 12, @07:12AM   Printer-friendly

Handful of users claim new Nvidia GPUs are melting power cables again:

Here we (maybe) go again: Reports from a handful of early adopters of Nvidia's new GeForce RTX 5090 graphics card are reporting that their power cables are melting (so far, there's at least one report on YouTube and one on Reddit, as reported by The Verge). This recalls a similar situation from early in the RTX 4090's life cycle, when power connectors were melting and even catching fire, damaging the GPUs and power supplies.

After much investigation and many guesses from Nvidia and other testers, the 4090's power connector issues ended up being blamed on what was essentially user error; the 12VHPWR connectors were not being inserted all the way into the socket on the GPU or were being bent in a way that created stress on the connection, which caused the connectors to run hot and eventually burst into flames.

The PCI-SIG, the standards body responsible for the design of the new connector, claimed that the design of the 12VHPWR connector itself was sound and that any problems with it should be attributed to the manufacturers implementing the standard. Partly in response to the 4090 issues, the 12VHPWR connector was replaced by an updated standard called 12V-2x6, which uses the same cables and is pin-compatible with 12VHPWR, but which tweaked the connector to ensure that power is only actually delivered if the connectors are firmly seated. The RTX 50-series cards use the 12V-2x6 connector.

The 12VHPWR and 12V-2x6 connectors are both designed to solve a real problem: delivering hundreds of watts of power to high-end GPUs over a single cable rather than trying to fit multiple 8-pin power connectors onto these GPUs. In theory, swapping two to four 8-pin connectors for a single 12V-2x6 or 12VHPWR connector cuts down on the amount of board space OEMs must reserve for these connectors in their designs and the number of cables that users have to snake through the inside of their gaming PCs.

But while Nvidia, Intel, AMD, Qualcomm, Arm, and other companies are all PCI-SIG members and all had a hand in the design of the new standards, Nvidia is the only GPU company to use the 12VHPWR and 12V-2x6 connectors in most of its GPUs. AMD and Intel have continued to use the 8-pin power connector, and even some of Nvidia's partners have stuck with 8-pin connectors for lower-end, lower-power cards like the RTX 4060 and 4070 series.

Both of the reported 5090 incidents involved third-party cables, one from custom PC part manufacturer MODDIY and one included with an FSP power supply, rather than the first-party 8-pin adapter that Nvidia supplies with GeForce GPUs. It's much too early to say whether these cables (or Nvidia, or the design of the connector, or the affected users) caused the problem or whether this was just a coincidence.


Original Submission

posted by janrinok on Wednesday February 12, @01:29AM   Printer-friendly

Boeing has informed its employees of uncertainty in future SLS contracts:

The primary contractor for the Space Launch System rocket, Boeing, is preparing for the possibility that NASA cancels the long-running program.

On Friday, with less than an hour's notice, David Dutcher, Boeing's vice president and program manager for the SLS rocket, scheduled an all-hands meeting for the approximately 800 employees working on the program. The apparently scripted meeting lasted just six minutes, and Dutcher didn't take questions.

During his remarks, Dutcher said Boeing's contracts for the rocket could end in March and that the company was preparing for layoffs in case the contracts with the space agency were not renewed. "Cold and scripted" is how one person described Dutcher's demeanor.

The aerospace company, which is the primary contractor for the rocket's large core stage, issued the notifications as part of the Worker Adjustment and Retraining Notification (or WARN) Act, which requires US employers with 100 or more full-time employees to provide a 60-day notice in advance of mass layoffs or plant closings.

"To align with revisions to the Artemis program and cost expectations, today we informed our Space Launch Systems team of the potential for approximately 400 fewer positions by April 2025," a Boeing spokesperson told Ars. "This will require 60-day notices of involuntary layoff be issued to impacted employees in coming weeks, in accordance with the Worker Adjustment and Retraining Notification Act. We are working with our customer and seeking opportunities to redeploy employees across our company to minimize job losses and retain our talented teammates."

The timing of Friday's hastily called meeting aligns with the anticipated release of President Trump's budget proposal for fiscal-year 2026. This may not be an entire plan but rather a "skinny" budget that lays out a wish list of spending requests for Congress and some basic economic projections. Congress does not have to act on Trump's budget priorities.

Multiple sources said there has been a healthy debate within the White House and senior leadership at NASA, including acting administrator Janet Petro, about the future of the SLS rocket and the Artemis Moon program. Some commercial space advocates have been pressing hard to cancel the rocket outright. Petro has been urging the White House to allow NASA to fly the Artemis II and Artemis III missions using the initial version of the SLS rocket before the program is canceled.

Critics of the large and expensive rocket—a single launch costs in excess of $2 billion, exclusive of any payloads or the cost of ground systems—say NASA should cut its losses. Keeping the SLS rocket program around for the first lunar landing would actually bog down progress, these critics say, because large contractors such as Boeing would be incentivized to slow down work and drag out funding with their cost-plus contracts for as long as possible.

On Saturday, a day after this story was published, NASA released a statement saying the SLS rocket remains an "essential component" of the Artemis campaign. "NASA and its industry partners continuously work together to evaluate and align budget, resources, contractor performance, and schedules to execute mission requirements efficiently, safely, and successfully in support of NASA's Moon to Mars goals and objectives," a spokesperson said. "NASA defers to its industry contractors for more information regarding their workforces."

Friday's all-hands meeting indicates that Boeing executives believe there is at least the possibility that the Trump White House will propose ending the SLS rocket as part of its budget proposal in March.

The US Congress, in concert with senior leaders at NASA, directed the space agency to develop the SLS rocket in 2011. Built to a significant degree from components of the space shuttle, including its main engines and side-mounted boosters, the SLS rocket was initially supposed to launch by the end of 2016. It did not make its debut flight until the end of 2022.

NASA has spent approximately $3 billion a year developing the rocket and its ground systems over the program's lifetime. While handing out guaranteed contracts to Boeing, Northrop Grumman, Aerojet, and other contractors, the government's rocket-building enterprise has been superseded by the private industry. SpaceX has developed two heavy-lift rockets in the last decade, and Blue Origin just launched its own, with the New Glenn booster. Each of these rockets is at least partially reusable and flies at less than one-tenth the cost of the SLS rocket.

[...]


Original Submission