Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Several sites are noticing a joke (for now) petition for Denmark to take pesky California off the US' hands:
Have you ever looked at a map and thought, "You know what Denmark needs? More sunshine, palm trees, and roller skates." Well, we have a once-in-a-lifetime opportunity to make that dream a reality.
Let's buy California from Donald Trump!
Yes, you heard that right.
California could be ours, and we need your help to make it happen.
See also English language articles like "Danes offer to buy California to spite Trump's Greenland aims: 'We'll bring hygge to Hollywood'" at The Guardian and "Petition for Denmark to buy California for $1 trillion surpasses 200,000 signatures" at CBS, among others.
I think we need a "Humor" topic. Can you better this?
https://phys.org/news/2025-02-flies-play-carousel.html
In a recent study, scientists at Leipzig University have for the first time demonstrated play-like behavior in flies. They found that fruit flies (Drosophila melanogaster) voluntarily and repeatedly visited a carousel.
"Until now, play-like behavior has mainly been described in vertebrates," says Professor Wolf Huetteroth, who led the study at the Institute of Biology at Leipzig University and recently moved to Northumbria University in Newcastle, England, as an associate professor. He and his colleagues have just published their findings in the journal Current Biology.
The play-like behavior of the flies described by the researchers, involving voluntary passive movements such as swinging, bobbing, sliding or turning, has now been demonstrated in insects for the first time. "This could help us to find out how we humans also develop efficient self-awareness of our bodies," explains Huetteroth.
In collaboration with Northumbria University, the researchers conducted a detailed analysis of how the flies interacted with the carousel. While many flies avoided the carousel, others visited it repeatedly and for long periods. When two carousels rotated alternately, the flies even actively followed the stimulation.
The scientists placed a total of 190 individual flies in a carousel arena, a glass dome about one centimeter high, and then filmed them for 3 to 14 days. The positions of the flies in the recordings were then automatically recognized and tracked using special software. Only a fraction of the data generated was included in the study.
More information: Tilman Triphan et al, Play-like behavior exhibited by the vinegar fly Drosophila melanogaster, Current Biology (2025).
DOI: 10.1016/j.cub.2025.01.025
The doge.gov website that was spun up to track Elon Musk's cuts to the federal government is insecure and pulls from a database that can be edited by anyone, according to two separate people who found the vulnerability and shared it with 404 Media. One coder added at least two database entries that are visible on the live site and say "this is a joke of a .gov site" and "THESE 'EXPERTS' LEFT THEIR DATABASE OPEN -roro."
Doge.gov was hastily deployed after Elon Musk told reporters Tuesday that his Department of Government Efficiency is "trying to be as transparent as possible. In fact, our actions—we post our actions to the DOGE handle on X, and to the DOGE website." At the time, DOGE was an essentially blank webpage. It was built out further Wednesday and Thursday, and now shows a mirror of the @DOGE X account posts, as well as various stats about the U.S. government's federal workforce.
Arthur T Knackerbracket has processed the following story:
Following an RTX 5090 melting incident just a few days ago, YouTuber Der8auer AKA Roman Hartung contacted the affected party and managed to acquire the damaged graphics card, power cable, and even the PSU for investigation. While the user was absolutely sure there was no user error involved, many blamed using a custom cable from MODDIY instead of official Nvidia adapters for the plastic-melting failures.
On further analysis, Der8auer revealed a critically damaged wire, noting that its condition was far worse than the others. Roman put his own RTX 5090 FE to the test, only to find out that of the six 12V cables, one drew over 22A of current, breaching safety limits with temperatures north of 150 degrees Celsius.
Back when the news broke, the Reddit OP revealed that they were using a third-party 16-pin cable from MODDIY instead of the official one included in the GPU box. This obviously led to some backlash, with many blaming the quote-unquote inferior quality of the cables as the root cause of the damage. Der8auer believes the criticism is unjustified, citing his positive experiences with the cable company and its repute in the DIY community.
In any case, the damage had already been done and now was time to investigate the crime scene. Roman captured high-quality microscopic shots (some attached below) of both ends of the melted cable, the GPU's connector, and even the damaged PSU. While horrifying, the damage is pretty standard considering this is not the first case we're seeing. As noted above, significant damage to one particular wire prompted further investigation. Roman paired his latest custom water block equipped RTX 5090 FE, connected to Corsair's AX1600i PSU, making sure to double-check that the GPU connector is seated properly. The GPU was put through its paces in FurMark, where it was seen drawing around 570W of power.
Just 45 seconds into the test, two of the six 12V wires shot up to nearly 60 degrees Celsius. On the PSU end, Roman witnessed a hotspot of almost 130 degrees Celsius, spiking to over 150 degrees Celsius after just four minutes. With the help of a current clamp, one 12V wire was carrying over 22 Amperes of current, equivalent to 264W of power. For context, the 12VHPWR and 12V-2x6 standard allows for a maximum of 9.5 Amperes through a single pin. The reported current readings for the remaining five wires were: 2A (24W), 5A (60W), 11A (132W), 8A (96W), and 3A (36W) with a moderate margin of error as it's hard to get precise measurements across all wires concurrently.
In short, uneven current distribution leads to dangerously high temperatures which can potentially burn or melt the cable and damage connected components. In isolation, this incident could've been swept under the rug as a one-off, however, Roman's near one-to-one recreation of the problem suggests there's something else at play here.
Previously: Handful of Users Claim New Nvidia GPUs Are Melting Power Cables Again
Arthur T Knackerbracket has processed the following story:
This week is the AI Action Summit in Paris and the European Union is using it as an opportunity to deep dive into the growing sector. The bloc has announced it's putting €200 billion ($206 billion) toward AI development. This number includes €20 billion ($20.6 billion) for AI gigafactories that process and train large models.
European Commission President Ursula von der Leyen announced the plan, called InvestAI, at the AI Action Summit on Tuesday. She pushed the position that Europe isn't late to the competition against China and the US. "The frontier is constantly moving, leadership is still up for grabs, and behind the frontier is the whole world of AI adoption," von der Leyen stated. "Bringing AI to industry-specific applications and harnessing its power for productivity and for people, and this is where Europe can truly lead the race.”
The news follows France announcement that private investments are funneling €109 billion ($112.5 billion) into its AI ecosystem. The country is also committing a gigawatt of nuclear power for an AI computing project led by FluidStack. It will use Nvidia-made chips.
January was a big month for AI growth in the US and China. In the US, OpenAI and SoftBank announced a $500 billion partnership called Stargate to create AI infrastructure. Then Chinese AI assistant DeepSeek exploded onto the global stage, with the company claiming it offers the same quality as its competitors — but cost a lot less to built.
Surprise surprise, we've done it again. We've demonstrated an ability to compromise significantly sensitive networks, including governments, militaries, space agencies, cyber security companies, supply chains, software development systems and environments, and more:
Arguably armed still with a somewhat inhibited ability to perceive risk and seemingly no fear, in November 2024, we decided to prove out the scenario of a significant Internet-wide supply chain attack caused by abandoned infrastructure. This time however, we dropped our obsession with expired domains, and instead shifted our focus to Amazon's S3 buckets.
It's important to note that although we focused on Amazon's S3 for this endeavour, this research challenge, approach and theme is cloud-provider agnostic and applicable to any managed storage solution. Amazon's S3 just happened to be the first storage solution we thought of, and we're certain this same challenge would apply to any customer/organization usage of any storage solution provided by any cloud provider.
The TL;DR is that this time, we ended up discovering ~150 Amazon S3 buckets that had previously been used across commercial and open source software products, governments, and infrastructure deployment/update pipelines - and then abandoned.
Naturally, we registered them, just to see what would happen - "how many people are really trying to request software updates from S3 buckets that appear to have been abandoned months or even years ago?", we naively thought to ourselves.
[...] These S3 buckets received more than 8 million HTTP requests over a 2 month period for all sorts of things -
- Software updates,
- Pre-compiled (unsigned!) Windows, Linux and macOS binaries,
- Virtual machine images (?!),
- JavaScript files,
- CloudFormation templates,
- SSLVPN server configurations,
- and more.
The article goes on to describe where the requests came from and provides some details on getting the word to the right companies and what actions they took. Originally spotted on Schneier on Security.
Related:
Arthur T Knackerbracket has processed the following story:
A federal judge appointed by a Republican President has castigated Trump administration officials — and ordered them to immediately restore public health websites that they abruptly abruptly shut down.
The lawsuit against the website removal, brought by a group of physicians known as Doctors for America, concerns sites operated by the Department of Health and Human Services, the Centers for Disease Control and Prevention and the Food and Drug Administration. Doctors for America says the scrubbing of the sites makes it more difficult for them to treat patients.
U.S. District Judge John Bates, appointed by George W Bush, agreed. He ordered the government to restore the pages by the end of the day of his ruling (Tuesday Feb. 11).
The TL;DR of Bates' ruling? It "was done without any public rationale, recourse or ability to challenge the decisions, despite laws and regulations that typically require them," as Politico summarized.
Sites removed by Trump officials concern HIV care, plus information on contraception drugs and student health. In their lawsuit filed against the Office for Personnel Management, HHS, CDC, and the FDA, Doctors for America says the removal of websites offering them guidance on these subjects is creating confusion, which eats up time that is better spent treating patients.
Justice Department attorneys defended the government's decision to remove the sites, saying doctors could still access the information by using the Wayback Machine, which archives offline websites. But that didn't fly with the judge.
"The Wayback Machine does not capture every webpage, and there is no information to suggest that is has archived each removed webpage," Bates wrote. "Additionally, pages archived on the Wayback Machine do not appear on search engines. In other words, a particular archived webpage is only viewable to a provider if the provider knows that the Wayback Machine exists and had recorded the pre-removal URL of the requested webpage."
https://phys.org/news/2025-02-earth-core-solid-previously-thought.html
The surface of the Earth's inner core may be changing, as shown by a new study by USC scientists that detected structural changes near the planet's center, published in Nature Geoscience.
The changes of the inner core have long been a topic of debate for scientists. However, most research has been focused on assessing rotation. John Vidale, Dean's Professor of Earth Sciences at the USC Dornsife College of Letters, Arts and Sciences and principal investigator of the study, said the researchers "didn't set out to define the physical nature of the inner core."
"What we ended up discovering is evidence that the near surface of Earth's inner core undergoes structural change," Vidale said. The finding sheds light on the role topographical activity plays in rotational changes in the inner core that have minutely altered the length of a day and may relate to the ongoing slowing of the inner core.
Located 3,000 miles below the Earth's surface, the inner core is anchored by gravity within the molten liquid outer core. Until now, the inner core was widely thought of as a solid sphere.
The original aim of the USC scientists was to further chart the slowing of the inner core. "But as I was analyzing multiple decades' worth of seismograms, one dataset of seismic waves curiously stood out from the rest," Vidale said. "Later on, I'd realize I was staring at evidence the inner core is not solid."
The study utilized seismic waveform data—including 121 repeating earthquakes from 42 locations near Antarctica's South Sandwich Islands that occurred between 1991 and 2024—to give a glimpse of what takes place in the inner core.
As the researchers analyzed the waveforms from receiver-array stations located near Fairbanks, Alaska, and Yellowknife, Canada, one dataset of seismic waves from the latter station included uncharacteristic properties the team had never seen before.
"At first the dataset confounded me," Vidale said. It wasn't until his research team improved the resolution technique did it become clear the seismic waveforms represented additional physical activity of the inner core.
The physical activity is best explained as temporal changes in the shape of the inner core. The new study indicates that the near surface of the inner core may undergo viscous deformation, changing its shape and shifting at the inner core's shallow boundary.
The clearest cause of the structural change is interaction between the inner and outer core. "The molten outer core is widely known to be turbulent, but its turbulence had not been observed to disrupt its neighbor the inner core on a human timescale," Vidale said. "What we're observing in this study for the first time is likely the outer core disturbing the inner core."
Vidale said the discovery opens a door to reveal previously hidden dynamics deep within Earth's core, and may lead to better understanding of Earth's thermal and magnetic field.
More information: John Vidale, Annual-scale variability in both the rotation rate and near surface of Earth's inner core, Nature Geoscience (2025).
DOI: 10.1038/s41561-025-01642-2. www.nature.com/articles/s41561-025-01642-2
Critics accuse the company of wielding outsized private influence on public policing
Hackers leaked thousands of files from Lexipol, a Texas-based company that develops policy manuals, training bulletins, and consulting services for first responders.
The manuals, which are crafted by Lexipol's team of public sector attorneys, practitioners, and subject-matter experts, are customized to align with the specific needs and local legal requirements of agencies across the country.
But the firm also faces criticism for its blanket approach to police policies and pushback on reforms.
The data, a sample of which was given to the Daily Dot by a group referring to itself as "the puppygirl hacker polycule," includes approximately 8,543 files related to training, procedural, and policy manuals, as well as customer records that contain names, usernames, agency names, hashed passwords, physical addresses, email addresses, and phone numbers.
[...] The full dataset was provided by the hackers to DDoSecrets, the non-profit journalist and data leak hosting collective, which notes that "Lexipol retains copyright over all manuals which it creates despite the public nature of its work."
"There is little transparency on how decisions are made to draft their policies," the non-profit said, "which have an oversized influence on policing in the United States."
Some departments proactively publish their policy manuals online, while others keep them hidden from public view. One of the leaked manuals seen by the Daily Dot from the Orville Police Department in Ohio, for example, was not available online. Yet a nearly identical manual from Ohio's Beachwood Police Department can be found on the city's website.
The manuals cover matters ranging from the use of force and non-lethal alternatives to rules surrounding confidential informants and high-speed chases.
Given Lexipol's status as a private company, the widespread adoption of such manuals has led to concerns over its influence on public policing policies. The centralization, critics argue, could result in standardized policies that do not accurately represent the needs or values of local communities.
As noted by the Texas Law Review, "although there are other private, nonprofit, and government entities that draft police policies, Lexipol is now a dominant force in police policymaking across the country."
[...] Founded by two former police officers-turned-lawyers in 2003, Lexipol has increased its customer base significantly over the years. The company has also caught the attention of civil liberties groups that have accused Lexipol of helping violent officers evade justice by crafting policies that provide broad discretion in use-of-force situations.
The company has been accused of discriminatory profiling as well. In 2017, the American Civil Liberties Union (ACLU) sent a letter to Lexipol demanding that it "eliminate illegal and unclear directives that can lead to racial profiling and harassment of immigrants."
"The policies include guidelines that are unconstitutional and otherwise illegal, and can lead to improper detentions and erroneous arrests," the ACLU said at the time, highlighting directives Lexipol issued cops that indicated they had more leeway to arrest immigrants than the law allowed.
- https://ddosecrets.com/article/lexipolleaks
- https://www.beachwoodohio.com/622/Police-Manual-Core-Policies
- https://texaslawreview.org/lexipol/
- https://www.repository.law.indiana.edu/ilj/vol97/iss1/1/
- https://www.motherjones.com/criminal-justice/2020/08/lexipol-police-policy-company/
- https://www.aclu.org/press-releases/faulty-lexipol-policies-expose-police-departments-costly-lawsuits-aclu-wa-and-nwirp
NASCAR's first points race of 2025 is the Daytona 500, which is on February 16. The Daytona 500 is NASCAR's most prestigious race, with a unique style of racing known as pack racing. This is characterized by many cars running at very high speeds in large packs, and often massive wrecks referred to as the "Big One". However, many drivers and fans have been critical of rules changes in recent years leading to racing at NASCAR's biggest oval tracks that they describe as boring.
Bobby Allison's 210 mph crash at the 1987 Winston 500 forever changed how NASCAR races at superspeedways. Allison's car became airborne, severely damaged the catch fence along the frontstretch at Talladega, and almost flew into the stands. NASCAR decided the speeds had become too fast at their two largest and highest-banked ovals, Daytona and Talladega, and implemented restrictor plates at those tracks starting in 1988. Restrictor plates reduce the air intake into the engine, reducing both horsepower and speeds.
Although drafting had always been powerful at superspeedways, the changes caused the cars to race in large packs, often with 20 or 30 cars within a couple seconds of each other. Although NASCAR says that this is necessary to prevent the worst wrecks, often leads to large multi-car wrecks. Drivers also complain that winning restrictor plate races is influenced too heavily by luck, though this is disputed.
When a car drives forward, it displaces the air with its nose, creating high pressure at the front of the car, and low pressure behind the car in its turbulent wake. The combination of high pressure in front and low pressure behind the car creates a rearward pointing pressure gradient force (PGF), which is drag and slows the car down. If another car rides in the wake of the lead car, it experiences lower pressure on its nose, reducing the drag and allowing the car to go faster. However, if the trailing car puts its nose right behind the rear bumper of the lead car, it increases the pressure behind the lead car, reducing the lead car's drag as well. When the cars are in close proximity, both leading and trailing cars benefit from the draft When NASCAR reduced the horsepower at superspeedways, the draft became particularly powerful, and the fastest way around the track was now in a group of cars driving bumper-to-bumper.
The result is often a large pack of cars, two or three wide, driving around the track at full throttle with speeds around 190 mph. One of the best ways to pass in pack racing is for a car to back up to the bumper of the car behind it, get pushed forward to increase its speed, and then get out of line to try to move forward. For this strategy to work, either that car has to get back in the draft soon before drag slows it down too much, or it needs other cars to also get out of line and start a new line. The result is a style of racing that leads to cars making aggressive moves at high speeds, and it can be spectacular to watch. However, in recent years and especially since the introduction of NASCAR's next-gen Cup Series car in 2022, the racing at superspeedways has been criticized as boring.
Although it's difficult to find detailed historical engine specs, for much of the restrictor plate era, cars might have 750 horsepower at most tracks but be limited to 450 horsepower at superspeedways. More recently, NASCAR has been increasing the power at superspeedways while adding more aerodynamic drag to slow the cars down. However, this means the drag is more severe when a car gets out of line, and a single car will drop back quickly. This makes it much more difficult for cars to pass without multiple cars getting out of line at once.
Driver Denny Hamlin also said that higher drag in the next-gen car leads to poorer fuel mileage, leading to slower speeds to conserve fuel, and less passing. Instead of making aggressive moves to pass, cars tend to ride around in line for much of the race leading to a style of racing that many describe as boring. Suggestions to improve the racing include reducing drag, lowering horsepower in the engines, and either adjusting the lengths of race stages or eliminating stage racing altogether at superspeedways.
https://phys.org/news/2025-02-money-distance-theory.html
Two of the most commonly accepted theories for the origin of money are the commodity theory and the chartalist theory. Both have drawbacks, but in recent years, the chartalist theory has gained much traction.
A recent study by archaeologist Dr. Mikael Fauvelle, published in the Journal of Archaeological Method and Theory, proposes that a third theory, examining external factors, may better explain the origin of money in pre-state societies.
Traditionally, two theories on the origin of money exist. The first is the commodity theory, which proposes that money was developed to facilitate internal barter between community members. The premise is that trading one good for another good that you desire is inefficient and unreliable, as you cannot guarantee that the trading partner has the goods you want or that they want the goods you are offering. However, money mitigates this problem.
This theory has recently come under scrutiny as ethnographic and historical studies show that pure barter systems are rare and that most traditional societies use exchange networks based on trust and delayed reciprocity.
Meanwhile, chartalist theory focuses on the role of money as a unit of account, arguing that money was imposed by the state to facilitate taxation, tribute collection, and financing of wars. However, this theory falls flat when looking at pre-state societies that did not tax or had no tribute to collect.
These two theories are often presented as opposing each other. However, not only need they not be mutually exclusive, but they tend to view money as having the same definition in ancient societies as it has today, namely as a medium of exchange, a unit of account, a standard of value, and a store of value.
Dr. Fauvelle provides evidence that supports a third theory, the so-called "Trade Money Theory." The theory proposes that it was not internal barter problems that money was used to solve but rather long-distance external exchange networks that could not rely on familiar, trust-based relationships of delayed reciprocity.
To support this theory, Dr. Fauvelle examines the money systems of two pre-state societies. "I focused on shell beads in Western North America and Bronze Money in Europe as these are two well-documented case studies with considerable evidence for widespread trade and monetary economies predating the development of ancient states."
Journal Reference: Mikael Fauvelle, The Trade Theory of Money: External Exchange and the Origins of Money, Journal of Archaeological Method and Theory (2025). DOI: 10.1007/s10816-025-09694-9
WikiTok cures boredom in spare moments with wholesome swipe-up Wikipedia article discovery:
On Wednesday, a New York-based app developer named Isaac Gemal debuted a new site called WikiTok, where users can vertically swipe through an endless stream of Wikipedia article stubs in a manner similar to the interface for video-sharing app TikTok.
It's a neat way to stumble upon interesting information randomly, learn new things, and spend spare moments of boredom without reaching for an algorithmically addictive social media app. Although to be fair, WikiTok is addictive in its own way, but without an invasive algorithm tracking you and pushing you toward the lowest-common-denominator content. It's also thrilling because you never know what's going to pop up next.
WikiTok, which works through mobile and desktop browsers, feeds visitors a random list of Wikipedia articles—culled from the Wikipedia API—into a vertically scrolling interface. Despite the name that hearkens to TikTok, there are currently no videos involved. Each entry is accompanied by an image pulled from the corresponding article. If you see something you like, you can tap "Read More," and the full Wikipedia page on the topic will open in your browser.
For now, the feed is truly random, and Gemal is currently resisting calls to automatically tailor the stream of articles to the user's interests based on what they express interest in.
"I have had plenty of people message me and even make issues on my GitHub asking for some insane crazy WikiTok algorithm," Gemal told Ars. "And I had to put my foot down and say something along the lines that we're already ruled by ruthless, opaque algorithms in our everyday life; why can't we just have one little corner in the world without them?"
[...] Gemal posted the code for WikiTok on GitHub, so anyone can modify or contribute to the project. Right now, the web app supports 14 languages, article previews, and article sharing on both desktop and mobile browsers. New features may arrive as contributors add them. It's based on a tech stack that includes React 18, TypeScript, Tailwind CSS, and Vite.
And so far, he is sticking to his vision of a free way to enjoy Wikipedia without being tracked and targeted. "I have no grand plans for some sort of insane monetized hyper-calculating TikTok algorithm," Gemal told us. "It is anti-algorithmic, if anything."
Tornado strength is rated from 0 (weakest) to 5 (strongest) on the Enhanced Fujita (EF) scale, with roughly 2% of tornadoes being rated EF4 or EF5. The EF scale replaced the older Fujita scale to provide much more fine-grained detail in determining a tornado's rating. The EF5 rating corresponds to estimated peak winds of 200+ mph. However, it is purely a damage scale, from which the peak winds in the tornado are later estimated. Although meteorologists often discuss the wind speeds in tornadoes, measured wind speeds are never a factor in rating tornadoes.
This distinction was made apparent on April 26, 1991 when the Andover, Kansas tornado was rated F5 while the Red Rock, Oklahoma tornado was rated F4 despite likely being the stronger tornado. A mobile radar from the University of Oklahoma measured 270+ mph winds in the Red Rock tornado, well into the F5 range, and the strongest tornado winds that had ever been measured to date. However, because the Red Rock tornado remained over mostly rural areas unlike the Andover tornado, there was little opportunity for it to do severe enough damage to be rated F5. This distinction remains true with the EF scale, where the 2013 El Reno, Oklahoma tornado was originally rated EF5 on the basis of mobile radar observations, then downgraded to EF3 based on the lack of EF4 or EF5 damage in damage surveys.
A new article in the Bulletin of the American Meteorological Society discusses the current "drought" in EF5 tornadoes, with that rating being most recently assigned to the 2013 Moore, Oklahoma tornado that happened just 11 days before the 2013 El Reno tornado. The lack of EF5 tornadoes for over 11 years has raised questions of why, and if the EF5 rating is essentially obsolete.
The journal paper argues that the lack of EF5 tornadoes for 11 years is roughly 0.3%, and it's very unlikely that there have been zero EF5 tornadoes during that period. Instead, it's probable that this is due to stricter application of the EF scale standards, and several tornadoes were estimated to have peak winds of 190+ mph during that period. If those tornadoes were reclassified to EF5, it would be statistically consistent with the previous climatology of EF5 tornadoes. The authors note that some of the previous EF5 ratings such as the 2011 Joplin Missouri tornado were based on damage indicators that were not part of the EF scale specifications.
One of the biggest reasons for not assigning an EF5 rating is the presence of areas with limited damage very close to near-total devastation. However, the strongest tornadoes are generally multi-vortex tornadoes, where the strongest winds are found within small vortices embedded within a broader tornadic circulation. This could explain the proximity of extreme damage to areas with much less damage. The damage severity also depends on how long structures are exposed to extreme winds, an example of which is the 1997 Jarrell, Texas tornado, which was rated F5 but damage was more severe due to the tornado moving slowly and exposing buildings to the tornado winds for a longer than usual time. This raises the question of whether the EF5 rating is obsolete based on how the EF scale is currently applied, and if it's time to again revise how meteorologists rate tornado strength.
https://spectrum.ieee.org/aluminum-battery
This sustainable, solid-state electrolyte design outlives lithium-ion batteries
Electric vehicles( EVs) and green energy sources rely heavily on batteries to store electricity. Currently, more than 75 percent of the world's energy storage depends on batteries that contain lithium, an expensive mineral that's subject to volatile pricing. Lithium-ion (Li-ion) batteries themselves can be volatile, too, because they use a flammable electrolyte that can catch fire when overcharged.
Now, a group of scientists based in Beijing believes that aluminum offers a better solution. Aluminum is the third-most abundant mineral in the Earth's crust and costs about one-quarter as much as lithium. And if built right, aluminum-based batteries may offer longer life expectancy and a safer, more sustainable design than their volatile counterparts. Led by scientists from the Beijing Institute of Technology and the University of Science and Technology Beijing, the group has found a way to stabilize aluminum batteries that can last far longer.
Aluminum-ion (Al-ion) batteries have been the subject of research for years. But previous attempts have generally used ionic liquid electrolytes, which can lead to anode corrosion, especially in humid conditions. Other researchers have used gel polymer electrolytes, halfway between liquid and solid-state alternatives, but these tend to have low conductivity. This team of researchers took a different approach and added a pinch of salt—namely, an inert aluminum fluoride salt—to a liquid electrolyte containing aluminum ions, creating a solid-state electrolyte.
Well, more than a pinch of salt, really. The salt has a porous 3D structure, which allows it to act like a rigid sponge that absorbs and stabilizes the liquid, yet still allows the ions to move more freely. This increases conductivity of the material, and the result is a solid composite material that cannot leak. The researchers also coated the electrodes with a thin layer of material that helps prevent crystals of aluminum from forming, which would degrade battery performance over time.
"Our research shows that a stable, recyclable solid-state electrolyte can improve aluminum-ion batteries by solving issues like corrosion, safety, and long-cycle life, making them a potential alternative to lithium-based batteries," says Shuqiang Jiao, a professor of electrochemical engineering at the University of Science and Technology Beijing.
AI increases unemployment rates in US IT sector:
The increasing use of artificial intelligence (AI) has continued to have negative impact on the information technology (IT) job market in the US, with unemployment rates increasing in this vital sector.
According to the US newspaper the Wall Street Journal (WSJ), the unemployment rate in the IT sector in the US rose from 3.9% in December 2024 to 5.7% in January 2025, as a result of the increasing reliance on automation and the use of AI technologies, pointing out that the number of unemployed IT workers rose from 98,000 in December 2024 to 152,000 in January 2025.
According to economic experts, labor market data, and specialized reports, job losses in the technology sector can be attributed in part to the impact of AI, as the emergence of generative AI has led to huge amounts of spending by giant technology companies on AI infrastructure instead of new jobs in the IT field, the newspaper added.
The WSJ said that "jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration."
"As they start looking at AI, theyre also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return," the WSJ added, indicating that companies are betting that AI will bring economic benefits to companies, whether in terms of improving efficiency or reducing costs.
"Increased corporate investment in AI has shown early signs of leading to future cuts in hiring, a concept some tech leaders are starting to call "cost avoidance." Rather than hiring new workers for tasks that can be more easily automated, some businesses are letting AI take on that work and reaping potential savings," WSJ said.
According to experts, the latest IT jobs numbers come as unemployment among white-collar workers remains at its highest levels since 2020.
"What weve really seen, especially in the last year or so, is a bifurcation in opportunities, where white-collar knowledge worker type jobs have had far less employer demand than jobs that are more in-person, skilled labor jobs," WSJ added.
See also:
IT Unemployment Rises to 5.7% as AI Hits Tech Jobs:
The unemployment rate in the information technology sector rose from 3.9% in December to 5.7% in January, well above last month's overall jobless rate of 4%, in the latest sign of how automation and the increasing use of artificial intelligence are having a negative impact on the tech labor market.
The number of unemployed IT workers rose from 98,000 in December to 152,000 last month, according to a report from consulting firm Janco Associates based on data from the U.S. Department of Labor.
Job losses in tech can be attributed in part to the influence of AI, according to Victor Janulaitis, chief executive of Janco Associates. The emergence of generative AI has produced massive amounts of spending by tech giants on AI infrastructure, but not necessarily new jobs in IT.
"Jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration," Janulaitis said. "As they start looking at AI, they're also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return."
[...] Another reason for January's tech job losses was that companies began implementing some intended spending cuts for this year, Janulaitis said, and many slashed budgets based on what the economy looked like during fiscal planning last year.
Layoffs have also continued at some large tech companies. Last month, Meta Platformssaid it would cut 5% of its workforce in performance-based job cuts in the U.S., and on Wednesday enterprise software giant Workday said it would cut about 8.5% of its workforce.