Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
https://phys.org/news/2025-02-money-distance-theory.html
Two of the most commonly accepted theories for the origin of money are the commodity theory and the chartalist theory. Both have drawbacks, but in recent years, the chartalist theory has gained much traction.
A recent study by archaeologist Dr. Mikael Fauvelle, published in the Journal of Archaeological Method and Theory, proposes that a third theory, examining external factors, may better explain the origin of money in pre-state societies.
Traditionally, two theories on the origin of money exist. The first is the commodity theory, which proposes that money was developed to facilitate internal barter between community members. The premise is that trading one good for another good that you desire is inefficient and unreliable, as you cannot guarantee that the trading partner has the goods you want or that they want the goods you are offering. However, money mitigates this problem.
This theory has recently come under scrutiny as ethnographic and historical studies show that pure barter systems are rare and that most traditional societies use exchange networks based on trust and delayed reciprocity.
Meanwhile, chartalist theory focuses on the role of money as a unit of account, arguing that money was imposed by the state to facilitate taxation, tribute collection, and financing of wars. However, this theory falls flat when looking at pre-state societies that did not tax or had no tribute to collect.
These two theories are often presented as opposing each other. However, not only need they not be mutually exclusive, but they tend to view money as having the same definition in ancient societies as it has today, namely as a medium of exchange, a unit of account, a standard of value, and a store of value.
Dr. Fauvelle provides evidence that supports a third theory, the so-called "Trade Money Theory." The theory proposes that it was not internal barter problems that money was used to solve but rather long-distance external exchange networks that could not rely on familiar, trust-based relationships of delayed reciprocity.
To support this theory, Dr. Fauvelle examines the money systems of two pre-state societies. "I focused on shell beads in Western North America and Bronze Money in Europe as these are two well-documented case studies with considerable evidence for widespread trade and monetary economies predating the development of ancient states."
Journal Reference: Mikael Fauvelle, The Trade Theory of Money: External Exchange and the Origins of Money, Journal of Archaeological Method and Theory (2025). DOI: 10.1007/s10816-025-09694-9
WikiTok cures boredom in spare moments with wholesome swipe-up Wikipedia article discovery:
On Wednesday, a New York-based app developer named Isaac Gemal debuted a new site called WikiTok, where users can vertically swipe through an endless stream of Wikipedia article stubs in a manner similar to the interface for video-sharing app TikTok.
It's a neat way to stumble upon interesting information randomly, learn new things, and spend spare moments of boredom without reaching for an algorithmically addictive social media app. Although to be fair, WikiTok is addictive in its own way, but without an invasive algorithm tracking you and pushing you toward the lowest-common-denominator content. It's also thrilling because you never know what's going to pop up next.
WikiTok, which works through mobile and desktop browsers, feeds visitors a random list of Wikipedia articles—culled from the Wikipedia API—into a vertically scrolling interface. Despite the name that hearkens to TikTok, there are currently no videos involved. Each entry is accompanied by an image pulled from the corresponding article. If you see something you like, you can tap "Read More," and the full Wikipedia page on the topic will open in your browser.
For now, the feed is truly random, and Gemal is currently resisting calls to automatically tailor the stream of articles to the user's interests based on what they express interest in.
"I have had plenty of people message me and even make issues on my GitHub asking for some insane crazy WikiTok algorithm," Gemal told Ars. "And I had to put my foot down and say something along the lines that we're already ruled by ruthless, opaque algorithms in our everyday life; why can't we just have one little corner in the world without them?"
[...] Gemal posted the code for WikiTok on GitHub, so anyone can modify or contribute to the project. Right now, the web app supports 14 languages, article previews, and article sharing on both desktop and mobile browsers. New features may arrive as contributors add them. It's based on a tech stack that includes React 18, TypeScript, Tailwind CSS, and Vite.
And so far, he is sticking to his vision of a free way to enjoy Wikipedia without being tracked and targeted. "I have no grand plans for some sort of insane monetized hyper-calculating TikTok algorithm," Gemal told us. "It is anti-algorithmic, if anything."
Tornado strength is rated from 0 (weakest) to 5 (strongest) on the Enhanced Fujita (EF) scale, with roughly 2% of tornadoes being rated EF4 or EF5. The EF scale replaced the older Fujita scale to provide much more fine-grained detail in determining a tornado's rating. The EF5 rating corresponds to estimated peak winds of 200+ mph. However, it is purely a damage scale, from which the peak winds in the tornado are later estimated. Although meteorologists often discuss the wind speeds in tornadoes, measured wind speeds are never a factor in rating tornadoes.
This distinction was made apparent on April 26, 1991 when the Andover, Kansas tornado was rated F5 while the Red Rock, Oklahoma tornado was rated F4 despite likely being the stronger tornado. A mobile radar from the University of Oklahoma measured 270+ mph winds in the Red Rock tornado, well into the F5 range, and the strongest tornado winds that had ever been measured to date. However, because the Red Rock tornado remained over mostly rural areas unlike the Andover tornado, there was little opportunity for it to do severe enough damage to be rated F5. This distinction remains true with the EF scale, where the 2013 El Reno, Oklahoma tornado was originally rated EF5 on the basis of mobile radar observations, then downgraded to EF3 based on the lack of EF4 or EF5 damage in damage surveys.
A new article in the Bulletin of the American Meteorological Society discusses the current "drought" in EF5 tornadoes, with that rating being most recently assigned to the 2013 Moore, Oklahoma tornado that happened just 11 days before the 2013 El Reno tornado. The lack of EF5 tornadoes for over 11 years has raised questions of why, and if the EF5 rating is essentially obsolete.
The journal paper argues that the lack of EF5 tornadoes for 11 years is roughly 0.3%, and it's very unlikely that there have been zero EF5 tornadoes during that period. Instead, it's probable that this is due to stricter application of the EF scale standards, and several tornadoes were estimated to have peak winds of 190+ mph during that period. If those tornadoes were reclassified to EF5, it would be statistically consistent with the previous climatology of EF5 tornadoes. The authors note that some of the previous EF5 ratings such as the 2011 Joplin Missouri tornado were based on damage indicators that were not part of the EF scale specifications.
One of the biggest reasons for not assigning an EF5 rating is the presence of areas with limited damage very close to near-total devastation. However, the strongest tornadoes are generally multi-vortex tornadoes, where the strongest winds are found within small vortices embedded within a broader tornadic circulation. This could explain the proximity of extreme damage to areas with much less damage. The damage severity also depends on how long structures are exposed to extreme winds, an example of which is the 1997 Jarrell, Texas tornado, which was rated F5 but damage was more severe due to the tornado moving slowly and exposing buildings to the tornado winds for a longer than usual time. This raises the question of whether the EF5 rating is obsolete based on how the EF scale is currently applied, and if it's time to again revise how meteorologists rate tornado strength.
https://spectrum.ieee.org/aluminum-battery
This sustainable, solid-state electrolyte design outlives lithium-ion batteries
Electric vehicles( EVs) and green energy sources rely heavily on batteries to store electricity. Currently, more than 75 percent of the world's energy storage depends on batteries that contain lithium, an expensive mineral that's subject to volatile pricing. Lithium-ion (Li-ion) batteries themselves can be volatile, too, because they use a flammable electrolyte that can catch fire when overcharged.
Now, a group of scientists based in Beijing believes that aluminum offers a better solution. Aluminum is the third-most abundant mineral in the Earth's crust and costs about one-quarter as much as lithium. And if built right, aluminum-based batteries may offer longer life expectancy and a safer, more sustainable design than their volatile counterparts. Led by scientists from the Beijing Institute of Technology and the University of Science and Technology Beijing, the group has found a way to stabilize aluminum batteries that can last far longer.
Aluminum-ion (Al-ion) batteries have been the subject of research for years. But previous attempts have generally used ionic liquid electrolytes, which can lead to anode corrosion, especially in humid conditions. Other researchers have used gel polymer electrolytes, halfway between liquid and solid-state alternatives, but these tend to have low conductivity. This team of researchers took a different approach and added a pinch of salt—namely, an inert aluminum fluoride salt—to a liquid electrolyte containing aluminum ions, creating a solid-state electrolyte.
Well, more than a pinch of salt, really. The salt has a porous 3D structure, which allows it to act like a rigid sponge that absorbs and stabilizes the liquid, yet still allows the ions to move more freely. This increases conductivity of the material, and the result is a solid composite material that cannot leak. The researchers also coated the electrodes with a thin layer of material that helps prevent crystals of aluminum from forming, which would degrade battery performance over time.
"Our research shows that a stable, recyclable solid-state electrolyte can improve aluminum-ion batteries by solving issues like corrosion, safety, and long-cycle life, making them a potential alternative to lithium-based batteries," says Shuqiang Jiao, a professor of electrochemical engineering at the University of Science and Technology Beijing.
AI increases unemployment rates in US IT sector:
The increasing use of artificial intelligence (AI) has continued to have negative impact on the information technology (IT) job market in the US, with unemployment rates increasing in this vital sector.
According to the US newspaper the Wall Street Journal (WSJ), the unemployment rate in the IT sector in the US rose from 3.9% in December 2024 to 5.7% in January 2025, as a result of the increasing reliance on automation and the use of AI technologies, pointing out that the number of unemployed IT workers rose from 98,000 in December 2024 to 152,000 in January 2025.
According to economic experts, labor market data, and specialized reports, job losses in the technology sector can be attributed in part to the impact of AI, as the emergence of generative AI has led to huge amounts of spending by giant technology companies on AI infrastructure instead of new jobs in the IT field, the newspaper added.
The WSJ said that "jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration."
"As they start looking at AI, theyre also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return," the WSJ added, indicating that companies are betting that AI will bring economic benefits to companies, whether in terms of improving efficiency or reducing costs.
"Increased corporate investment in AI has shown early signs of leading to future cuts in hiring, a concept some tech leaders are starting to call "cost avoidance." Rather than hiring new workers for tasks that can be more easily automated, some businesses are letting AI take on that work and reaping potential savings," WSJ said.
According to experts, the latest IT jobs numbers come as unemployment among white-collar workers remains at its highest levels since 2020.
"What weve really seen, especially in the last year or so, is a bifurcation in opportunities, where white-collar knowledge worker type jobs have had far less employer demand than jobs that are more in-person, skilled labor jobs," WSJ added.
See also:
IT Unemployment Rises to 5.7% as AI Hits Tech Jobs:
The unemployment rate in the information technology sector rose from 3.9% in December to 5.7% in January, well above last month's overall jobless rate of 4%, in the latest sign of how automation and the increasing use of artificial intelligence are having a negative impact on the tech labor market.
The number of unemployed IT workers rose from 98,000 in December to 152,000 last month, according to a report from consulting firm Janco Associates based on data from the U.S. Department of Labor.
Job losses in tech can be attributed in part to the influence of AI, according to Victor Janulaitis, chief executive of Janco Associates. The emergence of generative AI has produced massive amounts of spending by tech giants on AI infrastructure, but not necessarily new jobs in IT.
"Jobs are being eliminated within the IT function which are routine and mundane, such as reporting, clerical administration," Janulaitis said. "As they start looking at AI, they're also looking at reducing the number of programmers, systems designers, hoping that AI is going to be able to provide them some value and have a good rate of return."
[...] Another reason for January's tech job losses was that companies began implementing some intended spending cuts for this year, Janulaitis said, and many slashed budgets based on what the economy looked like during fiscal planning last year.
Layoffs have also continued at some large tech companies. Last month, Meta Platformssaid it would cut 5% of its workforce in performance-based job cuts in the U.S., and on Wednesday enterprise software giant Workday said it would cut about 8.5% of its workforce.
The Beeb decided to test some LLMs to see how well they could summarize the news https://www.bbc.com/news/articles/c0m17d8827ko Turns out the answer is, "not very well".
In the study, the BBC asked ChatGPT, Copilot, Gemini and Perplexity to summarise 100 news stories and rated each answer. It got journalists who were relevant experts in the subject of the article to rate the quality of answers from the AI assistants. It found 51% of all AI answers to questions about the news were judged to have significant issues of some form. Additionally, 19% of AI answers which cited BBC content introduced factual errors, such as incorrect factual statements, numbers and dates.
[...] In her blog, Ms Turness said the BBC was seeking to "open up a new conversation with AI tech providers" so we can "work together in partnership to find solutions".
She called on the tech companies to "pull back" their AI news summaries, as Apple did after complaints from the BBC that Apple Intelligence was misrepresenting news stories.
Some examples of inaccuracies found by the BBC included:
- Gemini incorrectly said the NHS did not recommend vaping as an aid to quit smoking
- ChatGPT and Copilot said Rishi Sunak and Nicola Sturgeon were still in office even after they had left
- Perplexity misquoted BBC News in a story about the Middle East, saying Iran initially showed "restraint" and described Israel's actions as "aggressive"
In general, Microsoft's Copilot and Google's Gemini had more significant issues than OpenAI's ChatGPT and Perplexity, which counts Jeff Bezos as one of its investors. Normally, the BBC blocks its content from AI chatbots, but it opened its website up for the duration of the tests in December 2024. The report said that as well as containing factual inaccuracies, the chatbots "struggled to differentiate between opinion and fact, editorialised, and often failed to include essential context."
Normally I'd add a snide remark, but I don't think I need to this time...
Arthur T Knackerbracket has processed the following story:
In mid-January, a top United States materials company announced that it had started to manufacture rare earth magnets. It was important news—there are no large U.S. makers of the neodymium magnets that underpin huge and vitally important commercial and defense industries, including electric vehicles. But it created barely a ripple during a particularly loud and stormy time in U.S. trade relations.
The press release, from MP Materials, was light on details. The company disclosed that it had started producing the magnets, called neodymium-iron-boron (NdFeB), on a “trial” basis and that the factory would begin gradually ramping up production before the end of this year. According to MP’s spokesman, Matt Sloustcher, the facility will have an initial capacity of 1,000 tonnes per annum, and has the infrastructure in place to scale up to 2,000 to 3,000 tonnes per year. The release also said that the facility, in Fort Worth, Texas, would supply magnets to General Motors and other U.S. manufacturers.
The Texas facility, which MP Materials has named Independence, is not the only major rare-earth-magnet project in the U.S. Most notably, Vacuumschmelze GmbH, a magnet maker based in Hanau, Germany, has begun constructing a plant in South Carolina through a North American subsidiary, e-VAC Magnetics. To build the US $500 million factory, the company secured $335 million in outside funds, including at least $100 million from the U.S. government. (E-VAC, too, has touted a supply agreement with General Motors for its future magnets.)
In another intriguing U.S. rare-earth magnet project, Noveon Magnetics, in San Marcos, Texas, is currently producing what it claims are “commercial quantities” of NdFeB magnets. However, the company is not making the magnets in the standard way, starting with metal alloys, but rather in a unique process based on recycling the materials from discarded magnets. USA Rare Earth announced on 8 January that it had manufactured a small amount of NdFeB magnets at a plant in Stillwater, Oklahoma.
Yet another company, Quadrant Magnetics, announced in January, 2022, that it would begin construction on a $100 million NdFeB magnet factory in Louisville, Kentucky. However, 11 months later, U.S. federal agents arrested three of the company’s top executives, charging them with passing off Chinese-made magnets as locally produced and giving confidential U.S. military data to Chinese agencies.
The multiple US neodymium-magnet projects are noteworthy but even collectively they won’t make a noticeable dent in China’s dominance. “Let me give you a reality check,” says Steve Constantinides, an IEEE member and magnet-industry consultant based in Honeoye, N.Y. “The total production of neo magnets was somewhere between 220 and 240 thousand tonnes in 2024,” he says, adding that 85 percent of the total, at least, was produced in China. And “the 15 percent that was not made in China was made in Japan, primarily, or in Vietnam.” (Other estimates put China’s share of the neodymium magnet market as high as 90 percent.)
But look at the figures from a different angle, suggests MP Materials’s Sloustcher. “The U.S. imports just 7,000 tonnes of NdFeB magnets per year,” he points out. “So in total, these [U.S.] facilities can supplant a significant percentage of U.S. imports, help re-start an industry, and scale as the production of motors and other magnet-dependent industries” returns to the United States, he argues.
And yet, it’s hard not to be a little awed by China’s supremacy. The country has some 300 manufacturers of rare-earth permanent magnets, according to Constantinides. The largest of these, JL MAG Rare-Earth Co. Ltd., in Ganzhou, produced at least 25,000 tonnes of neodymium magnets last year, Constantinides figures. (The company recently announced that it was building another facility, to begin operating in 2026, that it says will bring its installed capacity to 60,000 tonnes a year.)
That 25,000 tonnes figure is comparable to the combined output of all of the rare-earth magnet makers that aren’t in China. The $500-million e-VAC plant being built in South Carolina, for example, is reportedly designed to produce around 1,500 tonnes a year.
But even those numbers do not fully convey China’s dominance of permanent magnet manufacturing. Where ever a factory is, making neodymium magnets requires supplies of rare-earth metal, and that nearly always leads straight back to China. “Even though they only produce, say, 85 percent of the magnets, they are producing 97 percent of the metal” in the world, says Constantinides. “So the magnet manufacturers in Japan and Europe are highly dependent on the rare-earth metal coming from China.”
And there, at least, MP Materials may have an interesting edge. Hardly any firms, even in China, do what MP is attempting: produce finished magnets starting with ore that the company mines itself. Even large companies typically perform just one or at most two of the four major steps along the path to making a rare-earth magnet: mining the ore, refining the ore into rare-earth oxides, reducing the oxides to metals, and then, finally, using the metals to make magnets. Each step is an enormous undertaking requiring entirely different equipment, processes, knowledge, and skill sets.
“The one advantage they get from [doing it all] is that they get better insights into how different markets are actually growing,” says Stan Trout, a magnet industry consultant in Denver, Colorado. “Getting the timing right on any expansion is important,” Trout adds. “And so MP should be getting that information as well as anybody, with the different plants that they have, because they interact with the market in several different ways and can really see what demand is like in real time, rather than as some projection in a forecast.”
Still, it’s going to be an uphill climb. “There are a lot of both hard and soft subsidies in the supply chain in China,” says John Ormerod, an industry consultant based in Knoxville, Tenn. “It’s going to be difficult for a US manufacturer to compete with the current price levels of Chinese-made magnets,” he concludes.
And it’s not going to get better any time soon. China’s rare-earth magnet makers are only using about 60 percent of their production capacity, according to both Constantinides and Ormerod—and yet they are continuing to build new plants. “There’s going to be roughly 500,000 tonnes of capacity by the end of this year,” says Ormerod, citing figures gathered by Singapore-based analyst Thomas Kruemmer. “The demand is only about 50 percent of that.”
The upshot, all of the analysts agree, will be downward price pressure on rare earth magnets in the near future, at least. At the same time, the U.S. Department of Defense has made it a requirement that rare-earth magnets for its systems must be produced entirely, starting with ore, in “friendly” countries—which does not include China. “The DoD will need to pay a premium over cheaper imported magnets to establish a price floor enabling domestic U.S. producers to successfully and continuously supply the DoD,” says Constantinides.
In a 1985 paper, the computer scientist Andrew Yao, who would go on to win the A.M. Turing Award, asserted that among hash tables with a specific set of properties, the best way to find an individual element or an empty spot is to just go through potential spots randomly — an approach known as uniform probing. He also stated that, in the worst-case scenario, where you're searching for the last remaining open spot, you can never do better than x. for 40 years, most computer scientists assumed that Yao's conjecture was true.
Krapivin was not held back by the conventional wisdom for the simple reason that he was unaware of it. "I did this without knowing about Yao's conjecture," he said. His explorations with tiny pointers led to a new kind of hash table — one that did not rely on uniform probing. And for this new hash table, the time required for worst-case queries and insertions is proportional to (log x)2 — far faster than x. This result directly contradicted Yao's conjecture.
[...] "It's not just that they disproved [Yao's conjecture], they also found the best possible answer to his question," said Sepehr Assadi of the University of Waterloo.
[...] In addition to refuting Yao's conjecture, the new paper also contains what many consider an even more astonishing result. It pertains to a related, though slightly different, situation: In 1985, Yao looked not only at the worst-case times for queries, but also at the average time taken across all possible queries. He proved that hash tables with certain properties — including those that are labeled "greedy," which means that new elements must be placed in the first available spot — could never achieve an average time better than log x.
[...] They showed that it did not by providing a counterexample, a non-greedy hash table with an average query time that's much, much better than log x. In fact, it doesn't depend on x at all. "You get a number," Farach-Colton said, "something that is just a constant and doesn't depend on how full the hash table is." The fact that you can achieve a constant average query time, regardless of the hash table's fullness, was wholly unexpected — even to the authors themselves.
( https://www.quantamagazine.org/undergraduate-upends-a-40-year-old-data-science-conjecture-20250210/ )
Handful of users claim new Nvidia GPUs are melting power cables again:
Here we (maybe) go again: Reports from a handful of early adopters of Nvidia's new GeForce RTX 5090 graphics card are reporting that their power cables are melting (so far, there's at least one report on YouTube and one on Reddit, as reported by The Verge). This recalls a similar situation from early in the RTX 4090's life cycle, when power connectors were melting and even catching fire, damaging the GPUs and power supplies.
After much investigation and many guesses from Nvidia and other testers, the 4090's power connector issues ended up being blamed on what was essentially user error; the 12VHPWR connectors were not being inserted all the way into the socket on the GPU or were being bent in a way that created stress on the connection, which caused the connectors to run hot and eventually burst into flames.
The PCI-SIG, the standards body responsible for the design of the new connector, claimed that the design of the 12VHPWR connector itself was sound and that any problems with it should be attributed to the manufacturers implementing the standard. Partly in response to the 4090 issues, the 12VHPWR connector was replaced by an updated standard called 12V-2x6, which uses the same cables and is pin-compatible with 12VHPWR, but which tweaked the connector to ensure that power is only actually delivered if the connectors are firmly seated. The RTX 50-series cards use the 12V-2x6 connector.
The 12VHPWR and 12V-2x6 connectors are both designed to solve a real problem: delivering hundreds of watts of power to high-end GPUs over a single cable rather than trying to fit multiple 8-pin power connectors onto these GPUs. In theory, swapping two to four 8-pin connectors for a single 12V-2x6 or 12VHPWR connector cuts down on the amount of board space OEMs must reserve for these connectors in their designs and the number of cables that users have to snake through the inside of their gaming PCs.
But while Nvidia, Intel, AMD, Qualcomm, Arm, and other companies are all PCI-SIG members and all had a hand in the design of the new standards, Nvidia is the only GPU company to use the 12VHPWR and 12V-2x6 connectors in most of its GPUs. AMD and Intel have continued to use the 8-pin power connector, and even some of Nvidia's partners have stuck with 8-pin connectors for lower-end, lower-power cards like the RTX 4060 and 4070 series.
Both of the reported 5090 incidents involved third-party cables, one from custom PC part manufacturer MODDIY and one included with an FSP power supply, rather than the first-party 8-pin adapter that Nvidia supplies with GeForce GPUs. It's much too early to say whether these cables (or Nvidia, or the design of the connector, or the affected users) caused the problem or whether this was just a coincidence.
Boeing has informed its employees of uncertainty in future SLS contracts:
The primary contractor for the Space Launch System rocket, Boeing, is preparing for the possibility that NASA cancels the long-running program.
On Friday, with less than an hour's notice, David Dutcher, Boeing's vice president and program manager for the SLS rocket, scheduled an all-hands meeting for the approximately 800 employees working on the program. The apparently scripted meeting lasted just six minutes, and Dutcher didn't take questions.
During his remarks, Dutcher said Boeing's contracts for the rocket could end in March and that the company was preparing for layoffs in case the contracts with the space agency were not renewed. "Cold and scripted" is how one person described Dutcher's demeanor.
The aerospace company, which is the primary contractor for the rocket's large core stage, issued the notifications as part of the Worker Adjustment and Retraining Notification (or WARN) Act, which requires US employers with 100 or more full-time employees to provide a 60-day notice in advance of mass layoffs or plant closings.
"To align with revisions to the Artemis program and cost expectations, today we informed our Space Launch Systems team of the potential for approximately 400 fewer positions by April 2025," a Boeing spokesperson told Ars. "This will require 60-day notices of involuntary layoff be issued to impacted employees in coming weeks, in accordance with the Worker Adjustment and Retraining Notification Act. We are working with our customer and seeking opportunities to redeploy employees across our company to minimize job losses and retain our talented teammates."
The timing of Friday's hastily called meeting aligns with the anticipated release of President Trump's budget proposal for fiscal-year 2026. This may not be an entire plan but rather a "skinny" budget that lays out a wish list of spending requests for Congress and some basic economic projections. Congress does not have to act on Trump's budget priorities.
Multiple sources said there has been a healthy debate within the White House and senior leadership at NASA, including acting administrator Janet Petro, about the future of the SLS rocket and the Artemis Moon program. Some commercial space advocates have been pressing hard to cancel the rocket outright. Petro has been urging the White House to allow NASA to fly the Artemis II and Artemis III missions using the initial version of the SLS rocket before the program is canceled.
Critics of the large and expensive rocket—a single launch costs in excess of $2 billion, exclusive of any payloads or the cost of ground systems—say NASA should cut its losses. Keeping the SLS rocket program around for the first lunar landing would actually bog down progress, these critics say, because large contractors such as Boeing would be incentivized to slow down work and drag out funding with their cost-plus contracts for as long as possible.
On Saturday, a day after this story was published, NASA released a statement saying the SLS rocket remains an "essential component" of the Artemis campaign. "NASA and its industry partners continuously work together to evaluate and align budget, resources, contractor performance, and schedules to execute mission requirements efficiently, safely, and successfully in support of NASA's Moon to Mars goals and objectives," a spokesperson said. "NASA defers to its industry contractors for more information regarding their workforces."
Friday's all-hands meeting indicates that Boeing executives believe there is at least the possibility that the Trump White House will propose ending the SLS rocket as part of its budget proposal in March.
The US Congress, in concert with senior leaders at NASA, directed the space agency to develop the SLS rocket in 2011. Built to a significant degree from components of the space shuttle, including its main engines and side-mounted boosters, the SLS rocket was initially supposed to launch by the end of 2016. It did not make its debut flight until the end of 2022.
NASA has spent approximately $3 billion a year developing the rocket and its ground systems over the program's lifetime. While handing out guaranteed contracts to Boeing, Northrop Grumman, Aerojet, and other contractors, the government's rocket-building enterprise has been superseded by the private industry. SpaceX has developed two heavy-lift rockets in the last decade, and Blue Origin just launched its own, with the New Glenn booster. Each of these rockets is at least partially reusable and flies at less than one-tenth the cost of the SLS rocket.
[...]
Arthur T Knackerbracket has processed the following story:
It's certain that recently discovered asteroid 2024 YR4 will swing close to Earth in 2032. The chances of an impact remain low — but with relatively limited observations so far, the odds are in flux.
On Jan. 31, the collision impact probability was 1.4 percent. As of Feb. 7, NASA reports it's 2.3 percent, which also means a 97.7 percent chance of missing our humble blue world. But don't be surprised if that number climbs higher: It's normal for the impact odds to increase before falling or disappearing completely.
[...] Asteroid 2024 YR4 — spotted by a telescope from the NASA-funded Asteroid Terrestrial-impact Last Alert System — has been deemed an object worthy of close monitoring because of its size. " Currently, no other known large asteroids have an impact probability above 1 percent," the space agency explained. It's between 130 to 300 feet wide, enough to be dubbed a "city-killer" asteroid — if it indeed hit a city. (For reference, the asteroid that hit Arizona 50,000 years ago and created the 600-foot-deep "Meteor Crater" was 100 to 170 feet, or 30 to 50 meters, across. "A similar-size impact event today could destroy a city the size of Kansas City," David Kring, an impact cratering expert at the Lunar and Planetary Institute, explained in a NASA blog.)
Telescopes will refine the asteroid's orbit around the solar system over the coming months, until it travels too far away to observe (it will return again in 2028). And this added information may likely, though temporarily, boost its Earth impact odds. That's because the asteroid's risk corridor or area of uncertainty around Earth will shrink as astronomers can better define its orbit. But as long as Earth remains in that estimated hazard area — like a catcher's mitt awaiting a high-speed baseball — its relative odds of getting hit increases as the possible range of uncertainty shrinks.
"Earth is taking up a bigger percentage of that uncertain area," Betts explained. "So the impact percentage goes up."
Yet space is vast. And at the same time the area of uncertainty is shrinking, more observations reveal and shift where exactly this zone of uncertainty is. The shrinking area typically moves off of Earth, meaning our planet is no longer in that potential impact area. This happened with the asteroid Apophis — a 1,100-foot-wide behemoth that once had a small chance of impacts in both 2029 and 2036. But more precise telescope observations moved Apophis' range of trajectory off of Earth. The impact probability then plummeted.
"It dropped to zero," Betts said.
"It’s a funny thing about homing in on an asteroid and calculating its path, future position, and probability of impacting Earth – it will often appear risky during initial observations, get riskier, and then suddenly become entirely safe," the European Space Agency noted.
Related:
Huge 'God of Chaos' Asteroid to Pass Near Earth in 2029
Asteroid Shock: NASA Preparing for 'Colossal God of Chaos' Rock to Arrive in Next 10 Years
Feared Apophis Impact Ruled Out – Asteroid Will Pass Close Enough to Earth to See with Naked Eye
[Rescheduled to keep it visible--JR]
For most people the holiday season is over. There are a few who have their winter holidays booked as we still have a few months of the skiing season to go yet, but I don't think that this affects any of our staff! So I offer a belated 'Happy New Year' and wish you all the very best for 2025.
Slightly less than 12 months ago I asked for volunteers to serve on the Board of SoylentNews and fortunately some people stepped forward and took on the 3 key roles (Chairman, Treasurer and Secretary). They have each contributed to the setting up of the site and getting us where we are today. However, they will soon be wanting to stand down from their current posts. The concept of the site is that the governance is provided by the community and that posts should be rotated occasionally. We are again seeking volunteers to assume one of the current positions. The roles are important, they are the most important posts on the site because without them there can be no site, but I don't think that they are particularly arduous. They are not roles that require a daily or even a weekly input. They maintain an overview of the site and they have an independent decision-making role in future site operations.
Volunteers for the posts should remember that they must be prepared to sign site legal business documents and therefore cannot maintain perfect anonymity. On the site we have not given any additional information other than their nicknames and user ids. Nevertheless, somewhere in the masses of paperwork and records that the US demands and maintains their names and contact details are recorded.
If you wish to volunteer for a post then you should have an account in good standing i.e. not banned or created within the last few months, and with a reasonable level of karma. Please volunteer either here in the comments or directly via email to admin@soylentnews.org. If you have volunteered for a post previously then it does not preclude you from volunteering again. If you have questions regarding a role then please raise them here. If appropriate, I will ask the person currently in that post to reply so that you get the information direct from 'the horses mouth'.
Once we have a volunteer or volunteers for a post we will hold an election for the community to approve and select a person for a post. This will be done openly and everyone with an active account created before the date and time that this Meta is published will have a single vote. The reason for this restriction is to prevent a mass of new accounts attempting to unfairly influence the outcome of the vote. The current Board will make the final decision on who is chosen.
In December I published the new proposed documentation covering Policy, the Board, and other documents. Over the last few days the Wiki has been offline so I will repeat the links here:
It is 6 weeks since those documents were posted and I have included changes that have been proposed. There will be a vote to adopt them in the coming days.
Dale, our Treasurer, is preparing a financial statement for our annual return to the IRS. We currently have $968.61. There was no money transferred from the previous site but we have received several donations and subscriptions. This is a healthy figure because our servers and data connections have also been donated by generous community members and so our outgoings are significantly less than they were previously. The spreadsheet can be found here. (Note that different pages can be selected using the tabs at the bottom of the display). Subscriptions and donations may be made using Paypal, Stripe or direct bank transfer. My grateful thanks go to Dale for his work on behalf of the site.
Perhaps surprisingly, kolie discovered a significant security hole in the Rehash software which had been present since the original site began, and possibly from before that. Fortunately, there is no evidence that it was ever exploited. It has been fixed. Some parts of the code have never been worked on and it had been thought to have been secure when we first forked it. It appears to have never been fully tested.
Additionally, with the new software that is currently being written it will be possible for community members to have enhanced access via the site API. Some API functions will respond differently depending on the security level of the person accessing the site using it. As you can imagine this will require careful testing.
Some of you will have seen the software that will be used to remove spamming and doxxing information from public view. When the software is complete it will also give community members improved visibility on why some comments are removed from view. This will improve staff accountability to the community at large. Another benefit is that now nothing is actually removed from the database, it is only removed from display. It can therefore be restored should it be desired.
The Perl code, despite being very dated, is perhaps surprisingly well structured. It is possible that the Rehash code can be rewritten in a more popular and more supportable language function by function. This is a long term plan but it does appear to be a realistic one.
We have received a proposal that community members should have a better way of making suggestions for changes that will improve the site's function and use, rather than the current method of making a bug report. It is still only a proposal and we need to spend some time investigating the possibilities. Using the Wiki has been proposed and providing that we can securely protect the rest of the Wiki while granting access to community members to the Suggestions page it seems a good idea. The problem is that the Wiki is known to be vulnerable to external attacks and abuse unless a lot of additional software (and management) is employed. Leaving it open to ACs (and thus the whole world) will clearly not be possible so it has its a known limitation.
It is a fact that if this discussion were to be open to ACs on the front pages it would quickly become a focus for Spam from a very small group of people. Therefore, the contents of this Meta will be reproduced as a journal belonging to "AC Friendly" [https://soylentnews.org/~AC+Friendly/journal/] and ACs will be welcome to comment there. Valid points of discussion will be copied across to the front page story under the username of "AC Friendly". If an AC wishes to respond to a specific comment then please link to that comment in the first line of your own comment. Spam in that journal will be treated appropriately.
Arthur T Knackerbracket has processed the following story:
[...] The research team, led by Professor Tobin Filleter, has engineered nanomaterials that offer unprecedented strength, weight, and customizability. These materials are composed of tiny building blocks, or repeating units, measuring just a few hundred nanometers – so small that over 100 lined up would barely match the thickness of a human hair.
The researchers used a multi-objective Bayesian optimization machine learning algorithm to predict optimal geometries for enhancing stress distribution and improving the strength-to-weight ratio of nano-architected designs. The algorithm only needed 400 data points, whereas others might need 20,000 or more, allowing the researchers to work with a smaller, high-quality data set. The Canadian team collaborated with Professor Seunghwa Ryu and PhD student Jinwook Yeo at the Korean Advanced Institute of Science & Technology for this step of the process.
This experiment was the first time scientists have applied machine learning to optimize nano-architected materials. According to Peter Serles, the lead author of the project's paper published in Advanced Materials, the team was shocked by the improvements. It didn't just replicate successful geometries from the training data; it learned from what changes to the shapes worked and what didn't, enabling it to predict entirely new lattice geometries.
The team used a two-photon polymerization 3D printer to create prototypes for experimental validation, building optimized carbon nanolattices at the micro- and nano-scale. The team's optimized nanolattices more than doubled the strength of existing designs, withstanding stress of 2.03 megapascals for every cubic meter per kilogram of density – about five times stronger than titanium.
Arthur T Knackerbracket has processed the following story:
A robotic spacecraft has beamed home crisp videos and snapshots of Earth eclipsing the moon.
Though lunar eclipses generally aren't that unusual — stargazers can watch Earth's shadow obscuring the moon a few times a year — this was different.
Firefly Aerospace's Blue Ghost lander, a private spacecraft hired by NASA to take experiments to the moon, got a rare front-row seat of the spectacle in space. The phenomenon occurred when the blue marble came between the moon and the spacecraft.
Blue Ghost, named after an exotic species of firefly, captured the below footage while flying laps around Earth as it gears up for its first attempt at a lunar touchdown. Almost two weeks ago, the spacecraft witnessed another majestic moment when Earth eclipsed the sun.
Intel has already received $2.2B in federal grants for chip production:
Semiconductor giant Intel Corporation has already received $2.2 billion in federal grants from the U.S. Department of Commerce through the U.S. CHIPS and Science Act, the company shared during its Thursday earnings call.
Dave Zinsner, Intel's co-interim CEO, executive vice president, and CFO, said the Silicon Valley-based company received the first tranche of $1.1 billion in federal grants at the end of 2024 and an additional $1.1 billion in January 2025.
These grants are based on reaching certain milestones, Zinsner added. Another $5.66 billion has yet to be dispersed.
The company was awarded a total of $7.86 billion in federal grants to build semiconductors in the U.S. in November as part of the U.S. Department of Commerce's U.S. CHIPS and Science Act. While a sizable sum, this total was less than the original $8.5 billion estimate.