Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
The Future Of Nuclear Reactors Is Making Its Way To The US:
America has a checkered history with nuclear power. Stemming from the Three Mile Island accident in 1979, safety fears surrounding nuclear energy have curtailed the country's development of commercial nuclear reactors. Ultimately, from 1977 until 2013, there were no new construction starts for nuclear power stations. Yet, the country remains the world's largest producer of nuclear power, generating close to 30% of the world's total nuclear output. It's also the third biggest method of power generation in the U.S., producing 18% of America's electricity; only natural gas and coal add more power to the grid. However, the slump in investment in nuclear power may be coming to an end.
Building on President Trump's four executive orders to revitalize the sector, Chris Wright, the U.S. Secretary of Energy, recently announced a "pathway" to streamline the development and deployment of advanced nuclear reactors. Speaking at the International Atomic Energy Agency's (IAEA) General Conference, Secretary Wright cited the growing demand for affordable power and the rise of high-power demand industries like AI as driving forces behind the strategy change. He said, "We established an expedited pathway to approve advanced reactors, set standards to evaluate new construction licenses within 18 months." The goal is to deploy Small Modular Reactors (SMRs) as part of President Trump's plan to add 300 gigawatts of nuclear capacity to the grid by 2050.
The key to that future lies in the aforementioned SMRs, a new generation of reactors that are designed to be smaller, safer, and faster to build.
For those of us who associate nuclear power stations with behemoths like Chernobyl or Japan's Kashiwazaki-Kariwa plant, the new generation being planned by the U.S. might come as a surprise. Rather than being large, static plants, the U.S. Government sees the future of nuclear energy as being smaller in scale. The President's executive order details the need for the U.S. to develop advanced Generation III+ reactors. These include small modular reactors (SMRs) and microreactors. The executive order also notes that these should be developed in both stationary and mobile formats to build greater resilience into critical electrical infrastructure.
One of the cornerstones of the executive order is the use of SMRs. As the name suggests, these are small reactors with a power capacity of up to 300 megawatts per module. Because of their small size and scalability, these can be installed in places where traditional reactors are unsuitable. The modular aspect of the design also means they can be pre-built at a factory and quickly installed on site. SMRs can also be quickly — and relatively easily — installed in rural areas with limited electrical infrastructure.
Microreactors are an SMR subclass; these are smaller reactors that typically generate a maximum of 10 megawatts. Microreactors have many of the same advantages as larger SMRs. Additionally, they are also a cost-effective solution for isolated areas and can also be used for backup power or as a replacement for diesel generators. Incidentally, the U.S. Army is developing a microreactor.
For the U.S., the renewed focus on nuclear power isn't just about clean and reliable energy sources — it's also about jobs and security. In employment terms, the nuclear industry already employs close to 500,000 workers. Additionally, these are well-paid jobs with salaries around 50% higher than comparable jobs within other energy generation sectors. However, the development of SMRs is still a work-in-progress, and these are seen as critical for the future of the industry. President Trump further indicated America's commitment to their development when he announced a $900 million package, split across two tiers, to support the development of SMRs. The majority of the funding is intended to support the development of new commercial projects. The remainder is to be used to help deployments by smoothing out prohibiting factors like design and supply chain issues.
Security is also a driving force behind the re-emergence of the U.S. nuclear power sector. Historically, both the U.S. and Europe were central in developing international safeguards designed to prevent nuclear proliferation, an influence that has waned in recent years. With the advanced technology being developed and the moves by the U.S. Government to support and encourage the sector, the aim is to restore the U.S. influence across global energy markets.
Despite nuclear fusion records continuing to be broken, it's still considered a technology for the future. In the meantime, SMRs may just be the bridge that keeps our lights on as we move away from fossil fuels.
https://phys.org/news/2025-09-magic-mushrooms-unique-biochemical-paths.html
A German-Austrian team led by Friedrich Schiller University Jena and Leibniz-HKI has been able to biochemically demonstrate for the first time that different types of mushrooms produce the same mind-altering active substance, psilocybin, in different ways.
Both Psilocybe mushrooms and fiber cap mushrooms of the genus Inocybe produce this substance, but use completely different enzymes and reaction sequences for this process. The results are published in Angewandte Chemie International Edition.
"This concerns the biosynthesis of a molecule that has a very long history with humans," explains Prof. Dirk Hoffmeister, head of the research group Pharmaceutical Microbiology at Friedrich Schiller University Jena and the Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI).
"We are referring to psilocybin, a substance found in so-called 'magic mushrooms,' which our body converts into psilocin—a compound that can profoundly alter consciousness. However, psilocybin not only triggers psychedelic experiences, but is also considered a promising active compound in the treatment of therapy-resistant depression," says Hoffmeister.
The study, which was conducted within the Cluster of Excellence "Balance of the Microverse," shows for the first time that fungi have developed the ability to produce psilocybin at least twice independently of each other. While Psilocybe species use a known enzyme toolkit for this purpose, fiber cap mushrooms employ a completely different biochemical arsenal—and yet arrive at the same molecule.
This finding is considered an example of convergent evolution: Different species have independently developed a similar trait, but the magic mushrooms have gone their own way in doing so.
Tim Schäfer, lead author of the study and doctoral researcher in Hoffmeister's team, explains, "It was like looking at two different workshops, but both ultimately delivering the same product. In the fiber caps, we found a unique set of enzymes that have nothing to do with those found in Psilocybe mushrooms. Nevertheless, they all catalyze the steps necessary to form psilocybin."
The researchers analyzed the enzymes in the laboratory. Protein models created by Innsbruck chemist Bernhard Rupp confirmed that the sequence of reactions differs significantly from that known in Psilocybe.
"Here, nature has actually invented the same active compound twice," says Schäfer.
However, why two such different groups of fungi produce the same active compound remains unclear. "The real answer is that we don't know," emphasizes Hoffmeister. "Nature does nothing without reason. So there must be an advantage to both fiber cap mushrooms in the forest and Psilocybe species on manure or wood mulch producing this molecule—we just don't know what it is yet."
"One possible reason could be that psilocybin is intended to deter predators. Even the smallest injuries cause Psilocybe mushrooms to turn blue through a chemical chain reaction, revealing the breakdown products of psilocybin. Perhaps the molecule is a type of chemical defense mechanism," says Hoffmeister.
Although it is still unclear why different fungi ultimately produce the same molecule, the discovery nevertheless has practical implications.
"Now that we know about additional enzymes, we have more tools in our toolbox for the biotechnological production of psilocybin," explains Hoffmeister.
Schäfer is also looking ahead, stating, "We hope that our results will contribute to the future production of psilocybin for pharmaceuticals in bioreactors without the need for complex chemical syntheses."
At the Leibniz-HKI in Jena, Hoffmeister's team is working closely with the Bio Pilot Plant, which is developing processes for producing natural products such as psilocybin on an industry-like scale.
At the same time, the study provides exciting insights into the diversity of chemical strategies used by fungi and their interactions with their environment.
More information: Dissimilar Reactions and Enzymes for Psilocybin Biosynthesis in Inocybe and Psilocybe Mushrooms, Angewandte Chemie International Edition (2025). DOI: 10.1002/anie.202512017
https://phys.org/news/2025-09-ganges-river-drying-unprecedented.html
The Ganges River is in crisis. This lifeline for around 600 million people in India and neighboring countries is experiencing its worst drying period in 1,300 years. Using a combination of historical data, paleoclimate records and hydrological models, researchers from IIT Gandhinagar and the University of Arizona discovered that human activity is the main cause. They also found that the current drying is more severe than any recorded drought in the river's history.
In their study, published in the Proceedings of the National Academy of Sciences, researchers first reconstructed the river's flow for the last 1,300 years (700 to 2012 C.E.) by analyzing tree rings from the Monsoon Asia Drought Atlas (MADA) dataset. Then they used powerful computer programs to combine this tree-ring data with modern records to create a timeline of the river's flow. To ensure its accuracy, they double-checked it against documented historical droughts and famines.
The scientists found that the recent drying of the Ganges River from 1991 to 2020 is 76% worse than the previous worst recorded drought, which occurred during the 16th century. Not only is the river drier overall, but droughts are now more frequent and last longer. The main reason, according to the researchers, is human activity. While some natural climate patterns are at play, the primary driver is the weakening of the summer monsoon.
This weakening is linked to human-driven factors such as the warming of the Indian Ocean and air pollution from anthropogenic aerosols. These are liquid droplets and fine solid particles that come from factories, vehicles and power plants, among other sources and can suppress rainfall. The scientists also found that most climate models failed to spot the severe drying trend.
"The recent drying is well beyond the realm of last millennium climate variability, and most global climate models fail to capture it," the authors wrote in their paper. "Our findings underscore the urgent need to examine the interactions among the factors that control summer monsoon precipitation, including large-scale climate variability and anthropogenic forcings."
The researchers suggest two main courses of action. Given the mismatch between climate models and what they actually found, they are calling for better modeling to account for the regional impacts of human activity.
And because the Ganges is a vital source of water for drinking, agricultural production, industrial use and wildlife, the team also recommends implementing new adaptive water management strategies to mitigate potential water scarcity.
More information: Dipesh Singh Chuphal et al, Recent drying of the Ganga River is unprecedented in the last 1,300 years, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424613122
Invest in a major customer, then get money from a major customer?
Nvidia plans to invest up to $100 billion in OpenAI, marking an unprecedented financial and strategic alignment between the leading AI hardware provider and one of the best-known developers of artificial intelligence models. However, the deal raises major antitrust concerns among legal experts and policymakers over potential market imbalance, as in both cases the investment can impact competitors of both companies, reports Reuters.
The planned investment raises questions about how the cash infusion could affect Nvidia's other customers and the overall AI hardware market. Nvidia already commands the lion's share of the market for hardware used for AI training and inference, as virtually all AI companies use its Hopper and Blackwell GPUs, so they rely on access to Nvidia's GPUs to scale their own models. Legal experts note that Nvidia's investment may create incentives to prioritize OpenAI over others, potentially offering better terms or faster access to a limited supply of leading-edge GPUs, such as Rubin.
In response, a representative for Nvidia stated that the company's commitment to all of its clients remains unchanged. The spokesperson emphasized that having a financial interest in any one partner would not affect how the company serves others, assuring that every customer will continue to receive the same level of attention and service.
Additionally, if OpenAI prefers hardware from Nvidia over hardware from its rivals, such as AMD, or even its own processor developed in collaboration with Broadcom, then Nvidia will get an unfair advantage. Keeping in mind that OpenAI is believed to have acquired $10 billion worth of custom-built AI processors from Broadcom, it is unlikely that it won't deploy them, but now that Nvidia will provide OpenAI hardware worth tens of billions of dollars, the AI company will continue to do more work on Nvidia's hardware rather than on competing processors.
OpenAI currently operates as a non-profit but is pursuing a transition to a for-profit public benefit corporation. This structural change is meant to facilitate investment while maintaining oversight by the original non-profit entity. The arrangement with Nvidia does not provide governance rights — only financial participation — and may depend on regulatory approvals in states like Delaware and California, where OpenAI is registered.
[...] U.S. regulators have previously flagged the risk of major technology firms leveraging their existing dominance to control emerging AI markets. Officials from the Department of Justice have emphasized the importance of averting exclusionary practices in the AI supply chain, including restricted access to processors and compute infrastructure.
The potential effects extend beyond hardware. Oracle recently disclosed that it had signed large-scale cloud contracts with OpenAI and other clients, boosting its valuation. With Nvidia's investment potentially strengthening OpenAI's financial position, Oracle's revenue projections may appear more credible, something that will address investor concerns about OpenAI's ability to fund such commitments, according to Reuters.
The plague of cookie consent alerts, banners, and pop-ups that have added a sliver of sandpaper to web surfing since 2009 might be eradicated in December. The European Commission (EC) intends to revise a law called the e-Privacy Directive, reports Politico. Specifically, new guidelines from the European Data Protection Board (EDPB) aim to eliminate manipulative consent banners and reduce consent fatigue.
Cookies are a necessary part of the World Wide Web, which seasoned surfers will have first become aware of in troubleshooting – fixing issues by clearing cookies and so on. However, after the e-Privacy Directive came into force in the late noughties, cookies soon became a source of persistent irritation. The directive required website holders to get consent from visitors unless the cookies were strictly necessary.
Now, in 2025, and if you refresh your browser or buy a new computer/device, you'll face days and days of cookie clickspamageddon to return to smooth surfing on your familiar sites. We know there are browser extensions designed to ignore cookies, but they can have their own trade-offs with privacy, and/or compatibility wrinkles.
Politico shares a quote from Peter Craddock, a data lawyer with Keller and Heckman, which highlights the problem with the current state of cookie consent regulations. "Too much consent basically kills consent," remarked Craddock. "People are used to giving consent for everything, so they might stop reading things in as much detail, and if consent is the default for everything, it's no longer perceived in the same way by users."
[...] In practice, some of the changes we look forward to could be the hinted extremely clear 'reject all' button, which must be as prominent as any 'accept all' option, on all sites. Allowing browser-level consent preferences might be the biggest time saver of all, though. We'll see how browser makers tune and allow for granular control here.
Supporting the future of the open web: Cloudflare is sponsoring Ladybird and Omarchy
At Cloudflare, we believe that helping build a better Internet means encouraging a healthy ecosystem of options for how people can connect safely and quickly to the resources they need. [....] sometimes that means we support and partner with fantastic open teams taking big bets on the next generation of tools.
To that end, today we are excited to announce our support of two independent, open source projects: Ladybird, an ambitious project to build a completely independent browser from the ground up, and Omarchy, an opinionated Arch Linux setup for developers.
[....]
Ladybird, a new and independent browser
[....] While the openness of how browsers work has led to an explosive growth of services on the Internet, browsers themselves have consolidated to a tiny handful of viable options. There's a high probability you're reading this on a Chromium-based browser, like Google's Chrome, along with about 65% of users on the Internet. However, that consolidation has also scared off new entrants in the space. If all browsers ship on the same operating systems, powered by the same underlying technology, we lose out on potential privacy, security and performance innovations that could benefit developers and everyday Internet users.
This is where Ladybird comes in: it's not Chromium based – everything is built from scratch. The Ladybird project has two main components: LibWeb, a brand-new rendering engine, and LibJS, a brand-new JavaScript engine with its own parser, interpreter, and bytecode execution engine.
Building an engine that can correctly and securely render the modern web is a monumental task that requires deep technical expertise and navigating decades of specifications governed by standards bodies like the W3C and WHATWG. And because Ladybird implements these standards directly, it also stress-tests them in practice. Along the way, the project has found, reported, and sometimes fixed countless issues in the specifications themselves, contributions that strengthen the entire web platform for developers, browser vendors, and anyone who may attempt to build a browser in the future.
[...rest dele.........cough...cough..]
First came the Navigator. Then came the Explorer. Then came the Konqueror. When the Konqueror was in its seventh year, it begat Webkit.
How billions of hacked mosquitoes and a vaccine could beat the deadly dengue virus:
Last month, a parade of vehicles wound its way through three cities in Brazil, releasing clouds of mosquitoes into the air. The insects all carry a secret weapon — a bacterium called Wolbachia that lowers the odds that the mosquitoes can transmit the dreaded dengue virus to humans.
These infected mosquitoes are the latest weapon in Brazil's fight against dengue, which infects millions of people in the country each year and can be fatal. A biofactory that opened in the town of Curitiba in July can produce 100 million mosquito eggs per week — making it the largest such facility in the world. The company that runs it, Wolbito do Brasil, aims to protect about 14 million Brazilians per year through its Wolbachia-infected mosquitoes.
That will come as welcome news for the Brazilian health officials battling the rapidly growing threat of dengue. In 2024, the country experienced its worst outbreak yet: with 6.6 million probable cases and more than 6,300 related deaths. This year's outbreak, although less severe, is also one of the highest on record, with 1.6 million probable cases so far (see 'Dangerous outbreaks'). And the problem is spreading. Argentina, Colombia and Peru also experienced record-breaking outbreaks in 2024 and have seen a sustained increase in cases in recent years. Across Latin America and the Caribbean, deaths from dengue last year totalled more than 8,400 and the global figure reached more than 12,000 — the highest ever recorded for this disease.
As outbreaks grow larger and the crisis becomes more urgent, the Wolbachia method isn't Brazil's only bet. A locally produced dengue vaccine is now awaiting approval by the country's drug-regulatory agency, and its health ministry expects to start administering tens of millions of doses by next year.
These twin advances offer some hope to other countries — in the region and beyond. Driven by forces such as climate change, mosquito adaptation, globalized trade and movements of people, dengue is becoming a crisis worldwide, with an estimated 3.9 billion people at risk of infection. As Brazil rolls out its armies of infected mosquitoes and a vaccine in the coming year, the rest of the world will be watching closely.
Currently, there is one main dengue vaccine in use around the world: Qdenga, licensed by the Japanese pharmaceutical company Takeda. The vaccine has been approved in many countries, including Brazil, which was the first nation to include it in its public-health system.
However, Qdenga's roll-out in Brazil is limited. The country bought nine million doses of the two-dose vaccine this year: enough to vaccinate 4.5 million of its population of more than 210 million. So far, Qdenga has been administered to children between the ages of 10 and 14, one of the groups most likely to end up in hospital after contracting dengue, together with older people. Its safety and efficacy have not yet been tested in adults aged over 60.
The main reasons for such a limited roll-out in Brazil are availability and cost. Even though Brazil secured Qdenga from Takeda at one of the cheapest prices in the world —around US$19 per dose — the cost is still high compared with other vaccines. And even in the most optimistic scenario, the maximum number of doses Takeda could provide by 2028 is 50 million — enough to vaccinate 25 million people. What's more, for people who have not had dengue before, clinical trials did not show Qdenga to be effective against all four variants — or serotypes — of the dengue virus.
Brazil is trying to address all of those limitations with its one-dose vaccine candidate, developed at the Butantan Institute, a public biomedical research centre in São Paulo. "Having local production capacity gives us independence on decisions — how many doses we need, and at what speed to vaccinate," says Esper Kallás, Butantan's director. "You can practise prices that are more suitable and absorbable by a public-health system such as Brazil's."
Butantan is also optimistic that its vaccine will be effective against all four forms of dengue. Severe disease usually occurs when a person is infected by a different serotype to their first infection. That means that a successful vaccine needs to generate antibodies for all four serotypes without triggering severe reactions, which makes it a difficult vaccine to develop. "It was indeed a challenge, as each serotype behaves differently," says Neuza Frazatti Gallina, manager of the viral vaccine development laboratory at Butantan.
The vaccine's development began at the US National Institutes of Health in the late 1990s, where scientists transformed dengue viruses they had isolated from patients into weakened vaccine strains that could trigger the production of protective antibodies without causing disease. In 2009, Butantan extended that research by working to solve the challenges of combining the four strains into a vaccine.
After testing 30 formulations, Butantan arrived at one that proved highly effective in preventing infections, according to the preliminary results of a phase III trial involving more than 16,000 volunteers in Brazil. The study reported that two years after vaccinations, the formulation was 89% effective in preventing infections in people who had previously been infected with dengue, and 74% effective in those with no previous exposure1.
"It was a well-designed trial," says Annelies Wilder-Smith, who is team lead for vaccine development at the World Health Organization (WHO). But she says one limitation of the trial is that it was conducted in a single country, and therefore runs a risk that all four serotypes were not circulating at the time.
In fact, serotypes 3 and 4 were not prevalent during the data-collection period of the clinical trial, although they are now circulating in Brazil. Butantan researchers suggest that the vaccine will be effective against serotypes 3 and 4, pointing to data from a phase II trial2 in 300 adults that showed participants produced neutralizing antibodies to each of the serotypes. That study evaluated safety and immunological response in the short term, rather than looking at the vaccine's long-term efficacy in preventing infections. The full results of the Brazilian phase III trial — which will provide data on long-term effectiveness — are not yet public and are undergoing peer review.
The vaccine is already moving through the country's regulatory process. And although there's still no certainty about when Anvisa, Brazil's regulatory agency, will approve the vaccine, the government is counting on it. In February, President Luiz Inácio Lula da Silva announced that, starting in 2026, the Ministry of Health would be buying 60 million doses annually.
To meet that demand, Butantan is now producing the vaccine at its São Paulo facility. On its lush campus, an entire building is dedicated to churning out doses.
Regarding the vaccine's approval, "We are very confident," says Kallás. "We also anticipate that there is a very prominent need to have this product in the arms of people. So we hit the road running and started producing vaccines late last year."
Although Butantan's production efforts will focus initially on meeting Brazil's need for millions of doses, Kallás expects that the vaccine could reach other countries. Butantan has been discussing with its development partner — the pharmaceutical giant Merck — and the Pan American Health Organization (PAHO) how to make the vaccine accessible to other countries. The logical first step, he says, would be to roll it out through PAHO to Latin America and the Caribbean, and then to other regions.
In the meantime, Merck is developing a potential vaccine for Asia with an almost identical formulation, which builds on the knowledge that Butantan has developed. In a statement, the drug firm said that Butantan is "sharing clinical data and other learnings". In June, Merck started enrolling participants for its own phase III trial. "All the data, experiences and insights they have collected with the Butantan vaccine will be helpful," says Wilder-Smith.
While Butantan awaits news about the vaccine's approval, the Wolbachia method to control dengue is gaining momentum. The World Mosquito Program (WMP) — a non-profit group of companies owned by Monash University in Melbourne, Australia, where the strategy was developed — has operations in 14 countries, including Vietnam, Indonesia, Mexico and Colombia, but Brazil leads the way in terms of the scale of its expansion.
The method's arrival in the Americas is tied to Brazilian researcher Luciano Moreira, now the chief executive of Wolbito do Brazil. Wolbachia is naturally present in around 50% of insects, but not in the mosquito species Aedes aegypti, which is the main transmitter of dengue and many other viruses.
Journal Reference:
Live, Attenuated, Tetravalent Butantan–Dengue Vaccine in Children and Adults, New England Journal of Medicine (DOI: 10.1056%2FNEJMoa2301790)
Safety and immunogenicity of the tetravalent, live-attenuated dengue vaccine Butantan-DV in adults in Brazil: a two-step, double-blind, randomised placebo-controlled phase 2 trial, (DOI: )
A Wolbachia Symbiont in Aedes aegypti Limits Infection with Dengue, Chikungunya, and Plasmodium, (DOI: 10.1016%2Fj.cell.2009.11.042)
Sofia B. Pinto, Thais I. S. Riback, Gabriel Sylvestre, et al. Effectiveness of Wolbachia-infected mosquito deployments in reducing the incidence of dengue and other Aedes-borne diseases in Niterói, Brazil: A quasi-experimental study, PLOS Neglected Tropical Diseases (DOI: 10.1371/journal.pntd.0009556)
Katherine L Anders, Gabriel Sylvestre Ribeiro, Renato da Silva Lopes, et al. Long-term durability and public health impact of city-wide wMel Wolbachia mosquito releases in Niterói, Brazil during a dengue epidemic surge [$], medRxiv (DOI: 10.1101/2025.04.06.25325319)
Efficacy of Wolbachia-Infected Mosquito Deployments for the Control of Dengue, New England Journal of Medicine (DOI: 10.1056%2FNEJMoa2030243)
The Guardian has a very interesting article about Human Computer Interactions and its implications beyond gamers:
Five years ago, on the verge of the first Covid lockdown, I wrote an article asking what seemed to be an extremely niche question: why do some people invert their controls when playing 3D games?
I thought a few hardcore gamers would be interested in the question. Instead, more than one million people read the article, and the ensuing debate caught the attention of Dr Jennifer Corbett (quoted in the original piece) and Dr Jaap Munneke, then based at the Visual Perception and Attention Lab at Brunel University London.
At the time, the two were conducting research into vision science and cognitive neuroscience, but when the country locked down, they were no longer able to test volunteers in their laboratory. The question of controller inversion provided the perfect opportunity to study the neuroscience of human-computer interactions using remote subjects. They put out a call for gamers willing to help research the reasons behind controller inversion and received many hundreds of replies.
And it wasn't just gamers who were interested. "Machinists, equipment operators, pilots, designers, surgeons – people from so many different backgrounds reached out," says Corbett. "Because there were so many different answers, we realised we had a lot of scientific literature to review to design the best possible study. Readers' responses turned this study into the first of its kind to try to figure out what actually are those factors that shape how users configure their controllers. Personal experiences, favourite games, different genres, age, consoles, which way you scroll with a mouse ... all of these things could potentially be involved."
This month the duo published their findings in a paper entitled "Why axis inversion? Optimising interactions between users, interfaces, and visual displays in 3D environments". And the reason why some people invert their controls? It's complicated.
The process started with participants completing a survey about their backgrounds and gaming experiences. "Many people told us that playing a flight simulator, using a certain type of console, or the first game they played were the reasons they preferred to invert or not," says Corbett. "Many also said they switched preferences over time. We added a whole new section to the study based on all this feedback."
What they discovered through the cognitive testing was that a lot of assumptions being made around controller preferences were wrong. "None of the reasons people gave us [for inverting controls] had anything to do with whether they actually inverted," says Corbett. "It turns out the most predictive out of all the factors we measured was how quickly gamers could mentally rotate things and overcome the Simon effect. The faster they were, the less likely they were to invert. People who said they sometimes inverted were by far the slowest on these tasks." So does this mean non-inverters are better gamers? No, says Corbett. "Though they tended to be faster, they didn't get the correct answer more than inverters who were actually slightly more accurate."
In short, gamers think they are an inverter or a non-inverter because of how they were first exposed to game controls. Someone who played a lot of flight sims in the 1980s may have unconsciously taught themselves to invert and now they consider that their innate preference; alternatively a gamer who grew up in the 2000s, when non-inverted controls became prevalent may think they are naturally a non-inverter. However, cognitive tests suggest otherwise. It's much more likely that you invert or don't invert due to how your brain perceives objects in 3D space.
Consequently, Corbett says that it may improve you as a gamer to try the controller setup you are currently not using. "Non-inverters should give inversion a try – and inverters should give non-inversion another shot," she says. "You might even want to force yourself to stick with it for a few hours. People have learned one way. That doesn't mean they won't learn another way even better. A good example is being left-handed. Until the mid-20th century, left-handed children were forced to write with their right hand, causing some people to have lifelong handwriting difficulties and learning problems. Many older adults still don't realise they're naturally left-handed and could write/draw much better if they switched back."
Through this research, Corbett and Munneke have established that there are complex and often unconscious cognitive processes involved in how individuals use controllers, and that these may have important ramifications for not just game hardware but for any human-computer interfaces, from aircraft controls to surgical devices. They were able to design a framework for assessing how to best configure controls for any given individual and have now made that available via their research paper.
"This work opened our eyes to the huge potential that optimising inversion settings has for advancing human-machine teaming," says Corbett. "So many technologies are pairing humans with AI and other machines to augment what we can do alone. Understanding how a given individual best performs with a certain setup (controller configuration, screen placement, whether they are trying to hit a target or avoid an obstacle) can allow for much smoother interactions between humans and machines in lots of scenarios from partnering with an AI player to defeat a boss, to preventing damage to delicate internal tissue while performing a complicated laparoscopic surgery."
So what started as an idle, slightly nerdy question has now become a published cognitive research paper. One scientific publication has already cited it and interview requests are pouring in from podcasts and Youtubers. As for my takeaway? "The most surprising finding for gamers [who don't invert] is that they might perform better if they practised with an inverted control scheme," says Corbett. "Maybe not, but given our findings, it's definitely worth a shot because it could dramatically improve competitive game play!"
Additional Journal link: Why axis inversion? Optimizing interactions between users, interfaces, and visual displays in 3D environments
'A Black Hole of Energy Use': Meta's Massive AI Data Center Is Stressing Out a Louisiana Community:
A massive data center for Meta's AI will likely lead to rate hikes for Louisiana customers, but Meta wants to keep the details under wraps.
Holly Ridge is a rural community bisected by US Highway 80, gridded with farmland, with a big creek—it is literally named Big Creek—running through it. It is home to rice and grain mills and an elementary school and a few houses. Soon, it will also be home to Meta's massive, 4 million square foot AI data center hosting thousands of perpetually humming [4:01 --JE] servers that require billions of watts of energy to power. And that energy-guzzling infrastructure will be partially paid for by Louisiana residents.
The plan is part of what Meta CEO Mark Zuckerberg said would be "a defining year for AI." On Threads, Zuckerberg boasted that his company was "building a 2GW+ datacenter that is so large it would cover a significant part of Manhattan," posting a map of Manhattan along with the data center overlaid. Zuckerberg went on to say that over the coming years, AI "will drive our core products and business, unlock historic innovation, and extend American technology leadership. Let's go build! 💪"
What Zuckerberg did not mention is that "Let's go build" refers not only to the massive data center but also three new Meta-subsidized, gas power plants and a transmission line to fuel it serviced by Entergy Louisiana, the region's energy monopoly.
Key details about Meta's investments with the data center remain vague, and Meta's contracts with Entergy are largely cloaked from public scrutiny. But what is known is the $10 billion data center has been positioned as an enormous economic boon for the area—one that politicians bent over backward to facilitate—and Meta said it will invest $200 million into "local roads and water infrastructure."
A January report from NOLA.com said that the the state had rewritten zoning laws, promised to change a law so that it no longer had to put state property up for public bidding, and rewrote what was supposed to be a tax incentive for broadband internet meant to bridge the digital divide so that it was only an incentive for data centers, all with the goal of luring in Meta.
But Entergy Louisiana's residential customers, who live in one of the poorest regions of the state, will see their utility bills increase to pay for Meta's energy infrastructure, according to Entergy's application. Entergy estimates that amount will be small and will only cover a transmission line, but advocates for energy affordability say the costs could balloon depending on whether Meta agrees to finish paying for its three gas plants 15 years from now. The short-term rate increases will be debated in a public hearing before state regulators that has not yet been scheduled.
The Alliance for Affordable Energy called it a "black hole of energy use," and said "to give perspective on how much electricity the Meta project will use: Meta's energy needs are roughly 2.3x the power needs of Orleans Parish ... it's like building the power impact of a large city overnight in the middle of nowhere."
By 2030, Entergy's electricity prices are projected to increase 90 percent from where they were in 2018, although the company attributes much of that to damage to infrastructure from hurricanes. The state already has a high energy cost burden in part because of a storm damage to infrastructure, and balmy heat made worse by climate change that drives air conditioner use. The state's homes largely are not energy efficient, with many porous older buildings that don't retain heat in the winter or remain cool in the summer.
"You don't just have high utility bills, you also have high repair costs, you have high insurance premiums, and it all contributes to housing insecurity," said Andreanecia Morris, a member of Housing Louisiana, which is opposed to Entergy's gas plant application. She believes Meta's data center will make it worse. And Louisiana residents have reasons to distrust Entergy when it comes to passing off costs of new infrastructure: in 2018, the company's New Orleans subsidiary was caught paying actors to testify on behalf of a new gas plant. "The fees for the gas plant have all been borne by the people of New Orleans," Morris said.
In its application to build new gas plants and in public testimony, Entergy says the cost of Meta's data center to customers will be minimal and has even suggested Meta's presence will make their bills go down. But Meta's commitments are temporary, many of Meta's assurances are not binding, and crucial details about its deal with Entergy are shielded from public view, a structural issue with state energy regulators across the country.
[Editor's Note - The source is far too long to include here. I recommend reading the original source (in the link) for more of the details.--JR]
When cancer targets the young:
Cancer is usually a curse of time. In the United States, the vast majority of cancer diagnoses are in people over age 50. Our bodies' cells accumulate DNA damage over time, and older immune systems are not as good at making repairs. At the same time, decades of interaction with sunlight, tobacco products, alcohol, carcinogenic chemicals and other risk factors also take their toll.
But in recent years, cancer has been increasingly attacking younger adults. Global incidence rates of several types of cancer are rising in people in their 20s, 30s and 40s, many with no family history of the disease. Scientists don't know why diagnoses are soaring in people under age 50, and they are racing to find out. But as freelance journalist Fred Schwaller reports in this issue, identifying how risk factors like diet or environmental exposures could be at fault is notoriously difficult because there are so many potential influences at play.
For one, cancers in young adults may advance much more quickly than they do in older people, belying the assumption that healthy young bodies would excel at eradicating malignant cells.
What's more, cancer screening recommendations in many countries aren't currently designed to detect the disease in younger people. Young adult patients often say their concerns that something wasn't right are dismissed by doctors who say they are "too young to have cancer," even when they repeatedly voice their concerns. And that can lead to delayed diagnosis and treatment.
[...] Harsh treatments like radiation and chemotherapy can damage immature egg cells and cells that make sperm, making it impossible for some people who had cancer in childhood to have biological children. Teenage and adult patients may be able to freeze eggs or sperm, but children who haven't gone through puberty don't have those options. Senior writer Meghan Rosen reports on emerging research intended to help make that possible, including a conversation with the first childhood cancer survivor to have testicular stem cells transplanted back into his body.
Parents of children with cancer are increasingly considering these options for both boys and girls. And while scientists say the work is still in its infancy, they hope more childhood cancer survivors will one day have the option to thrive as parents.
Porsche AG on Friday dialled[sic] back plans for its electric vehicle rollout due to weaker demand, pressure in key market China and higher U.S. tariffs, causing the luxury sportscar maker and its parent Volkswagen to slash their 2025 profit outlooks:
The move highlights the challenges for one of the most well-known car brands, which has been squeezed by its two most important markets - China and the United States - over price declines and trade barriers.
Volkswagen, Europe's top carmaker, said it would take a 5.1 billion euro ($6 billion) hit from the far-reaching product overhaul, which delays some EV models in favour of hybrids and combustion engine cars, at its 75.4%-owned subsidiary.
The changes are a major shift for the Stuttgart-based maker of the iconic 911 model, and are expected to hit Porsche's operating profit by up to 1.8 billion euros this year, it said.
[...] Porsche said it would delay the launch of certain all-electric vehicles, adding that the new SUV above the Cayenne model would initially not be offered as an all-electric vehicle, but with combustion-engine and hybrid models.
Also at ZeroHedge.
Previously: Porsche's New Cayenne Will Charge Itself Like No Other EV
Related:
This ancient people lived in the Sahara when it was a much more welcoming environment:
Between 14,800 and 5,500 years ago, during what is known as the African Humid Period, the desert known for being one of the driest places on Earth actually had enough water to support a way of life. Back then, it was a savannah that early human populations settled in to take advantage of the favorable farming conditions. Among them was a mysterious people who lived in what is now southwestern Libya and should have been genetically Sub-Saharan—except, upon a modern analysis, their genes didn't reflect that.
Led by archaeogeneticist Nada Salem from the Max Planck Institute for Evolutionary Anthropology, a team of researchers analyzed the genes of two 7,000-year-old naturally preserved mummies of Neolithic female herders from the Takarkori rock shelter. Though genetic material does not preserve well in arid climates, which is why much about ancient human populations in the Sahara remains a mystery, there was enough fragmented DNA to give insights into their past.
"The majority of Takarkori individuals' ancestry stems from a previously unknown North African genetic lineage that diverged from sub-Saharan African lineages around the same time as present-day humans outside Africa and remained isolated throughout most of its existence," they said in a study recently published in Nature.
The Takarkori individuals are actually close relatives of 15,000-year-old foragers from Taforalt Cave in Morocco. Both lineages have about the same genetic distance from Sub-Saharan groups that existed during that period, which suggests that there was not much gene flow between Sub-Saharan and Northen Africa at the time. The Taforalt people also have half the Neanderthal genes of non-Africans, while the Takarkori have ten times less. What is strange is that they still have more Neanderthal DNA than other sub-Saharan peoples who were around at the time.
[...] The reason the Takarkori stayed isolated probably has to do with the diversity of environments in the Green Sahara. These ranged from lakes and wetlands to woodlands to grasslands, savannas and even mountains. Such differences in habitats were barriers to interaction between human populations.
Journal Reference: Salem, N., van de Loosdrecht, M.S., Sümer, A.P. et al. Ancient DNA from the Green Sahara reveals ancestral North African lineage. Nature 641, 144–150 (2025). https://doi.org/10.1038/s41586-025-08793-7
Deaths from flesh-eating bacteria are on the rise. Who is at risk?:
Deaths from "flesh-eating" bacteria are on the rise across the southeastern coasts of the U.S. At least five people in Florida, four in Louisiana and one in the Outer Banks have died this year from infections that can cause necrotizing wounds.
The culprit, the bacteria Vibrio vulnificus, thrives in warm seawater. Florida has seen 16 cases this year, according to the state's health department. Seventeen cases have been reported in Louisiana — more than previous years' annual averages. North Carolina has seen seven cases this year so far, the state Department of Health and Human Services confirmed to NBC News. And Mississippi has had three cases so far this year, the state's health department says.
Initial deaths from the infection in Florida were reported in counties spread around the state's extensive coastline, from Bay County in the Panhandle and Hillsborough County, where Tampa is, on the Gulf Coast, to Broward County in Southeastern Florida and St. Johns County just south of Jacksonville.
The bacteria can get into the body through open wounds in the skin and cause the surrounding tissue to die, a condition known as necrotizing fasciitis, or flesh-eating disease, according to the Centers for Disease Control and Prevention. People can also get Vibrio vulnificus from eating contaminated foods, particularly raw oysters. It's unclear how the people in Florida were infected.
About 1 in 5 people with a Vibrio vulnificus infection die, according to the CDC.
Antarpreet Jutla, an engineering professor at the University of Florida, said that infections are still rare but "something is off this year." Still, he said there are too many unknowns to be certain what's causing the rise in infections at this time.
"This is certainly not normal, that's one thing," Jutla said. "We haven't had that many cases early on in the summer for a very long time."
Jutla said Vibrio vulnificus infections tend to increase after hurricanes. Last year, Florida saw a total of 82 cases, which may have been exacerbated by the "extremely active" hurricane season. The bacteria can linger in hurricane floodwaters.
"Something happened this year that triggered the pathogens a little bit more than before," he said.
Hurricane season this year is still expected to be above normal as the U.S. enters its peak period, the National Oceanic and Atmospheric Administration reported Thursday.
Jutla's research group is investigating why there are high concentrations of plankton and chlorophyll — indicators for vibrio — across Florida's panhandle. He calls it a "concern."
Vibrio vulnificus is one of over 200 species of Vibrio bacteria, said Rita Colwell, a professor emerita of microbiology at the University of Maryland.
The majority of Vibrio infections aren't harmful to humans, Jutla said. Some only affect other animals.
But Vibrio bacteria do cause about 80,000 infections in people each year, according to the Cleveland Clinic. Most of those cases are gastrointestinal. Only a small handful — 100 to 200 cases — are due to Vibrio vulnificus. Other Vibrio species, including Vibrio parahaemolyticus and Vibrio alginolyticus, are often the cause of those stomach illnesses. Another type of Vibrio, Vibrio cholorae, causes the diarrheal disease cholera.
Because Vibrio bacteria prefer warm water, they are typically found along the southeastern shores of the U.S., but they are also found on the West Coast. As ocean temperatures warm, more cases have been found farther north in recent years, Jutla said, including some in New York, Connecticut and Maryland.
Vibrio bacteria can creep in open wounds after spending time in salty or brackish water, said Dr. Norman Beatty, an infectious disease doctor at University of Florida Health. Most cases he's seen have been associated with spending extended time in the water, but he says that even a brief exposure could be the "only thing needed."
Visible signs of an infection can start in just a few hours, Beatty said, and include redness, swelling and "bull's-eye" blisters. The site will also be painful. If infection progresses, it can get into the bloodstream and cause sepsis, which can be deadly. Symptoms of sepsis include fever, chills and dangerously low blood pressure, according to the CDC.
People with liver cirrhosis, weakened immune systems and those over 65 are most at risk for infection, Jutla said.
Vibrio vulnificus infections can be treated with antibiotics.
Beatty said he recommends covering up any open wounds before going into the ocean. Even a waterproof Band-Aid does the job, he said.
If people think they have an infection, they should seek care immediately, Beatty said. Delaying can be the difference between developing severe complications and a more mild infection.
"A delay in presenting to health care is truly the likely reason why most people have a more serious outcome than others," he said. "People who present within the same day with signs and symptoms of early infection, who receive antibiotics, can do well and can avoid a lot of these serious complications."
We risk a deluge of AI-written 'science' pushing corporate interests – here's what to do about it:
Back in the 2000s, the American pharmaceutical firm Wyeth was sued by thousands of women who had developed breast cancer after taking its hormone replacement drugs. Court filings revealed the role of "dozens of ghostwritten reviews and commentaries published in medical journals and supplements being used to promote unproven benefits and downplay harms" related to the drugs.
Wyeth, which was taken over by Pfizer in 2009, had paid a medical communications firm to produce these articles, which were published under the bylines of leading doctors in the field (with their consent). Any medical professionals reading these articles and relying on them for prescription advice would have had no idea that Wyeth was behind them.
The pharmaceutical company insisted that everything written was scientifically accurate and – shockingly – that paying ghostwriters for such services was common in the industry. Pfizer ended up paying out more than US$1 billion (£744 million) in damages over the harms from the drugs.
The articles in question are an excellent example of "resmearch" – bullshit science in the service of corporate interests. While the overwhelming majority of researchers are motivated to uncover the truth and check their findings robustly, resmearch is unconcerned with truth – it seeks only to persuade.
[...] Already the public health literature is observing a slew of papers that draw on data optimised for use with an AI to report single-factor results. Single-factor results link a single factor to some health outcome, such as finding a link between eating eggs and developing dementia.
These studies lend themselves to specious results. When datasets span thousands of people and hundreds of pieces of information about them, researchers will inevitably find misleading correlations that occur by chance.
A search of leading academic databases Scopus and Pubmed showed that an average of four single-factor studies were published per year between 2014 and 2021. In the first ten months of 2024 alone, a whopping 190 were published.
These weren't necessarily motivated by corporate interests – some could, for example, be the result of academics looking to publish more material to boost their career prospects. The point is more that with AI facilitating these kinds of studies, they become an added temptation for businesses looking to promote products.
[...] One issue is that research does not always go through peer review prior to informing policy. In 2021, for example, US Supreme Court justice Samuel Alito, in an opinion [PDF] on the right to carry a gun, cited a briefing paper by a Georgetown academic that presented survey data on gun use. [PDF]
The academic and gun survey were funded by the Constitutional Defence Fund, which the New York Times describes as a "pro-gun nonprofit".
Since the survey data are not publicly available and the academic has refused to answer questions about this, it is impossible to know whether his results are resmearch. Still, lawyers have referenced his paper in cases across the US to defend gun interests.
One obvious lesson is that anyone relying on research should be wary of any that has not passed peer review. A less obvious lesson is that we will need to reform peer review as well. There has been much discussion in recent years about the explosion in published research and the extent to which reviewers do their jobs properly.
[...] In general, the current system seems ill-equipped to cope with the deluge of papers that AI will precipitate. Reviewers need to invest time, effort and scrupulous attention checking preregistrations, specification curve analyses, data, code and so on.
This requires a peer-review mechanism that rewards reviewers [PDF] for the quality of their reviews. [PDF]
Public trust in science remains high worldwide. That is good for society because the scientific method is an impartial judge that promotes what is true and meaningful over what is popular or profitable.
Yet AI threatens to take us further from that ideal than ever. If science is to maintain its credibility, we urgently need to incentivise meaningful peer review.
Something Extremely Strange Is Happening at the Event Horizon of This Supermassive Blackhole:
In 2019, scientists unveiled the first-ever images of a black hole, M87*. Those observations kickstarted a wave of new investigations into how black holes work, how they grow, and how they change. And now, after a few upgrades, the Event Horizon Telescope network is back with another bombshell centered on M87*—finding tantalizing of previously unknown physics at the event horizon of the black hole itself.
In a series of images taken by the EHT between 2017 and 2021, scientists observed a completely unexpected reversal in the black hole's magnetic fields—in other words, its polarization flipped. They also detected strange jets blasting out of M87*. The observations provide researchers their most detailed view yet of the black hole, and, perhaps as a consequence, the extreme conditions surrounding it. The findings are set to be detailed in an upcoming Astronomy & Astrophysics paper.
"These results show how the EHT is evolving into a fully fledged scientific observatory, capable not only of delivering unprecedented images but also of building a progressive and coherent understanding of black hole physics," said Mariafelicia De Laurentis, study co-author and an astronomer at the University of Naples Federico II in Italy, in a release.
M87* is a supermassive black hole that sits at the center of the galaxy M87, which is located about 55 million light-years away from Earth. This behemoth is estimated to be more than six billion times the mass of our Sun. Such a gigantic black hole should exert huge gravitational influence on any matter nearby, as seen in the ring of bright, orange plasma in the image.
What caught astronomers by surprise, however, were stark shifts in the direction of the plasma spiral around M87*, technically known as its polarization pattern. It suggests that the area around M87* is an "evolving, turbulent environment where magnetic fields play a vital role in governing how matter falls into the black hole and how energy is launched outward," the researchers explained.
"What's remarkable is that while the ring size has remained consistent over the years-confirming the black hole's shadow predicted by Einstein's theory-the polarization pattern changes significantly," said Paul Tiede, study co-lead author and an astronomer at the Center for Astrophysics at Harvard & Smithsonian.
"This tells us that the magnetized plasma swirling near the event horizon is far from static; it's dynamic and complex, pushing our theoretical models to the limit," he added.
The observations suggest the polarization pattern at M87* flipped direction in 2017, before spiraling the other way in 2021.
"It challenges our models and shows there's much we still don't understand near the event horizon," said Jongho Park, another co-author of the paper and an astronomer at Kyunghee University in South Korea.
Black hole physics is, well, a bit of a black hole, with myriad unanswered questions and mysteries still to be solved. Any hint we can get helps to advance our science forward: Supermassive black holes like M87* are essential to how galaxies form stars, and they help distribute seeds of energy throughout the universe.
In particular, the powerful jets emitted by such large black holes are a "unique laboratory" for astrophysicists studying gamma rays or high-energy neutrinos, the researchers said, offering a rich array of information about the role of black holes in cosmic evolution.