Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
A scientist's unconventional project illustrates many challenges in developing new vaccines:
Chris Buck stands barefoot in his kitchen holding a glass bottle of unfiltered Lithuanian farmhouse ale. He swirls the bottle gently to stir up a fingerbreadth blanket of yeast and pours the turbulent beer into a glass mug.
Buck raises the mug and sips. "Cloudy beer. Delightful!"
He has just consumed what may be the world's first vaccine delivered in a beer. It could be the first small sip toward making vaccines more palatable and accessible to people around the world. Or it could fuel concerns about the safety and effectiveness of vaccines. Or the idea may go nowhere. No matter the outcome, the story of Buck's unconventional approach illustrates the legal, ethical, moral, scientific and social challenges involved in developing potentially life-saving vaccines.
Buck isn't just a home brewer dabbling in drug-making. He is a virologist at the National Cancer Institute in Bethesda, Md., where he studies polyomaviruses, which have been linked to various cancers and to serious health problems for people with weakened immune systems. He discovered four of the 13 polyomaviruses known to infect humans.
The vaccine beer experiment grew out of research Buck and colleagues have been doing to develop a traditional vaccine against polyomavirus. But Buck's experimental sips of vaccine beer are unsanctioned by his employer. A research ethics committee at the National Institutes of Health told Buck he couldn't experiment on himself by drinking the beer.
Buck says the committee has the right to determine what he can and can't do at work but can't govern what he does in his private life. So today he is Chef Gusteau, the founder and sole employee of Gusteau Research Corporation, a nonprofit organization Buck established so he could make and drink his vaccine beer as a private citizen. His company's name was inspired by the chef in the film Ratatouille, Auguste Gusteau, whose motto is "Anyone can cook."
Buck's body made antibodies against several types of the virus after drinking the beer and he suffered no ill effects, he and his brother Andrew Buck reported December 17 at the data sharing platform Zenodo.org, along with colleagues from NIH and Vilnius University in Lithuania. Andrew and other family members have also consumed the beer with no ill effects, he says. The Buck brothers posted a method for making vaccine beer December 17 at Zenodo.org. Chris Buck announced both publications in his blog Viruses Must Die on the online publishing platform Substack, but neither has been peer-reviewed by other scientists.
[...] Buck's unconventional approach has also sparked concerns among other experts about the safety and efficacy of the largely untested vaccine beer. While he has promising data in mice that the vaccine works, he has so far reported antibody results in humans from his own sips of the brew. Normally, vaccines are tested in much larger groups of people to see how well they work and whether they trigger any unanticipated side effects. This is especially important for polyomavirus vaccines, because one of the desired uses is to protect people who are about to get organ transplants. The immune-suppressing drugs these patients must take can leave them vulnerable to harm from polyomaviruses.
Michael Imperiale, a virologist and emeritus professor at the University of Michigan Medical School in Ann Arbor, first saw Buck present his idea at a scientific conference in Italy in June. The beer approach disturbed him. "We can't draw conclusions based on testing this on two people," he says, referring to Buck and his brother. It's also not clear which possible side effects Buck was monitoring for. Vaccines for vulnerable transplant patients should go through rigorous safety and efficacy testing, he says. "I raised a concern with him that I didn't think it was a good idea to be sidestepping that process."
Other critics warn that Buck's unconventional approach could fuel antivaccine sentiments. Arthur Caplan, who until recently headed medical ethics at the New York University Grossman School of Medicine, is skeptical that a vaccine beer will ever make it beyond Buck's kitchen.
"This is maybe the worst imaginable time to roll out something that you put on a Substack about how to get vaccinated," he says. Many people won't be interested because of antivaccine rhetoric. Beer companies may fear that having a vaccine beer on the market could sully the integrity of their brands. And Buck faces potential backlash from "a national administration that is entirely hostile to vaccines," Caplan says. "This is not the place for do-it-yourself."
But the project does have supporters who say it could instead calm vaccine fears by allowing everyday people to control the process. Other researchers are on the fence, believing that an oral vaccine against polyomavirus is a good idea but questioning whether Buck is going about introducing such a vaccine correctly.
[...] Buck says his self-experiment illustrates that a person can be safely immunized against BK polyomaviruses through drinking beer. But even though Buck produced antibodies, there is no guarantee others will. And right now, people who drink the vaccine beer won't know whether they produce antibodies or if any antibodies they do produce will be sufficient to protect them from developing cancer or other serious health problems later.
Other scientists familiar with Buck and his yeast project also have conflicting opinions about how it might influence public trust and acceptance of vaccines.
If something were to go wrong when a person tried to replicate Buck's beer experiment, Imperiale worries about "the harm that it could do to our ability to administer vaccines that have been tested, tried and true, and just the more general faith that the public has in us scientists. Right now, the scientific community has to think about everything it does and answer the question, 'Is what we're doing going to cause more distrust amongst the public?'"
That's especially true now that health officials in the Trump administration are slashing funding for vaccine research, undermining confidence in vaccines and limiting access to them. A recent poll by the Pew Research Center found that a majority of Americans are still confident that childhood vaccines are highly effective at preventing illness. But there has been an erosion of trust in the safety of those vaccines, particularly among Republicans.
[...] Buck feels a moral imperative to move forward with his self-experiments and to make polyomavirus vaccine beer available to everyone who wants it. "This is the most important work of my whole career," he says. "It's important enough to risk my career over." What he's doing in his home lab is consistent with his day job, he adds. "At the NIH in my contract it says my job is to generate and disseminate scientific knowledge," he says. "This is my only job, to make knowledge and put it out there and try to sell it to the public."
He doesn't see himself as a maverick. "I'm not a radical who's trying to subvert the system. I'm obeying the system, and I'm using the only thing that is left available to me."
OS news brings us the news that HP-UX reached the end of its life on December 31st:
It's 31 December 2025 today, the last day of the year, but it also happens to mark the end of support for the last and final version of one of my favourite operating systems: HP-UX. Today is the day HPE puts the final nail in the coffin of their long-running UNIX operating system, marking the end of another vestige of the heyday of the commercial UNIX variants, a reign ended by cheap x86 hardware and the increasing popularisation of Linux.
HP-UX' versioning is a bit of a convoluted mess for those not in the know, but the versions that matter are all part of the HP-UX 11i family. HP-UX 11i v1 and v2 (also known as 11.11 and 11.23, respectively) have been out of support for exactly a decade now, while HP-UX 11i v3 (also known as 11.31) is the version whose support ends today. To further complicate matters, like 11i v2, HP-UX 11i v3 supports two hardware platforms: HP 9000 (PA-RISC) and HP Integrity (Intel Itanium). Support for the HP-UX 11i v3 variant for HP 9000 ended exactly four years ago, and today marks the end of support for HP-UX 11i v3 for HP Integrity.
And that's all she wrote.
HP-UX 11i v1 was the last PA-RISC version of the operating system to officially support workstations, with 11i v2 only supporting Itanium workstations. There are some rumblings online that 11i v2 will still work just fine on PA-RISC workstations, but I have not yet tried this out. My c8000 also has a ton of other random software on it, of course, and only yesterday I discovered that the most recent release of sudo configures, compiles, and installs from source just fine on it. Sadly, a ton of other modern open source code does not run on it, considering the slightly outdated toolchain on HP-UX and few people willing and/or able to add special workarounds for such an obscure platform.
Over the past few years, I've been trying to get into contact with HPE about the state of HP-UX' patches, software, and drivers, which are slowly but surely disappearing from the web. A decent chunk is archived on various websites, but a lot of it isn't, which is a real shame. Most patches from 2009 onwards are unavailable, various software packages and programs for HP-UX are lost to time, HP-UX installation discs and ISOs later than 2006-2009 are not available anywhere, and everything that is available is only available via non-sanctioned means, if you know what I mean. Sadly, I never managed to get into contact with anyone at HPE, and my concerns about HP-UX preservation seem to have fallen on deaf ears. With the end-of-life date now here, I'm deeply concerned even more will go missing, and the odds of making the already missing stuff available are only decreasing.
I've come to accept that very few people seem to hold any love for or special attachment to HP-UX, and that very few people care as much about its preservation as I do. HP-UX doesn't carry the movie star status of IRIX, nor the benefits of being available as both open source and on commodity hardware as Solaris, so far fewer people have any experience with it or have developed a fondness for it. HP-UX didn't star in a Steven Spielberg blockbuster, it didn't leave behind influential technologies like ZFS. Despite being supported up until today, it's mostly forgotten – and not even HPE itself seems to care.
And that makes me sad.
When you raise your glasses tonight to mark the end of 2025 and welcome the new year, spare a thought for the UNIX everyone forgot still exists. I know I will.
Did you work with HP-UX? What did you think of it? How does it compare to more modern OSes? More widely, can we still learn things from older software, and are they worth archiving as historical items?
The FBI says Americans lost at least $333 million to Bitcoin ATM scams in 2025, as the cryptocurrency has continued to gain popularity for use in fraudulent transactions. The law enforcement agency told CNBC that this is a "clear and constant rise" that is "not slowing down." Reported losses to crypto ATM scams first broke $100 million in 2023, with the amount hitting $114 million — this then doubled the following year to $247 million. While the reported losses in 2025 weren't as huge a jump, it's still costing private citizens a huge amount of money, with most scammers targeting older victims.
The authorities are acting against cryptocurrency ATM providers, saying that they're "pocketing hundreds of thousands of dollars in undisclosed fees on the backs of scam victims." The U.S. Attorney General even sued Athena Bitcoin, with the lawsuit pointing out the 93% of the transactions on its ATMs "are the product of outright fraud," with victims having a median age of 71 years. In its defense, Athena told ABC News that it has "strong safeguards against fraud, including transparent instructions, prominent warnings, and customer education." An Athena rep also said, "Just as a bank isn't held responsible if someone willingly sends funds to someone else, Athena does not control users' decisions."
Earlier this year, we saw one local government take things into its own hands using a power tool to recover almost $32,000 that a victim deposited into a Bitcoin Depot ATM. The Sheriff's office was able to do this after securing a warrant, but the company said that it will seek damages, especially as each machine costs around $14,000. Furthermore, the victim will not be able to immediately get the recovered money, as it will have to go through the legal system before the scammed amount will be returned to them.
The U.S. isn't the only place that is seeing a growing number of crypto ATM scam cases — Australian authorities also said that most crypto ATM users are either scam victims or money mules who were forced to deposit cash into these machines. Cryptocurrency does, of course, have some advantages and legitimate uses. But because it's still fairly new, many don't understand how it works and often assume that it's just like any other bank. And with crypto ATMs becoming more ubiquitous in the U.S., it's also making it much easier for scammers to extort and steal money from their unsuspecting victims.
The Guardian has an article about the forthcoming upgrade of the Large Hadron Collider at CERN in Switzerland, which will be overseen by a new CERN Director General, Mark Thomson.
The LHC is famous for its use in discovering the Higgs boson, a fundamental particle whose existence was predicted in the 1960s as the means by which some other particles gain mass.
The latest upgrade, the high-luminosity LHC, will begin in June and take approximately five years. The superconducting magnets will be upgraded to increase the luminosity of the proton beams being collided and the detectors are also being upgraded.
It is hoped that the improved performance of the LHC will allow it to explore the interactions of Higgs bosons.
If the upgrade works, the LHC will make more precise measurements of particles and their interactions, which could find cracks in today's theories that become the foundations for tomorrow's. One remaining mystery surrounds the Higgs boson. Elementary particles gain their masses from the Higgs, but why the masses vary as they do is anyone's guess. It is not even clear how Higgs bosons interact with one another. "We could see something completely unexpected," Thomson says.
CERN also has plans to replace the LHC with a larger and more powerful collider called the Future Circular Collider, which will require a new 91km circular tunnel (compared with the LHC's 27km). There is no certainty as to what new science might be discovered with the FCC, and there are challenges obtaining sufficient funding. However, there are several fundamental questions to be explored by the new machine such as: what is the dark matter that clumps around galaxies; what is the dark energy that pushes the universe apart; why is gravity so weak; and why did matter win out over antimatter when the universe formed?
Ozempic is changing the foods Americans buy:
When Americans begin taking appetite-suppressing drugs like Ozempic and Wegovy, the changes extend well beyond the bathroom scale. According to new research, the medications are associated with meaningful reductions in how much households spend on food, both at the grocery store and at restaurants.
The study, published Dec. 18 in the Journal of Marketing Research, links survey data on GLP-1 receptor agonist use – a class of drugs originally developed for diabetes and now widely prescribed for weight loss – with detailed transaction records from tens of thousands of U.S. households. The result is one of the most comprehensive looks yet at how GLP-1 adoption is associated with changes in everyday food purchasing in the real world.
The headline finding is striking: Within six months of starting a GLP-1 medication, households reduce grocery spending by an average of 5.3%. Among higher-income households, the drop is even steeper, at more than 8%. Spending at fast-food restaurants, coffee shops and other limited-service eateries falls by about 8%.
Among households who continue using the medication, lower food spending persists at least a year, though the magnitude of the reduction becomes smaller over time, say co-authors, assistant professor Sylvia Hristakeva and professor Jura Liaukonyte, both in the Charles H. Dyson School of Applied Economics and Management in the Cornell SC Johnson College of Business.
"The data show clear changes in food spending following adoption," Hristakeva said. "After discontinuation, the effects become smaller and harder to distinguish from pre-adoption spending patterns."
[...] The reductions were not evenly distributed across the grocery store.
Ultra-processed, calorie-dense foods – the kinds most closely associated with cravings – saw the sharpest declines. Spending on savory snacks dropped by about 10%, with similarly large decreases in sweets, baked goods and cookies. Even staples like bread, meat and eggs declined.
Only a handful of categories showed increases. Yogurt rose the most, followed by fresh fruit, nutrition bars and meat snacks.
"The main pattern is a reduction in overall food purchases. Only a small number of categories show increases, and those increases are modest relative to the overall decline," Hristakeva said.
The effects extended beyond the supermarket. Spending at limited-service restaurants such as fast-food chains and coffee shops fell sharply as well.
[...] Notably, about one-third of users stopped taking the medication during the study period. When they did, their food spending reverted to pre-adoption levels – and their grocery baskets became slightly less healthy than before they started, driven in part by increased spending on categories such as candy and chocolate.
That movement underscores an important limitation, the authors caution. The study cannot fully separate the biological effects of the drugs from other lifestyle changes users may make at the same time. However, evidence from clinical trials, combined with the observed reversion in spending after discontinuation, suggests appetite suppression is likely a key mechanism behind the spending changes.
Journal Reference: Hristakeva, S., Liaukonytė, J., & Feler, L. (2025). EXPRESS: The No-Hunger Games: How GLP-1 Medication Adoption is Changing Consumer Food Demand. Journal of Marketing Research, 0(ja). https://doi.org/10.1177/00222437251412834
https://phys.org/news/2025-12-scientists-outline-atomic-scale-polaritons.html
Controlling light at dimensions thousands of times smaller than the thickness of a human hair is one of the pillars of modern nanotechnology.
An international team led by the Quantum Nano-Optics Group of the University of Oviedo and the Nanomaterials and Nanotechnology Research Center (CINN/Principalty of Asturias-CSIC) has published a review article in Nature Nanotechnology detailing how to manipulate fundamental optical phenomena when light couples to matter in atomically thin materials.
The study focuses on polaritons, hybrid quasiparticles that emerge when light and matter interact intensely. By using low-symmetry materials, known as van der Waals materials, light ceases to propagate in a conventional way and instead travels along specific directions, a characteristic that gives rise to phenomena that challenge conventional optics.
Among the findings reviewed are behaviors such as negative refraction, where light bends in the opposite direction to the usual one when crossing a boundary between materials, or canalized propagation, which makes it possible to guide energy without it dispersing.
"These properties offer unprecedented control over light–matter interaction in regions of the spectrum ranging from the visible to the terahertz," the team describes in the article.
This research is part of the TWISTOPTICS project, led by University of Oviedo professor Pablo Alonso González. This project is dedicated to the study of how twisting or stacking nanometric layers—a technique reminiscent of atomic-scale "Lego" pieces—makes it possible to design physical properties à la carte.
The publication is the result of an international collaboration in which—alongside the University of Oviedo—leading centers such as the Beijing Institute of Technology (BIT), the Donostia International Physics Center (DIPC), and the Max Planck Institute have participated.
The theoretical and experimental framework presented in this work lays the foundations for future practical implementations in various technological sectors, including integrated optical circuits, high-sensitivity biosensors, thermal management, and super-resolution imaging.
More information: Yixi Zhou et al, Fundamental optical phenomena of strongly anisotropic polaritons at the nanoscale, Nature Nanotechnology (2025). DOI: 10.1038/s41565-025-02039-3
One small step for chips, one giant leap for a lack of impurities:
A team from Cardiff, Wales, is experimenting with the feasibility of building semiconductors in space, and its most recent success is another step forward towards its goal. According to the BBC, Space Forge's microwave-sized furnace has been switched on in space and has reached 1,000°C (1,832°F) — one of the most important parts of the manufacturing process that the company needs to validate in space.
"This is so important because it's one of the core ingredients that we need for our in-space manufacturing process," Payload Operations Lead Veronica Vera told the BBC. "So being able to demonstrate this is amazing." Semiconductor manufacturing is a costly and labor-intensive endeavor on Earth, and while putting it in orbit might seem far more complicated, making chips in space offers some theoretical advantages. For example, microgravity conditions would help the atoms in semiconductors line up perfectly, while the lack of an atmosphere would also reduce the chance of contaminants affecting the wafer.
These two things would help reduce imperfections in the final wafer output, resulting in a much more efficient fab. "The work that we're doing now is allowing us to create semiconductors up to 4,000 times purer in space than we can currently make here today," Space Forge CEO Josh Western told the publication. "This sort of semiconductor would go on to be in the 5G tower in which you get your mobile phone signal, it's going to be in the car charger you plug an EV into, it's going to be in the latest planes."
Space Forge launched its first satellite in June 2025, hitching a ride on the SpaceX Transporter-14 rideshare mission. However, it still took the company several months before it finally succeeded in turning on its furnace, showing how complicated this project can get. Nevertheless, this advancement is quite promising, with Space Forge planning to build a bigger space factory with the capacity to output 10,000 chips. Aside from that, it also needs to work on a way to bring the finished products back to the surface. Other companies are also experimenting with orbital fabs, with U.S. startup Besxar planning to send "Fabships" into space on Falcon 9 booster rockets.
Putting semiconductor manufacturing in space could help reduce the massive amounts of power and water that these processes require from our resources while also outputting more wafers with fewer impurities. However, we also have to consider the huge environmental impact of launching multiple rockets per day just to deliver the raw materials and pick up the finished products from orbit.
Consumes 1/3 the power of optical, but costs 1/3 more than optical:
Scale-up connectivity is crucial for the performance of rack-scale AI systems, but achieving high bandwidth and low latency for such interconnections using copper wires is becoming increasingly complicated with each generation. Using optical interconnections for scale-up connectivity is a possibility, but it may be an overkill, so start-ups Point2 and AttoTude propose to use radio-based interconnections operating at millimeter-wave and terahertz frequencies over waveguides that connect to systems using standard pluggable connectors, reports IEEE Spectrum.
Point2's implementation uses what it calls an 'active radio cable' built from eight 'e-Tube' waveguides. Each waveguide carries data using two frequencies — 90 GHz and 225 GHz — and plug-in modules at both ends convert digital signals directly into modulated millimeter-wave radio and back again. A full cable delivers 1.6 Tb/s, occupies 8.1mm, or about a half the volume of a comparable active copper cable, and can reach up to seven meters, more than enough for scale-up connectivity. Point2 says the design consumes roughly one-third the power of optical links, costs about one-third as much, and adds as little as one-thousandth the latency.
A notable aspect of Point2's approach is the relative maturity of its technology. The radio transceivers can be fabricated at standard semiconductor production facilities using well-known fabrication processes — the company has already demonstrated this approach using a 28nm chip with the Korea Advanced Institute of Science and Technology (KAIST). Also, its partners Molex and Foxconn Interconnect Technology have shown that the specialized cables can be produced on existing lines without major retooling.
AttoTude is pursuing a similar concept, but at even higher frequencies. Its system combines a digital interface, a terahertz signal generator, and a mixer that encodes data onto carriers between 300 and 3,000 GHz that feeds the signal into a narrow dielectric waveguide. Early versions used hollow copper tubes, while later generations rely on fibers measuring approximately 200 micrometers across with losses as low as 0.3 dB per meter (considerably lower than copper). The company has demonstrated 224 Gb/s transmission over four meters at 970 GHz and projects viable reaches of around 20 meters.
Both companies use waveguides instead of cables because, at millimeter-wave and terahertz frequencies cables fail. While at very high data rates copper cables can pass signals, they do so by becoming thicker, shorter, and more power-hungry. Furthermore, their losses and jitter rise so fast that the link budget collapses and breaks, so cables cannot be used for such applications. Meanwhile, waveguides are not an exotic choice, they are among a few viable option for interconnects with terabit/s-class bandwidth.
A proof-of-concept is now available on the internet:
MongoBleed, a high-severity vulnerability plaguing multiple versions of MongoDB, can now easily be exploited since a proof-of-concept (PoC) is now available on the web.
Earlier this week, security researcher Joe Desimone published code that exploits a "read of uninitialized heap memory" vulnerability tracked as CVE-2025-14847. This vulnerability, rated 8.7/10 (high), stems from "mismatched length fields in Zlib compressed protocol headers".
By sending a poisoned message claiming a larger size when decompressed, the attacker can cause the server to allocate a bigger memory buffer, through which they would leak in-memory data containing sensitive information, such as credentials, cloud keys, session tokens, API keys, configurations, and other data.
What's more - the attackers exploiting MongoBleed do not need valid credentials to pull the attack off.
In its writeup, BleepingComputer confirms that there are roughly 87,000 potentially vulnerable instances exposed on the public internet, as per data from Censys. The majority are located in the United States (20,000), with notable instances in China (17,000), and Germany (around 8,000).
Here is a list of all the vulnerable versions:
- MongoDB 8.2.0 through 8.2.3
- MongoDB 8.0.0 through 8.0.16
- MongoDB 7.0.0 through 7.0.26
- MongoDB 6.0.0 through 6.0.26
- MongoDB 5.0.0 through 5.0.31
- MongoDB 4.4.0 through 4.4.29
- All MongoDB Server v4.2 versions
- All MongoDB Server v4.0 versions
- All MongoDB Server v3.6 versions
If you are running any of the above, make sure to patch up - a fix for self-hosting instances has been available since December 19. Users running MongoDB Atlas don't need to do anything, since their instances were automatically patched.
So far, there are no confirmed reports of in-the-wild abuse, although some researchers are linking MongoBleed to the recent Ubisoft Rainbow Six Siege breach.
Christmas is already behind us, but since this is an announcement from 11 December – that I missed – I'm calling this a very interesting and surprising Christmas present.
The team and I are beyond excited to share what we've been cooking up over the last little while: a full desktop environment running on QNX 8.0, with support for self-hosted compilation! This environment both makes it easier for newly-minted QNX developers to get started with building for QNX, but it also vastly simplifies the process of porting Linux applications and libraries to QNX 8.0.
↫ John Hanam at the QNX Developer BlogWhat we have here is QNX 8.0 running the Xfce desktop environment on Wayland, a whole slew of build and development tools like clang, gcc, git, etc.), a ton of popular code editors and IDEs, a web browser (looks like GNOME Web?), access to all the ports on the QNX Open-Source Dashboard, and more. For now, it's only available as a Qemu image to run on top of Ubuntu, but the plan is to also release an x86 image in the coming months so you can run this directly on real hardware.
This isn't quite the same as the QNX of old with its unique Photon microGUI, but it's been known for a while now that Photon hasn't been actively developed in a long time and is basically abandoned. Running Xfce on Wayland is obviously a much more sensible solution, and one that's quite future-proof, too. As a certified QNX desktop enthusiast of yore, I can't wait for the x86 image to arrive so I can try this out properly.
There are downsides. This image, too, is encumbered by annoying non-commercial license requirements and sign-ups, and this also wouldn't be the first time QNX starts an enthusiast effort, only to abandon it shortly after. Buyer beware, then, but I'm cautiously optimistic.
= Related: QNX at Wikipedia
Every task we perform on our computer — whether number crunching, watching a video, or typing out an article — requires different components of the machine to interact with one another. "Communication is massively crucial for any computation," says former SFI Graduate Fellow Abhishek Yadav, a Ph.D. scholar at the University of New Mexico. But scientists don't fully grasp how much energy computational devices spend on communication.
Over the last decade, SFI Professor David Wolpert has spearheaded research to unravel the principles underlying the thermodynamic costs of computation. Wolpert notes that determining the "thermodynamic bounds on the cost of communication" is an overlooked but critical issue in the field, as it applies not only to computers but also to communication systems across the board. "They are everything that holds up modern society," he says.
Now, a new study in Physical Review Research, co-authored by Yadav and Wolpert, sheds light on the unavoidable heat dissipation that occurs when information is transmitted across a system, challenging an earlier view that, in principle, communication incurs no energetic cost. For the study, the researchers drew on and combined principles from computer science, communication theory, and stochastic thermodynamics, a branch of statistical physics that deals with real-world out-of-equilibrium systems that includes smartphones and laptops.
Using a logical abstraction of generic communication channels, the researchers determined the minimum amount of heat a system must dissipate to transmit one unit of information. This abstraction could apply to any communication channel — artificial (e.g., optical cable) or biological (e.g., a neuron firing a signal in the brain). Real-world communication channels always have some noise that can interfere with the information transmission, and the framework developed by Yadav and Wolpert shows that the minimum heat dissipation is at least equal to the amount of useful information — technically called mutual information — that filters through the channel's noise.
Then, they used another broadly applicable abstraction of how modern-day computers perform computations to derive the minimum thermodynamic costs associated with encoding and decoding. Encoding and decoding steps ensure reliable transmission of messages by mitigating channel noise. Here, the researchers gained a significant insight: improving the accuracy of data transmission through better encoding and decoding algorithms comes at the cost of increased heat dissipation within the system.
Uncovering the unavoidable energy costs of sending information through communication channels could help build energy-efficient systems. Yadav reckons that the von Neumann architecture used in current computers presents significant energetic costs associated with communication between the CPU and memory. "The principles that we are outlining can be used to draw inspiration for future computer architecture," he says.
As these energy costs apply to all communication channels, the work presents a potential avenue for researchers to deepen the understanding of various energy-hungry complex systems where communication is crucial, from biological neurons to artificial logical circuits. Despite burning 20% of the body's calorie budget, the brain uses energy far more efficiently than artificial computers do, says Yadav. "So it would be interesting to see how natural computational systems like the brain are coping with the cost associated with communication."
Journal Reference: Abhishek Yadav and David Wolpert, Minimal thermodynamic cost of communication, Phys. Rev. Research 7, 043324 – Published 22 December, 2025 DOI: https://doi.org/10.1103/qvc2-32xr
https://finance.yahoo.com/news/fda-officially-confirms-kava-food-140000605.html
U.S. Food and Drug Administration (FDA), After Reviewing Historical Use and Modern Safety Evidence, Officially Confirms Kava is a Food Under Federal Law
The United States Food and Drug Administration (FDA) has officially confirmed that kava is a conventional food under federal law. This acknowledgment marks a pivotal moment in the national understanding of kava, providing long-needed clarity across federal and state systems and affirming that, when prepared and enjoyed as a beverage (i.e. kava tea), kava holds a legitimate and established place within the nation's food landscape.
This federal confirmation, issued through multiple FDA case responses, has already guided the State of Hawaii and the State of Michigan, with additional states now reviewing the same evidence, to determine that the kava beverage qualifies as Generally Recognized As Safe (GRAS) based on its extensive history of safe, cultural use. For Pacific Island communities, including Native Hawaiians whose cultural practices, ceremonies, and community life have been intertwined with kava for generations, the people of American Samoa, and the many Fijian, Tongan, and other Pacific Islander families throughout the United States, this acknowledgment carries profound significance. It affirms the deep cultural legacy of kava, strengthens recognition of Pacific Islander heritage in the United States, and honors a cultural food that is now finding an increasingly meaningful place in modern American life.
FDA Issues Written Statements Affirming Kava Tea as a Conventional Food
Kava's longstanding cultural use as a beverage informs how federal law evaluates traditional foods, and this history shaped the FDA's recent clarification. When asked to confirm how kava should be treated under federal food law, the agency provided some of its clearest language to date. In responding, the FDA affirmed the classification of the kava beverage and stated:
"You are correct that kava mixed with water as a single ingredient conventional food would generally not be regulated as a food additive if the tea is consumed as food."
In another communication, the agency reinforces this, explaining that "Kava tea can be considered as a food, provided that the tea and labeling are compliant with FDA's food safety and food labeling regulations".
More information:
The left-wing Irish government has vowed to push for the European Union to prohibit the use of anonymous social media accounts in what may set the ground for another battle over free speech with the Trump administration in the United States.
Ireland will take over the rotating Presidency of the Council of the European Union for a six month term starting in July and looks set to push for more restrictions on the internet, namely the imposition of ID-verification for social media accounts. The move would effectively end anonymity on social media, which critics have warned will hinder dissidents from speaking out against power structures.
Speaking to the Extra news outlet, Deputy Prime Minister Simon Harris said that anonymous accounts and so-called disinformation are "an issue in relation to our democracy. And I don't just mean ours. I mean democracy in the world."
"This isn't just Ireland's view. If you look at the comments of Emmanuel Macron... of Keir Starmer... recently, in terms of being open to considering what Australia have done, if you look at the actions of Australia, you know this is a global conversation Ireland will and should be a part of," he said.
Harris also said that Dublin will consider following Australia's lead in banning children under the age of 16 from accessing social media.
"We've age requirements in our country for so many things. You can't buy a pint before a certain age. You can't drive a car before a certain age. You can't place a bet before a certain age," the Deputy PM said.
"We have a digital age of consent in Ireland, which is 16, but it's simply not being enforced. And I think that's a really important move. And then I think there's the broader issue, which will require work that's not just at an Irish level, around the anonymous bots."
It comes in the wake of the U.S. State Department announcing sanctions against five British and European figures for their roles in silencing Americans and American companies.
Among those to face a visa ban sanction was former European Commissioner for Internal Market Thierry Breton, who served as the EU's censorship czar until last year and who spearheaded the bloc's Digital Services Act.
The draconian set of restrictions demand that large social media companies self-censor their platforms of so-called "hate speech" and "disinformation" or face the prospect of Brussels imposing a fine of up to six per cent of their global revenue. Earlier this month, the Digital Services Act was used to fine Elon Musk's X €120 million ($140 million).
Breton had previously threatened to use the DSA, which allows for the bloc to ban social media firms from operating on the continent, against Musk for conducting a live interview on X with then-Presidential candidate Donald Trump in the lead up to last year's elections. The Frenchman warned that the interview could result in the "amplification of harmful content" that may "generate detrimental effects on civic discourse and public security".
Announcing the sanctions against Breton and others, Secretary of State Marco Rubio said last week: "For far too long, ideologues in Europe have led organized efforts to coerce American platforms to punish American viewpoints they oppose. The Trump Administration will no longer tolerate these egregious acts of extraterritorial censorship."
New Study Reveals How the Brain Measures Distance:
Whether you are heading to bed or seeking a midnight snack, you don't need to turn on the lights to know where you are as you walk through your house at night. This hidden skill comes from a remarkable ability called path integration: your brain constantly tallies your steps and turns, allowing you to mentally track your position like a personal GPS. You're building a map by tracking movement, not sight.
Scientists at the Max Planck Florida Institute for Neuroscience (MPFI) think that understanding how the brain performs path integration could be a critical step toward understanding how our brain turns momentary experiences into memories of events that unfold over time. Publishing their findings this week in Nature Communications, they have made big strides toward this goal. Their insights may also provide information about what may be happening to patients in the early stages of Alzheimer's disease, whose first symptoms are often related to difficulty tracking distance or time.
In their study, the team trained mice to run a specific distance in a gray virtual reality environment without visual landmarks, in exchange for a reward. The animals could only judge how far they had traveled by monitoring their own movement, not by relying on environmental cues. As mice performed this task, the scientists recorded tiny electrical pulses that neurons use to communicate, allowing them to observe the activity of thousands of neurons. They focused on the activity of neurons in the hippocampus, a region essential for both navigation and memory. Using computer modeling, they then analyzed these signals to reveal the computational rules the brain uses for path integration.
"The hippocampus is known to help animals find their way through the environment. In this brain region, some neurons become active at specific places. However, in environments full of sights, sounds, and smells, it is difficult to tell whether these neurons are responding to those sensory cues or to the animal's position itself," explains senior author and MPFI group leader Yingxue Wang. "In this study, we removed as many sensory cues as possible to mimic situations such as moving in the dark. In these simplified conditions, we found that only a small number of hippocampal cells signaled a specific place or a specific time. This observation made us wonder what the rest of the neurons were doing, and whether they were helping the animal keep track of where it is by integrating how far and how long it had been moving, a process called path integration."
The scientists discovered that during navigation without landmarks, most hippocampal neurons followed one of two opposite patterns of activity. These patterns were crucial for helping the animals keep track of how far they had traveled.
In one group of neurons, activity sharply increased when the animal started moving, as if marking the start of the distance-counting process. The activity of these neurons then gradually ramped down at different rates as the animal moved further, until reaching the set distance for a reward. A second group of neurons showed the opposite pattern. Their activity dropped when the animal started moving, but gradually ramped up as the animal traveled farther.
The team discovered that these activity patterns act as a neural code for distance, with two distinct phases. The first phase (the rapid change in neural activity) marks the start of movement and the beginning of distance counting. The second phase (the gradual ramping changes in neural activity) counts the distance traveled. Both short and long distances could be tracked in the brain by using neurons with different ramping speeds.
"We have discovered that the brain encodes the elapsed distance or time needed to solve this task using neurons that show ramping activity patterns," said lead scientist Raphael Heldman. "This is the first time distance has been shown to be encoded in a way that differs from the well-known place-based coding in the hippocampus. These findings expand our understanding that the hippocampus is using multiple strategies – ramping patterns in addition to the place-based coding – to encode elapsed time and distance."
When the researchers disrupted these patterns by manipulating the circuits that produce them, the animals had difficulty performing the task accurately and often searched for the reward in the wrong location.
Dr. Wang notes that "understanding how time and distance are encoded in the brain during path integration is especially important because this ability is one of the earliest to degrade in Alzheimer's disease. Patients report early symptoms of getting spatially disoriented in familiar surroundings or not knowing how they got to a particular place."
The research team is now turning its efforts to understand how these patterns are generated in the brain, which may help reveal how our moment-to-moment experiences are encoded into memories
Journal Reference: Heldman, R., Pang, D., Zhao, X. et al. Time or distance encoding by hippocampal neurons via heterogeneous ramping rates. Nat Commun 16, 11083 (2025). https://doi.org/10.1038/s41467-025-67038-3
Security researchers have found various security-relevant errors in GnuPG and similar programs. Many of the vulnerabilities are (still) not fixed.
At the 39th Chaos Communication Congress, security researchers Lexi Groves, aka 49016, and Liam Wachter demonstrated a whole series of vulnerabilities in various tools for encrypting and signing data. In total, the researchers found 14 vulnerabilities in four different programs. All discovered problems are implementation errors, meaning they do not affect the fundamental security of the methods used, but rather their concrete – and indeed flawed – implementation in the respective tool.
The focus of the presentation was the popular PGP implementation GnuPG, whose code is generally considered to be well-established. Nevertheless, the security researchers found numerous vulnerabilities, including typical errors when processing C strings through injected null bytes. This allowed, among other things, signatures to be falsely displayed as valid, or it was possible to prepend text to signed data that was neither captured nor exposed as a modification by the signature.
The issues found in GnuPG cover a broad spectrum of causes: attackers could exploit clearly erroneous code, provoke misleading output that tempts users into fatal actions. Furthermore, they could inject ANSI sequences that, while correctly processed by GnuPG, lead to virtually arbitrary output in the victim's terminal. The latter can be exploited to give users malicious instructions that only appear to come from GnuPG, or to overwrite legitimate security queries from GnuPG with harmless follow-up questions, causing users to unintentionally approve dangerous actions.
The security researchers also found some discovered problem types in other tools, such as the newer PGP implementation Sequoia-PGP or the signature tool Minisign. In the encryption tool age, they discovered a way to execute any programs present on the victim's computer via the plug-in system. The researchers provide a comprehensive overview of all found issues on the website gpg.fail.
Many vulnerabilities still openSome vulnerabilities found have been fixed in the current versions of the affected programs, but this is not the case for many. Partly because patches have been adopted but no new version with them has been released yet, but partly also because the program authors do not see a problem to be corrected in their tool.
The researchers particularly praised the reaction to the vulnerability in age: Not only was the error fixed in the various age implementations, but the specification was also updated to prevent the problem. Directly at the hacker congress, age developer Filippo Valsorda even went a step further: He was in the audience of the presentation and used the mandatory Q&A session at the end to thank the researchers for their work. He also presented them with an improvised bug bounty in the form of stickers and pins.
The researchers also provide advice on their website on how to avoid the found errors – from both developer and user perspectives. In general, users should also perceive seemingly harmless error messages as serious warnings and avoid cleartext signatures – as recommended by the GnuPG man page. The researchers also suggest rethinking the use of cryptography tools on the command line in general: due to the mentioned ANSI sequences, users can be misled, even if all tools work without errors.
= Related video and/or audio:
- To sign or not to sign: Practical vulnerabilities in GPG & friends