Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
We have had recent stories and discussion regarding the connector on GPUs which are causing overheating and, in a small number of case, actually catching fire. :
It seems that there are new connectors being developed.
The 12VHPWR connector (and its 12V-2x6 successor) is notorious for their vulnerability to high temperatures on power-hungry GPUs, to the point where it can melt. To combat this on the adapter side, third-party manufacturers such as Ezdiy-fab and Cablemod have been forced to resort to "exotic" solutions sporting copper PCBs, thermal pads, and aluminum heatsinks to ensure their adapters stay cool.
Ezdiy-fab's 90- and 180-degree adapters take advantage of a 2oz copper PCB strapped to a thermal pad and aluminum heatsink cover. The copper PCB allegedly keeps the voltage impedance low while the thermal pad and heatsink on top of it ensure cool operation. CableMod's latest adapter uses the same design but takes advantage of a copper foil applied to its copper PCBs, in addition to a thermal pad and aluminum heatsink.
All of this additional cooling shows how fragile the new 16-pin connectors are to potential overheating. Virtually all current 16-pin adapters we could find (from various third-party makers) take advantage of some cooling system. By contrast, you can find angled 8-pin adapters that don't come with any fancy cooling gizmos (some do, but the point is that cooling components on 8-pin adapters don't seem to be required.) You can find angled 8-pin adapters with a simple plastic shell, contributing almost nothing to cool the interior components.
Cablemod had to recall its original V1.0 adapters due to temperature problems associated with the connectors loosening unintentionally, a flaw in the original design. Even though the design flaw only affected 1% of units sold, the total amount of property damage was estimated to be over $74,500 thanks in no small part to the sky-high prices of flagship GPUs lately. The cable manufacturer replaced the original version with an updated model that rectified the adapter's previous issue.
Lately, there have been melting concerns regarding the new RTX 50-series that comes with the revised 12V-2x6 power connector. It has been discovered that using previous-generation 12VHPWR cables with the RTX 5090 can result in melting issues regardless. We saw this when the first recorded RTX 5090 16-pin connector meltdown was published by a Reddit user online, who used an old 12VHPWR third-party cable with his new GPU. The cable's maker came out with a statement, clarifying that only its cables that are made in 2025 using the newer 12V-2x6 standard support RTX 50 series GPUs. (Reminder: 12V-2x6 is backward compatible with 12VHPWR.)
Initially, it was thought that the melting problem was due to connection seating only, especially with the original 12VHPWR connectors. However, multiple theories have come out suggesting that the connector may be doomed to fail. One theory suggests that the 16-pin standard as a whole is pushed way too close to its physical limits. Another suggests improper load balancing between the wires is causing the connectors to fail as well due to a lack of shunt resistors on RTX 40 and RTX 50 series GPUs.
Regardless of where exactly the problem is, it's clear that the new 16-pin connector standard is far less robust than its 8-pin and 6-pin predecessors. Maybe at some point, Nvidia and the PCI SIG committee will make an entirely new connector with a new design. But for now, those "lucky" enough to snag a high-end Nvidia GPU will have to live with the 16-pin connector, flaws and all.
Arthur T Knackerbracket has processed the following story:
In an unexpected turn of events Justin Hotard, the executive vice president and general manager of the Data Center and AI Group (DCAI) at Intel, left the company to become chief executive of Nokia. Intel has appointed an internal head for its datacenter and AI unit and will start searching for a new permanent general manager immediately.
"We have a strong DCAI team that will continue to advance our priorities in service to our customers," a statement by Intel reads. "Karin Eibschitz Segal has been appointed interim head of the DCAI business and is an accomplished executive with nearly two decades of Intel leadership experience spanning products, systems and infrastructure roles. We are grateful for Justin Hotard's contributions and wish him the best in his new role."
Justin Hotard joined Intel from HPE in early 2024. His tenure was arguably a mixed bag, though much of what he oversaw was more or less in place before he arrived. Intel successfully launched its Xeon 6 'Granite Rapids' and 'Sierra Forest' CPUs for servers, but sales of its Gaudi 3 processors for AI missed the company's own rather modest expectations. In addition, the company had to cancel its Falcon Shores as a product and delay its Clearwater Forest datacenter CPU by at least a quarter.
Justin Hotard has over 25 years of experience working at major technology companies. Before joining Intel, he held leadership positions at Hewlett Packard Enterprise and NCR Corporation. His background includes expertise in AI and datacenter markets, which are said to be critical areas for Nokia's future.
"I am delighted to welcome Justin to Nokia," said Sari Baldauf, Chair of Nokia’s Board of Directors. "He has a strong track record of accelerating growth in technology companies along with vast expertise in AI and datacenter markets, which are critical areas for Nokia's future growth. In his previous positions, and throughout the selection process, he has demonstrated the strategic insight, vision, leadership and value creation mindset required for a CEO of Nokia."
Nokia's current CEO Pekka Lundmark will step down on March 31, 2025, and Justin Hotard will take over the role starting April 1, 2025. Lundmark will stay on as an advisor until the end of the year. Hotard will be based in Espoo, Finland, where Nokia’s headquarters are located.
Lundmark has led Nokia since 2020, a period marked by significant challenges. Under his leadership, the company strengthened its position in 5G technology, cloud-based network infrastructure, and patent licensing. With this leadership change, Nokia aims to continue its transformation, focusing on AI, datacenters, and next-generation connectivity.
"I am honored by the opportunity to lead Nokia, a global leader in connectivity with a unique heritage in technology," said Justin Hotard. "Networks are the backbone that power society and businesses, and enable generational technology shifts like the one we are currently experiencing in AI. I am excited to get started and look forward to continuing Nokia's transformation journey to maximize its potential for growth and value creation."
Justin Hotard leaves a couple of months after Pat Gelsinger, chief executive of Intel, was ousted by the board of directors. As a result, Intel now does not have a permanent CEO or a permanent head of its key DCAI unit.
Don't worry - this will be a relatively short Meta, and it is not to explain another site outage!
Community Vote on Site Documentation
In December 2024 I released a Meta which detailed the proposed documentation for the site under the Soylent Phoenix board. This is a legal requirement resulting from the creation of a new company. I repeated the links to the documentation in January. The next step is for the community to accept or reject the proposed documentation. The previous voting software is no longer available to us but I believe that a straightforward count of comments will suffice.
I will publish another Meta which will contain the links to the proposed documentation but it is not to be used for any discussion regarding the contents. Each current account in good standing (i.e. having a karma of 20+ and created on or before the publication of the December Meta (16 Dec 2024 - that is up to and including account #49487 ) will be eligible to vote. In order to cast your vote your comment should be limited to a single word - "Yes" or "No" (upper or lower case is acceptable) on a line all by itself. "Yes" will indicate your acceptance of the documentation and "No" will indicate your rejection of it. Your last comment of a maximum of 2 attempts will be the one that counts so you will have the opportunity to change your vote. Any more than 2 attempts from an account to cast a vote will be discarded. Comments may contain a single paragraph to overcome the 'lame comment' filter. The contents of the paragraph will be ignored. The vote will remain open for 1 week and will close at 23:59 (UTC) on 28 February 2025. The result will be made public once the Board are satisfied that the voting has been fair and democratic.
Existing votes will remain valid and do not have to be redone.
Entering into a discussion in the vote or justifying why you have voted in a particular fashion will nullify your comment. There has been a period of over 2 months for discussion and suggested changes.
It is important that you cast a vote. As an extreme example, if 1 person alone votes Yes and 2 people vote No then the documentation will NOT be accepted. Not casting a vote doesn't make any statement whatsoever but may result in the majority of true community opinion being ignored.
Essentially, the documentation is the same as that adopted in 2014 except it has been rewritten where necessary to clarify the meaning or intent. It also incorporates in one location changes to the rules that have been accepted by the community since 2014 (e.g. the definition of Spam which was adopted by the site in 2021).
An interesting thought experiment ...
Imagine a supervillain attacking you with his unique superpower of creating small black holes. An invisible force zips through your body at unimaginable speed. You feel no push, no heat, yet, deep inside your body, atoms momentarily shift in response to the gravitational pull of something tiny yet immensely dense — a Primordial Black Hole (PBH).
What would this do to you? Would it cause minor, localized damage, or would it simply rip through your entire body? Physicist Robert J. Scherrer from Vanderbilt University investigated this very scenario. His study examines what happens when a tiny black hole, like the ones formed in the early universe, passes through the human body.
[...] While the idea of a tiny black hole silently piercing through your body is an intriguing thought experiment, the actual probability of it happening is close to zero. And even if one did, it would have to be exceptionally massive (by microscopic standards) to cause harm.
[Journal Ref]: https://arxiv.org/pdf/2502.09734
[Source]: ZME Science
https://phys.org/news/2025-02-protein-emergence-spoken-language.html
The origins of human language remain mysterious. Are we the only animals truly capable of complex speech? Are Homo sapiens the only hominids who could give detailed directions to a far-off freshwater source or describe the nuanced purples and reds of a dramatic sunset?
Close relatives of ours such as the Neanderthals likely had anatomical features in the throat and ears that could have enabled the speaking and hearing of spoken language, and they share with us a variant of a gene linked to the ability to speak. And yet it is only in modern humans that we find expanded brain regions that are critical for language production and comprehension.
Now researchers from The Rockefeller University have unearthed intriguing genetic evidence: a protein variant found only in humans that may have helped shape the emergence of spoken language.
In a study published in Nature Communications, researchers in the lab of Rockefeller researcher Robert B. Darnell discovered that when they put this exclusively human variant of NOVA1—an RNA-binding protein in the brain known to be crucial to neural development—into mice, it altered their vocalizations as they called to each other.
The study also confirmed that the variant is not found in either Neanderthals or Denisovans, archaic humans that our ancestors interbred with, as is evidenced by their genetic traces that remain in many human genomes today.
"This gene is part of a sweeping evolutionary change in early modern humans and hints at potential ancient origins of spoken language," says Darnell, head of the Laboratory of Molecular Neuro-Oncology. "NOVA1 may be a bona fide human 'language gene,' though certainly it's only one of many human-specific genetic changes."
Anatomical adaptations of the vocal tract and intricate neural networks enable our language capabilities. But the genetics behind them isn't well understood.
One theorized genetic language driver is FOXP2, which codes for a transcription factor involved in early brain development. People with mutations in this gene exhibit severe speech defects, including the inability to coordinate lip and mouth movements with sound.
Humans have two amino acid substitutions in FOXP2 that aren't found in other primates or mammals—but Neanderthals had them too, suggesting that the variant arose in an ancestor of both human lineages. But some findings on FOXP2 have been disputed, and its role in human language development remains unclear.
Now NOVA1 has arisen as a candidate. The gene produces a neuron-specific RNA binding protein key to brain development and neuromuscular control that was first cloned and characterized by Darnell in 1993. It's found in virtually identical form across a wide swath of the biosphere, from mammals to birds—but not in humans.
Instead, humans have a unique form characterized by a single change of an amino acid, from isoleucine to valine, at position 197 (I197V) in the protein chain.
Journal Reference: Tajima, Y., Vargas, C.D.M., Ito, K. et al. A humanized NOVA1 splicing factor alters mouse vocal communications. Nat Commun 16, 1542 (2025). https://doi.org/10.1038/s41467-025-56579-2
It started with a bizarre burning sensation in her feet. Over the next two days, the searing pain crept up her legs. Any light touch made it worse, and over-the-counter pain medicine offered no relief.
On the third day, the 30-year-old, otherwise healthy woman from New England went to an emergency department. Her exam was normal. Her blood tests and kidney function were normal. The only thing that stood out was a high number of eosinophils—white blood cells that become active with certain allergic diseases, parasitic infections, or other medical conditions, such as cancer. The woman was discharged and advised to follow up with her primary care doctor.
[...]
At home again, with little relief, a family member gave her a prescription sleep aid to help her get some rest. The next day, she awoke confused, saying she needed to pack for a vacation and couldn't be reasoned with to return to bed.
[...]
In a case report published in the New England Journal of Medicine, doctors explain how they figured out the source of her fiery symptoms—worms burrowing into her brain. By this point, she was alert but disoriented and restless. She couldn't answer questions consistently or follow commands.
[...]
Blood smear tests showed no evidence of parasites, and a computed tomography (CT) scan of her head showed no acute intracranial abnormalities. But, the results of a spinal tap showed a clear problem: her cerebrospinal fluid showed a count of 694 white blood cells per microliter. The reference range was 0 to 5.
[...]
They went through them one by one, crossing things off the list that didn't quite fit with everything they knew of her case. They ended with angiostrongyliasis, caused by the nematode (roundworm) Angiostrongylus cantonensis, also known as rat lungworm.
[...] Humans crash this process by accidentally eating the L3 larvae. This can happen if they eat undercooked snails or slugs, or undercooked creatures that eat slugs or snails, such as land crabs, freshwater prawns, or frogs. The more troubling route is eating raw vegetables or fruits that are contaminated by snails or slugs. This is possible because the L3 larvae are present in mollusk slime. For instance, if a slug or snail traverses a leaf of lettuce, leaving a slime trail in its wake, the leaf can be contaminated with the larvae. The authors of the case study note that "the infectious dose of slime is not defined."
[...]
This nauseating roundworm is a known plague in Hawaii. In fact, it gained attention in recent years after sparking small outbreaks in the state. In 2017, there were 19 confirmed cases, but case totals in each of the years since have remained below 10.
[...]
In this case, the patient and her doctors decided to use a 14-day combination of the immunosuppressive steroid prednisone and the anti-parasitic drug albendazole.Fortunately, the woman's symptoms cleared with the treatment, and she was discharged from the hospital after six days.
Arthur T Knackerbracket has processed the following story:
It has been nearly a decade since famed cryptographer and privacy expert Bruce Schneier released the book Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World - an examination of how government agencies and tech giants exploit personal data. Today, his predictions feel eerily accurate.
At stake, he argued then, was a possibly irreversible loss of privacy, and the archiving of everything. As he wrote, science fiction author Charlie Stross described the situation as the "end of prehistory," in that every facet of our lives would be on a computer somewhere and available to anyone who knew how to find them.
Since the book was published, we've seen data harvesting continue, particularly for training AI models. The battle to keep even the most basic facts about us private seems all but lost.
We sat down with Bruce Schneier for an update on his work, and what we can expect in the future.
The Register: Data and Goliath came out nearly two years after Snowden's leaks and just months before Congress finally made a few moves on the surveillance issue with the USA Freedom Act. Ten years on, how do you feel things have changed, if at all?
At the same time, the information environment has gotten worse. More of our data is in the cloud, where companies have easier access to it. We have more Internet-of-Things devices around ourselves, which keep us under constant surveillance. And every one of us carries an incredibly sophisticated surveillance device around with us wherever we go: our smartphones. Everywhere you turn, privacy is losing.
[...]
The Register: If the mass privatization of the government that's looking likely happens, what are the implications of all that data being leased out to the private sector?
And by security, I mean two things. Obviously, there's the possibility that the data will be stolen and used by foreign governments and corporations. And there is the high probability that it will end up in the hands of data brokers, and then bought and sold and combined with other data.
Surveillance in the US is largely a corporate business; this will just make it worse.
Arthur T Knackerbracket has processed the following story:
Data storage has always depended on systems that toggle between "on" and "off" states. However, the physical size of the components storing these binary states has traditionally limited how much information can be packed into a device.
Now, researchers at the University of Chicago's Pritzker School of Molecular Engineering have developed a way to overcome this constraint. They've successfully demonstrated how missing atoms within a crystal structure can be used to store terabytes of data in a space no larger than a millimeter.
"We found a way to integrate solid-state physics applied to radiation dosimetry with a research group that works strongly in quantum, although our work is not exactly quantum," said first author Leonardo França, a postdoctoral researcher in Zhong's lab.
Their study, published in Nanophotonics, explores how atomic-scale crystal defects can function as individual memory cells, merging quantum methodologies with classical computing principles.
Led by assistant professor Tian Zhong, the research team developed this novel storage method by introducing rare-earth ions into a crystal. Specifically, they incorporated praseodymium ions into a yttrium oxide crystal, though they suggest the approach could extend to other materials due to rare-earth elements' versatile optical properties.
The memory system is activated by a simple ultraviolet laser, which energizes the rare-earth ions, causing them to release electrons. These electrons then become trapped in the crystal's natural defects. By controlling the charge state of these gaps, the researchers effectively created a binary system, where a charged defect represents a "one" and an uncharged defect represents a "zero."
[...] The researchers believe this breakthrough could redefine data storage limits, paving the way for ultra-compact, high-capacity storage solutions in classical computing.
Journal Reference: França, Leonardo V. S., Doshi, Shaan, Zhang, Haitao and Zhong, Tian. "All-optical control of charge-trapping defects in rare-earth doped oxides" Nanophotonics, 2025. https://doi.org/10.1515/nanoph-2024-0635
Are noise-cancelling headphones to blame for young people's hearing problems? They are not going deaf but the brain are having difficulty processing sounds. As it has not been trained on sorting out sounds and noise due to constant headphone usage filtering out the auditory realities.
... auditory processing disorder (APD), a neurological condition where the brain finds it difficult to understand sounds and spoken words.
Her audiologist and others in England are now calling for more research into whether the condition is linked to overuse of noise-cancelling headphones.
Five NHS audiology departments have told the BBC that there has been an increase in the number of young people referred to them from GPs with hearing issues - only to find their hearing is normal when tested and it is their ability to process sound that is struggling.
Noise-cancelling headphones do have their benefits, particularly for long-term ear health where their soundproofing feature can prevent high frequency and loud noise from reaching and damaging the ear - even while listening to music.
... by blocking everyday sounds such as cars beeping, there is a possibility the brain can "forget" to filter out the noise.
"You have almost created this false environment by wearing those headphones of only listening to what you want to listen to. You are not having to work at it,"
https://www.bbc.com/news/articles/cgkjvr7x5x6o
Scientists Just Discovered 'Quipu,' the New Largest Structure in Our Cosmos:
Humanity's growing understanding of the universe can be best described as a "Copernican journey"—the centuries-long discovery that we are far from the center of all things. Earth, for example, orbits around the Sun (thanks for that one, Copernicus). But it's also just one Solar System among billions in the Milky Way, which is turn a part of the Virgo Supercluster and the even largerLaniakea supercluster—one of the largest objects in the universe, at around 520 million light-years across.
However, even Laniakea isn't the largest structure in the known universe. In 2003, scientists discovered the Sloan Great Wall (SGW), believed to stretch beyond 1 billion light-years. But now, in a study published on the preprint server arXiv (and accepted for publication in the journal Astronomy and Astrophysics), scientists assert their belief that there's a structure even larger than this celestial behemoth.
Its name is Quipu, and astronomers estimate that its massive bulk stretches some 1.39 billion light-years across. According to Princeton astronomer J. Richard Gott III, who helped discover the SGW and who spoke with New Scientist, Quipu "end to end, is slightly longer" than SGW. The researchers also estimate that Quipu contains the equivalent mass of 200 quadrillion Suns.
"For a precise determination of cosmological parameters we need to understand the effects of the local large-scale structure of the Universe on the measurements," the authors wrote. "Characterizing these superstructures is also important for astrophysical research, for example the study of the environmental dependence of galaxy evolution as well as for precision tests of cosmological models."
The name Quipu—a reference to the textile-based recording devices used by several ancient cultures in the central Andes—is both catchy and descriptive. The authors note that one particular view gives "the best impression of the superstructure as a long filament with small side filaments, which initiated the naming of Quipu."
The team analyzed Quipu, along with four other superstructures, using data from the German Aerospace Center-led ROSAT X-ray satellite and the team's Cosmic Large-Scale Structure in X-rays (CLASSIX) Cluster Survey. They found that these structures together contain roughly 45 percent of all galaxy clusters, 30 percent of all galaxies, and 25 percent of matter in the observable universe. However, even larger structures might still exist. The Hercules-Corona Borealis Great Wall, located further afield than Quipu, has been estimated to stretch 10 billion light-years long (though its true size is still up for debate).
Understanding Quipu and other superstructures like it is vitally important, as they challenge our current understanding of cosmological evolution, which states that matter should be relatively evenly distributed throughout the universe. These superstructures are so huge that forming them could theoretically take longer than the universe is old.
However, Quipu isn't a fixture of the universe. Despite its immense stature, it too will eventually disappear from the cosmic stage. "In the future cosmic evolution, these superstructures are bound to break up into several collapsing units," the authors wrote. "They are thus transient configurations."
Even cosmic superstructures can't escape the inexorable march of time.
The new CPU could be a piece in the $500 billion Stargate AI project:
Chip designer Arm plans to unveil its own processor this year with Meta as the launch customer, The Financial Times reported. The chip would be a CPU designed for servers in data centers and would have the potential to be customized for clients. Manufacturing would be outsourced to a contract fab plant like TSMC (Taiwan Semiconductor Manufacturing Co.) and the first in-house chip could be revealed as early as this summer, according to the FT's sources.
Last month, Arm parent Softbank announced the Stargate project, a partnership with OpenAI to build up to $500 billion worth of AI infrastructure. Arm, along with Microsoft and NVIDIA, is a key technology partner for the project. Arm's chip could now play a role in that project, and also in Jony Ive's mysterious AI-powered personal device, reportedly being developed in collaboration with OpenAI's Sam Altman, according to the report.
[...] The move would put Arm in direct competition with many of its own customers like NVIDIA, which manufacturers its own Arm-based server CPUs. To date, Arm has never made its own chips — instead, it licenses its technology and patents to major companies like Apple. Those companies then customize the designs for their own needs and use a contract manufacturer like TSMC or Samsung to build the chips.
Arm has begun recruiting from its own customers and competing against them for deals as it pushes toward selling its own chips, according to people familiar with the matter and a document viewed by Reuters.
Arm supplies the crucial intellectual property that firms such as Apple and Nvidia license to create their own central processing units (CPUs). It has also been seeking to expand its profits and revenues through a range of tactics, including considering whether to sell chips of its own.
Arm appears to be ramping up that effort.
The UK-based company has sought to recruit executives from licensees, two sources familiar with the matter told Reuters. And Arm is competing against Qualcomm, one of its largest customers, to sell data center CPUs to Meta Platforms, according to a person familiar with the matter.
The tech provider's moves to build out its own chip business could upend an industry that has long viewed the company as a neutral player rather than a competitor, by forcing companies who rely on Arm technology to consider whether they will end up competing against the firm for business.
https://newatlas.com/environment/indoor-air-pollution-scented-terpenes/
Using scented products indoors changes the chemistry of the air, producing as much air pollution as car exhaust does outside, according to a new study. Researchers say that breathing in these nanosized particles could have serious health implications.
When you hear or see the words 'air pollution,' you most likely think of things like factories and car exhaust. That's pollution that is out there – outside your house. But have you thought about how you're contributing to air pollution inside of where you live by using seemingly innocuous products like scented, non-combustible candles?
New research by Purdue University, the latest in a series of Purdue-led studies, examined how scented products – in this case, flame-free candles – are a significant source of nanosized particles small enough to get deep into your lungs, posing a potential risk to respiratory health
"A forest is a pristine environment, but if you're using cleaning and aromatherapy products full of chemically manufactured scents to recreate a forest in your home, you're actually creating a tremendous amount of indoor air pollution that you shouldn't be breathing in," said Nusrat Jung, an assistant professor in Purdue's Lyles School of Civil and Construction Engineering and co-corresponding author of the study's.
Scented wax melts are marketed as a flameless, smoke-free, non-toxic alternative to traditional candles, a safer way of making your home or office smell nice. To assess the truth of these claims, the researchers comprehensively measured the nanoparticles formed when they warmed wax melts in their mechanically ventilated test house. The tiny house is actually an architectural engineering laboratory called the Purdue Zero Energy Design Guidance for Engineers (zEDGE) lab. Designed and engineered to test the energy efficiency of a larger building, it's full of sensors that monitor the impact of everyday activities on indoor air quality.
"To understand how airborne particles form indoors, you need to measure the smallest nanoparticles – down to a single nanometer," said Brandon Boor, associate professor in civil engineering at Purdue and the study's other corresponding author. "At this scale, we can observe the earliest stages of new particle formation, where fragrances react with ozone to form tiny molecular clusters."
The researchers knew from their previous research that new nanoparticle formation was initiated by terpenes – aromatic compounds that determine the smell of things like plants and herbs – released from the melts and reacting with indoor atmospheric ozone (O3). They'd found that activities such as mopping the floor with a terpene-rich cleaning agent, using a citrus-scented air freshener, or applying scented personal care products like deodorant inside the zEDGE house resulted in pulsed terpene emissions to the indoor air within five minutes. Conversely, using essential oil diffusers or peeling citrus fruits caused a more gradual increase in terpenes.
In the present study, heating the scented wax contributed significantly to the number of new particles formed in the indoor air, particularly those smaller than 100 nanometers (nm). The resulting atmospheric concentrations were over one million nanoparticles per cubic centimeter (106 cm-3), which is comparable to concentrations emitted by traditional lighted candles (106 cm-3), gas stoves (105 – 107 cm-3), diesel engines (103 – 106 cm-3), and natural gas engines (106 – 107 cm-3). By comparison, there were no significant terpene emissions when unscented wax melts were heated.
https://pubs.acs.org/doi/10.1021/acs.estlett.4c00986
I expect that many noticed that the site went down and, if you are reading this you will also realise that it is now back up.
The entire server died leaving a wake of Out-Of-Memory messages, which resulted in the site itself, IRC and our email all failing. We (and by that I really mean kolie!) have restarted the server and doubled the amount of memory available to it.
Of course, that doesn't tell us why it ran out of memory, although we knew that it was a bit tight, nor what specifically happened today to push it over the edge. That will probably take a while to work out.
It might take us a while to put more stories in the queue but you should be able to comment on many of today's stories that have only just appeared on your screens.
We are sorry for the inconvenience and we are getting back on our feet again. As always, a big THANK YOU to kolie for his efforts.
Record-breaking neutrino is most energetic ever detected:
Highest energy cosmic neutrino so far 120PeV (120x1015eV)
Astrophysicists have observed the most energetic neutrino ever. The particle — which probably came from a distant galaxy — was spotted by the Cubic Kilometre Neutrino Telescope (KM3NeT), a collection of light-detecting glass spheres on the floor of the Mediterranean Sea, on 13 February 2023. Researchers monitoring the telescope did not notice the detection until early 2024, when they completed the first analysis of their data. They unveiled it as a potentially record event last year at a conference in Milan, Italy, but did not disclose details such as the timing, direction or energy of the neutrino.
"We had to convince ourselves that it wasn't something strange or weird with the telescope," says Paschal Coyle, a neutrino physicist at Aix-Marseille University in France and KM3NeT spokesperson. The result was published on 12 February in Nature1, and will be described in four preprints due to be posted on the arXiv preprint server.
Neutrinos are electrically neutral particles more than one million times lighter than an electron. They are typically produced in nuclear reactions such as those at the centre of the Sun, from which they emerge with energies on the order of millions of electronvolts (106 eV). But for more than 10 years, researchers have been recording neutrinos carrying unprecedented energies of up to several quadrillion electronvolts (1015 eV, or 1 petaelectronvolt), which are thought to originate in distant galaxies. (The most energetic particle ever detected, at 320,000 PeV, was not a neutrino but a cosmic ray dubbed the Oh-My-God particle.)
KM3NeT consists of strings of sensitive light detectors anchored to the sea floor at a depth of around 3,500 metres off the coast of the Italian island of Sicily, as well as in a second, smaller array near Toulon, France. These sensors pick up light emitted by high-energy, electrically charged particles such as muons. Muons are continuously raining down on Earth's surface, because they are produced when cosmic rays hit air molecules. But occasionally, a cosmic neutrino that smashes into the planet's surface also produces a muon.
In the February 2023 event detected by the Sicily observatory, the team estimated that the muon carried 120 PeV of energy, on the basis of the unusual amount of light it produced. The particle's path was close to horizontal with respect to Earth's surface and travelled eastwards, towards Greece.
Journal Reference:
The KM3NeT Collaboration. Observation of an ultra-high-energy cosmic neutrino with KM3NeT. Nature 638, 376–382 (2025). https://doi.org/10.1038/s41586-024-08543-1
Arthur T Knackerbracket has processed the following story:
The theme cropped up repeatedly during 2025's State Of Open Conference, with speakers from tech giants and volunteer maintainers laying out the challenges. Much of the open source ecosystem relies on volunteers putting in too many hours for too little support and the cracks are growing.
This week, the lead of the Asahi Linux project – a Linux distribution for Apple silicon – Hector Martin, abruptly quit, citing factors including developer burnout and demanding users.
Jamie Tanna, who gave himself the title of "Tired Maintainer" put it simply: "Being an open source maintainer is really rewarding... except when it isn't."
Tanna has been active in the open source world for several years, although it was the experience of being an oapi-codgen maintainer that he spoke about. For the uninitiated, oapi-codgen is a tool to convert OpenAPI specifications to Go code.
"It's used by a load of companies... and a load of angry users."
The story is a familiar one. Tanna has helped out with some issues on the project and had volunteered for maintainer duty. There was a flurry of releases, but before long, the time between each release began to lengthen. Being a maintainer, he explained, with big or small projects (but especially big ones) meant dealing with "fun" users who are very happy to express their feelings as well an ever-increasing list of requests.
The experience of feeling under pressure, isolated and faced with a growing pile of work while receiving the occasional unpleasant message from an entitled user demanding their issue be dealt with now or that a contribution merged be immediately is far too common.
Tanna is relatively fortunate – his employer gives him four hours a month to work on the project. However, that does not come close to meeting the demands of users and the "How hard can it be?" brigade. Maintainers are undoubtedly under pressure, and many have either quit or are considering doing so.
[...] Vargas used figures including a 2024 Tidelift survey that put a figure of 60 percent on maintainers that had either quit or were considering quitting, and another [PDF] from the Linux Foundation showing that most of the more widely used Free Open Source Software was developed by only a handful contributors.
[...] Dealing with the problem is difficult. Do maintainers simply need to be paid in recognition of their efforts? Vargas is unsure that everything has a financial solution and noted research (https://dl.acm.org/doi/10.1145/3674805.3686667) presented at this year's FOSDEM. Vargas told The Register, "Money is not going to solve all problems."
"Each maintainer and project has their own context and challenges - while many maintainers would benefit from financial support, others really could use more contributors to complement their work and remove responsibilities from them - especially for non-code tasks like mentorship, community management, issue triage, promotion and fundraising, etc."
Rickard also worried about a potential squeeze on budgets as economic uncertainties bite and talked of raising awareness on platforms such as GitHub around sponsorship, given a contraction in the funding of projects by companies.
"You've got to have something as a catalyst for that change to happen. We, as a group of humans, don't seem to do proactively very well."
Cosgrove said, "I'm afraid it'll take a significant project falling over to convince them [the users] that paying for open source maintainers is worthwhile and, in fact, may actually be a requirement.
"I don't want to see that happen because the fallout will be ugly and gross, but I'm concerned that that's what it'll take."