Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Four baby planets show how super-Earths and sub-Neptunes form:
Thanks to the discovery of thousands of exoplanets to date, we know that planets bigger than Earth but smaller than Neptune orbit most stars. Oddly, our sun lacks such a planet. That's been a source of frustration for planetary scientists, who can't study them in as much detail as they'd like, leaving one big question: How did these planets form?
Now we know the answer.
An international team of astrophysicists from UCLA and elsewhere has witnessed four baby planets in the V1298 Tau system in the process of becoming super-Earths and sub-Neptunes. The findings are published in the journal Nature.
"I'm reminded of the famous 'Lucy' fossil, one of our hominid ancestors that lived 3 million years ago and was one of the 'missing links' between apes and humans," said UCLA professor of physics and astronomy and second author Erik Petigura. "V1298 Tau is a critical link between the star- and planet-forming nebulae we see all over the sky, and the mature planetary systems that we have now discovered by the thousands."
Planets form when a cloud of gas and dust, called a nebula, contracts under the force of gravity into a young star and a swirling disk of matter called a protoplanetary disk. Planets form from this disk of gas, but it's a messy process. There are many ways a planet can grow or shrink in size during its infancy --- a period of a few hundred million years. This led to major questions about why so many mature planets were between the sizes of Earth and Neptune.
The star V1298 Tau is only about 20 million years old compared to our 4.5-billion-year-old sun. Expressed in human terms, it's equivalent to a 5-month-old baby. Four giant, rapidly evolving planets between the sizes of Neptune and Jupiter orbit the star, but unlike growing babies, the new research shows that these planets are contracting in size and are steadily losing their atmospheres. Petigura and co-author Trevor David at the Flatiron Institute led the team that first discovered the planets in 2019.
"What's so exciting is that we're seeing a preview of what will become a very normal planetary system," said John Livingston, the study's lead author from the Astrobiology Center in Tokyo, Japan. "The four planets we studied will likely contract into 'super-Earths' and 'sub-Neptunes'—the most common types of planets in our galaxy, but we've never had such a clear picture of them in their formative years."
[...] Once they sorted out the shapes and timing of the orbits of the four planets, the researchers could make sense of how the planets tugged on each other due to gravity, sometimes slowing down and sometimes speeding up, and leading to transits, sometimes occurring early and other times late. These transit and timing variations allowed the team to measure the masses of all four planets for the first time, which is akin to weighing them.
The shocking result? Despite being 5 to 10 times the radius of Earth, the planets had masses only 5 to 15 times larger than Earth. This means they are very low-density, comparable to Styrofoam, whereas the Earth has the density of rock.
"The unusually large radii of young planets led to the hypothesis that they have very low densities, but this had never been measured," said Trevor David, a co-author from the Flatiron Institute who led the initial discovery of the system in 2019. "By weighing these planets for the first time, we have provided the first observational proof. They are indeed exceptionally 'puffy,' which gives us a crucial, long-awaited benchmark for theories of planet evolution."
"Our measurements reveal they are incredibly lightweight — some of the least dense planets ever found. It's a critical step that turns a long-standing theory about how planets mature into an observed reality," said Livingston.
[...] "These planets have already undergone a dramatic transformation, rapidly losing much of their original atmospheres and cooled faster than what we'd expect from standard models," said James Owen, a co-author from Imperial College London who led the theoretical modeling. "But they're still evolving. Over the next few billion years, they will continue to lose their atmosphere and shrink significantly, transforming into the compact systems of super-Earths and sub-Neptunes we see throughout the galaxy."
Journal Reference: Livingston, J.H., Petigura, E.A., David, T.J. et al. A young progenitor for the most common planetary systems in the Galaxy. Nature 649, 310–314 (2026). https://doi.org/10.1038/s41586-025-09840-z
Visual Studio Code extension faces March shutdown with no transition guidance:
Microsoft has abruptly announced the deprecation of Polyglot Notebooks with less than two months' notice, throwing the future of the .NET Interactive project into doubt.
The deprecation will come into effect on March 27, whereupon bug fixes and support will cease, and no new features will be added. However, the extension won't be automatically uninstalled from a user's Visual Studio Code installation.
Polyglot Notebooks is an important element of the Microsoft .NET Interactive project, which Microsoft describes as "an engine and API for running and editing code interactively." .NET Interactive can run as a kernel for notebooks and "enables a polyglot (multi-language) notebook experience," according to Microsoft. "For the best experience when working with multi-language notebooks, we recommend installing the Polyglot Notebooks extension for Visual Studio Code."
That recommendation presumably remains in place until Microsoft pulls the plug.
The deprecation announcement was made in the project's GitHub repository and the thread was locked, limiting conversation. However, users were quick to raise additional issues, questioning the reasoning behind the deprecation and the short time frame.
One pointed out the Polyglot Notebooks extension in Visual Studio Code was Microsoft's recommendation for data analysts, since Azure Data Studio is retiring at the end of this month. Microsoft's reaction was to remove the recommendation.
It appears the author of the Azure Data Studio retirement documentation was unaware of the impending doom facing the Polyglot Notebooks extension. An individual claiming to be the author posted: "As a result of the deprecation announcement for Polyglot Notebooks, I am legally bound to remove that recommendation from the Azure Data Studio article, because it would mislead customers to keep it in."
Which is true. However, as another user noted: "Removing that documentation from the Azure Data Studio page – and giving no transition path at all for those users (like myself) who depend on those Azure Data Studio features – seems a pretty user-hostile approach. We've already followed Microsoft's transition guidance once and ended up in this situation. Should we now look elsewhere for this functionality?"
The short notice and mixed messaging speaks more of dysfunctional management and communication within Microsoft than anything else. If only there were some tool at the company's disposal for Teams to communicate and collaborate.
We'll give the final word to another user reacting to the deprecation announcement, who said: "This is just another dark day for Microsoft customers, and the decision makers are nowhere to be seen taking accountability for the impact of their decisions."
In 2023, the science fiction literary magazine Clarkesworld stopped accepting new submissions because so many were generated by artificial intelligence. Near as the editors could tell, many submitters pasted the magazine's detailed story guidelines into an AI and sent in the results. And they weren't alone:
This is only one example of a ubiquitous trend. A legacy system relied on the difficulty of writing and cognition to limit volume. Generative AI overwhelms the system because the humans on the receiving end can't keep up.
This is happening everywhere. Newspapers are being inundated by AI-generated letters to the editor, as are academic journals. Lawmakers are inundated with AI-generated constituent comments. Courts around the world are flooded with AI-generated filings, particularly by people representing themselves. AI conferences are flooded with AI-generated research papers. Social media is flooded with AI posts. In music, open source software, education, investigative journalism and hiring, it's the same story.
Like Clarkesworld's initial response, some of these institutions shut down their submissions processes. Others have met the offensive of AI inputs with some defensive response, often involving a counteracting use of AI.
[...] These are all arms races: rapid, adversarial iteration to apply a common technology to opposing purposes. Many of these arms races have clearly deleterious effects. Society suffers if the courts are clogged with frivolous, AI-manufactured cases. There is also harm if the established measures of academic performance – publications and citations – accrue to those researchers most willing to fraudulently submit AI-written letters and papers rather than to those whose ideas have the most impact. The fear is that, in the end, fraudulent behavior enabled by AI will undermine systems and institutions that society relies on.
TFA goes on to discuss the upsides of AI, how AI makes fraud easier, and some ideas on balancing harms with benefits. Originally spotted on Schneier on Security.
Dispute erupts between popular web archive and independent blogger:
Archive.today, also known as Archive.is and Archive.ph, has gained notoriety in recent years as a useful tool for archiving web pages and bypassing paywalls. However, the site's CAPTCHA page currently weaponizes visitor traffic in a DDoS campaign against a blogger who attempted to unmask Archive.today's mysterious operator(s). The behavior has prompted Wikipedia editors to debate whether to ban the archive site, which might be living on borrowed time and underpins hundreds of thousands of Wikipedia citations.
Wikipedia relies heavily on Archive.today because it is more effective than conventional alternatives, such as the Internet Archive. However, the properties that have made Archive.today so useful have also drawn the attention of the FBI, likely because the site circumvents the paywalls of numerous prominent media outlets.
In contrast with the Internet Archive, which is legally sanctioned and complies with takedown requests, Archive.today follows no such rules, and its creator remains anonymous. Its advanced scraping methods and free-wheeling nature have turned it into a repository for sources that are likely available nowhere else. If the site were to enter Wikipedia's blacklist, which occurred once from 2013 to 2016, nearly 700,000 citation links would become useless, and many would likely never be repaired.
The discussion arose after Archive.today used its CAPTCHA page to direct DDoS traffic toward blogger Jani Patokallio, who posted an inconclusive investigation into the site's origins in 2023. However, the blog did not draw much attention until 2025, when various outlets cited it while reporting on the FBI's investigation into Archive.today.
The CAPTCHA page currently contains code (pictured below) that drives requests to the search function of Patokallio's blog, meaning that every Wikipedia citation leading to Archive.today could potentially contribute to the DDoS attack. However, Patokallio claims that the attack has caused no real harm. Visiting the page with uBlock Origin installed also seems to neutralize the offending code.
[...] Wikipedia is currently weighing three options to address the issue: retaining the status quo, removing all links, or discouraging future citations while keeping existing links. Some also argue that pivoting away from Archive.today is prudent regardless of the current dispute due to the site's inherently precarious existence. In 2021, Archive.today's creator admitted that it is "doomed to die at any moment."
Another quarter, another gain for AMD:
AMD ended 2025 with fanfare as it managed to increase its market shares across all major CPU product segments, according to Mercury Research, and achieved a 29.2% share of all x86 processors shipped in the fourth quarter, which is an all-time record for the company. The company now controls its highest unit share across desktop, laptop, and server CPU markets while also capturing the most lucrative parts of these markets, and now controls 35.4% of x86 CPU revenue share.
In the client PC segment, AMD finished 2025 with one of its strongest quarters ever, partly because Intel struggled to get enough client silicon from its own fabs and from TSMC, but to a large degree because of highly competitive desktop CPUs and meticulously calculated mobile CPU lineup.
AMD's client CPU unit share rose to 29.2% in Q4 2025, up 3.8% quarter-over-quarter (QoQ) and 4.6% year-over-year (YoY), driven by sales of both desktop and mobile offerings.
Intel remained the clear volume leader with about 70.8% of client CPU shipments, which is a sharp decline both sequentially and compared to the same quarter a year ago, which is not surprising as Intel had to reassign its internal manufacturing capacities to produce server CPUs instead of client silicon and could not get enough silicon from TSMC.
What is perhaps more alarming for Intel is that its client PC CPU revenue share declined to 68.8%, allowing AMD to control 31.2% of the dollar share of PC processor sales, up 2.9% QoQ and 7.4% YoY. This reflects AMD's higher average selling prices (ASPs), stronger sales of premium desktop and notebook processors, and continued gains in higher-margin segments.
Intel admits that it is hard to compete against AMD with its current lineup and hopes that things will begin to change in late 2026 – 2027, which means that AMD will likely continue to enjoy eating Intel's lunch in the coming quarters.
On Tuesday night, the Federal Aviation Administration closed airspace up to 18,000 feet above the El Paso International Airport in Texas, saying the restrictions would be in place for 10 days. Then, less than 10 hours later, the federal agency reopened the airspace, allowing planes to land and take off at the busy airport.
About an hour after lifting the restrictions, US Secretary of Transportation Sean Duffy, whose responsibilities include overseeing the FAA, explained the unexpected closure by saying, "The FAA and DOW acted swiftly to address a cartel drone incursion."
[...]
Not everyone agrees with Duffy's account.Based upon reporting from The New York Times and other publications, the military has been developing high-energy lasers to bring down drones.
[...]
FAA had not resolved all of its concerns about airplane safety from the tests.Despite these apparently lingering concerns from the FAA, the military went ahead with a test earlier this week against what was thought to be a drone. The object was a party balloon.
[...]
One of the many lessons from the war in Ukraine, which has rapidly pushed forward drone technology in contested environments, is that it is not practical to shoot down drones with conventional missiles. So it is understandable that the US military is looking at alternatives. This all culminated in some sort of snafu between the FAA and military officials regarding coordination with this week's test.
[...]
action was taken without consulting local or state officials in Texas—who are understandably outraged
[...]
"I want to be very, very clear that this should've never happened," El Paso Mayor Renard Johnson said during a news conference on Wednesday. "That failure to communicate is unacceptable."
Relevant video from a commenter on the original article: 99 Luftballons [3:57 Ed]
https://nand2mario.github.io/posts/2026/80386_barrel_shifter/
I'm currently building an 80386-compatible core in SystemVerilog, driven by the original Intel microcode extracted from real 386 silicon. Real mode is now operational in simulation, with more than 10,000 single-instruction test cases passing successfully, and work on protected-mode features is in progress. In the course of this work, corners of the 386 microcode and silicon have been examined in detail; this series documents the resulting findings.
In the previous post, we looked at multiplication and division -- iterative algorithms that process one bit per cycle. Shifts and rotates are a different story: the 386 has a dedicated barrel shifter that completes an arbitrary multi-bit shift in a single cycle. What's interesting is how the microcode makes one piece of hardware serve all shift and rotate variants -- and how the complex rotate-through-carry instructions are handled.
https://www.theregister.com/2026/02/09/taiwan_us_chip_production/
Taiwan's vice-premier has ruled out relocating 40 percent of the country's semiconductor production to the US, calling the Trump administration's goal "impossible."
In an interview broadcast on the CTS channel, vice premier Cheng Li-chiun said she made clear to US officials that Taiwan's semiconductor ecosystem cannot be moved and its most advanced technologies will remain domestic.
"When it comes to 40 or 50 percent of production capacity being moved to the United States... I have made it very clear to the US side that this is impossible," she said, according to The Straits Times.
Cheng led Taiwan's January's trade delegation to Washington, which secured reduced US tariffs on Taiwanese goods - from 20 percent to 15 percent - in exchange for increased investment into America's tech sector.
At the time, US commerce secretary Howard Lutnick told CNBC the deal aimed to relocate 40 percent of Taiwan's entire chip manufacturing and production capacity to America.
A Department of Commerce release cast the agreement as a "massive reshoring of America's semiconductor sector."
Taiwan, which produces more than 60 percent of global semiconductors and roughly 90 percent of the world's most advanced chips, insists it gained this leadership position by investing in the tech when other countries didn't.
Former Intel chief Pat Gelsinger supports this view, publicly stating a couple of years ago that countries like Korea, Taiwan, and China put in place long-term industrial policies and investment in chipmaking, while the US and European nations failed to do the same.
Cheng reiterated this in her interview, saying that "an industrial ecosystem built up over decades cannot be relocated."
Taiwan views its semiconductor dominance as strategic defense against Chinese aggression. Beijing claims Taiwan as its territory and threatens reunification by force if necessary. Even Lutnick acknowledged this "silicon shield" dynamic last year, noting China's open ambitions:
"We need their silicon, the chips so badly that we'll shield them, we'll protect them."
TSMC considered relocating its chip fabs in 2024 due to China threats but decided against the idea given the difficulties.
Any Chinese invasion would devastate the global tech sector, as The Register pointed out recently. Most of Nvidia's GPUs are made in Taiwan, as are AMD's processors and Qualcomm's smartphone chips. The supply of these would be cut off by any invasion, and there is no other source these companies can easily turn to.
Last year, a team of scientists presented evidence that spruce trees in Italy's Dolomite mountains synchronized their bioelectrical activity in anticipation of a partial solar eclipse—a potentially exciting new insight into the complexities of plant communication. The findings naturally generated media interest and even inspired a documentary. But the claims drew sharp criticism from other researchers in the field, with some questioning whether the paper should even have been published. Those initial misgivings are outlined in more detail in a new critique published in the journal Trends in Plant Science.
For the original paper, Alessandro Chiolerio, a physicist at the Italian Institute of Technology, collaborated with plant ecologist Monica Gagliano of Southern Cross University and several others conducting field work in the Costa Bocche forest in the Dolomites. They essentially created an EKG for trees, attaching electrodes to three spruce trees (ranging in age from 20 to 70 years) and five tree stumps in the forest.
Those sensors recorded a marked increase in bioelectrical activity during a partial solar eclipse on October 22, 2022. The activity peaked mid-eclipse and faded away in its aftermath. Chiolerio et al. interpreted this spike in activity as a coordinated response among the trees to the darkened conditions brought on by the eclipse. And older trees' electrical activity spiked earlier and more strongly than the younger trees, which Chiolerio et al. felt was suggestive of trees developing response mechanisms—a kind of memory captured in associated gravitational effects. Older trees might even transmit this knowledge to younger trees, the authors suggested, based on the detection of bioelectrical waves traveling between the trees.
Soon, other plant scientists weighed in, expressing strong skepticism and citing the study's small sample size and large number of variables, among other concerns. Justine Karst, a forest ecologist at the University of Alberta in Canada, unfavorably compared Chiolerio et al.'s findings to a 2019 study claiming evidence for the controversial "wood-wide web" concept, in which trees communicate and share resources via underground networks of mycorrhizal fungi. Karst co-authored a 2023 study demonstrating insufficient evidence for the wood-wide-web.
Ariel Novoplansky, an evolutionary ecologist at Ben-Gurion University of the Negev in Israel, was among those who objected to the study's publication—so much so that he co-authored the new critique with his Ben-Gurion colleague Hezi Yizhaq. He thinks it's far more likely that the spikes in bioelectrical activity were due to temperature shifts or lightning strikes.
"My serious doubts had arisen from the very basic premise regarding the adaptive rationale the entire study hinged upon—namely, that those trees would be functionally affected by such a minor 'passing cloud' effects of such a (very) partial eclipse [with] a mere 10.5 percent reduction in sunlight for two hours," Novoplansky told Ars. "I then thought about the possibility that thunderstorms might be involved in the heightened 'anticipatory' electrical activity of the trees, and it rolled from there."
[...] "This field of plant behavior/communication is rampant with poorly designed 'studies' that are then twisted into a narrative that promotes personal worldviews and/or enhances personal celebrity," said James Cahill, a plant ecologist at the University of Alberta in Calgary, Canada, who voiced objections when the original paper was published and is cited in Novoplansky's acknowledgements. "The textbook example of this is the [Suzanne] Simard 'mother tree' debacle. Ariel is trying to get the science back on track, as are many of us."
[...] "He puts forward logical alternative hypotheses," said Cahill of Novoplansky's critique. "The original work should have tested among a number of different hypotheses rather than focusing on a single interpretation. This is in part what makes it pseudoscience and promoting a worldview."
[...] Chiolerio and Gagliano stand by their research, saying they have always acknowledged the preliminary nature of their results. "We measured [weather-related elements like] temperature, relative humidity, rainfall and daily solar radiation," Chiolerio told Ars. "None of them shows strong correlation with the transients of the electrome during the eclipse. We did not measure environmental electric fields, though; therefore, I cannot exclude effects induced by nearby lightnings. We did not have gravitational probes, did not check neutrinos, cosmic rays, magnetic fields, etc."
"I'm not going to debate an unpublished critique in the media, but I can clarify our position," Gagliano told Ars. "Our [2025] paper reports an empirical electrophysiological/synchrony pattern in the eclipse window, including changes beginning prior to maximum occultation, and we discussed candidate cues explicitly as hypotheses rather than demonstrated causes. Describing weather/lightning as 'more parsimonious' is not evidence of cause. Regional lightning strike counts and other proxies can motivate a competing hypothesis, but they do not establish causal attribution at the recording site without site-resolved, time-aligned field measurements. Without those measurements, the lightning/weather account remains a hypothesis among other possibilities rather than a supported or default explanation for the signals we recorded."
Journal Reference:
• Alessandro Chiolerio, Monica Gagliano, Silvio Pilia, et al.; Bioelectrical synchronization of Picea abies during a solar eclipse. R Soc Open Sci. 1 April 2025; 12 (4): 241786. https://doi.org/10.1098/rsos.241786
• Novoplansky, Ariel et al. Eclipse of reason: debunking speculative anticipatory behavior in trees, Trends in Plant Science, Volume 0, Issue 0
Previously: Do Trees Really 'Talk' to Each Other Through Underground Fungal Networks?
Like most cloud-enabled home security cameras, Google's Nest products don't provide long-term storage unless you pay a monthly fee. That video may not vanish into the digital aether right on time, though. Investigators involved with the high-profile abduction of Nancy Guthrie have released video from Guthrie's Nest doorbell camera—video that was believed to have been deleted because Guthrie wasn't paying for the service.
[...]
If you don't pay anything, Google only saves three hours of event history. After that, the videos are deleted, at least as far as the user is concerned.
[...]
Expired videos are no longer available to the user, and Google won't restore them even if you upgrade to a premium account later. However, that doesn't mean the data is truly gone. Nancy Guthrie was abducted from her home in the early hours of February 1, and at first, investigators said there was no video of the crime because the doorbell camera was not on a paid account. Yet, video showing a masked individual fiddling with the camera was published on February 10.
[...]
In statements made by investigators, the video was apparently "recovered from residual data located in backend systems." It's unclear how long such data is retained or how easy it is for Google to access it. Some reports claim that it took several days for Google to recover the data.
[...]
There is a temptation to ascribe some malicious intent to Google's video storage setup. After all, this video expired after three hours, but here it is nine days later. That feels a bit suspicious on the surface, particularly for a company that is so focused on training AI models that feed on video.
[...]
every event recorded by the camera is going to Google's servers, and it's probably recoverable long past the deletion timeline stipulated in the company's policy.
[...]
there are still more traditional "DVR" security cameras, which record footage to dedicated local storage. Many NAS boxes also have support for storing and managing video from select security cameras. If you're sending video to the cloud, you can't expect it will be totally gone even if you no longer have access to it.
Elon Musk says launch windows and other logistics are behind the shift in strategy:
Elon Musk says SpaceX has shifted its near-term priorities from Mars settlement plans to building what he called a "self-growing city on the Moon," arguing the lunar target is faster and more achievable. In a post on X, Musk claims the company could complete this in less than 10 years, while doing the same on Mars would take over 20 years.
This marks a major shift for the aerospace company, as Musk points out that the logistics of first completing a proof of concept on the moon are easier with respect to launch windows and proximity to Earth. The SpaceX founder is notorious for promising optimistic timelines that never come to pass, and said in 2017 that a base on Mars would be ready for its first settlers as early as 2024.
In subsequent replies to other posts Musk predicted "Mars will start in 5 or 6 years, so will be done parallel with the Moon, but the Moon will be the initial focus." He also said a manned Mars flight might happen in 2031.
Early last year Musk said in a post on X that SpaceX would be going "straight to Mars" and that "the Moon is a distraction." This was in response to Space industry analyst Peter Hague pointing out that among other considerations, lunar regolith, a material found on the surface of the moon, is about 45 percent oxygen. In 2023 NASA proved this oxygen could be extracted, which would yield enormous payload savings as opposed to shipping liquid oxygen between Earth and Mars.
NASA's Artemis missions, which SpaceX is a contractor for at certain stages, are planned to see humans back on the lunar surface by 2028. Artemis II, during which astronauts will circle the moon before returning to Earth, is set to launch in March of this year.
On February 1, Robert Tinney, the illustrator whose airbrushed cover paintings defined the look and feel of pioneering computer magazine Byte for over a decade, died at age 78 in Baker, Louisiana, according to a memorial posted on his official website.
As the primary cover artist for Byte from 1975 to the late 1980s, Tinney became one of the first illustrators to give the abstract world of personal computing a coherent visual language, translating topics like artificial intelligence, networking, and programming into vivid, surrealist-influenced paintings that a generation of computer enthusiasts grew up with.
Incident is at least the third time the exchange has been targeted by thieves:
Open source packages published on the npm and PyPI repositories were laced with code that stole wallet credentials from dYdX developers and backend systems and, in some cases, backdoored devices, researchers said.
"Every application using the compromised npm versions is at risk ...." the researchers, from security firm Socket, said Friday. "Direct impact includes complete wallet compromise and irreversible cryptocurrency theft. The attack scope includes all applications depending on the compromised versions and both developers testing with real credentials and production end-users."
Packages that were infected were:
dYdX is a decentralized derivatives exchange that supports hundreds of markets for "perpetual trading," or the use of cryptocurrency to bet that the value of a derivative future will rise or fall. Socket said dYdX has processed over $1.5 trillion in trading volume over its lifetime, with an average trading volume of $200 million to $540 million and roughly $175 million in open interest. The exchange provides code libraries that allow third-party apps for trading bots, automated strategies, or backend services, all of which handle mnemonics or private keys for signing.
[...] The malicious code available on PyPI contained the same credential theft function, although it also implemented a remote access Trojan (RAT) that allowed the execution of new malware on infected systems. The backdoor received commands from dydx[.]priceoracle[.]site. The domain was registered on January 9, 17 days before the malicious package was uploaded to PyPI.
The RAT, Socket said:
- Runs as a background daemon thread
- Beacons to the C2 server every 10 seconds
- Receives Python code from the server
- Executes it in an isolated subprocess with no visible output
- Uses a hardcoded authorization token: 490CD9DAD3FAE1F59521C27A96B32F5D677DD41BF1F706A0BF85E69CA6EBFE75
Once installed, the threat actors could:
- Execute arbitrary Python code with user privileges
- Steal SSH keys, API credentials, and source code
- Install persistent backdoors
- Exfiltrate sensitive files
- Monitor user activity
- Modify critical files
- Pivot to other systems on the network
Socket said the packages were published to npm and PyPI by official dYdX accounts, an indication that they were compromised and used by the attackers. dYdX officials didn't respond to an email seeking confirmation and additional details.
The incident is at least the third time dYdX has been targeted in attacks. Previous events include a September 2022 uploading of malicious code to the npm repository and the commandeering in 2024 of the dYdX v3 website through DNS hijacking. Users were redirected to a malicious site that prompted them to sign transactions designed to drain their wallets.
"Viewed alongside the 2022 npm supply chain compromise and the 2024 DNS hijacking incident, this [latest] attack highlights a persistent pattern of adversaries targeting dYdX-related assets through trusted distribution channels," Socket said. "The threat actor simultaneously compromised packages in both npm and PyPI ecosystems, expanding the attack surface to reach JavaScript and Python developers working with dYdX."
Anyone using the platform should carefully examine all apps for dependencies on the malicious packages listed above.
The study, from academics at Cardiff University, Loughborough University and the University of Oxford, used computer software to analyse the range of nouns and adjectives used in 33 of his best-selling Discworld novels.
The results show a significant decrease in the diversity of nouns and adjectives in his later works. This shift was particularly marked in the diversity of adjectives, which decreased below a defined threshold approximately ten years before Pratchett's formal diagnosis.
Sir Terry Pratchett died in 2015 at the age of 66. He had posterior cortical atrophy, a rare form of early-onset Alzheimer's disease that primarily affects visual processing.
Study co-author Dr Melody Pattison, based at Cardiff University's School of English, Communication and Philosophy, said: "Our analysis of Sir Terry Pratchett's novels suggests that subtle changes in linguistic patterns, such as decreased lexical diversity, may precede clinical diagnosis of dementia by a considerable margin. In particular we found the richness of descriptive language in his books gradually narrowed."
We would normally expect less lexical diversity as texts get longer, but even after controlling for text length our findings were still significant. This was not something a reader would necessarily notice, but rather a subtle, progressive change. --Dr Melody Pattison
[...] "Research indicates that memory problems may not be the first symptom of dementia. We wanted to explore whether language could be an early warning sign, and to do this, we used Sir Terry Pratchett's books, who himself suffered dementia.
"Our analysis found that Sir Terry's use of language did indeed change during his career. These results suggest that language may be one of the first signs of dementia, and Sir Terry's books reveal a potential new approach for early diagnosis."
Journal Reference: Brain Sci. 2026, 16(1), 94; https://doi.org/10.3390/brainsci16010094
A team of physicists at MIT has managed to do something long thought impossible: peer into the ultrafast, quantum-scale motion of superconducting electrons. Using a microscope built around pulses of terahertz light – radiation oscillating trillions of times per second – they've captured a kind of atomic dance that has remained hidden until now.
The implications of the breakthrough could ripple through multiple industries. A better understanding of how superconductivity behaves at quantum scales could accelerate the development of room-temperature superconductors, radically improving electrical grids, quantum computers, and magnetic levitation systems.
The underlying terahertz technology itself – capable of transmitting and detecting signals at unprecedented speeds – could shape the future of wireless communications, sensing devices, and ultrafast data transfer for next-generation electronics.
The development, described in Nature, centers on bismuth strontium calcium copper oxide (BSCCO), a copper-based superconductor known for carrying electricity without resistance at relatively high temperatures.
When hit with precisely tuned terahertz bursts, the electrons inside the material began to move collectively, vibrating in unison at the same frequencies as the light itself. MIT physicist Nuh Gedik calls this previously unseen motion "a new mode of superconducting electrons."
The feat was accomplished using a terahertz microscope capable of compressing radiation that typically stretches hundreds of microns long down to the tiny scale of a quantum material. Terahertz radiation sits between microwaves and infrared on the electromagnetic spectrum, an energy range considered a sweet spot for imaging because it's non-ionizing, penetrates deeply, and matches the natural oscillation rate of atoms and electrons.
Yet until now, it's been all but useless for imaging small structures because of a fundamental barrier called the diffraction limit – light can't be focused to a spot smaller than its own wavelength.
MIT postdoctoral researcher Alexander von Hoegen and colleagues found a way to beat that limitation. They used a spintronic emitter, a layered metallic structure that generates sharp terahertz pulses when hit by a laser.
By placing microscopic samples extremely close to this source, the researchers trapped the light before it could spread out, focusing the energy into a region much smaller than its wavelength. That confinement allowed the microscope to resolve features that had been invisible under conventional terahertz illumination.
The design integrates the emitter with a Bragg mirror – a stack of ultrathin reflective layers that filter unwanted light while allowing the desired terahertz frequencies through. This setup protects the fragile sample from the optical laser but preserves the high-frequency terahertz signals scientists want to study.
In their first experiment, the researchers cooled a BSCCO sample to near absolute zero, where it enters its superconducting phase. As terahertz pulses moved through the chilled material, detectors picked up faint oscillations in the returning field – a telltale sign that electrons inside were moving collectively like a frictionless fluid.
The team compared the signals to theoretical predictions and confirmed that they had, for the first time, imaged the quantum superfluid motion itself. "It's this superconducting gel that we're sort of seeing jiggle," von Hoegen explained.
The visualization offers a new window into the quantum dynamics of superconductors and could help uncover factors that might one day enable superconductivity at room temperature – a long-sought goal in physics and energy technology.
Von Hoegen sees broad implications beyond basic physics. Future terahertz microscopes, he said, could study signal propagation in nanoscale antennas or sensors designed for terahertz-frequency telecommunications – the next frontier beyond today's Wi-Fi and millimeter-wave systems.
"There's a huge push to take Wi-Fi or telecommunications to the next level, to terahertz frequencies," he said. "If you have a terahertz microscope, you could study how terahertz light interacts with microscopically small devices that could serve as future antennas or receivers."
With the new microscope now operational, the team plans to explore other two-dimensional materials known for exotic electronic behaviors, hoping to capture their internal vibrations in the terahertz domain. Each experiment, they say, brings them closer to understanding how electrons cooperate when friction disappears – and what that could mean for the future of electronic materials.
Reference:
"Imaging a terahertz superfluid plasmon in a two-dimensional superconductor" - A. von Hoegen, T. Tai, C. J. Allington, M. Yeung, J. Pettine, M. H. Michael, E. Viñas Boström, X. Cui, K. Torres, A. E. Kossak, B. Lee, G. S. D. Beach, G. D. Gu, A. Rubio, P. Kim & N. Gedik: DOI https://www.nature.com/articles/s41586-025-10082-2