Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favo(u)rite topic for articles?

  • /dev/random
  • Hardware
  • Software
  • News
  • OS
  • Science
  • Security
  • Other - please expand in the comments, or give your least favo(u)rite instead

[ Results | Polls ]
Comments:37 | Votes:91

posted by janrinok on Saturday March 15, @11:17PM   Printer-friendly
from the but-some-may-be-rather-like-peninsulas dept.

Author, sysadmin, and Grumpy BSD Guy, Peter N M Hansteen, has written a post about Software Bill of Materials (SBOM) and how they relate to all software, both proprietary and Free and Open Source Software (FOSS). Increasingly maintaining a machine-readable inventory of runtime and build dependencies in the form of an SBOM is becoming the cost of doing business, even for FOSS projects.

Whether you let others see the code you wrote nor not, the software does not exist in isolation.

All software has dependencies, and in the open source world this fact has been treated as a truth out in the open. Every free operating system, and in fact most modern-ish programming languages come with a package system to install software and to track and handle the web of depenencies, and you are supposed to use the corresponding package manager for the bulk of maintenance tasks.

So when the security relevant incidents hit, the open source world was fairly well stocked with code that did almost all the things that were needed for producing what became known as Software Bill of Materials, or SBOM for short.

So what would a Software Bill of Materials even look like?

Obviously nuts and bots would not be involved, but items such as the source code files in your project, any libraries or tools needed to build the thing would be nice-to-knows, and once you have the thing built, what other things -- libraries, suites of utilities, services that are required to be running or other software frameworks of any kind -- that are required in order to have the thing run are bivious items of interest.

So basically, any item your code would need comes out as a dependency, and you will find that your code has both build time and run time dependencies.

There is increasing agreement that SBOMs are now necessary. The question is now becoming how to implement them without adding undue burdens onto developers or even onto whole development teams. Perhaps the way would be to separate out the making of these machine-readable inventories similarly to how packaging is generally separate from the main development activities.

Previously:
(2023) Managing Open Source Software and Software Bill of Materials
(2022) Open Source Community Sets Out Path to Secure Software


Original Submission

posted by janrinok on Saturday March 15, @06:33PM   Printer-friendly

https://phys.org/news/2025-03-rapidly-population-crocs-impacting-australia.html

A team of marine biologists, environmental researchers and land management specialists affiliated with several institutions in Australia, working with a colleague from Canada, has conducted a study of the ecological impact of a huge rise in the population of saltwater crocodiles in Australia's Northern Territories.

In their paper published in the journal Proceedings of the Royal Society B: Biological Sciences, the group describes what they learned about changes in croc size, diet, and the sharp rise in nutrients they excrete into the water system.

Fifty-four years ago, the Australian government banned the hunting of saltwater crocodiles in its Northern Territories. Since that time, the population of crocs has grown from approximately 1,000 to approximately 100,000. The research team wondered about the ecological impact of such a rapid change, and more specifically, if it was possible to quantify the changes that had taken place.

The work by the team involved conducting two major studies. One involved analyzing data that has been amassed by various researchers over the past half-century and then using it to conduct bioenergetic modeling of croc size and population. They then used the models to make estimates about consumption rates of various foods the crocs have been consuming and what they were excreting, and how much.

The other study involved analyzing bones that have been recovered in the region over the years 1970 to 2022. From these, the team was able to learn more about what the crocs had been eating and how much by measuring carbon and nitrogen isotopes.

The researchers found that the size of the crocs has been growing slightly and that increases in population have led to a total biomass increase from an average of 10 kg to 400 kg per kilometer of river area. They also found that the amount of food they ate as a group increased approximately nine-fold. Additionally, the amount of phosphorous and nitrogen excreted rose 56 and 186-fold—most of which went into the water.

Journal Reference: Mariana A. Campbell et al, Quantifying the ecological role of crocodiles: a 50-year review of metabolic requirements and nutrient contributions in northern Australia, Proceedings of the Royal Society B: Biological Sciences (2025). DOI: 10.1098/rspb.2024.2260


Original Submission

posted by janrinok on Saturday March 15, @01:48PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

A federal judge has dealt a blow to Elon Musk’s DOGE agenda. On Thursday, Judge William Alsup of San Francisco said that the firing of tens of thousands of federal probationary workers had been based on a “lie” and that the government had conducted the expulsions illegally—further calling the initiative a “sham.” Alsup ordered that the workers be reinstated immediately.

Probationary workers—that is, workers who are new to the workforce and haven’t received more advanced benefits and protections—have suffered massive cuts across the government, as DOGE and the Trump administration have attempted to greatly reduce the federal workforce. The case before Alsup concerns litigation brought by union groups representing those workers.

Alsup’s reinstatement order applies to thousands of federal workers fired from the Defense Department, the Department of Veterans Affairs, the Department of Agriculture, the Department of Energy, the Treasury Department, and the Department of the Interior. Government Executive reports that some 24,000 employees would regain their jobs as a result of the judge’s decision.

The government’s firing of the employees was illegitimate because the agencies impacted by the cuts were directed by the Office of Personnel Management to do so, Alsup said. The OPM does not have the authority to make such orders, as those orders could only be made by the agencies themselves, the judge concluded.

Many of the cuts in question took place not long after Musk’s DOGE initiative was announced and a team of Musk-linked workers took over the OPM. That team is said to have included numerous current and former employees of Musk, including Amanda Scales, a former Musk employee who was appointed chief of staff at the agency. On January 31, Reuters reported that Musk aides had locked career civil servants out of the computer systems at the agency and were engaged in some sort of undisclosed work involving said systems. Democratic lawmakers subsequently accused Musk of leading a “hostile takover” of the agency.

On February 14, Reuters reported that, as part of the government downsizing initiative being led by Musk, the Trump administration had begun to fire “scores” of government employees, a majority of which were still on probation. A statement from the OPM at the time said that the Trump administration was “encouraging agencies to use the probationary period as it was intended: as a continuation of the job application process, not an entitlement for permanent employment.”

Charles Ezell, the acting director of the OPM, met with the heads of numerous federal agencies on February 13 and ordered them to fire tens of thousands of employees, according to the unions representing the workers. The government has claimed that Ezell was not issuing orders and was merely providing “guidance.” However, Alsup recently determined that the OPM had, indeed, ordered the firings, and done so illegally.

“The court finds that Office of Personnel Management did direct all agencies to terminate probationary employees with the exception of mission critical employees,” Alsup recently said.

The case before Alsup took a turn this week when Ezell abruptly refused a court order to testify about his role in the firings. “The problem here is that Acting Director Ezell submitted a sworn declaration in support of defendants’ position, but now refuses to appear to be cross-examined, or to be deposed,” Alsup said.

Alsup, a Clinton appointee, had harsh words for the Trump administration’s conduct, claiming that attorneys working for the government had attempted to mislead him. “The government, I believe, has tried to frustrate the judge’s ability to get at the truth of what happened here, and then set forth sham declarations,” he said. “That’s not the way it works in the U.S. District Court.”

Outlets report that Alsup became visibly upset with Trump Justice Department lawyers at various points throughout the hearing. “Come on, that’s a sham. Go ahead. It upsets me, I want you to know that. I’ve been practicing or serving in this court for over 50 years, and I know how do we get at the truth,” Alsup said. “And you’re not helping me get at the truth. You’re giving me press releases, sham documents.”

“It is sad, a sad day,” Alsup continued. “Our government would fire some good employee, and say it was based on performance. When they know good and well, that’s a lie.” He continued: “That should not have been done in our country. It was a sham in order to try to avoid statutory requirements.””

Alsup also ordered discovery and deposition in the case to provide greater transparency about the government’s activities. He further dissuaded the government from trying to paint him as some sort of leftist radical. “The words that I give you today should not be taken as some kind of ‘wild and crazy judge in San Francisco has said that the administration cannot engage in a reduction in force.’ I’m not saying that at all,” Alsup said. The judge noted that the government could not break the law or violate the Constitution while working on such an agenda: “Of course, if he does, it has to comply with the statutory requirements: the Reduction In Force act, the Civil Service Act, the Constitution, maybe other statutes,” Alsup said. “But it can be done.”


Original Submission

posted by hubie on Saturday March 15, @09:07AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Earth's atmosphere is shrinking due to climate change and one of the possible negative impacts is that space junk will stay in orbit for longer, bonk into other bits of space junk, and make so much mess that low Earth orbits become less useful.

That miserable set of predictions appeared on Monday in a Nature Sustainability paper titled "Greenhouse gases reduce the satellite carrying capacity of low Earth orbit."

Penned by two boffins from MIT, and another from the University of Birmingham, the paper opens with the observation: "Anthropogenic contributions of greenhouse gases in Earth's atmosphere have been observed to cause cooling and contraction in the thermosphere."

The thermosphere extends from about 90 km to 500 km above Earth's surface. While conditions in the thermosphere are hellish, it's not a hard vacuum. NASA describes it as home to "very low density of molecules" compared to the exosphere's "extremely low density."

Among the molecules found in the thermosphere is carbon dioxide (CO2), which conducts heat from lower down in the atmosphere then radiates it outward.

"Thus, increasing concentrations of CO2 inevitably leads to cooling in the upper atmosphere. A consequence of cooling is a contraction of the global thermosphere, leading to reductions in mass density at constant altitude over time."

That's unwelcome because the very low density of matter in the thermosphere is still enough to create drag on craft in low Earth orbit – enough that the International Space Station requires regular boosts to stay in orbit.

It's also enough drag to gradually slow space junk, causing it to descend into denser parts of the atmosphere where it vaporizes. A less dense thermosphere, the authors warn, means more space junk orbiting for longer and the possibility of Kessler syndrome instability – space junk bumping into space junk and breaking it up into smaller pieces until there's so much space junk some orbits become too dangerous to host satellites.

[...] The good news is the paper notes that satellite makers know Kessler syndrome instability is a possibility, so often build collision avoidance capabilities that let them avoid debris.

The authors hope manufacturers and operators work together on many debris-reduction tactics, and that greenhouse gas emissions are reduced to keep the thermosphere in fine trim.


Original Submission

posted by hubie on Saturday March 15, @04:22AM   Printer-friendly
from the mouthful-of-chiplets dept.

Arthur T Knackerbracket has processed the following story:

According to two leakers.

AMD’s upcoming Zen 6 processors will remain compatible with AM5, but they are set to introduce a new chiplet-based CPU design and significantly boost core counts across desktop and laptop products, according to sources of ChipHell, as well as Moore's Law Is Dead. Premium processors for gamers will also feature 3D V-Cache.

AMD's next-generation Ryzen processors based on the Zen 6 microarchitecture will feature 12-core core chiplet dies (CCDs), marking a major shift from eight-core CCDs used in Zen 3/4/5 generation processors, if the linked reports are accurate. As a result, desktop AM5 processors will be able to feature up to 24 cores. Meanwhile, advanced laptop APUs will transition from a four Zen 5 eight Zen 5c (8+4) configuration to a 12-core structure, at least according to MLID. A Zen 6 CCD is 75mm^2 large, MLID claims.

Now, the increased number of cores is a big deal. However, premium versions of AMD's desktop processors will feature up to 96MB of L3 cache, which is 4MB per core. 4MB per core is in line with existing Zen 5 configurations, so AMD does not cut down caches in favor of core count.

AMD is expected to release Zen 6-based products in 2026, so it is reasonable to expect them to use a more advanced node than they use today (TSMC's 4nm-class), so think TSMC's N3P (3nm-class) given that AMD does exactly use leading-edge nodes (possibly due to supply constraints), which will be N2 (2nm-class) next year.

AMD's Zen 6-based Ryzens for gaming PCs will also feature 3D V-Cache. Some laptop processors with built-in graphics will also feature 3D V-Cache, though exact configuration is something that remains to be seen.

Interestingly, and according to MLID, AMD's standard APUs will be chiplet-based, moving away from the monolithic approach. Medusa Point — a laptop APU — is expected to feature a Zen 6 CCD with 12 cores and a 200mm^2 I/O die (IOD), featuring eight RDNA work groups, a 128-bit memory controller, and a large NPU. There is speculation that Infinity Cache may be added to enhance GPU performance.

MLID also claims that the desktop version of Medusa Point — allegedly called Medusa Ridge — will use up to two 12-core Zen 6 CCD in the AM5 form-factor. That product will have a 155mm^2 IOD without an advanced built-in GPU, but possibly with a large NPU.


Original Submission

posted by hubie on Friday March 14, @11:34PM   Printer-friendly

Advanced transmission technologies could sidestep permitting challenges and clear the bottleneck holding up hundreds of gigawatts' worth of renewable-energy projects:

US electricity consumption is rising faster than it has in decades, thanks in part to the boom in data center development, the resurgence in manufacturing, and the increasing popularity of electric vehicles.

Accommodating that growth will require building wind turbines, solar farms, and other power plants faster than we ever have before—and expanding the network of wires needed to connect those facilities to the grid.

But one major problem is that it's expensive and slow to secure permits for new transmission lines and build them across the country. This challenge has created one of the biggest obstacles to getting more electricity generation online, reducing investment in new power plants and stranding others in years-long "interconnection queues" while they wait to join the grid.

Fortunately, there are some shortcuts that could expand the capacity of the existing system without requiring completely new infrastructure: a suite of hardware and software tools known as advanced transmission technologies (ATTs), which can increase both the capacity and the efficiency of the power sector.

ATTs have the potential to radically reduce timelines for grid upgrades, avoid tricky permitting issues, and yield billions in annual savings for US consumers. They could help us quickly bring online a significant portion of the nearly 2,600 gigawatts of backlogged generation and storage projects awaiting pathways to connect to the electric grid.

The opportunity to leverage advanced transmission technologies to update the way we deliver and consume electricity in America is as close to a $20 bill sitting on the sidewalk as policymakers may ever encounter. Promoting the development and use of these technologies should be a top priority for politicians in Washington, DC, as well as electricity market regulators around the country.

[...] ATTs generally fall into four categories: dynamic line ratings, which combine local weather forecasts and measurements on or near the transmission line to safely increase their capacity when conditions allow; high-performance conductors, which are advanced wires that use carbon fiber, composite cores, or superconducting materials to carry more electricity than traditional steel-core conductors; topology optimization, which uses software to model fluctuating conditions across the grid and identify the most efficient routes to distribute electricity from moment to moment; and advanced power flow control devices, which redistribute electricity to lines with available capacity.

[...] So why are we not seeing an explosion in ATT investment and deployment in the US? Because despite their potential to unlock 21st-century technology, the 20th-century structure of the nation's electricity markets discourages adoption of these solutions.

For one thing, under the current regulatory system, utilities generally make money by passing the cost of big new developments along to customers (earning a fixed annual return on their investment). That comes in the form of higher electricity rates, which local public utility commissions often approve after power companies propose such projects.

That means utilities have financial incentives to make large and expensive investments, but not to save consumers money. When ATTs are installed in place of building new transmission capacity, the smaller capital costs mean that utilities make lower profits. For example, utilities might earn $600,000 per year after building a new mile of transmission, compared with about $4,500 per mile annually after installing the equipment and software necessary for line ratings. While these state regulatory agencies are tasked with ensuring that utilities act in the best interest of consumers, they often lack the necessary information to identify the best approach for doing so.

[...] In addition, federal agencies and state lawmakers should require transmission providers to evaluate the potential for using ATTs on their grid, or provide support to help them do so. FERC has recently taken steps in this direction, and it should continue to strengthen those actions.

Regulators should also provide financial incentives to transmission providers to encourage the installation of ATTs. The most promising approach is a "shared savings" incentive, such as that proposed in the recent Advancing GETS Act. This would allow utilities to earn a profit for saving money, not just spending it, and could save consumers billions on their electricity bills every year.

Finally, we should invest in building digital tools so transmission owners can identify opportunities for these technologies and so regulators can hold them accountable. Developing these systems will require transmission providers to share information about electricity supply and demand as well as grid infrastructure. Ideally, with such data in hand, researchers can develop a "digital twin" of the current transmission system to test different configurations of ATTs and help improve the performance and efficiency of our grids.

We are all too aware that the world often faces difficult policy trade-offs. But laws or regulations that facilitate the use of ATTs can quickly expand the grid and save consumers money. They should be an easy yes on both sides of the aisle.


Original Submission

posted by janrinok on Friday March 14, @06:47PM   Printer-friendly

Humans have a third set of teeth: Scientists discover medicine to grow them:

Kiran Mazumdar-Shaw called the newly developed drug capable of regrowing human teeth an "amazing discovery" that could make dental implants obsolete. Imagine a world in which losing a tooth does not require the use of dentures or implants. Scientists in Japan have unearthed an important first in regenerative medicine: a medication that could enable humans develop a third set of teeth. This study, which focusses on a single gene responsible for tooth growth, has begun clinical testing and could be accessible for general use by 2030. If successful, this finding has the potential to improve dental treatment and provide hope to millions of people who are missing teeth.A team of Japanese researchers, lead by Dr. Katsu Takahashi of the Medical Research Institute Kitano Hospital in Osaka, has been studying the genetic principles of tooth development. Their findings build on a 2021 study published in Scientific Reports, which found that reducing the USAG-1 gene in mice resulted in the creation of new teeth.The USAG-1 gene produces a protein that suppresses tooth development. Researchers discovered that employing an antibody that disables this protein allowed mice to regenerate teeth. Encouraged by these findings, the team has shifted its focus to humans, assuming that comparable genetic systems exist within us.Handout images from the Medical Research Institute Kitano Hospital show before (top) and after images of the regrowth of teeth in a ferret (centre) and mice (R and L).Humans already have a hidden third set of teeth

One of the most intriguing aspects of this discovery is that humans already have the potential to grow a third set of teeth. "The idea of growing new teeth is every dentist's dream," Dr Takahashi told Mainichi. "We're hoping to see a time when tooth regrowth medicine is a third choice alongside dentures and implants."While most people develop only two sets of teeth—baby teeth and permanent teeth—some individuals with a condition called hyperdontia naturally grow extra teeth. This suggests that the body already has the biological framework for an additional set. Scientists believe that activating these latent tooth buds using gene-targeting therapy could stimulate controlled regrowth in the general population.How this discovery could revolutionise dentistry

The potential benefits of this breakthrough extend far beyond cosmetic dentistry. Around 1% of the global population suffers from anodontia, a genetic condition in which some or all permanent teeth fail to develop. Current treatment options, such as dental implants and dentures, are expensive and often come with complications.A medication that allows the body to regrow its own teeth could be life-changing for those affected by tooth loss due to genetic disorders, accidents, or aging. "The number of teeth varied through the mutation of just one gene," Dr. Takahashi explained. "If we make that the target of our research, there should be a way to change the number of teeth people have.

The Japanese research team has already begun human clinical trials and hopes to make the drug available for general use by 2030. If successful, this treatment could mark one of the most significant advances in dental medicine.A 2023 paper published in Regenerative Therapy highlighted the lack of existing treatments for tooth regrowth and emphasized the potential of anti-USAG-1 antibody therapy as a breakthrough. The researchers believe that continued progress could lead to a practical and widely available solution in the coming years.


Original Submission

posted by janrinok on Friday March 14, @02:01PM   Printer-friendly

https://phys.org/news/2025-03-attention-limitations-idea-thieves-workplaces.html

It happens all the time. You're in a meeting, brainstorming with your team to uncover the next big idea. As the discussion unfolds, one of the standout ideas is yours—or so you thought. Suddenly, you realize a colleague is getting the credit.

You've just encountered an idea thief.

Despite the high reputational cost of being caught, idea theft is surprisingly common. A 2015 poll of 1,000 British workers revealed nearly half had their ideas stolen by colleagues, while 1 in 5 admitted to stealing an idea themselves.

Why is idea theft so common? And how do so many idea thieves get away with it? Zoe Kinias, professor of organizational behavior and sustainability at Ivey Business School, tackled these questions with her colleagues in a new study, "Social inattentional blindness to idea stealing in meetings," published in Scientific Reports.

Today's managers and executives are juggling more than ever, balancing diverse tasks in dynamic and information-rich workplaces. It's hard to stay fully informed and keep a finger on the pulse of everything that matters, experts say.

"As humans, our senses are constantly working together to create a vivid and detailed perception of the world," said Kinias. "Yet, our brains process only a tiny fraction of the information around us, leaving much unnoticed. This phenomenon, known as inattentional blindness, highlights just how selective our attention truly is."

Inattentional blindness offers profound opportunities for understanding complex social dynamics. But how do you study something most people fail to notice? Enter Theodore C. Masters-Waage, then a Ph.D. student at Singapore Management University, who approached Kinias—an expert in empowering workers—with a bold idea: leveraging virtual reality (VR) to explore social attention in the workplace.

"While VR has long been a powerful tool in STEM (science, technology, engineering and math) fields, its use in organizational behavior research is still in its early days," Kinias said. "For this study, VR was essential. It allowed us to create a hyper-realistic scenario with complete control, enabling us to examine how subtle social changes influence where people focus, or fail to focus, their attention."

In their experiment, 154 participants used VR headsets to enter a virtual meeting, where they watched four team members brainstorm ideas. Their task was straightforward: Identify the best idea. But there was a twist—midway through the meeting, one person blatantly stole another's idea and claimed it as their own.

The results were surprising: While nearly all participants—more than 99%—could pinpoint the best idea, only 30% could recall who originally shared it. The study revealed that the person who swooped in and claimed the idea as their own reaped the rewards. In fact, 42 percent of participants mistakenly credited the idea thief.

Journal Reference: Masters-Waage, T.C., Kinias, Z., Argueta-Rivera, J. et al. Social inattentional blindness to idea stealing in meetings. Sci Rep 14, 8060 (2024). https://doi.org/10.1038/s41598-024-56905-6


Original Submission

posted by hubie on Friday March 14, @09:14AM   Printer-friendly

Exclusive: General Fusion fires up its newest steampunk fusion reactor:

General Fusion announced on Tuesday that it had successfully created plasma, a superheated fourth state of matter required for fusion, inside a prototype reactor. The milestone marks the beginning of a 93-week quest to prove that the outfit's steampunk approach to fusion power remains a viable contender.

The reactor, called Lawson Machine 26 (LM26), is General Fusion's latest iteration in a string of devices that have tested various parts of its unique approach. The company assembled LM26 in just 16 months, and it hopes to hit "breakeven" sometime in 2026.

General Fusion is one of the oldest fusion companies still operating. Founded in 2002, it has raised $440 million to date, according to PitchBook. Over that time, it has seen competitors rise and fall, and, like the fusion industry writ large, it has failed to meet breakeven promises, including one made over 20 years ago.

In fusion power, there are two points at which a reaction is said to breakeven. The one most people think of is called commercial breakeven. That's when a fusion reaction produces more power than the entire facility consumes, allowing the power plant to put electricity on the grid. No one has reached this milestone yet.

The other is known as scientific breakeven. In this case, the fusion reaction needs to produce at least as much power as was delivered directly to the fuel. Scientific breakeven only looks within the boundaries of the experimental system, ignoring the rest of the facility. Still, it's an important milestone for any fusion attempt. So far, only the U.S. Department of Energy's National Ignition Facility has reached it.

General Fusion's approach to fusion power differs significantly from other startups. Called magnetized target fusion (MTF), it's similar in some regards to inertial confinement, the technique the National Ignition Facility used in late 2022 to prove that fusion reactions could generate more power than was required to start them.

But where the National Ignition Facility uses lasers to compress a fuel pellet, General Fusion's MTF reactor design relies on steam-driven pistons. Inside the chamber, deuterium-tritium fuel is zapped with a bit of electricity to generate a magnetic field, which helps keep the plasma contained. The pistons then drive a liquid lithium wall inward on the plasma, compressing it.

As the fuel is compressed, its temperature rises until it sparks a fusion reaction. The reaction then heats the liquid lithium, which the company plans to circulate through a heat exchanger to create steam and spin a generator.

MTF emerged in the 1970s from the U.S. Naval Research Laboratory, where researchers were developing concepts for compact fusion reactors. Those efforts didn't bear fruit. General Fusion says that's because the pistons compressing the liquid liner weren't controlled precisely enough, and that modern computers now provide a better chance at executing the complex choreography.

Whatever LM26 accomplishes, General Fusion still has more work to do. The device doesn't have the liquid lithium wall, instead relying on solid lithium compressed by electromagnets. That limits the number of test runs the company can take since it takes longer to reset the device. The company has made progress on a prototype of the liquid wall, performing over 1,000 tests to see how it holds up over time, but integrating everything will still be a monumental engineering challenge.

Flipping the switch on LM26 is nonetheless a significant step for a company that is now racing to deliver a power plant alongside a host of newcomers with their own deep pocketbooks and aggressive timelines.


Original Submission

posted by janrinok on Friday March 14, @04:29AM   Printer-friendly

https://apnews.com/article/conveyor-dune-oil-sand-texas-hydraulic-fracturing-1e98b8438de8dfa687118599ec2ff7f4

It's longer than the width of Rhode Island, snakes across the oil fields of the southwest U.S. and crawls at 10 mph – too slow for a truck and too long for a train.

It's a new sight: the longest conveyer belt in America.

Atlas Energy Solutions, a Texas-based oil field company, has installed a 42-mile long (67 kilometers) conveyer belt to transport millions of tons of sand for hydraulic fracturing. The belt the company named "The Dune Express" runs from tiny Kermit, Texas, and across state borders into Lea County, New Mexico. Tall and lanky with lids that resemble solar modules, the steel structure could almost be mistaken for a roller coaster.

In remote West Texas, there are few people to marvel at the unusual machine in Kermit, a city with a population of less than 6,000, where the sand is typically hauled by tractor-trailers. During fracking, liquid is pumped into the ground at a high pressure to create holes, or fractures, that release oil. The sand helps keep the holes open as water, oil and gas flow through it.


Original Submission

posted by janrinok on Thursday March 13, @11:42PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Chipmaking tool biz ASML plans to open a new facility in China this year amid rising trade tensions between Washington and Beijing.

The supplier of advanced lithography equipment disclosed in its latest Annual Report that it aims to inaugurate a Beijing-based Reuse & Repair Center in 2025, recognizing the importance of China as one of its largest markets, alongside Taiwan.

This is a facility for reconditioning and reusing materials from systems that have been returned from the field, so the unit won't manufacture from scratch.

The decision comes after US authorities extended the list of restrictions on suppliers of chip manufacturing tech in December to include metrology – the precise measurement and validation of semiconductor materials using e-beams, X-rays and more – and software. Meanwhile further fab locations, mainly in China, were added to the export blacklist.

In retaliation, Beijing kicked off an investigation in January to decide if US subsidies to chipmakers are harming its semiconductor companies and amount to unreasonable trade practices.

This was days before President Donald Trump's administration - which itself isn't keen on the CHIPs Act - took over in Washington and introduced a further hardening of its stance on China, hiking tariffs on goods imported from the country by an extra ten percent.

ASML is currently the world's only supplier of extreme ultraviolet (EUV) photolithography equipment, used in the making of advanced chips with smaller features to cram in more circuitry. Export of these products to China was blocked by the Dutch government several years ago.

Fresh reports from China now suggest local researchers may have found a way to produce light at a 13.5 nm wavelength – the same in ASML's EUV kit – and are working to produce homegrown tech to sidestep the export ban.

According to TechPowerUp, Chinese megacorp Huawei is testing the system at a facility in the city of Dongguan, with trial production runs scheduled from September, and mass manufacturing targeted for 2026. It is said to use a technique called laser-induced discharge plasma (LDP), claimed to be simpler and less costly than ASML's laser-produced plasma (LPP) technology.

Huawei has previously caught the US off-guard by introducing a smartphone using a homegrown 7nm processor, technology it was not believed capable of producing.

In contrast to ASML, IBM shuttered its Chinese research and development operations at the start of March.

Big Blue signalled last year that it intended to close its China Development Lab and China Systems Lab after 32 years of operations, due to fractious relations between Washington and Beijing.

It was also reported at the time that the closure was blamed on competition from China's state-subsidized rivals, and that IBM was shifting R&D work outside the country. It is believed to impact more than 1,800 staff.


Original Submission

posted by janrinok on Thursday March 13, @06:57PM   Printer-friendly

https://phys.org/news/2025-03-psychological-booster-shots-resistance-misinformation.html

A new study has found that targeted psychological interventions can significantly enhance long-term resistance to misinformation. Dubbed "psychological booster shots," these interventions improve memory retention and help individuals recognize and resist misleading information more effectively over time.

The study, published in Nature Communications, explores how different approaches, including text-based messages, videos, and online games, can inoculate people against misinformation.

The researchers from the Universities of Oxford, Cambridge, Bristol, Potsdam and King's College London conducted five large-scale experiments with over 11,000 participants to examine the durability of these interventions and identify ways to strengthen their effects.

The research team tested three types of misinformation-prevention methods:

  • Text-based interventions, where participants read pre-emptive messages explaining common misinformation tactics.

  • Video-based interventions, short educational clips that expose the emotional manipulation techniques used in misleading content.

  • Gamified interventions, an interactive game that teaches people to spot misinformation tactics by having them create their own (fictional) fake news stories in a safe, controlled environment.

Participants were then exposed to misinformation and evaluated on their ability to detect and resist it over time. The study found that while all three interventions were effective, their effects diminished quickly over time, prompting questions about their long-term impact. However, providing memory-enhancing "booster" interventions, such as a follow-up reminder or reinforcement message, helped maintain misinformation resistance for a significantly longer period.

The study found that the longevity of misinformation resistance was primarily driven by how well participants remembered the original intervention. Follow-up reminders or memory-enhancing exercises were also found to significantly extend the effectiveness of the initial intervention, much like medical booster vaccines.

By contrast, the researchers found that boosters that did not focus on memory, but rather focused on increasing participants' motivation to defend themselves by reminding people of the looming threat of misinformation, did not have any measurable benefits for the longevity of the effects.

Journal Reference: Maertens, R., Roozenbeek, J., Simons, J.S. et al. Psychological booster shots targeting memory increase long-term resistance against misinformation. Nat Commun 16, 2062 (2025). https://doi.org/10.1038/s41467-025-57205-x


Original Submission

posted by hubie on Thursday March 13, @02:10PM   Printer-friendly

https://www.bleepingcomputer.com/news/software/mozilla-warns-users-to-update-firefox-before-certificate-expires/

  By Bill Toulas

        March 12, 2025 11:01 AM

Mozilla is warning Firefox users to update their browsers to the latest version to avoid facing disruption and security risks caused by the upcoming expiration of one of the company's root certificates.

The Mozilla certificate is set to expire this Friday, March 14, 2025, and was used to sign content, including add-ons for various Mozilla projects and Firefox itself.

Users need to update their browsers to Firefox 128 (released in July 2024) or later and ESR 115.13 or later for 'Extended Support Release' (ESR) users.

"On 14 March a root certificate (the resource used to prove an add-on was approved by Mozilla) will expire, meaning Firefox users on versions older than 128 (or ESR 115) will not be able to use their add-ons," warns a Mozilla blog post.

"We want developers to be aware of this in case some of your users are on older versions of Firefox that may be impacted."

A Mozilla support document explains that failing to update Firefox could expose users to significant security risks and practical issues, which, according to Mozilla, include:

        Malicious add-ons can compromise user data or privacy by bypassing security protections.
        Untrusted certificates may allow users to visit fraudulent or insecure websites without warning.
        Compromised password alerts may stop working, leaving users unaware of potential account breaches.

Users are recommended to check and confirm they're running Firefox version 128 and later via Menu > Help > About Firefox. This action should also automatically trigger a check for updates.

It is noted that the problem impacts Firefox on all platforms, including Windows, Android, Linux, and macOS, except for iOS, where there's an independent root certificate management system.

Mozilla says that users relying on older versions of Firefox may continue using their browsers after the expiration of the certificate if they accept the security risks, but the software's performance and functionality may be severely impacted.

"We strongly advise you to update to the latest version to avoid these issues and ensure your browser stays secure and efficient," advises Mozilla.

Mozilla has also set up a support thread for users who encounter problems or need help updating their Firefox browsers.

Users of Firefox-based browsers like Tor, LibreWolf, and Waterfox should also ensure they're running a version based on Firefox 128 and later.


Original Submission

posted by janrinok on Thursday March 13, @09:21AM   Printer-friendly

Woolly mice are cute and impressive – but they won't bring back mammoths or save endangered species:

US company Colossal Biosciences has announced the creation of a "woolly mouse" — a laboratory mouse with a series of genetic modifications that lead to a woolly coat. The company claims this is the first step toward "de-extincting" the woolly mammoth.

The successful genetic modification of a laboratory mouse is a testament to the progress science has made in understanding gene function, developmental biology and genome editing. But does a woolly mouse really teach us anything about the woolly mammoth?

Woolly mammoths were cold-adapted members of the elephant family, which disappeared from mainland Siberia at the end of the last Ice Age around 10,000 years ago. The last surviving population, on Wrangel Island in the Arctic Ocean, went extinct about 4,000 years ago.

The house mouse (Mus musculus) is a far more familiar creature, which most of us know as a kitchen pest. It is also one of the most studied organisms in biology and medical research. We know more about this laboratory mouse than perhaps any other mammal besides humans.

Colossal details its new research in a pre-print paper, which has not yet been peer-reviewed. According to the paper, the researchers disrupted the normal function of seven different genes in laboratory mice via gene editing.

Six of these genes were targeted because a large body of existing research on the mouse model had already demonstrated their roles in hair-related traits, such as coat colour, texture and thickness.

The modifications in a seventh gene — FABP2 — was based on evidence from the woolly mammoth genome. The gene is involved in the transport of fats in the body.

Woolly mammoths had a slightly shorter version of the gene, which the researchers believe may have contributed to its adaptation to life in cold climates. However, the "woolly mice" with the mammoth-style variant of FABP2 did not show significant differences in body mass compared to regular lab mice.

This work shows the promise of targeted editing of genes of known function in mice. After further testing, this technology may have a future place in conservation efforts. But it's a long way from holding promise for de-extinction.

Colossal Biosciences claims it is on track to produce a genetically modified "mammoth-like" elephant by 2028, but what makes a mammoth unique is more than skin-deep.

De-extinction would need to go beyond modifying an existing species to show superficial traits from an extinct relative. Many aspects of an extinct species' biology remain unknown. A woolly coat is one thing. Recreating the entire suite of adaptations, including genetic, epigenetic and behavioural traits that allowed mammoths to thrive in ice age environments, is another.

Unlike the thylacine (or Tasmanian tiger) — another species Colossal aims to resurrect — the mammoth has a close living relative in the modern Asian elephant. The closer connections between the genomes of these two species may make mammoth de-extinction more technically feasible than that of the thylacine.

But whether or not a woolly mouse brings us any closer to that prospect, this story forces us to consider some important ethical questions. Even if we could bring back the woolly mammoth, should we? Is the motivation behind this effort conservation, or entertainment? Is it ethical to bring a species back into an environment that may no longer sustain it?

In Australia alone, we've lost at least 100 species to extinction since European colonisation in 1788, largely due the introduction of feral predators and land clearing.

The idea of reversing extinction is understandably appealing. We might like to think we could undo the past.

Journal Reference: Rui Chen, Kanokwan Srirattana, Melissa L. Coquelin, et al., Multiplex-edited mice recapitulate woolly mammoth hair phenotypes, bioRxiv, https://doi.org/10.1101/2025.03.03.641227


Original Submission

posted by hubie on Thursday March 13, @04:37AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Rust is alive and well in the Linux kernel and is expected to translate into noticeable benefits shortly, though its integration with the largely C-oriented codebase still looks uneasy.

In a hopeful coda to the recent maintainer drama that raised questions about the willingness of Linux maintainers to accommodate Rust code, Josh Aas, who oversees the Internet Security Research Group's Prossimo memory-safety project, late last week hailed Miguel Ojeda's work to advance memory safety in the kernel without mentioning the programming language schism.

"While our goal was never to rewrite the entire kernel in Rust, we are glad to see growing acceptance of Rust's benefits in various subsystems," said Aas. "Today, multiple companies have full time engineers dedicated to working on Rust in the Linux kernel."

Since at least September last year, when Microsoft software engineer Wedson Almeida Filho left the Rust for Linux project citing "non-technical nonsense," it's been clear that acceptance had limits. Tensions between Rust and C kernel contributors flared again in January over concerns about the challenges of maintaining a mixed language codebase – likened to cancer by one maintainer. Urged to intervene, Linux creator Linux Torvalds did so, making his annoyance known to both parties and prompting their departures as Linux maintainers.

Amid all that, Ojeda, who helms the Rust for Linux project, published a "Rust kernel policy" as a way to clarify that those contributing Rust code to the Linux kernel should stay the course and to underscore that Linux leaders still support the initiative.

According to Aas, the presence of Rust code is increasing in various Linux subsystems, including: PHY drivers, the null block driver, the DRM panic screen QR code generator, the Android binder driver, the Apple AGX GPU driver, the NVMe driver, and the Nova GPU driver.

"We expect that one of them will be merged into the mainline kernel in the next 12-18 months," said Aas, pointing to remarks from Linux lieutenant Greg Kroah-Hartman last November suggesting that the availability of Rust driver bindings represented a tipping point that would allow most driver subsystems to start getting Rust drivers.

Once this happens, said Aas, "the goal of the effort will start to be realized: Products and services running Linux with Rust drivers will be more secure, and that means the people using them will be more secure, too."

[...] "The good news is that with the rare exception of code that must be written in assembly for performance and/or security reasons (eg, cryptographic routines), we know how to get rid of memory safety vulnerabilities entirely: write code in languages that don't allow for those kinds of mistakes. It's a more or less solved research problem, and as such we don't need to suffer from this kind of thing any more. It can be relegated to the past like smallpox, we just have to do the work."

Between evocations of cancer and smallpox, it sounds like the Linux and Rust communities still have some issues to work out.


Original Submission