Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

How long have you had your current mobile phone?

  • 0-6 months
  • 6-12 months
  • 1-2 years
  • 2-4 years
  • 4+ years
  • My phone belongs in a technology museum.
  • Do 2 tin cans and a very long piece of string count?
  • I don't have a mobile phone you insensitive clod!

[ Results | Polls ]
Comments:43 | Votes:225

posted by hubie on Wednesday June 11, @09:11PM   Printer-friendly

New Way to Track Covertly Android Users

Researchers have discovered a new way to covertly track Android users. Both Meta and Yandex were using it, but have suddenly stopped now that they have been caught.

The details are interesting, and worth reading in detail:

Tracking code that Meta and Russia-based Yandex embed into millions of websites is de-anonymizing visitors by abusing legitimate Internet protocols, causing Chrome and other browsers to surreptitiously send unique identifiers to native apps installed on a device, researchers have discovered. Google says it's investigating the abuse, which allows Meta and Yandex to convert ephemeral web identifiers into persistent mobile app user identities.

        The covert tracking ­implemented in the Meta Pixel and Yandex Metrica trackers­ allows Meta and Yandex to bypass core security and privacy protections provided by both the Android operating system and browsers that run on it. Android sandboxing, for instance, isolates processes to prevent them from interacting with the OS and any other app installed on the device, cutting off access to sensitive data or privileged system resources. Defenses such as state partitioning and storage partitioning, which are built into all major browsers, store site cookies and other data associated with a website in containers that are unique to every top-level website domain to ensure they're off-limits for every other site.

-- Links in article:

https://localmess.github.io/
https://www.facebook.com/business/tools/meta-pixel/
https://ads.yandex/metrica
https://source.android.com/docs/security/app-sandbox
https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/State_Partitioning
https://privacysandbox.google.com/cookies/storage-partitioning
https://www.washingtonpost.com/technology/2025/06/06/meta-privacy-facebook-instagram/

-- See Also:

- Meta and Yandex are de-anonymizing Android users' web browsing identifiers
https://arstechnica.com/security/2025/06/meta-and-yandex-are-de-anonymizing-android-users-web-browsing-identifiers/


Original Submission

posted by hubie on Wednesday June 11, @04:26PM   Printer-friendly

OpenAI defends privacy of hundreds of millions of ChatGPT users:

OpenAI is now fighting a court order to preserve all ChatGPT user logs—including deleted chats and sensitive chats logged through its API business offering—after news organizations suing over copyright claims accused the AI company of destroying evidence.

"Before OpenAI had an opportunity to respond to those unfounded accusations, the court ordered OpenAI to 'preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying)," OpenAI explained in a court filing demanding oral arguments in a bid to block the controversial order.

In the filing, OpenAI alleged that the court rushed the order based only on a hunch raised by The New York Times and other news plaintiffs. And now, without "any just cause," OpenAI argued, the order "continues to prevent OpenAI from respecting its users' privacy decisions." That risk extended to users of ChatGPT Free, Plus, and Pro, as well as users of OpenAI's application programming interface (API), OpenAI said.

The court order came after news organizations expressed concern that people using ChatGPT to skirt paywalls "might be more likely to 'delete all [their] searches' to cover their tracks," OpenAI explained. Evidence to support that claim, news plaintiffs argued, was missing from the record because so far, OpenAI had only shared samples of chat logs that users had agreed that the company could retain. Sharing the news plaintiffs' concerns, the judge, Ona Wang, ultimately agreed that OpenAI likely would never stop deleting that alleged evidence absent a court order, granting news plaintiffs' request to preserve all chats.

OpenAI argued the May 13 order was premature and should be vacated, until, "at a minimum," news organizations can establish a substantial need for OpenAI to preserve all chat logs. They warned that the privacy of hundreds of millions of ChatGPT users globally is at risk every day that the "sweeping, unprecedented" order continues to be enforced.

"As a result, OpenAI is forced to jettison its commitment to allow users to control when and how their ChatGPT conversation data is used, and whether it is retained," OpenAI argued.

Meanwhile, there is no evidence beyond speculation yet supporting claims that "OpenAI had intentionally deleted data," OpenAI alleged. And supposedly there is not "a single piece of evidence supporting" claims that copyright-infringing ChatGPT users are more likely to delete their chats.

"OpenAI did not 'destroy' any data, and certainly did not delete any data in response to litigation events," OpenAI argued. "The Order appears to have incorrectly assumed the contrary."

At a conference in January, Wang raised a hypothetical in line with her thinking on the subsequent order. She asked OpenAI's legal team to consider a ChatGPT user who "found some way to get around the pay wall" and "was getting The New York Times content somehow as the output." If that user "then hears about this case and says, 'Oh, whoa, you know I'm going to ask them to delete all of my searches and not retain any of my searches going forward,'" the judge asked, wouldn't that be "directly the problem" that the order would address?

[...] Before the order was in place mid-May, OpenAI only retained "chat history" for users of ChatGPT Free, Plus, and Pro who did not opt out of data retention. But now, OpenAI has been forced to preserve chat history even when users "elect to not retain particular conversations by manually deleting specific conversations or by starting a 'Temporary Chat,' which disappears once closed," OpenAI said. Previously, users could also request to "delete their OpenAI accounts entirely, including all prior conversation history," which was then purged within 30 days.

While OpenAI rejects claims that ordinary users use ChatGPT to access news articles, the company noted that including OpenAI's business customers in the order made "even less sense," since API conversation data "is subject to standard retention policies." That means API customers couldn't delete all their searches based on their customers' activity, which is the supposed basis for requiring OpenAI to retain sensitive data.

"The court nevertheless required OpenAI to continue preserving API Conversation Data as well," OpenAI argued, in support of lifting the order on the API chat logs.

[...] It's unclear if OpenAI will be able to get the judge to waver if oral arguments are scheduled.

Wang previously justified the broad order partly due to the news organizations' claim that "the volume of deleted conversations is significant." She suggested that OpenAI could have taken steps to anonymize the chat logs but chose not to, only making an argument for why it "would not" be able to segregate data, rather than explaining why it "can't."


Original Submission

posted by hubie on Wednesday June 11, @11:40AM   Printer-friendly

Do you think Internet SEARCH has gone sucky-sucky-so-so? Can you imagine a better experience? Do you have some coding (dis)ability, perhaps even friends-with-similar-benefits?

Then you -- yes, you -- might be interested in a project a bunch of European research institutions have been working on for the past two years, and now -- June 6 -- have released to the public.

The project -- imaginatively named the Open Web Search Initiative -- offers all elements of a modern day search engine in convenient open source packages; along with 6.61 billion urls, 923 TiB total, and 1 TiB of daily crawled data. The only thing left for you to do is to download a partial index of all that data to your own server(s) and develop your own custom software on top of that. Then ...

  1. Off to some VC millions
  2. ???
  3. Internet billions!!!

Please do return a percent of your revenue to this site though -- those private massages for the editor do not come cheaply, you know -- and an additional percent for this sub's author. Thank you's!

(Postscript: in case you're looking for funding as an open source developer; also, there's a free event on June 19-20 in Brussels.)


Original Submission

posted by janrinok on Wednesday June 11, @06:58AM   Printer-friendly

'We're definitely on the back foot': U.S. risks losing fusion energy race to China, industry leaders warn:

The race to lead in artificial intelligence isn't the only event in which the U.S. and China are competing for dominance. The pursuit of fusion — the "Holy Grail" of clean energy — is also pitting the superpowers against each other, and American tech leaders worry China could surge ahead.

At a Technology Alliance conference on Tuesday, Washington state companies building commercial fusion technologies raised concerns about China's strategy to pour resources into fusion.

"The U.S. is not committed to fusion. China is, by orders of magnitude," said Ben Levitt, the head of R&D for Zap Energy, speaking on a fusion panel at the Seattle Investor Summit+Showcase.

While the U.S. government spent approximately $800 million a year on fusion efforts during the Biden administration, China is investing more than twice that annually, IEEE Spectrum and others report. The Trump administration has taken action supporting nuclear fission, which powers today's nuclear reactors, but has not shown the same interest in fusion. The sector has become increasingly reliant on venture capital to fund its progress.

China is also focused on training fusion physicists and engineers, while President Trump is slashing funding for scientific research.

Fusion is so highly sought after given its potential to provide nearly limitless, carbon-free power, which could be critical to meet growing energy demands from AI applications and the global push to decarbonize transportation, the electrical grid, heating and cooling, industrial applications and elsewhere.

"The U.S. started with a very good hand in fusion and has played it extremely poorly," Levitt said. "So, yeah, we're definitely on the back foot."

The conference panel also included Brian Riordan, co-founder and chief operating officer of Avalanche Energy, and Anthony Pancotti, co-founder and head of R&D for Helion Energy.

Riordan argued that while China appears to be making strides in the race, what matters even more is who develops the most affordable technology.

Physicists for decades have pursued fusion energy. But replicating the reactions that power the Sun and stars is massively challenging and requires technologies that can generate super high pressure and temperatures of 100 million degrees Celsius, and sustain those conditions — plus efficiently capture the energy that fusion produces.

In December 2022, the U.S. National Ignition Facility (NIF) at Lawrence Livermore National Laboratory hit a key milestone in fusion research, demonstrating that fusion reactions here on Earth could release more power than required to produce them.

Images published in January revealed that China appears to be building a fusion research facility modeled on NIF — but even larger. Others suggest the site could be a giant Z-pinch machine — similar to the technology being pursued by Zap.

Years ago, a Chinese website posted a graphic of a fusion device that bore a troubling resemblance to Helion's technology, the company has said.

"We have seen copycats in China already, and it is terrifying," Pancotti said on Tuesday. "They can mobilize people and money at a scale that is beyond even what venture capital can do in this country. And so I think there's real concern there, and there's real concern around supply chain, too."

Added Levitt: "I wouldn't be surprised if every single one of our [fusion] concepts has a city designated to it in China."

When it comes to world ending tech, I'm not sure I want it to be a race.


Original Submission

posted by janrinok on Wednesday June 11, @02:14AM   Printer-friendly

https://distrowatch.com/dwres.php?resource=showheadline&story=20007

The Ubuntu team is following Fedora's example and dropping GNOME's X11 session in the distribution's next version. The announcement for the change reads, in part:

"The login screen (powered by GDM) will no longer offer the Ubuntu on Xorg option. All sessions based on GNOME Shell and Mutter are now Wayland-only and users who rely on X11-specific behaviors will not be able to use the GNOME desktop environment on Xorg. We understand that some users still depend on Xorg's implementation of X11; for example, in remote desktop setups, or highly specialized workflows. If you require Xorg specifically, you can install and use a non-GNOME desktop environment. Xorg itself is not going away, only GNOME's support for Xorg."


Original Submission

posted by janrinok on Tuesday June 10, @09:32PM   Printer-friendly

https://distrowatch.com/dwres.php?resource=showheadline&story=20005

The Linux Mint team is testing a new application for providing fingerprint authentication.

"Linux Mint 22.2 will feature a brand new app called Fingwit. Fingwit is a fingerprint configuration tool. It detects if your computer has a fingerprint reader and lets you record your fingerprints. It then configures your system to use fingerprint authentication for: The login screen, the screensaver; sudo commands, admin apps (pkexec)."

Fingwit will work across desktop environments and should function on any systems that have a fingerprint reader and PAM authentication support.

The Linux Mint May newsletter also reminds people that Linux Mint 20.x is reaching the end of its five years of support. People running version 20.x are advised to either perform a fresh install of Linux Mint 22 or upgrade in place to version 21. Tips for upgrading are provided in the newsletter.


Original Submission

posted by janrinok on Tuesday June 10, @04:47PM   Printer-friendly

""After almost 100 years on the planet, I now understand the most important place on Earth is not on land, but at sea.""

(Sir David Attenborough, at the presentation of Oceans.)

The world's oceans hold between 50 and 60 times more carbon dioxide than there's present in the atmosphere; each year they absorb about 30% of the CO2 being released into the atmosphere. Annually, between 86 and 94 million tonnes of fish are caught in the wild from oceans and seas, while aquaculture yielded 92.4 million tonnes in 2022.

In short, the oceans are pretty important for humanity.

Not everybody is convinced about that though. Bottom trawling is still a dominant fishing "tactic", there's so much plastic pollution you can use it as orientation points from space -- with an expectation that the amount of plastic reaching the oceans will double each year until 2040 -- and now, more recently, there's the push to start mining the ocean floor with robots in search of precious metals.

Let's try to manage our oceans responsibly for future generations, argued the United Nations in 2023 in New York -- and drafted a first version of the High Seas Treaty. That draft is now being worked out further during a conference at Nice, France, running from June 9 until June 30.

While the main highlight being reported in the media is about declaring 30 percent of the oceans to be off-limits for human industrial activity (including fishing) by 2030, the treaty is ostensibly going to be about much more than that, if you look at the inputs to the draft treaty by different countries.

The European Union wants more financial assistance and market access for small-scale fisheries, combined with strategies to minimize bycatch and discard rates; the United States also wants more attention to small-scale fishing, along with better monitoring and collaboration and focus on the impact of climate change; China is worried about ecosystem restoration and protection of deep-sea ecosystems, Indonesia wants restrictions on fisheries subsidies as these promote overcapacity and overfishing, ... and so on; even Interpol wants to have a say about marine pollution.

About a hundred countries put their signatures under the original draft proposal in New York, formally called the Treaty on Biological Diversity Beyond National Jurisdiction. If 60 of these original 100 put their signatures under the finalised treaty during this conference, the oceans will have their first (global) legal protection.

According to Greenpeace, less than 2 percent of the world's oceans is currently protected against human industrial activity.


Original Submission

posted by janrinok on Tuesday June 10, @12:07PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

As if the regular detainment of children isn’t bad enough, documents show that U.S. immigration authorities are adding their DNA to a criminal database. In less than five years, the U.S. has collected DNA samples from over 130,000 minors, including children as young as four.

Since 2020, the US Customs and Border Patrol has ramped up its contributions to the Combined DNA Index System (CODIS) overseen by the Federal Bureau of Investigations. CODIS stores DNA profiles from convicted offenders, unsolved crime scenes, and missing persons cases for local, state, and federal law enforcement use. The actual, physical DNA samples are stored indefinitely by the federal government. Excluding forensic ones, CODIS currently has 23 million DNA profiles, and as many as 133,539 of them belong to detained children and teenagers, according to documents reviewed by Wired.

According to Wired’s report, CBP collected samples of between 829,000 and 2.8 million people from October 2020 to the end of 2024. This came after 2020 updated regulations surrounding DNA collection from the Department of Justice that removed the Department of Homeland Security’s exemptions. The California Law Review critiqued the DOJ’s decision as one that “may be the first to result in the government’s widespread, permanent retention of genetic materials based solely on a status other than a criminal arrest or conviction.”

To comply with the DOJ’s orders, CBP launched a pilot program that same year to begin collecting more samples from detained immigrants. At the time, CBP said it would collect samples from people between the ages of 14 to 19. However, CBP’s policy gives officers some discretion when it comes to younger children and they have taken advantage of that. Per Wired, CBP obtained samples from as many as 227 children under the age of 13. In one case, CBP officers in El Paso, Texas sent samples from a 4-year old child to the FBI for processing.

“In order to secure our borders, CBP is devoting every resource available to identify who is entering our country. We are not letting human smugglers, child sex traffickers, and other criminals enter American communities,” Hilton Beckham, assistant commissioner of public affairs at CBP, told Wired. “Toward this end, CBP collects DNA samples for submission to [CODIS] from person in CBP custody who are arrested on federal criminal charges, and from aliens detained under CBP’s authority who are subject to fingerprinting and not otherwise exempt from the collection requirement.”

However, Stephanie Glaberson, the director of research and advocacy at Georgetown University’s Center on Privacy and Technology, told Gizmodo via email, “The revelation that CBP collected DNA from a 4-year-old and added it to CODIS brings the absurdity of the government’s DNA program into sharp relief.”

Recently, the Center released a report stating that CBP has added 1.5 million DNA profiles to CODIS since 2020 where they are now housed under the “offender” label. Overall, ICE and CBP’s number of collected samples has increased by 5,000 percent but those numbers are even more shocking when broken down into smaller periods of time. According to Wired, officers in Laredo, Texas submitted as many as 3,930 DNA samples to the FBI with 252 listed as 17 or younger. Every sample was collected on a single day in January 2024.

Two years ago, the FBI requested a massive increase in funding to continue maintaining the system. With immigration authorities’ contributions, CODIS is likely to continue ballooning exponentially. As Georgetown’s report explained, there are several limitations on criminal law enforcement obtaining DNA samples. When it comes to immigrants, however, the only limitation is that they must be detained.

“The meaning of the term ‘detained’ in the immigration context is notoriously broad, vague, and ever-shifting,” the report stated. “And unlike in the criminal legal systems, ICE and CBP agents do not have to get judicial authorization to detain someone. There is no process for checking to make sure that every time they do detain someone, they meet constitutional requirements.”

“The lack of procedural safeguards means that DHS can amass data at a much quicker rate than police can, but all of the DNS DHS takes is accessible to the police,” the report added.

“No matter the age of the individuals compelled to hand over this most sensitive information, this program is morally bankrupt and unconstitutional,” Glaberson told Gizmodo. “Collecting migrants’ DNA like this serves no legitimate immigration purpose. What it does is place these individuals, their families, and communities under watch for life, and brings us all one huge step closer to genetic surveillance.”


Original Submission

posted by hubie on Tuesday June 10, @07:19AM   Printer-friendly
from the I-thought-the-FDA-was-against-hallucinogens dept.

An agency-wide LLM called Elsa was released weeks ahead of schedule:

Under the Trump administration, the Food and Drug Administration is eagerly embracing artificial intelligence tools that staff members are reportedly calling rushed, buggy, overhyped, and inaccurate.

On Monday, the FDA publicly announced the agency-wide rollout of a large language model (LLM) called Elsa, which is intended to help FDA employees—"from scientific reviewers to investigators." The FDA said the generative AI is already being used to "accelerate clinical protocol reviews, shorten the time needed for scientific evaluations, and identify high-priority inspection targets."

"It can summarize adverse events to support safety profile assessments, perform faster label comparisons, and generate code to help develop databases for nonclinical applications," the announcement promised.

In a statement, FDA Chief AI Officer Jeremy Walsh trumpeted the rollout, saying: "Today marks the dawn of the AI era at the FDA[. W]ith the release of Elsa, AI is no longer a distant promise but a dynamic force enhancing and optimizing the performance and potential of every employee."

Meanwhile, FDA Commissioner Marty Makary highlighted the speed with which the tool was rolled out. "I set an aggressive timeline to scale AI agency-wide by June 30," Makary said. "Today's rollout of Elsa is ahead of schedule and under budget, thanks to the collaboration of our in-house experts across the centers."

However, according to a report from NBC News, Elsa could have used some more time in development. FDA staff tested Elsa on Monday with questions about FDA-approved products or other public information, only to find that it provided summaries that were either completely or partially wrong.

FDA staffers who spoke with Stat news, meanwhile, called the tool "rushed" and said its capabilities were overinflated by officials, including Makary and those at the Department of Government Efficiency (DOGE), which was headed by controversial billionaire Elon Musk. In its current form, it should only be used for administrative tasks, not scientific ones, the staffers said.

"Makary and DOGE think AI can replace staff and cut review times, but it decidedly cannot," one employee said. The staffer also said that the FDA has failed to set up guardrails for the tool's use. "I'm not sure in their rush to get it out that anyone is thinking through policy and use," the FDA employee said.

According to Stat, Elsa is based on Anthropic's Claude LLM and is being developed by consulting firm Deloitte. Since 2020, Deloitte has been paid $13.8 million to develop the original database of FDA documents that Elsa's training data is derived from. In April, the firm was awarded a $14.7 million contract to scale the tech across the agency. The FDA said that Elsa was built within a high-security GovCloud environment and offers a "secure platform for FDA employees to access internal documents while ensuring all information remains within the agency."


Original Submission

posted by hubie on Tuesday June 10, @02:31AM   Printer-friendly
from the Amazon-for-everything? dept.

Part science outlet, part Radio Shack, part curio cabinet—American Science & Surplus is unique:

It was shortly after moving into Chicago's Jefferson Park neighborhood that I saw the sign for the first time: American Science & Surplus. My curiosity piqued, I pulled into the strip mall and walked into a store filled with an unimaginable variety of lab equipment, military surplus, tools, electronics, toys, and so much more.

Now, nearly 90 years after its launch selling "reject lenses" as American Lens & Photo, American Science & Surplus is facing an existential threat. The COVID-19 pandemic and increased costs hit the business hard, so the store has launched a GoFundMe campaign looking to raise $200,000 from customers and fans alike. What's happening in suburban Chicago is a microcosm of the challenges facing local retail, with big-box retailers and online behemoths overwhelming beloved local institutions. It's a story that has played out countless times in the last two-plus decades, and owner Pat Meyer is hoping this tale has a different ending.

Launching a fundraiser was a tough choice for Meyer. "I don't like asking people for money," he said.

With his voice catching, he continued: "It's hard for me to talk about sometimes, because the more I'm in the store, the more I see how much people care about it and don't want it to go away."

And the current environment is tough for small business owners. "Banks... are real hesitant [about lending] money," Meyer told Ars. "Interest rates are high, too. So we decided that we were going to try and reach out to the community that we built over the last 88 years."

[...] Over time, the store has moved far beyond lenses and lab equipment. There's a science toy section and an aisle devoted to Etsy-style craft supplies. But other, once-thriving areas of the business have suffered. When I first discovered American Science & Surplus in the early 2000s, I would always linger at the massive telescope section. The store staff was always more than happy to answer my questions and explain the differences between the scopes. Now, telescopes are just a small corner of the store, and sales are infrequent. "People come in to ask questions and then buy the telescopes online," Meyer explained.

In many ways, American Science & Surplus is a physical manifestation of the maker ethos. There is an endless array of motors, switches, cables, tools, and connectors. "Sometimes our customers will send us photos of their creations," said Meyer. "It's always cool to see how people are inspired by shopping here."

The store should feel familiar to those who were alive in the peak days of Radio Shack. In fact, there used to be a Radio Shack in the same strip mall as American Science & Surplus' old store in the Jefferson Park neighborhood on Chicago's northwest side. Meyer said that Radio Shack would frequently send customers a few doors down to his store to find things Radio Shack didn't stock. And one time, the surplus store sent a customer back. "Radio Shack sent one guy over to us after telling him they didn't have the item in stock," Meyer said. "We didn't have it, but one of our associates knew Radio Shack did, so he walked the customer back, pulled the part out of the bin, and handed it to him."

[...] American Science & Surplus has adapted over the years. There's now a well-stocked section of science toys. And Meyer has started hosting science nights. The next one, slated for June 7 at the Park Ridge store, will double as a fundraiser—in addition to the usual science experiments and demonstrations, there will be a silent auction and live music.

What will Meyer do with the money if the fundraising goal is reached? "We have to move our warehouse," he said. "It's too expensive, it's too big." Other plans include updating its operating software and updating the website. A quick look at the About Us page of the current site shows the need for an update. It contains a warning that the "heavy use of tables" may not be supported in all browsers—paired with a suggestion to download Netscape Navigator.

As of this writing, the GoFundMe campaign has raised $136,903. Meyer says contributing isn't just about supporting American Science & Surplus; it's about supporting local retail during a very challenging time. "Who wants to buy everything at Amazon, Walmart, Temu, and Target?" he asked.

[Ed. note: I've purchased oddball optics from them before and they truly are unique --hubie]


Original Submission

posted by hubie on Monday June 09, @09:45PM   Printer-friendly

New technologies help wood-burning stoves burn more efficiently, produce less smoke

Oregon State University researchers are gaining a more detailed understanding of emissions from wood-burning stoves and developing technologies that allow stoves to operate much more cleanly and safely, potentially limiting particulate matter pollution by 95%.

The work has key implications for human health as wood-burning stoves are a leading source of PM2.5 emissions in the United States. PM2.5 refers to fine particulate matter with a diameter of 2.5 micrometers or smaller that can be inhaled deeply into the lungs and even enter the bloodstream. Exposure to PM2.5 is a known cause of cardiovascular disease and is linked to the onset and worsening of respiratory illness.

Even though a relatively small number of households use wood stoves, they are the U.S.'s third-largest source of particulate matter pollution, after wildfire smoke and agricultural dust, said Nordica MacCarty of the OSU College of Engineering.

Residential wood combustion, especially the use of inefficient stoves, is also a significant source of other harmful emissions including polycyclic aromatic hydrocarbons, carbon monoxide, nitrogen oxides, methane, benzene and formaldehyde.

"Wood is an affordable, local, renewable, low-carbon fuel that should be an important part of the U.S. energy mix, but it must be burned cleanly to effectively protect health," MacCarty said.

"Folks typically think of pollution as coming from vehicles and industry, but household wood stoves are a larger source—just a few smoky stoves can create a harmful effect on air quality in an entire community."

MacCarty published a paper in the Journal of the Air & Waste Management Association showing that 70% of the pollution emitted from wood stove flues happens at two points in time: when a stove is first lit, and when it's reloaded. MacCarty's team gained that knowledge by developing a new monitoring technique and deploying equipment at a collection of wood stove users' homes in rural Oregon.

According to the Environmental Protection Agency, there are an estimated 6.5 million inefficient stoves in the U.S., most of them models that predate EPA clean-burning standards. In all, there are roughly 10 million wood-burning stoves in the country, or one for every 35 people.

"A lot of the older stoves are essentially just metal boxes with chimneys and they don't incorporate modern engineering principles to optimize heat transfer and combustion," said MacCarty, the Richard & Gretchen Evans Professor of Humanitarian Engineering and an associate professor of mechanical engineering.

"They have no catalysts or secondary combustion to reduce emissions and lower the risk of creosote buildup that can cause chimney fires."

MacCarty's group is developing automated technologies that inject jets of primary and secondary air into the fire to provide just the right amount of air and mixing at the right time and place in the fire. Prototypes are showing about a 95% reduction in particulate matter emissions compared to older models, she said.

The EPA has been reducing the allowable PM2.5 emissions rate regularly since the 1980s. In 2015 it was 4 grams per hour for cordwood stoves, and five years later it was reduced to 2.5 grams per hour. Regulation is driving innovation as stove makers improve their designs to meet certification requirements, MacCarty said.

But wood stoves perform differently in the lab than they do in real life, she noted, and stoves are certified based on laboratory tests—and often designed to pass the tests, rather than to operate well in someone's home.

"It's difficult to measure wood stove emissions in the field, so there has been relatively little in-use performance data available in the past to guide designs," MacCarty said. "Our study introduces a new system that makes collecting real-world emissions data more practical."

The project included Oregon State undergraduate student Jonah Wald and was a collaboration between OSU and the nonprofit Aprovecho Research Center based in Cottage Grove, Oregon. It builds on OSU and Aprovecho's ongoing work on efficient combustion for cooking with wood in the developing world.

Roughly 2.7 billion people rely on open fires for cooking, MacCarty said, and her team has been designing efficient cook stoves for them to use instead.

More information: Samuel Bentson et al, In-situ measurements of emissions and fuel loading of non-catalytic cordwood stoves in rural Oregon, Journal of the Air & Waste Management Association (2025). DOI: 10.1080/10962247.2025.2483217


Original Submission

posted by hubie on Monday June 09, @04:58PM   Printer-friendly
from the everyone-on-the-web-owes-him dept.

Several sites are reporting that the legendary programmer Bill Atkinson has died. He contributed QuickDraw to the early Macintosh and was even responsible for MacPaint and Hypercard. The former, MacPaint, inspired Photoshop. The latter, Hypercard, can be considered an important milestone in computing even though it lacked the networking which the WWW is built upon.

He designed a program where information—text, video, audio—would be stored on virtual cards. These would link to each other. It was a vision that harkened back to a 1940s idea by scientist Vannevar Bush which had been sharpened by a technologist named Ted Nelson, who called the linking technique "hypertext." But it was Atkinson who made the software work for a popular computer. When he showed the program, called HyperCard, to Apple CEO John Sculley, the executive was blown away, and asked Atkinson what he wanted for it. "I want it to ship," Atkinson said. Sculley agreed to put it on every computer. HyperCard would become a forerunner of the World Wide Web, proof of the viability of the hyperlinking concept.

Bill Atkinson, Macintosh Pioneer and Inventor of Hypercard, Dies at 74, Wired.

The Internet Archive has a digitized edition of a two part interview with him, recorded in 1985 and originally aired on KFOX.


Original Submission

posted by mrpg on Monday June 09, @12:11PM   Printer-friendly
from the wifi-h dept.

Arthur T Knackerbracket has processed the following story:

In a significant advance for brain-computer interface (BCI) technology, a University of Michigan research team has achieved the first in-human recording using Paradromics' Connexus device – a wireless, fully implantable BCI designed to restore communication and movement for people living with severe neurological conditions.

The procedure took place on May 14, 2025, during epilepsy surgery, where the device was temporarily placed on the patient's temporal lobe, an area essential for processing sound and memory. This opportunity allowed the team to safely test the device's ability to capture neural signals without adding risk to the patient, as the surgery already required access to the brain.

The Connexus stands out for its compact size – smaller than a dime – and its high-density array of 421 microelectrodes, each thinner than a human hair. Unlike many earlier BCIs, which often relied on fewer electrodes and required external wires, Connexus is engineered to be fully implantable.

The device collects electrical signals from individual neurons, transmitting them via a thin lead to a transceiver implanted in the chest. From there, the data is sent wirelessly to an external computer, where artificial intelligence algorithms interpret the patterns and translate them into actions, such as moving a cursor or generating synthesized speech.

[...] The potential applications of Connexus extend beyond restoring speech and movement. By decoding neural signals at the level of individual neurons, the technology could one day help address mental health conditions or chronic pain by interpreting mood or discomfort directly from brain activity.


Original Submission

posted by mrpg on Monday June 09, @07:27AM   Printer-friendly
from the mega-giga-tera-peta-exa dept.

Arthur T Knackerbracket has processed the following story:

What happened to all the megafauna? From moas to mammoths, many large animals went extinct between 50 and 10,000 years ago. Learning why could provide crucial evidence about prehistoric ecosystems and help us understand future potential extinctions. But surviving fossils are often too fragmented to determine the original species, and DNA is not always recoverable, especially in hot or damp environments.

Now scientists have isolated collagen peptide markers which allow them to identify three key megafauna that were once present across Australia: a hippo-sized wombat, a giant kangaroo, and a marsupial with enormous claws.

"The geographic range and extinction date of megafauna in Australia, and potential interaction with early modern humans, is a hotly debated topic," said Professor Katerina Douka of the University of Vienna, senior author of the article in Frontiers in Mammal Science.

"The low number of fossils that have been found at paleontological sites across the country means that it is difficult to test hypotheses about why these animals became extinct," explained first author Dr. Carli Peters of the University of Algarve.


Original Submission

posted by mrpg on Monday June 09, @02:46AM   Printer-friendly
from the maybe dept.

Are Dead Sea Scrolls older than we thought?:

Over the years, scholars of the Dead Sea Scrolls have analyzed the ancient parchments with various methods: for example, X-rays, multispectral imaging, "virtual unfolding," and paleography, i.e., studying elements in their writing styles. The scrolls are believed to date back to between the third century BCE and the first century CE, but those dates rely largely on paleography, since only a handful of the scrolls have calendar dates written on them.

However, the traditional paleographic method is inherently subjective and based on a given scholar's experience. A team of scientists has combined radiocarbon dating from 24 scroll samples and machine-learning-based handwriting analysis to create their own AI program—dubbed Enoch. The objective was to achieve more accurate date estimates, according to a new paper published in the journal PLoS ONE. Among the findings: Many of the scrolls are older than previously thought.

[...] The development of Enoch grew out of the team's earlier deep neural network for ferreting out handwritten ink-trace patterns in digitized manuscripts, involving micro-level geometric shape analysis. "Enoch emphasizes shared characteristics and similarity matching between trained and test manuscripts, where traditional paleography focuses on subtle differences that are assumed to be indicative for style development," the authors wrote. "Combining dissimilarity matching and adaptive reinforcement learning can uncover hidden patterns."

They tested Enoch by having paleographic experts evaluate the AI program's age estimate for several scrolls. The results: About 79 percent of Enoch's estimates were deemed "realistic," while its age estimates for the remaining 21 percent were either too young, too old, or just indecisive.

This new model revealed that many of the Dead Sea Scrolls are older than previous estimates based solely on paleography. That should be relevant for the question of when two ancient Jewish script styles—"Hasmonean" and "Herodian"—developed, for example. The former script was thought to have emerged between 150–50 BCE, but the authors believe Hasmonean could have emerged much earlier; ditto for the Herodian script. So both scripts may have coexisted since the late second century, challenging the prevailing view that they preexisted by the mid-first century BCE.

Journal Reference:
Mladen Popović, Maruf A. Dhali, Lambert Schomaker, et al. Dating ancient manuscripts using radiocarbon and AI-based writing style analysis, PLOS ONE (DOI: 10.1371/journal.pone.0323185)


Original Submission