Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Attorneys General, HCA Settle Over Nurse Training Repayment Provisions:
California Attorney General Rob Bonta has announced a settlement with HCA Healthcare Inc. and Health Trust Workforce Solutions LLC (together, HCA), resolving allegations that HCA unlawfully required entry-level nurse employees to repay the cost of a mandatory training program if they did not remain employed with the company for two years.
One of the nation's largest hospital systems, for-profit HCA has several hospitals in California.
Today's settlement is the result of a years-long investigation by attorneys general in California, Colorado and Nevada, working in partnership with the Biden Administration's Consumer Financial Protection Bureau. The states' investigation found that HCA violated California employment and consumer protection laws as well as the federal consumer financial protection laws by using training repayment agreement provisions (TRAPs) in nurses' employment contracts. These TRAPs are a form of employer-driven debt, or debt obligations incurred by individuals through employment arrangements.
Here is how the California attorney general' s office described HCA's nursing training program and the settlement: As a condition of employment at an HCA hospital, HCA generally requires that entry-level nurse employees complete the Specialty Training Apprenticeship for Registered Nurses (StaRN) Residency Program. The company has advertised StaRN as an avenue for entry-level RNs to get the education and training they need to land their first nursing jobs in an acute-care hospital setting, although StaRN does not provide nurses with education or training necessary for licensure as an RN.
Until the spring of 2023, HCA required that RNs hired through the StaRN program at facilities in several states, including California, sign a TRAP agreement in their new-hire paperwork. The TRAPs purported to require nurses to repay a prorated portion of the StaRN "value" if they did not work for HCA for two years. If a nurse left HCA before the end of the two-year period, then the TRAP loan was typically sent to debt collection.
HCA imposed TRAPs on nurses who worked at their five hospitals in California: Good Samaritan Hospital in San Jose; Regional Medical Center in San Jose; Los Robles Regional Medical Center in Thousand Oaks; Riverside Community Hospital in Riverside; and West Hills Hospital & Medical Center in West Hills (no longer under HCA ownership).
Under California's settlement, HCA will:
• Pay approximately $83,000 to provide full restitution to California nurses who made payments on their TRAP debt to HCA.
• Be prohibited from imposing TRAPs on nurse employees and attempting to collect on the approximately $288,000 in outstanding TRAP debt incurred by California nurses who signed TRAPs with HCA.
• Pay $1,162,900 in penalties to California.
• HCA will pay a total of $2,900,000 in penalties under settlements filed in California, Colorado, and Nevada today."All too often, employer-driven debt forces workers to remain in jobs that they would otherwise leave. That's not just wrong; it's illegal under state and federal law. Workers must be able to pursue better pay and better working conditions — not be trapped by debt that their employer makes them take out," said Attorney General Bonta in a statement. "I'm grateful to my fellow attorneys general in Colorado and Nevada for their partnership. With today's settlement, we are taking a stand for workers in our states by holding HCA Healthcare accountable — ensuring that all affected nurses are made whole financially, that the company pays a penalty for its wrongdoing, and that the company is subject to strong injunctive terms to deter future misconduct."
Nursing unions applauded the settlement. "California Nurses Association and our national union, National Nurses United, want to thank Attorney General Bonta for his leadership in addressing this growing trend of employers, such as HCA, using debt repayment contracts to lock nurses and other workers into jobs," said Sandy Reding, R.N., president of the California Nurses Association, in a statement. "HCA, the largest for-profit hospital system in the country, has a shameful track record of using predatory stay-or-pay contracts, or Training Repayment Agreement Provisions (TRAPS), which handcuff nurses to our employers through the threat of serious financial consequences or ruin. No nurses and no other workers should be locked into a job under the weight of debt to their employer."
Politico reports on a hack affecting Federal Courts in the USA:
The identities of confidential court informants are feared compromised in a series of breaches across multiple U.S. states.
The electronic case filing system used by the federal judiciary has been breached in a sweeping cyber intrusion that is believed to have exposed sensitive court data across multiple U.S. states, according to two people with knowledge of the incident.
The hack, which has not been previously reported, is feared to have compromised the identities of confidential informants involved in criminal cases at multiple federal district courts, said the two people, both of whom were granted anonymity because they were not authorized to speak publicly about the hack.
The Administrative Office of the U.S. Courts — which manages the federal court filing system — first determined how serious the issue was around July 4, said the first person. But the office, along with the Justice Department and individual district courts around the country, is still trying to determine the full extent of the incident.
It is not immediately clear who is behind the hack, though nation-state-affiliated actors are widely suspected, the people said. Criminal organizations may also have been involved, they added.
The Administrative Office of the U.S. Courts declined to comment. Asked whether it is investigating the incident, the FBI referred POLITICO to the Justice Department. The Justice Department did not immediately reply to a request for comment.
It is not immediately clear how the hackers got in, but the incident is known to affect the judiciary's federal core case management system, which includes two overlapping components: Case Management/Electronic Case Files, or CM/ECF, which legal professionals use to upload and manage case documents; and PACER, a system that gives the public limited access to the same data.
In addition to records on witnesses and defendants cooperating with law enforcement, the filing system includes other sensitive information potentially of interest to foreign hackers or criminals, such as sealed indictments detailing non-public information about alleged crimes, and arrests and search warrants that criminal suspects could use to evade capture.
Chief judges of the federal courts in the 8th Circuit — which includes Arkansas, Iowa, Minnesota, Missouri, Nebraska, North Dakota, and South Dakota — were briefed on the hack at a judicial conference last week in Kansas City, said the two people. It is unclear who delivered the brief, though the Director of the Administrative Office of the U.S. Courts, Judge Robert J. Conrad, Jr., was in attendance, per the first person. Supreme Court Justice Brett Kavanaugh was also in attendance but didn't address the breach in his remarks.Staff for Conrad, a district judge in the Western District of North Carolina, declined to comment.
The hack is the latest sign that the federal court filing system is struggling to keep pace with a rising wave of cybersecurity threats.
Michael Scudder, who chairs the Committee on Information Technology for the federal courts' national policymaking body, told the House Judiciary Committee in June that CM/ECF and Pacer are "outdated, unsustainable due to cyber risks, and require replacement."
He also said that because the federal Judiciary holds such sensitive information, it faces "unrelenting security threats of extraordinary gravity."As of July 2022, the Justice Department was investigating another hack of the federal court system that then-House Judiciary Committee Chair
Jerrold Nadler (D-N.Y.) described as "startling." The incident involved three foreign hacking groups and dated back to early 2020, Nadler also said. It is not clear who the foreign hackers were or whether these incidents are connected.
"It's the first time I've ever seen a hack at this level," said the first of the two people, who has spent more than two decades on the federal judiciary.The second person said that roughly a dozen court dockets were tampered with in one court district as a result of the hack. The first person was not aware of any tampering but said it was theoretically possible.
The incident does not appear to have exposed the most highly protected federal court witnesses, since the real identities of those thought to face exceptional risk for cooperating are held on separate systems maintained by the Justice Department, according to the first person.
During his testimony before the House Judiciary Committee, Scudder said that replacing CM/ECF and PACER was a "top priority" for the federal judiciary, but that developing a more modernized system would have to "be developed and rolled out on an incremental basis."He also called CM/ECF and Pacer the "backbone system federal courts depend on for mission-critical, day-to-day operation."
When a droplet falls on a surface, it spreads itself horizontally into a thin lamella. Sometimes — depending on factors like viscosity, impact speed, and air pressure — that drop splashes, breaking up along its edge into myriad smaller droplets. But a new study finds that a small electrical charge is enough to suppress a drop's splash, as seen below.
The drop's electrical charge builds up along the drop's surface, providing an attraction that acts somewhat like surface tension. As a result, charged drops don't lift off the surface as much and they spread less overall; both factors inhibit splashing.* The effect could increase our control of droplets in ink jet printing, allowing for higher resolution printing.
*Note that this only works for non-conductive surfaces. If the surface is electrically conductive, the charge simply dissipates, allowing the splash to occur as normal.
Journal Reference:
Fanfei Yu, Aaron D. Ratschow, Ran Tao, et al. Why Charged Drops Do Not Splash, Physical Review Letters (DOI: 10.1103/PhysRevLett.134.134001)
Ever since the popularity of 3D-printing skyrocketed in the mid-aughts, people have manufactured everything from chocolate to rocket fuel—and that list now includes a microscopic elephant inside of a living cell (which you can see here). Technology has really leveled up since 2005.
As new biological opportunities for 3D printing keep emerging, a team of researchers—from the J. Stefan Institute, University of Ljubljana, and CENN Nanocenter in Slovenia—have found a way to pull the process off within a cell's cytoplasm. They were successfully able to print not only an elephant, but several other impossibly small structures using a liqiud polymer and a hyperfocused petawatt laser.
"Intracellular 3D printing offers an unprecedented degree of control over the cellular interior, allowing the integration of synthetic structures with native biological functions," the team said in a study recently posted to the preprint server arXiv. "This platform could allow for reconfiguration of cellular architecture, embed logic or mechanical components within the cytoplasm, and design cells with enhanced or entirely new properties."
For this experiment, the team used a negative photoresist (a material that changes when exposed to certain wavelengths of energy), which became insoluble when exposed to light. It was also the most biocompatible formula possible. After a droplet of photoresist was injected into the cell, an object was printed using a process called two-photon photolithography, which involves targeting an area inside the droplet with a laserto create a microstructure. Anything zapped with two photons from the laser hardens, while any remaining photoresist that has not been lasered into a structure dissolves.
Along with the ironically tiny 10-micrometer elephant, the research team printed other microstructures, like barcodes and a sphere that acted as a micro-laser. The former could eventually allow scientists to track what is going on inside individual cells, and give experts much more detailed insight into cellular function than is currently possible. The latter could be produced in various sizes that all emit light slightly differently, labeling cells with specific light signatures.
Surviving cells continued to go on as if nothing had happened. When a few of them divided, the microstructure inside was passed down to one of the daughter cells. Viability was still an issue, however—even the biocompatible photoresist was still somewhat toxic, and injecting liquid polymer damaged the cell membrane and sometimes caused cell death. How likely cells were to survive depended on the type of cell, and in total, about half of the cells that had microstructures printed in them made it through the experiment.
See also:
Wikipedia loses challenge against Online Safety Act verification rules:
Wikipedia has lost a legal challenge to new Online Safety Act rules which it says could threaten the human rights and safety of its volunteer editors.
The Wikimedia Foundation - the non-profit which supports the online encyclopaedia - wanted a judicial review of regulations which could mean Wikipedia has to verify the identities of its users.
But it said despite the loss, the judgement "emphasized the responsibility of Ofcom and the UK government to ensure Wikipedia is protected".
The government told the BBC it welcomed the High Court's judgment, "which will help us continue our work implementing the Online Safety Act to create a safer online world for everyone".
Judicial reviews challenge the lawfulness of the way in which a decision has been made by a public body.
In this case the Wikimedia Foundation and a Wikipedia editor tried to challenge the way in which the government decided to make regulations covering which sites should be classed "Category 1" under the Online Safety Act - the strictest rules sites must follow.
It argued the rules were logically flawed and too broad, meaning a policy intended to impose extra rules on large social media companies would instead apply to Wikipedia.
In particular the foundation is concerned the extra duties required - if Wikipedia was classed as Category 1 - would mean it would have to verify the identity of its contributors, undermining their privacy and safety.
The only way it could avoid being classed as Category 1 would be to cut the number of people in the UK who could access the online encyclopaedia by about three-quarters, or disable key functions on the site.
The government's lawyers argued that ministers had considered whether Wikipedia should be exempt from the regulations but had reasonably rejected the idea.
Wikipedia can challenge Online Safety Act if strictest rules apply to it, says judge:
The operator of Wikipedia has been given permission by a high court judge to challenge the Online Safety Act if it is categorised as a high-risk platform, which would impose the most stringent duties.
The Wikimedia Foundation has said it might be forced to reduce how many people can access the site in order to comply with the regulations if it is classified as a category 1 provider by Ofcom later this summer.
As a non-profit, the site said, it "would face huge challenges to meet the large technological and staffing needs" required to comply with the duties, which include user-verification requirements, stringent protections for users and regular reporting responsibilities to prevent the spread of harmful content.
The Wikimedia Foundation calculated that the number of people in the UK who access Wikipedia would have to be reduced by about three-quarters in order for the site to not qualify as a category 1 service, which is defined as a large user-to-user platform that uses algorithmic contender recommendations.
It said Wikipedia was different to other sites expected to be labelled as category 1 providers, such as Facebook, X and Instagram, because it was run by a charity and its users typically only encountered content that they sought out.
Mr Justice Johnson refused Wikipedia's legal challenge in the high court on several grounds, but he noted that the site "provides significant value for freedom of speech and expression" and added that the outcome did not give Ofcom or the government "a green light to implement a regime that would significantly impede Wikipedia's operations".
Any decision to make Wikipedia a category 1 provider would have to be "justified as proportionate if it were not to amount to a breach of the right to freedom of expression", he said, but he added that it would be "premature" to rule on this since Ofcom had not yet determined that Wikipedia was a category 1 service.
If Ofcom determines that Wikipedia is a category 1 service and this means Wikipedia is unable to operate as at present, Johnson suggested that the technology secretary, Peter Kyle, should "consider whether to amend the regulations or to exempt categories of service from the act" and said Wikipedia could bring a further challenge if he did not.
Phil Bradley-Schmieg, the lead counsel at the Wikimedia Foundation, said: "While the decision does not provide the immediate legal protections for Wikipedia that we hoped for, the court's ruling emphasised the responsibility of Ofcom and the UK government to ensure Wikipedia is protected as the OSA [Online Safety Act] is implemented.
https://distrowatch.com/?newsid=12524
https://archive.ph/pDIIb
Exactly one year after Kaisen Linux's most recent release candidate, the project has announced version 3.0 of its Debian-based, desktop distribution. In an unusual move, the release announcement also includes a report that the distribution is being discontinued. "I would like to begin this blog post by announcing the end of the Kaisen Linux project with this latest release. I wish to embark on other professional and personal projects that will take up a considerable amount of my time, and for this reason, I can no longer continue developing Kaisen Linux. This release will therefore be the last. However, security updates will still be provided for two years, giving you time to switch to another Linux system and familiarize yourself with your new environment." The announcement goes on to share highlights of the new version: "KDE is now the default interface for Kaisen Linux, and is in version 6. SDDM is now the default display manager instead of lightdm. Lightdm was used instead of SDDM due to some missing customization settings, which were introduced with KDE version 6. Xfce is now available in version 4.20...." [...]
https://tails.net/news/test_7.0-rc1/
Test 7.0~rc1
2025-08-07We are very excited to present you with a release candidate of the upcoming Tails 7.0.
We plan to release Tails 7.0 officially on October 16. You can help us by testing this release candidate already.
Tails 7.0 will be the first version of Tails based on Debian 13 (Trixie) and GNOME 48. It will bring new versions of many applications included in Tails.
We have tested 7.0~rc1 with the same extensive automatic and manual test suites that we use for regular releases. But, Tails 7.0~rc1 might still contain undiscovered issues.
We will provide automatic security upgrades for Tails 7.0~rc1, like we do for regular versions.
What is Tails OS?
- https://en.wikipedia.org/wiki/Tails_(operating_system)
Announcement:
https://blog.torproject.org/tails-7_0-rc1-testing/
Changes and updates
Replace GNOME Terminal with GNOME Console. (#20161)
We broke the Root Terminal while working on this change.
To open a root terminal, execute the following command in a regular Console.
$sudo -i
Replace GNOME Image Viewer with GNOME Loupe (#20640)
Remove Kleopatra from the Favorites menu. (#21072)
To start Kleopatra choose Apps ▸ Accessories ▸ Kleopatra.
Remove the obsolete Network Connection option from the Welcome Screen. (#21074)
Included software
Update the Tor client to 0.4.8.17.
Update Thunderbird to 128.13.0esr.
Update the Linux kernel to 6.1.14.
This improves support for newer hardware: graphics, Wi-Fi, and so on.
Update Electrum from 4.3.4 to 4.5.8.
Update OnionShare from 2.6.2 to 2.6.3.
Update KeePassXC from 2.7.4 to 2.7.10.
Update Kleopatra from 4:22.12 to 4:24.12
Update Inkscape from 1.2.2 to 1.4.
Update GIMP from 2.10.34 to 3.0.4.
Update Audacity from 3.2.4 to 3.7.3.
Update Text Editor from 43.2 to 48.3.
Update Document Scanner from 42.5 to 46.0.
Removed software
Remove unar. (#20946)
Remove aircrack-ng. (#21044)
Remove sq. (#21042)
Fixed problems
Fix selecting the correct keyboard for certain languages. (#12638)
For more details, see the list of closed issues on the 7.0 milestone in GitLab.
Known issues
Tails 7.0~rc1 requires 3 GB of RAM instead of 2 GB to run smoothly. (#18040)
We estimated that less than 2% of current users will be affected.
Tails 7.0~rc1 takes longer to start.
We plan to fix this in the final Tails 7.0.
For more details, see the list of issues on the 7.0 milestone in GitLab.
Send your feedback
Please, report any new problem to either:
tails-testers@boum.org (public mailing list)
support@tails.net (private email)
Radioactive water from the base that holds the UK's nuclear bombs was allowed to leak into the sea after old pipes repeatedly burst, official files have revealed.
The radioactive material was released into Loch Long, a sea loch near Glasgow in western Scotland, because the Royal Navy failed to properly maintain a network of 1,500 water pipes on the base, a regulator found.
The armaments depot at Coulport on Loch Long is one of the most secure and secretive military sites in the UK. It holds the Royal Navy's supply of nuclear warheads for its fleet of four Trident submarines, which are based nearby.
Files compiled by the Scottish Environment Protection Agency (Sepa), a government pollution watchdog, suggest that up to half the components at the base were beyond their design life when the leaks occurred.
Sepa said the flooding at Coulport was caused by "shortfalls in maintenance", resulting in the release of "unnecessary radioactive waste" in the form of low levels of tritium, which is used in nuclear warheads.
In one report in 2022, the agency blamed the leaks on the navy's repeated failure to maintain the equipment in the area devoted to storing the warheads, and said plans to replace 1,500 old pipes at risk of bursting were "sub-optimal".
The leaks are revealed in a cache of confidential inspection reports and emails given to the investigative website the Ferret and shared with the Guardian, which Sepa and the Ministry of Defence fought to keep secret.
[...] The Sepa files show there had been a pipe burst at Coulport in 2010 and a further two in 2019. One leak in August 2019 released "significant amounts of water" that flooded a nuclear weapons processing area, where it became contaminated with low levels of tritium and passed through an open drain that fed into Loch Long.
While Sepa said radioactivity levels in that incident were very low and did not endanger human health, it found there were "shortfalls in maintenance and asset management that led to the failure of the coupling that indirectly led to the production of unnecessary radioactive waste".
After an internal investigation and a Sepa inspection, the MoD promised 23 actions to prevent more bursts and floods in March 2020. It accepted that its lack of preparedness had caused "confusion", "a breakdown in access control" and a "lack of communication of the hazards".
However, there were two further pipe bursts in 2021, including one in another area that also held radioactive substances, prompting another inspection by Sepa in 2022. Progress on completing the 23 remedial actions "had been slow and delayed in many cases", Sepa said. "The events have highlighted shortcomings in asset management across the naval base."
David Cullen, a nuclear weapons expert with the defence thinktank Basic in London, said the repeated pollution incidents were shocking and the attempts to keep them secret were "outrageous".
He said: "The MoD is almost 10 years into a nearly £2bn infrastructure programme at Faslane and Coulport, and yet they apparently didn't have a proper asset management system as recently as 2022. This negligent approach is far too common in the nuclear weapons programme, and is a direct consequence of a lack of oversight."
[...] An MoD spokesperson said it placed "the upmost importance on our responsibilities for handling radioactive substances safely and securely. There have been no unsafe releases of radioactive material into the environment at any stage."
In a new study published today [13 August 2025], scientists discovered that keratin, a protein found in hair, skin and wool, can repair tooth enamel and stop early stages of decay.
The King's College London team of scientists discovered that keratin produces a protective coating that mimics the structure and function of natural enamel when it comes into contact with minerals in saliva.
Acidic foods and drinks, poor oral hygiene, and ageing all contribute to enamel erosion and decay, leading to tooth sensitivity, pain and eventually tooth loss.
While fluoride toothpastes are currently used to slow this process, keratin-based treatments were found to stop it completely. Keratin forms a dense mineral layer that protects the tooth and seals off exposed nerve channels that cause sensitivity, offering both structural and symptomatic relief.
The treatment could be delivered through a toothpaste for daily use or as a professionally applied gel, similar to nail varnish, for more targeted repair. The team is already exploring pathways for clinical application and believes that keratin-based enamel regeneration could be made available to the public within the next two to three years.
In their study, published in Advanced Healthcare Materials, the scientists extracted keratin from wool. They discovered that when keratin is applied to the tooth surface and comes into contact with the minerals naturally present in saliva, it forms a highly organised, crystal-like scaffold that mimics the structure and function of natural enamel.
Over time, this scaffold continues to attract calcium and phosphate ions, leading to the growth of a protective enamel-like coating around the tooth. This marks a significant step forward in regenerative dentistry.
[...] Dr Elsharkawy concluded: "We are entering an exciting era where biotechnology allows us to not just treat symptoms but restore biological function using the body's own materials. With further development and the right industry partnerships, we may soon be growing stronger, healthier smiles from something as simple as a haircut.
Journal Reference: Sara Gamea, Elham Radvar, Dimitra Athanasiadou, et al., Biomimetic Mineralization of Keratin Scaffolds for Enamel Regeneration, Advanced Healthcare Materials [OPEN], First published: 12 August 2025 https://doi.org/10.1002/adhm.202502465
https://www.osnews.com/story/143044/firefox-new-ai-features-cause-cpu-spikes-and-battery-drain/
Almost three weeks ago, Mozilla released Firefox 141 that, among other features like memory optimizations for Linux and a built-in unit converter, brought controversial AI-enhanced tab groups.
Powered by a local AI model, these groups identify related tabs and suggest names for them. There is even a "Suggest more tabs for group" button that users can click to get recommendations.
Now, several users have taken to the Firefox subreddit to complain about high CPU usage when using the feature, as well as express their disappointment in Mozilla for adding AI to the browser.
[...] If you are also dealing with CPU spikes and battery drain from Firefox's new AI features, you can disable them through the browser's advanced settings. Head to about:config in a new tab, accept the risk warning, and use the search bar to find the controls. To kill the AI chatbot feature, search for browser.ml.chat.enabled and set it to false. To stop smart tab grouping, search for browser.tabs.groups.smart.enabled and set it to false.
An article in The Conversation discusses a scientific paper which looks at a "gravitational bounce" from the Quantum Exclusion Principle which may take place inside black holes. The speculation is that this may form new universes.
The article states:
In a new paper, published in Physical Review D, my colleagues and I propose a striking alternative. Our calculations suggest the Big Bang was not the start of everything, but rather the outcome of a gravitational crunch or collapse that formed a very massive black hole – followed by a bounce inside it.
The general reasoning goes that inside the black hole, due to quantum effects, gravitational collapse does not result in a singularity (in contrast to predictions from Classical Physics). Due to the Quantum Exclusion Principle, the collapse is halted when a limit is reached and a bounce occurs, predicted by the Maths, producing a new universe "remarkably like our own." No new exotic theories are required to get this result.
A prediction from this theory is a small but non-zero positive spatial curvature which could be measured experimentally. Observations from the Euclid mission may be useful.
Is this plausible? It's an idea that's been about for as long as black holes have been predicted.
With platforms caving to pressure from payment processors, adult content creators are left to figure out what's next.
Ash Parrishis a reporter who covers the business, culture, and communities of video games, with a focus on marginalized gamers and the quirky, horny culture of video game communities.
In the aftermath of itch.io pulling the sale of over 20,000 pages of adult content, the creators of that work are left feeling betrayed, exhausted, and fearful. The number of platforms that permit the sale of adult material is shrinking, and there's no guarantee the ones that remain will still permit it in the future. But now, with their livelihoods at stake, many creators and their communities have begun to push back and search for new ways to thrive.
"Before [itch.io], the NSFW comics community would grouse and complain and share feelings of anxiety," said Brad Guigar, a smut comic artist. "This time around, we're actually doing something about it."
For some, that means organizing massive call campaigns to pressure payment processors to reverse course and allow itch.io to host the content it had before. Others have decided to abandon the fickleness of platforms for their own websites. And yet others have decided that if they can't sell their game directly, they'll just make it free.
To some creators, the most disheartening thing about itch.io removing thousands of pages of adult content is that it's relatively unsurprising. The storefront is one of several in recent years that have embraced adult content only to shun it later when payment processors start asking questions. They've now found themselves booted from platform to platform, moving from Tumblr to Patreon to Gumroad, only to have the rug pulled out from under them each time.
"This time around, we're actually doing something about it."
When adult creators are regularly forced to find new places for their work, their business overall suffers. "I can never get ahead," said PixelJail, a creator who makes BDSM and other kink-related comics and illustrations. "I have to stop doing paid work to set up new accounts, backlog posting, pay for new subscriptions or services" and other administrative tasks.
PixelJail has now opted to set up their own websites. But even without the burden of conforming to a platform's rules, having one's own website isn't a guarantee of absolute safety. In the UK, where PixelJail lives, the recently implemented Online Safety Act requires that online platforms have "strong age checks" in place to prevent children from accessing pornographic or "harmful" content.
"I had to geoblock my websites in the UK, including my webstore," PixelJail said, meaning they no longer sell their work in their own country.
Laws like the UK's Online Safety Act are slowly proliferating across the United States. The US Supreme Court recently ruled that age verification laws do not violate the First Amendment and many states are now requiring adult content sites to implement age verification tools, which can be expensive and subject to privacy concerns. Rather than comply, sites like PornHub have simply decided to cease operations in areas where those laws are in effect. Individual creators might have to make a similar choice.
"I made my site years ago and didn't use it much at first," PixelJail said. "But it's gradually become the only real place I can go to sell and even now, that's at risk."
Creator platforms have repeatedly been forced to exile adult content creators. In 2017, Patreon tightened its rules related to adult content, causing some of those creators to abandon the site, with many choosing to set up shop with Gumroad, another e-commerce platform. Then, last year, Gumroad banned virtually all sexually explicit material, causing yet another adult creator mass migration. You can follow the line of adult creators hopping from platform to platform, fleeing content bans all the way back to one website: Tumblr.
"From between 2012 to 2018, there was a huge, and I truly do mean huge NSFW community on Tumblr," said DieselBrain, a smut artist specializing in monster kink. For many of the creators I spoke to, the "Tumblr Purge" of 2018, where the social media site outright banned all adult content, was their first experience with having their previously accepted work suddenly prohibited. "This kicked the entire community off of there, and I'd argue that we never really recovered fully," Dieselbrain said.
When porn creators move from one platform to another, they bring their communities with them, creating an influx of traffic that would please anyone. Later on, after capitalizing on maximizing viewer eyeballs, sites dispose of a now-troublesome vestige of their early success.
This was almost the case with OnlyFans, which, in 2020, briefly flirted with banning adult content, the kind of material the website was universally known for. In every case, payment processors like Stripe, PayPal, Visa, and Mastercard were the culprits for these crackdowns. While all payment processors have guidelines prohibiting the sale of illegal material, many host platforms overcorrect, banning material that would ostensibly be permitted in order to avoid the increased scrutiny (and cost) hosting that content requires.
"We have been asked to be more rigorous in enforcing our ToS and must comply," Gumroad CEO Sahil Lavingia said in an interview with TechCrunch regarding its ban of adult content. Lavingia declined to name the specific company asking.
To blunt the blow caused by platform disruption, creators often turn to their communities, both the ones made up of other creators and those made up of their personal fans. They act as information networks, sharing news about where a creator may have set up shop, and are more generally an avenue of commiseration and support. To help his fellow artists navigate the recent events with itch.io, Guigar, the NSFW artist, started a newsletter for adult creators called Uncensored Artists.
The developer, Cara Cadaver, is leveraging her community to help support her game VILE: Exhumed. She made the game available for free on the Internet Archive after it was banned from Steam, which, according to her, was done under false pretenses.
"There are a lot of intense visuals in VILE: Exhumed," Cara Cadaver wrote. "But there is no uncensored nudity, no depictions of sex acts, and no pornography whatsoever – which is one of the justifications bad actors are using right now to censor games."
Though the game is free, there are options to support Cadaver directly through donations, half of which, she said, will be donated to charity. "This censorship of my work is a direct attack on creative expression and artistic freedom, and it will not stop with false accusations of sexual content," Cadaver said.
There has virtually never been a stable time to be an adult creator on the internet. To them, it feels unfair to have come to places like Tumblr, Patreon, Gumroad, and now itch.io, places that were tolerant of the kinds of work they did, only to have those places taken away, often without warning or recourse, leaving them with one less way to make a living.
"Most of the creators I know are everyday people with bills to pay mired in late stage capitalism," said Mesmereye, an artist who specializes in hypnosis kink. "When you have a body, a camera, and an internet connection, why shouldn't you try to put the proverbial bread on the table with the assets and talents you're born with?"
Air pollution filters help scientists produce first UK wildlife survey using eDNA:
Social media post led to discovery that samplers measuring toxic particles in air can also detect fragments of DNA
As the UK's Big Butterfly Count reaches more than 100,000 submissions, an international group of scientists have produced the first national survey of biodiversity using an entirely different approach. Instead of looking for species by eye, they took advantage of the samplers around the UK that constantly measure toxic metal particles in the air, and used them to measure tiny fragments of DNA [YouTube video 4:09 --JE].
Dr Joanne Littlefair from University College London, part of the research team, said: "Organisms lose bits of themselves all the time – dead skin cells, fragments of hair or feathers, saliva, even faeces and urine. Some of this will blow up into the air and become airborne 'environmental' DNA or eDNA."
Researchers were able to detect more than 1,100 plants and animals which included familiar UK species – trees, commercial crops, earthworms, newts, robins and badgers – as well as species of conservation concern, including skylarks and hedgehogs. The team found 65 species of butterfly and moth, including the gatekeeper (no 3 in the Big Butterfly Count) as well as the purple hairstreak, a butterfly that lives mainly in oak trees and is often overlooked. They also found established invasive species such as grey squirrels and muntjacs as well species that have only just arrived in the UK, and fungi that are considered crop pests as well as the pathogen that causes ash dieback.
The UK national survey started from a chance spot on social media. Dr Andew Brown from the National Physical Laboratory said: "We saw a social media post about airborne eDNA projects at a zoo in Cambridgeshire and wondered whether if the air pollution filters in our labs contained hidden information about local biodiversity." These filters came from 15 samplers around the UK that constantly measure toxic metal particles in the air, installed in diverse locations from the kerb of London's Marylebone Road to rural Hampshire and a peat bog in Scotland.
Some detections were not part of the natural ecosystems, but this data was useful for learning about how far eDNA could travel. Edible fish including seabass and hake were detected at Marylebone Road and traced to seafood stalls, including a market about 1.1km away. Exotic pets including peacocks and parrots were traced to outdoor aviaries. From this the researchers estimated that each air pollution monitoring site could detect the biodiversity of an area with a radius of about 19km.
Prof Elizabeth Clare from York University, Canada, part of the research team, said: "I think that this is only the beginning. Taking large national and continental measurements is now really possible. No other method can really scale to this geographic breadth."
Airborne eDNA compared well with the UK's other biodiversity data: a third of the species detectedthis way were nocturnal creatures that are hard to observe and can be under reported. Although some species were missed altogether, including blue tits and kestrels, the eDNA method may allow biodiversity changes to be tracked in places where they are not routinely surveyed by simply taking samples from air pollution measurement equipment that is used routinely around the globe.
Journal Reference:
Tournayre, Orianne, Littlefair, Joanne E., Garrett, Nina R., et al. First national survey of terrestrial biodiversity using airborne eDNA [open], Scientific Reports (DOI: 10.1038/s41598-025-03650-z)
Physs.org is reporting on repurposing large electromagnets in research facilities:
Magnets are at the heart of many scientific instruments at DOE's Brookhaven National Laboratory. They are not like typical refrigerator magnets, which apply a relatively weak and uniform force to magnetic materials. These electromagnets are often incredibly large and powerful, with variable fields that can be controlled by changing the electric current that runs through them.
One of their applications is to apply magnetic force to subatomic particles. For example, the Relativistic Heavy Ion Collider (RHIC) is made of superconducting electromagnets that steer and focus particle beams as they circulate through the accelerator at nearly the speed of light.
To build these magnets from scratch or source brand-new ones, research facilities must make large investments in time and money. Fortunately, when experiments are upgraded or decommissioned, researchers can sometimes reuse magnets for a new purpose. The same electromagnets can be used for decades, placed in upgraded machines to help collect more precise data or even placed in entirely different machines to help carry out a brand-new scientific endeavor.
After 25 years of groundbreaking nuclear physics research, RHIC is completing its final run this year. Following the final collisions, Brookhaven will begin to transform this DOE Office of Science user facility into the Electron-Ion Collider (EIC), the world's first collider of its kind.
While the upgrade will reuse much of RHIC's existing infrastructure—including one of RHIC's superconducting magnet ion rings—the EIC requires a new electron storage ring. To make that ring, EIC designers need hundreds of electromagnets to steer electrons around the 2.4-mile-circumference tunnel.
Fortunately, in Illinois, at DOE's Argonne National Laboratory, a DOE Office of Science user facility called the Advanced Photon Source (APS) recently underwent a massive upgrade. Scientists at APS partnered with Brookhaven Lab to repurpose their electromagnets for the EIC. Argonne sent hundreds of their 30-year-old magnets, which are still in their prime and safely usable, to both Brookhaven and DOE's Thomas Jefferson National Accelerator Laboratory, Brookhaven's partner in building the EIC.
"It's a noble cause to reuse and repurpose these magnets for many reasons, the most important being cost and schedule savings, as well as not overburdening the world-wide magnet manufacturing base," said George Mahler, group leader of the EIC Magnet Systems group, who oversees the teams receiving the APS magnets at Brookhaven. "The EIC in its entirety will require approximately 4,000 of these magnets."
Mahler estimates that a brand-new sextupole magnet, one kind of magnet that makes up the electron storage ring, can cost up to $60,000. Sextupole magnets are named for their six inner magnetic poles, which correct focusing errors as the electron beams zip around the storage ring. Brookhaven Lab received about 360 quadrupole and sextupole magnets from APS in total, worth about $21 million.
In addition to reusing magnets, the EIC project will recycle or sell unused materials, such as copper, aluminum, and other metals, for scrap, which will save an additional $600,000.
Recycling magnets for major physics experiments isn't a new idea for Brookhaven. The Lab has a long history of saving years and millions of dollars by repurposing valuable instrumentation.
At RHIC, the former PHENIX detector was upgraded to create the sPHENIX experiment, which began operations in 2023. Among other updates, adding a new solenoid magnet enabled physicists to collect more precise measurements of particle collisions. And while the magnet provided new opportunities for Brookhaven, the magnet itself was not new at all.
It came from DOE's SLAC National Accelerator Laboratory in California, from the BaBar experiment that was decommissioned in 2008. This solenoid is a 30,000-pound donut-shaped magnet large enough for an elephant to walk through. The sPHENIX team had the right timing to propose taking over the solenoid in 2013, even before the transition from PHENIX to sPHENIX began.
Brookhaven and DOE estimated the superconducting solenoid magnet was worth approximately $12 million, even after 30 years of use. Producing a new magnet would have cost significantly more.
"DOE helped us very efficiently transfer ownership. They were clearly enthusiastic about the possibility of us reusing the magnet," said John Haggerty, who served as the sPHENIX project scientist during construction. "It arrived on a snowy night, but I still tracked the truck and jumped in my car to see it reach the main gate."
Just this past June, another mega magnet that had made a cross-country journey was in the news. That's when DOE's Fermi National Accelerator Laboratory made headlines for releasing the most precise measurement of the muon magnetic anomaly.
The 17-ton electromagnet storage ring that made this research possible started its scientific life 25 years ago in an earlier "muon g-2" experiment at Brookhaven Lab. The Brookhaven experiment found strong hints of an exciting new discovery. Those hints motivated the idea to move the ring to Fermilab to repeat the experiment with a more powerful muon-generating beam.
A team of scientists and engineers from both Brookhaven Lab and Fermilab then faced a puzzle: how to get a 50-foot-diameter magnet ring from Long Island to Illinois.
"The Fermilab collaboration looked at the Brookhaven results and estimated they could get 20 times more muons," explained physicist Bill Morse, who oversaw the team of physicists that coordinated "the Big Move." The upgraded experiment could yield an even clearer picture of the basic building blocks of the universe, but it required some clever engineering to move the experiment.
The team of Brookhaven physicists and engineers devised the most efficient and least risky route for the ring. They contracted a transportation company to load the magnet onto a truck and drive it to Smith Point Marina. For the average beachgoer, that is a 10-mile, straight shot down William Floyd Parkway. For the magnet truck, this easy drive required working with the local authorities to control traffic and make room for the magnet.
"We had to close down William Floyd Parkway because the ring took up all the lanes," said Morse.
After the first leg of the trip by truck, the magnet took a boat cruise south down the East Coast, around Florida, and up the Mississippi River. Then, under Fermilab's care, the magnet was driven across Illinois over three nights, while parked in various supermarket parking lots during the day. Fermilab took advantage of this moving science museum and coordinated with local schools to visit the magnet and learn about the experiment along the way.
For each of these magnet moves, DOE's national lab scientists and contractors calculated the long routes to protect the magnets and cause as little disruption to traffic as possible, paying careful attention to safety. After each magnet's cross-country trip, teams subjected the cargo to rigorous testing to ensure all components worked after being jostled in trucks and boats.
For example, when magnet technicians at Brookhaven Lab received dozens of APS magnets for the EIC ring, they had to replace aged components, check for water leaks via pressure and flow tests, conduct high voltage testing of the electromagnets' coil assemblies, produce new components, and reconfigure outdated designs. Magnet engineers and physicists are still working to verify magnetic field measurements and adequacy of the magnets.
Engineers worked with a host of professionals to clear the magnets for reuse. Radiological safety teams assessed metal holding structures, beam equipment services aided disassembly work, and riggers carefully moved and placed the ton-heavy magnets.
Before physicists and engineers could deem the BaBar magnet ready to be the central component of the new sPHENIX detector, they ran it through a "full field test"—ramping it up to full power to make sure it could safely produce a high-quality magnetic field.
"We had to build a steel box that could surround the magnet and contain its power during the test," said Haggerty. "We ramped up the field all the way."
Just like the APS magnet upgrade, Haggerty's team depended on expertise from across the Lab and external partners to carry out the test. The Collider-Accelerator Department designed the steel box, the cooling system, and the power supplies for the magnet, while the Superconducting Magnet Division modified the solenoid for use in sPHENIX and developed controls and monitoring for the full field test.
The sPHENIX team also worked with a group that traveled to Brookhaven from CERN, the European Organization for Nuclear Research, to map the magnetic field before the rest of the sPHENIX detector components were built within and around the magnet. The sPHENIX team recently published their first physics results from data collected when the full detector became operational starting in 2023.
The magnets at the heart of sPHENIX experiment, in the Muon g-2 storage ring, and destined for the EIC would have been incredibly expensive and time-consuming to reproduce from scratch. It is no wonder that labs are jumping at the chance to repurpose these magnets for brand-new experiments at the cutting edge of science.
In the coming years, there may be even more cross-country road trips to build the scientific instruments of the future.
Site director says 'a kind of camp, a favela' was founded in the ruins of city destroyed in AD79:
Archaeologists have discovered new evidence pointing to the reoccupation of Pompeii after the AD79 eruption of Mount Vesuvius that left the city in ruins.
Despite the massive destruction suffered by Pompeii, an ancient Roman city home to more than 20,000 people before the eruption, some survivors who could not afford to start a new life elsewhere are believed to have returned to live in the devastated area.
Archaeologists believe they were joined by others looking for a place to settle and hoping to find valuable items left in the rubble by Pompeii's previous residents.
"Judging by the archaeological data, it must have been an informal settlement where people lived in precarious conditions, without the infrastructure and services typical of a Roman city," before the area was completely abandoned in the fifth century, the researchers said in a statement on Wednesday.
While some life returned to the upper floors of the old houses, the former ground floors were converted into cellars with ovens and mills.
"Thanks to the new excavations, the picture is now clearer: post-79 Pompeii re-emerges, more than a city, a precarious and grey agglomeration, a kind of camp, a favela among the still recognisable ruins of the Pompeii that once was," said Gabriel Zuchtriegel, the director of the site.
Evidence that the site was reoccupied had been detected in the past, but in the rush to access Pompeii's colourful frescoes and still-intact homes, "the faint traces of the site's reoccupation were literally removed and often swept away without any documentation".
"The momentous episode of the city's destruction in AD79 has monopolised the memory," said Zuchtriegel.
Archaeologists estimate that 15-20% of Pompeii's population died in the eruption, mostly from thermal shock as a giant cloud of gases and ash covered the city.
Volcanic ash then buried the Roman city, perfectly preserving the homes, public buildings, objects and even the people who had lived there until its discovery in the late 16th century.