Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favorite keyboard trait?

  • QWERTY
  • AZERTY
  • Silent (sounds)
  • Clicky sounds
  • Thocky sounds
  • The pretty colored lights
  • I use Braille you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:63 | Votes:116

posted by hubie on Thursday June 27, @11:45PM   Printer-friendly
from the money-money-money dept.

https://arstechnica.com/tech-policy/2024/06/internet-archive-forced-to-remove-500000-books-after-publishers-court-win/

As a result of book publishers successfully suing the Internet Archive (IA) last year, the free online library that strives to keep growing online access to books recently shrank by about 500,000 titles.

IA reported in a blog post this month that publishers abruptly forcing these takedowns triggered a "devastating loss" for readers who depend on IA to access books that are otherwise impossible or difficult to access.

To restore access, IA is now appealing, hoping to reverse the prior court's decision by convincing the US Court of Appeals in the Second Circuit that IA's controlled digital lending of its physical books should be considered fair use under copyright law. An April court filing shows that IA intends to argue that the publishers have no evidence that the e-book market has been harmed by the open library's lending, and copyright law is better served by allowing IA's lending than by preventing it.
[...]
IA will have an opportunity to defend its practices when oral arguments start in its appeal on June 28.

"Our position is straightforward; we just want to let our library patrons borrow and read the books we own, like any other library," Freeland wrote, while arguing that the "potential repercussions of this lawsuit extend far beyond the Internet Archive" and publishers should just "let readers read."
[...]
After publishers won an injunction stopping IA's digital lending, which "limits what we can do with our digitized books," IA's help page said, the open library started shrinking. While "removed books are still available to patrons with print disabilities," everyone else has been cut off, causing many books in IA's collection to show up as "Borrow Unavailable."
[...]
In an IA blog, one independent researcher called IA a "lifeline," while others claimed academic progress was "halted" or delayed by the takedowns.

"I understand that publishers and authors have to make a profit, but most of the material I am trying to access is written by people who are dead and whose publishers have stopped printing the material," wrote one IA fan from Boston.
[...]
In the open letter to publishers—which Techdirt opined "will almost certainly fall on extremely deaf ears"—the Internet Archive and its fans "respectfully" asked publishers "to restore access to the books" that were removed.

They also suggested that "there is a way" to protect authors' rights and ensure they're fairly compensated "while still allowing libraries to do what they have always done—help readers read."
[...]
For IA's digital lending to be considered fair use, the brief said, the court must balance all factors favoring a ruling of fair use, including weighing that IA's use is "non-commercial, serves important library missions long recognized by Congress, and causes no market harm."

Publishers with surging profits have so far struggled to show any evidence of market harm, while IA has offered multiple expert opinions showing that ebook licensing was not negatively impacted by IA's digital lending.

"Publishers' ebook revenues have grown since IA began its lending," IA argued.

And even when IA temporarily stopped limiting the number of loans to provide emergency access to books during the pandemic—which could be considered a proxy for publishers' fear that IA's lending could pose a greater threat if it became much more widespread—IA's expert "found no evidence of market harm."
[...]
While IA fights to end the injunction, its other library services continue growing, IA has said. IA "may still digitize books for preservation purposes" and "provide access to our digital collections" through interlibrary loan and other means. IA can also continue lending out-of-print and public domain books.

One IA fan in rural India fears that if publishers win, it would permanently cut many people like her off from one of the few reliable resources she has to access rare books.

"If you are going to ban online availability of these resources, what about us?" she asked.

Previously on SoylentNews:
Internet Archive's Legal Woes Mount as Record Labels Sue for $400M - 20230822
The Internet Archive Reaches An Agreement With Publishers In Digital Book-Lending Case - 20230815
A Federal Judge Has Ruled Against the Internet Archive in a Lawsuit Brought by Four Book Publishers - 20230327
Internet Archive Faces Uphill Battle in Lawsuit Over its Free Digital Library - 20230322
Internet Archive Files Answer and Affirmative Defenses to Publisher Copyright Infringement Lawsuit - 20200731
EFF and California Law Firm Durie Tangri Defending Internet Archive from Publisher Lawsuit - 20200629
Publishers Sue the Internet Archive Over its Open Library, Declare it a Pirate Site - 20200603

Related News:

Mickey, Disney, and the Public Domain: a 95-Year Love Triangle - 20231217
E-Books are Fast Becoming Tools of Corporate Surveillance - 20231217
Research Shows That, When Given the Choice, Most Authors Don't Want Excessively-long Copyright Terms - 20230302
'The Government Killed Him': A Tribute to Activist and Programmer Aaron Swartz - 20230112
2023's Public Domain is a Banger - 20221223
Public Domain Day 2022 - 20220101
Public Domain Day in the USA: Works from 1925 are Open to All! - 20210101
On the Disappearance of Open Access Journals Over Time - 20200920
Wayback Machine and Cloudflare Team Up to Archive More of the Web - 20200919
GitHub Buries 21 TB of Open Source Data in an Arctic Archive - 20200721
Internet Archive Ends "Emergency Library" Early to Appease Publishers - 20200612
Long-Lost Maxis Game "SimRefinery" Rediscovered, Uploaded to Internet Archive - 20200605
Project Gutenberg Public Domain Library Blocked in Italy for Copyright Infringement - 20200604
Internet Archive Adds "Context" with Warnings - 20200526
University Libraries Offer Online "Lending" of Scanned In-Copyright Books - 20200410
Authors Fume as Online Library "Lends" Unlimited Free Books - 20200401
Internet Archive Suspends E-Book Lending "Waiting Lists" During U.S. National Emergency - 20200328
French Internet Referral Unit Falsely Identifies Internet Archive Content as "Terrorist" - 20190419
Internet Archive Moving to Preserve Google+ Posts before April Shutdown - 20190318
Matching Donations at the Internet Archive / Wayback Machine - 20181130
Internet Archive's Open Library Now Supports Full-Text Searches for All 4+ Million Items - 20180717
Vint Cerf: Internet is Losing its Memory - 20180629
MSNBC Host Attributes Homophobic Blog Posts to Hacking, Internet Archive Responds - 20180429
The Pineapple Fund Gives $1M in Bitcoin to the Internet Archive - 20171227
Internet Archive to Duplicate Data in Canada in Response to Trump's Election - 20161130
Internet Archive: Proposed Changes To DMCA Would Make Us "Censor The Web" - 20160608
Decentralized Web Summit Event at Internet Archive in San Francisco - 20160524
Internet Archive Seeks Changes in DMCA Takedown Procedures - 20160324
All Issues of Sci-Fi Magazine "IF" Are Now Available for Free Download [UPDATED] - 20160229
Internet Archive: 900 Classic Arcade Games on your Web Browser - 20141104
Archiveteam Tries to Save Twitch.tv - 20140810
Internet Archive "Desperately" Needs Help with JSMESS & Web Audio API - 20140718
The Importance of Information Preservation - 20140530

Original Submission

posted by martyb on Thursday June 27, @07:00PM   Printer-friendly

Sprayable gel simplifies surgeries:

In an animal study, the researchers showed that the gel, called GastroShield, is simple to apply in the course of current endoscopic procedures and provides wound protection for three to seven days.

In addition to its potential in colonoscopies, this gel could be useful for treating stomach ulcers and inflammatory conditions such as Crohn's disease, or for delivering cancer drugs, says Natalie Artzi, a principal research scientist in MIT's Institute for Medical Engineering and Science, who coauthored a paper on the work with colleagues including Professor Elazer Edelman '78, SM '79, PhD '84, former MIT postdoc Pere Dosta, and former visiting student Gonzalo Muñoz Taboada.

Members of the research team have started a company called BioDevek that plans to further develop the new material for use in humans.

Journal Reference: DOI: https://onlinelibrary.wiley.com/doi/10.1002/adma.202311798


Original Submission

posted by janrinok on Thursday June 27, @03:20PM   Printer-friendly
from the rest-in-pieces dept.

The 6-ton (13,000 pounds) high-resolution observation satellite Resurs P1 was launched in 2013 and decommissioned in 2022 due to 'equipment malfunction'. Between 26 June 13:05 UTC and 27 June 00:51 UTC it 'released a number of fragments' where number is > 100.

Nine astronauts aboard the International Space Station climbed into their respective spacecraft to take refuge from potential impacts with space debris

Apparently they stayed there one hour.

https://www.forbes.com/sites/ericmack/2024/06/27/iss-astronauts-take-shelter-after-russian-spacecraft-breaks-up-in-orbit/
https://www.usnews.com/news/world/articles/2024-06-27/russian-satellite-blasts-debris-in-space-forces-iss-astronauts-to-shelter


Original Submission

posted by martyb on Thursday June 27, @11:18AM   Printer-friendly
from the not-cool dept.

Sweat may protect against Lyme disease:

Most people's sweat contains a protein that can prevent Lyme disease, researchers at MIT and the University of Helsinki have discovered. They also found that about one-third of the population carries a less protective variant that makes the tick-borne infection more likely.

By running a genome-wide association study, the researchers identified three variants more common in people who'd had Lyme disease. One—in a gene for a secretoglobin, a type of protein that in this case is produced primarily in the sweat glands—was previously unknown. In vitro, it significantly inhibited growth of Lyme-causing bacteria, but a variant version required twice as much to do so. And when mice were injected with Lyme bacteria that had been exposed to the normal version of the sweat protein, they did not develop the disease.

It's unknown how the protein inhibits the bacteria, but the researchers hope it can be used in preventive skin creams or to treat the 10% or so of Lyme infections that don't respond to antibiotics.

"We think there are real implications here for a preventative and possibly a therapeutic," says Michal Caspi Tal of MIT's Department of Biological Engineering, one of the senior authors of the new study. She also plans to study whether the 10 other secretoglobins in the human body could have antimicrobial qualities too.


Original Submission

posted by martyb on Thursday June 27, @06:29AM   Printer-friendly
from the its-about-time! dept.

Drugs are more effective at certain times of day:

The study also revealed that the liver is more susceptible to infections such as malaria at certain points in the circadian cycle, when fewer inflammatory proteins are being produced—possibly because its response to pathogens declines after meals, when it has typically been exposed to an influx of microorganisms that might trigger inflammation even if they are not harmful.

"One of the earliest applications for this method could be fine-tuning drug regimens of already approved drugs to maximize their efficacy and minimize their toxicity," says Professor Sangeeta Bhatia, SM '93, PhD '97, a member of MIT's Koch Institute for Integrative Cancer Research and the Institute for Medical Engineering and Science (IMES), who is the senior author of the new study.

The MIT researchers are now working with collaborators to analyze a cancer drug they suspect may be affected by circadian cycles, and they hope to investigate whether this may be true of drugs used in pain management as well. They are also taking advantage of the cycles in inflammatory signals to study infections that are usually difficult to establish in engineered livers, including certain types of malaria.

Journal Reference:
Just a moment..., (DOI: 10.1126/sciadv.adm9281)


Original Submission

posted by hubie on Thursday June 27, @01:42AM   Printer-friendly
from the hyperbolic-hyperventilation dept.

The House Ban On DJI Drones Is Mindless Anticompetitive Fear Mongering

When it comes to China, the U.S. likes to pretend its business policies are well-crafted, logic-driven decisions based on the welfare of the markets and the public, but very often that's simply not the case. We've already noted how the TikTok ban is an unconstitutional mess that doesn't have the public's support, in large part because it doesn't actually fix any of the problems supporters of a ban like to claim.

"Essentially, the US government pressured drone manufacturers to implement privacy and safety features that required internet infrastructure to operate, DJI built those features, and now lawmakers say those same features could be used by China to spy on Americans and are the reason for the ban. Meanwhile, the only existing American drone manufacturers create far more invasive products that are sold exclusively to law enforcement and government entities, which are increasingly using them to conduct surveillance on American citizens and communities."

Who's the naive idiot who thinks congress makes "well-crafted, logic-driven decisions based on the welfare of the markets and the public"?


Original Submission

posted by hubie on Wednesday June 26, @08:58PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Does proton decay exist and how do we search for it? This is what a recently submitted study to the arXiv preprint server hopes to address as a team of international researchers investigate a concept of using samples from the moon to search for evidence of proton decay, which remains a hypothetical type of particle decay that has yet to be observed and continues to elude particle physicists.

This study holds the potential to help solve one of the longstanding mysteries in all of physics, as it could enable new studies into deep-level and the laws of nature, overall.

[...] Dr. Stengel tells Universe Today this research started around 2018 with lead author, Dr. Sebastian Baum, and other scientists regarding the use of paleo-detectors, which is a proposed method to examine particles that span vast periods of geological timeframes.

[...] For the study, the researchers proposed a hypothetical concept using paleo-detectors that would involve collecting mineral samples from more than 5 kilometers (3.1 miles) beneath the lunar surface and analyzing them for presence of proton decay, either on the moon itself or back on Earth.

[...] Dr. Stengel tells Universe Today, "For a lunar mineral sample which is both sufficiently radiopure to mitigate radiogenic backgrounds and buried at sufficient depths for shielding from other cosmic ray backgrounds, we show that the sensitivity of paleo-detectors to proton decay could in principle be competitive with next-generation conventional proton decay experiments."

As noted, proton decay continues to be a hypothetical type of particle decay and was first proposed in 1967 by the Soviet physicist and Nobel Prize laureate, Dr. Andrei Sakharov. As its name implies, proton decay is hypothesized to occur when protons decays into particle smaller than an atom, also called subatomic particles.

[...] Dr. Stengel tells Universe Today, "Proton decay is a generic prediction of particle physics theories beyond the Standard Model (SM). In particular, proton decay could be one of the only low energy predictions of so-called Grand Unified Theories (GUTs), which attempt to combine all of the forces which mediate SM interactions into one force at very high energies. Physicists have been designing and building experiments to look for proton decay for over 50 years."

[...] As noted, the hypothetical concept proposed by this study using paleo-detectors to detect proton decay on the moon would require collecting samples at least 5 kilometers (3.1 miles) beneath the lunar surface. For context, the deepest humans have ever collected samples from beneath the lunar surface was just under 300 centimeters (118 inches) with the drill core samples obtained from the Apollo 17 astronauts.

On Earth, the deepest human-made hole is the Kola Superdeep Borehole in northern Russia and measures approximately 12.3 kilometers (7.6 miles) in true vertical depth, along with requiring several holes to be drilled and several years to achieve. While the study notes the proposed concept using paleo-detectors on the moon is "clearly futuristic," what steps are required to take this concept from futuristic to realistic?

Dr. Stengel tells Universe Today, "As we are careful not to stray too far from our respective areas of expertise related to particle physics, we chose not to speculate much at all about the actual logistics of performing such an experiment on the moon. However, we also thought that this concept was timely as various scientific agencies across different countries are considering a return to the moon and planning for broad program of experiments."

[...] Dr. Stengel tells Universe Today, "Due to the exposure of paleo-detectors to proton decay over billion-year timescales, only one kilogram of target material is necessary to be competitive with conventional experiments. In combination with the scientific motivation and the recent push towards returning humans to the moon for scientific endeavors, we think paleo-detectors could represent the final frontier in the search for proton decay."

More information: Sebastian Baum et al, The Final Frontier for Proton Decay, arXiv (2024). DOI: 10.48550/arxiv.2405.15845


Original Submission

posted by hubie on Wednesday June 26, @04:13PM   Printer-friendly
from the Junk-Drawer dept.

The 2024 Old Computer Challenge has been announced. The challenge started 4 years ago with the challenge to use a computer with 1 core at a max of 1 GHz and 512MB of RAM for a week and grew a small community surrounding them with 34 entrants for 2023. This year's theme, however, is no theme at all. The announcement post includes suggestions however there's no set of official rules this time around.

Anyone interested in participating can take a look at Headcrash's OCC Site to look at previous years' entries and find instructions for how to get listed this year.

Personally I'm planning on running a classic Clamshell Mac with OS9 as my daily driver :)


Original Submission

posted by hubie on Wednesday June 26, @11:25AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Congratulations, world. We’ve done it. Since passing the Clean Air Act in the 1970s, we’ve reduced cancer-causing particulate emissions from our cars and other sources dramatically, a change that has added years to our lives.

That’s the good news. The bad news is that we can now spend more time focusing on the remaining sources, including some unexpected ones. In an EV era, tires are becoming the greatest emitters of particulate matter, and as we’ve seen, whether it’s the microplastics in our shrimp or the preservatives in our salmon, they’re having a disturbing impact on our environment.

Gunnlaugur Erlendsson wants to do something about that. The affable Icelander founded Enso to tackle what he saw as a developing need for better EV tires. The UK-based company’s next big step is coming close to home: a $500 million US tire factory specifically for building eco-friendly tires for EVs. 

Well, eco-friendlier, anyway.

[...] While EV-specific tires are increasingly common, Erlendsson says most tire manufacturers are too focused on partnering with auto manufacturers, shipping new tires with new cars. “So even though technology exists to make tires much better today, it isn’t hitting the 90 percent of the tire industry, which is the aftermarket,” he said.

While Erlendsson said Enso is working to develop partnerships with those same vehicle manufacturers, the company’s US business model will focus on the 90 percent, creating tires in the correct fitments for popular EVs, regardless of brand, then selling them directly to customers.

What makes Enso’s tires different? Erlendsson was light on the technical details but promised 10 percent lower rolling resistance than regular tires, equating to a commensurate range increase. That’ll make your EV cheaper to run, while a 35 percent increase in tire life means lower wear, fewer particulates in the air, and fewer old tires sent to the incinerator, where half of all American tires go to die. 

Enso’s new factory will also handle recycling. It will be truly carbon neutral, not reliant on carbon offsets, and manufacture tires out of recycled carbon black and tire silica made from rice husks. 

[...] Enso is aiming for the production of 5 million tires from the new factory by 2027. Its location is still being finalized, but Enso cites Colorado, Nevada, Texas, or Georgia as likely locations. With the southeastern US becoming a hotbed for EV production and the so-called “Battery Belt” seeing huge investments from startups like Redwood Materials, that last option might be the safest bet.

A factory of that size will be a huge step up for Enso, which right now provides tires exclusively for fleet use in the UK, including the Royal Mail. Per The Guardian, a study from Transport for London, which regulates public transit in the city, shows Enso’s tires are living up to Erlendsson’s claims of increased efficiency, reduced wear, and reduced cost.

If Enso can deliver that on a larger scale to American drivers, it’ll fly in the face of typical corporate goals of selling more things to more people. Erlendsson sees this as a way to reset today’s tire economy.

“A proposition where you sell fewer tires is just not palatable to most listed companies in this industry,” he said. “It’s hard for someone with a legacy manufacturing and legacy supply chains and legacy distribution model to suddenly say, ‘I’m going to make fewer tires, and I’m going to spend more to make them,’ while not tanking your share price at the same time.”

Of course, upending a more than 150-year-old industry is no small feat, either. 


Original Submission

posted by hubie on Wednesday June 26, @06:42AM   Printer-friendly

https://gizmodo.com/detect-aliens-warp-drive-collapse-gravitational-waves-1851550746

Warp drives, inspired by Albert Einstein's grasp of cosmological physics, were first mathematically modeled by physicist Miguel Alcubierre in 1994. According to Alcubierre, a spacecraft could achieve faster-than-light travel (relative to an outside observer) through a mechanism known as a "warp bubble," which contracts space in front of it and expands space behind. The warp drive doesn't accelerate the spacecraft locally to faster-than-light speeds; instead, it manipulates spacetime around the vessel. Such a spaceship could travel vast distances in a short period by "warping" spacetime, bypassing the light-speed limit in a way that is consistent with general relativity.

The trouble is, this model requires negative energy, a speculative form of energy where there's less energy than empty space, which is not currently understood or achievable with today's technology. This gap in our understanding keeps the actual construction of a warp drive, as portrayed in Star Wars and Star Trek, firmly within the realms of science fiction.

In a study uploaded to the arXiv preprint server, astrophysicist and mathematician Katy Clough from Queen Mary University of London, along with colleagues Tim Dietrich from the Max Planck Institute for Gravitational Physics and Sebastian Khan from Cardiff University, explore the possibility that the hypothetical collapse of warp drives could emit detectable gravitational waves.

Note: Simply disable CSS style sheets to bypass the "Continue Reading" button.


Original Submission

posted by janrinok on Wednesday June 26, @04:50AM   Printer-friendly
from the smell-that-fresh-air dept.

https://www.bbc.co.uk/news/live/world-69145409

It's currently just past 12:30 in Singapore, 05:30 in London and 14:30 in Canberra - where Assange is expected to land later this afternoon. If you're just joining us now, here's what you need to know:

  • As part of a plea deal reached with the US, the Wikileaks founder pleaded guilty to one charge of breaching the Espionage Act in relation to his role in leaking thousands of classified documents.
  • In return, he was sentenced by Judge Ramona Manglona to time served due to his time spent at London's Belmarsh prison and allowed to walk free
  • The plea was part of a deal struck with the US and ends a years-long battle by Assange against extradition to the US to face 18 felony charges
  • One of Assange's lawyers say that Wikileaks's work will continue and that Assange "will be a continuing force for freedom of speech and transparency in government"
  • Assange is due to arrive in the Australian capital Canberra at around 18:41 local time (08:41 GMT)

A former CIA chief of staff, Larry Pfeiffer, has been talking to the Australian Broadcasting Corporation, saying he believes the plea deal is "fair" and "not unusual".

He theorised that the US likely came to the negotiating table to protect intelligence sources and methods from being revealed in court, and because the case was causing "diplomatic irritants" in its relationships with Australia and the UK.

Barnaby Joyce, a former deputy Prime Minister of Australia who lobbied in Washington for Assange, told the BBC's Newsday earlier this morning that he believes the extraterritorial aspect of Assange's case is worrying.

"He was not a citizen of the United States, nor was he ever in the United States. So we've sent a person to prison in a third country," said Joyce.

"I don't believe what he did was right. I'm not here to give a warrant to his character. But I do say is what he did in Australia was not illegal... There is no law he broke in Australia."

He also criticised the treatment the Wikileaks founder received while at Belmarsh prison.

"One day we'll look back at this case and everyone will wonder: honestly, who did he murder to be in solitary confinement 23 hours a day? What was the charge that inspired that?" Joyce said.

Touchdown! Free at last' Wikileaks has just posted [a] picture as the plane touched down, saying Assange was "free at last".

posted by hubie on Wednesday June 26, @01:57AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Whether it's physical phenomena, share prices or climate models—many dynamic processes in our world can be described mathematically with the aid of partial differential equations. Thanks to stochastics—an area of mathematics which deals with probabilities—this is even possible when randomness plays a role in these processes.

Something researchers have been working on for some decades now are so-called stochastic partial differential equations. Working together with other researchers, Dr. Markus Tempelmayr at the Cluster of Excellence Mathematics Münster at the University of Münster has found a method which helps to solve a certain class of such equations.

The basis for their work is a theory by Prof. Martin Hairer, recipient of the Fields Medal, developed in 2014 with international colleagues. It is seen as a great breakthrough in the research field of singular stochastic partial differential equations. "Up to then," Tempelmayr explains, "it was something of a mystery how to solve these equations. The new theory has provided a complete 'toolbox,' so to speak, on how such equations can be tackled."

The problem, Tempelmayr continues, is that the theory is relatively complex, with the result that applying the 'toolbox' and adapting it to other situations is sometimes difficult.

"So, in our work, we looked at aspects of the 'toolbox' from a different perspective and found and proved a method which can be used more easily and flexibly."

[...] Stochastic partial differential equations can be used to model a wide range of dynamic processes, for example, the surface growth of bacteria, the evolution of thin liquid films, or interacting particle models in magnetism. However, these concrete areas of application play no role in basic research in mathematics as, irrespective of them, it is always the same class of equations which is involved.

The mathematicians are concentrating on solving the equations in spite of the stochastic terms and the resulting challenges such as overlapping frequencies which lead to resonances.

[...] The approach they took was not to tackle the solution of complicated stochastic partial differential equations directly, but, instead, to solve many different simpler equations and prove certain statements about them.

"The solutions of the simple equations can then be combined—simply added up, so to speak—to arrive at a solution for the complicated equation which we're actually interested in." This knowledge is something which is used by other research groups who themselves work with other methods.

More information: Pablo Linares et al, A diagram-free approach to the stochastic estimates in regularity structures, Inventiones mathematicae (2024). DOI: 10.1007/s00222-024-01275-z


Original Submission

posted by janrinok on Tuesday June 25, @09:12PM   Printer-friendly
from the walled-garden dept.

https://arstechnica.com/tech-policy/2024/06/eu-says-apple-violated-app-developers-rights-could-be-fined-10-of-revenue/

The European Commission today said it found that Apple is violating the Digital Markets Act (DMA) with App Store rules and fees that "prevent app developers from freely steering consumers to alternative channels for offers and content." The commission "informed Apple of its preliminary view" that the company is violating the law, the regulator announced.

This starts a process in which Apple has the right to examine documents in the commission's investigation file and reply in writing to the findings. There is a March 2025 deadline for the commission to make a final ruling.

[...] Apple was further accused of charging excessive fees. The commission said that Apple is allowed to charge "a fee for facilitating via the App Store the initial acquisition of a new customer by developers," but "the fees charged by Apple go beyond what is strictly necessary for such remuneration. For example, Apple charges developers a fee for every purchase of digital goods or services a user makes within seven days after a link-out from the app."

Apple says it charges a commission of 27 percent on sales "to the user for digital goods or services on your website after a link out... provided that the sale was initiated within seven days and the digital goods or services can be used in an app."

[...] The commission today also announced it is starting a separate investigation into Apple's "contractual requirements for third-party app developers and app stores," including its "Core Technology Fee." Apple charges the Core Technology Fee for app installs, whether they are delivered from Apple's own App Store, from an alternative app marketplace, or from a developer's own website. The first million installs each year are free, but a per-install fee of €0.50 applies after that.

The commission said it would investigate whether the Core Technology Fee complies with the DMA.


Original Submission

posted by janrinok on Tuesday June 25, @04:27PM   Printer-friendly

Climate models are numerical simulations of the climate system, which are used to for predicting climate change from emissions scenarios and many other applications. Let's take a closer look at how they work.

Why do we need climate models?

The climate system and its components like the atmosphere and hydrosphere are driven by many processes that interact with each other to produce the climate we observe. We understand some of these processes very well, such as many atmospheric circulations that drive the weather. Others like the role of aerosols and deep ocean circulations are more uncertain. Even when individual processes are well understood, the interaction between these processes makes the system more complex and produces emergent properties like feedbacks and tipping points. We can't rely on simple extrapolation to generate accurate predictions, which is why we need models to simulate the dynamics of the climate system. Global climate models simulate the entire planet at a coarse resolution. Regional climate models simulate smaller areas at a higher resolution, relying on global climate models for their initial and lateral boundary conditions (the edges of the domain).

How do climate models work?

A climate model is a combination of several components, each of which typically simulates one aspect of the climate system such as the atmosphere, hydrosphere, cryosphere, lithosphere, or biosphere. These components are coupled together, meaning that what happens in one component of the climate system affects all of the other components. The most advanced climate models are large software tools that use parallel computing to run across hundreds or thousands of processors. Climate models are a close cousin of the models we use for weather forecasting and even use a lot of the same source code.

The atmospheric component of the model, for example, has a fluid dynamics simulation at its core. The model numerically integrates a set of primitive equations such as the Navier-Stokes equation, the first law of thermodynamics, the continuity equation, the Clausius-Clapeyron equation, and the equation of state. Global climate models generally assume the atmosphere is in hydrostatic balance at all times, but that is not necessarily the case for regional models. Hydrostatic balance means that the force of gravity completely balances with the upward pressure gradient force, meaning that the air never accelerates upward or downward, which does occur in some instances like inside thunderstorms.

Not all atmospheric processes can be described by these equations, and we also need to predict things like aerosols (particulates suspended in the atmosphere), radiation (incoming solar radiation, and heat radiated upward), microphysics (e.g., cloud dropets. rain drops, ice crystals, etc...), and deep convection like thunderstorms (in models with coarse resolutions) to accurately simulate the atmosphere. Instead, these processes are parameterized to simulate their effects as accurately as possible in the absence of governing equations.

The atmospheric simulations are generally more complex and run at a higher resolution for weather models than in climate models. However, weather models do not simulate the oceans, land surface, or the biosphere with the same level of complexity because it's not necessary to get accurate forecasts. For example, the deep oceans don't change enough on weather time scales to impact the forecast, but they do change in important ways on climate time scales. A weather model also probably isn't going to directly simulate how temperature and precipitation affect the type of vegetation growing in a particular location, or if there's just bare soil. Instead, a weather model might have data sets for land use and land cover during the summer and winter, use the appropriate data depending on the time of year being simulated, and then use that information to estimate things like albedo and evapotranspiration.

The main difference between climate models and weather models is that weather models are solving an initial condition problem whereas climate modeling is a boundary condition problem. Weather is highly sensitive to the initial state of the atmosphere, meaning that small changes in the atmosphere at the current time might result in large differences a week from now. Climate models depend on factors that occur and are predictable on much longer time scales like greenhouse gas concentrations, land use and land cover, and the temperature and salinity of the deep ocean. Climate models are also not concerned with accurately predicting the weather at a specific point in time, only its statistical moments like the mean and standard deviation over a period of time. We intuitively understand that these statistical moments are predictable on far longer time scales, which is why you could confidently insist that I'm wrong if I claimed that there would be heavy snow in Miami, Florida on June 20, 2050.

How and why are climate models coupled?

Information from the various components in the model needs to be communicated to the other components to get an accurate simulation. For example, precipitation affects the land surface by changing the soil moisture, which may also affect the biosphere. The albedo of the land surface affects air temperatures. Soil moisture also affects temperature, with arid areas typically getting warmer during the day and colder at night. If the precipitation is snow, the snow cover prevents heat from being conducted from the ground into the atmosphere, causing colder temperatures. Warm ocean temperatures are conducive for tropical cyclones to form, but the winds in a strong cyclone can churn up cooler water from below, which will weaken a tropical cyclone.

Both weather and climate models are coupled models, meaning that information is communicated between different components of the system to allow the model to simulate interactions like these and many others. Each component of the climate system (e.g., atmosphere, hydrosphere, lithosphere, etc...) is generally a separate software module that is run simultaneously with the other components and interfaces with them. If the components of weather and climate models weren't coupled together, we couldn't simulate many of the feedbacks and tipping points that arise from these interactions.

What are climate models used for?

Perhaps the most frequently discussed application of climate models is simulating how various emissions scenarios will affect future climates. But climate models are also used for many other applications like sensitivity studies, attribution of extreme events, and paleoclimate studies.

An example of a sensitivity study might be to examine how deforestation of the Amazon affects the climate. A sensitivity study would require two models, one a control simulation with the Amazon rainforest intact, the other with the rainforest replaced by grassland or bare soil. Most of the parameters that define these simulations like greenhouse gas concentrations would be kept identical so that only the presence or absence of the Amazon rainforest would be responsible for the differences in climate. The simulations would be run for a period of time, perhaps years or decades, and then the differences between the simulations are analyzed to determine the sensitivity of the climate to whatever is different between the simulations.

Extreme event attribution attempts to determine to what extent climate change is responsible for a particular extreme event. This is very similar to sensitivity studies in that there's a control simulation and a second simulation where some aspect of the climate system like greenhouse gas concentrations is different. For example, if we want to estimate the effect of climate change on an extreme heat wave in Europe, we might run a control simulation with preindustrial greenhouse gas levels and another simulation with present day levels. In this case, the greenhouse gas concentrations would probably be prescribed at a particular level and not permitted to vary during the simulation. These simulations might be run for hundreds or even thousands of years to see how often the extreme event occurs in the preindustrial and the modern simulation. If the heat wave occurs every hundred years with modern greenhouse gas levels but never occurs with preindustrial conditions, the event might be attributed entirely to climate change. If the event occurs in both simulations, we would compare the frequency it occurs in each simulation to estimate how much it can be attributed to climate change.

For paleoclimate simulations, we have much more limited information about the climate. We might know the greenhouse gas concentrations from bubbles of air trapped in ice cores, for example. There may be proxy data like fossil evidence of the plants and animals that lived in a particular location, which can be used to infer information about whether a climate was hot or cold, or whether it was wet or dry. On the other hand, we certainly won't have detailed observations of things like extreme events, oceanic circulations, and many other aspects of the climate system. In this case, the climate model can be configured to match the known aspects of the past climate as closely as possible, then using the simulation to fill in the gaps where we don't have observations. Paleoclimate simulations can also be used to identify biases and errors in the model when it's unable to accurately reproduce past climates. When these errors are discovered, the model can be improved to better simulate past climates, and that also increases our confidence in its ability to extrapolate future climates.

Can we trust climate models?

All weather models and climate models are wrong. A weather model will never forecast the weather with 100% accuracy, though they do a remarkably good job at forecasting wide range of weather events. The model is still the best tool we have to predict the weather, especially beyond a day or two where extrapolation just isn't going to be reliable. Many components are shared between weather and climate models, and if these components didn't work correctly, they would also prevent us from producing accurate weather forecasts. Weather models often do have some systematic bias, especially for longer range forecasts, but we can correct for these biases with statistical postprocessing. Every time a weather model is run, it's also helping to verify the accuracy of any components that are shared with climate models.

Climate models from a couple of decades ago generated forecasts for the present climate, and once differences in greenhouse gas concentrations are accounted for, they are very accurate at predicting our current climate. Climate models are also used to simulate past climates, and their ability to do so accurately means that we can be more confident in their ability to predict the climate under a much wider range of conditions.

Even when there is a known bias in climate models, it does not invalidate all climate model studies. For example, climate models typically underestimate greenhouse gas sinks, resulting in a high bias in greenhouse gas concentrations for a particular emissions scenario. But we may be able to correct for that bias with statistical postprocessing. Also, many applications of climate models like extreme event attribution, many sensitivity studies, and many paleoclimate simulations do not dynamically simulate the carbon cycle. This means that those applications of climate models would be completely unaffected by the issue with underestimating greenhouse gas sinks.

Many of the climate models like the Goddard Institute for Space Studies models, the Community Earth System Model, and the Weather Research and Forecasting Model (often used in regional climate modeling) are free and open source, meaning that anyone can download the model, examine the source code, and run their own simulations. Data from a large number of climate model simulations is often publicly shared, especially in various intercomparison projects. Climate models are not closely guarded secrets, so anyone can examine and test climate models for themselves, and modify the source code to fix bugs or make improvements.


Original Submission

posted by janrinok on Tuesday June 25, @11:44AM   Printer-friendly
from the cap-that! dept.

Arthur T Knackerbracket has processed the following story:

Scientists at Lawrence Berkeley National Laboratory and UC Berkeley have created "microcapacitors" that address this shortcoming, as highlighted in a study published in Nature. Made from engineered thin films of hafnium oxide and zirconium oxide, these capacitors employ materials and fabrication techniques commonly used in chip manufacturing. What sets them apart is their ability to store significantly more energy than ordinary capacitors, thanks to the use of negative capacitance materials.

Capacitors are one of the basic components of electrical circuits. They store energy in an electric field established between two metallic plates separated by a dielectric material (non-metallic substance). They can deliver power quickly and have longer lifespans than batteries, which store energy in electrochemical reactions.

However, these benefits come at the cost of significantly lower energy densities. Perhaps that's why we've only seen low-powered devices like mice powered by this technology, as opposed to something like a laptop. Plus, the problem is only exacerbated when shrinking them down to microcapacitor sizes for on-chip energy storage.

The researchers overcame this by engineering thin films of HfO2-ZrO2 to achieve a negative capacitance effect. By tuning the composition just right, they were able to get the material to be easily polarized by even a small electric field.

To scale up the energy storage capability of the films, the team placed atomically thin layers of aluminum oxide every few layers of HfO2-ZrO2, allowing them to grow the films up to 100 nm thick while retaining the desired properties.

These films were integrated into three-dimensional microcapacitor structures, achieving record-breaking properties: nine times higher energy density and 170 times higher power density compared to the best electrostatic capacitors today. That's huge.

"The energy and power density we got are much higher than we expected," said Sayeef Salahuddin, a senior scientist at Berkeley Lab, UC Berkeley professor, and project lead. "We've been developing negative capacitance materials for many years, but these results were quite surprising."

It's a major breakthrough, but the researchers aren't resting on their laurels just yet. Now they're working on scaling up the technology and integrating it into full-size microchips while improving the negative capacitance of the films further.


Original Submission