Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favorite keyboard trait?

  • QWERTY
  • AZERTY
  • Silent (sounds)
  • Clicky sounds
  • Thocky sounds
  • The pretty colored lights
  • I use Braille you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:49 | Votes:75

posted by janrinok on Tuesday July 23, @11:15PM   Printer-friendly

Botanists vote to remove racist reference from plants' scientific names:

[ Editor's Comment: caffra means 'infidel' in Arabic, and it was used as a racial slur against black (non-arabic) people, predominantly in South Africa. ]

Scientists have voted to eliminate the names of certain plants that are deemed to be racially offensive. The decision to remove a label that contains such a slur was taken last week after a gruelling six-day session attended by more than 100 researchers, as part of the International Botanical Congress, which officially opens on Sunday in Madrid.

The effect of the vote will be that all plants, fungi and algae names that contain the word caffra, which originates in insults made against Black people, will be replaced by the word affra to denote their African origins. More than 200 species will be affected, including the coast coral tree, which will be known as Erythrina affra instead of Erythrina caffra.

The scientists attending the nomenclature session also agreed to create a special committee which would rule on names given to newly discovered plants, fungi and algae. These are usually named by those who first describe them in the scientific literature. However, the names could now be overruled by the committee if they are deemed to be derogatory to a group or race.

A more general move to rule on other controversial historical labels was not agreed by botanists. Nevertheless, the changes agreed last week are the first rule alterations that taxonomists have officially agreed to the naming of species, and were welcomed by the botanist Sandy Knapp of the Natural History Museum in London, who presided over the six-day nomenclature session.

"This is an absolutely monumental first step in addressing an issue that has become a real problem in botany and also in other biological sciences," she told the Observer. "It is a very important start."

The change to remove the word caffra from species names was proposed by the plant taxonomist Prof Gideon Smith of Nelson Mandela University in South Africa, and his colleague Prof Estrela Figueiredo. They have campaigned for years for changes to be made to the international system for giving scientific names to plants and animals in order to permit the deletion and substitution of past names deemed objectionable.

"We are very pleased with the retroactive and permanent eradication of a racial slur from botanical nomenclature," Smith told the Observer. "It is most encouraging that more than 60% of our international colleagues supported this proposal."

And the Australian plant taxonomist Kevin Thiele – who had originally pressed for historical past names to be subject to changes as well as future names – told Nature that last week's moves were "at least a sliver of recognition of the issue".

Plant names are only a part of the taxonomic controversy, however. Naming animals after racists, fascists and other controversial figures cause just as many headaches as those posed by plants, say scientists. Examples include a brown, eyeless beetle which has been named after Adolf Hitler. Nor is Anophthalmus hitleri alone. Many other species' names recall individuals that offend, such as the moth Hypopta mussolinii.

The International Commission on Zoological Nomenclature (ICZN) has so far refused to consider changing its rules to allow the removal of racist or fascist references. Renaming would be disruptive, while replacement names could one day be seen as offensive "as attitudes change in the future", it announced in the Zoological Journal of the Linnean Societylast year. Nevertheless, many researchers have acknowledged that some changes will have to be made to zoological nomenclature rules in the near future.


Original Submission

posted by janrinok on Tuesday July 23, @05:31PM   Printer-friendly
from the fingers-crossed dept.

Academic journals are a lucrative scam – and we're determined to change that:

'It's never been more evident that for-profit publishing simply does not align with the aims of scholarly inquiry.' Photograph: agefotostock/AlamyView image in fullscreen'It's never been more evident that for-profit publishing simply does not align with the aims

Giant publishers are bleeding universities dry, with profit margins that rival Google's. So we decided to start our own

If you've ever read an academic article, the chances are that you were unwittingly paying tribute to a vast profit-generating machine that exploits the free labour of researchers and siphons off public funds.

The annual revenues of the "big five" commercial publishers – Elsevier, Wiley, Taylor & Francis, Springer Nature, and SAGE – are each in the billions, and some have staggering profit margins approaching 40%, surpassing even the likes of Google. Meanwhile, academics do almost all of the substantive work to produce these articles free of charge: we do the research, write the articles, vet them for quality and edit the journals.

Not only do these publishers not pay us for our work; they then sell access to these journals to the very same universities and institutions that fund the research and editorial labour in the first place. Universities need access to journals because these are where most cutting-edge research is disseminated. But the cost of subscribing to these journals has become so exorbitantly expensive that some universities are struggling to afford them. Consequently, many researchers (not to mention the general public) remain blocked by paywalls, unable to access the information they need. If your university or library doesn't subscribe to the main journals, downloading a single paywalled article on philosophy or politics can cost between £30 and £40.

The commercial stranglehold on academic publishing is doing considerable damage to our intellectual and scientific culture. As disinformation and propaganda spread freely online, genuine research and scholarship remains gated and prohibitively expensive. For the past couple of years, I worked as an editor of Philosophy & Public Affairs, one of the leading journals in political philosophy. It was founded in 1972, and it has published research from renowned philosophers such as John Rawls, Judith Jarvis Thomson and Peter Singer. Many of the most influential ideas in our field, on topics from abortion and democracy to famine and colonialism, started out in the pages of this journal. But earlier this year, my co-editors and I and our editorial board decided we'd had enough, and resigned en masse.

We were sick of the academic publishing racket and had decided to try something different. We wanted to launch a journal that would be truly open access, ensuring anyone could read our articles. This will be published by the Open Library of Humanities, a not-for-profit publisher funded by a consortium of libraries and other institutions. When academic publishing is run on a not-for-profit basis, it works reasonably well. These publishers provide a real service and typically sell the final product at a reasonable price to their own community. So why aren't there more of them?

To answer this, we have to go back a few decades, when commercial publishers began buying up journals from university presses. Exploiting their monopoly position, they then sharply raised prices. Today, a library subscription to a single journal in the humanities or social sciences typically costs more than £1,000 a year. Worse still, publishers often "bundle" journals together, forcing libraries to buy ones they don't want in order to have access to ones they do. Between 2010 and 2019, UK universities paid more than £1bn in journal subscriptions and other publishing charges. More than 90% of these fees went to the big five commercial publishers (UCL and Manchester shelled out over £4m each). It's worth remembering that the universities funded this research, paid the salaries of the academics who produced it and then had to pay millions of pounds to commercial publishers in order to access the end product.

Even more astonishing is the fact these publishers often charge authors for the privilege of publishing in their journals. In recent years, large publishers have begun offering so-called "open access" articles that are free to read. On the surface, this might sound like a welcome improvement. But for-profit publishers provide open access to readers only by charging authors, often thousands of pounds, to publish their own articles. Who ends up paying these substantial author fees? Once again, universities. In 2022 alone, UK institutions of higher education paid more than £112m to the big five to secure open-access publication for their authors.

This trend is having an insidious impact on knowledge production. Commercial publishers are incentivised to try to publish as many articles and journals as possible, because each additional article brings in more profit. This has led to a proliferation of junk journals that publish fake research, and has increased the pressure on rigorous journals to weaken their quality controls. It's never been more evident that for-profit publishing simply does not align with the aims of scholarly inquiry.

There is an obvious alternative: universities, libraries, and academic funding agencies can cut out the intermediary and directly fund journals themselves, at a far lower cost. This would remove commercial pressures from the editorial process, preserve editorial integrity and make research accessible to all. The term for this is "diamond" open access, which means the publishers charge neither authors, editors, nor readers (this is how our new journal will operate). Librarians have been urging this for years. So why haven't academics already migrated to diamond journals?

The reason is that such journals require alternative funding sources, and even if such funding were in place, academics still face a massive collective action problem: we want a new arrangement but each of us, individually, is strongly incentivised to stick with the status quo. Career advancement depends heavily on publishing in journals with established name recognition and prestige, and these journals are often owned by commercial publishers. Many academics – particularly early-career researchers trying to secure long-term employment in an extremely difficult job market – cannot afford to take a chance on new, untested journals on their own.


Original Submission

posted by hubie on Tuesday July 23, @11:45AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

New research led by scientists at the University of Michigan reveals that the Arctic has lost approximately 25% of its cooling ability since 1980 due to diminishing sea ice and reduced reflectivity. Additionally, this phenomenon has contributed to a global loss of up to 15% in cooling power.

Using satellite measurements of cloud cover and the solar radiation reflected by sea ice between 1980 and 2023, the researchers found that the percent decrease in sea ice’s cooling power is about twice as high as the percent decrease in annual average sea ice area in both the Arctic and Antarctic. The added warming impact from this change to sea ice cooling power is toward the higher end of climate model estimates.

“When we use climate simulations to quantify how melting sea ice affects climate, we typically simulate a full century before we have an answer,” said Mark Flanner, professor of climate and space sciences and engineering and the corresponding author of the study published in Geophysical Research Letters.

“We’re now reaching the point where we have a long enough record of satellite data to estimate the sea ice climate feedback with measurements.”

[...] The Arctic has seen the largest and most steady declines in sea ice cooling power since 1980, but until recently, the South Pole had appeared more resilient to the changing climate. Its sea ice cover had remained relatively stable from 2007 into the 2010s, and the cooling power of the Antarctic’s sea ice was actually trending up at that time.

That view abruptly changed in 2016, when an area larger than Texas melted on one of the continent’s largest ice shelves. The Antarctic lost sea ice then too, and its cooling power hasn’t recovered, according to the new study. As a result, 2016 and the following seven years have had the weakest global sea ice cooling effect since the early 1980s.

Beyond disappearing ice cover, the remaining ice is also growing less reflective as warming temperatures and increased rainfall create thinner, wetter ice and more melt ponds that reflect less solar radiation. This effect has been most pronounced in the Arctic, where sea ice has become less reflective in the sunniest parts of the year, and the new study raises the possibility that it could be an important factor in the Antarctic, too—in addition to lost sea ice cover.

[...] The research team hopes to provide their updated estimates of sea ice’s cooling power and climate feedback from less reflective ice to the climate science community via a website that is updated whenever new satellite data is available.

Reference: “Earth’s Sea Ice Radiative Effect From 1980 to 2023” by A. Duspayev, M. G. Flanner and A. Riihelä, 17 July 2024, Geophysical Research Letters.
  DOI: 10.1029/2024GL109608


Original Submission

posted by hubie on Tuesday July 23, @06:10AM   Printer-friendly

https://pldb.io/blog/JohnOusterhout.html

Dr. John Ousterhout is a computer science luminary who has made significant contributions to the field of computer science, particularly in the areas of operating systems and file systems. He is the creator of the Tcl scripting language, and has also worked on several major software projects, including the Log-Structured file system and the Sprite operating system. John Ousterhout's creation of Tcl has had a lasting impact on the technology industry, transforming the way developers think about scripting and automation.


Original Submission

posted by hubie on Tuesday July 23, @01:26AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Two new studies suggest that antibodies that attack people’s own tissues might cause ongoing neurological issues that afflict millions of people with the disease. 

When scientists transferred these antibodies from people with long COVID into healthy mice, certain symptoms, including pain, transferred to the animals too, researchers reported May 31 on bioRxiv.org and June 19 on medRxiv.org. 

Though scientists have previously implicated such antibodies, known as autoantibodies, as suspects in long COVID, the new studies are the first to offer direct evidence that they can do harm. “This is a big deal,” says Manali Mukherjee, a translational immunologist at McMaster University in Hamilton, Canada, who was not involved in the work. The papers make a good case for therapies that target autoantibodies, she says.

The work could also offer “peace of mind to some of the long-haulers,” Mukherjee says. As someone who has endured long COVID herself, she understands that when patients don’t know the cause of their suffering, it can add to their anxiety. They wonder, “What the hell is going wrong with me?” she says.

[...] Scientists have proposed many hypotheses for what causes long COVID, including SARS-CoV-2 virus lingering in the tissues and the reawakening of dormant herpes viruses (SN: 3/4/24). Those elements may still play a role in some people’s long COVID symptoms, but for pain, at least, rogue antibodies seem to be enough to kick-start the symptom all on their own. It’s not an out-of-the-blue role for autoantibodies; scientists suspect they may also be involved in other conditions that cause people pain, including fibromyalgia and myalgic encephalomyelitis/chronic fatigue syndrome.

But if doctors could identify which long COVID patients have pain-linked autoantibodies, they could try to reduce the amount circulating in the blood, says Iwasaki, who is also a Howard Hughes Medical Institute investigator. “I think that would really be a game changer for this particular set of patients.” 

The work represents a “very strong level of evidence” that autoantibodies could cause harm in people with long COVID, says Ignacio Sanz, an immunologist at Emory University in Atlanta. Both he and Mukherjee would like to see the findings validated in larger sets of participants. And the real clincher, Sanz says, would come from longer-term studies. If scientists could show that patients’ symptoms ease as these rogue antibodies disappear over time, that’d be an even surer sign of their guilt. 

References:
    • K. S. Guedes de Sa et alA causal link between autoantibodies and neurological symptoms in long COVID. medRxiv.org. Posted June 19, 2024. doi: 10.1101/2024.06.18.24309100.
    • H-J Chen et alTransfer of IgG from long COVID patients induces symptomology in mice. bioRxiv.org. Posted May 31, 2024. doi: 10.1101/2024.05.30.596590. 


Original Submission

posted by hubie on Monday July 22, @08:38PM   Printer-friendly
from the private-sector-always-does-it-cheaper dept.

Arthur T Knackerbracket has processed the following story:

Europe’s largest local authority faces a $15.58 million (£12 million) bill for manually auditing accounts which should have been supported by an Oracle ERP systems installed in April 2022.

The £3.2 billion ($4.1 billion) budget authority has become infamous for its ERP project disaster, which has seen its switch from legacy SAP software to cloud-based Oracle Fusion, a customer win co-founder and CTO Larry Ellison once flaunted to investors.

The delayed project left the council without auditable accounts, and without security features, along with costs climbing from around £20 million to as much as £131 million. The IT problems contributed to the Birmingham City Council becoming effectively bankrupt in September last year.

A report from external auditors stated the council will not have a fully functioning cash system until April next year, three years after it went live on an Oracle ERP, and will have to wait until September 2025 for a fully functioning finance system.

Yesterday, Mark Stocks, head of public sector practice at external auditors Grant Thornton, told councillors that officials had told him the new accounting “out-of-the-box” system might not be ready until March 2026, nearly four years after the failing customized system first went live.

The lack of a functioning accounting system was making it costly and time consuming to produce a full audit, the auditors concluded after exploratory work.

[...] Problems with the customized ERP system were multiple, but cash management, bank reconciliation and accounts receivable were of particular concern. The council has bought third-party software — CivicaPay/Civica Income Management — as the replacement for the banking system.

Stocks said officials had been working hard to improve the current Oracle system, and said he did not “lose that message.”

Nonetheless, serious issues continue. “You're not going to have a fully functioning finance system and cash system [until] April next year. The actual financial ledger could be April 2026. That's really difficult from a finance officer point of view [and] it's particularly difficult from an external audit point of view to draw a conclusion on your accounts,” he said.


Original Submission

posted by hubie on Monday July 22, @03:52PM   Printer-friendly

Editor's note: Due to the extensive use of buzzwords, the submitter questions whether this was written by a human or not, but perhaps those who are knowledgeable in network architecture can comment on whether this idea is as revolutionary as TFA suggests.

Arthur T Knackerbracket has processed the following story:

A research team has proposed a revolutionary polymorphic network environment (PNE) in their study, which seeks to achieve global scalability while addressing the diverse needs of evolving network services. Their framework challenges traditional network designs by creating a versatile “network of networks” that overcomes the limitations of current systems, paving the way for scalable and adaptable network architectures.

A recent paper published in Engineering by scientists Wu Jiangxing and his research team introduces a theoretical framework that promises to transform network systems and architectures. The study tackles a critical issue in network design: how to achieve global scalability while meeting the varied demands of evolving services.

For decades, the quest for an ideal network capable of seamlessly scaling across various dimensions has remained elusive. The team, however, has identified a critical barrier known as the “impossible service-level agreement (S), multiplexity (M), and variousness (V) triangle” dilemma, which highlights the inherent limitations of traditional unimorphic network systems. These systems struggle to adapt to the growing complexity of services and application scenarios while maintaining global scalability throughout the network’s life cycle.

To overcome this challenge, the researchers propose a paradigm shift in network development—an approach they term the polymorphic network environment (PNE). At the core of this framework lies the separation of application network systems from the underlying infrastructure environment. By leveraging core technologies such as network elementization and dynamic resource aggregation, the PNE enables the creation of a versatile “network of networks” capable of accommodating diverse service requirements.

Through extensive theoretical analysis and environment testing, the team demonstrates the viability of the PNE model. Results indicate that the framework not only supports multiple application network modalities simultaneously but also aligns with technical and economic constraints, thus paving the way for scalable and adaptable network architectures.

Reference: “Theoretical Framework for a Polymorphic Network Environment” by Jiangxing Wu et al., 28 February 2024, Engineering. DOI: 10.1016/j.eng.2024.01.018


Original Submission

posted by hubie on Monday July 22, @11:06AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

A new study published in Science Advances reveals a surprising twist in the evolutionary history of complex life. Researchers at Queen Mary University of London have discovered that a single-celled organism, a close relative of animals, harbors the remnants of ancient giant viruses woven into its own genetic code. This finding sheds light on how complex organisms may have acquired some of their genes and highlights the dynamic interplay between viruses and their hosts.

The study focused on a microbe called Amoebidium, a unicellular parasite found in freshwater environments. By analyzing Amoebidium's genome, the researchers led by Dr. Alex de Mendoza Soler, Senior Lecturer at Queen Mary's School of Biological and Behavioural Sciences, found a surprising abundance of genetic material originating from giant viruses—some of the largest viruses known to science. These viral sequences were heavily methylated, a chemical tag that often silences genes.

[...] "These findings challenge our understanding of the relationship between viruses and their hosts," says Dr. de Mendoza Soler. "Traditionally, viruses are seen as invaders, but this study suggests a more complex story. Viral insertions may have played a role in the evolution of complex organisms by providing them with new genes. And this is allowed by the chemical taming of these intruders DNA."

Furthermore, the findings in Amoebidium offer intriguing parallels to how our own genomes interact with viruses. Similar to Amoebidium, humans and other mammals have remnants of ancient viruses, called endogenous retroviruses, integrated into their DNA.

While these remnants were previously thought to be inactive "junk DNA," some might now be beneficial. However, unlike the giant viruses found in Amoebidium, Endogenous retroviruses are much smaller, and the human genome is significantly larger. Future research can explore these similarities and differences to understand the complex interplay between viruses and complex life forms.

More information: Luke A. Sarre et al, DNA methylation enables recurrent endogenization of giant viruses in an animal relative, Science Advances (2024). DOI: 10.1126/sciadv.ado6406


Original Submission

posted by janrinok on Monday July 22, @06:23AM   Printer-friendly

CrowdStrike broke Debian and Rocky Linux months ago, but no one noticed:

A widespread Blue Screen of Death (BSOD) issue on Windows PCs disrupted operations across various sectors, notably impacting airlines, banks, and healthcare providers. The issue was caused by a problematic channel file delivered via an update from the popular cybersecurity service provider, CrowdStrike. CrowdStrike confirmed that this crash did not impact Mac or Linux PCs.

It turns out that similar problems have been occurring for months without much awareness, despite the fact that many may view this as an isolated incident. Users of Debian and Rocky Linux also experienced significant disruptions as a result of CrowdStrike updates, raising serious concerns about the company's software update and testing procedures. These occurrences highlight potential risks for customers who rely on their products daily.

In April, a CrowdStrike update caused all Debian Linux servers in a civic tech lab to crash simultaneously and refuse to boot. The update proved incompatible with the latest stable version of Debian, despite the specific Linux configuration being supposedly supported. The lab's IT team discovered that removing CrowdStrike allowed the machines to boot and reported the incident.

A team member involved in the incident expressed dissatisfaction with CrowdStrike's delayed response. It took them weeks to provide a root cause analysis after acknowledging the issue a day later. The analysis revealed that the Debian Linux configuration was not included in their test matrix.

"Crowdstrike's model seems to be 'we push software to your machines any time we want, whether or not it's urgent, without testing it'," lamented the team member.

This was not an isolated incident. CrowdStrike users also reported similar issues after upgrading to RockyLinux 9.4, with their servers crashing due to a kernel bug. Crowdstrike support acknowledged the issue, highlighting a pattern of inadequate testing and insufficient attention to compatibility issues across different operating systems.

To avoid such issues in the future, CrowdStrike should prioritize rigorous testing across all supported configurations. Additionally, organizations should approach CrowdStrike updates with caution and have contingency plans in place to mitigate potential disruptions.

Source: Ycombinator, RockyLinux


Original Submission

posted by hubie on Monday July 22, @01:40AM   Printer-friendly
from the mmmm-mmmm-good dept.

Arthur T Knackerbracket has processed the following story:

At the end of a small country road in Denmark is the "Enorm" factory, an insect farm set up by a Danish woman who wants to revolutionize livestock feed.

Jane Lind Sam and her father, Carsten Lind Pedersen, swapped pigs for soldier flies and created a 22,000-square-metre (237,000 square feet) factory where they intend to produce more than 10,000 tonnes of insect meal and oil a year.

The factory, which opened in December 2023, is the largest of its kind in northern Europe, and its products will initially be used by farmers for animal feed and, perhaps in the future, for human consumption.

The two entrepreneurs are making products that will be "substituting other, maybe less climate-friendly products", Lind Sam, co-owner and chief operations officer, explained to AFP.

They hope to contribute to the evolution of agriculture in a country where the sector's climate impact is under scrutiny.

[...] Under turquoise fluorescent lights, millions of black flies buzzed inside some 500 plastic cages, where they lay hundreds of thousands of eggs every day.

Inside the facility, it was impossible to escape the roar of insects who incessantly lay eggs throughout their 10-day lifespan.

"The female fly lays its eggs in this piece of cardboard," Lind Sam explained as she pulled out a sheet with a honeycomb pattern at the bottom of one of the cages.

About 25 kilograms (55 pounds) of eggs are produced per day. A single gram corresponds to about 40,000 eggs.

From these eggs come some of tomorrow's feeder flies, but also the future maggots which, once they have become pupae, will be transformed.

[...] "They are fascinating animals. And I think it's amazing that they can live on any organic matter," Lind Sam said.

Niels Thomas Eriksen, a biologist at Aalborg University, told AFP that "insects can eat materials that other animals probably won't so we can make better use" of agricultural byproducts and food waste.

Minimizing waste is one of Enorm's key aims and the manufacturer stressed that the rearing of insects facilitates "the recycling of nutrients".

It takes between 40 and 50 days to produce the finished product, which is mainly flour with a protein content of 55 percent.

It is then distributed across Europe—although Enorm remains discreet about the identity of its customers—used for feed for pig, poultry, fish and pet farms.

See Also: Fly larvae: Costa Rica's sustainable protein for animal feed


Original Submission

posted by hubie on Sunday July 21, @08:52PM   Printer-friendly

https://www.indiewire.com/news/breaking-news/spaceballs-sequel-amazon-mgm-mel-brooks-josh-gad-1235017862/

"Spaceballs" the t-shirt! "Spaceballs" the coloring book! "Spaceballs" the....sequel?

Yup. A source told IndieWire that Amazon MGM Studios is currently in early development on a sequel to "Spaceballs." Mel Brooks is returning to produce the feature, a direct follow-up to his 1987 "Star Wars" spoof. Josh Gad is also on board to star in and also produce what we'll call "Spaceballs 2."

[...] Josh Greenbaum, who directed "Barb & Star Go to Vista Del Mar" and this year's documentary "Will & Harper," is attached to direct the sequel. He's working from a script by Benji Samit, Dan Hernandez, and Josh Gad. Samit and Hernandez are known for working on "Teenage Mutant Ninja Turtles: Mutant Mayhem," "Pokemon Detective Pikachu," and the upcoming "Lego Star Wars: Rebuild the Galaxy."

[...] "Spaceballs" from 1987 starred Brooks, Rick Moranis as Dark Helmet, the late John Candy as the Chewbacca parody Barf, and Bill Pullman as the hero Lone Starr. In classic Brooks fashion the film mercilessly ripped off "Star Wars" and featured everything from heroes fighting with pseudo lightsabers extending from rings, characters using the "Schwartz" to save the day, and even Brooks playing a Yoda parody, Yogurt, who shamelessly plugged fourth-wall-breaking "Spaceballs" merchandise.

The film made $38 million worldwide but has become part of the canon of staples for Brooks acolytes. Brooks recently wrote the Hulu series "History of the World: Part II," a sequel to his 1981 sketch film. That series also featured Gad in one episode playing Shakespeare.


Original Submission

posted by hubie on Sunday July 21, @04:04PM   Printer-friendly
from the poop dept.

https://arstechnica.com/tech-policy/2024/07/dirty-diaper-resold-on-amazon-ruined-a-family-business-report-says/

A feces-encrusted swim diaper tanked a family business after Amazon re-sold it as new, Bloomberg reported, triggering a bad review that quickly turned a million-dollar mom-and-pop shop into a $600,000 pile of debt.

Paul and Rachelle Baron, owners of Beau & Belle Littles, told Bloomberg that Amazon is supposed to inspect returned items before reselling them. But the company failed to detect the poop stains before reselling a damaged item that triggered a one-star review in 2020 that the couple says doomed their business after more than 100 buyers flagged it as "helpful."

"The diaper arrived used and was covered in poop stains," the review said, urging readers to "see pics."
[...]
Amazon says that it prohibits negative reviews that violate community guidelines, including by focusing on seller, order, or shipping feedback rather than on the item's quality. Other one-star reviews for the same product that the Barons seemingly accept as valid comment on quality, leaving feedback like the diaper fitting too tightly or leaking.
[...]
But Amazon ultimately declined to remove the bad review, Paul Baron told Bloomberg. The buyer who left the review, a teacher named Erin Elizabeth Herbert, told Bloomberg that the Barons had reached out directly to explain what happened, but she forgot to update the review and still has not as of this writing.

"I always meant to go back and revise my review to reflect that, and life got busy and I never did," Herbert told Bloomberg.

Her review remains online, serving as a warning for parents to avoid buying from the family business.
[...]
On Amazon's site, other sellers have complained about the company's failure to remove reviews that clearly violate community guidelines. In one case, an Amazon support specialist named Danika acknowledged that the use of profanity in a review, for example, "seems particularly cut and dry as a violation," promising to escalate the complaint. However, Danika appeared to abandon the thread after that, with the user commenting that the review remained up after the escalation.

[...] The Barons told Ars they've given up on resolving the issue with Amazon after a support specialist appeared demoralized, admitting that "it's completely" Amazon's "fault" but there was nothing he could do.
[...]
Amazon promises on its site that "each item at an Amazon return center is carefully inspected and evaluated to determine if it meets Amazon's high bar to be re-listed for sale."

The company supposedly evaluates the packaging for broken seals, then opens the package to "confirm the item matches the description, check for any signs of use, and assess any product damage" before it's deemed to meet Amazon's "high standards" and can be resold as new.
[...]
Earlier this year, the company apologized for selling a customer in India a "new" laptop that was obviously used and had a warranty that had started six months before it was purchased, Hindustan Times reported. In one Reddit thread accusing Amazon of a "laptop scam" viewed by thousands, a user claimed that Amazon's refund process resulted in an investigation on his account for "suspicious activity."
[...]
The Federal Trade Commission is currently focused more on probing how Amazon allegedly stifles competition rather than on reports of harms to consumers and sellers through allegedly deceptive advertising, though. That investigation will take years to wrap up, Reuters reported, with the trial not expected to start until 2026.
[...]
For the Barons, the damage control continues despite a decade of mostly glowing reviews for their baby products and years of contacting Amazon seeking assistance. They worry Amazon might still be reselling used items, but they cannot stop using the platform because Amazon remains their primary source of sales, the couple told Ars. Last summer, The Strategist ranked the item hit by the bad review among the "best swim diapers," and this summer, so did Parents.com. So far, though, hoping to bury the bad review with positive endorsements seems to have done little to help the Barons turn their business around.

"Amazon talks a big game about helping small businesses," Paul Baron told Bloomberg. "But they really don't."

Related stories on SoylentNews:
Tech CEO Gets 6 Years for Selling Fake Cisco Gear on Amazon, eBay - 20240506
He Blew the Whistle on Amazon. He's Still Paying the Price - 20231214
Judge: Amazon "Cannot Claim Shock" That Bathroom Spycams Were Used as Advertised - 20231206
After Luring Customers With Low Prices, Amazon Stuffs Fire TVs With Ads - 20231112
Amazon Drivers' Urine Packaged as Energy Drink, Sold on Amazon - 20231024
FTC Files "the Big One," a Lawsuit Alleging Amazon Illegally Maintains Monopoly - 20230928
Amazon Adding Ads to Prime Video in 2024 Unless You Pay $2.99 Extra - 20230926
Amazon Won't Stop Sending Tortured Woman Unwanted Boxes of Shoes - 20230811
FTC Prepares "the Big One," a Major Lawsuit Targeting Amazon's Core Business - 20230630
Amazon's Smart Speakers Collecting Kids Data May Lead to Government Lawsuit - 20230402
Amazon, Google Busted Faking Small Business Opposition to Antitrust Reform - 20220406
Amazon Lied About Using Seller Data, Lawmakers Say, Urging DOJ Investigation - 20220314
Alexa Tells 10-Year-Old Girl to Touch Live Plug with Penny - 20211228
Amazon Unlawfully Confiscated Union Literature, NLRB (National Labor Relations Board) Finds - 20210804
Amazon Blames Social Media Companies for Sales of Fake Amazon Reviews - 20210616


Original Submission

posted by janrinok on Sunday July 21, @11:17AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Opioids, like morphine, are effective painkillers but have led to widespread addiction and serious side effects like respiratory depression, notably seen in the U.S. opioid crisis that claimed nearly 645,000 lives from 1999 to 2021. Researchers at Johannes Gutenberg University Mainz have identified a potential alternative, aniquinazolin B, from the marine fungus Aspergillus nidulans, which binds to opioid receptors and could replace opioids with fewer undesirable effects, after rigorous testing including over 750,000 calculations per substance using the MOGON supercomputer.

Opioids, recognized for their significant pharmacological effects, have long been used as effective painkillers. Morphine, a notable example first isolated and synthesized in the early 19th century, provides crucial relief for patients in the final stages of severe illness.

However, when opioids are used inappropriately they can cause addiction and even the development of extremely serious undesirable effects, such as respiratory depression. In the USA, opioids were once widely promoted through the media and, as a consequence, were often prescribed to treat what were in fact mild disorders. According to the Centers for Disease Control and Prevention (CDC), there were nearly 645,000 cases of mortality due to opioid overdose in the United States between 1999 and 2021.

And the opioid crisis has arrived in Germany, too. The main problem is street drugs and the fact that the synthetic opioid heroin, in particular, is cut with other, cheaper opioids, such as fentanyl. While a dose of 200 milligrams of heroin is fatal, just two milligrams of fentanyl can kill. In 2022, more than 1,000 people in Germany died as a result of the consumption of opioids.

Governments have introduced measures to contain this epidemic. However, opioid addiction rates are high. Others suffer from extreme pain that needs to be alleviated. There is thus an urgent need for safe analgesics. Researchers at Johannes Gutenberg University Mainz (JGU) – with the financial support of the Research Training Group “Life Sciences – Life Writing”, funded by the German Research Foundation (DFG) – have now made progress towards this goal.

“A natural product called aniquinazolin B that is isolated from the marine fungus Aspergillus nidulans stimulates the opioid receptors and could possibly thus be used instead of opioids in the future,” explained Roxana Damiescu, a member of the research team headed by Professor Thomas Efferth.

In the search for new compounds, the team started with a chemical database of more than 40,000 natural substances. It was their aim to determine how effectively each substance would bind to the corresponding receptor. And, in addition, they had to ascertain whether these had the properties required of a pharmaceutical drug.

Such a compound must be water-soluble to some extent, for example. This research required calculations in the form of approximations, with the results becoming increasingly more precise the more frequently these calculations were performed. Each substance was the subject of some 750,000 individual calculations. Such a colossal number of calculations would vastly exceed the capacity of a standard PC. Therefore, the team utilized the MOGON supercomputer at JGU. The top 100 candidate products of these calculations were subsequently assessed using other analytical methods.

The resultant top ten found their way into the lab, where they underwent biochemical analysis. The initial priority was to establish safety. Using preparations of human kidney cells, the researchers looked at whether higher concentrations of each substance would prove toxic to the cells and even kill them. Finally, two other aspects had to be subjected to testing.

“We needed to confirm that the high binding energy of the substances to the pain receptors that had been predicted by the theoretical calculations was actually also produced in the real physical world,” said Professor Thomas Efferth, head of the JGU Department of Pharmaceutical Biology. However, binding of a substance to the receptors is not alone sufficient. The binding must also influence the functioning of the receptors.

Thus, the research team used a second test system to assess whether there was the kind of inhibition of biological activity that occurs during opioid use. One of the two compounds passed all tests with flying colors: aniquinazolin B, the substance present in the marine fungus Aspergillus nidulans. “The results of our investigations indicate that this substance may have effects similar to those of opioids. At the same time, it causes far fewer undesirable reactions,” concluded Roxana Damiescu.

Reference: “Aniquinazoline B, a Fungal Natural Product, Activates the μ-Opioid Receptor” by Roxana Damiescu, Mohamed Elbadawi, Mona Dawood, Sabine M. Klauck, Gerhard Bringmann and Thomas Efferth, 23 May 2024, ChemMedChem.
  DOI: 10.1002/cmdc.202400213


Original Submission

posted by janrinok on Sunday July 21, @06:32AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

North Korean nukes look like disco balls, olives, and peanuts, according to a group of scientists and researchers who study nuclear weapons. Newly released research puts the DPRK’s devastating stockpile of quirkily named nightmare machines at around 50. And it could get that number up to 130 by the end of the decade.

The world’s nuclear powers are cagey about the exact nature of their nukes. It’s a weapon you want everyone to know you have, but you don’t necessarily want them to know how many.

Enter the Federation of American Scientists [FAS], a U.S. nonprofit that attempts to use science to make the world a better place. One of its big projects is the Nuclear Notebook, a constantly updating list of the world’s nuclear weapons. Cataloging world-ending weapons is a challenge in countries like France and the U.S. which have certain amounts of transparency around their arsenals. In North Korea, it’s almost impossible. Almost.

North Korea was not always as closed as it is now. International officials did once visit the country and knowledge from those visits gave the FAS critical information that it used to suss out what, exactly, the DPRK is capable of. North Korea also does a lot of media events that create pictures and videos that help experts figure out the size of its arsenal. Kim Jong-un loves to pose with nukes and launchers in parades.

“Using these resources and other open sources, including commercial satellite imagery and publicly available reports from the [International Atomic Energy Agency] and the UN Panel of Experts on North Korea, analysts at independent organizations have been able to examine industry networks, locate key facilities, and map North Korea’s nuclear fuel cycle to generate estimates of fissile material stockpiles and production—all of which are key factors in assessing the size, sophistication, and status of North Korea’s nuclear arsenal today,” the FAS said in its latest nuclear notebook.

In its research, the FAS identified three kinds of North Korean warheads which it gave nicknames. There’s the disco ball, which the DPRK first showed off in 2016. Supposedly, this is a single-stage implosion nuke. Basically, it’s a big silver ball with a bit of nuclear material surrounded by high explosives. The implosion of the high explosives would trigger the nuclear explosion. This is similar to the nuclear device detonated at the Trinity site in Oppenheimer.

In 2017, Kim Jong Un posed with what the FAS dubbed the peanut. This is supposedly a two-stage thermonuclear device. A thermonuclear device consists of a series of nuclear explosions that feed off each other and generate a massive blast. FAS said in its report that the peanut might not be a thermonuclear weapon at all, however. This could be a device filled with tritium, which would improve the efficiency of a single-stage device.

In 2023, the DPRK unveiled photos of what the FAS called the olive. The small warhead appeared to be a single-stage nuke similar to the disco ball but designed to fit inside a variety of delivery systems. “North Korea’s display of different devices demonstrates an aspirational progression toward more sophisticated and efficient warhead design,” the FAS said in its research.

Based on the available knowledge, FAS also tried to guess how much nuclear material North Korea has. It then used that number to extrapolate the number of nukes it’s sitting on. “We estimate North Korea could possess up to 81 kilograms of plutonium and 1,800 kilograms of [highly-enriched uranium], which could supply North Korea with enough material to potentially build up to 90 nuclear weapons,” it said.

Its estimates were conservative. “These lower-end projections mean that North Korea could potentially build up to 20 uranium-only design and 33 composite design weapons if using the same fissile material allocations, for a possible capacity to build up to 53 nuclear weapons,” it said. The FAS estimated that the DPRK could build around 6 nukes a year and bring its numbers up to 130 by the end of the decade.

Buried in the report’s scientific research is something more troubling than the nukes themselves: a discussion of how North Korea plans to use them. Some, but not all, countries with nukes maintain something called a “no-first-use policy.” It’s a codified promise that they’ll only use their nukes if someone else attacks them with nukes first. China has a no-first-use policy. The United States and Russia do not.

North Korea once promised it would never use nuclear weapons preemptively, but it’s changed its mind. According to the FAS report, North Korea’s parliament passed a law giving it the right to launch nukes preemptively in 2022. One year later, the North Korean government codified under the country’s constitution its right to ‘deter war and protect regional and global peace by rapidly developing nuclear weapons to a higher level.’


Original Submission

posted by janrinok on Sunday July 21, @01:49AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

NASA and Boeing engineers are evaluating results from last week’s engine tests at NASA’s White Sands Test Facility in New Mexico as the team works through plans to return the agency’s Boeing Crew Flight Test from the International Space Station in the coming weeks.

Teams completed ground hot fire testing at White Sands and are working to evaluate the test data and inspect the test engine. The ongoing ground analysis is expected to continue throughout the week. Working with a reaction control system thruster built for a future Starliner spacecraft, ground teams fired the engine through similar inflight conditions the spacecraft experienced on the way to the space station. The ground tests also included stress-case firings, and replicated conditions Starliner’s thrusters will experience from undocking to deorbit burn, where the thrusters will fire to slow Starliner’s speed to bring it out of orbit for landing in the southwestern United States.

For a detailed overview of the test plans, listen to a replay of a recent media teleconference with NASA and Boeing leadership:

“I am extremely proud of the NASA, Boeing team for their hard work in executing a very complex test series,” said Steve Stich, manager, NASA’s Commercial Crew Program. “We collected an incredible amount of data on the thruster that could help us better understand what is going on in flight. Next, our team has moved into engine tear downs and inspections which will provide additional insight as we analyze the results and evaluate next steps.”

Integrated ground teams also are preparing for an in-depth Agency Flight Test Readiness Review, which will evaluate data related to the spacecraft’s propulsion system performance before its return to Earth. The date of the agency review has not yet been solidified.

NASA and Boeing leadership plan to discuss the testing and analysis work in detail during a media briefing next week. More information on the briefing will be made available soon.


Original Submission