Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Arthur T Knackerbracket has processed the following story:
There are few bigger names in the world of motorcycles than Harley-Davidson. The brand has been churning out two-wheeled classics left and right for over a century, cultivating a fanbase like no other. Along the way, Harley-Davidson has seen some incredibly important moments in its company history arise, in addition to plenty that have fallen by the wayside. Seeing as it's rarely talked about in the modern era, it's no stretch to say that one of these overlooked points on the Harley timeline is the brand's brief foray into lawn mower technology.
Yes, as it was making its name as one of the United States' foremost motorcycle manufacturers, Harley-Davidson took a detour into landscaping territory. Way back in 1929, the company began providing single cylinder side-valve engines for the Worthington Mower Company's Overgreen lawn mowers. These massive mowers were used commercially to cut large swaths of grass like golf courses. Harley's engines powered them throughout the Great Depression, though the motorcycle giant didn't stick with lawn mowers for the long haul. Thus, it was never destined to sit alongside the best and worst major lawn mower brands.
It has been nearly 100 years since Harley-Davidson's Worthington collaboration came and went, yet some folks are still keen on trying to power their mowers with the bike favorite's technology.
With modern perceptions of Harley-Davidson as strictly a motorcycle brand, the idea of it contributing to the Worthington Mower Company's Overgreen lawn mowers sounds absurd. After all, throughout its century-long history, it has only branched out to create non-motorcycle products a few times. For instance, it created the LR-64 drone rocket engine for the U.S. military, and took a dip in the marine technology pool with its purchase of the Tomahawk Boating Company in 1961. Still, this hasn't prevented some folks from trying to bring its modern tech to back to the realm of landscaping.
Over on YouTube, Specialised Motorcycle Transport showcased what a modern lawn mower could look like with a Harley engine inside. It looks like a conventional riding mower with some unique aesthetic touches, but sounds just like motorcycle and is designed to get the speed of one too. This contraption was constructed for racing purposes rather than lawn maintenance. MrOildale on YouTube featured a similar piece on their channel. This orange riding mower — dubbed the "mower cycle" — came equipped with a 1,500cc Harley engine, certainly making it better suited for the racetrack than for cutting grass. It more than likely can't outrun the fastest Harley-Davidson motorcycles ever built, but it'll leave most average riding mowers out there in the dust.
Odds are Harley-Davidson's efforts in terms of lawn mowing are long over, but so long as mechanics can make its engines compatible with the latest riding mowers, chances are its legacy in the space will endure well into the future.
Arthur T Knackerbracket has processed the following story:
In an earlier article, I discussed a few of the flaws in Europe’s flagship data privacy law, the General Data Protection Regulation (GDPR). Building on that critique, I would now like to go further, proposing specifications for developing a robust privacy protection regime in the US.
Writers must overcome several hurdles to have a chance at persuading readers about possible flaws in the GDPR. First, some readers are skeptical of any piece criticizing the GDPR because they believe the law is still too young to evaluate. Second, some are suspicious of any piece criticizing the GDPR because they suspect that the authors might be covert supporters of Big Tech’s anti-GDPR agenda. (I can assure readers that I am not, nor have I ever, worked to support any agenda of Big Tech companies.)
In this piece, I will highlight the price of ignoring the GDPR. Then, I will present several conceptual flaws of the GDPR that have been acknowledged by one of the lead architects of the law. Next, I will propose certain characteristics and design requirements that countries like the United States should consider when developing a privacy protection law. Lastly, I provide a few reasons why everyone should care about this project.
People sometimes assume that the GDPR is mostly a “bureaucratic headache”—but this perspective is no longer valid. Consider the following actions by administrators of the GDPR in different countries.
In other words, the GDPR is not merely a bureaucratic matter; it can trigger hefty, unexpected fines. The notion that the GDPR can be ignored is a fatal error.
Axel Voss is one of the lead architects of the GDPR. He is a member of the European Parliament and authored the 2011 initiative report titled “Comprehensive Approach to Personal Data Protection in the EU” when he was the European Parliament's rapporteur. His call for action resulted in the development of the GDPR legislation. After observing the unfulfilled promises of the GDPR, Voss wrote a position paper highlighting the law's weaknesses. I want to mention nine of the flaws that Voss described.
First, while the GDPR was excellent in theory and pointed a path toward the improvement of standards for data protection, it is an overly bureaucratic law created largely using a top-down approach by EU bureaucrats.
Second, the law is based on the premise that data protection should be a fundamental right of EU persons. Hence, the stipulations are absolute and one-sided or laser-focused only on protecting the "fundamental rights and freedoms" of natural persons. In making this change, the GDPR architects have transferred the relationship between the state and the citizen and applied it to the relationship between citizens and companies and the relationship between companies and their peers. This construction is one reason why the obligations imposed on data controllers and processors are rigid.
Third, the GDPR law aims to empower the data subjects by giving them rights and enshrining these rights into law. Specifically, the law enshrines nine data subject rights into law. They are: the right to be informed, the right to access, the right to rectification, the right to be forgotten/or to erasure, the right to data portability, the right to restrict processing, the right to object to the processing of personal data, the right to object to automated processing and the right to withdraw consent. As with any list, there is always a concern that some rights may be missing. If critical rights are omitted from the GDPR, it would hinder the effectiveness of the law in protecting privacy and data protection. Specifically, in the case of the GDPR, the protected data subject rights are not exhaustive.
Fourth, the GDPR is grounded on a prohibition and limitation approach to data protection. For example, the principle of purpose limitation excludes chance discoveries in science. This ignores the reality that current technologies, e.g., machine learning and artificial Intelligence applications, function differently. Hence, these old data protection mindsets, such as data minimization and storage limitation, are not workable anymore.
Fifth, the GDPR, on principle, posits that every processing of personal data restricts the data subject’s right to data protection. It requires, therefore, that each of these processes needs a justification based on the law. The GDPR deems any processing of personal data as a potential risk and forbids its processing in principle. It only allows processing if a legal ground is met. Such an anti-processing and anti-sharing approach may not make sense in a data-driven economy.
Sixth, the law does not distinguish between low-risk and high-risk applications by imposing the same obligations for each type of data processing application, with a few exceptions requiring consultation of the Data Processing Administrator for high-risk applications.
Seventh, the GDPR also excludes exemptions for low-risk processing scenarios or when SMEs, startups, non-commercial entities, or private citizens are the data controllers. Further, there are no exemptions or provisions that protect the rights of the controller and of third parties for such scenarios in which the data controller has a legitimate interest in protecting business and trade secrets, fulfilling confidentiality obligations, or the economic interest in avoiding huge and disproportionate efforts to meet GDPR obligations.
Eighth, the GDPR lacks a mechanism that allows SMEs and startups to shift the compliance burden onto third parties, which then store and process data.
Ninth, the GPR relies heavily on government-based bureaucratic monitoring and administration of GDPR privacy compliance. This means an extensive bureaucratic system is needed to manage the compliance regime.
There are other issues with GDPR enforcement (see pieces by Matt Burgess and Anda Bologa) and its negative impacts on the EU’s digital economy and on Irish technology companies. This piece will focus only on the nine flaws described above. These nine flaws are some of the reasons why the US authorities should not simply copy the GDPR.
The good news is that many of these flaws can be resolved.
SpaceX Dragon spacecraft lifted off from the Cape Canaveral Space Force Station in Florida at 1:17 p.m., according to NASA. It will take the Crew-9 mission 28.5 hours to dock at the ISS.
SpaceX's Crew Dragon left Earth with two empty seats for astronauts Butch Wilmore and Suni Williams, who have been docked at the ISS since June. The pair was the first to perform Boeing's first crewed mission to space.
[...] NASA said the Crew-9 mission has safely reached orbit and the nosecone has opened.
SpaceX launches rescue mission for NASA astronauts stuck in space until next year:
CAPE CANAVERAL, Fla. — SpaceX launched a rescue mission for the two stuck astronauts at the International Space Station on Saturday, sending up a downsized crew to bring them home but not until next year.
[...] Since NASA rotates space station crews approximately every six months, this newly launched flight with two empty seats reserved for Wilmore and Williams won't return until late February. Officials said there wasn't a way to bring them back earlier on SpaceX without interrupting other scheduled missions.
By the time they return, the pair will have logged more than eight months in space. They expected to be gone just a week when they signed up for Boeing's first astronaut flight that launched in June.
[...] Williams has since been promoted to commander of the space station, which will soon be back to its normal population of seven. Once Hague and Gorbunov arrive this weekend, four astronauts living there since March can leave in their own SpaceX capsule. Their homecoming was delayed a month by Starliner's turmoil.
Hague noted before the flight that change is the one constant in human spaceflight.
"There's always something that is changing. Maybe this time it's been a little more visible to the public," he said.
Hague was thrust into the commander's job for the rescue mission based on his experience and handling of a launch emergency six years ago. The Russian rocket failed shortly after liftoff, and the capsule carrying him and a cosmonaut catapulted off the top to safety.
Rookie NASA astronaut Zena Cardman and veteran space flier Stephanie Wilson were pulled from this flight after NASA opted to go with SpaceX to bring the stuck astronauts home. The space agency said both would be eligible to fly on future missions. Gorbunov remained under an exchange agreement between NASA and the Russian Space Agency.
[...] SpaceX has long been the leader in NASA's commercial crew program, established as the space shuttles were retiring more than a decade ago. SpaceX beat Boeing in delivering astronauts to the space station in 2020 and it's now up to 10 crew flights for NASA.
[...] Delayed by Hurricane Helene pounding Florida, the latest SpaceX liftoff marked the first for astronauts from Launch Complex 40 at Cape Canaveral Space Force Station. SpaceX took over the old Titan rocket pad nearly two decades ago and used it for satellite launches, while flying crews from Kennedy's former Apollo and shuttle pad next door. The company wanted more flexibility as more Falcon rockets soared.
Arthur T Knackerbracket has processed the following story:
As expected, Winamp's source code has been publicly released on GitHub. The Winamp for Windows project, as it's officially called by Llama Group, will receive a few updates per year, adding new features and ensuring proper security practices. Just don't get your hopes up about creating alternative projects from this "freely" available code.
Llama Group is looking to the GitHub community for help in developing new capabilities and maintaining (or even modernizing) a codebase that dates back several decades. Originally introduced in 1997, Winamp has long been praised for its flexibility and broad compatibility with various audio formats.
[...] Winamp's source code is now freely available, but the license it was released under has sparked controversy. The Winamp Collaborative License imposes significant restrictions on what people can do with the code, including a ban on releasing modified third-party modified versions. The license explicitly states that no public forks are permitted, and only the maintainers of the official repository are allowed to release the software or any new (approved) modifications.
Llama Group appears eager to benefit from community contributions to Winamp, but the software itself cannot be repurposed to create something new. Despite the legal threat, developers are undeterred, with hundreds of "unofficial" forks of Winamp's source code already surfacing online. As time goes on, this trend is only expected to grow.
Legal concerns aside, it seems clear that Llama Group no longer wants the full responsibility of maintaining the still-popular PC media player. The company has shifted its focus to different business ventures, including cloud-based products that leverage the "Winamp" brand to capitalize on music creators and streaming services.
Arthur T Knackerbracket has processed the following story:
A six-year investigation into the vast Thwaites glacier in Antarctica has concluded with a grim outlook on its future.
Often dubbed the “doomsday glacier”, this huge mass of ice is comparable in size to Britain or Florida and its collapse alone would raise sea levels by 65 centimetres. Worse still, this is expected to trigger a more widespread loss of the ice sheet covering West Antarctica, causing a calamitous sea level rise of 3.3 metres and threatening cities like New York, Kolkata and Shanghai.
It is an extremely remote and difficult area to get to, but the International Thwaites Glacier Collaboration (ITGC), a joint UK-US research programme, has managed to deploy 100 scientists there over the past six years, using planes, ships and underwater robots to study the dynamics of this ice in detail. “It was a tremendous challenge, and yet we really learned a lot,” says Ted Scambos at University of Colorado Boulder.
These discoveries include the fact that Thwaites glacier is particularly vulnerable, as it rests on a bed of rock that is well below sea level and is being melted from the underside by warmer seawater. What’s more, the bedrock slopes downwards towards the interior of the ice sheet, so, as the glacier retreats, even more ice is exposed to warm seawater, threatening to accelerate the collapse.
[...] “It’s not going to instantaneously lead to a catastrophic retreat in the next year or the year after, but, at the same time, we are very sure that Thwaites is going to continue to retreat, and ultimately the retreat is going to accelerate,” says Rob Larter at the British Antarctic Survey, another member of the team. “We can’t put an exact time frame on that.”
Ultimately, however, the ITCG researchers think that, by the end of the 23rd century, Thwaites glacier and much of the West Antarctic ice sheet might be lost.
On Monday, OpenAI CEO Sam Altman outlined his vision for an AI-driven future of tech progress and global prosperity in a new personal blog post titled "The Intelligence Age." The essay paints a picture of human advancement accelerated by AI, with Altman suggesting that superintelligent AI could emerge within the next decade.
"It is possible that we will have superintelligence in a few thousand days (!); it may take longer, but I'm confident we'll get there," he wrote.
OpenAI's current goal is to create AGI (artificial general intelligence), which is a term for hypothetical technology that could match human intelligence in performing many tasks without the need for specific training. By contrast, superintelligence surpasses AGI, and it could be seen as a hypothetical level of machine intelligence that can dramatically outperform humans at any intellectual task, perhaps even to an unfathomable degree.
[...]
Despite the criticism, it's notable when the CEO of what is probably the defining AI company of the moment makes a broad prediction about future capabilities—even if that means he's perpetually trying to raise money. Building infrastructure to power AI services is foremost on many tech CEOs' minds these days."If we want to put AI into the hands of as many people as possible," Altman writes in his essay, "we need to drive down the cost of compute and make it abundant (which requires lots of energy and chips). If we don't build enough infrastructure, AI will be a very limited resource that wars get fought over and that becomes mostly a tool for rich people."
[...]
While enthusiastic about AI's potential, Altman urges caution, too, but vaguely. He writes, "We need to act wisely but with conviction. The dawn of the Intelligence Age is a momentous development with very complex and extremely high-stakes challenges. It will not be an entirely positive story, but the upside is so tremendous that we owe it to ourselves, and the future, to figure out how to navigate the risks in front of us."
[...]
"Many of the jobs we do today would have looked like trifling wastes of time to people a few hundred years ago, but nobody is looking back at the past, wishing they were a lamplighter," he wrote. "If a lamplighter could see the world today, he would think the prosperity all around him was unimaginable. And if we could fast-forward a hundred years from today, the prosperity all around us would feel just as unimaginable."
Related Stories on Soylent News:
Plan Would Power New Microsoft AI Data Center From Pa.'s Three Mile Island 'Unit 1' Nuclear Reactor - 20240921
Artificial Intelligence 'Godfather' on AI Possibly Wiping Out Humanity: 'It's Not Inconceivable' - 20230329
Microsoft Research Paper Claims Sparks of Artificial Intelligence in GPT-4 - 20230327
John Carmack's 'Different Path' to Artificial General Intelligence - 20230213
Just in time for spooky season:
If you're thinking of going down the unconventional route for this year's Halloween costume, we've got just the inspiration for you: a brand-new species of chimaera, better known as a ghost shark or spookfish.
Scientists discovered the new species, which goes by the scientific name Harriotta avia – avia meaning "grandmother" in Latin, in honor of study author Dr Brit Finucci's grandmother – and the common name Australasian Narrow-nosed Spookfish, off the coast of New Zealand and Australia.
This region is no stranger to ghost sharks – the study describing the new species dubs it "a global diversity [hotspot] for chimaeroids" – but it was previously thought that this particular species wasn't actually its own species at all. Instead, it was believed to be a variant of another species, Harriotta raleighana, which is found across the globe.
But with the help of genetics and a closer inspection of its morphology, the researchers were able to identify H. avia as a species in its own right.
"Harriotta avia is unique due to its elongated, narrow and depressed snout; long, slender trunk; large eyes; and very long, broad pectoral fins," said Finucci in a statement. "It is a lovely chocolate brown colour."
[...] "Ghost sharks like this one are largely confined to the ocean floor, living in depths of up to 2,600 [meters] [8530 feet]," said Finucci. The level of pressure to be found at such depths isn't exactly human-friendly.
"Their habitat makes them hard to study and monitor, meaning we don't know a lot about their biology or threat status, but it makes discoveries like this even more exciting," Finucci explained.
What we do know is that, despite the nickname, ghost sharks aren't actually sharks at all. Whilst the two groups are still related to each other, it's thought that they diverged from one another other nearly 400 million years ago. Both remained cartilaginous, but ghost sharks have wound up with several physical differences from their relatives.
Journal Reference:
Finucci, Brittany, Didier, Dominique, Ebert, David A., et al. Harriotta avia sp. nov. – a new rhinochimaerid (Chimaeriformes: Rhinochimaeridae) described from the Southwest Pacific, Environmental Biology of Fishes (DOI: 10.1007/s10641-024-01577-4)
Motor Trend is running a piece on the systems in the recently released Mercedes-Benz "Drive Pilot 95", https://www.motortrend.com/reviews/mercedes-benz-drive-pilot-95-first-drive-review
Here are a few of the details I found interesting:
By the end of this year, pending final certification from the authorities, German customers will be able to order an upgraded version called Drive Pilot 95, which, under certain operating conditions, will allow their S-Class and EQS models to self-drive for an indefinite period in the right lane of autobahns at speeds of up to 95 km/h (59 mph).[The earlier version from 2022 only worked in congested traffic up to 65 kph (40mph)]
[...]
Why has it taken so long to implement a software tweak? Well, both the German legislators and Mercedes-Benz, which assumes legal responsibility for the functioning of its vehicles while they are operating in Level 3 autonomous drive mode, are cautious. The Silicon Valley 'move fast and break things' approach doesn't work for them.
[...]
In addition to the parking sensors in the front and rear bumpers and the 360 degree cameras in the rear view mirrors that are fitted to many Mercedes-Benz models, Drive Pilot equipped cars have multi-mode radars at each corner, a front-facing long-range radar and a lidar unit behind the grille, a stereo camera at the top of the windshield, a regular camera facing rearward through the backlight, and a moisture sensor in the front wheel well.
The rear-facing camera is used to detect the flashing lights of emergency vehicles approaching from behind, though the 'Hey Mercedes' voice activation microphone in the cabin will pick up the sound of the sirens even if the vehicle cannot be seen. The moisture detector, which measures the sound level of the spray from the tire on wet roads, is used to determine whether rain and spray could interfere with the camera, radar and the lidar systems.
[...]
In simple terms, the key difference between the original Drive Pilot system and Drive Pilot 95 is the latter will now operate autonomously at Level 3 for an indefinite period if the Mercedes-Benz is in the right lane of the autobahn and is following traffic traveling at no more than 95km/h. Without that traffic, which can be up to 1000 feet ahead, the system will not activate.
This is where the trucks come in: The traffic on German autobahns that most consistently conforms to that pattern are the swarms of semis that are constantly crisscrossing the country. "The trucks are generally limited to 80km/h (50mph)," says Drive Pilot test engineer Jochen Haab," but they usually travel at about 90km/h, and up to 95km/h on downhills."
Drive Pilot 95 could operate without having to follow traffic, Haab says, but making that part of its operational design domain provides an additional safety redundancy: If there is traffic ahead, and it is moving, the car knows for certain the road ahead is clear without needing to process more data to double check.
A highly precise positioning antenna mounted in the roof enables the car to know, to within a fraction of an inch, exactly where it is in terms of its absolute position, its relative position, and its position correlated to carefully measured landmarks on an HD map built from data collected Mercedes-Benz engineers who drove every single mile of Germany's 8,196-mile autobahn network in both directions and in every lane.
Still not a fan of Level 3 which requires the driver to be ready to accept a handoff--but this system gives the human 10 seconds to take control.
Many other interesting details in the link.
This seems to be the definitive answer to one of the original questions about self-driving:
"Mercedes-Benz, which assumes legal responsibility for the functioning of its vehicles while they are operating in Level 3 autonomous drive mode..."
http://www.righto.com/2024/09/ramtron-ferroelectric-fram-die.html
Ferroelectric memory (FRAM) is an interesting storage technique that stores bits in a special "ferroelectric" material. Ferroelectric memory is nonvolatile like flash memory, able to hold its data for decades. But, unlike flash, ferroelectric memory can write data rapidly. Moreover, FRAM is much more durable than flash and can be be written trillions of times. With these advantages, you might wonder why FRAM isn't more popular. The problem is that FRAM is much more expensive than flash, so it is only used in niche applications.
[...] The history of ferroelectric memory dates back to the early 1950s.3 Many companies worked on FRAM from the 1950s to the 1970s, including Bell Labs, IBM, RCA, and Ford. The 1955 photo below shows a 256-bit ferroelectric memory built by Bell Labs. Unfortunately, ferroelectric memory had many problems,4 limiting it to specialized applications, and development was mostly abandoned by the 1970s.
Ferroelectric memory had a second chance, though. A major proponent of ferroelectric memory was George Rohrer, who started working on ferroelectric memory in 1968. He formed a memory company, Technovation, which was unsuccessful, and then cofounded Ramtron in 1984.5 Ramtron produced a tiny 256-bit memory chip in 1988, followed by much larger memories in the 1990s.
Arthur T Knackerbracket has processed the following story:
Lasso peptides are natural products made by bacteria. Their unusual lasso shape endows them with remarkable stability, protecting them from extreme conditions. In a new study, published in Nature Chemical Biology, researchers have constructed and tested models for how these peptides are made and demonstrated how this information might be used to advance lasso peptide-based drugs into the clinic.
"Lasso peptides are interesting because they are basically linear molecules that have been tied into a slip knot-like shape," said Susanna Barrett, a graduate student in the Mitchell lab (MMG). "Due to their incredible stability and engineerability, they have a lot of potential as therapeutics. They have also been shown to have antibacterial, antiviral, and anti-cancer properties."lasso peptides
Lasso peptides are ribosomally synthesized and post-translationally modified molecules. The peptide chains are formed from joining amino acids together in the form of a string, which is done by the ribosome. Two enzymes, a peptidase and a cyclase, then collaborate to convert a linear precursor peptide into the distinctive knotted lasso structure. Since their discovery over three decades ago, scientists have been trying to understand how the cyclase folds the lasso peptide.
"One of the major challenges of solving this problem has been that the enzymes are difficult to work with. They are generally insoluble or inactive when you attempt to purify them," Barrett said.
One rare counterexample is fusilassin cyclase, or FusC, which the Mitchell lab characterized in 2019. Former group members were able to purify the enzyme, and since then, it has served as a model to understand the lasso knot-tying process. Yet, the structure of FusC remained unknown, making it impossible to understand how the cyclase interacts with the peptide to fold the knot.
In the current study, the group used the artificial intelligence program AlphaFold to predict the FusC protein structure. They used the structure and other artificial intelligence-based tools, like RODEO, to pinpoint which cyclase active site residues were important for interacting with the lasso peptide substrate.
"FusC is made up of approximately 600 amino acids and the active site contains 120. These programs were instrumental to our project because they allowed us to do 'structural studies' and whittle down which amino acids are important in the active site of the enzyme," Barrett said.
They also used molecular dynamics simulations to computationally understand how the lasso is folded by the cyclase. "Thanks to the computing power of Folding@home, we were able to collect extensive simulation data to visualize the interactions at the atomic level," said Song Yin, a graduate student in the Shukla lab. "Before this study, there were no MD simulations of the interactions between lasso peptides and cyclases, and we think this approach will be applicable to many other peptide engineering studies."
From their computational efforts, the researchers found that among different cyclases, the backwall region of the active site seemed to be especially important for folding. In FusC, this corresponded to the helix 11 region. The researchers then carried out cell-free biosynthesis where they added all the cell components that are necessary for the synthesis of the lasso peptides to a test tube with enzyme variants that had different amino acids in the helix 11 region. Ultimately, they identified a version of FusC with a mutation on helix 11 that could fold lasso peptides which cannot be made by the original cyclase. This data confirms the model for lasso peptide folding that the researchers developed with their computational approaches.
"How enzymes tie a lasso knot is a fascinating question. This study provides a first glimpse of the biophysical interactions responsible for producing this unique structure," said Diwakar Shukla, an associate professor of chemical and biomolecular engineering.
"We also showed that these molecular contacts are the same in several different cyclases across different phyla. Even though we have not tested every system, we believe it's a generalizable model," Barrett said.
Collaborating with the San Diego-based company Lassogen, the researchers showed that the new insights can guide cyclase engineering to generate lasso peptides that otherwise cannot be made. As a proof-of-concept, they engineered a different cyclase, called McjC, to efficiently produce a potent inhibitor of a cancer-promoting integrin.
"The ability to generate lasso peptide diversity is important for optimizing drugs," said Mark Burk, CEO of Lassogen. "The enzymes from nature do not always allow us to produce the lasso peptides of interest and the ability to engineer lasso cyclases greatly expands the therapeutic utility of these amazing molecules."
"Our work would not have been possible without access to powerful computing and recent advances in artificial intelligence and cell-free biosynthetic methods," said Douglas Mitchell, John and Margaret Witt Professor of Chemistry. "This work is an extraordinary example of how interdisciplinary collaborations are catalyzed at the Carl R. Woese Institute for Genomic Biology."
Journal information: Nature Chemical Biology
More information:Susanna E. Barrett et al, Substrate interactions guide cyclase engineering and lasso peptide diversification, Nature Chemical Biology (2024). DOI: 10.1038/s41589-024-01727-w
Investors sold after the investment bank's analysts warned about what they called the 'China butterfly effect':
Shares of General Motors and Ford Motor traded lower on Wednesday after Morgan Stanley downgraded the overall U.S. auto sector, citing worries that Western automakers might struggle in the intensifying competition with Chinese rivals.
General Motors was downgraded to "underweight" from "equal weight," and its shares fell 5.4 percentage points, to $45.50. Ford went to "equal weight" from "overweight," with its shares dropping more than 4 percentage points, to $10.43.
Electric vehicle (EV) maker Rivian Automotive and Canadian parts manufacturer Magna International were both downgraded to "equal weight" from "overweight." Shares of Rivian were down 5.7 percentage points while Magna's were off 4.7 percentage points.
Investors sold after Morgan Stanley analysts warned about what they called the "China butterfly effect," a metaphor suggesting that even small surges in China's industrial production capacity could have significant ripple effects across the global market.
[...] Bolstered in part by massive government subsidies, Chinese manufacturers have rapidly emerged as major players in the EV industry, accounting for 60 percent of worldwide EV sales and almost one in five EVs sold in Europe last year.
Both Washington and Brussels have hiked tariffs in response to China's excess production of low-price EVs.
Previously:
Arthur T Knackerbracket has processed the following story:
UK government IT contracts worth £23.4 billion are due to end during the current five-year Parliament, according to researchers who warn that poor performing suppliers are hardly ever excluded from bidding again.
A report by public spending research company Tussell and the Institute for Government found that a third of these, worth £9 billion, are supposed to finish up in 2025.
The report points out that large contracts expiring next year include the longstanding Post Office deal with Fujitsu to build and manage the Horizon IT system at the center of one of the greatest miscarriages of justice in the UK. From 1999 until 2015, 736 local branch managers were wrongfully convicted of fraud when errors in the system were to blame. The total value of the Horizon contract is £2.38 billion ($3.15 billion). It is due to expire on March 31, 2025.
[...] The researchers warn that poor-performing suppliers to UK government are virtually never excluded from supplying the public sector and often continue to receive government money. Meanwhile, a large number of contracts, totaling billions of pounds, are overseen by officials who are not commercial specialists.
The report also highlights that poor data across government departments meant officials didn't know how much they were spending and with whom. And new providers that could perhaps deliver better services for less money are discouraged from bidding for business.
[...] "Public procurement is a huge market hiding in plain sight, accounting for approximately one-third of all public spending and 10 percent of UK GDP," said Gus Tugendhat, founder of Tussell.
"In the context of tight budgets and strained public services, getting value for money out of government contracts is more important than ever," he said.
There is a fair, and long running, amount of research for that playing Tetris helps people deal with trauma, PTSD of some kind. Adding some more recent research then where it can reduce PTSD symptoms in healthcare workers (nurses) that worked with trauma COVID19 patients.
Playing something such as Tetris (it's a bit unclear if it's just Tetris or a similar style of games of which Tetris is the prime example) can induce some relaxing zen like state or a "cognitive vaccine". 20 minutes is apparently the prescribed dosage of rotational healing experience. There was the 15 minutes of talking to before playing Tetris. But clearly the healing power of Tetris at work ...
The study was carried out with healthcare workers in Sweden who worked with COVID-19 patients and were exposed to work-related trauma. It was conducted during the COVID-19 pandemic between September 2020 and April 2022. A total of 164 participants were included. Participants were recruited through information at workplaces. Participation was entirely voluntary. The criterion for participation was that the person had at least two intrusive memories per week due to traumatic events that occurred at work.
https://www.uu.se/en/press/press-releases/2024/2024-09-20-ptsd-symptoms-can-be-reduced-through-treatment-including-a-video-game
https://www.psych.ox.ac.uk/news/tetris-used-to-prevent-post-traumatic-stress-symptoms
Arthur T Knackerbracket has processed the following story:
A truck full of lithium-ion batteries is burning in Los Angeles, shutting down ports and a bridge. It’s not clear what the batteries were for — but LA’s Vincent Thomas Bridge, leading to the Port of Los Angeles and the next-door Port of Long Beach, has been shut down for at least 15 hours now while local firefighters let the truck burn. State Route 47 was also closed in both directions as of a couple of hours ago.
Amazingly, a local towing company caught the explosion on camera from a nearby drone:
Both the Port of Los Angeles and the Port of Long Beach have shut down a number of terminals while the fire continues to burn. As of 12:10PM PT on Friday, the truck was still on fire, and both the ports and bridge were still closed, Los Angeles Fire Department (LAFD) spokesperson Ren Medina told The Verge.
Firefighters are nearby and are actively monitoring the situation; as of 10PM PT on Thursday, the fire was expected to last “at least another 24-48 hours.”
As we’ve seen with several EV battery fires, big concentrated lithium battery fires can be very difficult to put out: firefighters sometimes douse them with thousands of gallons of water only to see the fire restart as additional battery cells heat up to the point that they combust. Once a cell gets hot enough, it’s said to go into “thermal runaway,” at which point it can sometimes restart a fire. The LAFD confirms this is a case of thermal runaway.
EV packs are particularly dense with cells, but we don’t yet know if they were involved here — the LA Fire Department spokesperson says it’s not clear who owns the truck, let alone what it was carrying. The LAFD could only confirm they are lithium-ion batteries at this point.
A senior RedMonk analyst tried to prove shifting to proprietary licenses *doesn't* improve financial outcomes. But what's interesting is the reactions she got -- from a VC at OSS Capital, ex-Googlers, Chef's co-founder, and even Taylor Dolezal, head of ecosystem at the Cloud Native Computing Community. Plus analyst Lawrence Hecht, who concluded "these companies are nowhere closer to being profitable than before."
There's new quotes from the analyst herself. ("I asked Stephens if she thought the analysis would have an impact in the future on companies considering moves to proprietary licensing. 'I doubt it,' Stephens replied...") And Hecht pounds away at the missteps. ("The assumption has been that closing a company's license will allow the companies to increase their margins among their existing customers... The percentage of companies using a given technology is not changing... Elasticsearch fell from 14% to 13%..")
Interestingly, the study hits right as Elastic is switching *back* to an open-source license. They weigh in in this article too...
It's the discussion about open source licensing that really needed to happen.