Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Idiosyncratic use of punctuation - which of these annoys you the most?

  • Declarations and assignments that end with }; (C, C++, Javascript, etc.)
  • (Parenthesis (pile-ups (at (the (end (of (Lisp (code))))))))
  • Syntactically-significant whitespace (Python, Ruby, Haskell...)
  • Perl sigils: @array, $array[index], %hash, $hash{key}
  • Unnecessary sigils, like $variable in PHP
  • macro!() in Rust
  • Do you have any idea how much I spent on this Space Cadet keyboard, you insensitive clod?!
  • Something even worse...

[ Results | Polls ]
Comments:64 | Votes:119

posted by hubie on Friday June 09 2023, @07:48PM   Printer-friendly

A new study by researchers at the University of Rhode Island shows some of the best evidence yet for a feedback loop phenomenon in which species evolution drives ecological change:

The story of the peppered moths is a textbook evolutionary tale. As coal smoke darkened tree bark near England's cities during the Industrial Revolution, white-bodied peppered moths became conspicuous targets for predators and their numbers quickly dwindled. Meanwhile, black-bodied moths, which had been rare, thrived and became dominant in their newly darkened environment.

The peppered moths became a classic example of how environmental change drives species evolution. But in recent years, scientists have begun thinking about the inverse process. Might there be a feedback loop in which species evolution drives ecological change? Now, a new study by researchers at the University of Rhode Island shows some of the best evidence yet for that very phenomenon.

In research published in the Proceedings of the National Academy of Sciences, the researchers show that an evolutionary change in the length of lizards' legs can have a significant impact on vegetation growth and spider populations on small islands in the Bahamas. This is one of the first times, the researchers say, that such dramatic evolution-to-environment effects have been documented in a natural setting.

[...] Armed with specialized lizard wrangling gear—poles with tiny lassos made of dental floss at the end—the team captured hundreds of brown anoles. They then measured the leg length of each lizard, keeping the ones whose limbs were either especially long or especially short and returning the rest to the wild. Once they had distinct populations of short- and long-limbed lizards, they set each population free on islands that previously had no lizards living on them.

Since the experimental islands were mostly covered by smaller diameter vegetation, the researchers expected that the short-legged lizards would be better adapted to that environment, that is, more maneuverable and better able to catch prey in the trees and brush. The question the researchers wanted to answer was whether the ecological effects of those highly effective hunters could be detected.

After eight months, the researchers checked back on the islands to look for ecological differences between islands stocked with the short- and long-legged groups. The differences, it turned out, were substantial. On islands with shorter-legged lizards, populations of web spiders—a key prey item for brown anoles—were reduced by 41% compared to islands with lanky lizards. There were significant differences in plant growth as well. Because the short-legged lizards were better at preying on insect herbivores, plants flourished. On islands with short-legged lizards, buttonwood trees had twice as much shoot growth compared to trees on islands with long-legged lizards, the researchers found.

The results, Kolbe says, help to bring the interaction between ecology and evolution full circle.

Journal Reference:
Kolbe, Jason J. et al, Experimentally simulating the evolution-to-ecology connection: Divergent predator morphologies alter natural food webs, PNAS (2023). DOI: 10.1073/pnas.2221691120


Original Submission

posted by hubie on Friday June 09 2023, @03:03PM   Printer-friendly

Interesting article relating to Google/OpenAI vs. Open Source for LLMs

Leaked Internal Google Document Claims Open Source AI Will Outcompete Google and OpenAI:

The text below is a very recent leaked document, which was shared by an anonymous individual on a public Discord server who has granted permission for its republication. It originates from a researcher within Google. We have verified its authenticity. The only modifications are formatting and removing links to internal web pages. The document is only the opinion of a Google employee, not the entire firm. We do not agree with what is written below, nor do other researchers we asked, but we will publish our opinions on this in a separate piece for subscribers. We simply are a vessel to share this document which raises some very interesting points.

We've done a lot of looking over our shoulders at OpenAI. Who will cross the next milestone? What will the next move be?

But the uncomfortable truth is, we aren't positioned to win this arms race and neither is OpenAI. While we've been squabbling, a third faction has been quietly eating our lunch.

I'm talking, of course, about open source. Plainly put, they are lapping us. Things we consider "major open problems" are solved and in people's hands today. Just to name a few:

While our models still hold a slight edge in terms of quality, the gap is closing astonishingly quickly. Open-source models are faster, more customizable, more private, and pound-for-pound more capable. They are doing things with $100 and 13B params that we struggle with at $10M and 540B. And they are doing so in weeks, not months. This has profound implications for us:

  • We have no secret sauce. Our best hope is to learn from and collaborate with what others are doing outside Google. We should prioritize enabling 3P integrations.

  • People will not pay for a restricted model when free, unrestricted alternatives are comparable in quality. We should consider where our value add really is.

  • Giant models are slowing us down. In the long run, the best models are the ones which can be iterated upon quickly. We should make small variants more than an afterthought, now that we know what is possible in the 20B parameter regime.

At the beginning of March the open source community got their hands on their first really capable foundation model, as Meta's LLaMA was leaked to the public. It had no instruction or conversation tuning, and no RLHF. Nonetheless, the community immediately understood the significance of what they had been given.

A tremendous outpouring of innovation followed, with just days between major developments (see The Timeline for the full breakdown). Here we are, barely a month later, and there are variants with instruction tuning, quantization, quality improvements, human evals, multimodality, RLHF, etc. etc. many of which build on each other.

Most importantly, they have solved the scaling problem to the extent that anyone can tinker. Many of the new ideas are from ordinary people. The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.

Lots more stuff in the article. It would be interesting to hear from knowledgeable experts what the primary disagreements to these points are and whether you agree or disagree.


Original Submission

posted by janrinok on Friday June 09 2023, @10:13AM   Printer-friendly

Self-healing code is the future of software development:

One of the more fascinating aspects of large language models is their ability to improve their output through self reflection. Feed the model its own response back, then ask it to improve the response or identify errors, and it has a much better chance of producing something factually accurate or pleasing to its users. Ask it to solve a problem by showing its work, step by step, and these systems are more accurate than those tuned just to find the correct final answer.

While the field is still developing fast, and factual errors, known as hallucinations, remain a problem for many LLM powered chatbots, a growing body of research indicates that a more guided, auto-regressive approach can lead to better outcomes.

This gets really interesting when applied to the world of software development and CI/CD. Most developers are already familiar with processes that help automate the creation of code, detection of bugs, testing of solutions, and documentation of ideas. Several have written in the past on the idea of self-healing code. Head over to Stack Overflow's CI/CD Collective and you'll find numerous examples of technologists putting this ideas into practice.

When code fails, it often gives an error message. If your software is any good, that error message will say exactly what was wrong and point you in the direction of a fix. Previous self-healing code programs are clever automations that reduce errors, allow for graceful fallbacks, and manage alerts. Maybe you want to add a little disk space or delete some files when you get a warning that utilization is at 90% percent. Or hey, have you tried turning it off and then back on again?

Developers love automating solutions to their problems, and with the rise of generative AI, this concept is likely to be applied to both the creation, maintenance, and the improvement of code at an entirely new level.

The ability of LLMs to quickly produce large chunks of code may mean that developers—and even non-developers—will be adding more to the company codebase than in the past. This poses its own set of challenges.

"One of the things that I'm hearing a lot from software engineers is they're saying, 'Well, I mean, anybody can generate some code now with some of these tools, but we're concerned about maybe the quality of what's being generated,'" says Forrest Brazeal, head of developer media at Google Cloud. The pace and volume at which these systems can output code can feel overwhelming. "I mean, think about reviewing a 7,000 line pull request that somebody on your team wrote. It's very, very difficult to do that and have meaningful feedback. It's not getting any easier when AI generates this huge amount of code. So we're rapidly entering a world where we're going to have to come up with software engineering best practices to make sure that we're using GenAI effectively."

"People have talked about technical debt for a long time, and now we have a brand new credit card here that is going to allow us to accumulate technical debt in ways we were never able to do before," said Armando Solar-Lezama, a professor at the Massachusetts Institute of Technology's Computer Science & Artificial Intelligence Laboratory, in an interview with the Wall Street Journal. "I think there is a risk of accumulating lots of very shoddy code written by a machine," he said, adding that companies will have to rethink methodologies around how they can work in tandem with the new tools' capabilities to avoid that.

[Editor's Comment: Much more discussion follows in the linked article.--JR]


Original Submission

posted by janrinok on Friday June 09 2023, @05:33AM   Printer-friendly
from the statistics-alphabet-soup dept.

Several days ago, a New York Times article titled "How New Rules Turned Back the Clock on Baseball" was posted over at Hacker News. The 2023 Major League Baseball (MLB) season has adopted several rule changes including implementing a pitch clock, limiting pickoff attempts, increasing the size of bases, and banning extreme defensive shifts. The results have been dramatic, with a much faster pace of play and a large increase in stolen bases. It is an effort to undo many trends in the game that have been influenced by the rise of advanced metrics.

Statistics have always been a part of baseball, whether it's trying to hit .400, strike out 300 batters, or trying to hit 60 home runs in a season. In the 1990s, typical statistics to measure hitting success were batting average (BA), home runs (HR), and runs batted in (RBI). Pitchers were evaluated with statistics like strikeouts (K), wins (W), earned run average (ERA), and walks and hits per inning pitched (WHIP). During this era, there was an increase in the amount and type of data collected during games, providing far more details for statisticians to analyze.

Some of these statistics like BA, HR, RBI, K, and W really aren't great indicators of the value of a player. For example, wins are heavily influenced both by a team's lineup and the defense behind a pitcher, so they don't correlate well to the quality of a pitcher. Home runs are valuable to an offense, but it's a count instead of a rate, meaning it's influenced heavily by how many plate appearances a hitter receives and how often the hitter takes walks. Statistics like ERA and WHIP were better because they presented as rates, though they were still influenced significantly by the quality of a team's defense. The development of advanced metrics, which are newer and more insightful statistical tools, provided a lot of insight into what is actually valuable to a team's success.

In the present day, statistics like weighted on-base average (wOBA) and wins above replacement (WAR), in addition to many others, are commonly used to measure the value of players. These statistics attempt to determine the true value of each play to a team's success and present them in a single metric. For example, examining the seasonal constants used to calculate wOBA shows that stolen bases aren't particularly valuable compared to even outcomes like taking a walk. It also shows that home runs are more than twice as valuable as a single. This was one factor in changing the typical approach taken by batters.

In some cases, the advanced metrics also differed significantly from conventional wisdom. For many decades, hitters were generally expected to change their hitting approach with two strikes, sacrificing power for just trying to make contact with the baseball. However, advanced metrics revealed that strikeouts weren't much worse for a team's success than groundouts or flyouts. The result was a more aggressive approach to hitting with two-strike counts, accepting much higher strikeout rates in exchange for more doubles, triples, and home runs. Additionally, there is significant value in just getting on base, and walks (BB) are valued almost as much as singles. Hitters generally swung more aggressively at pitches inside the strike zone but also avoided chasing pitches outside of the strike zone.

The result was a trend toward an increase in the three true outcomes (HR, K, and BB), which are plays where only the pitcher and catcher are involved in the defense. In front offices, nerds displaced people who had significant experience playing baseball, because teams coveted their skills in processing and analyzing data. But for many fans, the game had become much less interesting, with slower games and less action involving defense and baserunning. Baseball had been largely optimized with more data collection and many advanced metrics to evaluate players, but the result was a boring product for fans.

There's no way to take the analytics out of baseball, and teams aren't going to start replacing the nerds in front offices with people who have more playing experience. Instead, MLB introduced several new rules this season designed to make the game more entertaining and reduce the negative impacts from expanded use of advanced metrics. Although it has not completely reverted the game of baseball back to the 1990s, the statistics in the New York Times article show that the rule changes have created a faster-paced game with more baserunning.


Original Submission

posted by martyb on Friday June 09 2023, @12:45AM   Printer-friendly
from the even-their-trees-try-to-kill-you dept.

IMB researchers have identified a unique pain pathway targeted by a notorious Australian stinging tree and say it could point the way to new, non-opioid pain relief:

Professor Irina Vetter and her team have studied how toxins in the venom of the Gympie-Gympie tree cause intense pain that can last for weeks.

[...] "The gympietide toxin in the stinging tree has a similar structure to toxins produced by cone snails and spiders, but the similarity ends there," Professor Vetter said.

"This toxin causes pain in a way we've never seen before."

Many toxins cause pain by binding directly to sodium channels in sensory nerve cells, but the UQ researchers have found the gympietide toxin needs assistance to bind.

"It requires a partner protein called TMEM233 to function and in the absence of TMEM233 the toxin has no effect," Professor Vetter said.

"This was an unexpected finding and the first time we've seen a toxin that requires a partner to impact sodium channels."

The team is working to understand whether switching off this pain mechanism might lead to the development of new painkillers.

"The persistent pain the stinging tree toxins cause gives us hope that we can convert these compounds into new painkillers or anaesthetics which have long-lasting effects," Professor Vetter said.

Journal Reference:
Sina Jami, Jennifer R. Deuis, Tabea Klasfauseweh, et al. Pain-causing stinging nettle toxins target TMEM233 to modulate NaV1.7 function (https://doi.org/10.1038/s41467-023-37963-2)


Original Submission

posted by martyb on Thursday June 08 2023, @10:09PM   Printer-friendly
from the here's-the-rest-of-the-story dept.

Snowden Ten Years Later - Schneier on Security:

Snowden Ten Years Later

In 2013 and 2014, I wrote extensively about new revelations regarding NSA surveillance based on the documents provided by Edward Snowden. But I had a more personal involvement as well.

I wrote the essay below in September 2013. The New Yorker agreed to publish it, but the Guardian asked me not to. It was scared of UK law enforcement, and worried that this essay would reflect badly on it. And given that the UK police would raid its offices in July 2014, it had legitimate cause to be worried.

Now, ten years later, I offer this as a time capsule of what those early months of Snowden were like.

It’s a surreal experience, paging through hundreds of top-secret NSA documents. You’re peering into a forbidden world: strange, confusing, and fascinating all at the same time.

I had flown down to Rio de Janeiro in late August at the request of Glenn Greenwald. He had been working on the Edward Snowden archive for a couple of months, and had a pile of more technical documents that he wanted help interpreting. According to Greenwald, Snowden also thought that bringing me down was a good idea.

It made sense. I didn’t know either of them, but I have been writing about cryptography, security, and privacy for decades. I could decipher some of the technical language that Greenwald had difficulty with, and understand the context and importance of various document. And I have long been publicly critical of the NSA’s eavesdropping capabilities. My knowledge and expertise could help figure out which stories needed to be reported.

I thought about it a lot before agreeing. This was before David Miranda, Greenwald’s partner, was detained at Heathrow airport by the UK authorities; but even without that, I knew there was a risk. I fly a lot—a quarter of a million miles per year—and being put on a TSA list, or being detained at the US border and having my electronics confiscated, would be a major problem. So would the FBI breaking into my home and seizing my personal electronics. But in the end, that made me more determined to do it.

I did spend some time on the phone with the attorneys recommended to me by the ACLU and the EFF. And I talked about it with my partner, especially when Miranda was detained three days before my departure. Both Greenwald and his employer, the Guardian, are careful about whom they show the documents to. They publish only those portions essential to getting the story out. It was important to them that I be a co-author, not a source. I didn’t follow the legal reasoning, but the point is that the Guardian doesn’t want to leak the documents to random people. It will, however, write stories in the public interest, and I would be allowed to review the documents as part of that process. So after a Skype conversation with someone at the Guardian, I signed a letter of engagement.

And then I flew to Brazil.

The story concludes:

[...] But now it’s been a decade. Everything he knows is old and out of date. Everything we know is old and out of date. The NSA suffered an even worse leak of its secrets by the Russians, under the guise of the Shadow Brokers, in 2016 and 2017. The NSA has rebuilt. It again has capabilities we can only surmise.

This essay previously appeared in an IETF publication, as part of an Edward Snowden ten-year retrospective.

EDITED TO ADD (6/7): Conversation between Snowden, Greenwald, and Poitras.

Posted on June 6, 2023 at 7:17 AM27 Comments


Original Submission

posted by martyb on Thursday June 08 2023, @07:15PM   Printer-friendly
from the *BIG*-deal dept.

Preparing for the Incoming Computer Shopper Tsunami

There's no way for me to know where your awareness starts with all this, so let's just start at the beginning.

Computer Shopper was a hell of a magazine. I wrote a whole essay about it, which can be summarized as "this magazine got to be very large, very extensive, and probably served as the unofficial 'bible' of the state of hardware and software to the general public throughout the 1980s and 1990s." While it was just a pleasant little computer tabloid when it started in 1979, it quickly grew to a page count that most reasonable people would define as "intimidating".

[...] So, there I was whining online about how it was 2023 and nobody seemed to be scanning in Computer Shopper and we were going to be running into greater and greater difficulty to acquire and process them meaningfully, and I finally, stupidly said that if we happened on a somewhat-complete collection, I'd figure out how to do it.

And then an ebay auction came up that seemed to fit the bill.

Ed note: I well remember. Some editions stretched to 800 or more pages! It seemed that I could barely get through one edition when the next month's edition would come along. Who else remembers?


Original Submission

posted by hubie on Thursday June 08 2023, @02:56PM   Printer-friendly
from the Blackberry dept.

https://www.msn.com/en-us/news/technology/this-raspberry-pi-project-could-give-your-old-blackberry-a-second-life/ar-AA1c4WYV

Opinion:
Scientific studies have shown for decades now that the most efficient, pleasurable, and effective way of communicating with a cell phone is through a keyboard (also applies to laptops!). Double-blind studies of cave rats in Nambia showed that messages typed with a keyboard are 100% more readable than ones without keyboards, or they would be if cave rats knew how to spell. 9 out of 10 doctors agree based on our best analysis of their prescription hand wiring legibility.

While on my weekly quest to see if any new keyboard phones might be somewhere in the future I came across this article from Saturday

Article:

This Raspberry Pi Project Could Give Your Old BlackBerry A Second Life

Indie tech collective Squarofumi, which, in collaboration with the creators of Matrix-based chat app Beeper, have created a Raspberry Pi-powered device in the BlackBerry's image. This device is aptly named the Beepberry, and it combines that classic keyboard with a simplistic interface.

This device is powered by a Raspberry Pi Zero W hooked up to a high-contrast, low-power 400x240 Sharp Memory LCD and a classic, pleasantly tactile keyboard and trackpad. The Beepberry features native support for the Beeper app, a universal chat app that can be used to connect with users on 15 different major chat platforms like WhatsApp, Slack, Discord, and more.

In addition to the nostalgic BlackBerry-style keyboard, the interface of the Beepberry is designed to be as minimalistic as possible, rendering all apps exclusively with text (and some ASCII art, where applicable). If you'd prefer your mobile device to be a bit flashier, the Beepberry is highly customizable in terms of both hardware and software. It features programmable USB and GPIO ports and buttons, and can support any Linux app that's already operable on the Raspberry Pi Zero W. There's even a programmable RGB light on the front of the device for notifications.

With the raspberry pi zero its 99 bucks, without is 79. They are sold out which is sad because I would buy one if they weren't. Keyboard phones are back baby.
https://shop.sqfmi.com/products/beepberry?variant=43376334962843


Original Submission

posted by martyb on Thursday June 08 2023, @10:12AM   Printer-friendly
from the that's-a-smucking-fart-idea! dept.

'Ducking hell' to disappear from Apple autocorrect:

Apple has said it will no longer automatically change one of the most common swear words to 'ducking'.

The autocorrect feature, which has long frustrated users, will soon be able to use AI to detect when you really mean to use that expletive.

"In those moments where you just want to type a ducking word, well, the keyboard will learn it, too," said software boss Craig Federighi.

He announced the development at Apple's developers' conference in California.

iPhone users have often complained about how autocorrect forces them to rewrite their own messages - with the term "damn you autocorrect" becoming an acronym, a meme, an Instagram account and even a song.

[...] Initially flagged in a 2017 paper from Google, transformers are some of the most powerful classes of AI models, and autosuggest - or predictive text - systems are beginning to become more mainstream.

The autocorrect change will be part of the iOS 17 operating system upgrades which are expected to be available as a public beta in July, with the general release in September.


Original Submission

posted by janrinok on Thursday June 08 2023, @06:14AM   Printer-friendly
from the don't-dare-use-morse-code dept.

La Quadrature du Net has a detailed analysis of the "8 December Case" where a number of suspects were rounded up and have been kept behind bars since December 2020. Their case is schedule for October 2023 and hinges more or less entirely on the observation that the group used encrypted software, especially communications software. The basic use of encryption is used to hand wave away the questions about the lack of evidence.

La Quadrature du Net has been alerted to the fact that, in the context of the "8 december" case, not only the use of commundicatgions [sic] encryption tooals [sic] (WhatsApp, Signal, Protonmail, Silence, etic [sic].) but also the possession of technicale documentation and the organisation of digital hygiene training courses are being used to "demonstrate" a so-called "clandestine behaviour" revealing the "terrorist nature" of the group5.

We have had access to certain elements of the file confirming this information. We have chosen to make them visible in order to denounce the criminalisation of digital practices at the heart of our day-to-day work and the manipulation to which they are subjected in this affair.

Mixing fantasies, bad faith and technical incompetence, a police story has been constructed around the (good) digital practices of the accused, with the aim of staging a "clandestine group", "conspirative", "conspiratist" and therefore... terrorist.

The elements of the investigation that have been communicated to us are staggering. Here are just some of the practices that are being misused as evidence of terrorist behavior

  • the use of applications such as Signal, WhatsApp, Wire, Silence or ProtonMail to encrypt communications ;
  • using Internet privacy tools such as VPN, Tor or Tails7 ;
  • protecting ourselves against the exploitation of our personal data by GAFAM via services such as /e/OS, LineageOS, F-Droid ;
  • encrypting digital media;
  • organizing and participating in digital hygiene training sessions;
  • simple possession of technical documentation.

The gist is that the authorities are seeking to establish a position where simply having used encryption is sufficient evidence in and of itself of crime and conspiracy to commit crime.


Original Submission

posted by mrpg on Thursday June 08 2023, @01:33AM   Printer-friendly

https://phys.org/news/2023-05-team-nanoparticles-brain-cancer-treatment.html

University of Queensland researchers have developed a nanoparticle to take a chemotherapy drug into fast growing, aggressive brain tumors.

Research team lead Dr. Taskeen Janjua from UQ's School of Pharmacy said the new silica nanoparticle can be loaded with temozolomide, a small molecule drug used to treat tumors known as glioblastoma.

"This chemotherapy drug has limitations—it doesn't stay in the blood for very long, it can be pushed out of the brain, and it doesn't have high penetration from blood into the brain," Dr. Janjua said.

"To make the drug more effective, we developed an ultra-small, large pore nanoparticle to help it move through the blood-brain barrier and penetrate the tumor while also reducing unwanted patient side effects.

"This strategy could be a more effective way to treat brain cancer and prevent it from coming back."

More information:Taskeen Iqbal Janjua et al, Efficient delivery of Temozolomide using ultrasmall large-pore silica nanoparticles for glioblastoma, Journal of Controlled Release (2023). DOI: 10.1016/j.jconrel.2023.03.040

Journal information:Journal of Controlled Release
                                                                                                               


Original Submission

posted by mrpg on Wednesday June 07 2023, @09:02PM   Printer-friendly
from the another-Tom-Collins-paper-please dept.

Too much water can make whiskies taste the same:

While adding a little water is popularly thought to "open up" the flavor of whisky, a Washington State University-led study indicates there's a point at which it becomes too much: about 20%.

Researchers chemically analyzed how volatile compounds in a set of 25 whiskies responded to the addition of water, including bourbons, ryes, Irish whiskeys and both single malt and blended Scotches. They also had a trained sensory panel assess six of those whiskies, three Scotches and three bourbons.

Both tests found that adding a little water could change how the whiskies smelled, but after 20%, they may start to have the same aroma. Since smell and taste are often closely linked, this likely affected the spirit's flavor as well.

[...] Whisky is a mix of compounds that run the scale from hydrophilic to hydrophobic, in other words, ones that are attracted to water and others that are repelled by it. The addition of water sends the whisky's hydrophobic compounds into that headspace and leaves the hydrophilic ones behind, changing the aroma of the liquid.

Journal Reference:
P. Layton Ashmore, Aubrey DuBois, Elizabeth Tomasino, et al., Impact of Dilution on Whisky Aroma: A Sensory and Volatile Composition Analysis [open], Foods 2023, 12(6), 1276; https://doi.org/10.3390/foods12061276


Original Submission

posted by mrpg on Wednesday June 07 2023, @04:22PM   Printer-friendly
from the protons-find-a-way dept.

The first building blocks of life on Earth may have formed thanks to eruptions from our Sun:

A series of chemical experiments show how solar particles, colliding with gases in Earth's early atmosphere, can form amino acids and carboxylic acids, the basic building blocks of proteins and organic life. The findings were published in the journal Life.

To understand the origins of life, many scientists try to explain how amino acids, the raw materials from which proteins and all cellular life, were formed. The best-known proposal originated in the late 1800s as scientists speculated that life might have begun in a "warm little pond": A soup of chemicals, energized by lightning, heat, and other energy sources, that could mix together in concentrated amounts to form organic molecules.

In 1953, Stanley Miller of the University of Chicago tried to recreate these primordial conditions in the lab. Miller filled a closed chamber with methane, ammonia, water, and molecular hydrogen – gases thought to be prevalent in Earth's early atmosphere – and repeatedly ignited an electrical spark to simulate lightning. A week later, Miller and his graduate advisor Harold Urey analyzed the chamber's contents and found that 20 different amino acids had formed.

[...] But the last 70 years have complicated this interpretation. Scientists now believe ammonia (NH3) and methane (CH4) were far less abundant; instead, Earth's air was filled with carbon dioxide (CO2) and molecular nitrogen (N2), which require more energy to break down. These gases can still yield amino acids, but in greatly reduced quantities.

[...] "During cold conditions you never have lightning, and early Earth was under a pretty faint Sun," Airapetian said. "That's not saying that it couldn't have come from lightning, but lightning seems less likely now, and solar particles seems more likely."

These experiments suggest our active young Sun could have catalyzed the precursors of life more easily, and perhaps earlier, than previously assumed.

Journal Reference:
Kensei Kobayashi, Jun-ichi Ise, Ryohei Aoki, et al., Formation of Amino Acids and Carboxylic Acids in Weakly Reducing Planetary Atmospheres by Solar Energetic Particles from the Young Sun [open], Life, 2023. https://doi.org/10.3390/life13051103


Original Submission

posted by mrpg on Wednesday June 07 2023, @02:30PM   Printer-friendly

New York's skyscrapers are causing it to sink – what can be done about it?:

The ground under New York City is sinking partly due to the sheer mass of all its buildings [...] As sea levels also rise to meet these concrete jungles, can they be saved?

[...] On the 300sq miles (777sq km) that comprise New York City sit 762 million tonnes (1.68 trillion pounds) of concrete, glass and steel, according to estimates by researchers at the United States Geological Survey (USGS). While that figure involves some generalisations about construction materials, that prodigious tonnage does not include the fixtures, fittings and furniture inside those million-odd buildings. Nor does it include the transport infrastructure that connects them, nor the 8.5 million people who inhabit them.

All that weight is having an extraordinary effect on the land on which it is built. That ground, according to a study published in May, is sinking by 1-2mm (0.04-0.08in) per year, partly due to the pressure exerted on it by the city buildings above. And that is concerning experts – add the subsidence of the land to the rising of sea levels, and the relative sea level rise is 3-4mm (0.12-0.16in) per year. That may not sound like much, but over a few years it adds up to significant problems for a coastal city.

New York has already been suffering subsidence since the end of last ice age. Relieved of the weight of ice sheets, some land on the Eastern Seaboard is expanding, while other parts of the coastal landmass, including the chunk on which New York City lies, seem to be settling down. "That relaxation causes subsidence," says Tom Parsons, a research geophysicist at the Pacific Coastal and Marine Science Center of the USGS in Moffett Field, California and one of the four authors of the study.

But the enormous weight of the city's built environment worsens this subsidence, Parsons says.

And this is a global phenomenon. New York City, says Parsons, "can be seen as a proxy for other coastal cities in the US and the world that have growing populations from people migrating to them, that have associated urbanisation, and that face rising seas".

There is a wide range of reasons for why coastal cities are sinking, but the mass of human infrastructure pressing down on the land is playing a role. The scale of this infrastructure is vast: in 2020 the mass of human-made objects surpassed that of all living biomass.

[...] Can anything be done to halt these cities – which between them have hundreds of millions of residents – from sinking into the sea?

It's a relatively long article, but it clearly describes the extent of the problem.

Journal References:

1.) Land Area and Population - Per Square Mile New York - Northern New Jersey - Long Island (NY-NJ-PA)

2.) The Weight of New York City: Possible Contributions to Subsidence From Anthropogenic Sources Tom Parsons, Pei-Chin Wu, Meng (Matt) Wei, et al.

3.) (DOI: 10.1029/2022EF003465)

4.) Hotspot of accelerated sea-level rise on the Atlantic coast of North America, Asbury H. Sallenger, Kara S. Doran, Peter A. Howd.

5.) Nature Climate Change (DOI: 10.1038/nclimate1597)

6.) Elhacham, Emily, Ben-Uri, Liad, Grozovski, Jonathan, et al. Global human-made mass exceeds all living biomass, Nature (DOI: 10.1038/s41586-020-3010-5)

7.) (DOI: 10.1029/2022GL098477)

8.) (DOI: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2020JB020648)

9.) (DOI: 10.1016/j.cosust.2021.02.010)


Original Submission

posted by mrpg on Wednesday June 07 2023, @11:40AM   Printer-friendly

A conductive self-healing hydrogel to create flexible sensors:

Recent advancements in the field of electronics have enabled the creation of smaller and increasingly sophisticated devices, including wearable technologies, biosensors, medical implants, and soft robots. Most of these technologies are based on stretchy materials with electronic properties.

While material scientists have already introduced a wide range of flexible materials that could be used to create electronics, many of these materials are fragile and can be easily damaged. As damage to materials can result in their failure, while also compromising the overall functioning of the system they are integrated in, several existing soft and conductive materials can end up being unreliable and unsuitable for large-scale implementations.

Researchers at Harbin University of Science and Technology in China recently developed a new conductive and self-healing hydrogel that could be used to create flexible sensors for wearables, robots or other devices. This material and its composition was outlined in the Journal of Science: Advanced Materials and Devices.

[...] In the future, the hydrogel created by this team of researchers could be used to develop a wide range of other sensors and wearable electronics, such as sensors that can detect human motion or medical devices that monitor specific biological signals. In addition, their work could pave the way for the development of similar flexible and conductive hydrogels with self-healing properties.

Journal Reference:
Xiaoming Wang et al, Constructing conductive and mechanical strength self-healing hydrogel for flexible sensor, Journal of Science: Advanced Materials and Devices (2023). DOI: 10.1016/j.jsamd.2023.100563


Original Submission