Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Idiosyncratic use of punctuation - which of these annoys you the most?

  • Declarations and assignments that end with }; (C, C++, Javascript, etc.)
  • (Parenthesis (pile-ups (at (the (end (of (Lisp (code))))))))
  • Syntactically-significant whitespace (Python, Ruby, Haskell...)
  • Perl sigils: @array, $array[index], %hash, $hash{key}
  • Unnecessary sigils, like $variable in PHP
  • macro!() in Rust
  • Do you have any idea how much I spent on this Space Cadet keyboard, you insensitive clod?!
  • Something even worse...

[ Results | Polls ]
Comments:50 | Votes:95

posted by hubie on Thursday January 04 2024, @11:10PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

I recently managed to anger both the iPhone users who are in favor of sideloading and the ones who aren’t. I explained several times that I will never use sideloading, third-party app stores, or third-party payment systems on my iPhone. Some of our readers were not happy with that line of thinking.

I also said that Apple should make sideloading available internationally rather than only in specific markets where regulators force it to. That way, it would deal with the PR mess around these sensitive topics. Then I heard from another round of BGR readers who were unhappy with my point of view.

I might not want to sideload apps on my iPhone, but Apple might be forced to support the feature in more markets. Following the decision in the Europan Union (EU), other regions are looking to impose similar measures on Apple, including Japan and the US.

[...] Reports have said that Apple plans to restrict sideloading and all that comes with it to the EU region. International users won’t even be able to VPN services to trick Apple into allowing sideloading on their iPhones.

Japan is looking at similar laws to improve competition on iPhone and Android. They would impact Apple, Google, and other companies that manage similar platforms.

As for the US, the US Department of Justice (DOJ) has launched an antitrust investigation into the same App Store issues. Per The Financial Times (via 9to5Mac), the case against Apple is “firing on all cylinders.”

[...] Google recently settled a similar antitrust case. The company will have to pay $700 million and change how the Google Play store works. Google will have to let Android users install apps from any source. That’s what sideloading and third-party app stores amount to.

Google will also support third-party payment systems. However, the company will still collect a fee from developers for each transaction. And developers might not end up saving that much money, if anything, by offering apps through third-party repositories instead of the Play Store.

[...] Antitrust case aside, I think Apple would be better served by getting ahead of all this. Universal iPhone sideloading support will happen eventually. Resisting it only hurts the company at this point.

At the end of the day, it will be up to the users to decide whether to install apps from third parties. In the long run, I don’t think Apple’s App Store bottom line will see a big impact. Not as long as I, and others like me, continue to avoid iPhone sideloading and all the risks that it entails.


Original Submission

posted by hubie on Thursday January 04 2024, @06:23PM   Printer-friendly
from the until-gmail-pulls-the-plug-on-you dept.

System administrator and prolific blogger, Chris Siebenmann, has a brief note that e-mail addresses are not good 'permanent' identifiers for accounts.

The biggest problem with email addresses as 'permanent' identifiers is that people's email addresses change even within a single organization (for example, a university). They change for the same collection of reasons that people's commonly used names and logins change. An organization that refuses to change or redo the email addresses it assigns to people is being unusually cruel in ways that are probably not legally sustainable in any number of places.


Original Submission

posted by janrinok on Thursday January 04 2024, @01:54PM   Printer-friendly

https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html

I am deeply saddened to have received the news of Niklaus Wirth's passing and extend my heartfelt condolences to his family and all those who were dear to him. I wanted to take a moment to reflect on the profound and positive impact that Niklaus had on my life and career, and to express my gratitude for all that he meant to me.


Original Submission

posted by hubie on Thursday January 04 2024, @09:12AM   Printer-friendly
from the $1.3M-CEO-raise-is-clearly-putting-one-certain-people-over-profit dept.

Arthur T Knackerbracket has processed the following story:

In a nutshell: Mozilla is seemingly on a journey to change the internet again, but it will not get there with Firefox. The open-source browser is barely mentioned throughout the company's latest corporate manifesto, where AI algorithms have become the real focus of the show.

In the recently published State of Mozilla for 2023, the open-source foundation made some bold statements about its plans for the future. The organization aims to build a better internet "by the people, for the people," countering the overwhelming influence of Big Tech corporations with open data and AI services.

The updated State of Mozilla is essentially designed to be a company manifesto written in corporate lingo but it also comes with the latest financial statements related to 2022 results, providing some interesting food for thought about how and where the organization is spending its money.

Mozilla's CEO, Mitchell Baker, received a substantial compensation increase, as stated in the document, going from $5,591,406 (2021) to $6,903,089 in 2022. Baker mentioned that the organization is clearly moving in the right direction but needs to do more and have a larger impact on the market.

[...] Mozilla's CEO is being paid a lot more while Firefox keeps losing users, and someone has suggested that the organization's plan is now to fully transition away from the open-source browser. Mozilla is increasing its pile of financial assets, and Baker has clearly stated that the organization is ready to make "difficult choices" when it comes to shutting down unprofitable projects.

So, what future is Mozilla trying to build for itself and internet users? The foundation is pushing the idea of a trustworthy AI, improved ML algorithms with rich data, and privacy prioritization. Mozilla still wants to put people ahead of profit, the company's CEO said, but also take more risks and move quickly in the growing AI market.

See also: Linux Foundation Spending on Actual Linux Down to 2% of Their Budget


Original Submission

posted by hubie on Thursday January 04 2024, @04:24AM   Printer-friendly
from the foggy-research dept.

https://phys.org/news/2023-12-fog-mountainous-areas.html

Of the world's various weather phenomena, fog is perhaps the most mysterious, forming and dissipating near the ground with fluctuations in air temperature and humidity interacting with the terrain itself.

While fog presents a major hazard to transportation safety, meteorologists have yet to figure out how to forecast it with the precision they have achieved for precipitation, wind and other stormy events.

This is because the physical processes resulting in fog formation are extremely complex, according to Zhaoxia Pu, a professor of atmospheric sciences at the University of Utah.

"Our understanding is limited. In order to accurately forecast fog we should better understand the process that controls fog formation," said Pu, who led a fog study focusing on a northern Utah valley.

Now, in a recent paper published by the American Meteorological Society, Pu and her colleagues have reported their findings from the Cold Fog Amongst Complex Terrain (CFACT) project, conceived to investigate the life cycle of cold fog in mountain valleys.
...
Today, most forecasting uses a computer model known as Numerical Weather Prediction (NWP), which processes massive meteorological observations with computer models to output predictions for precipitation, temperature, and all sorts of other elements of the weather. However the current computer model doesn't work well for fog, and Pu's team hopes that improvements can be made using the masses of data they gathered over seven weeks in the winter of 2022 at several sites in the Heber Valley.

"Fog involves a lot of physics processes so it requires a computer model that can better represent all these processes," Pu said. "Because fog is clouds near the ground, it requires a high-resolution model to resolve it, so we need models at a very fine scale, which are computationally very expensive. The current models (relatively coarser in resolution) are not capable of resolving the fog processes, and we need to improve the models for better fog prediction."

Located bout 50 miles southeast of Salt Lake City, Heber Valley is nestled behind the Wasatch Mountains and framed by two major reservoirs on the Provo River.

This scenic basin is a typical mountain valley, hemmed by Mt. Timpanogos and other high peaks, with the reservoirs serving as a moisture source. The seven-week study window covered the time of year when Heber Valley is the foggiest.

Valley fog is a perfect example of how topography and atmospheric processes converge to create a distinctive weather phenomenon.
...

Journal Reference:
Zhaoxia Pu et al, Cold Fog Amongst Complex Terrain, Bulletin of the American Meteorological Society (2023). DOI: 10.1175/BAMS-D-22-0030.1


Original Submission

posted by hubie on Wednesday January 03 2024, @11:39PM   Printer-friendly
from the even-if-you-fry-it-in-butter dept.

Daniel Stenberg of cURL fame has written about the impact of fake, LLM-generated bug reports has on his project, cURL. The main problem with LLM-generated bug reports is that they tend to be bunk while at the same time looking close enough to a real bug report as to end up wasting a lot of developer time which could have been used triaging and addressing real bugs.

A security report can take away a developer from fixing a really annoying bug. because a security issue is always more important than other bugs. If the report turned out to be crap, we did not improve security and we missed out time on fixing bugs or developing a new feature. Not to mention how it drains you on energy having to deal with rubbish.

Often wannabe security consultants will take the output of an LLM and modify it with their own language, thus intentionally or unintentionally obscuring some of the telltale warning signs of LLM-generated bunk.

Previously:
(2023) "cURL", the URL Code That Can, Marks 25 Years of Transfers
(2023) Half of Curl's Security Vulnerabilities Due to C Mistakes
(2020) curl up 2020 and Other Conferences Go Online Only
(2018) Daniel Stenberg, Author of cURL and libcurl, Denied US Visit Again


Original Submission

posted by hubie on Wednesday January 03 2024, @06:57PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Samsung is delaying the start of mass production at its Texas foundry to 2025, according to a report by Business Korea. The new fabbing plant, which was originally set to begin mass production in the second half of 2024, is now expected to have only limited production by that time. Samsung is apparently scaling back its Texas operation due to uncertain financial factors, including CHIPS Act subsidies and the global economy.

The Korean conglomerate will invest $200 billion in Texas alone, with 11 total foundries to produce 4nm chips, the first one being built at Taylor, Texas. The Taylor fab has been delayed, much like TSMC's Fab 21 in Arizona. But the delay at Samsung's first fab is apparently intentional, with the corporation deciding to postpone mass production in favor of a smaller output levels until 2025. In contrast, TSMC's foundry hit conflicts with local workers and unions.

[...] Business Korea claims that finances are a key concern of Samsung, especially when considering CHIPS Act subsidies and the state of the global economy. The CHIPS and Science Act is supposed to grant subsidies to semiconductor companies like Samsung to encourage the construction of foundries in the U.S. However, these subsidies are still largely in the pipeline, with just $35 million of the total $52 billion granted so far.

[...] The health of the global economy is also on Samsung's mind. Although the U.S. has seemingly achieved a so-called soft landing and has avoided a true recession, other parts of the world aren't so lucky. Samsung is still reeling from low SSD and RAM prices, even though revenue is finally on the rise. The PC market is forecasted to finally recover to 2020 levels in 2025, but the high peaks of 2021 are apparently well in the past. All of this just makes Samsung tepid about its $200 billion investment in the U.S., which could spell success or disaster for the company.  


Original Submission

posted by hubie on Wednesday January 03 2024, @02:09PM   Printer-friendly
from the milk-it-does-a-body-good dept.

Arthur T Knackerbracket has processed the following story:

Vitamin D deficiency, depression and diabetes are among a number of health issues that increase the risk of young-onset dementia, a major study suggests.

The condition - which around 70,000 people in the UK are thought to be living with - is when symptoms of dementia develop before the age of 65.

The study challenges the notion that genetics are the sole cause of the condition, researchers have said.

Targeting the factors they identified could help reduce the risk, they added.

A list of 15 factors - which is similar to that for late-onset dementia - includes alcohol abuse, stroke, social isolation and hearing impairment.

Those with a higher formal education were seen to be at less of a risk.

The study "breaks new ground" and could "herald a new era" for interventions to reduce new cases, said Dr Janice Ranson, one of the study's authors.

The most common feature of dementia is memory loss, but other symptoms include changes in behaviour, and becoming lost in familiar places.

[...] The study, conducted by scientists from the UK and the Netherlands, is the "largest and most robust study of its kind ever conducted", one of it's authors, Professor David Llewellyn, has said.

[...] Prof Llewellyn said there was still much to learn but the study "reveals that we may be able to take action to reduce risk of this debilitating condition".

"This pioneering study shines important and much-needed light on factors that can influence the risk of young-onset dementia."

[...] "The cause is often assumed to be genetic, but for many people we don't actually know exactly what the cause is. This is why we also wanted to investigate other risk factors in this study."

The full list, from the paper's results: "In the final model, 15 factors were significantly associated with a higher YOD risk, namely lower formal education, lower socioeconomic status, carrying 2 apolipoprotein ε4 allele, no alcohol use, alcohol use disorder, social isolation, vitamin D deficiency, high C-reactive protein levels, lower handgrip strength, hearing impairment, orthostatic hypotension, stroke, diabetes, heart disease, and depression."

Journal Reference:
Hendriks S, Ranson JM, Peetoom K, et al. Risk Factors for Young-Onset Dementia in the UK Biobank. JAMA Neurol. Published online December 26, 2023. doi:10.1001/jamaneurol.2023.4929


Original Submission

posted by hubie on Wednesday January 03 2024, @09:20AM   Printer-friendly

Some excerpts from an Ars Technica interview with historian Rebecca Simon on the real-life buccaneer bylaws that shaped every aspect of a pirate's life:

One of the many amusing scenes in the 2003 film Pirates of the Caribbean: The Curse of the Black Pearl depicts Elizabeth Swann (Keira Knightley) invoking the concept of "parley" in the pirate code to negotiate a cease of hostilities with pirate captain Hector Barbossa (Geoffrey Rush). "The code is more what you'd call guidelines than actual rules," he informs her. Rebecca Simon, a historian at Santa Monica College, delves into the real, historical set of rules and bylaws that shaped every aspect of a pirate's life with her latest book. The Pirates' Code: Laws and Life Aboard Ship.

Ars Technica: How did the idea of a pirates' code come about?

Rebecca Simon: Two of the pirates that I mention in the book—Ned Low and Bartholomew Roberts—their code was actually published in newspapers in London. I don't where they got it. Maybe it was made up for the sake of readership because that is getting towards the tail end of the Golden Age of Piracy, the 1720s. But we find examples of other codes in A General History of the Pyrates written by a man named Captain Charles Johnson in 1724. It included many pirate biographies and a lot of it was very largely fictionalized. So we take it with a grain of salt. But we do know that pirates did have a notion of law and order and regulations and ritual based on survivor accounts.

You had to be very organized. You had to have very specific rules because as a pirate, you're facing death every second of the day, more so than if you are a merchant or a fisherman or a member of the Royal Navy.  Pirates go out and attack to get the goods that they want. In order to survive all that, they have to be very meticulously prepared. Everyone has to know their exact role and everyone has to have a game plan going in. Pirates didn't attack willy-nilly out of control. No way. They all had a role.

[...] Ars Technica: Some of the pirate codes seemed surprisingly democratic. They divided the spoils equally according to rank, so there was a social hierarchy. But there was also a sense of fairness.

Rebecca Simon: You needed to have a sense of order on a pirate ship. One of the big draws that pirates used to recruit hostages to officially join them into piracy was to tell them they'd get an equal share. This was quite rare on many other ships. where payment was based per person, or maybe just a flat rate across the board. A lot of times your wages might get withheld or you wouldn't necessarily get the wages you were promised. On a pirate ship, everyone had the amount of money they were going to get based on the hierarchy and based on their skill level. The quartermaster was in charge of doling out all of the spoils or the stolen goods. If someone was caught taking more of their share, that was a huge deal.

You could get very severely punished perhaps by marooning or being jailed below the hold. The punishment had to be decided by the whole crew, so it didn't seem like the captain was being unfair or overly brutal. Pirates could also vote out their captain if they felt the captain was doing a bad job, such as not going after enough ships, taking too much of his share, being too harsh in punishment, or not listening to the crew. Again, this is all to keep order. You had to keep morale very high, you had to make sure there was very little discontent or infighting.

[...] Ars Technica: Much of what you do is separate fact from fiction, such as the legend of Captain Kidd's buried treasure. What are some of the common misconceptions that you find yourself correcting, besides buried treasure?

Rebecca Simon:  A lot of people ask me about the pirate accent: "Aaarr matey!" That accent we think of comes from the actor Robert Newton who played Long John Silver in the 1950 film Treasure Island. In reality, it just depended on where they were born. At the end of the day, pirates were sailors. People ask about what they wore, what they ate, thinking it's somehow different. But the reality is it was the same as other sailors. They might have had better clothes and better food because of how often they robbed other ships.

[...] Ars Technica: What were the factors that led to the end of what we call the Golden Age of Piracy?

Rebecca Simon: There were several reasons why piracy really began to die down in the 1720s. One was an increase in the Royal Navy presence so the seas were a lot more heavily patrolled and it was becoming more difficult to make a living as a pirate. Colonial governors and colonists were no longer supporting pirates the way they once had, so a lot of pirates were now losing their alliances and protections. A lot of major pirate leaders who had been veterans of the War of the Spanish Succession as privateers had been killed in battle by the 1720s: people like Charles Vane, Edward Teach, Benjamin Hornigold, Henry Jennings, and Sam Bellamy.

It was just becoming too risky. And by 1730 a lot more wars were breaking out, which required people who could sail and fight. Pirates were offered pardons if they agreed to become a privateer, basically a government-sanctioned mercenary at sea where they were contracted to attack specific enemies. As payment they got to keep about 80 percent of what they stole. A lot of pirates decided that was more lucrative and more stable.


Original Submission

posted by hubie on Wednesday January 03 2024, @04:33AM   Printer-friendly
from the how-low-can-you-go? dept.

Bryan Lunduke has gone over the 2023 Linux Foundation report. He has observed that the foundation spends even less on the kernel than ever, both in absolute dollars and in percentage of the budget. It spends around 2% on Linux and 98% on everything else.

While it's true that The Linux Foundation continues to grow substantially -- now bringing in over a quarter of a Billion dollars per year (seriously) -- the total amount spent on the Linux kernel dropped roughly $400,000 in 2023.   (Not surprising as The Lunduke Journal previously pointed out that lowering the total support of Linux appeared to be the goal.)

  • The percentage of The Linux Foundation revenue spent on Linux dropped in 2023.
  • And the total amount spent dropped as well.
  • All while funding of non-Linux projects (such as AI and Blockchain) continued to dominate.

As many notice, budget aside, the foundation does not advance or promote the kernel, rather the opposite. It represents its members' corporate interests inside kernel development. Bruce Perens pointed out about six years ago that the membership the basically amounts to a GPL infringers club.

Previously:
(2023) Linux Foundation Launches New Organization to Maintain TLA+
(2021) Linux Foundation and Partners Announce "Open 3D Foundation"
(2021) Linux Foundation Unveils Sigstore
(2020) Linux Foundation Does Not Eat its Own Dogfood


Original Submission

posted by hubie on Tuesday January 02 2024, @10:03PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

[...] The rainy season in the Amazon should have started in October but it was still dry and hot until late November. This is an effect of the cyclical El Niño weather pattern, amplified by climate change.

El Niño causes water to warm in the Pacific Ocean, which pushes heated air over the Americas. This year the water in the North Atlantic has also been abnormally warm, and hot, dry air has enveloped the Amazon.

"When it was my first drought I thought, 'Wow, this is awful. How can this happen to the rainforest?'" says Flávia Costa, a plant ecologist at the National Institute for Amazonian Research, who has been living and working in the rainforest for 26 years.

"And then, year after year, it was record-breaking. Each drought was stronger than before."

She says it's too soon to assess how much damage this year's drought has done, but her team has found many plants "showing signs of being dead".

Past dry seasons give an indication of the harm that could be done. By some estimates the 2015 "Godzilla drought" killed 2.5bn trees and plants in just one small part of the forest - and it was less severe than this latest drought.

"On average, the Amazon stopped functioning as a carbon sink," Dr Costa says. "And we mostly expect the same now, which is sad."

[...] As it stands, the Amazon creates a weather system of its own. In the vast rainforest, water evaporates from the trees to form rain clouds which travel over the tree canopy, recycling this moisture five or six times. This keeps the forest cool and hydrated, feeding it the water it needs to sustain life.

But if swathes of the forest die, that mechanism could be broken. And once this happens there may be no going back.

[...] In her 30 years living in the Amazon, Dr Marmontel never imagined she would see it so dry. She is shocked by how quickly the climate is changing.

"It was like a slap in the face. Because it's the first time that I see and I feel what's happening to the Amazon," she says.

"We always say these animals are sentinels because they feel first what's going to come to us. It's happening to them, it's going to happen to us."

For Oliveira, too, this year has been a wake up call.

"We know that we are very much to blame for this, we haven't been paying attention, we haven't been defending our mother Earth. She is screaming for help," he says.

"It's time to defend her."


Original Submission

posted by requerdanos on Tuesday January 02 2024, @09:30PM   Printer-friendly
from the 21:00-UTC-is-4pm-Eastern-Standard-Time dept.

Meeting Announcement: The next meeting of the SoylentNews governance committee is scheduled for Wednesday, January 3, 2024 at 21:00 UTC (4pm Eastern) in #governance on SoylentNews IRC. Logs of the meeting will be available afterwards for review, and minutes will be published when complete.

Minutes and agenda, and other governance committee information have a page on the SoylentNews Wiki at: https://wiki.staging.soylentnews.org/wiki/Governance

The community is welcome to observe and participate, and is encouraged to attend the meeting.

posted by martyb on Tuesday January 02 2024, @02:12PM   Printer-friendly

Google settles $5bn lawsuit for 'private mode' tracking:

Google has agreed to settle a US lawsuit claiming it invaded the privacy of users by tracking them even when they were browsing in "private mode".

The class action sought at least $5bn (£3.9bn) from the world's go-to search engine and parent company Alphabet.

Large technology firms have faced increased scrutiny of their practices in the US and beyond.

Lawyers representing Google and its users did not immediately respond to the BBC's requests for comment.

[...] Judge Rogers had rejected Google's bid to have the case dismissed earlier this year, saying she could not agree that users consented to allowing Google to collect information on their browsing activity.

The terms of the settlement were not disclosed. However, lawyers are expected to present a formal settlement for the court's approval by February 2024.

The class action, which was filed by law firm Boies Schiller Flexner in 2020, claimed that Google had tracked users' activity even when they set the Google Chrome browser to "Incognito" mode and other browsers to "private mode".

It said this had turned Google into an "unaccountable trove of information" on user preferences and "potentially embarrassing things".

It added that Google could not "continue to engage in the covert and unauthorized data collection from virtually every American with a computer or phone".

Google said it had been upfront about the data it collected when users viewed in private mode, even if many users assumed otherwise.

The search engine said the collection of search history, even in private viewing mode, helped site owners "better evaluate the performance of their content, products, marketing and more".

Incognito mode within Google's Chrome browser gives users the choice to search the internet without their activity being saved to the browser or device. But the websites visited can use tools such as Google Analytics to track usage.

Google faces other lawsuits challenging its search and digital advertising practices.


Original Submission

posted by hubie on Tuesday January 02 2024, @09:37AM   Printer-friendly

SSBs on Linux and when they may be most useful.:

These days, users can usually assume that Linux has the same functionality as other operating systems. The application may differ, but the functionality is available. Occasionally, though, Linux may lack any equivalent. A case in point is Single-Site Browsers, aka Site-Specific Bowsers (SSBs). Although Wikipedia lists a number of SSBs, Peppermint OS's Ice and its successor Kumo are the only free software versions of SSBs available on Linux. Fortunately for those who want this functionality, Peppermint OS is a Debian derivative, and both can be installed on Debian and most other derivatives.

As the names imply, SSBs are web browsers that open to a single URL. They are one effort to address the dichotomy that exists on modern computers between local applications and Internet resources. That is to say, while local applications are in a user's control –and can be positioned as desired on a workplace or on the desktop panel or menu – Internet resources are ordinarily accessed through the extra step of opening a web browser. Moreover, while most users long ago became accustomed to web browsers, they add another level of complication with bookmarks, tabs, and extensions that is often unnecessary and not needed with an SSB. The idea is that by creating SSBs, Internet resources can be accessed in the same way as local applications, making for a simpler, more efficient user experience. Moreover, an SSB can be isolated as a security measure. In addition, companies can install SSBs without a web browser so that employees access selected Internet resources but not use the web for personal purposes during work hours. A business might also use SSBs to view its intranet or web page.

Since SSBs first appeared in 2005, they have been available on both Windows and macOS. On Linux, however, the availability has come and gone. On Linux, Firefox once had an SSB mode, but it was discontinued in 2020 on the grounds that it had multiple bugs that were time-consuming to fix and there was "little to no perceived user benefit to the feature." Similarly, Chromium once had a basic SSB menu item, Create Application Shortcut, which no longer appears in recent versions. As for GNOME Web's (Epiphany's) Install Site as Web Application, while it still appears in the menu, it is no longer functional. Today, Linux users who want to try SSBs have no choices except Ice or Kumo.

[...]

Are SSBs Still Useful?

Had SSBs come into existence in the mid-1990s when the Internet first became popular, they would have been a valuable tool for those trying to grasp the difference between between local applications and online resources. But nearly a quarter of the way into the 21st century, complete newcomers to computers are a much smaller minority than they once were. After trying Ice and Kumo for a week, I found SSBs convenient, but remain uncertain whether to make them a standard tool on my desktop. While SSBs make for a better user experience, is their efficiency that much more efficient than the dichotomy I lived with years? For better or worse, like most people, I am used to the dichotomy and it would be inefficient to change, even for a more economical arrangement.

Perhaps SSBs make more sense on a network or in a business where their isolation provides another layer of security. Or perhaps the time for SSBs is past and there's a reason browsers have tried to implement them, and then discarded them.


Original Submission

posted by martyb on Tuesday January 02 2024, @04:56AM   Printer-friendly
from the just-think-about-it? dept.

During the pandemic, a third of people in the UK reported that their trust in science had increased, we recently discovered. But 7% said that it had decreased. Why is there such variety of responses?

For many years, it was thought that the main reason some people reject science was a simple deficit of knowledge and a mooted(*) fear of the unknown. Consistent with this, many surveys reported that attitudes to science are more positive among those people who know more of the textbook science.

But if that were indeed the core problem, the remedy would be simple: inform people about the facts. This strategy, which dominated science communication through much of the later part of the 20th century, has, however, failed at multiple levels.

In controlled experiments, giving people scientific information was found not to change attitudes. And in the UK, scientific messaging over genetically modified technologies has even backfired.

[...] Recent evidence has revealed that people who reject or distrust science are not especially well informed about it, but more importantly, they typically believe that they do understand the science.

[...] A common quandary for much science communication may in fact be that it appeals to those already engaged with science. Which may be why you read this.

That said, the new science of communication suggests it is certainly worth trying to reach out to those who are disengaged.

(*) Moot; see: https://www.collinsdictionary.com/dictionary/english/moot or https://www.etymonline.com/word/moot.

[Source]: The Conversation

What solution would you suggest ?


Original Submission