Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

If you were trapped in 1995 with a personal computer, what would you want it to be?

  • Acorn RISC PC 700
  • Amiga 4000T
  • Atari Falcon030
  • 486 PC compatible
  • Macintosh Quadra 950
  • NeXTstation Color Turbo
  • Something way more expensive or obscure
  • I'm clinging to an 8-bit computer you insensitive clod!

[ Results | Polls ]
Comments:66 | Votes:167

posted by hubie on Friday October 07 2022, @11:03PM   Printer-friendly
from the eyes-have-it dept.

The countries say the pact will help combat serious crimes, but privacy advocates have raised concerns:

As of today, a data-sharing pact between the US and the UK is in effect, five years after it was first floated. The two sides claim that the Data Access Agreement, which was authorized by the Clarifying Lawful Overseas Use of Data (CLOUD) Act in the US, will help law enforcement to combat serious crimes in both countries. The Department of Justice called the initiative the first of its kind, adding that it would enable investigators "to gain better access to vital data" to fight serious crimes in a manner that's "consistent with privacy and civil liberties standards."

Under the agreement, authorities in one country can request data from ISPs in the other country, as long as it's related to preventing, detecting, investigating and prosecuting serious crimes including terrorism, transnational organized crime and child exploitation. US officials can't submit data requests targeting people in the UK and vice-versa — presumably the requests can either be used to assist domestic investigations or investigations into foreign nationals. Authorities also need to adhere to certain requirements, limitations and conditions when they access and use data.

[...] The US is looking to forge pacts with other countries under the CLOUD Act. It signed a deal with Australia last December and entered negotiations with Canada earlier this year.

Previously:
    Responsibility Deflected, the CLOUD Act Passes
    U.S. law to Snoop on Citizens' Info Stored Abroad


Original Submission

posted by hubie on Friday October 07 2022, @08:16PM   Printer-friendly
from the no-laughing-matter dept.

Exoplanet hunters should check for N2O:

Scientists at UC Riverside are suggesting something is missing from the typical roster of chemicals that astrobiologists use to search for life on planets around other stars — laughing gas.

Chemical compounds in a planet's atmosphere that could indicate life, called biosignatures, typically include gases found in abundance in Earth's atmosphere today.

"There's been a lot of thought put into oxygen and methane as biosignatures. Fewer researchers have seriously considered nitrous oxide, but we think that may be a mistake," said Eddie Schwieterman, an astrobiologist in UCR's Department of Earth and Planetary Sciences.

[...] "In a star system like TRAPPIST-1, the nearest and best system to observe the atmospheres of rocky planets, you could potentially detect nitrous oxide at levels comparable to CO2 or methane, [CH4]" Schwieterman said.

[...] Others who have considered N2O as a biosignature gas often conclude it would be difficult to detect from so far away. Schwieterman explained that this conclusion is based on N2O concentrations in Earth's atmosphere today. Because there isn't a lot of it on this planet, which is teeming with life, some believe it would also be hard to detect elsewhere.

"This conclusion doesn't account for periods in Earth's history where ocean conditions would have allowed for much greater biological release of N2O. Conditions in those periods might mirror where an exoplanet is today," Schwieterman said.

[...] The research team believes now is the time for astrobiologists to consider alternative biosignature gases like N2O because the James Webb telescope may soon be sending information about the atmospheres of rocky, Earth-like planets in the TRAPPIST-1 system.

Journal Reference:
Edward W. Schwieterman et al 2022 Evaluating the Plausible Range of N2O Biosignatures on Exo-Earths: An Integrated Biogeochemical, Photochemical, and Spectral Modeling Approach [open] ApJ 937 109. DOI: 10.3847/1538-4357/ac8cfb


Original Submission

posted by martyb on Friday October 07 2022, @05:32PM   Printer-friendly

http://www.os2museum.com/wp/pc-mos386-source-code/

I missed this when it was initially announced. The source code for PC-MOS/386 version 5.01 is now available on github under the GPLv3 license. It requires the user to supply Borland C++ 3.1 in order to build, but there are binaries checked in as well, including a bootable floppy image.

PC-MOS is a multi-tasking/multi-user DOS clone. It was one of the first commercial products which used the 386's virtual-8086 mode when it was released in early 1987 (but not the first, that was almost certainly CEMM in 1986).

Are there any [gray|grey|white]-beards here who remember using this old version? What is the oldest version you used?


Original Submission

posted by janrinok on Friday October 07 2022, @02:50PM   Printer-friendly
from the time-someone-did-a-bit-of-dusting dept.

NASA's DART asteroid impact test left a trail over 6,000 miles long:

NASA's successful asteroid impact test created a beautiful mess, apparently. As the Associated Press reports, astronomers using the Southern Astrophysical Research (SOAR) Telescope in Chile have captured an image revealing that DART's collision with Dimorphos left a trail of dust and other debris measuring over 6,000 miles long. The spacecraft wasn't solely responsible — rather, the Sun's radiation pressure pushed the material away like it would with a comet's tail.

[...] The capture was about more than obtaining a dramatic snapshot, of course. Scientists will use data collected using SOAR, the Astronomical Event Observatory Network and other observers to understand more about the collision and Dimorphos itself. They'll determine the amount and speed of material ejected from the asteroid, and whether or not DART produced large debris chunks or 'merely' fine dust. Those will help understand how spacecraft can alter an asteroid's orbit, and potentially improve Earth's defenses against wayward cosmic rocks.


Original Submission

posted by janrinok on Friday October 07 2022, @12:07PM   Printer-friendly
from the I'm-really-sorry-for-your-ageing-head dept.

Biomarkers used to track benefits of anti-ageing therapies can be misleading:

We all grow old and die, but we still don't know why. Diet, exercise and stress all effect our lifespan, but the underlying processes that drive ageing remain a mystery. Often, we measure age by counting our years since birth and yet our cells know nothing of chronological time—our organs and tissues may age more rapidly or slowly regardless of what we'd expect from counting the number of orbits we tale around the sun.

For this reason, many scientists search to develop methods to measure the "biological age" of our cells -– which can be different from our chronological age. In theory, such biomarkers of ageing could provide a measure of health that could revolutionize how we practice medicine. Individuals could use a biomarker of ageing to track their biological age over time and measure the effect of diet, exercise, and drugs and predict their effects to extend lifespan or improve quality of life. Medicines could be designed and identified based on their effect on biological age. In other words, we could start to treat ageing itself.

However, no accurate and highly predictive test for biological age has been validated to date. In part, this is because we still don't know what causes ageing and so can't measure it. Definitive progress in the field will require validating biomarkers throughout a patient's lifetime, an impractical feat given human life expectancy.

[...] Describing their results in the journal PLOS Computational Biology, the research team found that nematodes have at least two partially independent ageing processes taking place at the same time – one that determines VMC [vigorous movement cessation] and the other determines time of death. While both processes follow different trajectories, their rates are correlated to each other, in other words, in individuals for whom VMC occurred at an accelerated rate, so did time of death, and vice versa. In other words, the study revealed that each individual nematode has at least two distinct biological ages.

[...] The researchers also found that no matter which lifespan-altering mutations and interventions they gave the nematodes, the statistical correlation between the distinct biological ages remained constant. This suggests the existence of an invisible chain of command – or hierarchical structure – that regulates the worm's ageing processes, the mechanisms of which are yet to be discovered. This means that, while ageing processes can be independent, it is also true that some individuals are 'fast agers' and others 'slow agers', in that many of their ageing processes move similarly faster or slower than their peers.

The findings have implications for consumers being offered commercial products that assess their biological age. [...]

According to Dr. Stroustrup, the solution lies in finding biomarkers that measure distinct, interacting ageing processes that also minimally correlate with each other. "Biomarkers used to assess biological age can be changed without actually turning a 'fast ager' into 'slow ager'. Researchers should focus on measuring the effect of interventions on functional outcomes rather than assuming that changes in biomarkers will predict outcomes in a straightforward way," he concludes.

Journal Reference:
Natasha Oswal, Olivier M. F. Martin, Sofia Stroustrup, et al. A hierarchical process model links behavioral aging and lifespan in C. elegans [open], Plos Comput Bio, 2022. DOI: 10.1371/journal.pcbi.1010415


Original Submission

posted by janrinok on Friday October 07 2022, @09:22AM   Printer-friendly
from the a-bus-factor-of-one dept.

The New Yorker has a non-technical article, The Thorny Problem of Keeping the Internet's Time, about the Network Time Protocol (NTP) from both the software and protocol perspectives. It gives a surprisingly good summary of the background of both as well as the current situation and the issues holding back the next steps. If you have networked computers, especially servers, in any capacity then you are certainly familiar with the NTP or at least its supporting utilities. NTP was developed by David Mills, who by the late 1970s, after a *little*-bit-of-improvementer his PhD, eventually ended up at COMSAT where he started working on it for ARPANET. He still works on it despite failed eyesight.

In N.T.P., Mills built a system that allowed for endless tinkering, and he found joy in optimization. "The actual use of the time information was not of central interest," he recalled. The fledgling Internet had few clocks to synchronize. But during the nineteen-eighties the network grew quickly, and by the nineties the widespread adoption of personal computers required the Internet to incorpoa-*little*-bit-of-improvementrate millions more devices than its first designers had envisioned. Coders created versions of N.T.P. that worked on Unix and Windows machines. Others wrote "reference implementations" of N.T.P.—open-source codebases that exemplified how the protocol should be run, and which were freely available for users to adapt. Government agencies, including the National Institute of Standards and Technology (NIST) and the U.S. Naval Observatory, started distributing the time kept by their master clocks using N.T.P.

A loose community of people across the world set up their own servers to provide time through the protocol. In 2000, N.T.P. servers fielded eighteen billion time-synchronization requests from several million computers—and in the following few years, as broadband proliferated, requests to the busiest N.T.P. servers increased tenfold. The time servers had once been "well lit in the US and Europe but dark elsewhere in South America, Africa and the Pacific Rim," Mills wrote, in a 2003 paper. "Today, the Sun never sets or even gets close to the horizon on NTP." Programmers began to treat the protocol like an assumption—it seemed natural to them that synchronized time was dependably and easily available. Mills's little fief was everywhere.

NTP servers keep the world's computers' clocks in synchrony, but there has been negligible amount of money kicked upstream to the project or even to Mills. Poul-Henning Kamp (PHK) gave a talk in 2015 at FOSDEM, Ntimed, an NTPD replacement, about where he saw things heading back in 2015 and how refactoring NTPd would be neither time nor resource efficient.

Previously:
(2015) New Attacks on Network Time Protocol can Defeat HTTPS and Create Chaos
(2015) Finance, Workload Troubles for Developer of Reference NTP Implementation
(2015) OpenNTPD 5.7p1 Released
(2014) What Time Is It? Time for Multiple NTP Vulnerabilities!


Original Submission

posted by janrinok on Friday October 07 2022, @06:36AM   Printer-friendly
from the about-time-too dept.

Uber's Former Security Chief Convicted of Covering Up 2016 Data Breach:

The firm's former chief information security officer was found guilty of hiding a massive data breach from federal investigators.

A federal jury has convicted Uber's former security chief of charges related to a 2016 cover-up involving the ride-share giant, according to journalists present in the courtroom.

Joe Sullivan, who was found guilty of one count of obstruction and one count of misprision of a felony on Wednesday, helped to conceal a massive 2016 data breach from authorities, while also obstructing a Federal Trade Commission investigation.

[...] Federal prosecutors alleged that Sullivan subsequently attempted to "conceal, deflect, and mislead the Federal Trade Commission about the breach." Sullivan's charges stem from the cover-up, not paying the hackers. The latter has become increasingly common in the cybersecurity industry in recent years.

The case has decidedly split those in the cybersecurity community. The New York Times reports that this could be the first time that a security executive was held liable for a hacking incident in this way. The episode could ultimately set a new precedent for future cases in which CISOs must face legal consequences over data breaches.


Original Submission

posted by janrinok on Friday October 07 2022, @03:52AM   Printer-friendly

Samsung announces 36 Gbps GDDR7 memory standard, aims to release V-NAND storage solutions with 1000 layers by 2030

The new 36 Gbps GDDR7 standard offers 50% improved speeds over the current 24 Gbps GDDR6X one from Micron. Peak GDDR7 bandwidth could reach 1.7 TB/s with a 384-bit bus. Samsung also plans to release 32 Gb DDR5 chips this year, and envisions a future where 1000-layer V-NAND storage could be possible by 2030.

[...] Furthermore, the 8.5 Gbps LPDDR5X DRAM solutions for mobile phones and ultrabooks are also expected to see increased adoption throughout the coming year.

The latest graphics cards from Nvidia, AMD, and Intel use GDDR6 or GDDR6X memory.

24 Gb DDR5 chips have already been announced as a stopgap between 16 Gb and 32 Gb, enabling memory modules with unusual capacities, e.g. 48 GiB instead of 32 or 64.

The 3D NAND currently in use by the industry has around 176 to 232 layers, so reaching 1000 layers could lead to quintupled SSD capacities.

See also: Samsung reports the first Q3 profit drop in three years


Original Submission

posted by hubie on Friday October 07 2022, @01:11AM   Printer-friendly
from the can't-you-just-3D-print-them? dept.

The first step will be figuring out the extent of the damage and then the difficulties really begin:

Until Russia's invasion of Ukraine, the Nord Stream 1 and 2 gas pipelines were a key part of Europe's energy infrastructure. In the fourth quarter of 2021, the Nord Stream lines supplied 18% of all Europe's gas imports. [...]

Since then, Nord Stream has become a geopolitical pawn as Russia has retaliated for economic sanctions imposed upon it after the invasion. [...]

Then, in late September, unexpected damage caused four leaks in the subsea pipeline system. Everyone except Russia believes it's sabotage by the pariah state as it attempts to squeeze supplies ahead of a tricky winter energy shortage in Europe, where countries are already planning to cut back on energy use.

[...] What we do know is that any mission will be an unprecedented challenge for the oil and gas sector, requiring complex robotics and imaginative engineering.

And while we don't even know for sure how bad the situation is, the damage is expected to be significant: the September 26 blasts believed to have caused the pipeline ruptures registered 2.2 on the Richter scale, according to the Swedish National Seismic Network. [...]

No matter who did it, it was deliberate, says van der Beukel. "These pipelines normally simply don't break down," he says. The steel Nord Stream pipes are 1.6 inches thick, with up to another 4.3 inches of concrete wrapped around them. Each of the 100,000 or so sections of the pipeline weighs 24 metric tons.

The repairs themselves would not be easy. There are a number of options, says Ribet. The first is to replace the damaged sections of the pipe in their totality—though that's the costliest. "You need the same diameter, the same kind of steel grade, and so on," he says. And you need to bring shipborne cranes that are strong enough to lift the heavy pipe segments out of the water.

The second repair option would be to install a clamp that covers the damaged sections of the pipe, essentially patching the ruptured areas. However, with an internal diameter of 1.153 meters, the Nord Stream pipelines would require huge clamps, as well as the temporary installation of an underwater caisson, a watertight chamber that would encase the section of pipeline so that engineers could work within it.

Marin believes this would be "the easiest solution." However, he adds, it would take months to procure a clamp big enough to encase the pipeline. This method also won't work if there turns out to be extensive damage, because it's not feasible to build clamps big enough to cover significant holes. A third option is a composite repair that mixes the two methods: replace the worst-damaged elements of the pipeline, and clamp those that are less affected.

Ribet suggests one potentially less likely fourth option: building and installing a new pipeline section that could bypass the damaged sections, which would be left in place. Russian analysts also note that one of Nord Stream's four individual pipelines appears not to have been affected, meaning it could continue to deliver gas, albeit at a lower rate.

[...] Asked if he can think if we've ever seen a subsea problem on this scale before, van den Beukel has a simple answer: "No. When you talk sabotage, it's usually onshore and on a much smaller scale," he says. "I can't think of anything similar to this—ever."


Original Submission

posted by hubie on Thursday October 06 2022, @10:22PM   Printer-friendly
from the chess-is-a-wearable-computing-man's-game? dept.

The American chess grandmaster at the centre of the sport's biggest scandal has been accused of cheating more than 100 times on a major online platform:

Hans Niemann, 19, has been the talk of the chess world after five-time world champion Magnus Carlsen accused him of cheating during their Sinquefield Cup game in September.

Niemann denied he cheated and even offered to play naked to prove he was clean, but did admit to cheating twice in his life, aged 12 and 16.

However, major website Chess.com has released a 72-page report that highlights more than 100 games where the platform believes Niemann cheated.

The games are dated between July 2015 and August 2020.

[...] "We present evidence in this report that Hans likely cheated online much more than his public statements suggest," the report states.

[...] Niemann categorically denied that he cheated in the Sinquefield Cup, which was played over-the-board and not online.

Chess.com in its report said it had no evidence that Niemann had ever cheated over-the-board or in his game against Carlsen.

However, the report said Chess.com found aspects of the Sinquefield Cup game "suspicious".

[...] Chess.com has said it has no evidence to suggest Niemann had cheated since 2020, after his ban was lifted.

"Our investigation has revealed that while there has been some noteworthy online play that has caught our attention as suspicious since August 2020, we are unaware of any evidence that Hans has engaged in online cheating since then," the report said.

"Our investigation has concluded that he did, however, cheat much more than he has publicly admitted to, including in many prize events, at least 25 streamed games, and 100+ rated games on Chess.com, as recently as when he was 17 years old."

Niemann has not commented on the report.

Oct. 2022 Final H. Niemann Report.pdf:


Original Submission #1Original Submission #2

posted by hubie on Thursday October 06 2022, @07:37PM   Printer-friendly
from the my-old-XF86Config-fears-realized! dept.

A patch is already available:

PSA: Users running Linux on laptops with Intel processors should avoid Linux Kernel 5.19.12 due to an error that might physically harm the display. Fortunately, kernel 5.19.13 has already fixed the issue. Versions 6.0 and 6.1 have also begun rolling out with many significant changes.

Recent reports from Intel laptop users running Linux Kernel 5.19.12 describe "white flashing" on their screens. A Linux engineer found that the issue could ruin the LCD, urging users to immediately roll back to an earlier iteration. The critical flaw prompted developers to issue a quick update.

The problem appears to originate from a faulty Intel graphics driver, which Linux kernel engineer Ville Syrjäl describes as a bad panel power sequencing delay. Greg Kroah-Hartman, the developer who released 5.19.13, said that users should only upgrade to the new kernel if they're experiencing this issue.

[...] Most Linux users likely have to wait until kernel 5.19.13 is available for their specific distro. The engineers examining the LCD problem didn't say whether the newly-released kernels 6.0 and 6.1 also include fixes for the issue.

Released for most major distros this week, Linux Kernel 6.0 supports the newest hardware architectures, including Raptor Lake, Meteor Lake, Arc Alchemist, and RDNA 3. [...]

Kernel 6.1 closes a significant Bluetooth security hole and makes the first steps towards supporting the Rust programming language, which Google uses to develop Android. [...]


Original Submission

posted by hubie on Thursday October 06 2022, @04:49PM   Printer-friendly
from the cloud-will-bring-me-some-security dept.

Most developers aren't particularly good at building authorization into their applications, but would they trust a third-party provider like Oso?:

It's increasingly evident that for security to work, security must be baked into the development process — not a bolt-on afterthought that a dedicated security team manages. This newfound appreciation for developers' roles in security has given rise to things like DevSecOps as well as open source projects like Oso.

Oso, which just announced today the general availability of Oso Cloud, offers an open source policy engine for authorization that represents security as code so developers can express security as a natural extension of their applications.

[...] Authorization is hard to get right, and while crucially important, it's not necessarily central to anyone's business. As such, authorization tends to be something that every company requires yet often goes about in ineffective ways. Arguably, it's time we stop thinking about authorization, or security in general, as an off-the-shelf product that someone can buy, and more about a new model or mindset that developers must apply.

[...] This brings us to authorization. Authorization has so far evaded becoming a third-party service offering, largely because no one has been able to make it generic enough to be broadly relevant while still being flexible enough to be useful. Oso thinks it has cracked that code.

[...] Some developers, Neray said, may have heard of RBAC or ABAC. More cutting-edge developers may have heard of Google's Zanzibar. None of these really handle the core problem. What does work, Neray continued, is to think of authorization as composed of three core abstractions — logic, data and enforcement — and "once you understand how each of them works, you can build (or adopt) structured solutions that let you bend authorization to your will."

In practice, this means it's a bit like SQL, where if you put your data in a standard format and give it a schema, you can then query it arbitrarily. In a similar manner, in Oso you put your authorization data in a standard format, write arbitrarily simple or complex authorization logic, and then can ask any question you want.

[...] But really, it comes down to whether a little bit of trust is worth the removal of a lot of bother from your application infrastructure. As Oso co-founder and CTO Sam Scott stressed: "Our vision is to decrease the amount of time and brain calories that developers spend thinking about authorization by 10x in the next 10 years."


Original Submission

posted by hubie on Thursday October 06 2022, @02:02PM   Printer-friendly
from the I-am-a-camera dept.

The future heart of the Vera C. Rubin Observatory will soon make its way to Chile:

The world's largest camera sits within a nondescript industrial building in the hills above San Francisco Bay.

If all goes well, this camera will one day fit into the heart of the future Vera C. Rubin Observatory in Chile. For the last seven years, engineers have been crafting the camera in a clean room at the SLAC National Accelerator Laboratory in Menlo Park, Calif. In May 2023, if all goes according to plan, the camera will finally fly to its destination, itself currently under construction in the desert highlands of northern Chile.

[...] "We're at the stage where we've got all the camera's mechanisms fully assembled," says Hannah Pollek, a staff engineer at SLAC.

Any typical camera needs a lens, and this camera is certainly no exception. At 1.57 meters (5 feet) across, this lens is the world's largest, as recognized by the Guinness Book of World Records. When it's installed, it will catch light reflected through a triplet of mirrors, built separately.

In action, the telescope will point at a parcel of sky, 3.5 degrees across—in other words, seven times the width of the full moon. The camera will take two exposures, back-to-back, approximately 15 seconds each—bracketed by the sweeping of a colossal shutter. Then, the telescope will move along to the next parcel, and so forth, in a mission to survey the southern sky for years on end.

Behind the lens sit the detectors, which are fashioned from charge-coupled device (CCD) sensors, common in astronomy. With the lens cap removed, the detectors are visible as a silver-and-blue grid, the different colors being a consequence of the camera having two different suppliers. Together, they can construct images that are as large as 3.2 gigapixels.

[...] If all goes well with the last phase of construction, this camera will soon depart California for Chile and catch its first glimpse of the night sky by 2024.


Original Submission

posted by martyb on Thursday October 06 2022, @11:16AM   Printer-friendly

Nobel Prize in Physics won for quantum entanglement findings:

Physicists Alain Aspect, John Clauser and Anton Zeilinger were awarded the Nobel Prize in Physics this week for performing breakthrough quantum entanglement experiments.

Quantum entanglement is a phenomenon in which a group of particles share a quantum state even when they are physically separate over some distance. Measuring the momentum, spin, or polarization of one particle instantaneously affects and determines the state of other entangled particles in the same system.

The nature of entanglement was fiercely debated among physicists. Some thought information could not travel faster than the light and there must be some other process impacting the particles in the system while others believed the weird phenomena showed a breakdown in classical physics, paving the way for quantum mechanics.

[...] In 1964, John Stewart Bell came up with a theoretical framework that tests if the entanglement effects were due to some hidden variables affecting the entangled particles. Bell's inequalities describe the mathematical constraints an entangled system must obey if it is affected by these local hidden variables.

[...] Clauser, 79, and Aspect, 75, performed the initial experiments proving that entangled particles violated Bell's inequalities in separate projects conducted in the US and France. Zeilinger, 77, later applied the results in other experiments demonstrating other entanglement-related effects such as quantum teleportation of a qubit.

"It has become increasingly clear that a new kind of quantum technology is emerging," Anders Irbäck, chair of the Nobel Committee for Physics, said on Tuesday. "We can see that the laureates' work with entangled states is of great importance, even beyond the fundamental questions about the interpretation of quantum mechanics."


Original Submission

posted by Fnord666 on Thursday October 06 2022, @08:27AM   Printer-friendly
from the we-can't-mod-up-together-with-suspicious-minds dept.

A person's distrust in humans predicts they will have more trust in artificial intelligence's ability to moderate content online:

A person's distrust in humans predicts they will have more trust in artificial intelligence's ability to moderate content online, according to a recently published study. The findings, the researchers say, have practical implications for both designers and users of AI tools in social media.

"We found a systematic pattern of individuals who have less trust in other humans showing greater trust in AI's classification," said S. Shyam Sundar, the James P. Jimirro Professor of Media Effects at Penn State. "Based on our analysis, this seems to be due to the users invoking the idea that machines are accurate, objective and free from ideological bias."

The study, published in the journal of New Media & Society also found that "power users" who are experienced users of information technology, had the opposite tendency. They trusted the AI moderators less because they believe that machines lack the ability to detect nuances of human language.

[...] "One of the reasons why some may be hesitant to trust content moderation technology is that we are used to freely expressing our opinions online. We feel like content moderation may take that away from us," said Maria D. Molina, an assistant professor of communication arts and sciences at Michigan State University, and the first author of this paper. [...]

"A major practical implication of the study is to figure out communication and design strategies for helping users calibrate their trust in automated systems," said Sundar, who is also director of Penn State's Center for Socially Responsible Artificial Intelligence. "Certain groups of people who tend to have too much faith in AI technology should be alerted to its limitations and those who do not believe in its ability to moderate content should be fully informed about the extent of human involvement in the process."

Journal Reference:
Molina, M. D., & Sundar, S. S. (2022). Does distrust in humans predict greater trust in AI? Role of individual differences in user responses to content moderation. New Media & Society, 2022. 10.1177/14614448221103534


Original Submission