Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Favorite single biome planet trope lacking latitudinal variations:

  • desert planet of Tatooine
  • ice planet of Hoth
  • forrest moon of Endor
  • swamp planet Dagobah
  • Waterworld
  • I prefer molten lava you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:42 | Votes:61

posted by janrinok on Friday February 09, @10:04PM   Printer-friendly
from the somebody-will-always-pay dept.

Arthur T Knackerbracket has processed the following story:

Scalpers have taken to Apple's new headset, but there's currently plenty of supply.

When Apple Vision Pro launched late last week, there were two main topics of conversation. The first is all of the things it can do and how well it can do them. The other is the price: it starts at $3,499 with 256GB of storage and goes up from there. That's a lot of money, but there's actually someone trying to charge more than Apple: scalpers. They're often trying to start around $4,000, with some asking for as $10,000 in an attempt to make extra cash. Scalpers have unfortunately become a fixture of major technology launches. Remember the PlayStation 5 shortages that started in 2020? Those didn't resolve until just last year. Or what about graphics cards during the early pandemic? Those all went on third-party marketplaces as scalpers and the bots they employ have served as unwanted middlemen for financial gain. But with Vision Pro, that doesn't seem to be working. When I went to my local Apple Store on the evening of the launch for the demo experience, the specialist who gave me the demo told me that if I wanted the 512GB or 1TB models, I could get one immediately. That was right before the store closed. As I write this, I could get a 256GB model from Apple and pick it up tomorrow at a store near my office or the one closest to my home. Others are available this week. Shipping might take a bit more time, as it would arrive closer to the end of the month. And yet, scalpers are taking to eBay for a premium. Why would you do that when you could get it from the manufacturer?

"Well, that's the beauty of open markets and speculation," Ramon T. Llamas, a research director with the analysis firm IDC’s devices and displays team, told Tom’s Hardware. But the Vision Pro market is a bit different than recent tech scalping. For starters, Llamas points out, a lot of people are still trying to figure out what they're going to use a Vision Pro for. The PlayStation 5 has a very defined use case, which is part of why it was so in demand. Others may be waiting for later generations of the product and let early adopters work out the kinks. 

"It's easy to see there is some interest out there for this device, but when you're competing against the supplier itself, Apple, with a very fixed price and everything — a very public price — and… ample supply on hand, you're going to dive into some limitations," Llamas said. Which is to say, when I open eBay and Facebook Marketplace, I'm seeing a lot of listings.

This is compounded in difficulty by the degree of customization involved in buying a Vision Pro. It requires two scans from an iPhone or iPad with Apple's Face ID. These measurements decide which size straps should come in the box, as well as which size light shield will fit your face.

Some sellers list the size they bought (presumably, revealing the size of their noggin in the process); in other cases, you may go in blind on the sizing. As long as there's stock in an Apple Store near you, it makes far more sense, for $3,499.99, to go get it fitted to your own head. The idea that someone would want to buy an ill-fitting Vision Pro for more money doesn't make much sense, especially because they might end up going to Apple anyway and shelling out $199 for a new light seal and cushions or $99 for a new headband.

[...] Apple didn't respond to a request for comment. We'll update if we hear back.


Original Submission

posted by janrinok on Friday February 09, @05:18PM   Printer-friendly
from the d'oh! dept.

https://arstechnica.com/space/2024/02/humanitys-most-distant-space-probe-jeopardized-by-computer-glitch/

Voyager 1 is still alive out there, barreling into the cosmos more than 15 billion miles away. However, a computer problem has kept the mission's loyal support team in Southern California from knowing much more about the status of one of NASA's longest-lived spacecraft.

The computer glitch cropped up on November 14, and it affected Voyager 1's ability to send back telemetry data, such as measurements from the spacecraft's science instruments or basic engineering information about how the probe was doing. [...] "It would be the biggest miracle if we get it back. We certainly haven't given up," said Suzanne Dodd, Voyager project manager at NASA's Jet Propulsion Laboratory, in an interview with Ars. "There are other things we can try. But this is, by far, the most serious since I've been project manager."

Dodd became the project manager for NASA's Voyager mission in 2010, overseeing a small cadre of engineers responsible for humanity's exploration into interstellar space. Voyager 1 is the most distant spacecraft ever, speeding away from the Sun at 38,000 mph (17 kilometers per second). [...] The latest problem with Voyager 1 lies in the probe's Flight Data Subsystem (FDS), one of three computers on the spacecraft working alongside a command-and-control central computer and another device overseeing attitude control and pointing. [...] In November, the data packages transmitted by Voyager 1 manifested a repeating pattern of ones and zeros as if it were stuck, according to NASA. Dodd said engineers at JPL have spent the better part of three months trying to diagnose the cause of the problem. She said the engineering team is "99.9 percent sure" the problem originated in the FDS, which appears to be having trouble "frame syncing" data. [...] "It's likely somewhere in the FDS memory," Dodd said. "A bit got flipped or corrupted. But without the telemetry, we can't see where that FDS memory corruption is."

[...] "We have sheets and sheets of schematics that are paper, that are all yellowed on the corners, and all signed in 1974," Dodd said. "They're pinned up on the walls and people are looking at them. That's a whole story in itself, just how to get to the information you need to be able to talk about the commanding decisions or what the problem might be." [...] "It is difficult to command Voyager," Dodd said. "We don't have any type of simulator for this. We don't have any hardware simulator. We don't have any software simulator... There's no simulator with the FDS, no hardware where we can try it on the ground first before we send it. So that makes people more cautious, and it's a balance between getting commanding right and taking risks."

[...] The spacecraft's vast distance and position in the southern sky require NASA to use the largest 230-foot (70-meter) antenna at a Deep Space Network tracking site in Australia, one of the network's most in-demand antennas.

"The data rates are very low, and this anomaly causes us not to have any telemetry," Dodd said. "We're kind of shooting in the blind a little bit because we don't know what the status of the spacecraft is completely."

Previously on SoylentNews:
Engineers Work to Fix Voyager 1 Computer - 20231215

Related stories on SoylentNews:
NASA Back in Touch With Voyager 2 After 'Interstellar Shout' - 20230808
The Farthest-away Pictures of Earth Ever Taken - 20230423
NASA Fixed the Glitch that Caused Voyager 1 to Send Back Jumbled Data - 20220902
Record-Breaking Voyager Spacecraft Begin to Power Down - 20220721
Engineers Investigating NASA's Voyager 1 Telemetry Data - 20220522
NASA Calls Voyager 2, and the Spacecraft Answers from Interstellar Space - 20201104
Voyager Spacecraft Detect an Increase in the Density of Space Outside the Solar System - 20201021
Five NASA Spacecraft That are Leaving Our Solar System for Good - 20200930
Revisiting Decades-Old Voyager 2 Data, Scientists Find One More Secret About Uranus - 20200327
Voyager 1's Pale Blue Dot - 20200215
Voyager 2 Returning to Normal Operation - 20200202
Future Stellar Flybys of the Voyager and Pioneer Spacecraft - 20190522
Voyager 2 Reaches Interstellar Space - 20190418
NASA Announces That Voyager 2 Has Exited the Heliosphere - 20181210
Voyager 1 Fires Up Trajectory Correction Maneuver Thrusters for the First Time in 37 Years - 20171204
NASA's Voyager Mission Turns 40 - 20170820
The Oldest Computer (Not) on Earth - 20170107
Voyager 1 Rides 'Tsunami Wave' in Interstellar Space - 20141218
Reviewing Voyager Data Leads to New Discoveries - 20141114

Search for voyager on SoylentNews:
https://soylentnews.org/search.pl?threshold=0&query=Voyager&op=stories&sort=2


Original Submission

posted by janrinok on Friday February 09, @12:37PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

AWS could rake in between $400 million and $1 billion a year from charging customers for public IPv4 addresses while migration to IPv6 remains slow.

The cloud computing kingpin signaled last year that it would start charging customers for public IPv4 addresses from February 1, as covered by The Register at the time.

AWS cited increasing scarcity and claimed the cost to acquire a single public IPv4 address for customer use had risen more than 300 percent over the past few years.

Fortunately, the charge is hardly ruinous – $0.005 (half a cent) per IP address per hour, which equates to a total cost of $43.80 per year for each public IPv4 address you have – excluding any IP addresses that you might own and opt to bring to AWS using Amazon's BYOIP (Bring Your Own IP) service.

However a technologist has done the calculations and estimated that across all users, this will add up to a sum of between $400 million and $1 billion a year for AWS. Not bad for something that was being offered completely free just a few days ago (and is still offered for 750 hours a month at no cost in the AWS free tier).

The source of the billion-dollar claim is Andree Toonk, founder and CEO of network services biz Border0, who is presumably trying to generate business for his own company.

Toonk used Amazon's own IP address range data to estimate that the cloud colossus has at least 131,932,752 IPv4 addresses. Based on the average price for an IPv4 address being about $35 at the time of writing, this means that AWS is sitting on about $4.6 billion, should it wish to divest itself of them.

He also used a script to ping all of the IPv4 addresses in order to gauge how many were "alive" within the AWS network and came up with an answer of about 6 million. But many instances on AWS will have a security policy to not respond to a ping packet, so the actual number of active IPv4 addresses could be double that.

Even with just those six million addresses, that's $262.8 million AWS will earn from charging for IPv4 in a year.

He forecast the headline $400 million to $1 billion figure by projecting a "conservative" estimate that between 10 percent and 30 percent of the IPv4 addresses (approximately 7.9 million) published in the AWS JSON are used for a year.

We asked AWS if it recognized any of these figures, and what the company itself estimated it would earn from charging customers for public IPv4 addresses, but it declined to answer, instead referring us to its original blog post disclosing the charges.

The general feeling among industry experts is that this is fair game, and customers should make plans to migrate to IPv6 if they don't like it – assuming their applications allow this, of course.

[...] "My view is that AWS has been smart in buying up IPv4 addresses, and this is a way for it to cash in until IPv6 adoption makes IPv4 redundant. It's just that organizations are not rushing to move to IPv6," he said.

Previously: AWS to Charge Customers for Public IPv4 Addresses From 2024


Original Submission

posted by janrinok on Friday February 09, @07:52AM   Printer-friendly
from the it's-popular-all-of-a-sudden dept.

Everyone knows we should be doing backups. While the standard these days is an online backup (too expensive for a full backup, I use it for important, small things,) or using an external hard drive, SSDs can lose their data after a few years of not being powered on and hard drives are complicated mechanical beasts susceptible to their grease hardening, bearings seizing, etc.

The best option if I want long term backups is to grab good quality blurays and a burner. Is anyone else out there doing this? How are you handling splitting up your data (who only has 32gb of data these days?) Do you just have a dedicated spot on your hard drive to stage backups before burning or are there some software tricks on modern computers like the old days to burn a single "file" across multiple discs? How far back a backup have you recovered, now that bluray's going on 20 years old?


Original Submission

posted by janrinok on Friday February 09, @03:05AM   Printer-friendly

https://spectrum.ieee.org/non-line-of-sight-infrared

Just because an object is around a corner doesn't mean it has to be hidden. Non-line-of-sight imaging can peek around corners and spot those objects, but it has so far been limited to a narrow band of frequencies. Now, a new sensor can help extend this technique from working with visible light to infrared. This advance could help make autonomous vehicles safer, among other potential applications.

Non-line-of-sight imaging relies on the faint signals of light beams that have reflected off surfaces in order to reconstruct images. The ability to see around corners may prove useful for machine vision—for instance, helping autonomous vehicles foresee hidden dangers to better predict how to respond to them, says Xiaolong Hu, the senior author of the study and a professor at Tianjin University in Tianjin, China. It may also improve endoscopes that help doctors peer inside the body.

The light that non-line-of-sight imaging depends on is typically very dim, and until now, the detectors that were efficient and sensitive enough for non-line-of-sight imaging could only detect either visible or near-infrared light. Moving to longer wavelengths might have several advantages, such as dealing with less interference from sunshine, and the possibility of using lasers that are safe around eyes, Hu says.

Now Hu and his colleagues have for the first time performed non-line-of-sight imaging using 1,560- and 1,997-nanometer infrared wavelengths. "This extension in spectrum paves the way for more practical applications," Hu says.

In the new study, the researchers experimented with superconducting nanowire single-photon detectors. In each device, a 40-nanometer-wide niobium titanium nitride wire was cooled to about 2 kelvins (about –271 °C), rendering the wire superconductive. A single photon could disrupt this fragile state, generating electrical pulses that enabled the efficient detection of individual photons.

The scientists contorted the nanowire in each device into a fractal pattern that took on similar shapes at various magnifications. This let the sensor detect photons of all polarizations, boosting its efficiency.

The new detector was up to nearly three times as efficient as other single-photon detectors at sensing near- and mid-infrared light. This let the researchers perform non-line-of-sight imaging, achieving a spatial resolution of roughly 1.3 to 1.5 centimeters.

Journal Reference:
Yifan Feng, Xingyu Cui, Yun Meng, Xiangjun Yin, Kai Zou, Zifan Hao, Jingyu Yang, and Xiaolong Hu, "Non-line-of-sight imaging at infrared wavelengths using a superconducting nanowire single-photon detector," Opt. Express 31, 42240-42254 (2023) https://doi.org/10.1364/OE.497802


Original Submission

posted by janrinok on Thursday February 08, @10:21PM   Printer-friendly
from the who-reads-those-anyway dept.

https://arstechnica.com/science/2024/02/trio-wins-700k-vesuvius-challenge-grand-prize-for-deciphering-ancient-scroll/

Last fall we reported on the use of machine learning to decipher the first letters from a previously unreadable ancient scroll found in an ancient Roman villa at Herculaneum—part of the 2023 Vesuvius Challenge. Tech entrepreneur and challenge co-founder Nat Friedman has now announced via X (formerly Twitter) that they have awarded the grand prize of $700,000 for producing the first readable text. The three winning team members are Luke Farritor, Yousef Nader, and Julian Schilliger.

As previously reported, the ancient Roman resort town Pompeii wasn't the only city destroyed in the catastrophic 79 AD eruption of Mount Vesuvius. Several other cities in the area, including the wealthy enclave of Herculaneum, were fried by clouds of hot gas called pyroclastic pulses and flows.

[...] Brent Searles' lab at the University of Kentucky has been working on deciphering the Herculaneum scrolls for many years. He employs a different method of "virtually unrolling" damaged scrolls, which he used in 2016 to "open" a scroll found on the western shore of the Dead Sea, revealing the first few verses from the book of Leviticus. The team's approach combined digital scanning with micro-computed tomography—a noninvasive technique often used for cancer imaging—with segmentation to digitally create pages, augmented with texturing and flattening techniques. Then they developed software (Volume Cartography) to unroll the scroll virtually.

[...] In October, Farritor, a college student and SpaceX intern, successfully read the first text hidden within one of the rolled-up scrolls using a machine-learning model. The achievement snagged him $40,000. Nader, an Egyptian bio-robotics student in Berlin, received a smaller $10,000 First Ink prize for essentially being the second person to decipher letters in a scroll. Schilliger, a Swiss robotics student at ETH Zurich, won three Segmentation Tooling prizes, which enabled 3D mapping of the papyrus.

Schilliger, Farritor, and Nader then formed a "superteam" to create the winning entry, extracting 15 columns of text from inside the carbonized scroll. In addition, there was a three-way tie for runner-up, with each entry winning $50,000 for devising new approaches to the subtleties of ink labeling and sampling: Shao-Qian Mah; Louis Schlessinger and Arefeh Sherafati; and Elian Rafael Dal Prá, Sean Johnson, Leonardo Scabini, Raí Fernando Dal Prá, João Vitor Brentigani Torezan, Daniel Baldin Franceschini, Bruno Pereira Kellm, Marcelo Soccol Gris, and Odemir Martinez Bruno.

[...] The Vesuvius Challenge co-founders thought when they started the challenge that there was less than a 30 percent chance of success within the year, since, at the time, no one had been able to read actual letters inside of a scroll. However, the crowdsourcing approach proved wildly successful. It's still just 5 percent of a single scroll, so Friedman, Searles, and Gross have announced a new challenge for 2024: $100,000 for the first entry that can read 90 percent of the four scrolls scanned thus far.

[...] "We have not yet found the villa's main library, which would have contained a much wider range of Greek and Latin literature," historian Garrett Ryan wrote on the Vesuvius Challenge site. "That library, with its thousands or even tens of thousands of scrolls, must still be buried. If those texts are discovered, and if even a small fraction can still be read, they will transform our knowledge of classical life and literature on a scale not seen since the Renaissance."

Previously on SoylentNews:


Original Submission

posted by hubie on Thursday February 08, @05:34PM   Printer-friendly

https://newatlas.com/medical/neuropathy-chronic-nerve-pain-non-opioid-compound/

Researchers have discovered a non-opioid compound that, in mice, effectively reduced the pain hypersensitivity associated with chronic and often debilitating nerve pain caused by diabetes or chemotherapy drugs. It's opened the door to developing a drug to treat the condition for which existing painkillers do little.

Diabetes, chemotherapy drugs, multiple sclerosis, injuries and amputations have all been associated with neuropathic pain, usually caused by damage to nerves in various body tissues, including the skin, muscles and joints. Mechanical hypersensitivity – or mechanical allodynia – is a major symptom of neuropathic pain, where innocuous stimuli like light touch cause severe pain.

Many available pain medications aren't effective in reducing this often-debilitating type of chronic pain. However, researchers at the University of Texas at Austin (UT Austin), in collaboration with UT Dallas and the University of Miami, may have advanced the treatment of neuropathic pain by discovering a molecule that reduces mechanical hypersensitivity in mice.

"We found it to be an effective painkiller, and the effects were rather long-lived," said Stephen Martin, a co-corresponding author of the study. "When we tested it on different models, diabetic neuropathy and chemotherapy-induced neuropathy, for example, we found this compound has an incredible beneficial effect."

[...] "It's our goal to make this compound into a drug that can be used to treat chronic pain without the dangers of opioids," Martin said. "Neuropathic pain is often a debilitating condition that can affect people their entire lives, and we need a treatment that is well tolerated and effective."

Journal Reference:
Muhammad Saad Yousuf, et al., Highly specific σ2R/TMEM97 ligand FEM-1689 alleviates neuropathic pain and inhibits the integrated stress response, PNAS, 2023. https://doi.org/10.1073/pnas.2306090120


Original Submission

posted by hubie on Thursday February 08, @12:48PM   Printer-friendly

The European Parliament has reached a provisional deal on EU regulations to strengthen consumers' right to repair.

Negotiations over the bill have been ongoing for a while now - the rules were first proposed in March 2022, and the hope is that new requirements will be finalized in 2024.

The thinking is that consumers should be better informed about the lifespan and repairability of products before buying them, and there should be measures to boost repair after the legal guarantee period has expired.

All told, the bill looks set to bolster the repair sector. Manufacturers must make spare parts and tools available for "a reasonable price" and be prohibited from using contractual clauses or hardware or software blocks to obstruct repairs. While singling no company out specifically, the European Parliament said: "In particular, they should not impede the use of second-hand or 3D issued spare parts by independent repairers."

Other consumer rights in the deal include options for borrowing a device while their own is being repaired or opting for a refurbished unit, an additional one-year extension of the legal guarantee for repaired goods, and free online access to indicative repair prices.

Under the regulation, manufacturers must inform customers of the duty to repair and be obligated to fix "common household products," such as a washing machine or smartphone. The European Parliament has left open the possibility of adding more items over time.

[...] It will be a while before this all goes into effect, though. Once both Council and Parliament adopt the directive, member states will have 24 months to get it into national law.


Original Submission

posted by hubie on Thursday February 08, @08:03AM   Printer-friendly

https://www.theregister.com/2024/02/07/failed_usb_sticks/

The report, from German data recovery company CBL, concluded that NAND chips from reputable manufacturers such as Hynix, Sandisk, or Samsung that had failed quality control were being resold and repurposed. While still working, the chips' storage capacity is reduced.

"When we opened defective USB sticks last year, we found an alarming number of inferior memory chips with reduced capacity and the manufacturer's logo removed from the chip. Clearly discarded and unrecognizable microSD cards are also soldered onto a USB stick and managed with the external one on the USB stick board instead of the microSD's internal controller," explains Conrad Heinicke, Managing Director of CBL Datenrettung GmbH.

[...] Technological advancements have also affected these NAND chips, but not in a good way. The chips originally used single-level cell (SLC) memory cells that only stored one bit each, offering less data density but better performance and reliability. In order to increase the amount of storage the chips offered, manufacturers started moving to four bits per cell (QLC), decreasing the endurance and retention. Combined with the questionable components, it's why CBL warns that "You shouldn't rely too much on the reliability of flash memory."

[...] It's always wise to be careful when choosing your storage device and beware of offers that seem too good to be true. Back in 2022, a generic 30TB M.2 external SSD was available for about $18 on Walmart's website. It actually held two 512MB SD cards stuck to the board with hot glue – their firmware had been modified to report each one as 15 TB. There was also the case of fake Samsung SSDs with unbelievable slow speeds uncovered last year.


Original Submission

posted by hubie on Thursday February 08, @03:23AM   Printer-friendly
from the no,-really,-it's-not-true. dept.

Documents show industry-backed Air Pollution Foundation uncovered the severe harm climate change would wreak

"The fossil fuel industry funded some of the world's most foundational climate science as early as 1954, newly unearthed documents have shown, including the early research of Charles Keeling, famous for the so-called "Keeling curve" that has charted the upward march of the Earth's carbon dioxide levels."

A coalition of oil and car manufacturing interests provided $13,814 (about $158,000 in today's money) in December 1954 to fund Keeling's earliest work in measuring CO2 levels across the western US, the documents reveal.

Keeling would go on to establish the continuous measurement of global CO2 at the Mauna Loa Observatory in Hawaii. This "Keeling curve" has tracked the steady increase of the atmospheric carbon that drives the climate crisis and has been hailed as one of the most important scientific works of modern times.

[...] Experts say the documents show the fossil fuel industry had intimate involvement in the inception of modern climate science, along with its warnings of the severe harm climate change will wreak, only to then publicly deny this science for decades and fund ongoing efforts to delay action on the climate crisis.

"They contain smoking gun proof that by at least 1954, the fossil fuel industry was on notice about the potential for its products to disrupt Earth's climate on a scale significant to human civilization," said Geoffrey Supran, an expert in historic climate disinformation at the University of Miami.

[...] "These documents talk about CO2 emissions having planetary implications, meaning this industry understood extraordinarily early on that fossil fuel combustion was profound on a planetary scale," he said.

"There is overwhelming evidence the oil and gas industry has been misleading the public and regulators around the climate risks of their product for 70 years. Trusting them to be part of the solutions is foolhardy. We've now moved into an era of accountability."

Obligatory XKCD


Original Submission

posted by janrinok on Wednesday February 07, @10:34PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Microsoft opened its arms to Linux during the Windows 10 era, inventing an entire virtualized subsystem to allow users and developers to access a real-deal Linux command line without leaving the Windows environment. Now, it looks like Microsoft may embrace yet another Linux feature: the sudo command.

Short for "superuser do" or "substitute user do" and immortalized in nerd-leaning pop culture by an early xkcd comic, sudo is most commonly used at the command line when the user needs administrator access to the system—usually to install or update software, or to make changes to system files. Users who aren't in the sudo user group on a given system can't run the command, protecting the rest of the files on the system from being accessed or changed.

In a post on X, formerly Twitter, user @thebookisclosed found settings for a Sudo command in a preview version of Windows 11 that was posted to the experimental Canary channel in late January. WindowsLatest experimented with the setting in a build of Windows Server 2025, which currently requires Developer Mode to be enabled in the Settings app. There's a toggle to turn the sudo command on and off and a separate drop-down to tweak how the command behaves when you use it, though as of this writing the command itself doesn't actually work yet.

The sudo command is also part of the Windows Subsystem for Linux (WSL), but that version of the sudo command only covers Linux software. This one seems likely to run native Windows commands, though obviously we won't know exactly how it works before it's enabled and fully functional. Currently, users who want a sudo-like command in Windows need to rely on third-party software like gsudo to accomplish the task.

The benefit of the sudo command for Windows users—whether they're using Windows Server or otherwise—would be the ability to elevate the privilege level without having to open an entirely separate command prompt or Windows Terminal window. According to the options available in the preview build, commands run with sudo could be opened up in a new window automatically, or they could happen inline, but you'd never need to do the "right-click, run-as-administrator" dance again if you didn't want to.


Original Submission

posted by janrinok on Wednesday February 07, @05:52PM   Printer-friendly
from the AI-overlords dept.

https://arstechnica.com/information-technology/2024/02/microsoft-in-deal-with-semafor-to-create-news-stories-with-aid-of-ai-chatbot/

Microsoft is working with media startup Semafor to use its artificial intelligence chatbot to help develop news stories—part of a journalistic outreach that comes as the tech giant faces a multibillion-dollar lawsuit from the New York Times.

As part of the agreement, Microsoft is paying an undisclosed sum of money to Semafor to sponsor a breaking news feed called "Signals." The companies would not share financial details, but the amount of money is "substantial" to Semafor's business, said a person familiar with the matter.

[...] The partnerships come as media companies have become increasingly concerned over generative AI and its potential threat to their businesses. News publishers are grappling with how to use AI to improve their work and stay ahead of technology, while also fearing that they could lose traffic, and therefore revenue, to AI chatbots—which can churn out humanlike text and information in seconds.

The New York Times in December filed a lawsuit against Microsoft and OpenAI, alleging the tech companies have taken a "free ride" on millions of its articles to build their artificial intelligence chatbots, and seeking billions of dollars in damages.

[...] Semafor, which is free to read, is funded by wealthy individuals, including 3G capital founder Jorge Paulo Lemann and KKR co-founder Henry Kravis. The company made more than $10 million in revenue in 2023 and has more than 500,000 subscriptions to its free newsletters. Justin Smith said Semafor was "very close to a profit" in the fourth quarter of 2023.

Related stories on SoylentNews:
AI Threatens to Crush News Organizations. Lawmakers Signal Change Is Ahead - 20240112
New York Times Sues Microsoft, ChatGPT Maker OpenAI Over Copyright Infringement - 20231228
Microsoft Shamelessly Pumping Internet Full of Garbage AI-Generated "News" Articles - 20231104
Google, DOJ Still Blocking Public Access to Monopoly Trial Docs, NYT Says - 20231020
After ChatGPT Disruption, Stack Overflow Lays Off 28 Percent of Staff - 20231017
Security Risks Of Windows Copilot Are Unknowable - 20231011
Microsoft AI Team Accidentally Leaks 38TB of Private Company Data - 20230923
Microsoft Pulls AI-Generated Article Recommending Ottawa Food Bank to Tourists - 20230820
A Jargon-Free Explanation of How AI Large Language Models Work - 20230805
the Godfather of AI Leaves Google Amid Ethical Concerns - 20230502
The AI Doomers' Playbook - 20230418
Ads Are Coming for the Bing AI Chatbot, as They Come for All Microsoft Products - 20230404
Deepfakes, Synthetic Media: How Digital Propaganda Undermines Trust - 20230319


Original Submission

posted by janrinok on Wednesday February 07, @01:07PM   Printer-friendly

https://www.bbc.co.uk/news/science-environment-68172162

Researchers at the world's biggest particle accelerator in Switzerland have submitted proposals for a new, much larger, supercollider. Its aim is to discover new particles that would revolutionise physics and lead to a more complete understanding of how the Universe works. If approved, it will be three times larger than the current giant machine. But its £12bn price tag has raised some eyebrows, with one critic describing the expenditure as "reckless".

The biggest achievement of the Large Hadron Collider (LHC) was the detection of a new particle called the Higgs Boson in 2012. But since then its ambition to track down two holy grails of physics - dark matter and dark energy - have proved elusive and some researchers believe there are cheaper options. The new machine is called the Future Circular Collider (FCC). Cern's director general, Prof Fabiola Gianotti, told BBC News that, if approved, it will be a "beautiful machine".

[...] The proposal is for the larger FCC to be built in two stages. The first will begin operating in the mid 2040s and will collide electrons together. It is hoped the increased energy will produce large numbers of Higgs particles for scientists to study in detail.

The second phase will begin in the 2070s and require more powerful magnets, so advanced that they have not yet been invented. Instead of electrons, heavier protons will be used in the search for brand new particles.


Original Submission

posted by janrinok on Wednesday February 07, @08:25AM   Printer-friendly

https://www.science.org/content/article/electrifying-new-ironmaking-method-could-slash-carbon-emissions

By extracting metallic iron without producing carbon dioxide, the new process could even be carbon negative, at least for part of the world's iron production

Making iron, the main ingredient of steel, takes a toll on Earth's delicate atmosphere, producing 8% of all global greenhouse gas emissions. Now, a team of chemists has come up with a way to make the business much more eco-friendly. By using electricity to convert iron ore and salt water into metallic iron and other industrially useful chemicals, researchers report today in Joule that their approach is cost effective, works well with electricity provided by wind and solar farms, and could even be carbon negative, consuming more carbon dioxide (CO2) than it produces.

"It's a very clever approach," says Karthish Manthiram, a chemical engineer at the California Institute of Technology who was not involved with the study. He notes that the process has other advantages, including working at a low temperature, and being amenable to working with intermittent renewable electricity. "It checks all the boxes."

Iron is one of the most abundant elements on Earth, but in its natural state is bound to oxygen in the various minerals that make up iron ore. To extract metallic iron from this ore, workers typically mix it with a high-carbon form of coal called coke and heat the combination to about 1500°C in a blast furnace. At that temperature, the carbon atoms strip the oxygen atoms from the iron, producing CO2 that wafts into the atmosphere and leaves behind the molten metal. Steelmakers then combine this iron with a small amount of carbon and other trace metals to forge steel.

Although this way to make iron and steel is cheap and time tested, it produces significant amounts of CO2. The world mines 2.5 billion tons of iron every year, and reducing it to iron emits as much CO2 as the tailpipes of all passenger vehicles combined. So, scientists are looking for economically viable ways to produce metallic iron that don't generate greenhouse gases.

To that end, Paul Kempler, a chemical engineer at the University of Oregon, and colleagues wondered whether an industrial process for making chlorine from saltwater could be repurposed for ironmaking. In this "chlor-alkali" process, water containing sodium-chloride is placed in an electrochemical cell resembling a battery that contains two electrodes submerged in a liquid electrolyte. The positively charged electrode, the anode, pulls electrons from chloride ions, causing chlorine atoms to pair up into chlorine gas. At the same time, electrons flowing in from the cathode split water molecules into pieces that pair with the sodium ions and one another to make sodium hydroxide and hydrogen gas.

To tweak the setup to purify iron, Kempler's team added iron oxide particles to its cathode. Now, the electrons sent to it would also release the oxygen atoms from iron oxide and again form sodium hydroxide—as well as leave behind solid metallic iron. The process is highly efficient, the researchers claim. In fact, they estimate that selling the chlorine and some of the sodium hydroxide at current market prices should enable the overall process to produce iron at roughly the same price as making it in blast furnaces. And because sodium hydroxide can bind CO2 and convert it into carbon-based minerals, the process could be used to help capture CO2, making it carbon negative.


Original Submission

posted by janrinok on Wednesday February 07, @03:39AM   Printer-friendly

The majority residents of Paris have voted in favor of heavy parking fees for suburban utility vehicles over 1.6 tons or more. These new fees rise to €18 an hour in the city center, with lower rates further out. The goals are to improve air quality, road safety, and provide for better commuting by bicycle.

Starting September 1st, gas or hybrid SUVs, and other larger vehicles weighing over 1.6 tonnes (1.76 tons), will be charged €18 (around $19.40) per hour to park in the center of Paris, and €12 (around $12.90) per hour in the rest of the city. The new pricing also applies to electric vehicles weighing over two tonnes (2.20 tons). Exemptions are in place for taxis and city residents, which means those traveling into Paris from outside the city will be most impacted. According to one of the posters for the referendum, only three in 10 Parisians even own a personal vehicle.

[...] Under Hidalgo, a Socialist, the streets of Paris have been transformed with 84 kilometers (52 miles) of cycle lanes created since 2020 and a 71% jump in bike usage between the end of the COVID-19 lockdowns and 2023, according to City Hall.

[...] SUVs have become increasingly popular in France, favored by families in particular.

The Verge: Paris votes to crack down on SUVs

Previously:
(2023) Test Bike Generators in Paris, Rotterdam, and Barcelona
(2023) Parisians Say Au Revoir to Shared E-scooters


Original Submission