Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favorite keyboard trait?

  • QWERTY
  • AZERTY
  • Silent (sounds)
  • Clicky sounds
  • Thocky sounds
  • The pretty colored lights
  • I use Braille you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:63 | Votes:116

posted by hubie on Friday April 21 2023, @10:49PM   Printer-friendly
from the argument-full-of-holes dept.

A punctured bone fragment predates eyed needles in Western Europe by about 15,000 years:

An animal bone fragment full of human-made pits hints at how prehistoric people in Western Europe may have crafted clothing.

The nearly 40,000-year-old artifact probably served as a punch board for leatherwork, researchers report April 12 in Science Advances. They suggest that the bone fragment rested beneath animal hide while an artisan pricked holes in the material, possibly for seams. If so, it's the earliest-known tool of its kind and predates eyed needles in the region by about 15,000 years.

Found at an archaeological site south of Barcelona, the roughly 11-centimeter-long bone fragment contains 28 punctures scattered across one flat side, with 10 of them aligned and fairly evenly spaced.

The marks don't seem to have been a notation system or decoration, given that some holes are hard to see and the bone fragment wasn't otherwise shaped, says archaeologist Luc Doyon of the University of Bordeaux in France. He thought leatherwork could have made the marks. But it wasn't until he visited a cobbler shop and saw one of the artisan's tools that the hypothesis solidified, guiding Doyon's next steps.

[...] Scientists knew that humans wore clothing long before the oldest-known eyed needles existed (SN: 4/20/10). "What [the new finding] tells us is that the first modern humans who lived in Europe had the technology in their toolkit for making fitted clothes," Doyon says.

Journal Reference:
Luc Doyon, Thomas Faure, Montserrat Sanz, et al., A 39,600-year-old leather punch board from Canyars, Gavà, Spain [open], Sci. Adv., 12, 2023. (DOI: 10.1126/sciadv.adg0834)


Original Submission

posted by janrinok on Friday April 21 2023, @08:07PM   Printer-friendly
from the got-kefir? dept.

Ancient protein evidence shows milk consumption was a powerful cultural adaptation that stimulated human expansion onto the highland Tibetan Plateau:

The Tibetan Plateau, known as the "third pole", or "roof of the world", is one of the most inhospitable environments on Earth. While positive natural selection at several genomic loci enabled early Tibetans to better adapt to high elevations, obtaining sufficient food from the resource-poor highlands would have remained a challenge.

Now, a new study in the journal Science Advances reveals that dairy was a key component of early human diets on the Tibetan Plateau. The study reports ancient proteins from the dental calculus of 40 human individuals from 15 sites across the interior plateau.

[...] Ancient protein evidence indicates that dairy products were consumed by diverse populations, including females and males, adults and children, as well as individuals from both elite and non-elite burial contexts. Additionally, prehistoric Tibetan highlanders made use of the dairy products of goats, sheep, and possibly cattle and yak. Early pastoralists in western Tibet seem to have had a preference for goat milk.

"The adoption of dairy pastoralism helped to revolutionize people's ability to occupy much of the plateau, particularly the vast areas too extreme for crop cultivation," says Prof. Nicole Boivin, senior author of the study.

[...] "We were excited to observe an incredibly clear pattern," says Li Tang. "All our milk peptides came from ancient individuals in the western and northern steppes, where growing crops is extremely difficult. However, we did not detect any milk proteins from the southern-central and south-eastern valleys, where more farmable land is available."

Surprisingly, all the individuals with evidence for milk consumption were recovered from sites higher than 3700 meters above sea level (masl); almost half were above 4000 masl, with the highest at the extreme altitude of 4654 masl.

"It is clear that dairying was crucial in supporting early pastoralist occupation of the highlands," notes Prof. Shargan Wangdue. "Ruminant animals could convert the energy locked in alpine pastures into nutritional milk and meat, and this fueled the expansion of human populations into some of the world's most extreme environments." Li Tang concludes.

Journal Reference:

Li Tang, Shevan Wilkin, Kristine Korzow Richter, et al., Palaeoproteomic evidence reveals dairying supported prehistoric occupation of the highland Tibetan Plateau [open], Sci. Adv., 2023. DOI: 10.1126/sciadv.adf0345


Original Submission

posted by janrinok on Friday April 21 2023, @05:23PM   Printer-friendly
from the more-good-news-for-your-children-and-yourself dept.

It looks like the Paris Agreement is as dead as the fried chicken at my local deli.

At Paris, in 2015, the World agreed to limit the global temperature rise to 1.5 degrees Celsius. The latest report of the EU's Climate Change Service shows (summary pdf) that this target has been royally breached, at least for Europe. Temperatures there, averaged over the last 5 years, have increased by 2.2 degrees celsius.

Europe, at least, has a climate change service to measure these things. As for the rest of the world, an extrapolation of the pattern shown in Figure 1c, here, indicates that, there too, demand for swimming pools and flood insurance will grow.

To illustrate the complexity of the problem, the heatwave in mid-July of 2022 was caused by hot air from the Sahara moving into Europe, driving temperatures above 40 degrees Celsius. By mid-August, a stationary high-pressure system with clear skies and weak winds took hold, and caused a second heatwave, which was made worse due to the soil being dried out by the mid-July event, and no rains since.

Events above the Sahara might have come a second time in play, here. Increasing temperatures lead to a stronger evaporation over sea, while the land heats up more. This results in a stronger temperature gradient, which draws rains deeper inland: heavier rainfalls now are reported in the central Sahara, in summer, with formerly dry valleys being put under four meters of water. This causes less Sahara dust in the atmosphere, and hence shields the land less from solar radiation: the EU's report mentions that 2022 surface solar radiation was the highest in a 40 year record, and part of a positive trend.

To end with a positive note, the EU ain't doing so bad, compared to Greenland: three different heatwaves in 2022, and an average September temperature more than 8 degrees Celsius higher than normal.


Original Submission

posted by janrinok on Friday April 21 2023, @02:39PM   Printer-friendly

The Fermi bubbles may have started life as jets of high-energy charged particles:

Bubbles of radiation billowing from the galactic center may have started as a stream of electrons and their antimatter counterparts, positrons, new observations suggest. An excess of positrons zipping past Earth suggests that the bubbles are the result of a burp from our galaxy's supermassive black hole after a meal millions of years ago.

For over a decade, scientists have known about bubbles of gas, or Fermi bubbles, extending above and below the Milky Way's center (SN: 11/9/10). Other observations have since spotted the bubbles in microwave radiation and X-rays (SN: 12/9/20). But astronomers still aren't quite sure how they formed.

A jet of high-energy electrons and positrons, emitted by the supermassive black hole in one big burst, could explain the bubbles' multi-wavelength light, physicist Ilias Cholis reported April 18 at the American Physical Society meeting.

In the initial burst, most of the particles would have been launched along jets aimed perpendicular to the galaxy's disk. As the particles interacted with other galactic matter, they would lose energy and cause the emission of different wavelengths of light.

Those jets would have been aimed away from Earth, so those particles can never be detected. But some of the particles could have escaped along the galactic disk, perpendicular to the bubbles, and end up passing Earth. "It could be that just now, some of those positrons are hitting us," says Cholis, of Oakland University in Rochester, Mich.

So Cholis and Iason Krommydas of Rice University in Houston analyzed positrons detected by the Alpha Magnetic Spectrometer on the International Space Station. The pair found an excess of positrons whose present-day energies could correspond to a burst of activity from the galactic center between 3 million and 10 million years ago, right around when the Fermi bubbles are thought to have formed, Cholis said at the meeting.

The result, Cholis said, supports the idea that the Fermi bubbles came from a time when the galaxy's central black hole was busier than it is today.

Journal Reference:
Have we found the counterpart signal of the Fermi bubbles at the cosmic-ray positrons?, Bulletin of the American Physical Society (DOI: https://meetings.aps.org/Meeting/APR23/Session/U13.1)


Original Submission

posted by janrinok on Friday April 21 2023, @11:51AM   Printer-friendly

U.S. government imposes record fine on Seagate for violating sanctions against Seagate:

Seagate has been hit with a massive $300 million fine by the U.S. Department of Commerce [PDF] for violating export control restrictions imposed on Huawei in 2020. The report shows that the U.S. Department of Commerce states that Seagate shipped millions of hard drives to Huawei in 2020 – 2021 and become the sole supplier of HDDs to the company while its rivals Toshiba and Western Digital refrained to work with the conglomerate.

Seagate shipped 7.4 million hard drives to Huawei on 429 occasions between August 2020 and September 2021 without obtaining an export license from the U.S. Department of Commerce's Bureau of Industry and Security. Those drives were worth around $1.104 billion back then, a significant sum for Seagate, which revenue totaled $10.681 billion in 2021.

To settle the matter, Seagate has agreed to pay the $300 million fine in quarterly instalments of $15 million over five years starting in October 2023. The civil penalty of $300 million is more than double the estimated net profits that Seagate made from the alleged illegal exports to or involving Huawei, according to BIS. In fact, $300 million is a record fine for BIS.

"Today's action is the consequence: the largest standalone administrative resolution in our agency's history," said Matthew S. Axelrod, Assistant Secretary for Export Enforcement. "This settlement is a clarion call about the need for companies to comply rigorously with BIS export rules, as our enforcement team works to ensure both our national security and a level playing field."

As of mid-August 2020, the U.S. Department of Commerce's Bureau of Industry and Security mandated that any company planning to sell semiconductor hardware, software, equipment, or any other asset using American intellectual property to Huawei and its entities must obtain a special export license. The export controls on Huawei mostly pertain to semiconductors. However, Seagate's hard drives also fall under the export-controlled items category because they use controllers and memory designed with electronic design automation tools developed by American companies and produced using U.S.-made equipment.

These export licenses were subject to a presumption of denial policy, meaning they were difficult to obtain. However, multiple companies were granted appropriate licenses between 2020 and 2022, which allowed Huawei to acquire various products that were developed or manufactured in the United States.

Seagate did not apply for an appropriate license and said in September, 2020, that its drives could be shipped to Huawei without a license, an opinion that was not shared by its rival Western Digital. Since Huawei was not supposed to get HDDs at all, republican senator Roger Wicker wondered in mid-2021 how exactly the sanctioned company obtained such storage devices and whether three global makers of hard drives complied with the export rules.

As it turned out, although Toshiba and Western Digital ceased to sell HDDs to Huawei, Seagate continued to do so. In fact, the company became Huawei's exclusive hard drive supplier and even signed a three-year Strategic Cooperation Agreement and then a Long-Term Agreement to purchase over five million HDDs with the Chinese conglomerate in 2021.


Original Submission

posted by janrinok on Friday April 21 2023, @09:06AM   Printer-friendly

The Hyena code is able to handle amounts of data that make GPT-style technology run out of memory and fail:

In a paper published in March, artificial intelligence (AI) scientists at Stanford University and Canada's MILA institute for AI proposed a technology that could be far more efficient than GPT-4 -- or anything like it -- at gobbling vast amounts of data and transforming it into an answer.

Known as Hyena, the technology is able to achieve equivalent accuracy on benchmark tests, such as question answering, while using a fraction of the computing power. In some instances, the Hyena code is able to handle amounts of text that make GPT-style technology simply run out of memory and fail.

"Our promising results at the sub-billion parameter scale suggest that attention may not be all we need," write the authors. That remark refers to the title of a landmark AI report of 2017, 'Attention is all you need'. In that paper, Google scientist Ashish Vaswani and colleagues introduced the world to Google's Transformer AI program. The transformer became the basis for every one of the recent large language models.

But the Transformer has a big flaw. It uses something called "attention," where the computer program takes the information in one group of symbols, such as words, and moves that information to a new group of symbols, such as the answer you see from ChatGPT, which is the output.

That attention operation -- the essential tool of all large language programs, including ChatGPT and GPT-4 -- has "quadratic" computational complexity (Wiki "time complexity" of computing). That complexity means the amount of time it takes for ChatGPT to produce an answer increases as the square of the amount of data it is fed as input.

At some point, if there is too much data -- too many words in the prompt, or too many strings of conversations over hours and hours of chatting with the program -- then either the program gets bogged down providing an answer, or it must be given more and more GPU chips to run faster and faster, leading to a surge in computing requirements.

In the new paper, 'Hyena Hierarchy: Towards Larger Convolutional Language Models', posted on the arXiv pre-print server, lead author Michael Poli of Stanford and his colleagues propose to replace the Transformer's attention function with something sub-quadratic, namely Hyena.

[...] The paper's contributing authors include luminaries of the AI world, such as Yoshua Bengio, MILA's scientific director, who is a recipient of a 2019 Turing Award, computing's equivalent of the Nobel Prize. Bengio is widely credited with developing the attention mechanism long before Vaswani and team adapted it for the Transformer.

Also among the authors is Stanford University computer science associate professor Christopher Ré, who has helped in recent years to advance the notion of AI as "software 2.0".

To find a sub-quadratic alternative to attention, Poli and team set about studying how the attention mechanism is doing what it does, to see if that work could be done more efficiently.

A recent practice in AI science, known as mechanistic interpretability, is yielding insights about what is going on deep inside a neural network, inside the computational "circuits" of attention. You can think of it as taking apart software the way you would take apart a clock or a PC to see its parts and figure out how it operates.

One work cited by Poli and team is a set of experiments by researcher Nelson Elhage of AI startup Anthropic. Those experiments take apart the Transformer programs to see what attention is doing.

In essence, what Elhage and team found is that attention functions at its most basic level by very simple computer operations, such as copying a word from recent input and pasting it into the output.

For example, if one starts to type into a large language model program such as ChatGPT a sentence from Harry Potter and the Sorcerer's Stone, such as "Mr. Dursley was the director of a firm called Grunnings...", just typing "D-u-r-s", the start of the name, might be enough to prompt the program to complete the name "Dursley" because it has seen the name in a prior sentence of Sorcerer's Stone. The system is able to copy from memory the record of the characters "l-e-y" to autocomplete the sentence.

However, the attention operation runs into the quadratic complexity problem as the amount of words grows and grows. More words require more of what are known as "weights" or parameters, to run the attention operation.

As the authors write: "The Transformer block is a powerful tool for sequence modeling, but it is not without its limitations. One of the most notable is the computational cost, which grows rapidly as the length of the input sequence increases."

While the technical details of ChatGPT and GPT-4 haven't been disclosed by OpenAI, it is believed they may have a trillion or more such parameters. Running these parameters requires more GPU chips from Nvidia, thus driving up the compute cost.

To reduce that quadratic compute cost, Poli and team replace the attention operation with what's called a "convolution", which is one of the oldest operations in AI programs, refined back in the 1980s. A convolution is just a filter that can pick out items in data, be it the pixels in a digital photo or the words in a sentence.

Poli and team do a kind of mash-up: they take work done by Stanford researcher Daniel Y. Fu and team to apply convolutional filters to sequences of words, and they combine that with work by scholar David Romero and colleagues at the Vrije Universiteit Amsterdam that lets the program change filter size on the fly. That ability to flexibly adapt cuts down on the number of costly parameters, or, weights, the program needs to have.

The result of the mash-up is that a convolution can be applied to an unlimited amount of text without requiring more and more parameters in order to copy more and more data. It's an "attention-free" approach, as the authors put it.


Original Submission

posted by hubie on Friday April 21 2023, @06:22AM   Printer-friendly
from the streamlining-processes dept.

Proposed emissions from a Mississippi Chevron plant could raise locals' cancer risk by 250,000x the acceptable level and a community group is fighting back:

We need climate action. But just because something gets grouped under the umbrella of things that theoretically combat climate change doesn't mean it's actually good for the planet or people. In an alarming example, production of certain alternative "climate-friendly" fuels could lead to dangerous, cancer-causing emissions.

A Chevron scheme to make new plastic-based fuels, approved by the Environmental Protection Agency, could carry a 1-in-4 lifetime cancer risk for residents near the company's refinery in Pascagoula, Mississippi. A February joint report from ProPublica and the Guardian brought the problem to light. Now, a community group is fighting back against the plan, suing the EPA for approving it in the first place, as first reported by ProPublica and the Guardian in a follow-up report on Tuesday.

Cherokee Concerned Citizens, an organization that represents a ~130 home subdivision less than two miles away from Chevron's Pascagoula refinery, filed its suit to the Washington D.C. Circuit Court of Appeals on April 7. The petition demands that the court review and re-visit the EPA's rubber stamp of the Chevron proposal.

[...] Last year, the EPA greenlit Chevron's plan to emit some unnamed, truly gnarly, cancer-causing chemicals at a refinery in Pascagoula. The approval fell under an effort described as fast tracking the review of "climate-friendly new chemicals." Chevron proposed turning plastics into novel fuels, and the EPA hopped on board, in accordance with a Biden Administration policy to prioritize developing replacements for standard fossil fuels.

By opting to "streamline the review" of certain alternative fuels, the agency wrote it could help "displace current, higher greenhouse gas emitting transportation fuels," in a January 2022 press release. But also, through that "streamlining," the EPA appears to have pushed aside some major concerns.

[...] That 1-in-4 risk is about 250,000 times higher than the 1-in-1 million acceptable cancer risk threshold that the EPA generally applies when considering harm to the public. Another chemical listed in the approval document as P-21-0150 carries a lifetime cancer risk estimate of 1-in-8,333 for those exposed to fugitive air emissions —also far above the EPA's acceptable risk threshold. [...]

[...] For some reason though, despite its own internal risk cut-offs and federal regulation surrounding new chemical approvals, the EPA allowed Chevron to move forward without any further testing or a clear mitigation plan in place.

It's hard to say, specifically, what these EPA-approved compounds are because in the single relevant agency document obtained by ProPublica and the Guardian, chemical names are blacked out. However, the substances in question are all plastic-based fuels, as outlined in another, related document. Though obtuse, their approval seems to stem from a recently renewed national program to promote biofuel development, through a loophole that allows for fuels derived from waste.

[...] Nonetheless, the Biden Administration's push for more "biofuels" and re-upped Renewable Fuel Standard makes wide allowances for any fuel source that comes from trash—apparently regardless of the possible fallout.


Original Submission

posted by hubie on Friday April 21 2023, @03:34AM   Printer-friendly

Chinese tech giant claims better performance than competing GPUs:

Chinese social media, cloud, and entertainment giant Tencent on Monday revealed that it has started mass production of a home brew video transcoding accelerator.

The announcement comes nearly two years after the company unveiled a trio of custom chips designed to accelerate everything from streaming video to networking and artificial intelligence workloads.

In a post published on WeChat, Tencent Cloud revealed that "tens of thousands" of its Canghai chips, which are designed to offload video encode/decode for latency sensitive workloads, have been deployed internally to accelerate cloud gaming and live broadcasting.

Tencent says the Canghai chip can be paired with GPUs from a variety of vendors to support low-latency game streaming. When used for video transcoding, Tencent said a single node equipped with Canghai can deliver up to 1,024 video channels . We'll note that Nvidia, with the launch of its L4 GPUs last month, made similar claims. Without real-world benchmarks, it's hard to say how either firm's claims stack up.

[...] When it comes to spinning custom chips to improve the efficiency and economics of cloud computing, Amazon Web Services gets a lot of credit. The American e-tail giant and cloud titan has developed everything from custom CPUs, AI training and inference accelerators, and smartNICs to offload many housekeeping workloads.

And while Google has developed an accelerator of its own, called the Tensor Processing Unit (TPU), most US cloud providers have largely stuck with commercially available parts from the likes of Intel, AMD, Ampere, Broadcom, or Nvidia, rather than designing their own.

However, in China, custom chips appear to be more prevalent, with development an imperative accelerated by US sanctions that mean some tech products can't be exported to the Middle Kingdom.


Original Submission

posted by hubie on Friday April 21 2023, @12:48AM   Printer-friendly
from the need-more-filament dept.

The company says it learned much from Terran-1's debut flight and is choosing to go bigger for its successor:

After its rocket failed to reach orbit last month, California-based Relativity Space doesn't want to dwell on the past. Instead, the company is leaping forward with its next launch vehicle, which promises to be bigger and better.

On Wednesday, Relativity Space announced its lessons learned from the launch of Terran-1, a 3D-printed, methane-fueled rocket that was set to break records on its first flight. The rocket took off from Cape Canaveral Space Force Station on March 22 but an engine failure prevented it from reaching orbit.

Shortly after its stage separation, the rocket engine did not reach full thrust, according to Relativity Space. The company shared key findings from the rocket anomaly, detailing that the engine's main valves opened slower than expected, preventing the propellant from reaching the thrust chamber in time.

Terran-1 is 85% 3D-printed by mass and it's also powered by a liquid methane-oxygen propellant known as methalox. [...]

[...] Unlike its predecessor, Terran-R is designed to be a much larger 3D printed, medium-to-heavy lift orbital launch vehicle capable of carrying 33.5 metric tons to orbit. The rocket's first stage will be outfitted with 13 3D-printed Aeon engines while its second stage will have a single methane-fueled engine.

Terran-R's design is focused on the reusability of its first stage rather than its second stage, made from printed aluminum that would allow up to 20 re-flights. The plan is land the rockets on drone ships stationed in the Atlantic Ocean, similar to how SpaceX lands its Falcon 9 first stage.

[...] "Terran 1 was like a concept car, redefining the boundaries of what is possible by developing many valuable brand-new technologies well ahead of their time," Ellis said.

It's a bold move for Relativity Space to move onto the next project despite Terran-1 not fulfilling its inaugural mission. But in this, the new commercial space race, it's important for companies to move quickly or risk being left behind.

Previously:
    With Eyes on Reuse, Relativity Plans Rapid Transition to Terran R Engines
    Relativity Space Announces Fully Reusable "Terran R" Rocket, Planned for 2024 Debut
    Relativity Space Selected to Launch Satellites for Telesat
    Aerospace Startup Making 3D-Printed Rockets Now Has a Launch Site at America's Busiest Spaceport


Original Submission

posted by hubie on Thursday April 20 2023, @10:03PM   Printer-friendly
from the what-did-the-article-say? dept.

Potentially good news for old machinists and over-the-hill heavy metal fans:

"Five years ago, a team of researchers at the University of Rochester Medical Center (URMC) was able to regrow cochlear hair cells in mice for the first time. These hair cells are found in the cochlear region of ears in all mammals. They sense sound vibrations, convert those into brain signals, and eventually allow a person to hear and understand the different sounds around them. The new study from URMC researchers sheds light on the underlying mechanism that allowed the ear hairs to regrow in mice."

"We know from our previous work that expression of an active growth gene, called ERBB2, was able to activate the growth of new hair cells (in mammals), but we didn't fully understand why. This new study tells us how that activation is happening—a significant advance toward the ultimate goal of generating new cochlear hair cells in mammals," said Patricia White, one of the study authors and a neuroscience professor at URMC."

https://www.zmescience.com/science/news-science/can-we-reverse-hearing-loss-yes-we-can-here-is-how-it-works/


Original Submission

posted by janrinok on Thursday April 20 2023, @07:14PM   Printer-friendly
from the I-love-to-work-at-nothing-all-day dept.

Big tech companies were apparently hiring workers to keep them from joining rival firms:

Many former employees at big tech companies are admitting that they had very little to do at their jobs, despite earning high salaries. One such under-worked and overpaid former tech worker is 33-year-old Madelyn Machado, who left Microsoft to join Facebook's parent company Meta as a recruiter in the fall of 2021.

In a viral TikTok video, Machado claimed she was hired for a $190,000 yearly salary, but had basically nothing to do during her stint at the company. "I do think a lot of these companies wanted there to be work, but there wasn't enough," she said. Talking to The Wall Street Journal, Machado said that on most days, her work included attending virtual meetings from noon until 3:30 pm before logging off for the day.

Curiously, Machado says she was told by her recruiters at Meta that she wouldn't be hiring anybody during her first year at the company. She also claims that some of her colleagues told her that they had spent two years at the company without ever hiring anyone. Unfortunately for her, she only worked for six months at Meta before being fired last year for posting TikTok videos that the company said posed a conflict of interest.

Another former Meta worker who recounted a similar story is 35-year-old Britney Levy, who says she joined the company in April 2022 but received her first and only assignment shortly before being laid off in November. Since then, companies across the tech industry, including Amazon, Meta, Microsoft, Twitter, PayPal, Yahoo, Zoom, IBM, Spotify, and others, have announced massive layoffs, affecting tens of thousands of employees.

Talking to the WSJ, experts said they believe companies overhired during the pandemic-era boom not because they needed more workers, but to hoard talent from rival companies. According to Vijay Govindarajan, professor at Dartmouth's Tuck School of Business, the hiring spree was initially fueled by a shortage of tech talent but eventually became a competition, which led to companies "hiring ahead of demand." He also pointed out that that the situation was very similar to what happened in the finance industry in the early 2000s, when companies overhired during periods of high growth, leaving many workers with not enough work.


Original Submission

posted by janrinok on Thursday April 20 2023, @04:28PM   Printer-friendly
from the I'm-not-pirating-this-movie-I'm-training-my-AI-model dept.

Inside the secret list of websites that make AI like ChatGPT sound smart:

AI chatbots have exploded in popularity over the past four months, stunning the public with their awesome abilities, from writing sophisticated term papers to holding unnervingly lucid conversations.

Chatbots cannot think like humans: They do not actually understand what they say. They can mimic human speech because the artificial intelligence that powers them has ingested a gargantuan amount of text, mostly scraped from the internet.

This text is the AI's mainsource of information about the world as it is being built, and it influences how it responds to users. If it aces the bar exam, for example, it's probably because its training data included thousands of LSAT practice sites.

Tech companies have grown secretive about what they feed the AI. So The Washington Post set out to analyze one of these data sets to fully reveal the types of proprietary, personal, and often offensive websites that go into an AI's training data.

To look inside this black box, we analyzed Google's C4 data set, a massive snapshot of the contents of 15 million websites that have been used to instruct some high-profile English-language AIs, called large language models, including Google's T5 and Facebook's LLaMA. (OpenAI does not disclose what datasets it uses to train the models backing its popular chatbot, ChatGPT)

The Post worked with researchers at the Allen Institute for AI on this investigation and categorized the websites using data from Similarweb, a web analytics company. About a third of the websites could not be categorized, mostly because they no longer appear on the internet. Those are not shown.

We then ranked the remaining 10 million websites based on how many "tokens" appeared from each in the data set. Tokens are small bits of text used to process disorganized information — typically a word or phrase.

The data set was dominated by websites from industries including journalism, entertainment, software development, medicine and content creation, helping to explain why these fields may be threatened by the new wave of artificial intelligence. The three biggest sites were patents.google.com No. 1, which contains text from patents issued around the world; wikipedia.org No. 2, the free online encyclopedia; and scribd.com No. 3, a subscription-only digital library. Also high on the list: b-ok.org No. 190, a notorious market for pirated e-books that has since been seized by the U.S. Justice Department. At least 27 other sites identified by the U.S. government as markets for piracy and counterfeits were present in the data set.

[...] Others raised significant privacy concerns. Two sites in the top 100, coloradovoters.info No. 40 and flvoters.com No. 73, had privately hosted copies of state voter registration databases. Though voter data is public, the models could use this personal information in unknown ways.

[...] The Post's analysis suggests more legal challenges may be on the way: The copyright symbol — which denotes a work registered as intellectual property — appears more than 200 million times in the C4 data set.

The News and Media category ranks third across categories. But half of the top 10 sites overall were news outlets: nytimes.com No. 4, latimes.com No. 6, theguardian.com No. 7, forbes.com No. 8, and huffpost.com No. 9. (Washingtonpost.com No. 11 was close behind.) Like artists and creators, some news organizations have criticized tech companies for using their content without authorization or compensation.

[...] Technology is the second largest category, making up 15 percent of categorized tokens. This includes many platforms for building websites, like sites.google.com No. 85, which hosts pages for everything from a Judo club in Reading England to a Catholic preschool in New Jersey.

The data set contained more than half a million personal blogs, representing 3.8 percent of categorized tokens. Publishing platform medium.com No. 46 was the fifth largest technology site and hosts tens of thousands of blogs under its domain. Our tally includes blogs written on platforms like WordPress, Tumblr, Blogspot and Live Journal.

[...] Social networks like Facebook and Twitter — the heart of the modern web — prohibit scraping, which means most data sets used to train AI cannot access them. Tech giants like Facebook and Google that are sitting on mammoth troves of conversational data have not been clear about how personal user information may be used to train AI models that are used internally or sold as products.

[...] A web crawl may sound like a copy of the entire internet, but it's just a snapshot, capturing content from a sampling of webpages at a particular moment in time. C4 began as a scrape performed in April 2019 by the nonprofit CommonCrawl, a popular resource for AI models. CommonCrawl told The Post that it tries to prioritize the most important and reputable sites, but does not try to avoid licensed or copyrighted content.

[...] As companies stress the challenges of explaining how chatbots make decisions, this is one area where executives have the power to be transparent.


Original Submission

posted by janrinok on Thursday April 20 2023, @01:43PM   Printer-friendly

The Moon still has much to tell us about the early solar system:

The Moon still has much to tell us about the early solar system. Encouragingly, it also has scientific value as a platform for observational astronomy.

Lunar exploration is undergoing a renaissance. Dozens of missions, organised by multiple space agencies—and increasingly by commercial companies—are set to visit the Moon by the end of this decade. Most of these will involve small robotic spacecraft, but NASA's ambitious Artemis program, aims to return humans to the lunar surface by the middle of the decade.

[...] The potential role for astronomy of Earth's natural satellite was discussed at a Royal Society meeting earlier this year. The meeting itself had, in part, been sparked by the enhanced access to the lunar surface now in prospect. Several types of astronomy would benefit. The most obvious is radio astronomy, which can be conducted from the side of the Moon that always faces away from Earth—the far side.

The lunar far side is permanently shielded from the radio signals generated by humans on Earth. During the lunar night, it is also protected from the Sun. These characteristics make it probably the most "radio-quiet" location in the whole solar system as no other planet or moon has a side that permanently faces away from the Earth. It is therefore ideally suited for radio astronomy.

[...] Radio waves with wavelengths longer than about 15m are blocked by Earth's ionoshere. But radio waves at these wavelengths reach the Moon's surface unimpeded. For astronomy, this is the last unexplored region of the electromagnetic spectrum, and it is best studied from the lunar far side. Observations of the cosmos at these wavelengths come under the umbrella of "low frequency radio astronomy." These wavelengths are uniquely able to probe the structure of the early universe, especially the cosmic "dark ages," an era before the first galaxies formed.

[...] ... another potential application of far side radio astronomy is trying to detect radio waves from charged particles trapped by magnetic fields—magnetospheres—of planets orbiting other stars. This would help to assess how capable these exoplanets are of hosting life. Radio waves from exoplanet magnetospheres would probably have wavelengths greater than 100m, so they would require a radio-quiet environment in space. Again, the far side of the Moon will be the best location.

The Moon also offers opportunities for other types of astronomy as well. Astronomers have lots of experience with optical and infrared telescopes operating in free space, such as the Hubble telescope and JWST. However, the stability of the lunar surface may confer advantages for these types of instrument. Moreover, there are craters at the lunar poles that receive no sunlight. Telescopes that observe the universe at infrared wavelengths are very sensitive to heat and therefore have to operate at low temperatures. JWST, for example, needs a huge sunshield to protect it from the sun's rays. On the Moon, a natural crater rim could provide this shielding for free.

Journal References Mentioned:
DOI: https://royalsocietypublishing.org/doi/10.1098/rsta.2019.0564
DOI: https://royalsocietypublishing.org/doi/10.1098/rsta.2019.0570
DOI: https://royalsocietypublishing.org/doi/10.1098/rsta.2020.0212
DOI: https://royalsocietypublishing.org/doi/10.1098/rsta.2019.0562


Original Submission

posted by janrinok on Thursday April 20 2023, @10:56AM   Printer-friendly

Netflix Will Block Password Sharing Before July 2023

Netflix Will Block Password Sharing Before July 2023:

Netflix has been working on a way to block people from sharing their Netflix passwords. It was supposed to roll out in the United States already, but now it's coming to the US and other regions sometime soon.

Netflix confirmed in its recent earnings report that it will start rolling out the new account sharing limitations in the second quarter of 2023 — meaning sometime between now and June 30. The company said in the report, "In Q1, we launched paid sharing in four countries and are pleased with the results. We are planning on a broad rollout, including in the US, in Q2."

In other countries where Netflix has already rolled out the changes, Netflix accounts have a "primary location" that is determined using your account history, home Wi-Fi network, and other data. Devices that aren't connected to that network and watching Netflix are automatically blocked after 31 days. The only way around the block is to add a paid "extra member" to your account, which costs less than an individual subscription, but isn't available for all types of Netflix plans.

Netflix to Charge for Password Sharing in the U.S. as Soon as This Summer

Netflix to charge for password sharing in the U.S. as soon as this summer:

In a detailed letter to shareholders, Netflix explained the plans for a broad rollout, including the U.S., as one that will grow the paid membership base, therefore increasing profits, rather than reduce these metrics.

Paid sharing was rolled out in the first quarter of 2023 in Canada, New Zealand, Spain, and Portugal. "In Canada, which we believe is a reliable predictor for the US, our paid membership base is now larger than prior to the launch of paid sharing and revenue growth has accelerated and is now growing faster than in the U.S.," the letter reads.

This rollout comes after paid sharing tests conducted in Latin America in 2022 were rendered successful. Netflix explains it saw initial cancel reactions in each of the three countries it tested the paid sharing program when the news were announced. But then it saw increased acquisition and revenue as the "borrowers" activated their own paid accounts and existing members began adding extra shared accounts.

"Longer term, paid sharing will ensure a bigger revenue base from which we can grow as we improve our service," Netflix adds.


Original Submission #1Original Submission #2

posted by janrinok on Thursday April 20 2023, @08:13AM   Printer-friendly

A European Chips Act to play catch-up with the US and Asia:

The European Union finally agreed on a new plan to boost its microchip industry. The multi-billion investment is focused on strengthening Europe's technological leadership, the EU said, but it could very well be an attempt to put the Old Continent on par with what market leaders are already doing right now.

After spending some months negotiating between the European Council and the European Parliament, the European Union has now officially approved a plentiful subsidy plan for its semiconductor industry. The European Chips Act will put €43 billion (roughly $47 billion) to bolster Europe's "competitiveness and resilience" in the microchip business, promoting an effective digital and green transition powered by hi-tech technology.

Right now, Europe has a 10% market share of global chip manufacturing; with the EU Chips Act, Brussels plans to double the EU's production capacity to 20% of the global market by 2030. The plan is also focused on strengthening Europe's research and technology capabilities over chip advancements, building innovation capacity in design manufacturing and packaging, developing an in-depth understanding of the global semiconductor supply chain, and addressing the skills shortage by attracting new talents and growing its own skilled workforce.

Microchips already are "strategic assets for key industrial value chains," the EU said, while the digital transformation opened new markets for the chip industry such as highly automated cars, cloud, IOT, connectivity, space, defense and supercomputers. The recent global semiconductor shortages also showed how the global supply chain has an "extreme" dependency on very few actors in a complex geopolitical context.

[...] As a matter of fact, the final EU Chips Act contains some additional provisions which were not included in the initial draft. Besides funding the manufacturing of cutting-edge semiconductor technology, the plan will also cover the entire value chain with older chips and research & design facilities. The EU Chips Act is coming after the world's powerhouses in the chip industry (USA, Taiwan, South Korea, Japan) have already approved or are in the process of approving their own subsidy initiatives. Therefore, Brussels' money to boost EU semiconductor output won't guarantee success.


Original Submission