Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Idiosyncratic use of punctuation - which of these annoys you the most?

  • Declarations and assignments that end with }; (C, C++, Javascript, etc.)
  • (Parenthesis (pile-ups (at (the (end (of (Lisp (code))))))))
  • Syntactically-significant whitespace (Python, Ruby, Haskell...)
  • Perl sigils: @array, $array[index], %hash, $hash{key}
  • Unnecessary sigils, like $variable in PHP
  • macro!() in Rust
  • Do you have any idea how much I spent on this Space Cadet keyboard, you insensitive clod?!
  • Something even worse...

[ Results | Polls ]
Comments:58 | Votes:103

posted by janrinok on Tuesday October 17 2023, @07:25PM   Printer-friendly
from the not-quite-ice-nine dept.

ScienceAlert has a summary of a report on a new phase of superionic ice as developed in the lab.

Scientists confirmed in 2019 what physicists had predicted back in 1988: a structure where the oxygen atoms in superionic ice are locked in a solid cubic lattice, while the ionized hydrogen atoms are let loose, flowing through that lattice like electrons through metals.

This gives superionic ice its conductive properties. It also raises its melting point such that the frozen water remains solid at blistering temperatures.

In this latest study, physicist Arianna Gleason of Stanford University and colleagues bombarded thin slivers of water, sandwiched between two diamond layers, with some ridiculously powerful lasers.

Successive shockwaves raised the pressure to 200 GPa (2 million atmospheres) and temperatures up to about 5,000 K (8,500 °F) – hotter than the temperatures of the 2019 experiments, but at lower pressures.

"Recent discoveries of water-rich Neptune-like exoplanets require a more detailed understanding of the phase diagram of [water] at pressure–temperature conditions relevant to their planetary interiors," Gleason and colleagues explain in their paper, from January 2022.

X-ray diffraction then revealed the hot, dense ice's crystal structure, despite the pressure and temperature conditions only being maintained for a fraction of a second.

The resulting diffraction patterns confirmed the ice crystals were in fact a new phase distinct from superionic ice observed in 2019. The newly discovered superionic ice, Ice XIX, has a body-centered cubic structure and increased conductivity compared to its predecessor from 2019, Ice XVIII.

Superionic ice might be the most common form of water in the universe.

Cite: Dynamic compression of water to conditions in ice giant interiors


Original Submission

posted by janrinok on Tuesday October 17 2023, @02:38PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Just when you thought it was safe to use your 12VHPWR cable with your RTX 4090 again, another incident of GPU meltdown pops up in the forums. Redditor Byogore reports that he bought an Asus 4090 in Germany a year ago because of US supply issues. The card worked totally fine until it self-immolated two days ago. This incident is unusual because typically failures have happened much sooner.

Other Redditors quickly questioned whether Byogore had the connectors seated securely since "user error" was one of Nvidia's excuses when the issue arose shortly after the RTX4090 launch. Byogore defended his ability to properly attach a computer component.

[...] "The fact that the 90-degree cable mod adapter [to prevent bending] was plugged in fully to the connector and the connector still melted, then we know that we have a problem with the card. We have a problem with the card. We do not have a problem with the cable. It's not user error, it's not a cable not plugged in properly problem, it's not a cable mod problem, but it's a problem with the design and the engineering of the card."

NorthridgeFix also points out that even if the problem was caused by user error, it's still Nvidia's fault. For instance, if a user is plugging in a cable as they do with any other, past or present, and it leaves a 1mm gap that causes failure, that is not the user's fault. When there is a tolerance issue like that, it is the manufacturer's responsibility to fix it or design a mechanism to prevent it, not blame the user for not plugging it in correctly and doing nothing more about it.

Byogore says that the meltdown happened while he was playing Battlefield 2042. The screen turned black, but the audio continued. Then, the computer rebooted itself. As it started up, he could smell burnt plastic. The card still worked, but only briefly before crashing and burning again. Byogore mentioned that he has a 1,000W Corsair PSU and that the 4090 was undervolted at the time of the catastrophic failure. However, he didn't note whether his PSU was ATX 3.0. The problem only seems to occur on older ATX 2.0 power supplies.

[...] Nvidia has been reluctant to accept any blame for the 4090's hot-button woes. It initially blamed users for not securing the cable tightly to the socket. Later, it said that "poorly designed" 12VHPWR adapters were to blame. The last time the issue popped up enough to make news was last May. The ongoing problem has even sucked Nvidia into a class-action lawsuit.


Original Submission

posted by janrinok on Tuesday October 17 2023, @09:52AM   Printer-friendly
from the like-I-said dept.

New study finds that large group size and mating systems where males have multiple mates drove evolution of deeper male voices in primates, including humans:

Deeper male voices in primates, including humans, offer more than sex appeal — they may have evolved as another way for males to drive off competitors in large groups that favored polygyny, or mating systems where a male has multiple mates, according to researchers. The research is the most comprehensive investigation of differences in vocal pitch between sexes to date and has the potential to help to shed light on social behavior in humans and their closest living relatives.

The average speaking pitch of an adult male human is about half the average pitch, an octave lower, than that of an adult female human, said David Puts, professor of anthropology at Penn State and study co-author.

"It's a sex difference that emerges at sexual maturity across species and it probably influences mating success through attracting mates or by intimidating competitors," he said. "I thought it has to be a trait that's been subjected to sexual selection, in which mating opportunities influence which traits are passed down to offspring. Humans and many other primates are highly communicative, especially through vocal communication. So it seems like a really relevant trait for thinking about social behavior in humans and primates in general."

The researchers used specialized computer software to visualize vocalizations and measure voice pitch in recordings from 37 anthropoid primate species, or those most closely related to humans, including gorillas, chimpanzees and recordings of 60 humans evenly divided by sex. Samples for each species included at least two male and two female vocal recordings, for a total of 1,914 vocalizations. The team then calculated average male and female vocal fundamental frequency for each species to see how pronounced the difference was between sexes.

[...] The researchers used these data to test five hypotheses simultaneously to identify which factors may have played the strongest roles in driving sex differences in vocal pitch. The hypotheses were: intensity of mating competition, large group size, multilevel social organization, trade-off against the intensity of sperm competition, and poor acoustic habitats. Previous research has looked at one or two of these hypotheses at a time. The current study is the first to test multiple hypotheses simultaneously for vocal pitch differences using a robust dataset, ensuring data consistency and garnering convincing results, according to Puts.

The team found that fundamental frequency differences by sex increased in larger groups and those with polygynous mating systems, especially in groups with a higher female-to-male ratio. They reported their findings today (July 10) in Nature Communications.

[...] Deeper male voices may act as an additional way to fend off mating competitors without having to engage in costly fighting by making males sound bigger, in addition to other physical traits like height and muscle size, according to the researchers. In adult humans, for instance, males vocalize at an average of 120 hertz whereas females vocalize at an average of about 220 hertz, putting humans right in the middle of polygynous societies, the researchers reported.

"Although social monogamy is really common in humans, mating and reproduction in our ancestors was substantially polygynous," Puts said. "Our findings help us to understand why male and female voices of our species differ so drastically. It may be a product of our evolutionary history, particularly our history of living in large groups in which some males reproduced with multiple females."

Journal Reference:
Aung, T., Hill, A.K., Pfefferle, D. et al. Group size and mating system predict sex differences in vocal fundamental frequency in anthropoid primates. Nat Commun 14, 4069 (2023). https://doi.org/10.1038/s41467-023-39535-w


Original Submission

posted by hubie on Tuesday October 17 2023, @05:11AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The 69th Annual IEEE International Electron Device Meeting is set to start on 9 December, and the conference teaser shows that researchers have been extending the roadmap for a number of technologies, notably those used to make CPUs and GPUs.

Because chip companies can’t keep on increasing transistor density by scaling down chip features in two dimenstions, they have moved into the third dimension by stacking chips on top of each other. Now they’re working to build transistors on top of each other within those chips. Next, it appears likely, they will squeeze still more into the third dimension by designing 3D circuits with 2D semiconductors, such as molybdenum disulfide. All of these technologies will likely serve machine learning, an application with an ever-growing appetite for processing power. But other research to be presented at IEDM shows that 3D silicon and 2D semiconductors aren’t the only things that can keep neural networks humming.

Increasing the number of transistors you can squeeze into a given area by stacking up chips (called chiplets in this case) is both the present and future of silicon. Generally, manufacturers are striving to increase the density of the vertical connections between chips. But there are complications.

[...] Scaling down nanosheet transistors (and CFETs, too) will mean ever-thinner ribbons of silicon at the heart of transistors. Eventually, there won’t be enough atoms of silicon to do the job. So researchers are turning to materials that are semiconductors even in a layer that’s just one atom thick.

Three problems have dogged the idea that 2D semiconductors could take over from silicon. One is that it’s been very difficult to produce (or transfer) a defect-free layer of 2D semiconductor. The second is that the resistance between the transistor contacts and the 2D semiconductor has been way too high. And finally, for CMOS you need a semiconductor that can conduct both holes and electrons equally well, but no single 2D semiconductor seems to be good for both. Research to be presented at IEDM addresses all three in one form or another.

[...] Among the biggest issues in machine learning is the movement of data. The key data involved are the so-called weights and activations that define the strength of the connections between artificial neurons in one layer and the information that those neurons will pass to the next layer. Top GPUs and other AI accelerators prioritize this problem by keeping data as close as they can to the processing elements. Researchers have been working on multiple ways to do this, such as moving some of the computing into the memory itself and stacking memory elements on top of computing logic.

Two cutting-edge examples caught my eye from the IEDM agenda. The first is the use of analog AI for transformer-based language models (ChatGPT and the like). In that scheme, the weights are encoded as conductance values in a resistive memory element (RRAM). The RRAM is an integral part of an analog circuit that performs the key machine learning calculation, multiply and accumulate. That computation is done in analog as a simple summation of currents, potentially saving huge amounts of power.

IBM’s Geoff Burr explained analog AI in depth in the December 2021 issue of IEEE Spectrum. At IEDM, he’ll be delivering a design for ways analog AI can tackle transformer models.

Another interesting AI scheme coming up at IEDM originates with researchers at Tsinghua University and Peking University. It’s based on a three-layer system that includes a silicon CMOS logic layer, a carbon nanotube transistor and RRAM layer, and another layer of RRAM made from a different material. This combination, they say, solves a data transfer bottleneck in many schemes that seek to lower the power and latency of AI by building computing in memory. In tests it performed a standard image recognition task with the similar accuracy to a GPU but almost 50 times faster and with about 1/40th the energy.


Original Submission

posted by hubie on Tuesday October 17 2023, @12:22AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Chip designer Qualcomm has revealed it intends to shed over 1,000 California-based employees, delivering on previously foreshadowed plans to address its economic woes.

Qualcomm reported a near-sixty-percent profit plunge in August – largely due to a slowdown in demand for smartphones leading to lower sales of its silicon for such devices.

At the time, CEO Cristiano Amon told investors "We're taking a conservative view of the market, and we'll be proactively taking additional cost actions."

[...] Execs, including vice presidents, will be axed. So will hundreds of engineers: reportedly more than 750 of them. Qualcomm employs about 50,000 globally, up from 45,000 in 2021, the year it bought Nuvia.

Cruelly, the ax will fall starting from December 13.

A friend of The Reg tells us around 150 jobs will go in the UK, too. Yet Qualcomm recently found a rumored £180 million ($220m) to sponsor British soccer team Manchester United for three years, with the Snapdragon CPU brand to be featured on team shirts.

The sponsorship creates an interesting metaphor: Manchester United has underperformed for years, and its recent investments have not led to desired improvements in performance. Which is maybe not quite the story Qualcomm wants to associate with Snapdragon.

[...] Demand for engineers is high elsewhere: Taiwan's TSMC has flagged its intention to hire thousands of engineers.


Original Submission

posted by hubie on Monday October 16 2023, @07:35PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Natural and man-made disasters have caused $3.8 trillion in crop and livestock losses over 30 years, the UN's Food and Agricultural Organization said on Friday.

Floods, droughts, insect infestations, storms, disease and war have caused about $123 billion per year in lost food production between 1991 and 2021, the equivalent of five percent of total production or enough to feed up to half a billion people per year, the FAO said in a report.

This is the first time the UN body has tried to compile such an estimate, with the aim of putting into context the scale of the cost of disasters on both a global and personal scale.

"The international community is taking stock of the fact that disasters are... increasing tremendously... quadrupling since the 1970s" and are having an increasing impact on food production, the deputy head of FAO's statistics department, Piero Conforti, told AFP.

The FAO report, entitled "The Impact of Disasters on Agriculture and Food Security", found that disasters are increasing in severity and frequency, from 100 per year in the 1970s to around 400 events per year in the past 20 years.

[...] It identified the "systemic drivers of disaster risk" as climate change, pandemics, epidemics and armed conflicts.

[...] The FAO further found that poorer nations suffered the highest losses due to extreme events in terms of the percentage of their agricultural output, at up to 10 percent.

[...] Despite the increasing frequency and intensity of disasters, it is possible to reduce risks to agriculture.

"There is no one size fits all solution," said the FAO's Conforti, but "there are a range of practices that can enhance the resilience of agricultural systems."

That includes agronomic techniques such as using different plant varieties and different methods to prepare the soil, as well as creating and improving warning systems.

When locusts invaded the Horn of Africa region in 2020 and 2021, early warning provided the time necessary to treat 2.3 million hectares (5.6 million acres) in the region and nearby Yemen.

Some $1.77 billion in losses in grain and dairy production was saved, the FAO estimates.

Moreover, it was extremely cost-effective, with each dollar invested in prevention measures resulted in $15 of avoided crop losses.


Original Submission

posted by hubie on Monday October 16 2023, @02:53PM   Printer-friendly

Discoveries at Kalambo Falls, Zambia offer insights into ancient human technology:

Recent research has revealed that nearly half a million years ago, ancient human ancestors, predating Homo sapiens, were already engaging in advanced woodworking.

The artifacts found indicate that these humans were building structures, potentially laying the foundation of platforms or parts of dwellings, much earlier than what was once believed.

A team from the University of Liverpool and Aberystwyth University excavated preserved wood at Kalambo Falls, Zambia, dating back to an impressive 476,000 years. Analyzing the stone tool cut marks on the wood, the team deduced that these early humans intentionally shaped and combined two logs, showcasing the deliberate crafting of logs to fit together. Prior to this discovery, humans were believed to only utilize wood for simpler purposes such as creating fire, crafting digging sticks, and making spears.

The preservation of this wood is in itself remarkable. Typically, wood from such ancient times deteriorates and disappears. However, at Kalambo Falls, high water levels have protected and preserved these ancient wooden structures.

These findings cast doubt on the previously held belief that Stone Age humans were strictly nomadic. The abundance of resources in the vicinity of Kalambo Falls suggests that these ancient humans could have settled, tapping into the perennial water source and the surrounding forest for sustenance, allowing them to engage in construction.

Professor Larry Barham from the University of Liverpool articulated the significance of this discovery, stating, "They used their intelligence, imagination, and skills to create something they'd never seen before, something that had never previously existed."

Journal Reference:
Barham, L., Duller, G. A. T., Candy, I., et al. Evidence for the earliest structural use of wood at least 476,000 years ago [open], Nature (DOI: 10.1038/s41586-023-06557-9)


Original Submission

posted by hubie on Monday October 16 2023, @10:04AM   Printer-friendly

Europe is subsidizing the launch of Internet satellites for Jeff Bezos:

Nearly a decade ago, the European Space Agency announced plans to develop the next generation of its Ariane rocket, the Ariane 6 booster. The goal was to bring a less costly workhorse rocket to market that could compete with the likes of SpaceX's Falcon 9 booster and begin flying by 2020.

It has been well documented that development of the Ariane 6 is running years behind—the vehicle is now unlikely to fly before the middle of 2024 and subject to further delays. For example, a critical long-duration hot fire of the vehicle's Vulcain 2.1 main engine had been scheduled for "early October," but there have been no recent updates on when this key test will occur.

However, there are also increasing concerns that the Ariane 6 rocket will not meet its ambitious price targets. For years, European officials have said they would like to cut the price of launches by half with a rocket that is easier to manufacture and by flying an increased cadence of missions.

[...] However, as Ars previously reported, a 50 percent cost reduction is no longer achievable. Speaking in June at the Paris Air Show, the European Space Agency's Toni Tolker-Nielsen said the Ariane 6 is projected to come in at a higher cost per launch than first predicted. The Ariane 6's cost per flight will be about 40 percent lower than that of the now-retired Ariane 5, short of the previous goal.

[...] Since 2021, the publicly funded European Space Agency has provided a subsidy of 140 million euros annually to ArianeGroup in order to make the Ariane 6 rocket more competitive in the commercial market. That is to say, taxpayers are subsidizing the cost of building Ariane 6 rockets so that they will be more attractive to private satellite operators seeking a ride to space.

However, according to the French news report, ArianeGroup is asking for a substantial increase to this subsidy, to 350 million euros a year. If this were approved by the European Space Agency, it would blow any cost savings for the Ariane 6 rocket, compared to the Ariane 5, out of the water.

Given the large development costs and ongoing subsidy, one might start to question why Europe developed the Ariane 6 in the first place. After all, the Ariane 5 has a good success rate and, if it had not been retired earlier this year, would be seeing significant demand on the launch market. Instead, Europe has a gap in its ability to launch medium and large satellites until the Ariane 6 is operationally ready.

Another significant downside to all of this is that Europe spent a decade developing a rocket that is somewhat more modern than the Ariane 5 but still performs the same basic function at the same basic price. During that lost decade, SpaceX has amply demonstrated the value of re-flying first stages and has kickstarted a stampede within the launch industry toward reuse. Because it has been focused on the expendable Ariane 6, the European Space Agency has missed out on this opportunity. It is now years behind and only starting basic technology demonstrations rather than bringing a true Falcon 9 competitor to market.

[...] "There is very little we can do now," Parsonson said. "I know that. Ariane 6 is a pill that we're just going to have to swallow. We cannot cancel the program and any new development would take several years to mature. What we can do, however, is make sure that ArianeGroup is not involved in the future of European launch."


Original Submission

posted by Fnord666 on Monday October 16 2023, @05:17AM   Printer-friendly
from the updated-prescription dept.

Arthur T Knackerbracket has processed the following story:

The European Space Agency's Euclid space telescope is back to normal and will resume its mission, thanks to a software update that was required after its navigation sensors mistakenly identified solar ray signals as stars.

But shortly after its instruments were deployed and it snapped a first picture a month later, mission control discovered the telescope was failing to focus on stars. Squiggly lines and circles captured in another image revealed that Euclid was looping around and struggling to lock onto distant stars to keep it steady during its observations.

Mission control pinpointed the issue and realized the telescope's Fine Guidance Sensor (FGS), which the craft uses to locate a set of landmark stars to help navigate and align its instruments on designated targets, was to blame. The optical sensors were mistakenly identifying photons, ejected by the Sun during periods of high solar activity, as stars.

The FGS analyzes light from distant sources and helps control Euclid's orientation. Since it was misinterpreting the solar rays as stars, the telescope was moving haphazardly making it difficult to focus its sight. Working together with aerospace companies Thales Alenia Space and Leonardo, ESA engineers updated its software to change the way its sensors characterize stars.

The patch was uploaded to Euclid, and the telescope is now functioning normally. Mission control will continue testing its performance for a little while longer before it officially begins to collect data. 

"The performance verification phase that was interrupted in August has now fully restarted and all the observations are carried out correctly," Giuseppe Racca, Euclid Project Manager, said before the weekend. "This phase will last until late November, but we are confident that the mission performance will prove to be outstanding and the regular scientific survey observations can start thereafter."


Original Submission

posted by Fnord666 on Monday October 16 2023, @12:33AM   Printer-friendly
from the new-and-improved dept.

Arthur T Knackerbracket has processed the following story:

In August and September, threat actors unleashed the biggest distributed denial-of-service attacks in Internet history by exploiting a previously unknown vulnerability in a key technical protocol. Unlike other high-severity zerodays in recent years—Heartbleed or log4j, for example—which caused chaos from a torrent of indiscriminate exploits, the more recent attacks, dubbed HTTP/2 Rapid Reset, were barely noticeable to all but a select few engineers.

HTTP2/Rapid Reset is a novel technique for waging DDoS, or distributed denial-of-service attacks, of an unprecedented magnitude. It wasn’t discovered until after it was already being exploited to deliver record-breaking DDoSes. One attack on a customer using the Cloudflare content delivery network peaked at 201 million requests per second, almost triple the previous record Cloudflare had seen of 71 million rps. An attack on a site using Google’s cloud infrastructure topped out at 398 million rps, more than 7.5 times bigger than the previous record Google recorded of 46 million rps.

[...] The vulnerability that HTTP/2 Rapid Reset exploits resides in HTTP/2, which went into effect in 2015 and has undergone several overhauls since then. Compared to the HTTP/1 and HTTP/1.1 protocols that predated it, HTTP/2 provided the ability for a single HTTP request to carry 100 or more “streams” that a server can receive all at once. The resulting throughput can lead to almost 100 times higher utilization of each connection, compared with the earlier HTTP protocols.

The increased efficiency wasn’t just useful for distributing video, audio, and other sorts of benign content. DDoSers began leveraging HTTP/2 to deliver attacks that were orders of magnitude larger. There are two properties in the protocol allowing for these new efficient DDoSes. Before discussing them, it’s useful to review how DDoS attacks work in general and then move on to the way HTTP protocols prior to 2.0 worked.

[...] The type of attack carried out by HTTP/2 Rapid Reset falls into a third form of DDoS known as Application Layer attacks. Rather than trying to overwhelm the incoming connection (volumetric) or exhaust the routing infrastructure (network protocol), application-level DDOSes attempt to exhaust the computing resources available in layer 7 of a target’s infrastructure. Floods to server applications for HTTP, HTTPS, and SIP voice are among the most common means for exhausting a target’s computing resources.


Original Submission

posted by janrinok on Sunday October 15 2023, @07:47PM   Printer-friendly

https://gizmodo.com/neanderthals-hunted-cave-lions-skeleton-spear-1850921295

Marks on the ribcage of a 48,000-year-old cave lion skeleton suggest the animal was killed by Neanderthals, making it the first evidence that our nearest human cousins hunted the Ice Age predators.

A team of paleoanthropologists and archaeologists recently scrutinized the remains of four lions: the aforementioned skeleton, which was excavated in 1985 in Siegsdorf, Germany, and phalanges and sesamoid bones from three lion specimens excavated from Einhornhöle, Germany, in 2019. The former showed evidence of being punctured by a wooden-tipped spear—a known weapon of Neanderthals—and the latter three had cut marks that suggested they were butchered in a way to keep the animals' claws preserved on the fur. The team's research is published today in Scientific Reports.

"The notion that Neanderthals interacted with cave lions holds deep significance," said Gabriele Russo, a paleoanthropologist at Eberhard Karls University of Tübingen and the study's lead author, in an email to Gizmodo. "It reveals that Neanderthals were actively engaged with their environment, which included encounters with formidable creatures like lions. These interactions encompassed not only the cultural use of lion body parts but also the ability to hunt them."

Cave lions (Panthera spelaea) are now extinct, but they inhabited most of northern Eurasia during the Pleistocene, recently enough that some preserved cave lions look like they're just sleeping. They made up a remarkable tableau of megafauna on the Ice Age steppe, alongside creatures like the woolly rhinoceros, ancient, extinct elephant species, and the woolly mammoth. And while the mammoth is a known quarry of Neanderthals, it now appears that the human group also hunted cave lions, one of the most prominent Ice Age predators.


Original Submission

posted by janrinok on Sunday October 15 2023, @03:03PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Speaking to partners last week as part of their annual Open Innovation Platform forum in Europe, a big portion of TSMC's roadshow was dedicated to the next generation of the company's foundry technology. TSMC's 2 nm-class N2N2P, and N2X process technologies are set to introduce multiple innovations, including nanosheet gate-all-around (GAA) transistors, backside power delivery, and super-high-performance metal-insulator-metal (SHPMIM) capacitor over the next few years. But in order to take advantage of these innovations, TSMC warns, chip designers will need to use all-new electronic design automation (EDA), simulation, and verification tools as well as IP. And while making such a big shift is never an easy task, TSMC is bringing some good news to chip designers early-on: even with N2 still a couple of years out, many of the major EDA tools, verification tools, foundation IP, and even analog IP for N2 are already available for use.

[...] Preparations for the start of N2 chip production, scheduled for sometime in the second half of 2025, began long ago. Nanosheet GAA transistors behave differently than familiar FinFETs, so EDA and other tool and IP makers had to build their products from scratch. This is where TSMC's Open Innovation Platform (OIP) demonstrated its prowess and enabled TSMC's partners to start working on their products well in advance.

By now, major EDA tools from Cadence and Synopsys as well as many tools from Ansys and Siemens EDA have been certified by TSMC, so chip developers can already use them to design chips. Also, EDA software programs from Cadence and Synopsys are ready for analog design migration. Furthermore, Cadence's EDA tools already support N2P's backside power delivery network.

With pre-built IP designs, things are taking a bit longer. TSMC's foundation libraries and IP, including standard cells, GPIO/ESD, PLL, SRAM, and ROM are ready both for mobile and high-performance computing applications. Meanwhile, some PLLs exist in pre-silicon development kits, whereas others are silicon proven. Finally, blocks such as non-volatile memory, interface IP, and even chiplet IP are not yet available - bottlenecking some chip designs - but these blocks in active development or planned for development by companies like Alphawave, Cadence, Credo, eMemory, GUC, and Synopsys, according to a TSMC slide. Ultimately, the ecosystem of tools and libraries for designing 2 nm chips is coming together, but it's not all there quite yet.

[...] Although many of the major building blocks for chips are N2-ready, a lot of work still has to be done by many companies before TSMC's 2 nm-class process technologies go into mass production. Large companies, which tend to design (or co-design) IP and development tools themselves are already working on their 2 nm chips, and should be ready with their products by the time mass production starts in 2H 2025. Other players can also fire up their design engines because 2 nm preps are well underway at TSMC and its partners.


Original Submission

posted by janrinok on Sunday October 15 2023, @10:23AM   Printer-friendly

https://www.techdirt.com/2023/10/06/a-reagan-judge-the-first-amendment-and-the-eternal-war-against-pornography/

Using "Protect the children!" as their rallying cry, red states are enacting digital pornography restrictions. Texas's effort, H.B. 1181, requires commercial pornographic websites—and others, as we'll see shortly—to verify that their users are adults, and to display state-drafted warnings about pornography's alleged health dangers. In late August, a federal district judge blocked the law from taking effect. The U.S. Court of Appeals for the Fifth Circuit expedited Texas's appeal, and it just held oral argument. This law, or one of the others like it, seems destined for the Supreme Court.

So continues what the Washington Post, in the headline of a 1989 op-ed by the columnist Nat Henthoff, once called "the eternal war against pornography."

It's true that the First Amendment does not protect obscenity—which the Supreme Court defines as "prurient" and "patently offensive" material devoid of "serious literary, artistic, political, or scientific value." Like many past anti-porn crusaders, however, Texas's legislators blew past those confines. H.B. 1181 targets material that is obscene to minors. Because "virtually all salacious material" is "prurient, offensive, and without value" to young children, the district judge observed, H.B. 1181 covers "sex education [content] for high school seniors," "prurient R-rated movies," and much else besides. Texas's attorneys claim that the state is going after "teen bondage gangbang" films, but the law they're defending sweeps in paintings like Manet's Olympia (1863):

Incidentally, this portrait appears—along with other nudes—in a recent Supreme Court opinion.


Original Submission

posted by hubie on Sunday October 15 2023, @05:38AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

We hear plenty of legitimate concerns regarding the new wave of generative AI, from the human jobs it could replace to its potential for creating misinformation. But one area that often gets overlooked is the sheer amount of energy these systems use. In the not-so-distant future, the technology could be consuming the same amount of electricity as an entire country.

Alex de Vries, a researcher at the Vrije Universiteit Amsterdam, authored 'The Growing Energy Footprint of Artificial Intelligence,' which examines the environmental impact of AI systems.

De Vries notes that the training phase for large language models is often considered the most energy-intensive, and therefore has been the focus of sustainability research in AI.

Following training, models are deployed into a production environment and begin the inference phase. In the case of ChatGPT, this involves generating live responses to user queries. Little research has gone into the inference phase, but De Vries believes there are indications that this period might contribute significantly to an AI model's life-cycle costs.

According to research firm SemiAnalysis, OpenAI required 3,617 Nvidia HGX A100 servers, with a total of 28,936 GPUs, to support ChatGPT, implying an energy demand of 564 MWh per day. For comparison, an estimated 1,287 MWh was used in GPT-3's training phase, so the inference phase's energy demands were considerably higher.

Google, which reported that 60% of AI-related energy consumption from 2019 to 2021 stemmed from inference, is integrating AI features into its search engine. Back in February, Alphabet Chairman John Hennessy said that a single user exchange with an AI-powered search service "likely costs ten times more than a standard keyword search."

[...] "It would be advisable for developers not only to focus on optimizing AI, but also to critically consider the necessity of using AI in the first place, as it is unlikely that all applications will benefit from AI or that the benefits will always outweigh the costs," said De Vries.


Original Submission

posted by hubie on Sunday October 15 2023, @12:52AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The biggest acquisition in gaming history and one of the largest in the tech industry is in the books. Twenty-one months after the deal was announced, Microsoft has bought Activision Blizzard for $68.7 billion, the largest acquisition in the company's history. CEO of Microsoft Gaming Phil Spencer has asked Activision CEO Bobby Kotick to stay on until the end of 2023, at which point he'll be leaving the company. It's been a long road filled with plenty of twists and turns to get to this point.

[...] In an attempt to win over the UK regulator, Microsoft agreed to sell the cloud gaming rights for Activision Blizzard titles to Ubisoft. That means that not only should Activision Blizzard's games be on Xbox Game Pass, but they'll land on Ubisoft+ and any other game-streaming service Ubisoft decides to work with. Concerns about competition in the cloud gaming market was the CMA's reasoning for initially blocking Microsoft's takeover of Activision, but the watchdog said in September that the Ubisoft concession "opens the door to the deal being cleared." A few weeks later, the CMA has rubberstamped the merger.

Microsoft also signed 10-year agreements with Nintendo and several cloud-gaming companies to offer its titles on their platforms. Those moves led to the European Union giving the merger the green light. The bloc's competition officials reportedly didn't see anything in the amended merger agreement (with the Ubisoft plan factored in) that would prompt a fresh antitrust investigation.

[...] The FTC still plans to challenge the merger. If that effort is successful, Microsoft could be forced to divest some or all of Activision Blizzard.

But for now, the deal is done. It means, among other things, that Activision Blizzard titles will be available on cloud gaming platforms for the first time since the publisher pulled its titles from GeForce Now in early 2020. Its games will surely join Game Pass in the coming months, including on Xbox Cloud Gaming, and they'll pop up on Ubisoft+ and other platforms Ubisoft works with.

[...] One of the key reasons Microsoft gave for pursuing the deal was to accelerate its aim of becoming a major player in the mobile gaming market. With Activision Blizzard pulling in $1.9 billion in mobile revenue in the first six months of 2023 alone, it will achieve that goal practically overnight.

[...] Spencer hinted at efforts to improve the publisher's workplace culture. "Today is a good day to play. We officially welcome Activision Blizzard King to Team Xbox," he wrote on X. "Together, we’ll create stories and experiences that bring players together, in a culture empowering everyone to do their best work and celebrate diverse perspectives." Spencer added that "whether you play on Xbox, PlayStation, Nintendo, PC or mobile, you’re always welcome here — even if Xbox isn’t where you play your favorite franchise. Because when everyone plays, we all win."


Original Submission