Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
https://www.ifixit.com/News/108371/right-to-repair-laws-have-now-been-introduced-in-all-50-us-states
With the introduction of a bill in Wisconsin, Right to Repair legislation has now been introduced in every single US state.
We've been fighting for the simple right to fix everything we own for the last eleven years—and we've been joined in that fight by more and more advocates, tinkerers, farmers, students, and lawmakers. Today, that movement has touched every corner of the country. Lawmakers in every state in the union have filed legislation demanding access to the parts, tools, and documentation we need for repair. This year alone, legislation is active in 24 states.
Some of those laws have passed: Five states (New York, California, Minnesota, Oregon, and Colorado) have passed electronics Right to Repair legislation. One in five Americans lives in a state that has passed Right to Repair—and the remaining states are working hard to restore repair competition.
[...] "This is more than a legislative landmark—it's a tipping point. We've gone from a handful of passionate advocates to a nationwide call for repair autonomy," said Kyle Wiens, CEO of iFixit. "People are fed up with disposable products and locked-down devices. Repair is the future, and this moment proves it."
On Wednesday, Clone Robotics released video footage of its Protoclone humanoid robot, a full-body machine that uses synthetic muscles to create unsettlingly human-like movements. In the video, the robot hangs suspended from the ceiling as its limbs twitch and kick, marking what the company claims is a step toward its goal of creating household-helper robots.
Poland-based Clone Robotics designed the Protoclone with a polymer skeleton that replicates 206 human bones.
[...] It contains over 1,000 artificial muscles built with the company's "Myofiber" technology, which builds on the McKibbin pneumatic muscle concept.
[...] While the Protoclone is a twitching, dangling robotic prototype right now, there's a lot of tech packed into its body. Protoclone's sensory system includes four depth cameras in its skull for vision, 70 inertial sensors to track joint positions, and 320 pressure sensors that provide force feedback. This system lets the robot react to visual input and learn by watching humans perform tasks.
[...] Other companies' robots typically use other types of actuators, such as solenoids and electric motors. Clone's pressure-based muscle system is an interesting approach, though getting Protoclone to stand and balance without the need for suspension or umbilicals may still prove a challenge.
Clone Robotics plans to start its production with 279 units called Clone Alpha, with plans to open preorders later in 2025.
Related stories on SoylentNews:
Search SoylentNews for robot
Arthur T Knackerbracket has processed the following story:
Repurposing old GPUs to help relive your favorite legacy games.
A few days ago, it came to light that Nvidia has dropped support for 32-bit CUDA applications with its latest RTX 50-series (Blackwell) GPUs. Support for PhysX has gradually faded over the years. However, PhysX can still be offloaded to an RTX 40-series (Ada Lovelace) or older GPU, and that's exactly what a user at Reddit has done. In addition, we've gathered some interesting benchmarks, courtesy of VerbalSilence on YouTube and the same Reddit user, where the GTX 980 Ti handily outperforms the RTX 5090 in 32-bit PhysX games.
PhysX is fully functional in 64-bit applications like Batman: Arkham Knight, so Nvidia hasn't abandoned the technology entirely. However, the GPU maker has retired 32-bit CUDA support for RTX 50-series GPUs (and likely beyond). Given the age of the technology, most games with PhysX were compiled using 32-bit CUDA libraries. This is a software limitation, for the most part, though maintaining support for legacy environments is easier said than done.
As the news dropped, a Redditor snagged a separate RTX 3050 GPU to pair with the primary RTX 5090 to maintain PhysX support in older 32-bit titles. Using the Nvidia Control Panel, you can offload PhysX computations to a separate GPU or CPU, which you never need to do. Around 20 years back, dedicated processors for computing physics calculations were dubbed PPUs (Physics Processing Units). Ageia used to make such devices, which Nvidia later acquired.
In older 32-bit titles, there's a substantial gap between using the RTX 3050 and without. With legacy PhysX no longer supported, RTX 50 GPUs crash when you enable the setting or fall back to CPU processing. The user mentions that despite setting the RTX 3050 as a dedicated PhysX processor, 64-bit games utilize the RTX 5090 anyway. As mentioned above, modern PhysX implementations, at least the handful that exist, should still run fine on Blackwell.
Another test conducted by VerbalSilence reveals a striking difference in Mirror's Edge, where in some scenes, the RTX 5080 plummets to less than 10 FPS while the GTX 980 Ti sits comfortably at almost 150 FPS. The performance delta is heavily dependent on the game's PhysX implementation. Still, Borderlands 2 sees the GTX 980 Ti lead the RTX 5080 by almost 2X, and that's telling. Here, the GTX 980 Ti system is coupled with a Core i5-4690K, with the Ryzen 7 9800X3D reserved for the RTX 5080 setup.
It's unlikely that Nvidia will reinstate compatibility for legacy CUDA applications. If you genuinely wish to play your favorite 32-bit titles with PhysX, maybe it's time to dust off that old GPU in your cabinet and restore it to service.
https://phys.org/news/2025-02-fluid-discovery-defies-logic.html
A team led by researchers at UNC-Chapel Hill have made an extraordinary discovery that is reshaping our understanding of bubbles and their movement. Picture tiny air bubbles inside a container filled with liquid. When the container is shaken up and down, these bubbles engage in an unexpected, rhythmic "galloping" motion—bouncing like playful horses and moving horizontally, even though the shaking occurs vertically.
This counterintuitive phenomenon, revealed in a new study published in Nature, has significant implications for technology, from cleaning surfaces to improving heat transfer in microchips and even advancing space applications.
These galloping bubbles are already garnering significant attention: their impact in the field of fluid dynamics has been recognized with an award for their video entry at the most recent Gallery of Fluid Motion, organized by the American Physical Society.
"Our research not only answers a fundamental scientific question but also inspires curiosity and exploration of the fascinating, unseen world of fluid motion," said Pedro Sáenz, principal investigator and professor of applied mathematics at UNC-Chapel Hill. "After all, the smallest things can sometimes lead to the biggest changes."
In collaboration with a colleague at Princeton University, the research team sought to answer a seemingly simple question: Could shaking bubbles up and down make them move continuously in one direction?
To their surprise, not only did the bubbles move—but they did so perpendicularly to the direction of shaking. This means that vertical vibrations were spontaneously transformed into persistent horizontal motion, something that defies common intuition in physics. Moreover, by adjusting the shaking frequency and amplitude, the researchers discovered that bubbles could transition between different movement patterns: straight-line motion, circular paths, and chaotic zigzagging reminiscent of bacterial search strategies.
"This discovery transforms our understanding of bubble dynamics, which is usually unpredictable, into a controlled and versatile phenomenon with far-reaching applications in heat transfer, microfluidics, and other technologies," explained Connor Magoon, joint first author and graduate student in mathematics at UNC-Chapel Hill.
Bubbles play a key role in a vast range of everyday processes, from the fizz in soft drinks to climate regulation and industrial applications such as cooling systems, water treatment, and chemical production.
Controlling bubble movement has long been a challenge across multiple fields, but this study introduces an entirely new method: leveraging a fluid instability to direct bubbles in precise ways.
One immediate application is in cooling systems for microchips. On Earth, buoyancy naturally removes bubbles from heated surfaces, preventing overheating. However, in microgravity environments such as space, there is no buoyancy, making bubble removal a major issue. This newly discovered method allows bubbles to be actively removed without relying on gravity, which can enable improved heat transfer in satellites and space-based electronics.
Rust, a modern and notably more memory-safe language than C, once seemed like it was on a steady, calm, and gradual approach into the Linux kernel.
In 2021, Linux kernel leaders, like founder and leader Linus Torvalds himself, were impressed with the language but had a "wait and see" approach. Rust for Linux gained supporters and momentum, and in October 2022, Torvalds approved a pull request adding support for Rust code in the kernel.
By late 2024, however, Rust enthusiasts were frustrated with stalls and blocks on their efforts, with the Rust for Linux lead quitting over "nontechnical nonsense."
[...]
over the last two months, things in one section of the Linux Kernel Mailing List have gotten tense and may now be heading toward resolution—albeit one that Torvalds does not think "needs to be all that black-and-white."
[...]
Hector Martin, the lead of the Asahi Linux project, resigned from the list of Linux maintainers while also departing the Asahi project, citing burnout and frustration with roadblocks to implementing Rust in the kernel.
[...]
Christoph Hellwig, maintainer of the Direct Memory Access (DMA) API, was opposed to Rust code in his section on the grounds that a cross-language codebase was painful to maintain.
[...]
Hellwig posted a longer missive, outlining his opposition to Rust bindings—or translations of Rust libraries that can work with equivalents in C—and continuing with his prior comparison of such multi-language allowances to "cancer."
[...]
Torvalds' response from Thursday does offer some clarification on Rust bindings in the kernel, but also on what die-hard C coders can and cannot control.
Maintainers like Hellwig who do not want to integrate Rust do not have to. But they also cannot dictate the language or manner of code that touches their area of control but does not alter it. The pull request Hellwig objected to "DID NOT TOUCH THE DMA LAYER AT ALL," Torvalds writes (all-caps emphasis his), and was "literally just another user of it, in a completely separate subdirectory."
[...]
Torvalds writes Hellwig that "I respect you technically, and I like working with you," and that he likes when Hellwig "call[s] me out on my bullshit," as there "needs to be people who just stand up to me and tell me I'm full of shit." But, Torvalds writes, "Now I'm calling you out on *YOURS*."
[...]
In an earlier response to the "Rust kernel policy" topic, Kroah-Hartman suggests that, "As someone who has seen almost EVERY kernel bugfix and security issue for the past 15+ years ... I think I can speak on this topic."As the majority of bugs are due to "stupid little corner cases in C that are totally gone in Rust," Koah-Hartman is "wanting to see Rust get into the kernel," so focus can shift to more important bugs. While there are "30 million lines of C code that isn't going anywhere any year soon," new code and drivers written in Rust are "a win for all of us, why wouldn't we do this?"
[...]
Rust may or may not become an ascendant language in the kernel. But maintaining C as the dominant language, to the point of actively tamping down even non-direct interaction with any C code, did not seem like a viable long-term strategy.
Previously on SoylentNews:
Torvalds Weighs In On 'Nasty' Rust Vs C For Linux Debate - 20240923
"Rust for Linux" Lead Retires Rather Than Deal With More "Nontechnical Nonsense" - 20240904
Linux Kernel 6.10 Arrives - 20240717
Linus Torvalds Is Annoyed With Linux Developers' Late Kernel Homework - 20221018
Linus Torvalds: Rust Will Go Into Linux 6.1 - 20220919
Related stories on SoylentNews:
Google: After Using Rust, We Slashed Android Memory Safety Vulnerabilities - 20221203
Beyond C++: The promise of Rust, Carbon, and Cppfront - 20221116
Rust Programming Language Outlines Plan for Updates to Style Guide - 20221010
It's Time to Stop Using C and C++ for New Projects, Says Microsoft Azure CTO - 20220923
Arthur T Knackerbracket has processed the following story:
The software engineering job market has experienced a significant downturn, with job openings hitting a five-year low, according to an analysis of data from Indeed by Practical Engineer. The statistics reveal a stark 35 percent decrease in software developer job listings compared to five years ago, marking a dramatic shift in the industry's employment landscape.
This decline is particularly noteworthy when compared to other sectors. While the overall job market has seen a 10 percent increase in listings since February 2020, software development positions have plummeted. This contrasts sharply with growth in areas such as construction (25 percent), accounting (24 percent), and electrical engineering (20 percent).
The software development sector has also experienced very volatile fluctuations in recent years. Job listings more than doubled during the pandemic-era boom of 2021 and 2022, outpacing all other industries. However, this surge was followed by an equally dramatic fall, with current vacancy numbers 3.5 times lower than their mid-2022 peak.
Several factors contribute to this decline. The end of zero-percent interest rates has had a significant impact on the tech industry, affecting hiring practices, venture capital funding, and the survival of tech startups. However, this alone doesn't explain the hiring slowdown and layoffs at highly profitable Big Tech companies like Microsoft, Meta, Amazon, and Google, according to author Gergely Orosz.
He posits that many companies may still be adjusting after over-recruiting in 2021-2022, leading to a more cautious approach to hiring.
Additionally, the rise of generative AI and LLMs may be influencing the job market. These technologies have shown particular promise in coding – 75 percent of engineers reported the use of AI coding tools in a recent survey. Some speculate that companies might be adopting a "wait and see" approach, assessing the potential productivity gains from these tools before expanding their engineering teams.
[...] Orosz also points out that Indeed's data may not provide a complete picture of the job market. The platform may be losing popularity for posting software engineering jobs, particularly among startups and some Big Tech companies. For instance, Microsoft lists more software-related jobs on its own site than are reflected in Indeed's data.
So, while the Indeed data should be considered directionally correct, indicating a genuine decrease in developer job listings, it may not fully represent hiring trends in startups or accurately track Big Tech hiring. Still, it is clear that the software engineering job market is undergoing significant changes, influenced by economic factors, technological advancements, and evolving company strategies.
Companies are advised to constantly update their apps and software, and patch known network vulnerabilities to prevent such attacks:
A ransomware group called "Ghost" is exploiting the network vulnerabilities of various organizations to gain access to their systems, according to a joint advisory issued by multiple U.S. federal agencies.
"Beginning early 2021, Ghost actors began attacking victims whose internet-facing services ran outdated versions of software and firmware," the Cybersecurity and Infrastructure Security Agency (CISA) said in the Feb. 19 joint advisory. "Ghost actors, located in China, conduct these widespread attacks for financial gain."
The attacks have targeted schools and universities, government networks, critical infrastructure, technology and manufacturing companies, health care, and several small and mid-sized businesses.
[...] The criminals use publicly available code to exploit "common vulnerabilities and exposures" of their targets to secure access to servers. They leverage vulnerabilities in servers running Adobe ColdFusion, Microsoft Exchange, and Microsoft SharePoint.
Also at BleepingComputer.
Related:
Arthur T Knackerbracket has processed the following story:
A scheme involving the resale of used Seagate Exos enterprise-grade hard drives as new was uncovered earlier this month, but we have now learned that it reportedly affects not only Exos HDDs but also Seagate's IronWolf Pro HDDs, according to an investigation conducted by Lutz Labs from ComputerBase. Fraudsters erase usage records, alter serial numbers, and modify labels to deceive buyers, but it is still possible to determine that the particular drive had been in use, and by now, there are multiple ways to detect such falsified HDDs.
Both Exos and IronWolf Pro are extremely reliable hard disk drives. Seagate's Exos HDDs are aimed at enterprises and hyperscale cloud service providers and are meant to operate 24/7, whereas IronWolf Pro is designed for enterprise-grade NAS environments that also work in 24/7 mode. While these drives share a lot in terms of hardware platforms, they have different firmware. Given the reliability, performance, and capacity points of IronWolf Pro HDDs, they are good candidates for use in Chia mining. As such, the current theory is that Chia miners are selling off used hard drives from mining farms, and it would make sense they may have both Exos and IronWolf Pro devices.
Falsified Seagate's Exos and IronWolf Pro hard drives are sold by retailers in different countries and generally look almost like new. The drives appear unused to the software because their internal usage logs, specifically SMART parameters, were wiped. However, a closer look at these drives may reveal slight dents and scratches on the chassis as well as scratches on their SATA connector, which are clear signs of previous use.
Also, the QR codes on counterfeit units have been tampered with. Instead of linking to Seagate's usual verification page, they redirect to a warranty check that does not display the serial number or storage capacity, making it harder to verify authenticity. Since the labels on the drives are false, there are slight variations in label alignment and scaling. Finally, tools like smartmontools that can read Seagate's FARM (field-accessible reliability metrics) values reveal that some had operated for over 50,000 hours.
So far, no similar cases have been reported for Toshiba or Western Digital. However, detecting tampering in these brands is more difficult since they lack Seagate's FARM values (which are only available on Exos, IronWolf, IronWolf Pro, and perhaps SkyHawk drives, according to ComputerBase), which store extensive usage history.
Arthur T Knackerbracket has processed the following story:
It’s not every day that we get to see a glimpse of what a mysterious space plane is up to in orbit. This week, the US Space Force shared a picture it says was snapped last year by the X-37B, showing Earth in the distance and a bit of the craft itself. X-37B launched on its seventh mission at the end of 2023, though not much is known about what that mission entails. Its previous flight, which wrapped up in 2022, set a new endurance record for the space plane, logging 908 days in orbit.
There isn't too much information to glean from the photo, but it does offer a rare look at X-37B in space. “An X-37B onboard camera, used to ensure the health and safety of the vehicle, captures an image of Earth while conducting experiments in HEO in 2024,” the Space Force wrote on X.
One thing we have been told about the current mission is that it marks the first time the Boeing-made X-37B has tried out a maneuver known as aerobraking, or a more fuel-efficient method of changing orbit through “a series of passes using the drag of Earth's atmosphere.” The Space Force said back in October that the vehicle had begun the process, and the latest update indicates it was successful. “The X-37B executed a series of first-of-kind maneuvers, called aerobraking, to safely change its orbit using minimal fuel,” the Space Force noted. It's unknown how much longer the mission is expected to go on.
From our shy community member: https://www.fincen.gov/boi
As discussed here earlier the US Treasury FinCen beneficial ownership information (BOI) reporting requirements for millions of small businesses are back again. They have ping-ponged on and off several times since the end of 2024.
Now it seems there was a judgement in the Texas lawsuit:
... However, because the Department of the Treasury recognizes that reporting companies may need additional time to comply with their BOI reporting obligations, FinCEN is generally extending the deadline 30 calendar days from February 19, 2025, for most companies.
Notably, in keeping with Treasury's commitment to reducing regulatory burden on businesses, during this 30-day period FinCEN will assess its options to further modify deadlines, while prioritizing reporting for those entities that pose the most significant national security risks.
FinCEN also intends to initiate a process this year to revise the BOI reporting rule to reduce burden for lower-risk entities, including many U.S. small businesses.
[...] For the vast majority of reporting companies, the new deadline to file an initial, updated, and/or corrected BOI report is now March 21, 2025. FinCEN will provide an update before then of any further modification of this deadline, recognizing that reporting companies may need additional time to comply with their BOI reporting obligations once this update is provided. [continues with exceptions]
Reading between the lines, your SN small business owner is guessing that, "reduce burden for lower-risk entities, including many U.S. small businesses" means that the Treasury is expecting Musk and DOGE to hit this topic, any day now.
Arthur T Knackerbracket has processed the following story:
When a group of amateur treasure seekers set out on an expedition in Poland at the end of January, they weren’t sure what they’d find. On previous trips, while sweeping the ground with metal detectors, they found fascinating trinkets, including thirteenth-century Carolingian dynasty coins. This time around, they found something more mighty: a big honking sword from the Middle Ages.
The giant blade, which was clearly meant to be handled with two hands, was found alongside two axes in the country’s Nowomiejskie district. The discovery was made by members of a group that calls itself GRYF—Biskupieckie Stowarzyszenie Detektorystyczne, which Google translates to the Diocesan Detection Association.
Alas, this is not a band of plucky private eyes, but rather a “group of history enthusiasts and treasure hunters,” according to their Facebook page. The club actually sounds pretty rad, with fun activities planned, including an upcoming hunt for Napoleonic-era artifacts. If you’ll be in Poland when it happens, you should definitely join in. They’ve also done some good in the community, having organized a cleanup of a forgotten Jewish cemetery in the woods close to the town of Lubawa in November.
The January search was conducted in conjunction with the Ostroda Museum, which will eventually make the weapons part of its permanent display. “We are starting to work on their permanent security, proper preservation is preceded by a series of x-rays,” the museum said on its Facebook page. “This year we plan to present the monuments as part of our permanent exhibition.”
The sword, which measures just under 3.2 feet (1 meter), was, as you would expect, extremely rusted and weathered by the centuries, but is otherwise well preserved, with its blade, pommel, and handle all intact. The axe blades were in similarly good condition, albeit less complete.
Details about the weapons’ origins are scant, as the museum didn’t specify their age or who might have wielded them. All that’s known is that they are Medieval in origin, though that’s vague, as the Middle Ages lasted roughly 1,000 years, from the fifth to the fifteenth century. According to a paper published by University of Lodz associate professor Anna Kowalska-Pietrzak, Poland during that time was largely inhabited by a number of Slavic tribes, though there was an invasion by Teutonic Knights in the fifteenth century.
As Archaeology News reported, the sword’s design is similar to “hand-and-a-half” weapons that were popular in Western Europe during the Late Middle Ages, and were crafted to stab through armor. The publication cited unnamed experts who said that, as the weapons were found near the Osa River, they may have spent centuries underwater, which would have contributed to their remarkably preserved state.
See, kids? Cool things happen when you put down the cellphones and go outside.
Over at ACM.org, Bjarne Stroustrup presents the key contemporary C++ mechanism designed to maintain compatibility over decades:
It is now 45+ years since C++ was first conceived. As planned, it evolved to meet challenges, but many developers use C++ as if it was still the previous millennium. This is suboptimal from the perspective of ease of expressing ideas, performance, reliability, and maintainability. Here, I present the key concepts on which performant, type safe, and flexible C++ software can be built: resource management, life-time management, error-handling, modularity, and generic programming. At the end, I present ways to ensure that code is contemporary, rather than relying on outdated, unsafe, and hard-to-maintain techniques: guidelines and profiles.
The article lays out the ideals and talks about resource management, modularity and generic programming, among other topics. It concludes:
C++ was designed to evolve. When I started, not only didn't I have the resources to design and implement my ideal language, but I also understood that I needed the feedback from use to turn my ideals into practical reality. And evolve it did while staying true to its fundamental aims24. Contemporary C++ (C++23) is a much better approximation to the ideals than any earlier version, including support for better code quality, type safety, expressive power, performance, and for a much wider range of application areas.
However, the evolutionary approach caused some serious problems. Many people got stuck with an outdated view of what C++ is. Today, we still see endless mentions of the mythical language C/C++, usually implying a view of C++ as a minor extension of C embodying all the worst aspects of C together with grotesque misuses of complex C++ features. Other sources describe C++ as a failed attempt to design Java. Also, tool support in areas such as package management and build systems have lagged because of a community focus on older styles of use.
Arthur T Knackerbracket has processed the following story:
United States president Donald Trump last Friday issued a memorandum that suggests imposition of tariffs on nations that dare to tax big tech companies.
The memorandum mentions the digital services taxes (DSTs) were introduced to capture profits from revenue that tech companies generate in one country but collect in another. Netflix is often cited as an example of why such taxes are needed, because many of its customers around the world paid their subscriptions to an entity in The Netherlands. Governments argued that was inappropriate because Netflix was selling to their citizens, who consumed the vid-streamer’s services in their territory, and that a Netflix subscription therefore represented economic activity in their jurisdictions that should be taxed like any other.
Another reason DSTs were considered was that Netflix’s Netherlands scheme, like many other structures used by Big Tech companies, are legal-but-cynical tax efforts at reducing their tax bills to levels well below those local companies pay.
The OECD developed measures to prevent multinational companies using such tactics, and they have been widely adopted without stopping all of Big Tech’s tax tricks. DSTS were pitched as necessary – perhaps temporarily – while the OECD approach was developed, and adopted.
Trump’s opposition to DSTs is not new: the Biden administration felt they disproportionately targeted US businesses and threatened 25 percent tariffs if they were not removed. The UK and Europe dropped some of the taxes, as did India.
Tariffs are now back on the agenda for remaining digital services taxes. As outlined in a Friday memorandum, Trump stated: “My Administration will not allow American companies and workers and American economic and national security interests to be compromised by one-sided, anti-competitive policies and practices of foreign governments. American businesses will no longer prop up failed foreign economies through extortive fines and taxes.”
“All of these measures violate American sovereignty and offshore American jobs, limit American companies’ global competitiveness, and increase American operational costs while exposing our sensitive information to potentially hostile foreign regulators,” the Memorandum adds.
The document also calls for US authorities to consider DSTs in its report on the OECD tax measures mentioned above, which Trump also feels unjustly penalize American businesses.
The Memorandum instructs the US Trade Representative to “identify tools the United States can use to secure among trading partners a permanent moratorium on customs duties on electronic transmissions.” Just when those tools will be identified, and implemented, is unknown.
However the administration’s intent is clear: Big Tech should not be taxed by any nation other than the US, which itself struggles to tax its top tech companies thanks to the tax minimization schemes the OECD deal was designed to dent.
Arthur T Knackerbracket has processed the following story:
The semiconductor industry is expanding rapidly as countries race to build new fabs. While it takes around 19 months to build a fab in Taiwan, it takes a whopping 38 months to build a fab in the U.S. due to the extensive time it takes to get a permit and because fabs are not constructed 24/7, according to Exyte, a leading engineering, construction, and design company that specializes in high-tech facilities like chip production plants, reports Semiconductor Digest.
Taiwan completes fabs in around 19 months, followed by Singapore and Malaysia at 23 months. European projects take 34 months, while the U.S. is the slowest at 38 months. A key reason for this is Taiwan's streamlined permit process and round-the-clock construction, whereas the U.S. and Europe face delays in approvals and do not construct 24/7. The U.S. has enacted a law that exempts certain U.S. fabs from federal environmental assessments, but that is obviously not enough to be on par with Taiwan.
Costs also differ widely. Constructing a plant in the U.S. is about twice as expensive as in Taiwan, despite similar equipment costs, according to Exyte. This discrepancy arises from higher labor costs, extensive regulatory requirements, and inefficiencies in supply chains. Also, Taiwanese workforce is highly experienced, so Taiwanese builders require fewer detailed blueprints because they are familiar with every step of the process, which speeds up completion of fab projects, according to Herbert Blaschitz, an executive at Exyte.
To compete efficiently with Taiwan — which has a well-integrated supply chain, experienced workforce, and efficient regulatory processes — the U.S. and Europe must streamline permitting, optimize construction techniques, and adopt advanced planning tools like digital twins. Blaschitz suggests adopting 'virtual commissioning,' where a digital model of the plant is created before physical construction begins. This allows potential problems to be identified early, reducing costs and environmental impact while improving speed and efficiency.
Modern semiconductor production facilities are huge, both in terms of dimensions and investments. A leading-edge fab — such as those operated by Intel, Samsung Foundry, or TSMC — requires investment exceeding $20 billion, with $4-6 billion allocated just for the structure itself, according to Blaschitz, who highlighted Taiwan's advantages at the SEMI Industry Strategy Symposium.
Arthur T Knackerbracket has processed the following story:
The atmosphere of a distant world has been mapped in detail for the first time, revealing a strange, topsy-turvy weather system, with the fastest winds ever seen inexplicably blowing around the planet’s stratosphere.
Astronomers have studied WASP-121b, also known as Tylos, since 2015. The planet, which is 900 light years away, is a vast ball of gas double the size of Jupiter, and it orbits its star extremely closely, completing a full orbit in just 30 Earth hours. This close orbit heats the planet’s atmosphere to temperatures of 2500°C, hot enough to boil iron.
Now, Julia Seidel at the European Southern Observatory in Chile and her colleagues have looked inside Tylos’s scorchingly hot atmosphere using the observatory’s Very Large Telescope, and they found it has at least three distinct layers of gas moving in different directions around the planet – a structure unlike anything astronomers have ever seen. “It’s absolutely crazy, science fiction-y patterns and behaviours,” says Seidel.
The planetary atmospheres in our solar system share a broadly similar structure to one another, where a jet stream of powerful winds blowing in the lower portion of the atmosphere is driven by internal temperature differences, while winds in the upper layers are more affected by temperature differences created by the sun’s heat, which warms the daylight side of the planet but not the other.
Yet in Tylos’s atmosphere, it is the winds in the lower layer that are driven by heat from the planet’s star, travelling away from the warm side, while the jet stream appears to be mostly in the middle layer of the atmosphere, travelling around Tylos’s equator in the direction of the planet’s rotation. An upper layer of hydrogen also shows jetstream-like features, flowing around the planet but also drifting outwards into space. This is difficult to explain using our current models, says Seidel. “What we see now is actually exactly the inverse of what comes out of theory.”
What’s more, the jet stream on Tylos is the most powerful ever seen, blasting at around 70,000 kilometres per hour across half the planet – double the speed of the previous record holder. Exactly what is driving this speed is unclear, but the researchers think that it may be due to the planet’s strong magnetic field or because of ultraviolet radiation from its star. “This could possibly change the flow patterns, but this is all highly speculative,” says Seidel.