Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
A SoylentNews contributor writes:
This trade magazine, https://www.autonomousvehicleinternational.com/news/adas/mit-report-forecasts-global-emissions-impact-from-autonomous-vehicle-computers.html is reporting on a new study and modeling effort--
A new study conducted by MIT researchers has found that in the future, the amount of energy required to run computers on board an international fleet of AVs could generate the same amount of greenhouse gas emissions as all the world's current data centers. The study explored the potential energy consumption and related carbon emissions if autonomous vehicles were widely adopted.
At present, the data centers that house the physical computing infrastructure used for running applications account for approximately 0.3% of global greenhouse gas emissions (GHG). As there has been little focus on the potential footprint of AVs, MIT researchers developed a statistical model to study the potential issue.
The research team calculated that one billion AVs driving for an hour each day, with each vehicle's computer using 840W, would consume enough energy to generate roughly the same number of emissions as global data centers do currently. Researchers also found that in 90% of modeled scenarios, to keep AV emissions from surpassing present day data center emissions, the vehicle would have to use under 1.2kW of computing power. To achieve this target, more efficient hardware for AVs would be required.
In one test, the team modeled a scenario in 2050 where 95% of the global fleet is made up of AVs. During this scenario, computational workloads doubled every three years and the Earth continued to decarbonize at the current rate. Upon completion of this simulation, researchers found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.
"If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn't seem like it is going to be enough to constrain the emissions from computing on board autonomous vehicles," said first author Soumya Sudhakar, a graduate student in aeronautics and astronautics. "This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start."
The source paper is here (abstract), https://ieeexplore.ieee.org/document/9942310 with the full text linked (may be behind an IEEE paywall?)
[Paper made available by MIT Open Access --hubie]
Seattle Public Schools is joining a growing number of school districts banning ChatGPT, the natural language chatbot from OpenAI that has sparked widespread attention in recent weeks.
ChatGPT has garnered praise for its ability to quickly answer complex queries and instantly produce content.
But it's also generating concern among educators worried that students will use the technology to do their homework.
SPS blocked ChatGPT on all school devices in December, said Tim Robinson, a spokesman for Seattle Public Schools, in an email to GeekWire.
"Like all school districts, Seattle Public Schools does not allow cheating and requires original thought and work from students," he said.
The district also blocks other "cheating tools," Robinson said.
Germany wants to force its power-hungry data centers to harness excess heat for warming residential homes — an effort which the industry warns is likely to fall flat:
The country has become one of the largest global hubs for data centers thanks to its clear data protection and security laws. Politicians are now trying to re-purpose some of their controversial excess heat to improve efficiency in light of the energy crisis.
While in theory an innovative way to reduce the industry's immense carbon footprint, experts have pointed to a flaw in the government's proposal expected to be passed this month: potential recipients of waste heat are not being compelled to take it.
[...] "Data center operators are mostly ready and willing to give away their waste heat," according to Ralph Hintemann, senior researcher at the data center lobby group Borderstep. "The challenge here is finding someone who can use that heat economically."
What's at stake for Europe's largest economy is that it either risks scaring off IT investments and slowing its efforts to digitize, or falling behind on climate goals. The energy efficiency law being prepared by the government aims to save some 500 terawatt-hours of energy by 2030 — pegged in part on a requirement to reuse at least 30% of a new data center's heat by 2025.
Not a new idea, as demonstrated by this article from almost a decade ago.
Original article on Bloomberg but javascript and all...
Physics models and real-world experiments help keep bubbles from popping:
Blowing soap bubbles, besides being a favorite pastime for children, also happens to be an art form and a subject of interest for physicists. Emmanuelle Rio, François Boulogne, Marina Pasquet, and Frédéric Restagno from the Laboratory of Solid State Physics at the University of Paris-Saclay have been studying bubbles for years, trying to understand the different processes at play in these innocuous-looking structures.
"Bubbles are important as they appear in many places, including washing products, cosmetics, building materials, and also in nature. For example, sea foam plays a role in terms of the exchanges between the atmosphere and the sea," Boulogne said.
Now, the team has described a key event in the life of bubbles: when they pop.
[...] Boulogne stated that although there is a link between temperature and aging of the bubbles, the impact of low temperatures on when the bubbles pop remains to be understood—and is likely to stay that way for a while. "So far, we have no model that can make this prediction. Understanding the stability of bubbles is a challenge that will take several decades," he said.
[...] Working with the French artist Pierre-Yves Fusier, who specializes in bubbles art, Rio and her colleagues developed the recipe, which consists of 40 milliliters of dishwashing liquid, 100 milliliters of glycerol, and 1 gram of long polymer such as the naturally occurring guar gum mixed in 1 liter of water. Using this recipe, Rio created 5 cm-diameter bubbles in her laboratory that lasted an hour.
While adding glycerol may make the bubbles more stable, Rio said the impact of other ingredients on the bubbles' stability is still an open question. "Glycerol is a hydroscopic molecule which can help condensate water. But we know the surfactant (dishwashing liquid) and the polymer also impact evaporation. The next step in our study, therefore, is to find out how our recipe impacts the evaporation," Rio said.
Journal Reference:
Pasquet, M., Wallon, L., Fusier, PY. et al. An optimized recipe for making giant bubbles. Eur. Phys. J. E 45, 101 (2022). https://doi.org/10.1140/epje/s10189-022-00255-6
Peer review has failed, and that's great news – for diamond open access, science and society:
[...] Mastroianni explains that although ubiquitous today, peer review is a relatively new phenomenon. After World War II governments poured huge amounts of money into research; peer review was supposed to make sure the money was well spent. But as Mastroianni documents, peer review has failed on just about every metric.
Research productivity has been flat or declining for decades; reviewers consistently miss major flaws in submitted papers; fraudulent work is published all the time. Peer review often encourages bad research because of unhelpful comments; and scientists themselves don't care about peer review: they actively seek to circumvent it, and ignore it in their own reading.
[...] last month I published a paper, by which I mean I uploaded a PDF to the internet. I wrote it in normal language so anyone could understand it. I held nothing back—I even admitted that I forgot why I ran one of the studies. I put jokes in it because nobody could tell me not to. I uploaded all the materials, data, and code where everybody could see them. I figured I'd look like a total dummy and nobody would pay any attention, but at least I was having fun and doing what I thought was right.
Then, before I even told anyone about the paper, thousands of people found it, commented on it, and retweeted it.
What Mastroianni describes is essentially the diamond open access approach, something Walled Culture has discussed several times. It is designed to provide an extremely simple and lightweight publishing platform to help researchers get their papers quickly and easily before as many people as possible. It is costless, for both the person uploading their paper, and those who download it.
Whenever you see cryptocurrency prices suddenly rise, that's probably why:
We're not just a couple of weeks into 2023 and crypto prices are spiking. Seeing number go up might entice you to throw some money into Bitcoin or Ethereum. After all, maybe this is the beginning of another crypto bull market? You wouldn't want to miss out!
Well, just wait a minute. Consider this first: Why are crypto prices suddenly rising?
There are plenty of analysts out there trying to make logical sense of the recent bump in cryptocurrencies value – inflation is slowing, belief that the Federal Reserve is done with hiking interest rate hiking, bullish news on crypto – but no, that's not really it.
There's been no big positive news in the industry. There aren't reports of some new, mainstream avenues of adoption. Sure, the stock market is up a bit right now in the new year, but not at the same level cryptocurrency is.
So, what's going on here? Market manipulation.
[...] As longtime cryptocurrency writer and critic David Gerard explains: The big players in the industry are "buying" in order to control the market.
"The bitcoin price is whatever the large players need it to be," writes Gerard. "The market is very thin and trivially manipulable with the billions of pseudo-dollars in unbacked stablecoins on the unregulated offshore exchanges. The price needs to be high enough so the big boys' loans don't get liquidated; but it needs to be low enough so that the bagholders don't attempt to cash out."
John Reed Stark, a former SEC official, concurred with Gerard's assessment.
"A recent Forbes analysis of 157 crypto exchanges found that 51 percent of daily bitcoin trading volume being reported was likely bogus," tweeted Stark, referring to a Forbes report from last summer.
[...] It's just yet another way the big crypto companies and investment funds manipulate the market.
Lightning strikes flow along laser's path for tens of meters:
Lightning rods protect buildings by providing a low-resistance path for charges to flow between the clouds and the ground. But they only work if lightning finds that path first. The actual strike is chaotic, and there's never a guarantee that the processes that initiate it will happen close enough to the lightning rod to ensure that things will work as intended.
A team of European researchers decided they didn't like that randomness and managed to direct a few lightning strikes safely into a telecom tower located on top of a Swiss mountain. Their secret? Lasers, which were used to create a path of charged ions to smooth the path to the lightning rod.
The basic challenge with directing lightning bolts is that the atmospheric events that create charged particles occur at significant altitudes relative to lightning rods. This allows lightning to find paths to the ground that don't involve the lightning rod. People have successfully created a connection between the two by using small rockets to shoot conductive cables to the heights where the charges were. But using this regularly would eventually require a lot of rockets and leave the surroundings draped in cables.
The idea of using lasers to guide lightning is an old one, with the suggestion first appearing in the scientific literature back in the 1970s. A sufficiently high-intensity laser beam has a complicated relationship with the air it travels through. The changes it makes to the air help focus the laser, while the electrons it knocks loose tend to disperse it. Meanwhile, the molecules in the atmosphere that absorb the light heat up and shoot out of its path, creating a low-pressure path in the laser's wake. Critically, many of the particles left behind in these low-pressure filaments are charged, providing a potential path for lightning.
It's also possible to shape laser pulses so that you control where the generation of these filaments start—up to a kilometer away from the laser source.
[...] Aside from the frequency, the tower is a great site for these experiments since it is equipped to study lightning. Instruments measure the current that flows through its lightning rod and the electromagnetic fields in the area and can perform imaging at various wavelengths, including X-rays.
In any case, the researchers thought this time might be different for a key reason: Lasers have improved considerably. They're now able to fire much more rapidly; the one set up for this work is capable of a 1 kilohertz frequency. That is a firing rate over 100 times larger than any laser used for this sort of work previously. Models that include this rapid cycling, the researchers show, suggest it creates a more persistent filament of charged particles in the air.
And it worked. Over six hours of testing, the tower saw four strikes while the laser was active. Imaging of one of these, which occurred in a clear sky, clearly shows the lightning bolt moving along the path defined by the laser until it reached the tower.
Like almost all the lightning strikes previously recorded at the Säntis Tower, all four strikes started at the ground and propagated upward. But 85 percent of the strikes recorded there involve a connection to a pool of negative charges in the clouds. By contrast, all four laser-guided strikes connected to a positively charged pool.
Journal Reference:
Houard, Aurélien, Walch, Pierre, Produit, Thomas, et al. Laser-guided lightning [open], Nature Photonics (DOI: 10.1038/s41566-022-01139-z)
Microsoft announces 10,000 layoffs, 5% of its workforce:
Microsoft is "focusing on our short- and long-term opportunity", which is to say it's laying off 10,000 people.
First, we will align our cost structure with our revenue and where we see customer demand. Today, we are making changes that will result in the reduction of our overall workforce by 10,000 jobs through the end of FY23 Q3. This represents less than 5 percent of our total employee base, with some notifications happening today. It's important to note that while we are eliminating roles in some areas, we will continue to hire in key strategic areas. We know this is a challenging time for each person impacted. The senior leadership team and I are committed that as we go through this process, we will do so in the most thoughtful and transparent way possible.
Second, we will continue to invest in strategic areas for our future, meaning we are allocating both our capital and talent to areas of secular growth and long-term competitiveness for the company, while divesting in other areas. These are the kinds of hard choices we have made throughout our 47-year history to remain a consequential company in this industry that is unforgiving to anyone who doesn't adapt to platform shifts. As such, we are taking a $1.2 billion charge in Q2 related to severance costs, changes to our hardware portfolio, and the cost of lease consolidation as we create higher density across our workspaces.
And third, we will treat our people with dignity and respect, and act transparently. These decisions are difficult, but necessary. They are especially difficult because they impact people and people's lives – our colleagues and friends. We are committed to ensuring all those whose roles are eliminated have our full support during these transitions. U.S.-benefit-eligible employees will receive a variety of benefits, including above-market severance pay, continuing healthcare coverage for six months, continued vesting of stock awards for six months, career transition services, and 60 days' notice prior to termination, regardless of whether such notice is legally required. Benefits for employees outside the U.S. will align with the employment laws in each country.
Netherlands refuses to summarily agree to US export restrictions on China over silicon chips.:
The United States of America has requested a number of countries in Europe and Asia to impose sanctions on Chinese chip manufacturing firms. One of these, the Netherlands, has come out and put a statement saying that they will not summarily accept new US restrictions on exporting chip-making technology to China, and is consulting with European and Asian allies.
The Dutch Trade Minister Liesje Schreinemacher on Sunday said that he expects the Dutch Prime Minister Mark Rutte to discuss export policy with President Joe Biden when Prime Minister Rutte visits the US.
In effect though, the Netherlands has stopped ASML Holding from shipping its most advanced machines to China and is only allowing them to sell machinery and technology that were made before 2019.
The Dutch government has denied ASML permission to ship its most advanced machines to China since 2019 following a pressure campaign by the Trump administration, but ASML did sell 2 billion euros worth of older machines to China in 2021.
The US took action in October to limit China's capacity to produce its own chips, and US trade officials stated at the time that they anticipated the Netherlands and Japan to follow suit soon. ASML has said that should the rules proposed by the US come into play, it could impact roughly 5 per cent of its group sales.
Previously: Dutch Chip Equipment Maker ASML's CEO Questions U.S. Export Rules on China
Unix is dead. Long live Unix!:
It's the end of an era. As The Reg covered last week, IBM has transferred development of AIX to India. Why should IBM pay for an expensive US-based team to maintain its own proprietary flavor of official Unix when it paid 34 billion bucks for its own FOSS flavor in Red Hat?
Here at The Reg FOSS desk, we've felt this was coming ever since we reported that Big Blue was launching new POWER servers which didn't support AIX – already nearly eight years ago. Even if it was visibly coming over the horizon, this is a significant event: AIX is the last proprietary Unix which was in active development, and constitutes four of the 10 entries in the official Open Group list.
Within Oracle, Solaris is in maintenance mode. Almost exactly six year ago, we reported that the next major release, Solaris 12, had disappeared from Oracle's roadmap. HPE's HP-UX is also in maintenance mode because there's no new hardware to run it on. Itanium really is dead now and at the end that's all HP-UX could run on. It's over a decade since we reported that HP investigated but canceled an effort to port it to x86-64.
The last incarnation of the SCO Group, Xinuos, is still around and offers not one but two proprietary UNIX variants: SCO OpenServer, descended from SCO Xenix, and UnixWare, descended from Novell's Unix. We note that OpenServer 10, a more modern OS based on FreeBSD 10, has disappeared from Xinuos's homepage. It's worth pointing out that the SCO Group was the company formerly known as Caldera, and isn't the same SCO as the Santa Cruz Operation which co-created Xenix with Microsoft in the 1980s.
There used to be two Chinese Linux distros which had passed the Open Group's testing and could use the Unix trademark: Inspur K/UX and Huawei EulerOS. Both companies have let the rather expensive trademark lapse, though. But the important detail here is that Linux passed and was certified as a UNIX™. And it wasn't just one distro, although both were CentOS Linux derivatives. We suspect that any Linux would breeze through because several many un-Unix-like OSes have passed before.
[...] Ever since Windows NT in 1993, Windows has had a POSIX environment. Now, with WSL, it arguably has two of them, and we suspect that if Microsoft were so inclined, it could have Windows certified as an official Unix-compatible OS.
[...] But this illustrates the difficulty of definining precisely what the word "Unix" means in the 21st century. It hasn't meant "based on AT&T code" since Novell bought Unix System Labs from AT&T in 1993, kept the code, and donated the trademark to the Open Group. Since that time, if it passes the Open Group's testing (and you pay a fee to use the trademark), it's UNIX™. Haiku hasn't so it isn't. Linux has so it is. But then so is z/OS, which is a direct descendant of OS/390, or IBM MVS as it was called when it was launched in 1974. In other words, an OS which isn't actually based on, similar to, or even related to Unix.
US firm Getty Images on Tuesday threatened to sue a tech company it accuses of illegally copying millions of photos for use in an artificial intelligence (AI) art tool:
Getty, which distributes stock images and news photos including those of AFP, accused Stability AI of profiting from its pictures and those of its partners. Stability AI runs a tool called Stable Diffusion that allows users to generate mash-up images from a few words of text, but the firm uses material it scrapes from the web often without permission.
The question of copyright is still in dispute, with creators and artists arguing that the tools infringe their intellectual property and AI firms claiming they are protected under "fair use" rules.
Tools like Stable Diffusion and Dall-E 2 exploded in popularity last year, quickly becoming a global sensation with absurd images in the style of famous artists flooding social media.
Related:
CAMM = Compression Attached Memory Module
CAMM to Usurp SO-DIMM Laptop Memory Form Factor Says JEDEC Member
So, farewell, SO-DIMM. After a quarter century of service in laptop, all-in-ones and other compact designs, it looks like the end of the road for SO-DIMM is in sight. JEDEC committee member and Dell Senior Distinguished Engineer, Tom Schnell, told PC World that the new 'CAMM Common Spec' will be the next RAM standard for laptops. There already seems to have been a lot of progress in the background, with the v0.5 spec already approved by 20 or so companies in the task group, and JEDEC expected to finalize the v1.0 spec in the second half of this year.
[...] The new information from PC World editor Gordon Ung's chat with Tom Schnell helps us weigh up some of the pros and cons of CAMM, and point to some ways it has progressed from Dell's pre-JEDEC-approved spec. Apparently, as well as improved density (more RAM capacity in a smaller space), CAMM is amenable to "scaling to ever higher clock speeds." Specifically, the new information indicates that the DDR5-6400 'brick wall' for SO-DIMMs will be shrugged off by CAMMs.
When CAMM reaches devices, there are a couple of tech advances which could help spur its adoption. We mentioned the faster DDR5 speeds above, but it is thought that CAMM could really take off when DDR6 arrives. Another appealing variation might be for adding LPDDR(6) memory to laptops. Traditionally LPDDR memory is soldered, so the new spring contact fitting modules might mean much better upgradability for the thinnest and lightest devices which tend to use LPDDR memory.
Previously: Dell Defends its Controversial New Laptop Memory (CAMM)
Controversy erupts over non-consensual AI mental health experiment:
On Friday, Koko co-founder Rob Morris announced on Twitter that his company ran an experiment to provide AI-written mental health counseling for 4,000 people without informing them first, The Verge reports. Critics have called the experiment deeply unethical because Koko did not obtain informed consent from people seeking counseling.
Koko is a nonprofit mental health platform that connects teens and adults who need mental health help to volunteers through messaging apps like Telegram and Discord.
On Discord, users sign into the Koko Cares server and send direct messages to a Koko bot that asks several multiple-choice questions (e.g., "What's the darkest thought you have about this?"). It then shares a person's concerns—written as a few sentences of text—anonymously with someone else on the server who can reply anonymously with a short message of their own.
During the AI experiment—which applied to about 30,000 messages, according to Morris—volunteers providing assistance to others had the option to use a response automatically generated by OpenAI's GPT-3 large language model instead of writing one themselves (GPT-3 is the technology behind the recently popular ChatGPT chatbot).
SEC sues law firm for client list in the Hafnium cyberattack:
The US Securities and Exchange Commission (SEC) has sued international law firm Covington & Burling for details about 298 of the biz's clients whose information was accessed by a Chinese state-sponsored hacking group in November 2020.
The data theft in question is the now-infamous Microsoft Exchange attack in which Hafnium exploited four zero-day vulnerabilities in the email platform to steal data from US-based defense contractors, law firms, and infectious disease researchers.
Covington was one of the breached law firms, and the intrusion gave the Beijing-backed cyberspies access to some of Covington's clients that are regulated by the US agency.
"Covington has admitted that a foreign actor intentionally and maliciously accessed the files of Covington's clients, including companies regulated by the Commission," the lawsuit says [PDF]. "In light of this reported breach, the Commission is seeking to determine whether the malicious activity resulted in violations of the federal securities laws to the detriment of investors."
Airbus Begins Testing Autonomous Emergency Flight Tech:
If you've traveled by plane a handful of times, chances are you've been on an Airbus. The aerospace corporation's planes are some of the most commonly-used commercial aircraft in the world, comparable only with Boeing's 747 line and Antonov's An-24. Now, with a project titled DragonFly, there's a chance Airbus' passenger jets could eventually incorporate autonomous flight technology.
DragonFly is an initiative under Airbus UpNext, a division responsible for testing and validating new tech prior to rollout. In a blog post Thursday, UpNext shared that DragonFly focuses on "derisking" emergency operations by detecting issues and autonomously solving them if crew members are unable to take action. Airbus is hoping to achieve this through biomimicry, or engineering that takes inspiration from living things. It should come as no surprise that UpNext is modeling its autonomous system after an actual dragonfly, which uses its massive eyes to see in 360° and differentiate important landmarks.
"The systems we are developing and testing are similarly designed to review and identify features in the landscape that enable the aircraft to 'see' and safely maneuver within its surroundings," the division's blog post reads. At the DragonFly's core are a series of sensors, which work alongside computer algorithms to process visual data. These calculations are designed to help pilots land in low visibility and extreme weather conditions. In a situation in which the crew is busy or incapacitated, DragonFly will use these novel insights to land autonomously, redirecting to the nearest appropriate airport if necessary. UpNext claims the system will eventually reach a point where it can independently land at any airport in the world.