Slash Boxes

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password

Site News

Join our Folding@Home team:
Main F@H site
Our team page

Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:



Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag

We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Do You Celebrate/Acknowledge Pi Day?

  • Yes
  • No
  • No, because tau is the true circle constant, so get back to me in June
  • Yes, but I didn't start until I heard people grousing about tau
  • I don't know what you people are talking about

[ Results | Polls ]
Comments:24 | Votes:92

posted by janrinok on Wednesday March 15, @11:08PM   Printer-friendly

A tough time for big tech workers continues:

Founder and CEO Mark Zuckerberg announced Tuesday in a written statement that the tech giant would lay off 10,000 more workers, adding to the 11,000 people it laid off back in November. Additionally, around 5,000 open roles that hadn't been filled yet will be closed. In other words, it's a hiring freeze on top of a large number of layoffs.

Zuckerberg acknowledged the cuts in a blog post updating Meta's "Year of Efficiency."

This will be tough and there's no way around that. It will mean saying goodbye to talented and passionate colleagues who have been part of our success. They've dedicated themselves to our mission and I'm personally grateful for all their efforts. We will support people in the same ways we have before and treat everyone with the gratitude they deserve.

Amid the layoffs, Meta has also announced that it is stepping away from NFTs to focus on other projects.

According to TechCrunch, Meta's employee head-count came in at around 76,000 after November's layoffs. In the aftermath of this week's job cuts, that would bring the count down to around 66,000.

This is, unfortunately, just part of a wider trend in the world of big tech. Other tech firms of varying sizes like Lyft, Groupon, Vimeo, and Microsoft have all laid off workers in the last year due to broader economic difficulties.

Previously: Meta Employees Brace for Layoffs Ahead of Zuckerberg's Paternity Leave

Original Submission

posted by janrinok on Wednesday March 15, @08:27PM   Printer-friendly

After unusually low amounts of rain and snow this winter, the continent faces a severe water shortage:

The drought in parts of France is so bad right now that some authorities have banned new home-building projects—for the next four years. Despite a severe housing shortage in France, new homes just aren't worth the drain on water resources that construction, and eventual new residents, would cause, say nine communes in the south of the country.

It's just one of many signs that Europe is running dry. "What we are looking at is something like a multiyear drought," says Rohini Kumar of the Helmholtz Centre for Environmental Research in Germany. Unusually low rainfall and snowfall was recorded this winter not just in France but also in the UK, Ireland, Switzerland, and parts of Italy and Germany. The current predicament follows European droughts in 2018, 2019, 2020, and 2022.

Last summer, drought exacerbated by record temperatures around the continent was in the headlines. The subsequent dry winter has meant that many aquifers—places underground that retain water—and surface reservoirs have not had a chance to recover. Now, summer beckons once again, and experts who spoke to WIRED are worried that a severe water shortage could threaten lives, industry, and biodiversity in a big way.

The European Drought Observatory tracks indicators of drought across the continent, including from satellite measurements, and suggests that vast regions are far drier than they should be. "Honestly, all over Central Europe, this issue, it's a widespread problem," says Carmelo Cammalleri at the Polytechnic University of Milan.

He estimates that reservoirs in France and northern Italy are about 40 to 50 percent lower than they should be. The longest river in Italy, the Po, is 60 percent below its normal levels. Not only that, there is roughly half the usual snow on the Alps than would be expected for this time of year. That's a huge problem, because much of Central Europe relies on meltwater from these famous mountains every spring. "The Alps are known as the water towers of Europe for a reason," says Cammalleri.

France has just experienced its driest winter for 60 years. In some places, you can find extreme examples of how people have been affected. Take the village of Coucouron in the south of the country, where a truck has had to deliver drinking water up to 10 times a day since July—without any hiatus during the supposedly wetter months.

In the UK, also, many rivers are at record lows. And look to the Rhine, an arterial river that rises in the Alps and flows through multiple countries toward the North Sea. It fell considerably last year, causing massive headaches for barges that use it to transport goods. Right now, the river level is 1 to 2 meters below average for this time of year, according to some estimates. Lucie Fahrner, a spokeswoman for the Central Commission for the Navigation of the Rhine, denies that the river level is low at present, despite its lower-than-average levels, adding that various measures to help shipping cope with drought in the future are currently being evaluated.

What happens during the next few months will really matter. Abundant rainfall could ease the situation and stave off the worst-case scenario. But Europe needs a lot. "We're talking about a sea, a sea's worth of water," says Hannah Cloke at the University of Reading in the UK. In terms of volume, hundreds of millions of cubic liters of rain would have to fall across the continent to fill the deficit, she estimates. It would have to amount to higher-than-average rainfall for France and certain other places, including parts of the UK. The chances of that are, unfortunately, not high.

Original Submission

posted by janrinok on Wednesday March 15, @05:40PM   Printer-friendly
from the livin'-off-cloud-nine dept.

After being laid off, many people are starting their own businesses as cloud pros for hire:

The recent tech industry layoffs are driving a wave of what some are calling "solopreneurs" doing gig work or independent contracting. Think DoorDash or Uber Eats, but instead of delivering Thai food, people are delivering key cloud advisory services or even completed cloud-based systems ready for deployment.

This is driven by the anticipation that a slowing economy is likely to drive down tech sales. But also, a cloud skills shortage is occurring simultaneously. We're not preparing enough cloud professionals to keep up with demand, but, at the same time, tech companies are laying them off. Go figure.

This has been evolving for years as workers understand the value of the gig economy and may be looking for more independence and less employment reliance on the larger technology players. Many technology professionals are exploring more entrepreneurial options instead of opting for standard full-time jobs and cushy benefits with companies that can't guarantee a job for life—and never could.

Indeed, 63% of tech workers report they have started their own company post-layoff, according to a recent survey of 1,000 professionals laid off in recent years. Most of these new ventures (83%) exist in the technology industry, especially cloud computing.

[...] I suspect that many of these entrepreneurs will reach a valuation of many millions of dollars (depending on the type of cloud tech business) after a couple of years and average growth. I've seen this personally a great many times. Beats most 401(k)s.

Also interesting, according to the survey, 93% report they are now competing with the company that let them go. [...]

This will have an overall positive effect on the technology industry and cloud computing specifically, given that these types of businesses drive more innovation. They are not hindered by large corporate governance and company politics. Creativity and innovation are directly rewarded with sales and higher business value. This will also increase the number of wealthy people in the technology industry since this model will better disburse wealth among more technology industry contributors.

Original Submission

posted by janrinok on Wednesday March 15, @02:54PM   Printer-friendly

The STEM feed comes as TikTok faces increasing scrutiny:

TikTok has a large science community, and the social network wants everyone to know it on Pi Day (March 14). The company is launching a dedicated STEM (science, technology, engineering and math) feed that shows only these more educational videos. You may learn to code or discuss experiments without having to wade through TikTok's usual entertainment-focused content.

Not surprisingly, TikTok is taking steps to block misinformation in this new section. Curator Common Sense Networks will study content to make sure it's relevant to the STEM feed, while the fact-checkers at Poynter will gauge the accuracy. Any videos that don't pass both inspections won't reach the new feed.

Users in the US will start seeing the STEM feed in the "coming weeks," TikTok says. The social media giant has already been experimenting with a "Topic Feed" in some regions to court fans of gaming, sports and other common subjects. The science-oriented feed is considered an expansion of this initiative.

[...] Whether or not this helps with TikTok's survival in the US is another matter. Some politicians want to ban TikTok outright over fears it's a national security threat. Officials are concerned China may collect data about key Americans or spread propaganda.

See also: TikTok is Adding a Dedicated Feed for STEM Content

Original Submission

posted by janrinok on Wednesday March 15, @12:04PM   Printer-friendly
from the I-wish-I-understood-quantum-physics dept.

First demonstration of universal control of encoded spin qubits:

HRL Laboratories, LLC, has published the first demonstration of universal control of encoded spin qubits. This newly emerging approach to quantum computation uses a novel silicon-based qubit device architecture, fabricated in HRL's Malibu cleanroom, to trap single electrons in quantum dots. Spins of three such single electrons host energy-degenerate qubit states, which are controlled by nearest-neighbor contact interactions that partially swap spin states with those of their neighbors.

[...] The encoded silicon/silicon germanium quantum dot qubits use three electron spins and a control scheme whereby voltages applied to metal gates partially swap the directions of those electron-spins without ever aligning them in any particular direction. The demonstration involved applying thousands of these precisely calibrated voltage pulses in strict relation to one another over the course of a few millionths of a second.

The quantum coherence offered by the isotopically enriched silicon used, the all-electrical and low-crosstalk-control of partial swap operations, and the configurable insensitivity of the encoding to certain error sources combine to offer a strong pathway toward scalable fault tolerance and computational advantage, major steps toward a commercial quantum computer.

[...] "It is hard to define what the best qubit technology is, but I think the silicon exchange-only qubit is at least the best-balanced," said Thaddeus Ladd, HRL group leader and co-author.

Journal Reference: Aaron J. Weinstein et al, Universal logic with encoded spin qubits in silicon, Nature (2023).

Original Submission

posted by hubie on Wednesday March 15, @09:23AM   Printer-friendly
from the put-this-in-your-spice-model dept.

It'll allow researchers to develop a 'a mechanistic understanding of how the brain works':

Researchers understand the structure of brains and have mapped them out in some detail, but they still don't know exactly how they process data — for that, a detailed "circuit map" of the brain is needed.

Now, scientists have created just such a map for the most advanced creature yet: a fruit fly larva. Called a connectome, it diagrams the insect's 3016 neurons and 548,000 synapses, Neuroscience News has reported. The map will help researchers study better understand how the brains of both insects and animals control behavior, learning, body functions and more. The work may even inspired improved AI networks.

"Up until this point, we've not seen the structure of any brain except of the roundworm C. elegans, the tadpole of a low chordate, and the larva of a marine annelid, all of which have several hundred neurons," said professor Marta Zlatic from the MRC Laboratory of Molecular Biology. "This means neuroscience has been mostly operating without circuit maps. Without knowing the structure of a brain, we're guessing on the way computations are implemented. But now, we can start gaining a mechanistic understanding of how the brain works."

[...] As a next step, the team will investigate the structures used for behavioural functions like learning and decision making, and examine connectome activity while the insect does specific activities. And while a fruit fly larva is a simple insect, the researchers expect to see similar patterns in other animals. "In the same way that genes are conserved across the animal kingdom, I think that the basic circuit motifs that implement these fundamental behaviours will also be conserved," said Zlatic.

Original Submission

posted by hubie on Wednesday March 15, @06:37AM   Printer-friendly
from the things-expand-to-exceed-the-space-provided dept.

Hackaday has a story about a simple non-scientific calculator that packs an Alwinner A50 tablet SoC and the Android operating system:

As shipped they lack the Android launcher, so they aren't designed to run much more than the calculator app. Of course that won't stop somebody who knows their way around Google's mobile operating system for very long - at the end of the review, there's some shots of the gadget running Minecraft and playing streaming video.

But it does beg the question as to why such a product was put into production when the same task could have been performed using very cheap microcontroller. Further, having done so they make it a non-scientific machine, not even bestowing it with anything that could possibly justify the hardware.

Embedded has more generic related post about overengineering in embedded systems:

Embedded systems have traditionally been resource-constrained devices that have a specific purpose. They are not general computing devices but often some type of controller, sensor node, etc. As a result, embedded systems developers often are forced to balance bill-of-material (BOM) costs with software features and needs, resulting in a system that does a specific purpose efficiently and economically.

Over the last few years, I've noticed many systems being built that seem to ignore this balance. For example, I've seen intelligent thermostats that could be built using an Arm Cortex-M4 with a clock speed of fewer than 100 MHz and several hundred kilobytes of memory. Instead, these systems are designed using multicore Arm Cortex-M7 (or even Cortex-A!) parts running at 600 MHz+ with several megabytes of memory! This leads me to ask, are embedded systems developers today overengineering their systems?

I think there are more systems today that are designed with far more memory and processing power than is necessary to get the job done. To some degree, the push for IoT and edge devices has driven a new level of complexity into embedded systems that were once optimized for cost and performance. In addition, connectivity and the need to potentially add new features to a product for a decade or more into the future are leading developers to overestimate their needs and overengineer their systems.

While leaving extra headroom in a system for future expansion is always a great idea, I've seen the extras recently move into the excess. It's not uncommon for me to encounter a team without understanding their system's performance or software requirements. Yet, they've already selected the most cutting-edge microcontroller they can find. When asked about their part selection based on requirements, I've heard multiple times, "We don't know, so we picked the biggest part we could find just in case". Folks, that's not engineering; that's design by fear!

Original Submission

posted by hubie on Wednesday March 15, @03:52AM   Printer-friendly
from the doing-the-right-thing-for-the-wrong-reasons dept.

FISA Oversight Board Member Says Americans Need More Privacy Protections As Congress Debates Section 702 Reauthorization:

One of the NSA's most powerful spying tools is up for renewal at the end of the year. The problem with this power isn't necessarily the NSA. I mean, the NSA has its problems, but the issue here is the domestic surveillance performed by the FBI via this executive power — something it shouldn't be doing but has almost always done.

The FBI is currently catching a lot of heat for its "backdoor" access to US persons' data and communications, something it has shown little interest in controlling or tracking. Abuse is a regular occurrence and this abuse finally received some high profile attention after Congressional Republicans got bent out of shape because some of their own people ended up under the FBI's backdoor Section 702 microscope.

[...] Section 702 allows the NSA to perform "upstream" collections of data and communications. It's foreign-facing but it also collects any communications between foreign targets and US persons. That's where the FBI steps in. It's only supposed to be able to access minimized data and communications, but these restrictions are often ignored by the agency.

[...] Specifically, the program needs constraints on the FBI's access and use of the data collected by the NSA. For years, the FBI has abused its access to perform backdoor searches of Americans' data. And for years, it has been unable to explain why it can't stop violating minimization procedures and what, if anything, this unexpected, "incidental" treasure trove contributes to its law enforcement work.

[...] To that end, LeBlanc suggests a couple of changes. First, there's the court order requirement. Then Congress could limit the NSA's haystack-building apparatus by ending its "about" variables, which allow it to also search for communications that merely mention certain individuals, rather than limiting collection to those actually communicating with the agency's targets. Finally, Congress should act to limit or forbid "batch searches" of 702 collections by the FBI, preventing it from engaging in mass violations of the Fourth Amendment courts (so far) have ruled the government should never have to answer for.

If anyone can get this done, it's Congressional leaders motivated by personal animus and political grandstanding. An entire party is, at the moment, extremely angry at the FBI. Blatant self-interest may finally achieve what privacy advocates and activists have been seeking for several years. If the ends are going to justify the means, it may as well be these ends and those means. Some concern for the little people would be nice, but as an advocate of restricted surveillance powers, I'm willing to take what I can get.

Original Submission

posted by hubie on Wednesday March 15, @01:10AM   Printer-friendly

The pioneering project cuts cement from the recipe and replaces it with industrial waste and carbon dioxide captured from the atmosphere:

Block-Lite is a small concrete manufacturer in an industrial corridor of Flagstaff, Arizona. The third-generation family business makes bricks and other masonry materials for retaining walls, driveways, and landscaping projects. The company was already a local leader in sustainability — in 2020, it became the first manufacturer in Flagstaff to power its operations with on-site solar panels. But now it's doing something much more ambitious.

On Tuesday, Block-Lite announced a pioneering collaboration with climate tech startups Aircapture and CarbonBuilt to suck carbon dioxide from the atmosphere and stash it in concrete blocks. The companies estimate the project will reduce the carbon footprint of Block-Lite's products by 70 percent, creating a model they hope could reshape the industry.

[...] CarbonBuilt has developed a solution that addresses the issue in two distinct ways. First, the company found a proprietary way to replace cement with a mix of inexpensive, locally-sourced industrial waste materials. CEO Rahul Shendure told Grist they include common byproducts of coal plants, steelmaking, and chemical production that would, for the most part, otherwise be destined for landfills. The company's second feat is the way its equipment hardens that slurry into concrete blocks — by curing it with carbon dioxide. That's where Aircapture comes in. The company will build one of its machines which extract carbon dioxide from the ambient air directly on Block-Lite's site.

[...] Block-Lite did not respond to Grist's inquiry, but in a press release, the company suggested that the new concrete products would be no costlier than its current offerings. "All too often sustainable building materials require a trade off between cost and performance, but what is unique about this project is that there's no 'green premium.'" Block-Lite said. "We're going to be able to produce on-spec, ultra-low carbon blocks at price parity with traditional blocks which should speed adoption and impact."

Original Submission

posted by janrinok on Tuesday March 14, @10:20PM   Printer-friendly

To check that atomic weapons work, scientists run simulations of explosions using high-energy lasers—and Russia is building the strongest one of all:

In town of Sarov, roughly 350 kilometers east of Moscow, scientists are busy working on a project to help keep Russia's nuclear weapons operational long into the future. Inside a huge facility, 10 storeys high and covering the area of two football fields, they are building what's officially known as UFL-2M—or, as the Russian media has dubbed it, the "Tsar Laser." If completed, it will be the highest-energy laser in the world.

High-energy lasers can concentrate energy on groups of atoms, increasing temperature and pressure to start nuclear reactions. Scientists can use them to simulate what happens when a nuclear warhead detonates. By creating explosions in small samples of material—either research samples or tiny amounts from existing nuclear weapons—scientists can then calculate how a full-blown bomb is likely to perform. With an old warhead, they can check that it still works as intended. Laser experiments allow testing without letting a nuke off. "It's a substantial investment by the Russians in their nuclear weapons," says Jeffrey Lewis, a nuclear non-proliferation researcher at the Middlebury Institute of International Studies in California.

Until now, Russia has been unique among the best-established nuclear powers in not having a high-energy laser. The United States has its National Ignition Facility (NIF), currently the world's most energetic laser system. Its 192 separate beams combine to deliver 1.8 megajoules of energy. Looked at in one way, a megajoule is not an enormous amount—it's equivalent to 240 food calories, similar to a light meal. But concentrating this energy onto a tiny area can create very high temperatures and pressures. France meanwhile has its Laser Mégajoule, with 80 beams currently delivering 350 kilojoules, though it aims to have 176 beams delivering 1.3 megajoules by 2026. The UK's Orion laser produces 5 kilojoules of energy; China's SG-III laser, 180 kilojoules.

If completed the Tsar Laser will surpass them all. Like the NIF, it's due to have 192 beams, but with a higher combined output of 2.8 megajoules. Currently, though, only its first stage has launched. At a Russian Academy of Sciences meeting in December 2022, an official revealed that the laser boasts 64 beams in its current state. Their total output is 128 kilojoules, 6 percent of the planned final capability. The next step would be testing them, the official said.

[...] In experiments, these lasers blast their target materials into a high-energy state of matter known as plasma. In gases, solids, and liquids, electrons are usually locked tight to their atoms' nuclei, but in plasma they roam freely. The plasmas throw out electromagnetic radiation, such as flashes of light and x-rays, and particles like electrons and neutrons. The lasers therefore also need detection equipment that can record when and where these events happen. These measurements then allow scientists to extrapolate how a full warhead might behave.

[...] Researchers have used lasers in nuclear weapons testing since at least the 1970s. At first they combined them with underground tests of actual weapons, using data from both to build theoretical models of how plasma behaves. But after the US stopped live-testing nuclear weapons in 1992 while seeking agreement on the Comprehensive Nuclear-Test-Ban Treaty, it switched to "science-based stockpile stewardship"—namely, using supercomputer simulations of warheads detonating to assess their safety and reliability.

But the US and other countries following this approach still needed to physically test some nuclear materials, with lasers, to ensure their models and simulations matched reality and that their nukes were holding up. And they still need to do this today.

[...] But Tikhonchuk [emeritus professor at the Center for Intense Lasers and Applications at the University of Bordeaux, France] believes that Russia will struggle now because it has lost much of the expertise needed, with scientists moving overseas. He notes that the Tsar Laser's beam arrays are very large, at 40 centimeters across, which poses a significant challenge for making their lenses. The larger the lens, the greater the chance there will be a defect in it. Defects can concentrate energy, heating up and damaging or destroying the lenses.

The fact that Russia is developing the Tsar Laser indicates it wants to maintain its nuclear stockpile, says Lewis. "It's a sign that they plan for these things to be around for a long time, which is not great." But if the laser is completed, he sees a sliver of hope in Russia's move. "I'm quite worried that the US, Russia, and China are going to resume explosive testing." The Tsar Laser investment might instead show that Russia thinks it already has enough data from explosive nuclear tests, he says.

Original Submission

posted by janrinok on Tuesday March 14, @07:34PM   Printer-friendly

Resulting in the birth of several mice that were produced without mothers:

Same-sex reproduction has historically required donor cells, as is the case with egg implantation and some instances of in-vitro fertilization (IVF). Thanks to genetic engineering, however, this might not always be the case. Scientists in Japan have successfully created eggs using male cells, resulting in the birth of several mice that were produced without mothers.

Renowned Kyushu University stem cell researcher Katshuhiko Hayashi presented his team's achievement this week at the Third International Summit on Human Genome Editing in London. Hayashi had led his colleagues through "reprogramming" a male mouse's skin cells into induced pluripotent stem (iPS) cells, or former non-reproductive cells that can be engineered into various cell forms. Because male cells contain the XY chromosome combination, Hayashi had to remove the Y chromosome and replace it with an X chromosome from another cell. (Hayashi's team attempted to devise a way to duplicate the first cell's X chromosome but was unsuccessful, resulting in the need to pull from a donor.)

Hayashi implanted the makeshift eggs inside a mouse ovary organoid, a ball of tissues that function similarly to a natural ovary. After fertilizing the eggs with sperm, his team implanted the resulting 600 embryos into surrogate mice. Seven of these embryos became mouse pups, which grew into adults with normal lifespans and successful mating routines.

Should Hayashi and his colleagues successfully produce eggs in the lab, it could pave the way for novel infertility treatments and for same-sex procreation that incorporates both partners' genes.

Original Submission

posted by janrinok on Tuesday March 14, @04:53PM   Printer-friendly

And scientists have only seen four percent of the data so far:

A project to map the earliest structures of the universe has found 15,000 more galaxies in its first snapshot than captured in an entire deep field survey conducted 20 years ago.

The James Webb Space Telescope, the new preeminent observatory in the sky, saw about 25,000 galaxies in that single image, dramatically surpassing the nearly 10,000 shown in the Hubble Space Telescope's Ultra Deep Field Survey. Scientists say that little piece of the space pie represents just four percent of the data they'll discover from the new Webb survey by the time it's completed next year.

"When it is finished, this deep field will be astoundingly large and overwhelmingly beautiful," said Caitlin Casey, a University of Texas at Austin astronomer co-leading the investigation, in a statement.

[...] A deep field image is much like drilling deep into Earth to take a core sample: It's a narrow but distant view of the cosmos, revealing layers of history by cutting across billions of light-years. In Hubble's deep field, the oldest visible galaxies dated back to the first 800 million years after the Big Bang. That's an incredibly early period relative to the universe's estimated age of 13.8 billion-with-a-B years.

[...] Four different types of galaxies observed through the COSMOS-Web deep field survey.The COSMOS-Web survey will map 0.6 square degrees of the sky—about the area of three full moons.

The first images from COSMOS-Web, the largest program in Webb's first year, show a rich variety of structures, teeming with spiral galaxies, gravitational lensing, and galaxy mergers. Furthermore, hundreds of galaxies that were previously identified by Hubble are getting reclassified with different characteristics after being shown in more detail with Webb.

Original Submission

posted by janrinok on Tuesday March 14, @02:12PM   Printer-friendly
from the more-data-for-spreadsheet-nerds dept.

SSD Reliability is Only Slightly Better Than HDD, Backblaze Says

A surprising outcome for the first SSD-based AFR report:

Backblaze is a California-based company dealing with cloud storage and data backup services. Every year, the organization provides some interesting reliability data about the large fleet of storage units employed in its five data centers around the world.

For the first time, Backblaze's latest report on storage drive reliability is focusing on Solid State Drives (SSD) rather than HDD units alone. The company started using SSDs in the fourth quarter of 2018, employing the NAND Flash-based units as boot drives rather than data-storing drives. Backblaze uses consumer-grade drives, providing Annualized Failure Rate (AFR) information about 13 different models from five different manufacturers.

The 2022 Drive States review is based on data recorded from 2,906 SSD boot units, Backblaze states, and it is essentially confirming what the company was saying in its 2022 mid-year report. SSDs are more reliable than HDDs, Backblaze says, as they show a lower AFR rate (0.98%) compared to HDDs (1.64%).

The fact that the difference in reliability level isn't exactly staggering (0.66% AFR) is rather surprising, however, as SSDs are essentially just moving electrons through memory chips while hard drives have to deal with a complex (and failure-prone) mechanism employing spinning platters and extremely sensitive read/write magnetic heads.

The reasons behind failing drives aren't known, as only an SSD manufacturer would have the equipment needed to make a reliable diagnose. For 2022, Backblaze says that seven of the 13 drive models had no failure at all. Six of those seven models had a limited number of "drive days" (less than 10,000), the company concedes, meaning that there is not enough data to make a reliable projection about their failure rates.

An interesting tidbit about Backblaze's report is that the company hasn't used a single SSD unit made by Samsung, which is a major player in the SSD consumer market. One possible explanation is that Samsung drives aren't cheap, and Backblaze is essentially using the cheapest drives they can buy in bulk quantities.

The SSD Edition: 2022 Drive Stats Review

The SSD Edition: 2022 Drive Stats Review:

Welcome to the 2022 SSD Edition of the Backblaze Drive Stats series. The SSD Edition focuses on the solid state drives (SSDs) we use as boot drives for the data storage servers in our cloud storage platform. This is opposed to our traditional Drive Stats reports which focus on our hard disk drives (HDDs) used to store customer data.

We started using SSDs as boot drives beginning in Q4 of 2018. Since that time, all new storage servers and any with failed HDD boot drives have had SSDs installed. Boot drives in our environment do much more than boot the storage servers. Each day they also read, write, and delete log files and temporary files produced by the storage server itself. The workload is similar across all the SSDs included in this report.

In this report, we look at the failure rates of the SSDs that we use in our storage servers for 2022, for the last 3 years, and for the lifetime of the SSDs. In addition, we take our first look at the temperature of our SSDs for 2022, and we compare SSD and HDD temperatures to see if SSDs really do run cooler.

As of December 31, 2022, there were 2,906 SSDs being used as boot drives in our storage servers. There were 13 different models in use, most of which are considered consumer grade SSDs, and we'll touch on why we use consumer grade SSDs a little later. In this report, we'll show the Annualized Failure Rate (AFR) for these drive models over various periods of time, making observations and providing caveats to help interpret the data presented.

The dataset on which this report is based is available for download on our Drive Stats Test Data webpage. The SSD data is combined with the HDD data in the same files. Unfortunately, the data itself does not distinguish between SSD and HDD drive types, so you have to use the model field to make that distinction. If you are just looking for SSD data, start with Q4 2018 and go forward.

Click on the link to get the actual figures.

Original Submission #1Original Submission #2

posted by janrinok on Tuesday March 14, @11:26AM   Printer-friendly
from the lost-in-a-crowd dept.

Our brain has its own GPS and it helps us navigate by detecting the movements of the people around us:

Whether you are making your way through a crowded pedestrian zone or striving towards the goal in a team game, in both situations it is important to think not only about your own movements but also those of others. These navigation and orientation processes are carried out by brain cells that register our current position, where we are coming from, where we are moving towards and in which direction we are looking. Through their joint activity, they create a "map" of our surroundings. A special type of these cells are the so-called grid cells in the entorhinal cortex, a small brain region in the middle temporal lobe. They function like the brain's own GPS, because they not only represent our position in space, but can also put it in relation to other points in the same space.

[...] They found that the brain activity recorded while watching others was comparable to the activity of grid cells. In addition, the team was able to show that this activity was part of a larger network of brain regions that are associated with navigation processes. Interestingly, however, it turned out that the better a subject was at following the path of others, the less active this network was. "We interpret this as greater efficiency of the grid cells, which might make it less necessary to engage the larger brain network," Wagner explains.

The results of the study thus suggest that grid cells belong to a larger network of brain regions that, among other aspects, coordinates navigation processes. However, this network is particularly affected by ageing processes and especially by dementia. Wagner explains: "The function of grid cells decreases with age and dementia. As a result, people can no longer find their way around and their orientation is impaired." The group's further research is now dedicated to the question of whether grid cells are also involved in recognising other people - an aspect that is often impaired in advanced dementia.

Journal Reference:
Wagner, I.C., Graichen, L.P., Todorova, B. et al. Entorhinal grid-like codes and time-locked network dynamics track others navigating through space. Nat Commun 14, 231 (2023).

Original Submission

posted by hubie on Tuesday March 14, @08:42AM   Printer-friendly

Wildfire Smoke Eroded Ozone Layer By 10 Percent In 2020: Study:

The havoc wreaked by wildfires isn't just on the ground. Researchers at MIT have found that wildfire smoke particles actively erode Earth's protective ozone layer, thus widening the gap we've been spending the last decade trying to close.

When something burns and produces smoke, those smoke particles—otherwise called wildfire aerosol—can drift into the stratosphere, where they hang out for a year or more. According to a study published Wednesday in the journal Nature, chemists and atmospheric scientists have found that suspended wildfire aerosol sparks chemical reactions that ultimately degrade the ozone layer, or the thin atmospheric layer responsible for shielding Earth from the Sun.

The newly-discovered chemical reaction increases hydrochloric acid's solubility. While hydrochloric acid is already present in the atmosphere, MIT found that larger hydrochloric acid quantities activate chlorine in the air and increase ozone loss rates when warmer temperatures strike. This spells danger for the storied hole in the ozone layer, which environmental activists, scientists, and policymakers have been fighting to shrink for several years.

[...] Thankfully, recent attempts to mitigate damage to the ozone layer have been quite successful. International treaties like the Montreal Protocol have helped phase out the use of ozone-depleting pollutants. The world's gradual adoption of electric vehicles might have also helped. The US National Oceanic and Atmospheric Administration even found that the Antarctic ozone hole was slightly smaller in 2022 than in 2021 and far smaller than in 2006 when its size peaked. That said, it's difficult to know right now whether these efforts are enough to compensate for the ozone damage caused by wildfire smoke.

Journal Reference:
Solomon, S., Stone, K., Yu, P. et al. Chlorine activation and enhanced ozone depletion induced by wildfire aerosol. Nature 615, 259–264 (2023).

Original Submission