Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
The US Food and Drug Administration on Monday advised consumers to avoid nine types of hand sanitizers that may contain methanol, a toxic alcohol that can cause blindness if swallowed and systemic effects if absorbed through the skin.
All nine hand sanitizers are made by Eskbiochem SA de CV in Mexico. The agency said in its advisory that it discovered methanol while testing two of the company's products. One, called Lavar Gel, was 81-percent methanol—and no ethanol, a safe alcohol meant to be used in hand sanitizers. Another, CleanCare No Germ, was 2-percent methanol.
From the FDA advisory, the Eskbiochem SA de CV products are:
- All-Clean Hand Sanitizer (NDC: 74589-002-01)
- Esk Biochem Hand Sanitizer (NDC: 74589-007-01)
- CleanCare NoGerm Advanced Hand Sanitizer 75% Alcohol (NDC: 74589-008-04)
- Lavar 70 Gel Hand Sanitizer (NDC: 74589-006-01)
- The Good Gel Antibacterial Gel Hand Sanitizer (NDC: 74589-010-10)
- CleanCare NoGerm Advanced Hand Sanitizer 80% Alcohol (NDC: 74589-005-03)
- CleanCare NoGerm Advanced Hand Sanitizer 75% Alcohol (NDC: 74589-009-01)
- CleanCare NoGerm Advanced Hand Sanitizer 80% Alcohol (NDC: 74589-003-01)
- Saniderm Advanced Hand Sanitizer (NDC: 74589-001-01)
Strainoptronics: A New Way To Control Photons:
Researchers discovered a new way to engineer optoelectronic devices by stretching a two-dimensional material on top of a silicon photonic platform. Using this method, coined strainoptronics by a team led by George Washington University professor Volker Sorger, the researchers demonstrated for the first time that a 2-D material wrapped around a nanoscale silicon photonic waveguide creates a novel photodetector that can operate with high efficiency at the technology-critical wavelength of 1550 nanometers.
[...] 2-D materials have scientific and technologically relevant properties for photodetectors. Because of their strong optical absorption, designing a 2-D material-based photodetector would enable an improved photo-conversion, and hence more efficient data transmission and telecommunications. However, 2-D semiconducting materials, such as those from the family of transition metal dichalcogenides, have, so far, been unable to operate efficiently at telecommunication wavelengths because of their large optical bandgap and low absorption.
[...] Realizing the potential of strainoptronics, the researchers stretched an ultrathin layer of molybdenum telluride, a 2-D material semiconductor, on top of a silicon photonic waveguide to assemble a novel photodetector. They then used their newly created strainoptronics "control knob" to alter its physical properties to shrink the electronic bandgap, allowing the device to operate at near infrared wavelengths, namely at the telecommunication (C-band) relevant wavelength around 1550 nm.
Journal Reference:
R. Maiti, C. Patil, M. A. S. R. Saadi, et al. Strain-engineered high-responsivity MoTe 2 photodetector for silicon photonic integrated circuits, Nature Photonics (DOI: 10.1038/s41566-020-0647-4)
Ampere's Product List: 80 Cores, up to 3.3 GHz at 250 W; 128 Core in Q4
The Ampere Altra range, as part of today's release, will offer parts from 32 cores up to 80 cores, up to 3.3 GHz, with a variety of TDPs up to 250 W. As we've described in our previous news items on the chip, this is an Arm v8.2 core with a few 8.3+8.5 features, offers support for FP16 and INT8, supports 8 channels of DDR4-3200 ECC at 2 DIMMs per channel, and up to 4 TiB of memory per socket in a 1P or 2P configuration. Each CPU will offer 128 PCIe 4.0 lanes, 32 of which can be used for socket-to-socket communications implemented with the CCIX protocol over PCIe. This means 50 GB/s in each direction, and 192 PCIe 4.0 lanes in a dual socket system for add-in cards. Each of the PCIe lanes can bifurcate down to x2.
[...] Previously Ampere had stated they were going for 80 cores at 3.0 GHz at 210 W, however the Q80-33 is pushing that frequency another 300 MHz for another 40 W, and we understand that the tapeout of silicon from TSMC performed better than expected, hence this new top processor.
[...] If that wasn't enough, Ampere dropped a sizeable nugget into our pre-announcement briefing. The company is set to launch a 128-core version of Altra later this year.
This will be a new silicon design, beyond Ampere's initial layout of 80 cores for Altra, however Ampere states that while they are using the same platform as the regular Altra, they have done extensive tweaking and optimizations within the mesh interconnect for Altra Max to hide the additional contention that might occur when using the same main memory speeds.
Altra Max will be socket and pin-compatible with Altra, also support dual socket deployments, and Ampere states that the silicon will be ready for early sampling with partners in Q4, and is looking to move into high volume in mid-2021.
Previously: Ampere Launches its First ARM-Based Server Processors in Challenge to Intel
80-Core Arm CPU To Bring Lower Power, Higher Density To A Rack Near You
Related: Amazon Announces 64-core Graviton2 Arm CPU
Marvell Announces ThunderX3, an ARM Server CPU With 96 Cores, 384 Threads
AMD and Intel Have a Formidable New Foe (Amazon)
New #1 Supercomputer: Fujitsu's Fugaku
High performance computing is now at a point in its existence where to be the number one, you need very powerful, very efficient hardware, lots of it, and lots of capability to deploy it. Deploying a single rack of servers to total a couple of thousand cores isn't going to cut it. The former #1 supercomputer, Summit, is built from 22-core IBM Power9 CPUs paired with NVIDIA GV100 accelerators, totaling 2.4 million cores and consuming 10 MegaWatts of power. The new Fugaku supercomputer, built at Riken in partnership with Fujitsu, takes the top spot on the June 2020 #1 list, with 7.3 million cores and consuming 28 MegaWatts of power.
The new Fugaku supercomputer is bigger than Summit in practically every way. It has 3.05x cores, it has 2.8x the score in the official LINPACK tests, and consumes 2.8x the power. It also marks the first time that an Arm based system sits at number one on the top 500 list.
Also at NYT.
Fujitsu Fugaku report by Jack Dongarra (3.3 MB PDF)
The Fujitsu A64FX is a 64-bit ARM CPU with 48 cores and 2-4 cores assistant cores for the operating system. It uses 32 GiB of on-package High Bandwidth Memory 2. There are no GPUs or accelerators used in the the Fugaku supercomputer.
Fugaku can reach as high as 537 petaflops of FP64 (boost mode), or 1.07 exaflops of FP32, 2.15 exaflops of FP16, and 4.3 exaOPS of INT8. Theoretical peak memory bandwidth is 163 petabytes per second.
RMAX of #10 system: 18.2 petaflops (November 2019), 21.23 petaflops (June 2020)
RMAX of #100 system: 2.57 petaflops (November 2019), 2.802 petaflops (June 2020)
RMAX of #500 system: 1.142 petaflops (November 2019), 1.23 petaflops (June 2020)
See also: Arm Aligns Its Server Ambitions To Those Of Its Partners
AMD Scores First Top 10 Zen Supercomputer... at NVIDIA
Every six months TOP500.org announces its list of the top 500 fastest supercomputers. The new TOP500 list -- their 55th -- was announced today with a brand new system at the top.
Installed at the RIKEN Center for Computational Science, the system is named Fugaku. It is comprised of Fujitsu A64FX SoCs, each of which sports 48 cores at 2.2 GHz and is based on the ARM architecture. In total, it has 7,299,072 cores and attains an Rmax of 415.5 (PFlop/s) on the High Performance Linpack benchmark.
The previous top system is now in 2nd place. The Summit is located at the Oak Ridge National Laboratory and was built by IBM. Each node has two 22-core 3.07 GHz Power9 CPUs and six NVIDIA Tesla V100 GPUs. With a total of 2,414,592 cores, it is rated at an Rmax of 148.6 (PFlop/s).
Rounding out the top 3 is the Sierra which is also by IBM. It has 22-core POWER9 CPUs running at 3.1GHz and NVIDIA Volta GV100 GPUs. Its score is 94.6 (PFlop/s).
When the list was first published in June of 1993, the top system on the list, installed at Los Alamos National Laboratory, was a CM-5/1024 by Thinking Machines Corporation. Comprised of 1,024 cores, it was rated at a peak of 59.7 Rmax (GFlop/s). (It would require over 8.6 million of them to match the compute power of today's number one system.) in June 1993, #100 was a Cray Y-MP8/8128 installed at Lawrence Livermore National Laboratory and rated at 2.1 Rmax (GFlop/s). On that first list, 500th place went to an HPE C3840 having 4 cores and an Rmax of 0.4 (GFlop/s). Yes, that is 400 KFlop/s.
I wonder how today's cell phones would rate against that first list?
For the curious, the benchmark code can be downloaded from http://www.netlib.org/benchmark/hpl/.
Original Submission #1 Original Submission #2 Original Submission #3
When planting trees threatens the forest:
Campaigns to plant huge numbers of trees could backfire, according to a new study that is the first to rigorously analyze the potential effects of subsidies in such schemes.
The analysis, published on June 22 in Nature Sustainability, reveals how efforts such as the global Trillion Trees campaign and a related initiative (H. R. 5859) under consideration by the U.S. Congress could lead to more biodiversity loss and little, if any, climate change upside. The researchers emphasize, however, that these efforts could have significant benefits if they include strong subsidy restrictions, such as prohibitions against replacing native forests with tree plantations.
"If policies to incentivize tree plantations are poorly designed or poorly enforced, there is a high risk of not only wasting public money but also releasing more carbon and losing biodiversity," said study co-author Eric Lambin, the George and Setsuko Ishiyama Provostial Professor in Stanford's School of Earth, Energy & Environmental Sciences. "That's the exact opposite of what these policies are aiming for."
[...] The researchers set out to quantify the full impact of the afforestation subsidies and calculate their effects on net carbon and biodiversity changes across the entire country. They compared the area of Chilean forests under three scenarios: actual observed subsidy patterns, no subsidies and subsidies combined with fully enforced restrictions on the conversion of native forests to plantations. They found that, relative to a scenario of no subsidies, afforestation payments expanded the area covered by trees, but decreased the area of native forests. Since Chile's native forests are more carbon dense and biodiverse than plantations, the subsidies failed to increase carbon storage, and accelerated biodiversity losses.
"Nations should design and enforce their forest subsidy policies to avoid the undesirable ecological impacts that resulted from Chile's program," said study coauthor Cristian Echeverría, a professor at the University of Concepción in Chile. "Future subsidies should seek to promote the recovery of the many carbon- and biodiversity-rich natural ecosystems that have been lost."
Journal Reference:
Robert Heilmayr, Cristian Echeverría, Eric F. Lambin. Impacts of Chilean forest subsidies on forest cover, carbon and biodiversity, Nature Sustainability (DOI: 10.1038/s41893-020-0547-0)
US beekeepers reported lower winter losses but abnormally high summer losses
Beekeepers across the United States lost 43.7% of their managed honey bee colonies from April 2019 to April 2020, according to preliminary results of the 14th annual nationwide survey conducted by the nonprofit Bee Informed Partnership (BIP). These losses mark the second highest loss rate the survey has recorded since it began in 2006 (4.7 percentage points higher than the average annual loss rate of 39.0%). The survey results highlight the cyclical nature of honey bee colony turnover. Although the high loss rate was driven by the highest summer losses ever reported by the survey, winter losses were markedly lower than in most years. As researchers learn more about what drives these cycles of loss, this year emphasizes the importance of the summer for beekeeper losses.
This past year, winter losses were reported at 22.2%, which is 15.5 percentage points lower than last year and 6.4 points lower than the survey average. However, high summer losses were reported at 32.0%, which is 12.0 percentage points higher than last year and 10.4 points higher than the survey average.
"This year, summer loss was actually the highest we've ever recorded, even higher than winter losses, which is only the second time we've seen that, and it's mostly commercial beekeepers that are driving that loss number, which is unusual," says Nathalie Steinhauer, BIP's science coordinator and a post-doctoral researcher in the University of Maryland Department of Entomology. "So that makes this year different and interesting to us, because we want to know what is driving their losses up in comparison to previous years."
Commercial beekeepers typically have lower losses than backyard and smaller operations. Commercial honey bees pollinate $15 billion worth of food crops in the United States each year, so their health is critical to food production and supply.
Survey data, Loss map, Preliminary report (abstract (pdf))
Seeing isn't always believing: Google starts fact-checking images
Google said Monday it will start labeling some misleading photos in its images search feature with a fact-check label, expanding that function beyond search and videos as misinformation continues to spread rampant online. If a website or news article debunks an image in some way, the company will add a small "fact-check" label to the description of photos in search. A larger preview of the photo will show a short summary of the fact-check and direct users to its source.
[...] Fact-checking from social media and other tech companies has become common in the past three years — Facebook, Twitter and Google all do it to some extent — but it is by no means universal and often relies on news media and other partners to publish a fact-check and make sure the companies see it. It can also be applied unevenly, something that triggers complaints.
[...] The company used the example of an image showing a giant shark swimming along a Houston street. Now a search for the shark image — which was edited to make it seem as though a storm had caused the ocean wildlife to swim alongside cars — will show a small fact-check label next to a photo attached to a PolitiFact article.
Google said it is launching the feature fully this week.
Apple announces Mac architecture transition from Intel to its own ARM chips, offers emulation story
Apple has just announced its plans to switch from Intel CPUs in Macs to silicon of its own design, based on the ARM architecture. This means that Apple is now designing its own chips for iOS devices and its Mac desktop and laptops. Apple said it will ship its first ARM Mac before the end of the year, and complete the Intel -> ARM transition within two years.
Apple will bring industry leading performance and performance-by-watt with its custom silicon. Apple's chips will combine custom CPU, GPU, SSD controller and many other components. The Apple silicon will include the Neural Engine for machine learning applications.
[...] "Most apps will just work".
The Next Phase: Apple Lays Out Plans To Transition Macs from x86 to Apple SoCs
[From] an architecture standpoint, the timing of the transition is a bit of an odd one. As noted by our own Arm guru, Andrei Frumusanu, Arm is on the precipice of announcing the Arm v9 ISA, which will bring several notable additions to the ISA such as Scalable Vector Extension 2 (SVE2). So either Arm is about to announce v9, and Apple's A14 SoCs will be among the first to implement the new ISA, otherwise Apple will be setting the baseline for macOS-on-Arm as v8.2 and its NEON extensions fairly late into the ISA's lifecycle. This will be something worth keeping an eye on.
[...] [In] order to bridge the gap between Apple's current software ecosystem and where they want to be in a couple of years, Apple will once again be investing in a significant software compatibility layer in order to run current x86 applications on future Arm Macs. To be sure, Apple wants developers to recompile their applications to be native – and they are investing even more into the Xcode infrastructure to do just that – but some degree of x86 compatibility is still a necessity for now.
The cornerstone of this is the return of Rosetta, the PowerPC-to-x86 binary translation layer that Apple first used for the transition to x86 almost 15 years ago. Rosetta 2, as it's called, is designed to do the same thing for x86-to-Arm, translating x86 macOS binaries so that they can run on Arm Macs. Rosetta 2's principle mode of operation will be to translate binaries at install time.
See also: Apple Announces iOS 14 and iPadOS 14: An Overview
Apple's First ARM-Based (Mac) Product Is a Mac mini Featuring an A12Z Bionic, but Sadly, Regular Customers Can't Buy It
Previously: Apple Will Reportedly Sell a New Mac Laptop With its Own Chips Next Year
You know the drill, right? The FBI keeps insisting that it has a "going dark" problem due to encryption making it impossible to access key evidence of supposedly criminal behavior, in theory allowing crime to happen without recourse. The problem, though, is that nearly every single bit of this claim is false. It's kind of stunning.
- It appears that, in practice, the FBI almost never runs into encryption.
- In the rare cases where it has (and we don't know how many because since the FBI admitted it over exaggerated how many "locked" devices it had, and then has since refused to provide an updated count), there do appear to be ways to get into those devices anyway.
- But the key issue, by far, is that the opposite of going dark is happening. Thanks to our increasingly electronic lives, the government actually has way more access to information than ever before.
Two recent articles highlight this in practice, with regards to the FBI trying to track down the rare cases of criminal activity happening around some of the protests.
Wirecard says missing $2.1 billion likely did not exist; withdraws forecasts
Scandal-hit German payments firm Wirecard AG on Monday said a quarter of its assets totalling 1.9 billion euros ($2.13 billion) that auditor EY has been unable to account for likely did not exist in the first place.
The company, whose stock has plummeted 75% since EY refused to sign off its 2019 accounts last week, also said it has withdrawn its preliminary 2019 and first-quarter 2020 financial results as well as forecasts.
"The Management Board of Wirecard assesses on the basis of further examination that there is a prevailing likelihood that the bank trust account balances in the amount of 1.9 billion EUR do not exist," the company said in a statement.
The development comes after Chief Executive Officer Markus Braun quit on Friday with the company scrambling to secure a financial lifeline from its banks, while its search for the money hit a dead end in the Philippines.
'Total disaster': Phantom billions plunge Wirecard into chaos
The one-time investor darling is holding emergency talks with its banks, which are owed roughly 1.75 billion euros, to avert a looming cash crunch triggered by the missing money.
The episode marks a dramatic turn in the fortunes of a homegrown tech firm that attracted some of the world's biggest investors before a whistleblower alleged that it owed its success in part to a web of sham transactions.
See also: Wirecard shares crash again after payments firm says missing $2 billion likely doesn't exist
German finance minister on Wirecard oversight — 'the supervisory institutions did their job'
Bank of China Weighs Ending Wirecard's Credit Line
Short sellers made $2.6 bln off Wirecard plunge
NASA thinks it's time to return to Neptune with its Trident mission:
It's been 30 years since NASA's Voyager 2 spacecraft flew past the gas giant and its largest moon, and that flyby posed more questions than it answered. Maybe we'll get some answers in 2038, when the positions of Jupiter, Neptune and Triton will be just right for a mission.
NASA is deliberating over the next mission in its Discovery Program, narrowing it down to four possibilities: a mission to study Venus' atmosphere, one to observe volcanic activity on Jupiter's moon Io, one to map Venus' surface and study its geology, and one to explore Neptune's moon Triton.
The conceptual mission to Triton is called Trident, and it's competing with the other three to become a full-fledged mission.
[...] The Trident mission would launch in 2026, taking advantage of a rare and efficient alignment between Jupiter, Neptune, and Triton in 2038. It would do gravity assist fly-bys of Earth, Venus and Jupiter before continuing on to Neptune. All of those fly-bys will propel the spacecraft toward its goal. Then it would perform a fly-by of Neptune and a fly-by of Triton. Sadly, the mission profile doesn't include any orbiters or landers.
The spacecraft's unique path would mean that even with only one fly-by of Triton, it would be able to map the surface of the moon almost completely. It would also be able to fly within 500 km (310 miles) of the surface, right through Triton's thin atmosphere.
[...] The Triton mission is only a concept at this point. And it's competing with three other missions for selection. By summer 2021, NASA will have narrowed the choice down to two finalists, or possibly one winner.
More information:
Mitchell, Prockter, et al. Implementation of Trident: A Discovery Class Mission to Triton. 50th Lunar and Planetary Science Conference Abstracts. (2019) www.hou.usra.edu/meetings/lpsc2019/pdf/3200.pdf
'BlueLeaks' Exposes Files from Hundreds of Police Departments
Hundreds of thousands of potentially sensitive files from police departments across the United States were leaked online last week. The collection, dubbed "BlueLeaks" and made searchable online, stems from a security breach at a Texas web design and hosting company that maintains a number of state law enforcement data-sharing portals.
The collection — nearly 270 gigabytes in total — is the latest release from Distributed Denial of Secrets (DDoSecrets), an alternative to Wikileaks that publishes caches of previously secret data.
In a post on Twitter, DDoSecrets said the BlueLeaks archive indexes "ten years of data from over 200 police departments, fusion centers and other law enforcement training and support resources," and that "among the hundreds of thousands of documents are police and FBI reports, bulletins, guides and more."
Fusion centers are state-owned and operated entities that gather and disseminate law enforcement and public safety information between state, local, tribal and territorial, federal and private sector partners.
BlueLeaks from Distributed Denial of Secrets. [Dataset link has been nonresponsive since this story was submitted.]
Also at Vice, Forbes, ZDNet, and SecurityWeek.
Related: Virginia Police Have Been Secretively Stockpiling Private Phone Records
Washington State Fusion Center Accidentally Releases Records on Remote Mind Control
https://edition.cnn.com/2020/06/19/tech/north-face-facebook-ads/index.html:
Outdoor apparel brand The North Face has become the best-known company yet to commit to an advertising boycott of Facebook in light of the social media platform's handling of misinformation and hate speech — a move that could open the door for other brands to do the same.
The brand's decision responds to a pressure campaign by top civil rights groups, including the NAACP and the Anti-Defamation League, known as #StopHateForProfit, which on Wednesday began calling for advertisers to suspend their marketing on Facebook in the month of July.
"We're in," The North Face tweeted. "We're out @Facebook #StopHateForProfit."
Hours later, outdoor equipment retailer REI said it will join the boycott.
[...] The activists demanding change face an enormously ambitious task. Facebook is the second-largest player in US digital marketing after Google, and last year generated $69.7 billion from advertising worldwide.
Scientists find huge ring of ancient shafts near Stonehenge:
Archaeologists said Monday that they have discovered a major prehistoric monument under the earth near Stonehenge that could shed new light on the origins of the mystical stone circle in southwestern England.
Experts from a group of British universities led by the University of Bradford say the site consists of at least 20 huge shafts, more than 10 meters (32 feet) in diameter and 5 meters (16 feet) deep, forming a circle more than 2 kilometers (1.2 miles) in diameter.
The new find is at Durrington Walls, the site of a Neolithic village about 2 kilometers (1.2 miles) from Stonehenge,
Researchers say the shafts appear to have been dug around 4,500 years ago, and could mark the boundary of a sacred area or precinct around a circular monument known as the Durrington Walls henge.
The hollows were initially thought to be natural voids in the limestone before the larger picture emerged to show a circle.
To evade detection, hackers are requiring targets to complete CAPTCHAs:
CAPTCHAs, those puzzles with muffled sounds or blurred or squiggly letters that websites use to filter out bots (often unsuccessfully), have been annoying end users for more than a decade. Now, the challenge-and-response tests are likely to vex targets in malware attacks.
Microsoft recently spotted an attack group distributing a malicious Excel document on a site requiring users to complete a CAPTCHA, most likely in an attempt to thwart automated detection by good guys. The Excel file contains macros that, when enabled, install GraceWire, a trojan that steals sensitive information such as passwords. The attacks are the work of a group Microsoft calls Chimborazo, which company researchers have been tracking since at least January.
Previously, Microsoft observed Chimborazo distributing the Excel file in attachments included in phishing messages and later spreading through embedded Web links. In recent weeks, the group has begun sending phishing emails that change things up again. In some cases, the phishes include links that lead to redirector sites (usually legitimate sites that have been compromised). In other cases, the emails have an HTML attachment that contains a malicious iframe tag.
Either way, clicking on the link or attachment leads to a site where targets download the malicious file, but only after completing the CAPTCHA (which is short for completely automated public Turing test to tell computers and humans apart). The purpose: to thwart automated analysis defenders use to detect and block attacks and get attack campaigns shut down. Typically the analysis is performed by what are essentially bots that download malware samples and run and analyze them in virtual machines.
Requiring the successful completion of a CAPTCHA means analysis will only happen when a live human being downloads the sample. Without the automation, the chances of the malicious file flying under the radar are much better. Microsoft has dubbed Chimborazo’s ongoing attack campaign Dudear.