Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
U.S. drinking water widely contaminated with 'forever chemicals': environment watchdog
The contamination of U.S. drinking water with man-made "forever chemicals" is far worse than previously estimated with some of the highest levels found in Miami, Philadelphia and New Orleans, said a report on Wednesday by an environmental watchdog group.
The chemicals, resistant to breaking down in the environment, are known as perfluoroalkyl substances, or PFAS. Some have been linked to cancers, liver damage, low birth weight and other health problems.
The findings here by the Environmental Working Group (EWG) show the group's previous estimate in 2018, based on unpublished U.S. Environmental Protection Agency (EPA) data, that 110 million Americans may be contaminated with PFAS, could be far too low.
Per- and polyfluoroalkyl substances
Velodyne Will Sell a Lidar for $100
Velodyne claims to have broken the US $100 barrier for automotive lidar with its tiny Velabit, which it unveiled at CES earlier this month.
"Claims" is the mot juste because this nice, round dollar amount is an estimate based on the mass-manufacturing maturity of a product that has yet to ship. Such a factoid would hardly be worth mentioning had it come from some of the several-score odd lidar startups that haven't shipped anything at all. But Velodyne created this industry back during DARPA-funded competitions, and has been the market leader ever since.
"The projection is $100 at volume; we'll start sampling customers in the next few months," Anand Gopalan, the company's chief technology officer, tells IEEE Spectrum.
The company says in a release that the Velabit "delivers the same technology and performance found on Velodyne's full suite of state-of-the-art sensors." Given the device's small size, that must mean the solid-state version of the technology. That is, the non-rotating kind.
Related: Why Experts Believe Cheaper, Better Lidar is Right Around the Corner
Nikon Will Help Build Velodyne's Lidar Sensors for Future Self-Driving Cars
Contrary To Musk's Claims, Lidar Has Some Advantages In Self Driving Technology
Artificial Eyes: How Robots Will See In The Future
How to verify that quantum chips are computing correctly:
In a step toward practical quantum computing, researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers can't.
Quantum chips perform computations using quantum bits, called "qubits," that can represent the two states corresponding to classic binary bits — a 0 or 1 — or a "quantum superposition" of both states simultaneously. The unique superposition state can enable quantum computers to solve problems that are practically impossible for classical computers, potentially spurring breakthroughs in material design, drug discovery, and machine learning, among other applications.
Full-scale quantum computers will require millions of qubits, which isn't yet feasible. In the past few years, researchers have started developing "Noisy Intermediate Scale Quantum" (NISQ) chips, which contain around 50 to 100 qubits. That's just enough to demonstrate "quantum advantage," meaning the NISQ chip can solve certain algorithms that are intractable for classical computers. Verifying that the chips performed operations as expected, however, can be very inefficient. The chip's outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan.
In a paper published today in Nature Physics, the researchers describe a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations. They validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.
"As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time critical," says first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE). "Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting."
[...] The researchers' work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.
At the core of the new protocol, called "Variational Quantum Unsampling," lies a "divide and conquer" approach, Carolan says, that breaks the output quantum state into chunks. "Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way," Carolan says.
Arthur T Knackerbracket has found the following story:
Political polarization among Americans has grown rapidly in the last 40 years—more than in Canada, the United Kingdom, Australia or Germany—a phenomenon possibly due to increased racial division, the rise of partisan cable news and changes in the composition of the Democratic and Republican parties.
That's according to new research co-authored by Jesse Shapiro, a professor of political economy at Brown University. The study, conducted alongside Stanford University economists Levi Boxell and Matthew Gentzkow, was released on Monday, Jan. 20, as a National Bureau of Economic Research working paper.
In the study, Shapiro and colleagues present the first ever multi-nation evidence on long-term trends in "affective polarization"—a phenomenon in which citizens feel more negatively toward other political parties than toward their own. They found that in the U.S., affective polarization has increased more dramatically since the late 1970s than in the eight other countries they examined—the U.K., Canada, Australia, New Zealand, Germany, Switzerland, Norway and Sweden.
"A lot of analysis on polarization is focused on the U.S., so we thought it could be interesting to put the U.S. in context and see whether it is part of a global trend or whether it looks more exceptional," Shapiro said. "We found that the trend in the U.S. is indeed exceptional."
Using data from four decades of public opinion surveys conducted in the nine countries, the researchers used a so-called "feeling thermometer" to rate attitudes on a scale from 0 to 100, where 0 reflected no negative feelings toward other parties. They found that in 1978, the average American rated the members of their own political party 27 points higher than members of the other major party. By 2016, Americans were rating their own party 45.9 points higher than the other party, on average. In other words, negative feelings toward members of the other party compared to one's own party increased by an average of 4.8 points per decade.
The researchers found that polarization had also risen in Canada, New Zealand and Switzerland in the last 40 years, but to a lesser extent. In the U.K., Australia, Germany, Norway and Sweden, polarization decreased.
More information: Levi Boxell et al, Cross-Country Trends in Affective Polarization, (2020). DOI: 10.3386/w26669
Xerox is done playing mister nice guy – the company has named a slate of directors it wants to shoehorn onto HP's board to spearhead its £33bn hostile takeover bid.
The copier giant will nominate 11 directors at HP's upcoming shareholder meeting in an attempt to gain control of the company's 12-person board. The list includes former senior directors at Aetna, United Airlines, Hilton Hotels, Novartis and Verizon.
Nominees
Betsky Atkins – Chief executive of Baja Corporation, a venture capital firm.
George Bickerstaff – Co-founder and MD of M.M. Dillon & Co., a healthcare and tech boutique investment bank.
Carolyn Byrd – Chair and CEO of GlobalTech Financial, a consulting firm.
Jeannie Diefenderfer – A former Verizon executive, who is now a member of the Workforce Development & Support Advisory Panel at the NSA, where she advises on workforce development and diversity and inclusion.
Kim Fennebresque – former chairman, president, and CEO of financial services firm Cowen Group.
Carol Flaton – former MD at AlixPartners, a global consulting firm.
Matthew Hart – President and CEO of Hilton Hotels until their buyout by Blackstone in 2007.
Fred Hochberg – Chairman and president of the Export-Import Bank of the United States during the Obama Administration.
Jacob Katz – Former chairman of Grant Thorton, a leading independent audit, tax, and advisory firm.
Nichelle Maynard-Elliott – Recently served as executive director of M&A for industrial gas giant Praxair.
Thomas Sabatino Jr – Former executive vice president and general counsel of Aetna, an American health insurer.Xerox has been chasing its much larger rival since November, offering $33.5bn or $22 per share. HP has so far refused to play ball, thrice rebuffing Xerox on the grounds that its offer "significantly undervalues" the business.
The nominations signal a more aggressive approach from Xerox – the company previously warned things were going to get nasty – pressuring HP to negotiate a deal. A vote on the nominees could act as a proxy referendum on the proposed merger, and installing several directors more favourable to it will help.
"We believe HP shareholders will be better served by a new slate of independent directors who understand the challenges of operating a global enterprise and appreciate the value that can be created by realising the synergies of a combination with Xerox," CEO John Visentin said in a statement.
HP responded by calling the nominations "a self-serving tactic by Xerox to advance its proposal, which significantly undervalues HP and creates meaningful risk to the detriment of HP shareholders".
Additional, earlier, coverage at The Register
Tool predicts how fast code will run on a chip:
[...] In [a] series of conference papers, the researchers describe a novel machine-learning pipeline that automates this process, making it easier, faster, and more accurate. In a paper presented at the International Conference on Machine Learning in June, the researchers presented Ithemal, a neural-network model that trains on labeled data in the form of “basic blocks” — fundamental snippets of computing instructions — to automatically predict how long it takes a given chip to execute previously unseen basic blocks. Results suggest Ithemal performs far more accurately than traditional hand-tuned models.
Then, at the November IEEE International Symposium on Workload Characterization, the researchers presented a benchmark suite of basic blocks from a variety of domains, including machine learning, compilers, cryptography, and graphics that can be used to validate performance models. They pooled more than 300,000 of the profiled blocks into an open-source dataset called BHive. During their evaluations, Ithemal predicted how fast Intel chips would run code even better than a performance model built by Intel itself.
Ultimately, developers and compilers can use the tool to generate code that runs faster and more efficiently on an ever-growing number of diverse and “black box” chip designs. “Modern computer processors are opaque, horrendously complicated, and difficult to understand. It is also incredibly challenging to write computer code that executes as fast as possible for these processors,” says co-author on all three papers Michael Carbin, an assistant professor in the Department of Electrical Engineering and Computer Science (EECS) and a researcher in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “This tool is a big step forward toward fully modeling the performance of these chips for improved efficiency.”
Most recently, in a paper presented at the NeurIPS conference in December, the team proposed a new technique to automatically generate compiler optimizations. Specifically, they automatically generate an algorithm, called Vemal, that converts certain code into vectors, which can be used for parallel computing. Vemal outperforms hand-crafted vectorization algorithms used in the LLVM compiler — a popular compiler used in the industry.
[...] “Intel’s documents are neither error-free nor complete, and Intel will omit certain things, because it’s proprietary,” says co-author on all three papers Charith Mendis, a graduate student in EECS and CSAIL. “However, when you use data, you don’t need to know the documentation. If there’s something hidden you can learn it directly from the data.”
[...] In training, the Ithemal model analyzes millions of automatically profiled basic blocks to learn exactly how different chip architectures will execute computation. Importantly, Ithemal takes raw text as input and does not require manually adding features to the input data. In testing, Ithemal can be fed previously unseen basic blocks and a given chip, and will generate a single number indicating how fast the chip will execute that code.
The researchers found Ithemal cut error rates in accuracy — meaning the difference between the predicted speed versus real-world speed — by 50 percent over traditional hand-crafted models. Further, in their next paper, they showed that Ithemal’s error rate was 10 percent, while the Intel performance-prediction model’s error rate was 20 percent on a variety of basic blocks across multiple different domains.
Articles:
Charith Mendis, Alex Renda, Saman Amarasinghe, Michael Carbin. Ithemal: Accurate, Portable and Fast Basic Block Throughput Estimation using Deep Neural Networks. http://proceedings.mlr.press/v97/mendis19a/mendis19a.pdf
Yishen Chen, Ajay Brahmakshatriya, Charith Mendis, Alex Renda, Eric Atkinson, Ondrej Sykora, Saman Amarasinghe, and Michael Carbin. BHive: A Benchmark Suite and Measurement Framework for Validating x86-64 Basic Block Performance Models http://groups.csail.mit.edu/commit/papers/19/ithemal-measurement.pdf"
Charith Mendis, Cambridge Yang, Yewen Pu, Saman Amarasinghe, Michael Carbin. Compiler Auto-Vectorization with Imitation Learning http://papers.nips.cc/paper/9604-compiler-auto-vectorization-with-imitation-learning.pdf"
3D Glasses Work On Cuttlefish And It's Adorable:
Every so often, scientists grace the undeserving public with an experiment seemingly designed solely to make us giggle in delight. Without any further ado, Cuttlefish Wearing 3D Glasses: [YouTube Link]
[...] Initial attempts to glue the glasses straight on the cuttlefish (yikes!) led to skin damage, and the scientists solved this by instead gluing little velcro strips (double yikes!) to both their heads and the glasses, so that they could strap them on.
Cuttlefish love to snack on shrimp, and the way they catch them is by swimming forward and backward to adjust for distance, then launching their tentacles at prey from just the right spot. Scientists weren't sure if the cuttlefish would do this for the 3D animated shrimp in their little movie, but they totally did. [...] The cuttlefish adjusted their distance, they launched their tentacles, and even though they came up empty-handed, they proved that scientists were at least on the right track.
The study in the American Chemical Society's ACS Applied Energy Materials describes a previously unknown mechanism by which lithium gets trapped in batteries, thus limiting the number of times it can be charged and discharged at full power.
[...] The Rice lab of chemical and biomolecular engineer Sibani Lisa Biswal found a sweet spot in the batteries that, by not maxing out their storage capacity, could provide steady and stable cycling for applications that need it.
Biswal said conventional lithium-ion batteries utilize graphite-based anodes that have a capacity of less than 400 milliamp hours per gram (mAh/g), but silicon anodes have potentially 10 times that capacity. That comes with a downside: Silicon expands as it alloys with lithium, stressing the anode. By making the silicon porous and limiting its capacity to 1,000 mAh/g, the team's test batteries provided stable cycling with still-excellent capacity.
[...] The team led by postdoctoral fellow Anulekha Haridas tested the concept of pairing the porous, high-capacity silicon anodes (in place of graphite) with high-voltage nickel manganese cobalt oxide (NMC) cathodes. The full cell lithium-ion batteries demonstrated stable cyclability at 1,000 mAh/g over hundreds of cycles.
Some cathodes had a 3-nanometer layer of alumina (applied via atomic layer deposition), and some did not. Those with the alumina coating protected the cathode from breaking down in the presence of hydrofluoric acid, which forms if even minute amounts of water invade the liquid electrolyte. Testing showed the alumina also accelerated the battery's charging speed, reducing the number of times it can be charged and discharged.
There appears to be extensive trapping as a result of the fast lithium transport through alumina, Haridas said. The researchers already knew of possible ways silicon anodes trap lithium, making it unavailable to power devices, but she said this is the first report of the alumina itself absorbing lithium until saturated. At that point, she said, the layer becomes a catalyst for fast transport to and from the cathode.
"This lithium-trapping mechanism effectively protects the cathode by helping maintain a stable capacity and energy density for the full cells," Haridas said.
More information: Anulekha K. Haridas et al, ALD-Modified LiNi0.33Mn0.33Co0.33O2 Paired with Macroporous Silicon for Lithium-Ion Batteries: An Investigation on Lithium Trapping, Resistance Rise, and Cycle-Life Performance$, ACS Applied Energy Materials (2019). DOI: 10.1021/acsaem.9b01728
China Battles Coronavirus Outbreak: All the Latest Updates:
The virus thought to have originated in a Wuhan food market continues to spread as China steps up containment efforts.
[...] China is extending the Lunar New Year holiday for three days and enforcing strict containment measures in an attempt to curb the spread of a new coronavirus that has killed 80 people and infected at more than 2,700, most of them in the central province of Hubei where the virus first emerged.
The holiday season was due to end on Friday but will now be extended until February 2.
More than 56 million people in almost 20 cities, including the Hubei capital of Wuhan, have been affected by travel restrictions, introduced amid fears the transmission rate will balloon as hundreds of millions of Chinese travel during the Lunar New Year celebrations.
[...] Health authorities around the world are taking action to prevent a pandemic as more countries report cases. Confirmed cases have so far been announced in several Asiancountries, Europe and North America.
[...] The World Health Organization (WHO) has acknowledged the respiratory illness, which has been traced to the city of Wuhan, is an emergency in China but the organisation said on Thursday it was too early to declare the outbreak a public health emergency of international concern.
Previously:
Intel on Thursday reported $20.2bn revenue for the fourth quarter of 2019, a gain of eight per cent year-on-year, and $72bn for the full-year, a two per cent increase.
Analysts had been expecting something less, around $19.23bn and $70.98bn on average, and the results lifted the chip giants stock in after-hours trading.
CEO Bob Swan, in a canned statement, said more or less that things had gone well, without providing any specifics: "In 2019, we gained share in an expanded addressable market that demands more performance to process, move and store data," he said. "One year into our long-term financial plan, we have outperformed our revenue and EPS expectations."
Not the sort of stuff that gets one booked for a commencement address.
Chipzilla's numbers said as much, though more succinctly. Its earnings per share came to $1.58 for the quarter and $4.71 for the year. Its gross margin for the quarter was 58.8 per cent, down 60.2 per cent in Q4 2018; for the year, its gross margin was 58.6 per cent, down from 61.7 per cent in 2018.
Intel's fourth quarter operating margin came in at 36 per cent, up half a percentage point from the same period a year ago.
Operating income was $6.8bn for Q4 and $22bn for 2019; net income was $6.9bn and $21bn respectively. In fact so much money has been rolling in that Chipzilla increased its per-share annual dividend to $1.32, an increase of five per cent.
Intel generated $33.1bn in cash from its operations in 2019 and $16.9bn in free cash flow while routing about $19.2bn back to shareholders. In Q4 2019, Chipzilla's cash machine created $9.9bn and dispensed $1.4bn in dividends, with another $3.5bn going to buy back Intel shares to support the share price.
The chip maker's revenue is split more or less evenly between its data-center business (DCG) and its PC-centric trade (CCG). However, its data-center, sorry, -centric business is expected to grow faster than its counterpart.
Arthur T Knackerbracket has found the following story:
Google security researchers have published details about the flaws they identified last year in Intelligent Tracking Protection (ITP), a privacy scheme developed by Apple's WebKit team for the company's Safari browser.
In December, Apple addressed some of these vulnerabilities (CVE-2019-8835, CVE-2019-8844, and CVE-2019-8846) through software updates, specifically Safari 13.0.4 and iOS 13.3. Those bugs could be exploited to leak browsing and search history and to perform denial of service attacks.
But they're not quite fixed, according to Google's boffins. In a paper [PDF] titled, "Information Leaks via Safari's Intelligent Tracking Prevention," authors Artur Janc, Krzysztof Kotowicz, Lukas Weichselbaum, and Roberto Clapis claim that the proposed mitigations "will not address the underlying problem."
And on Wednesday, Justin Schuh, Google engineering director for Chrome security and privacy, made a similar claim via Twitter. Google, he said, had found similar security flaws in a Chrome tool called XSS Auditor and had decided they were fundamentally unfixable.
"After several back and forths with the team that discovered the issue, we determined that it was inherent to the design and had to remove the code," he explained.
-- submitted from IRC
Scientists from the University of Surrey and University of Geneva have discovered that the bacterium which causes bovine TB can survive and grow in small, single-celled organisms found in soil and dung. It is believed that originally the bacterium evolved to survive in these single-celled organisms known as amoebae and in time progressed to infect and cause TB in larger animals such as cattle.
During the study, published in the ISME Journal, scientists sought to understand more about the cattle and humans.
Scientists also discovered that M. bovis remained metabolically active and continued to grow, although at a slower pace, at lower temperatures than expected. Previously it was thought the bacterium could only replicate at 37˚C, the body temperature of cattle and humans; however, replication of the bacterium was identified at 25 ˚C. Researchers believe that the bacterium's ability to adapt to ambient temperatures and survive in amoebae may partially explain high transmission rates of the bacterium between animals.
[...] "An important additional benefit is that our research shows the potential for carrying out at least some future TB research in amoebae rather than in large animals."
More information: Rachel E Butler et al. Mycobacterium bovis uses the ESX-1 Type VII secretion system to escape predation by the soil-dwelling amoeba Dictyostelium discoideum, The ISME Journal (2020). DOI: 10.1038/s41396-019-0572-z
Everyone has seen the warning. At the bottom of the email, it says: "Please consider the environment before printing." But for those who care about global warming, you might want to consider not writing so many emails in the first place.
More and more, people rely on their electronic mailboxes as a life organizer. Old emails, photos, and files from years past sit undisturbed, awaiting your search for a name, lost address, or maybe a photo of an old boyfriend. The problem is that all those messages require energy to preserve them. And despite the tech industry's focus on renewables, the advent of streaming and artificial intelligence is only accelerating the amount of fossil fuels burned to keep data servers up, running, and cool.
Right now, data centers consume about 2% of the world's electricity, but that's expected to reach 8% by 2030. Moreover, only about 6% of all data ever created is in use today, according to research from Hewlett Packard Enterprise. That means that 94% is sitting in a vast "cyber landfill," albeit one with a massive carbon footprint.
"It's costing us the equivalent of maintaining the airline industry for data we don't even use," says Andrew Choi, a senior research analyst at Parnassus Investments, a $27 billion environmental, social, and governance firm in San Francisco.
[...] Choi says the problem is getting too big too fast: How many photos are sitting untouched in the cloud? Is there a net benefit from an internet-connected toothbrush? Is an AI model that enables slightly faster food delivery really worth the energy cost? (Training an AI model emits about as much carbon as the lifetime emissions associated with running five cars.)
Parnassus has been focusing on Advanced Micro Devices and Nvidia, companies that are researching more efficient storage technology. But Choi says real solutions may require more radical thoughts.
"Data is possibly overstated as an advantage for business, and no one's really asking the question," he says. "If a small group of people are the only ones really benefiting from this data revolution, then what are we actually doing, using all of this power?"
Biophysicists from the MIPT Center for Molecular Mechanisms of Aging and Age-Related Diseases have teamed up with colleagues from Canada, the U.S., Japan, France, and Germany to shed light on the structure and functioning mechanism of the CysLT receptors, which regulate inflammatory responses associated with allergic disorders. Their findings are reported in Nature Communications.
[...] As of today, the development of more effective medications for asthma and associated conditions is hindered by the lack of information on how and to what ligands the CysLT receptors bind. Their functioning mechanisms have not been clearly understood either, as this requires high-resolution structural biology data. Once these are available, researchers can proceed using computer simulations.
[...] In their recent study, researchers from the Moscow Institute of Physics and Technology identified the most critical ligand-binding determinants of the CysLT1 and CysLT2 receptors based on the structural analysis the team performed for CysLT2R and the structural data on CysLT1R published by the laboratory in October.
"The new structures have greatly improved the accuracy of ligand docking and helped us better understand the properties of ligands with respect to both receptors. Now we know how to alter drug design templates to inhibit the activity of both the CysLT1 and CysLT2 receptors or do that selectively for either of them," commented Anastasiia Gusach , a Ph.D. student at MIPT and a junior researcher at the MIPT Laboratory of Structural Biology of G Protein-Coupled Receptors.
In the future, these structures could be further developed to serve as drug candidates or tool compounds, aiding in understanding the specific role of each of the CysLT receptor subtypes in various physiological and pathological processes.
[...] This implies that with the rapid development of genome sequencing technologies and the accumulation of large volumes of statistical data, structure-function studies will not only allow accurate prediction of disease for every patient but will also enable predicting drug efficacy and improving patient safety based on how these variations in genes affect the response to certain medications.
More information: Anastasiia Gusach et al. Structural basis of ligand selectivity and disease mutations in cysteinyl leukotriene receptors, Nature Communications (2019). DOI: 10.1038/s41467-019-13348-2
The past 10 months have not been good for Boeing for all sorts of reasons—capped off by the failure of the company's Starliner commercial crew vehicle to achieve the right orbit in its uncrewed premier in December. But the biggest of the company's problems remains the 737 Max, grounded since last spring after two crashes that killed 346 people between them. Combined, the crashes are the worst air disaster since September 11, 2001.
Both were at least partially caused by a sensor failure with no redundancy and a problem with MCAS (the new software controlling the handling of the aircraft) that the air crews had not been trained to overcome.
Boeing executives are now telling the company's 737 Max customers that the software fix required to make the airliner airworthy will not be approved in the near future, and that it will likely be June or July before the Federal Aviation Administration certifies the aircraft for flight again—meaning that the aircraft will have been grounded for at least 16 months.
The FAA, for its part, has not committed to any timeframe for re-certifying the aircraft. In an emailed statement, an FAA spokesperson said, "We continue to work with other safety regulators to review Boeing's work as the company conducts the required safety assessments and addresses all issues that arise during testing."