Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
[Ed note: This story was originally posted 2019-08-14 23:09 UTC but was lost when we had the site crash Thursday morning. Prior comments have, unfortunately, been lost. --martyb]
Neanderthals commonly suffered from 'swimmer's ear'
Abnormal bony growths in the ear canal were surprisingly common in Neanderthals, according to a study published August 14, 2019 in the open-access journal PLOS ONE by Erik Trinkaus of Washington University and colleagues.
External auditory exostoses are dense bony growths that protrude into the ear canal. In modern humans, this condition is commonly called "swimmer's ear" and is known to be correlated with habitual exposure to cold water or chilly air, though there is also a potential genetic predisposition for the condition. Such exostoses have been noted in ancient humans, but little research has examined how the condition might inform our understanding of past human lifestyles.
In this study, Trinkaus and colleagues examined well-preserved ear canals in the remains of 77 ancient humans, including Neanderthals and early modern humans from the Middle to Late Pleistocene Epoch of western Eurasia. While the early modern human samples exhibited similar frequencies of exostoses to modern human samples, the condition was exceptionally common in Neanderthals. Approximately half of the 23 Neanderthal remains examined exhibited mild to severe exostoses, at least twice the frequency seen in almost any other population studied.
Also at CNN and New Scientist.
External auditory exostoses among western Eurasian late Middle and Late Pleistocene humans (open, DOI: 10.1371/journal.pone.0220464) (DX)
Submitted via IRC for SoyCow7671
Attackers Try to Evade Defenses with Smaller DDoS Floods, Probes
Cybercriminals are increasingly targeting corporate networks, websites, and online services with low-bandwidth distributed denial-of-service (DDoS) attacks that exploit weaknesses in application infrastructure to disrupt business, Internet infrastructure firm Neustar stated in an August 14 threat report.
In its "Q2 2019 Cyberthreats & Trends" report, Neustar found that DDoS attacks using less than 5 Gbit/s make up a greater share of packet floods, with more than 75% of all attacks using less than 5 Gbit/s in the second quarter of 2019, up from less than 70% the previous year. The average attack consisted of a 0.99 Gbit/s stream of packets, so small that most companies may not notice the impact, says Michael Kaczmarek, vice president of product for Neustar Security.
"People think DDoS is going away," he says. "They think it is this unsophisticated brute-force attack, but by no means is it gone; it has just morphed."
Overall, DDoS attacks increased by 133%, more than doubling, according to Neustar's report. The trend is a reversal from last year, when security firms had documented a decrease in attacks for most of the year. The average attack also showed greater complexity, with 82% of attacks using two or more different threat vectors.
The different vectors aim to find a vulnerable spot in a company's infrastructure and abuse the weakness, Kaczmarek says.
"The attackers are getting more sophisticated in what they are targeting," he says. "They are going after not the most vulnerably guy, but the most vulnerable component of the infrastructure."
In an analysis of all suitable sites for onshore wind farms, the new study reveals that Europe has the potential to supply enough energy for the whole world until 2050. The study reveals that if all of Europe's capacity for onshore wind farms was realised, the installed nameplate capacity would 52.5 TW -- equivalent to 1 MW for every 16 European citizens.
Co-author Benjamin Sovacool, Professor of Energy Policy at the University of Sussex, said: "The study is not a blueprint for development but a guide for policymakers indicating the potential of how much more can be done and where the prime opportunities exist.
"Our study suggests that the horizon is bright for the onshore wind sector and that European aspirations for a 100% renewable energy grid are within our collective grasp technologically.
"Obviously, we are not saying that we should install turbines in all the identified sites but the study does show the huge wind power potential right across Europe which needs to be harnessed if we're to avert a climate catastrophe."
Spatial analysis of Geographical Information System (GIS)-based wind atlases allowed the research team to identify around 46% of Europe's territory which would be suitable for siting of onshore wind farms.
The advanced GIS data at sub-national levels provided a far more detailed insight and allowed the team to factor in a far greater range of exclusionary factors including houses, roads, restricted areas due to military or political reasons as well as terrains not suitable for wind power generation.
The greater detail in this approach allowed the research team to identify more than three times the onshore wind potential in Europe than previous studies.
Submitted via IRC for SoyCow7671
New Cerberus Android Banker Uses Pedometer to Avoid Analysis
A new banking trojan for Android devices relies on the accelerometer sensor to delay its running on the system and thus evade analysis from security researchers.
Cerberus malware has recently stepped into the malware-as-a-service business filling the void left by the demise of previous Android bankers.
The malware author(s) claim that it was used privately for the past two years and that they created Cerberus from scratch over several years.
Security researchers from Amsterdam-based cybersecurity company ThreatFabric analyzed a sample of the malware and found that it did not borrow from Anubis, an Android banker whose source code got leaked, sparking the creation of clones.
Payload and string obfuscation are normal techniques for making analysis and detection more difficult, but Cerberus also uses a mechanism that determines if the infected system is moving or not.
The trojan achieves this by reading data from the accelerometer sensor present on Android devices to measure the acceleration force on all three physical axes, X, Y, and Z, also considering the force of gravity.
By implementing a simple pedometer, Cerberus can track if the victim is moving [...]. A real person will move around, generating motion data and increasing the step counter.
New research has found that in 15 major cities in the global south, almost half of all households lack access to piped utility water, affecting more than 50 million people. Access is lowest in the cities of sub-Saharan Africa, where only 22% of households receive piped water.
The research also found that of those households that did have access, the majority received intermittent service. In the city of Karachi in Pakistan, the city's population of 15 million people received an average piped water supply of only three days a week, for less than three hours.
These new findings add to data from the World Resources Institute's (WRI) Aqueduct tool, which recently found that by 2030, 45 cities with populations over 3 million could experience high water stress. The research, detailed in the Unaffordable and Undrinkable: Rethinking Urban Water Access in the Global South report shows that even in some places where water sources are available, water is not reaching many residents. Some cities, like Dar es Salaam, have relatively abundant supplies, yet daily access to clean, reliable and affordable water continues to be problematic for many residents.
"Decades of increasing the private sector's role in water provision has not adequately improved access, especially for the urban under-served," said Diana Mitlin, lead author, professor of global urbanism at The Global Development Institute at The University of Manchester. "Water is a human right and a social good, and cities need to prioritize it as such."
Analysis in the report showed that alternatives to piped water, like buying from private providers that truck water in from elsewhere, can cost up to 25% of monthly household income and is 52 times more expensive than public tap water.
Global indicators used for the Millennium Development Goals and Sustainable Development Goals have largely underestimated this urban water crisis because they do not take into account affordability, intermittency or quality of water. UNICEF and the World Health Organization reported in 2015 that more than 90% of the world's population used improved drinking water sources. But "improved" encompasses such a wide variety of sources, such as public taps, boreholes or wells that it fails to reflect the reality for individuals and families in today's rapidly growing cities.
The question of whether water is affordable is not measured and while efforts have been made to increase water coverage, public authorities have paid little attention to affordability issues.
The giant waddling sea bird stood 1.6 metres (63 inches) high and weighed 80 kilograms, about four times heavier and 40cm taller than the modern Emperor penguin, researchers said.
Named "crossvallia waiparensis", it hunted off New Zealand's coast in the Paleocene era, 66-56 million years ago. An amateur fossil hunter found leg bones belonging to the bird last year and it was confirmed as a new species in research published this week in Alcheringa: An Australasian Journal of Palaeontology.
Canterbury Museum researcher Vanesa De Pietri said it was the second giant penguin from the Paleocene era found in the area.
"It further reinforces our theory that penguins attained great size early in their evolution," she said.
Scientists have previously speculated that the mega-penguins eventually died out due to the emergence of other large marine predators such as seals and toothed whales.
[Ed note: This story was originally posted 2019.08.14 21:36 UTC but was lost when we had the site crash this morning. Prior comments have, unfortunately, been lost. takyon: This story has been further updated to avoid confusion.]
Update: Tim Chen has retracted his earlier comments and has stated that there is actually no agreement currently in place with SpaceX for RUAG to produce taller fairings out of its new Decatur, AL factory.
[...] SpaceX has three obvious responses at its disposal: design and build an entirely new variant of its universal Falcon fairing, purchase the necessary fairings from an established supplier, or bow out of launch contract competitions that demand it. The latter option is immediately untenable given that it could very well mean bowing out of the entire US military competition, known as Phase 2 of the National Security Space Launch program's (NSSL; formerly EELV) Launch Services Procurement (LSP).
For dubious reasons, the US Air Force (USAF) has structured the NSSL Phase 2 acquisition in such a way that – despite there being four possible competitors – only two will be awarded contracts at its conclusion. The roughly ~30 launch contracts up for grabs would be split 60:40 between the two victors, leaving two competitors completely emptyhanded. In short, bowing out of the Phase 2 competition could mean forgoing as many as one or two-dozen contracts worth at least $1-2B, depending on the side of the 60:40 split.
[...] Interestingly, although ULA's RUAG-built Atlas V fairing is slightly narrower than SpaceX's 5.2m (17 ft) diameter fairing, Atlas V's largest fairing is significantly taller, supporting payloads up to 16.5m (54 ft) tall compared to 11m (36 ft) for Falcon 9 and Heavy. Given that just a tiny portion of military spacecraft actually need fairings that tall, SpaceX is apparently not interested in simply modifying its own fairing design and production equipment to support a 20-30% stretch.
This likely relates in part to the fact that one of SpaceX's three NSSL Phase 2 competitors – Northrop Grumman (Omega), Blue Origin (New Glenn), and ULA (Vulcan) – are guaranteed to receive hundreds of millions of dollars of development funding after winning one of the two available slots (60% or 40% of contracts). SpaceX, on the other hand, will receive no such funding while still having to meet the same stringent USAF requirements compete in LSP Phase 2. Of note, Congressman Adam Smith managed to insert a clause into FY2020's defense authorization bill that could disburse up to $500M to SpaceX in the event that the company is one of Phase 2's two winners.
RUAG.
Previously: The Military Chooses Which Rockets It Wants Built for the Next Decade
Blue Origin Urges U.S. Air Force to Delay Launch Provider Decision
SpaceX Sues the U.S. Air Force, Again
SpaceX's attempts to buy bigger Falcon fairings complicated by contractor's ULA relationship
Earlier this morning (around 0520 UTC) we experienced a sequence of faults which we do not, yet, fully understand. Unfortunately this coincided with several key admins being asleep as they should be.
Several of the team started to try to recover the site but all the obvious efforts failed to have the desired results (e.g. restarting the failed software and switching it all off and back on again). Chromas took the lead in this but he was unable to resolve the problems. SemperOSS showed up next and made a mighty effort but things failed to cooperate. Martyb was next in and checked on status of our servers and a few other things but ultimately could not provide much assistance.
TheMightyBuzzard turned up at his usual time and he and SemperOSS set to work trying to recover the site. After a long slog it became apparent that the existing software had suffered some form of corruption and that a simple recovery was not going to be possible. The decision was finally made to restore to yesterday's snapshot, but this comes with a downside. All the stories, comments, and journal entries since that snapshot was taken (approximately 2019-08-14 22:02:36 UTC) have been lost. This is regrettable but under the circumstances unavoidable.
Some of you joined us on IRC and provided real or moral support to those carrying out the recovery task. Others sent us emails informing us of the outage and we also received offers of help. For this we thank you. We also thank our community at large for your patience while we carried out the work that needed to be done. The investigation into what happened has still to be done and will take some time to do. The site is now up again, but as we are not 100% sure what caused the problem we cannot guarantee that the site will function properly and we will have to wait a day or two to be confident that we have resolved the issue.
We tried to keep everyone updated on our progress on our IRC channel but, if you could not get there, then there was little else we could do to update you on our progress. Displaying a short 'Site Down' message actually exacerbated the problems so we decided that it was better to leave the database alone until the problems had been resolved.
Finally, I would like to say thank you to the guys who did all the work under difficult circumstances. Their efforts are appreciated. We used the usual piece of software on IRC to find out who should bear the blame for this debacle (~blame) — it responded with the only name it has in its choice of staff. Bytram!
https://www.jpl.nasa.gov/edu/news/2016/3/16/how-many-decimals-of-pi-do-we-really-need/
Earlier this week, we received this question from a fan on Facebook who wondered how many decimals of the mathematical constant pi (π) NASA-JPL scientists and engineers use when making calculations:
Does JPL only use 3.14 for its pi calculations? Or do you use more decimals like say: 3.141592653589793238462643383279502884197169399375105820974944592307816406286208998628034825342117067982148086513282306647093844609550582231725359408128481117450284102701938521105559644622948954930381964428810975665933446128475648233786783165271201909145648566923460348610454326648213393607260249141273724587006606315588174881520920962829254091715364367892590360
We posed this question to the director and chief engineer for NASA's Dawn mission, Marc Rayman. Here's what he said:
Submitted via IRC for SoyCow6430
Schrödinger's Cat with 20 Qubits
"Qubits in the cat state are considered extremely important for the development of quantum technologies," explains Jian Cui. "The secret of the enormous efficiency and performance expected of future quantum computers is to be found in this superposition of states," says the physicist from the Peter Grünberg Institute at Jülich (PGI-8).
Classical bits in a conventional computer always only have one certain value, which is composed of 0 and 1, for example. Therefore, these values can only be processed bit by bit one after the other. Qubits, which have several states simultaneously due to the superposition principle, can store and process several values in parallel in one step. The number of qubits is crucial here. You don't get far with just a handful of qubits. But with 20 qubits, the number of superimposed states already exceeds one million. And 300 qubits can store more numbers simultaneously than there are particles in the universe.
The new result of 20 qubits now comes a little closer to this value, after the old record of 14 qubits remained unchanged since 2011. For their experiment, the researchers used a programmable quantum simulator based on Rydberg atom arrays. In this approach, individual atoms, in this case rubidium atoms, are captured by laser beams and held in place side by side in a row. The technique is also known as optical tweezers. An additional laser excites the atoms until they reach the Rydberg state, in which the electrons are located far beyond the nucleus.
This process is rather complicated and usually takes too much time, such that the delicate cat state is destroyed before it can even be measured. The group in Jülich contributed their expertise in Quantum Optimal Control to solve this issue. By cleverly switching the lasers off and on at the right rate, they achieved a speed up in the preparation process which made this new record possible.
A[...] recent study showed that termite activity in the soils of wetlands can help improve soil structure and nutrient content.
To study this question, Deborah S. Page-Dumroese and her colleagues researched various types of bedding systems in eastern South Carolina. "Microorganisms and termites are the primary wood decay agents in forests of southeastern United States," says Page-Dumroese. Previous research showed that raised planting beds on poorly-drained soils greatly improve the survival and growth of planted seedlings. Page-Dumroese's research team showed that bedding in wetlands could be a good management practice, too.
[...] The decay of dead trees (and any plant product) produces organic matter. And, this organic matter can increase crucial soil carbon content. All living things are made of carbon, and it is important to keep carbon in the soil (carbon sequestration) because it helps hold and filter water, reduces nutrient leaching, and improves forest health.
Researchers created beds in the study area using tractors. This mixed surface organic debris from the wetland floor with the mineral soil. They created beds of various height for study. Raised planting beds improve soil aeration, raise soil temperature, and increase nutrient availability.
To measure the activity of microbes and termites, the team placed wooden stakes into various sized beds. They chose stakes made of aspen as well as loblolly pine, both prevalent trees in eastern South Carolina. They compared the decomposition of the stakes over 23 months, in beds that ranged from flat to about 30 cm (12 inches).
Indeed, the team found many differences in decomposition between stake species and bedding height. Termites damaged or consumed 45% of aspen stakes in double height beds. This is compared to only 11% of the loblolly pine stakes. Microbial decay of both types of stakes increased with greater bedding heights.
Effects of Bed Height and Termite Activity on Wood Decomposition, Soil Science Society of America (DOI: 10.2136/sssaj2018.12.0492)
Reportedly a girl earned a phone confiscation from her mother by watching YouTube while the stove caught fire. There were so many opportunities around their house to circumvent being grounded that it only slowed down her access to the source of her addiction, earning some attention around the net:
In her recounting over Twitter DM, Dorothy told me that her mom took away her phone after she "was boiling rice and was too busy on phone and stove burst into flames." She was watching YouTube at the time.
After her phone was confiscated, she began desperately searching for other ways to tweet. "I've been bored all summer and twitter passes the time for me," she said. She also worried that if she stayed off the platform too long, she'd lose her mutuals — internet shorthand for users who follow each other.
In her search for other posting methods, Dorothy came up with increasingly elaborate ways to daisy-chain systems not designed for tweeting. In her first post, she managed to send a tweet from her Nintendo 3DS, a video-game console with a rudimentary camera and web browser. [...]
She also allegedly used a Nintendo Wii U before settling on the family LG Smart Refrigerator.
Whether this is a gag or not, there are several important issues rasied here regarding both addiction and the proliferation of poorly secured comsumer grade devices.
The periodic table has been a vital foundational tool for material research since it was first created 150 years ago. Now, Martin Rahm from Chalmers University of Technology presents a new article which adds an entirely new dimension to the table, offering a new set of principles for material research. The article is published in the Journal of the American Chemical Society.
The study maps how both the electronegativity and the electron configuration of elements change under pressure. These findings offer materials researchers an entirely new set of tools. Primarily, it means it is now possible to make quick predictions about how certain elements will behave at different pressures, without requiring experimental testing or computationally expensive quantum mechanical calculations.
"Currently, searching for those interesting compounds which appear at high pressure requires a large investment of time and resources, both computationally and experimentally. As a consequence, only a tiny fraction of all possible compounds has been investigated. The work we are presenting can act as a guide to help explain what to look for and which compounds to expect when materials are placed under high pressure," says Martin Rahm, Assistant Professor in Chemistry at Chalmers, who led the study.
At high pressures the properties of atoms can change radically. The new study shows how the electron configuration and electronegativity of atoms change as pressure increases. Electron configuration is fundamental to the structure of the periodic table. It determines which group in the system different elements belong to. Electronegativity is also a central concept to chemistry and can be viewed as a third dimension of the periodic table. It indicates how strongly different atoms attract electrons. Together, electron configuration and electronegativity are important for understanding how atoms react with one another to form different substances. At high pressure, atoms which normally do not combine can create new, never before seen compounds with unique properties. Such materials can inspire researchers to try other methods for creating them under more normal conditions, and give us new insight into how our world works.
"At high pressure, extremely fascinating chemical structures with unusual qualities can arise, and reactions that are impossible under normal conditions can occur. A lot of what we as chemists know about elements' properties under ambient conditions simply doesn't hold true any longer. You can basically take a lot of your chemistry education and throw it out the window! In the dimension of pressure there is an unbelievable number of new combinations of atoms to investigate" says Martin Rahm.
A well-known example of what can happen at high pressure is how diamonds can be formed from graphite. Another example is polymerisation of nitrogen gas, where nitrogen atoms are forced together to bond in a three-dimensional network. These two high-pressure materials are very unlike one another. Whereas carbon retains its diamond structure, polymerised nitrogen is unstable and reverts back to gas form when the pressure is released. If the polymer structure of nitrogen could be maintained at normal pressures, it would without doubt be the most energy dense chemical compound on Earth.
Currently, several research groups use high pressures to create superconductors—materials which can conduct electricity without resistance. Some of these high-pressure superconductors function close to room temperature. If such a material could be made to work at normal pressure, it would be revolutionary, enabling, for example, lossless power transfer and cheaper magnetic levitation.
[...] Only some materials that form at high pressure retain their structure and properties when returned to ambient pressure.
It would give these bureaucratic government agencies unprecedented control over how Internet platforms moderate speech by allowing them to revoke the essential protections Congress laid out in Section 230 of the Communications Decency Act (CDA). CDA 230 is the basic law that makes it possible for online platforms to let users post our own content, and to make basic decisions about what types of content they as private entities want to host. Every meme, every social media post, every blog and user-created video on the Internet has been made possible by this crucial free speech protection.
In practice, this executive order would mean that whichever political party is in power could dictate what speech is allowed on the Internet. If the government doesn't like the way a private company is moderating content, they can shut their entire website down.
From https://www.salon.com/2019/08/12/leaked-draft-of-trump-executive-order-deemed-unconstitutional_partner/ we get the following:
According to CNN, which obtained a copy of the draft, the new rule "calls for the FCC to develop new regulations clarifying how and when the law protects social media websites when they decide to remove or suppress content on their platforms. Although still in its early stages and subject to change, the Trump administration's draft order also calls for the Federal Trade Commission to take those new policies into account when it investigates or files lawsuits against misbehaving companies."
While Politico was the first to report how the draft was being circulated by the White House, CNN notes that if put into effect, "the order would reflect a significant escalation by President Trump in his frequent attacks against social media companies over an alleged but unproven systemic bias against conservatives by technology platforms. And it could lead to a significant reinterpretation of a law that, its authors have insisted, was meant to give tech companies broad freedom to handle content as they see fit."
"[...] It's hard to put into words how mind bogglingly absurd this executive order is," said Evan Greer, deputy director of Fight for the Future, in a tweet. "In the name of defending free speech it would allow mass government censorship of online content. In practice, it means whichever party is in power can decide what speech is allowed on the internet."
This authoritarian legislation is being pushed by claiming it will do the opposite of censorship by giving the federal government even more broad power. Reminds me of the following quote, "I like taking guns away early," Trump said. "Take the guns first, go through due process second."
See also:
It has been coming for some time, but now the major breach of a biometric database has actually been reported—facial recognition records, fingerprints, log data and personal information has all been found on "a publicly accessible database." The damage is not yet clear, but the report claims that actual fingerprints and facial recognition records for millions of people have been exposed.
The issue with biometric data being stored in this way is that, unlike usernames and passwords, it cannot be changed. Once it’s compromised, it’s compromised. And for that reason this breach report will sound all kinds of alarms.
The report published by security researches Noam Rotem and Ran Loca at Vpnmentor relates to Suprema, a company describing itself as a "global Powerhouse in biometrics, security and identity solutions," with a product range that "includes biometric access control systems, time and attendance solutions, fingerprint live scanners, mobile authentication solutions and embedded fingerprint modules."
The news of the breach was first published by Wednesday’s Guardian newspaper in the U.K., which highlighted the use of Suprema solutions by the "Metropolitan Police, defence contractors and banks." The breach, though, is international, with Suprema's Biostar 2 biometric identity SDK integrated into the AEOS access control system "used by 5,700 organisations in 83 countries, including governments, banks and the police."
[...] Almost 28 million records across more than 23 gigabytes of data—records that include "fingerprint data, facial recognition data, face photos of users, unencrypted usernames and passwords, logs of facility access, security levels and clearance, and personal details of staff."
Highly sensitive data was left unencrypted, including (most alarmingly of all) usernames and passwords. "We were able to find plain-text passwords of administrator accounts,” Rotem told the Guardian. "The access allows first of all seeing millions of users are using this system to access different locations and see in real time which user enters which facility or which room in each facility." The researchers were even "able to change data and add new users."
[...] The final interesting take away from this story doesn’t relate to any of the specifics, it’s a much more general point. We are currently giving away biometric information to multiple platforms and providers. Our phones, our banks, our immigration services, to name but a few. Every time we do this, our risk increases. At some point the realization will hit that we need some kind of unified platform where we limit the numbers of parties who actually hold such data, with others accessing those trusted holders on an “as a service” basis.