Slash Boxes

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password

Site News

Join our Folding@Home team:
Main F@H site
Our team page

Funding Goal
For 6-month period:
2018-07-01 to 2018-12-31
Estimated Base Goal: $3000.00
Progress So Far:
Approximately: $54.44

Covers transactions:
2018-07-01 00:00:00 ..
2018-08-06 10:42:24 UTC
(SPIDs: [942..948])
Last Update:
2018-08-14 11:38:03 UTC

Support us: Subscribe Here
and buy SoylentNews Swag

We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Favorite kind of poll?

  • Serious
  • Humorous
  • Exit
  • Fishing
  • Stripper
  • poll(2)
  • Magnetic
  • Frederik Pohl

[ Results | Polls ]
Comments:64 | Votes:222

posted by Fnord666 on Wednesday August 15, @04:11PM   Printer-friendly
from the better-than-sniffing-glue dept.

If you missed the OpenSSL update released in May, go back and get it: a Georgia Tech team recovered a 2048-bit RSA key from OpenSSL using smartphone processor radio emissions, in a single pass.

The good news is that their attack was on OpenSSL 1.1.0g, which was released last November, and the library has been updated since then. Dubbed “One&Done”, the attack was carried out by Georgia tech's Monjur Alam, Haider Adnan Khan, Moumita Dey, Nishith Sinha, Robert Callan, Alenka Zajic, and Milos Prvulovic.

The researchers only needed a simple and relatively low cost Ettus USRP B200 mini receiver (costing less than $1,000/€900/£800) to capture the revealing radio noise from a Samsung Galaxy phone, an Alcatel Ideal phone, and a A13-OLinuXino single-board computer.

In Georgia Tech's announcement, the group explained that its attack is the first to crack OpenSSL without exploiting cache timing or organisation.

[...] The good news is that not only was mitigation relatively simple, it improved OpenSSL's performance. “Our mitigation relies on obtaining all the bits that belong to one window at once, rather than extracting the bits one at a time,” the paper stated. “For the attacker, this means that there are now billions of possibilities for the value to be extracted from the signal, while the number of signal samples available for this recovery is similar to what was originally used for making a binary (single-bit) decision”.

“This mitigation results in a slight improvement in execution time of the exponentiation,” the paper continued.

Here's the link to the group's upcoming Usenix talk.

Original Submission

posted by Fnord666 on Wednesday August 15, @02:37PM   Printer-friendly
from the anthropological-science-FTW dept.

The diet and eating habits of earlier civilizations has been inferred from old manuscripts and artwork, but there is always a question as to how representative that is of what the common diet was at the time, in much the same way as whether in a millennia from now one could infer our modern-day diet from surviving ``foodie'' magazines. It is always a bonus when you can have access to direct tissue to analyze. In a recent paper in the Open Access journal Nature Scientific Reports, Atsushi Maruyama and colleagues in Japan acquired a number of book sets produced during the Edo period and they analyzed samples of human hair found in the books. By analyzing the abundances of various carbon and nitrogen isotopes they were able to make inferences about the early Japanese diet.

The covers of such books are made of recycled thick paper, which, for financial reasons, was believed to have been produced soon before book printing, using waste paper collected in the same cities where the books were printed. Because the hairs are embedded in the paper fibres, the hairs are thought to have been mixed accidentally during waste paper collection or blended intentionally for reinforcement during paper production. In either case, the hairs most likely belong to people living in the city and year of book printing, both of which are available from the records (colophon) on the book. Thus, the hairs found in each book, together with the records of time and place, constitute the ideal human tissue samples to reconstruct the eating habits at the time and place of the book printing, using isotope analysis.

They found that people depended upon rice, vegetables, and fish more exclusively than contemporary Japanese people. They also noticed that the levels of nitrogen increased over 200 years, indicating an increase in the contribution of marine fish as both food and fertilizer, which generally confirms what literature-based studies have found.

Atsushi Maruyama, Jun'ichiro Takemura, Hayato Sawada, Takaaki Kaneko, Yukihiro Kohmatsu & Atsushi Iriguchi, Hairs in old books isotopically reconstruct the eating habits of early modern Japan, Scientific Reports volume 8, Article number: 12152 (2018)

Original Submission

posted by martyb on Wednesday August 15, @01:02PM   Printer-friendly
from the won't-you-be-my-neighbor? dept.

The nearest neighbor problem asks where a new point fits in to an existing data set. A few researchers set out to prove that there was no universal way to solve it. Instead, they found such a way.

If you were opening a coffee shop, there's a question you'd want answered: Where's the next closest cafe? This information would help you understand your competition.

This scenario is an example of a type of problem widely studied in computer science called "nearest neighbor" search. It asks, given a data set and a new data point, which point in your existing data is closest to your new point? It's a question that comes up in many everyday situations in areas such as genomics research, image searches and Spotify recommendations.

And unlike the coffee shop example, nearest neighbor questions are often very hard to answer. Over the past few decades, top minds in computer science have applied themselves to finding a better way to solve the problem. In particular, they've tried to address complications that arise because different data sets can use very different definitions of what it means for two points to be "close" to one another.

Now, a team of computer scientists has come up with a radically new way of solving nearest neighbor problems. In a pair of papers, five computer scientists have elaborated the first general-purpose method of solving nearest neighbor questions for complex data.

Original Submission

posted by martyb on Wednesday August 15, @11:25AM   Printer-friendly
from the Your-honor,-there-was-a-tree-branch-blocking-the-sign! dept.

Utilizing FOIA and some clever software Mr. Chapman quickly identifies a troubled spot for parking in Chicago and gets results!

The story relates how the author used Freedom of Information Act requests to gather raw data on parking tickets issued in Chicago. What he received was a semicolon-delimited text file containing a great number of data entry errors. The author outlines the steps taken to clean and extract data on a likely problematic parking location. Armed with this data, he visited the location and discovered very confusing signage. He reported this to the city, who rectified the signage. This led to a 50 percent decrease in the number of tickets issued for that location.

I immediately asked myself three things

1. How much more effective has that corner become?
2. Who's grumbling about the loss of revenue?
3. What would happen if more of us did this very thing?

Original Submission

posted by Fnord666 on Wednesday August 15, @09:48AM   Printer-friendly
from the hello-DefCon dept.

Submitted via IRC for BoyceMagooglyMonkey

Research funded by the Department of Homeland Security has found a "slew" of vulnerabilities in mobile devices offered by the four major U.S. cell phone carriers, including loopholes that may allow a hacker to gain access to a user's data, emails, text messages without the owner's knowledge.

The flaws allow a user "to escalate privileges and take over the device," Vincent Sritapan, a program manager at the Department of Homeland Security's Science and Technology Directorate told Fifth Domain during the Black Hat conference in Las Vegas.

The vulnerabilities are built into devices before a customer purchases the phone. Researchers said it is not clear if hackers have exploited the loophole yet.

Department of Homeland Security officials declined to say which manufacturers have the underlying vulnerabilities.

Millions of users in the U.S. are likely at risk, a source familiar with the research said, although the total number is not clear.

Because of the size of the market, it is likely that government officials are also at risk. The vulnerabilities are not limited to the U.S.

Researchers are expected to announce more details about the flaws later in the week.


Original Submission

posted by Fnord666 on Wednesday August 15, @08:16AM   Printer-friendly
from the tick-tock-tick-zap dept.

Submitted via IRC for SoyCow1984

Life-saving pacemakers manufactured by Medtronic don't rely on encryption to safeguard firmware updates, a failing that makes it possible for hackers to remotely install malicious wares that threaten patients' lives, security researchers said Thursday.

At the Black Hat security conference in Las Vegas, researchers Billy Rios and Jonathan Butts said they first alerted medical device maker Medtronic to the hacking vulnerabilities in January 2017. So far, they said, the proof-of-concept attacks they developed still work. The duo on Thursday demonstrated one hack that compromised a CareLink 2090 programmer, a device doctors use to control pacemakers after they're implanted in patients.

Because updates for the programmer aren't delivered over an encrypted HTTPS connection and firmware isn't digitally signed, the researchers were able to force it to run malicious firmware that would be hard for most doctors to detect. From there, the researchers said, the compromised machine could cause implanted pacemakers to make life-threatening changes in therapies, such as increasing the number of shocks delivered to patients.


Related: A Doctor Trying to Save Medical Devices from Hackers
Security Researcher Hacks Her Own Pacemaker
Updated: University of Michigan Says Flaws That MedSec Reported Aren't That Serious
Fatal Flaws in Ten Pacemakers Make for Denial of Life Attacks
After Lawsuits and Denial, Pacemaker Vendor Finally Admits its Product is Hackable
8,000 Vulnerabilities Found in Software to Manage Cardiac Devices
465,000 US Patients Told That Their Pacemaker Needs a Firmware Upgrade
Abbott Addresses Life-Threatening Flaw in a Half-Million Pacemakers

Original Submission

posted by martyb on Wednesday August 15, @06:42AM   Printer-friendly
from the building-up-to-it dept.

Home Depot's Sales Rebound Muted by Inflation in Fuel and Lumber

Home Depot Inc.'s sales rebounded last quarter as Americans took on more remodeling projects, but rising costs for lumber and transportation are weighing on profitability.

[...] Home Depot and its smaller rival Lowe's Cos. are often seen as proxies for the health of the housing sector because property owners spend more on their homes when they believe values are rising. But for several quarters there's been increasing concern that years of robust home-price gains are cooling. For its part, Home Depot has continually said that a shortage of available homes in many markets would actually underpin higher home-improvement spending.

[...] Even as the overall housing market looks to be cooling, several trends are driving demand for home-improvement products. A shortage of available listings has slowed property purchases, causing some owners to opt for sprucing up their homes instead. Additionally, more people are staying longer in their homes, which also supports the uptick.

The labor market also plays a role: A strong run of hiring, coupled with moderate wage growth, has boosted Americans' wherewithal to spend money on fixing up their homes. Spending on home improvement -- which accounts for about 38 percent of private residential construction outlays -- surged 13.8 percent in June from a year earlier to reach $221 billion, according to Commerce Department data. Going forward, the job market may continue to propel housing and remodeling demand. But potential hurdles include a pickup in mortgage rates, a shortage of skilled workers for building and remodeling projects, and rising costs for construction materials such as lumber, which is affected by tariffs.

Also at CNN and CNBC.

Original Submission

posted by Fnord666 on Wednesday August 15, @05:05AM   Printer-friendly
from the his-heart-grew-three-sizes-that-day dept.

Many studies have attempted to identify a single transcription factor that can induce formation of the mesoderm, an early layer in embryonic development, without help from other cellular proteins. None have been successful, until now.

In a new study published in Cell Stem Cell, titled "Tbx6 Induces Nascent Mesoderm from Pluripotent Stem Cells and Temporally Controls Cardiac versus Somite Lineage Diversification," a research team, including experts from the University of Tsukuba, screened over 50 transcription factors and found that Tbx6 alone was able to stimulate mesoderm formation in laboratory-grown stem cells, and could cause those stem cells to become cardiovascular or musculoskeletal cells.

[...] In the study, temporary production of Tbx6 caused the formation of mesoderm that later produced cardiovascular cells, while continuous Tbx6 expression suppressed this cardiovascular-forming mesoderm and caused formation of mesoderm that later produced musculoskeletal cells.

"Our analyses revealed a connection between early Tbx6 expression and cardiovascular lineage differentiation, and we believe that our study and similar studies may change the current view of lineage specification during development," Dr. Ieda explains. "Importantly, this essential and unappreciated function of Tbx6 in mesoderm and cardiovascular specification is conserved from lower organisms to mammals, so this discovery may have wide-ranging applicability in regenerative medicine."

Tbx6 Induces Nascent Mesoderm from Pluripotent Stem Cells and Temporally Controls Cardiac versus Somite Lineage Diversification (DOI: 10.1016/j.stem.2018.07.001) (DX)

Original Submission

posted by mrpg on Wednesday August 15, @03:33AM   Printer-friendly
from the nemure-akira dept.

Submitted via IRC for cmn32480

NASA's Opportunity rover has had an incredible career already, spending years upon years studying the Martian surface and proving to be an incredibly reliable and hardy piece of hardware. Unfortunately, a NASA dust storm that began kicking up in May may have abruptly ended its historic run.

In mid-June, the solar-powered Opportunity ran out of juice and was forced to go into its dormant standby mode. The dust storm which swallowed the entirety of Mars had blocked out the Sun, cutting the rover off of its only available source of power. NASA engineers had remained optimistic that the rover would wake back up when the skies began to clear, but things aren't looking good thus far.

[...] That's...not great news. NASA knew that the rover would be forced to sit dormant for a while because of the intensity of the storm, but that was several weeks ago. The dust has since began to settle, and enough light should be pushing its way down to the surface to begin recharging Opportunity's batteries once again.

Source: NASA's Opportunity rover still hasn't woken up from a Mars dust storm, and engineers are getting nervous

Original Submission

posted by mrpg on Wednesday August 15, @02:02AM   Printer-friendly
from the [sigh] dept.

Intel's SGX blown wide open by, you guessed it, a speculative execution attack

Another day, another speculative execution-based attack. Data protected by Intel's SGX—data that's meant to be protected even from a malicious or hacked kernel—can be read by an attacker thanks to leaks enabled by speculative execution.

Since publication of the Spectre and Meltdown attacks in January this year, security researchers have been taking a close look at speculative execution and the implications it has for security. All high-speed processors today perform speculative execution: they assume certain things (a register will contain a particular value, a branch will go a particular way) and perform calculations on the basis of those assumptions. It's an important design feature of these chips that's essential to their performance, and it has been for 20 years.

[...] What's in store today? A new Meltdown-inspired attack on Intel's SGX, given the name Foreshadow by the researchers who found it. Two groups of researchers found the vulnerability independently: a team from KU Leuven in Belgium reported it to Intel in early January—just before Meltdown and Spectre went public—and a second team from the University of Michigan, University of Adelaide, and Technion reported it three weeks later.

SGX, standing for Software Guard eXtensions, is a new feature that Intel introduced with its Skylake processors that enables the creation of Trusted Execution Environments (TEEs). TEEs are secure environments where both the code and the data the code works with are protected to ensure their confidentiality (nothing else on the system can spy on them) and integrity (any tampering with the code or data can be detected). SGX is used to create what are called enclaves: secure blocks of memory containing code and data. The contents of an enclave are transparently encrypted every time they're written to RAM and decrypted on being read. The processor governs access to the enclave memory: any attempt to access the enclave's memory from outside the enclave should be blocked.

[...] As with many of the other speculative execution issues, a large part of the fix comes in the form of microcode updates, and in this case, the microcode updates are already released and in the wild and have been for some weeks. With the updated microcode, every time the processor leaves execution of an enclave, it also flushes the level 1 cache. With no data in level 1 cache, there's no scope for the L1TF to take effect. Similarly, with the new microcode leaving, management mode flushes the level 1 cache, protecting SMM data.

Also at Engadget and Wired.

Original Submission

posted by Fnord666 on Wednesday August 15, @12:21AM   Printer-friendly
from the just-getting-to-know-you dept.

Submitted via IRC for SoyCow1984

Students are suing a major college admissions test maker for allegedly selling information about their disability statuses with universities, which they say could hurt their chances at getting into schools and impact the rest of their lives.

When students register to take the ACT—a standardized test used for college admissions taken by more than a million high schoolers each year—they answer a barrage of personal questions. As part of this, they are asked to note if they have disabilities that require "special provisions from the educational institution."

The ACT, which is administered by ACT, Inc., is the only real competitor to the College Board's SAT exam. The lawsuit claims that the ACT is selling the data it gleans from those student questionnaires—connected directly to students' individual identities—to colleges, which then use it to make important decisions about admissions and financial aid.

"A lot of students and parents have no idea how these testing agencies, which are gatekeepers to college, are using very sensitive and confidential data in the college admissions process," Jesse Creed, one of the plaintiffs' lawyers, told me in a phone call. "[Colleges are] hungry for disability data, because they have limited resources, and it's expensive to educate people with disabilities."


Original Submission

posted by Fnord666 on Tuesday August 14, @10:49PM   Printer-friendly

A novel laboratory-synthesized molecule, based on natural compounds known as marinoquinolines found in marine gliding bacteria, is a strong candidate for the development of a new antimalarial drug.

In tests, the molecule proved capable of killing even the strain that resists conventional antimalarials. The molecule displays low toxicity and high selectivity, acting only on the parasite and not on other cells of the host organism.

The molecule was developed in Brazil at the Center for Research and Innovation in Biodiversity and Drug Discovery (CIBFar). The researchers tested the molecule in strains cultured in vitro as well as in mice using Plasmodium berghei, since mice are immune to infection by Plasmodium falciparum, which causes the most aggressive type of malaria.

"In mice, the number of parasites in the bloodstream (parasitemia) had fallen 62 percent by the fifth day of the test. After 30 days, all the mice given doses of the molecule were still alive," said Rafael Guido, a professor at the University of São Paulo's São Carlos Physics Institute (IFSC-USP).

Guido co-authors an article published in the Journal of Medicinal Chemistry, in which the researchers describe the molecule's inhibitory action in the blood and liver stages of the parasite's asexual cycle, which is responsible for the signs and symptoms of the disease.

[...] According to Duarte Correia, the first 50 molecules developed from marinoquinolines were tested in the FAPESP-supported study. "This work hasn't ended with this publication. We're still developing other compounds," he said.

The researchers are also characterizing the potential of this class to treat malaria caused by P. vivax, the most prevalent form in Brazil and are developing the pharmacokinetic part of the project (how drugs move through the organism).

"If the pharmokinetic properties, especially solubility, absorption, distribution, metabolism and excretion, aren't adequate, the compound can build up in the organism and become toxic to the patient, making it inappropriate for treatment. After completing this step, we plan to perform preclinical and clinical trials," Guido said.

Anna Caroline Campos Aguiar et al, Discovery of Marinoquinolines as Potent and Fast-Acting Plasmodium falciparum Inhibitors with in Vivo Activity, Journal of Medicinal Chemistry (2018). DOI: 10.1021/acs.jmedchem.8b00143 Read more at:

Original Submission

posted by Fnord666 on Tuesday August 14, @09:17PM   Printer-friendly
from the dense-at-the-center dept.

A team of scientists from the Faculty of Physics and Sternberg State Astronomical Institute, MSU, leading an international collaboration with members from Europe, Chile, the U.S. and Australia discovered a supermassive black hole in the center of the Fornax galaxy. The results of the research were published in Monthly Notices of the Royal Astronomical Society journal.

Fornax UCD3 is a part of a Fornax galaxy cluster and belongs to a very rare and unusual class of galaxies, ultracompact dwarfs. The mass of such dwarf galaxies reaches several dozen millions of solar masses, and the radius does not typically exceed 300 light years. This ratio between mass and size makes UCDs the densest stellar systems in the universe.

"We have discovered a supermassive black hole in the center of Fornax UCD3. The black hole mass is 3.5 million that of the sun, similar to the central black hole in our own Milky Way," explained Anton Afanasiev, the first author of the article, a student of the department of the Faculty of Physics, MSU.

[...] The black hole discovered by the authors is the fourth ever to be found in UCDs and corresponds to 4 percent of the total galaxy mass. In average galaxies, this ratio is considerably lower (about 0.3 percent). Though there are few known examples, the existence of massive black holes in UCDs is a strong argument for the tidal origin of such galaxies. According to this hypothesis, an average-sized galaxy passed a bigger and more massive one at a certain stage of its evolution, and as a result of the influence of tidal forces, lost the majority of its stars. The remaining compact nucleus has become what we know as an ultracompact dwarf.

"To be able to say with complete assurance that this hypothesis is correct, we need to discover more supermassive black holes in UCDs. This is one of the prospects of this work.

Moreover, a similar methodology may be applied to more massive and less dense compact elliptical galaxies. In one of our next works, we will study the population of central black holes in objects of this kind," concluded the scientist.

Original Submission

posted by Fnord666 on Tuesday August 14, @07:45PM   Printer-friendly
from the just-use-the-front-door dept.

Australia's promised “not-a-backdoor” crypto-busting bill is out and the government has kept its word - it doesn't want a backdoor, just the keys to your front one.

The draft of The Assistance and Access Bill 2018 calls for anyone using or selling communications services in Australia will be subject to police orders for access to private data.

That includes all vendors of computers, phones, apps, social media and cloud services in the Lucky Country, and anyone within national borders using them. These data-tapping orders will be enforced with fines of up to AU$10m (US$7.3m) for companies or $50,000 ($36,368) for individuals

The draft legislation also wants five years in prison for anyone who reveals a data-slurping investigation is going on. And while there's no explicit encryption backdoor requirements in the 110 page draft bill, our first look suggests there doesn't need to be.

Original Submission

posted by Fnord666 on Tuesday August 14, @05:43PM   Printer-friendly
from the can-you-ID-me-now? dept.

Browser fingerprinting is where JavaScript or other means are used to scrape uniquely identifying information from the browser metadata and functions such as how it draws a canvas object. In it's latest release Apple will defeat browser fingerprinting by making all Mac users look alike to advertisers and websites that use fingerprinting to track users. Apple can afford to do this as it doesn't have skin in the online advertising game.

[This is likely only going to be for the Safari browser. - Ed]

Original Submission