Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

The Best Star Trek

  • The Original Series (TOS) or The Animated Series (TAS)
  • The Next Generation (TNG) or Deep Space 9 (DS9)
  • Voyager (VOY) or Enterprise (ENT)
  • Discovery (DSC) or Picard (PIC)
  • Lower Decks or Prodigy
  • Strange New Worlds
  • Orville
  • Other (please specify in comments)

[ Results | Polls ]
Comments:83 | Votes:89

posted by janrinok on Wednesday August 15 2018, @10:38PM   Printer-friendly
from the bags-of-dosh dept.

Chinese electric carmaker NIO has filed for a $1.8 billion initial public offering in the United States as the burgeoning company seeks to compete with US rival Tesla. NIO is one of dozens of new automakers to crop up in China as policymakers in Beijing push an all-electric future for the world's largest auto market.

American, Japanese and European auto giants dominate sales of combustion engine vehicles in China, but homegrown firms unencumbered by the billions sunk into refining gasoline engines are in the driver's seat when it comes to electric cars.

While Tesla chief Elon Musk is in talks with Saudi Arabia's sovereign wealth fund and other investors to take his company private, NIO filed papers at the US Securities and Exchange Commission on Monday to go public.

But the upstart Chinese automaker faces a long road ahead for its proposed float of up to $1.8 billion on the New York Stock Exchange. The company had delivered only 481 of its first mass production electric SUV model the ES8 by the end of July, with reservations and deposits in place for an additional 17,000.


Original Submission

posted by janrinok on Wednesday August 15 2018, @09:13PM   Printer-friendly
from the to-infinity-and-beyond dept.

SpaceX's Falcon Heavy eyed by Europe/Japan

According to RussianSpaceWeb, SpaceX's Falcon Heavy rocket is under serious consideration for launches of major European and Japanese payloads associated with the Lunar Orbital Platform-Gateway (formerly the Deep Space Gateway).

[...] The first payload considering Falcon Heavy for launch services is the Japanese Space Agency's (JAXA) HTV-X, and upgraded version of a spacecraft the country developed to assist in resupplying the International Space Station (ISS). HTV-X is primarily being designed with an ISS-resupply role still at the forefront, but RussianSpaceWeb recently reported that JAXA is seriously considering the development of a variant of the robotic spacecraft dedicated to resupplying the Lunar Orbital Platform-Gateway (LOPG; and I truly wish I were joking about both the name and acronym).

[...] Regardless of the LOPG's existential merits, a lot of energy (and money) is currently being funneled into planning and initial hardware development for the lunar station's various modular segments. JAXA is currently analyzing ways to resupply LOPG and its crew complement with its HTV-X cargo spacecraft, currently targeting its first annual ISS resupply mission by the end of 2021. While JAXA will use its own domestic H-III rocket to launch HTV-X to the ISS, that rocket simply is not powerful enough to place a minimum of ~10,000 kg (22,000 lb) on a trans-lunar insertion (TLI) trajectory. As such, JAXA is examining SpaceX's Falcon Heavy as a prime (and affordable) option: by recovering both side boosters on SpaceX's drone ships and sacrificing the rocket's center core, a 2/3rds-reusable Falcon Heavy should be able to send as much as 20,000 kg to TLI (lunar orbit), according to comments made by CEO Elon Musk.

That impressive performance would also be needed for another LOPG payload, this time for ESA's 5-6 ton European System Providing Refueling Infrastructure and Telecommunications (ESPRIT) lunar station module. That component is unlikely to reach launch readiness before 2024, but ESA is already considering Falcon Heavy (over its own Ariane 6 rocket) in order to save some of the module's propellant. Weighing 6 metric tons at most, Falcon Heavy could most likely launch ESPRIT while still recovering all three of its booster stages.

Previously: NASA's Chief of Human Spaceflight Rules Out Use of Falcon Heavy for Lunar Station

Related: NASA and International Partners Planning Orbital Lunar Outpost
Russia Assembles Engineering Group for Lunar Activities and the Deep Space Gateway
This Week in Space Pessimism: SLS, Mars, and Lunar Gateway


Original Submission

posted by takyon on Wednesday August 15 2018, @07:33PM   Printer-friendly
from the legislated-in-America dept.

President Trump yesterday signed a defense funding bill that included a sweeping ban on the US government using technology supplied by Chinese telecommunications giants ZTE and Huawei. The bill also includes a narrower ban on using surveillance gear provided by Chinese companies Hytera Communications, Hangzhou Hikvision Digital Technology, or Dahua Technology for national security applications.

The legislation directs federal agencies to stop using the Chinese-made hardware within two years. If that proves impractical, an agency can apply for a waiver to permit a longer phase-out period.

Obviously, being banned from selling to the US government is a significant blow to these companies. But overall the bill actually represents something of a reprieve for ZTE. Back in June, the US Senate passed a version of the bill that would have re-imposed an export ban that would have been a de facto death sentence for ZTE because ZTE is heavily dependent on components like Qualcomm chips and Google's Android operating system.

Previously: Verizon Cancels Plans to Sell Huawei Phone Due to U.S. Government Pressure
U.S. Intelligence Agency Heads Warn Against Using Huawei and ZTE Products
The U.S. Intelligence Community's Demonization of Huawei Remains Highly Hypocritical
Huawei CEO Still Committed to the U.S. Market
Rural Wireless Association Opposes U.S. Government Ban on Huawei and ZTE Equipment
ZTE Suspends Operations Due to U.S. Ban (UPDATED)


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @05:44PM   Printer-friendly
from the faster-path-to-skynet dept.

Submitted via IRC for SoyCow1984

Students from Fast.ai, a small organization that runs free machine-learning courses online, just created an AI algorithm that outperforms code from Google's researchers, according to an important benchmark.

Fast.ai's success is important because it sometimes seems as if only those with huge resources can do advanced AI research.

Fast.ai consists of part-time students keen to try their hand at machine learning—and perhaps transition into a career in data science. It rents access to computers in Amazon's cloud.

But Fast.ai's team built an algorithm that beats Google's code, as measured using a benchmark called DAWNBench, from researchers at Stanford. This benchmark uses a common image classification task to track the speed of a deep-learning algorithm per dollar of compute power.

Google's researchers topped the previous rankings, in a category for training on several machines, using a custom-built collection its own chips designed specifically for machine learning. The Fast.ai team was able to produce something even faster, on roughly equivalent hardware.

"State-of-the-art results are not the exclusive domain of big companies," says Jeremy Howard, one of Fast.ai's founders and a prominent AI entrepreneur. Howard and his cofounder, Rachel Thomas, created Fast.ai to make AI more accessible and less exclusive.

Source: https://www.technologyreview.com/s/611858/small-team-of-ai-coders-beats-googles-code/


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @04:11PM   Printer-friendly
from the better-than-sniffing-glue dept.

If you missed the OpenSSL update released in May, go back and get it: a Georgia Tech team recovered a 2048-bit RSA key from OpenSSL using smartphone processor radio emissions, in a single pass.

The good news is that their attack was on OpenSSL 1.1.0g, which was released last November, and the library has been updated since then. Dubbed “One&Done”, the attack was carried out by Georgia tech's Monjur Alam, Haider Adnan Khan, Moumita Dey, Nishith Sinha, Robert Callan, Alenka Zajic, and Milos Prvulovic.

The researchers only needed a simple and relatively low cost Ettus USRP B200 mini receiver (costing less than $1,000/€900/£800) to capture the revealing radio noise from a Samsung Galaxy phone, an Alcatel Ideal phone, and a A13-OLinuXino single-board computer.

In Georgia Tech's announcement, the group explained that its attack is the first to crack OpenSSL without exploiting cache timing or organisation.

[...] The good news is that not only was mitigation relatively simple, it improved OpenSSL's performance. “Our mitigation relies on obtaining all the bits that belong to one window at once, rather than extracting the bits one at a time,” the paper stated. “For the attacker, this means that there are now billions of possibilities for the value to be extracted from the signal, while the number of signal samples available for this recovery is similar to what was originally used for making a binary (single-bit) decision”.

“This mitigation results in a slight improvement in execution time of the exponentiation,” the paper continued.

Here's the link to the group's upcoming Usenix talk.


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @02:37PM   Printer-friendly
from the anthropological-science-FTW dept.

The diet and eating habits of earlier civilizations has been inferred from old manuscripts and artwork, but there is always a question as to how representative that is of what the common diet was at the time, in much the same way as whether in a millennia from now one could infer our modern-day diet from surviving ``foodie'' magazines. It is always a bonus when you can have access to direct tissue to analyze. In a recent paper in the Open Access journal Nature Scientific Reports, Atsushi Maruyama and colleagues in Japan acquired a number of book sets produced during the Edo period and they analyzed samples of human hair found in the books. By analyzing the abundances of various carbon and nitrogen isotopes they were able to make inferences about the early Japanese diet.

The covers of such books are made of recycled thick paper, which, for financial reasons, was believed to have been produced soon before book printing, using waste paper collected in the same cities where the books were printed. Because the hairs are embedded in the paper fibres, the hairs are thought to have been mixed accidentally during waste paper collection or blended intentionally for reinforcement during paper production. In either case, the hairs most likely belong to people living in the city and year of book printing, both of which are available from the records (colophon) on the book. Thus, the hairs found in each book, together with the records of time and place, constitute the ideal human tissue samples to reconstruct the eating habits at the time and place of the book printing, using isotope analysis.

They found that people depended upon rice, vegetables, and fish more exclusively than contemporary Japanese people. They also noticed that the levels of nitrogen increased over 200 years, indicating an increase in the contribution of marine fish as both food and fertilizer, which generally confirms what literature-based studies have found.

Atsushi Maruyama, Jun'ichiro Takemura, Hayato Sawada, Takaaki Kaneko, Yukihiro Kohmatsu & Atsushi Iriguchi, Hairs in old books isotopically reconstruct the eating habits of early modern Japan, Scientific Reports volume 8, Article number: 12152 (2018)


Original Submission

posted by martyb on Wednesday August 15 2018, @01:02PM   Printer-friendly
from the won't-you-be-my-neighbor? dept.

The nearest neighbor problem asks where a new point fits in to an existing data set. A few researchers set out to prove that there was no universal way to solve it. Instead, they found such a way.

If you were opening a coffee shop, there's a question you'd want answered: Where's the next closest cafe? This information would help you understand your competition.

This scenario is an example of a type of problem widely studied in computer science called "nearest neighbor" search. It asks, given a data set and a new data point, which point in your existing data is closest to your new point? It's a question that comes up in many everyday situations in areas such as genomics research, image searches and Spotify recommendations.

And unlike the coffee shop example, nearest neighbor questions are often very hard to answer. Over the past few decades, top minds in computer science have applied themselves to finding a better way to solve the problem. In particular, they've tried to address complications that arise because different data sets can use very different definitions of what it means for two points to be "close" to one another.

Now, a team of computer scientists has come up with a radically new way of solving nearest neighbor problems. In a pair of papers, five computer scientists have elaborated the first general-purpose method of solving nearest neighbor questions for complex data.


Original Submission

posted by martyb on Wednesday August 15 2018, @11:25AM   Printer-friendly
from the Your-honor,-there-was-a-tree-branch-blocking-the-sign! dept.

Utilizing FOIA and some clever software Mr. Chapman quickly identifies a troubled spot for parking in Chicago and gets results!

http://mchap.io/using-foia-data-and-unix-to-halve-major-source-of-parking-tickets.html

The story relates how the author used Freedom of Information Act requests to gather raw data on parking tickets issued in Chicago. What he received was a semicolon-delimited text file containing a great number of data entry errors. The author outlines the steps taken to clean and extract data on a likely problematic parking location. Armed with this data, he visited the location and discovered very confusing signage. He reported this to the city, who rectified the signage. This led to a 50 percent decrease in the number of tickets issued for that location.

I immediately asked myself three things

1. How much more effective has that corner become?
2. Who's grumbling about the loss of revenue?
3. What would happen if more of us did this very thing?


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @09:48AM   Printer-friendly
from the hello-DefCon dept.

Submitted via IRC for BoyceMagooglyMonkey

Research funded by the Department of Homeland Security has found a "slew" of vulnerabilities in mobile devices offered by the four major U.S. cell phone carriers, including loopholes that may allow a hacker to gain access to a user's data, emails, text messages without the owner's knowledge.

The flaws allow a user "to escalate privileges and take over the device," Vincent Sritapan, a program manager at the Department of Homeland Security's Science and Technology Directorate told Fifth Domain during the Black Hat conference in Las Vegas.

The vulnerabilities are built into devices before a customer purchases the phone. Researchers said it is not clear if hackers have exploited the loophole yet.

Department of Homeland Security officials declined to say which manufacturers have the underlying vulnerabilities.

Millions of users in the U.S. are likely at risk, a source familiar with the research said, although the total number is not clear.

Because of the size of the market, it is likely that government officials are also at risk. The vulnerabilities are not limited to the U.S.

Researchers are expected to announce more details about the flaws later in the week.

Source: https://www.fifthdomain.com/show-reporters/black-hat/2018/08/07/manufacturing-bugs-allow-millions-of-phones-to-be-taken-over-dhs-project-to-announce/


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @08:16AM   Printer-friendly
from the tick-tock-tick-zap dept.

Submitted via IRC for SoyCow1984

Life-saving pacemakers manufactured by Medtronic don't rely on encryption to safeguard firmware updates, a failing that makes it possible for hackers to remotely install malicious wares that threaten patients' lives, security researchers said Thursday.

At the Black Hat security conference in Las Vegas, researchers Billy Rios and Jonathan Butts said they first alerted medical device maker Medtronic to the hacking vulnerabilities in January 2017. So far, they said, the proof-of-concept attacks they developed still work. The duo on Thursday demonstrated one hack that compromised a CareLink 2090 programmer, a device doctors use to control pacemakers after they're implanted in patients.

Because updates for the programmer aren't delivered over an encrypted HTTPS connection and firmware isn't digitally signed, the researchers were able to force it to run malicious firmware that would be hard for most doctors to detect. From there, the researchers said, the compromised machine could cause implanted pacemakers to make life-threatening changes in therapies, such as increasing the number of shocks delivered to patients.

Source: https://arstechnica.com/information-technology/2018/08/lack-of-encryption-makes-hacks-on-life-saving-pacemakers-shockingly-easy/

Related: A Doctor Trying to Save Medical Devices from Hackers
Security Researcher Hacks Her Own Pacemaker
Updated: University of Michigan Says Flaws That MedSec Reported Aren't That Serious
Fatal Flaws in Ten Pacemakers Make for Denial of Life Attacks
After Lawsuits and Denial, Pacemaker Vendor Finally Admits its Product is Hackable
8,000 Vulnerabilities Found in Software to Manage Cardiac Devices
465,000 US Patients Told That Their Pacemaker Needs a Firmware Upgrade
Abbott Addresses Life-Threatening Flaw in a Half-Million Pacemakers


Original Submission

posted by martyb on Wednesday August 15 2018, @06:42AM   Printer-friendly
from the building-up-to-it dept.

Home Depot's Sales Rebound Muted by Inflation in Fuel and Lumber

Home Depot Inc.'s sales rebounded last quarter as Americans took on more remodeling projects, but rising costs for lumber and transportation are weighing on profitability.

[...] Home Depot and its smaller rival Lowe's Cos. are often seen as proxies for the health of the housing sector because property owners spend more on their homes when they believe values are rising. But for several quarters there's been increasing concern that years of robust home-price gains are cooling. For its part, Home Depot has continually said that a shortage of available homes in many markets would actually underpin higher home-improvement spending.

[...] Even as the overall housing market looks to be cooling, several trends are driving demand for home-improvement products. A shortage of available listings has slowed property purchases, causing some owners to opt for sprucing up their homes instead. Additionally, more people are staying longer in their homes, which also supports the uptick.

The labor market also plays a role: A strong run of hiring, coupled with moderate wage growth, has boosted Americans' wherewithal to spend money on fixing up their homes. Spending on home improvement -- which accounts for about 38 percent of private residential construction outlays -- surged 13.8 percent in June from a year earlier to reach $221 billion, according to Commerce Department data. Going forward, the job market may continue to propel housing and remodeling demand. But potential hurdles include a pickup in mortgage rates, a shortage of skilled workers for building and remodeling projects, and rising costs for construction materials such as lumber, which is affected by tariffs.

Also at CNN and CNBC.


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @05:05AM   Printer-friendly
from the his-heart-grew-three-sizes-that-day dept.

Many studies have attempted to identify a single transcription factor that can induce formation of the mesoderm, an early layer in embryonic development, without help from other cellular proteins. None have been successful, until now.

In a new study published in Cell Stem Cell, titled "Tbx6 Induces Nascent Mesoderm from Pluripotent Stem Cells and Temporally Controls Cardiac versus Somite Lineage Diversification," a research team, including experts from the University of Tsukuba, screened over 50 transcription factors and found that Tbx6 alone was able to stimulate mesoderm formation in laboratory-grown stem cells, and could cause those stem cells to become cardiovascular or musculoskeletal cells.

[...] In the study, temporary production of Tbx6 caused the formation of mesoderm that later produced cardiovascular cells, while continuous Tbx6 expression suppressed this cardiovascular-forming mesoderm and caused formation of mesoderm that later produced musculoskeletal cells.

"Our analyses revealed a connection between early Tbx6 expression and cardiovascular lineage differentiation, and we believe that our study and similar studies may change the current view of lineage specification during development," Dr. Ieda explains. "Importantly, this essential and unappreciated function of Tbx6 in mesoderm and cardiovascular specification is conserved from lower organisms to mammals, so this discovery may have wide-ranging applicability in regenerative medicine."

Tbx6 Induces Nascent Mesoderm from Pluripotent Stem Cells and Temporally Controls Cardiac versus Somite Lineage Diversification (DOI: 10.1016/j.stem.2018.07.001) (DX)


Original Submission

posted by mrpg on Wednesday August 15 2018, @03:33AM   Printer-friendly
from the nemure-akira dept.

Submitted via IRC for cmn32480

NASA's Opportunity rover has had an incredible career already, spending years upon years studying the Martian surface and proving to be an incredibly reliable and hardy piece of hardware. Unfortunately, a NASA dust storm that began kicking up in May may have abruptly ended its historic run.

In mid-June, the solar-powered Opportunity ran out of juice and was forced to go into its dormant standby mode. The dust storm which swallowed the entirety of Mars had blocked out the Sun, cutting the rover off of its only available source of power. NASA engineers had remained optimistic that the rover would wake back up when the skies began to clear, but things aren't looking good thus far.

[...] That's...not great news. NASA knew that the rover would be forced to sit dormant for a while because of the intensity of the storm, but that was several weeks ago. The dust has since began to settle, and enough light should be pushing its way down to the surface to begin recharging Opportunity's batteries once again.

Source: NASA's Opportunity rover still hasn't woken up from a Mars dust storm, and engineers are getting nervous


Original Submission

posted by mrpg on Wednesday August 15 2018, @02:02AM   Printer-friendly
from the [sigh] dept.

Intel's SGX blown wide open by, you guessed it, a speculative execution attack

Another day, another speculative execution-based attack. Data protected by Intel's SGX—data that's meant to be protected even from a malicious or hacked kernel—can be read by an attacker thanks to leaks enabled by speculative execution.

Since publication of the Spectre and Meltdown attacks in January this year, security researchers have been taking a close look at speculative execution and the implications it has for security. All high-speed processors today perform speculative execution: they assume certain things (a register will contain a particular value, a branch will go a particular way) and perform calculations on the basis of those assumptions. It's an important design feature of these chips that's essential to their performance, and it has been for 20 years.

[...] What's in store today? A new Meltdown-inspired attack on Intel's SGX, given the name Foreshadow by the researchers who found it. Two groups of researchers found the vulnerability independently: a team from KU Leuven in Belgium reported it to Intel in early January—just before Meltdown and Spectre went public—and a second team from the University of Michigan, University of Adelaide, and Technion reported it three weeks later.

SGX, standing for Software Guard eXtensions, is a new feature that Intel introduced with its Skylake processors that enables the creation of Trusted Execution Environments (TEEs). TEEs are secure environments where both the code and the data the code works with are protected to ensure their confidentiality (nothing else on the system can spy on them) and integrity (any tampering with the code or data can be detected). SGX is used to create what are called enclaves: secure blocks of memory containing code and data. The contents of an enclave are transparently encrypted every time they're written to RAM and decrypted on being read. The processor governs access to the enclave memory: any attempt to access the enclave's memory from outside the enclave should be blocked.

[...] As with many of the other speculative execution issues, a large part of the fix comes in the form of microcode updates, and in this case, the microcode updates are already released and in the wild and have been for some weeks. With the updated microcode, every time the processor leaves execution of an enclave, it also flushes the level 1 cache. With no data in level 1 cache, there's no scope for the L1TF to take effect. Similarly, with the new microcode leaving, management mode flushes the level 1 cache, protecting SMM data.

Also at Engadget and Wired.


Original Submission

posted by Fnord666 on Wednesday August 15 2018, @12:21AM   Printer-friendly
from the just-getting-to-know-you dept.

Submitted via IRC for SoyCow1984

Students are suing a major college admissions test maker for allegedly selling information about their disability statuses with universities, which they say could hurt their chances at getting into schools and impact the rest of their lives.

When students register to take the ACT—a standardized test used for college admissions taken by more than a million high schoolers each year—they answer a barrage of personal questions. As part of this, they are asked to note if they have disabilities that require "special provisions from the educational institution."

The ACT, which is administered by ACT, Inc., is the only real competitor to the College Board's SAT exam. The lawsuit claims that the ACT is selling the data it gleans from those student questionnaires—connected directly to students' individual identities—to colleges, which then use it to make important decisions about admissions and financial aid.

"A lot of students and parents have no idea how these testing agencies, which are gatekeepers to college, are using very sensitive and confidential data in the college admissions process," Jesse Creed, one of the plaintiffs' lawyers, told me in a phone call. "[Colleges are] hungry for disability data, because they have limited resources, and it's expensive to educate people with disabilities."

Source: https://motherboard.vice.com/en_us/article/43pbep/lawsuit-claims-the-act-sells-students-disability-data-to-colleges


Original Submission