Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What was highest label on your first car speedometer?

  • 80 mph
  • 88 mph
  • 100 mph
  • 120 mph
  • 150 mph
  • it was in kph like civilized countries use you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:43 | Votes:95

posted by martyb on Saturday September 23 2017, @11:14PM   Printer-friendly
from the license?-we-don't-need-no-stinkin'-license! dept.

Uber will lose its license to operate inside London. The issue may be only a temporary setback since the license expires on September 30th and Uber can continue to operate in London while appealing the decision:

London's transportation agency dealt a major blow to Uber on Friday, declining to renew the ride-hailing service's license to operate in its largest European market. [...] "Uber's approach and conduct demonstrate a lack of corporate responsibility in relation to a number of issues which have potential public safety and security implications," the agency, Transport for London, said in a statement.

[...] In issuing its decision, Transport for London, which is responsible for the city's subways and buses as well as regulating its taxicabs, declared that Uber was not "fit and proper" to operate in the city — a designation that carries significant weight in Britain. "Fit and proper" is a benchmark applied across different sectors of business and the charitable organizations in the country to ensure that people or organizations meet the requirements of their industry or specialty. Tests typically assess factors like an individual or company's honesty, transparency and competence, though there is no formal exam. In Uber's case, Transport for London said it examined issues of how it dealt with serious criminal offenses, how it conducted background checks on drivers and its justification for a software program called Greyball that "could be used to block regulatory bodies from gaining full access to the app."

Opinion: London's Uber Ban Is a Big Brexit Mistake


Original Submission

posted by martyb on Saturday September 23 2017, @08:53PM   Printer-friendly
from the best-of-luck-to-them dept.

Scientists have engineered a "tri-specific antibody" that they say can attack 99% of HIV strains:

Scientists have engineered an antibody that attacks 99% of HIV strains and can prevent infection in primates. It is built to attack three critical parts of the virus - making it harder for HIV to resist its effects.

The work is a collaboration between the US National Institutes of Health and the pharmaceutical company Sanofi. The International Aids Society said it was an "exciting breakthrough". Human trials will start in 2018 to see if it can prevent or treat infection.

Trispecific broadly neutralizing HIV antibodies mediate potent SHIV protection in macaques (DOI: 10.1126/science.aan8630) (DX)

The development of an effective AIDS vaccine has been challenging due to viral genetic diversity and the difficulty in generating broadly neutralizing antibodies (bnAbs). Here, we engineered trispecific antibodies (Abs) that allow a single molecule to interact with three independent HIV-1 envelope determinants: 1) the CD4 binding site, 2) the membrane proximal external region (MPER) and 3) the V1V2 glycan site. Trispecific Abs exhibited higher potency and breadth than any previously described single bnAb, showed pharmacokinetics similar to human bnAbs, and conferred complete immunity against a mixture of SHIVs in non-human primates (NHP) in contrast to single bnAbs. Trispecific Abs thus constitute a platform to engage multiple therapeutic targets through a single protein, and could be applicable for diverse diseases, including infections, cancer and autoimmunity.


Original Submission

posted by martyb on Saturday September 23 2017, @06:31PM   Printer-friendly
from the let-me-think-about-that-one dept.

A new study of a Neanderthal child's skeleton has suggested that Neanderthal brains developed more slowly than previous studies had indicated:

A new study shows that Neanderthal brains developed more slowly than ours. An analysis of a Neanderthal child's skeleton suggests that its brain was still developing at a time when the brains of modern human children are fully formed. This is further evidence that this now extinct human was not more brutish and primitive than our species. The research has been published in the journal Science.

Until now it had been thought that we were the only species whose brains develop slowly. Unlike other apes and more primitive humans modern humans have an extended period of childhood lasting several years. This is because it takes time and energy to develop our large brain. Previous studies of Neanderthal remains indicated that they developed more quickly than modern humans - suggesting that their brains might be less sophisticated.

But a team led by Prof Antonio Rosas of the Museum of Natural Sciences in Madrid found that if anything, Neanderthal brains may develop more slowly than ours. "It was a surprise," he told BBC News. "When we started the study we were expecting something similar to the previous studies," he told BBC News.

Also at Science Magazine, NYT, and Discover Magazine.

The growth pattern of Neandertals, reconstructed from a juvenile skeleton from El Sidrón (Spain) (open, DOI: 10.1126/science.aan6463) (DX)


Original Submission

posted by martyb on Saturday September 23 2017, @04:10PM   Printer-friendly
from the information-wants-to-be-free? dept.

Adobe is showing that it can be transparent about its security practices:

Having some transparency about security problems with software is great, but Adobe's Product Security Incident Response Team (PSIRT) took that transparency a little too far today when a member of the team posted the PGP keys for PSIRT's e-mail account—both the public and the private keys. The keys have since been taken down, and a new public key has been posted in its stead.

The faux pas was spotted at 1:49pm ET by security researcher Juho Nurminen:

Oh shit Adobe pic.twitter.com/7rDL3LWVVz
— Juho Nurminen (@jupenur) September 22, 2017

Nurminen was able to confirm that the key was associated with the psirt@adobe.com e-mail account.

Also at The Register and Wccftech.

[How many here have done something like this? Perhaps an extra file accidentally uploaded to GitHub? --Ed.]


Original Submission

posted by martyb on Saturday September 23 2017, @01:48PM   Printer-friendly
from the innovations-in-financial-engineering dept.

SolarCity, a company Tesla acquired in Nov. 2016, has agreed to pay $29.5 million to resolve allegations that it lied to the government by submitting inflated claims to cash in on a solar stimulus program, the Department of Justice announced Friday.

SolarCity has agreed to drop charges it had against the US government as part of the settlement, which is not an admission of guilt. The settlement is a SolarCity obligation, a SolarCity representative told Business Insider.

The investigation centered on a program — Section 1603 — set up under the Obama administration that was meant to encourage solar adoption by subsidizing installation costs. The program allowed solar companies to receive a federal grant that was equal to 30% of the cost to install or acquire solar systems.

The Justice Department alleges that SolarCity made thousands of claims that overstated the costs of its installations, allowing it to receive inflated grant payments. It has been probing SolarCity and several other solar companies since 2012.

Source: Business Insider

SolarCity was founded in 2006 by brothers Peter and Lyndon Rive, based on a suggestion for a solar company concept by their cousin, Elon Musk, who is the chairman and helped start the company. The Rive brothers left SolarCity earlier this year.


Original Submission

posted by martyb on Saturday September 23 2017, @11:27AM   Printer-friendly
from the on-the-rocks dept.

More ice has been found to exist in permanently shadowed craters and terrain on Mercury's surface:

The study, published in Geophysical Research Letters [DOI: 10.1002/2017GL074723] [DX], adds three new members to the list of craters near Mercury's north pole that appear to harbor large surface ice deposits. But in addition to those large deposits, the research also shows evidence that smaller-scale deposits scattered around Mercury's north pole, both inside craters and in shadowed terrain between craters. Those deposits may be small, but they could add up to a lot more previously unaccounted-for ice.

"The assumption has been that surface ice on Mercury exists predominantly in large craters, but we show evidence for these smaller-scale deposits as well," said Ariel Deutsch, the study's lead author and a Ph.D. candidate at Brown. "Adding these small-scale deposits to the large deposits within craters adds significantly to the surface ice inventory on Mercury."

[...] To seek further evidence that such smaller-scale deposits exist, the researchers looked though the altimeter data in search of patches that were smaller than the big crater-based deposits, but still large enough to resolve with the altimeter. They found four, each with diameters of less than about 5 kilometers. "These four were just the ones we could resolve with the MESSENGER instruments," Deutsch said. "We think there are probably many, many more of these, ranging in sizes from a kilometer down to a few centimeters."

A Mercury Colony?

Also at the American Geophysical Union.


Original Submission

posted by mrpg on Saturday September 23 2017, @09:06AM   Printer-friendly
from the woosh dept.

China is once again operating the world's fastest train service after a speed cap was lifted:

China increased the maximum speed of bullet trains on the Shanghai-Beijing line to 350 kilometers per hour yesterday, six years after a fatal accident led to a speed cap. The limit was reduced to 300kph after 40 people died in a high-speed train crash near Wenzhou, east China's Zhejiang Province, in July 2011.

The decision to increase the speed means that China once again has the world's fastest train service. The new limit cuts the time of the 1,318-kilometer journey between Shanghai and the capital to four hours and 28 minutes, saving passengers nearly an hour. A total of 14 trains a day will run between the two cities at the higher speed.

Also at Xinhua and NextBigFuture.


Original Submission

posted by mrpg on Saturday September 23 2017, @06:34AM   Printer-friendly
from the I-like-counterfeit dept.

Researchers at The University of Manchester have developed the world's first handheld SORS device that can detect fake spirits, such as vodka and whisky, whilst still in their bottles.

SORS, or 'spatially offset Raman spectroscopy," devices give highly accurate chemical analysis of objects and contents beneath concealing surfaces, such as glass bottles. It works by using 'an optical approach' where lasers are directed through the glass, enabling the isolation of chemically-rich information that is held within the spirits.

Such devices are already commercially available but are usually used for security and hazmat detection, screening and pharmaceutical analysis. This latest version, developed at the University's School of Chemistry in the Manchester Institute of Biotechnology (MIB), is the first time such a handheld tool is being used for a food or beverage product. The reseach has been published in Nature today (21st September).

Spirit drinks are the EU's biggest agri-food export, with EU governments' revenues of at least €23 billion in excise duties and VAT, and approximately 1 million jobs linked to the production, distribution and sale of spirit drinks.

Bah, I make my own.


Original Submission

posted by takyon on Saturday September 23 2017, @04:13AM   Printer-friendly
from the wishful-thinking dept.

From the lowRISC blog:

We are looking for a talented hardware engineer to join the lowRISC team and help make our vision for an open source, secure, and flexible SoC a reality. Apply now!

lowRISC C.I.C. is a not-for-profit company that aims to demonstrate, promote and support the use of open-source hardware. The lowRISC project was established in 2014 with the aim of bringing the benefits of open-source to the hardware world. It is working to do this by producing a high quality, secure, open, and flexible System-on-Chip (SoC) platform. lowRISC C.I.C. also provides hardware and software services to support the growing RISC-V ecosystem. Our expertise includes the LLVM Compiler, hardware security extensions and RISC-V tools, hardware and processor design.

[...] lowRISC is an ambitious project with a small core team, so you will be heavily involved in the project's development direction. This role will involve frequent work with external contributors and collaborators. While much of the work will be at the hardware level the post will offer experience of the full hardware/software stack, higher-level simulation tools and architectural design issues.

Some practical experience of hardware design with a HDL such as Verilog/SystemVerilog is essential, as is a good knowledge of the HW/SW stack. Ideally, candidates will also have experience or demonstrated interest in some of: SoC design, large-scale open source development, hardware or software security, technical documentation, board support package development and driver development. Industrial experience and higher degree levels are valued, but we would be happy to consider an enthusiastic recent graduate with a strong academic record.

Informal enquires should be made to Alex Bradbury asb@lowrisc.org.

takyon (thanks to an AC): lowRISC is a project to create a "fully open-sourced, Linux-capable, system-on-a-chip"; it is based around RISC-V, the "Free and Open RISC Instruction Set Architecture", which is meant to provide an extensible platform that scales from low-level microcontrollers up to highly parallel, high-bandwidth general-purpose supercomputers.

Reduced instruction set computer (RISC).

Previously: RISC-V Projects to Collaborate
LowRISC Announces its 0.4 Milestone Release
SiFive and UltraSoC Partner to Accelerate RISC-V Development Through DesignShare


Original Submission

posted by Fnord666 on Saturday September 23 2017, @01:52AM   Printer-friendly
from the just-no-words dept.

From Wired:

WIRED wants to take you on the deepest dive yet into the science behind the Impossible Burger.

Biting into an Impossible Burger is to bite into a future in which humanity has to somehow feed an exploding population and not further imperil the planet with ever more livestock. Because livestock, and cows in particular, go through unfathomable amounts of food and water (up to 11,000 gallons a year per cow) and take up vast stretches of land. And their gastrointestinal methane emissions aren't doing the fight against global warming any favors either (cattle gas makes up 10 percent of greenhouse gas emissions worldwide).

This is the inside story of the engineering of the Impossible Burger, the fake meat on a mission to change the world with one part soy plant, one part genetically engineered yeast—and one part activism. As it happens, though, you can't raise hell in the food supply without first raising a few eyebrows.

[...] Technicians take genes that code for the soy leghemoglobin protein and insert them into a species of yeast called Pichia pastoris. They then feed the modified yeast sugar and minerals, prompting it to grow and replicate and manufacture heme with a fraction of the footprint of field-grown soy. With this process, Impossible Foods claims it produces a fake burger that uses a 20th of the land required for feeding and raising livestock and uses a quarter of the water, while producing an eighth of the greenhouse gases (based on a metric called a life cycle assessment).

Now, engineering a "beef" burger from scratch is of course about more than just heme, which Impossible Foods bills as its essential ingredient. Ground beef features a galaxy of different compounds that interact with each other, transforming as the meat cooks. To piece together a plant-based burger that's indistinguishable from the real thing, you need to identify and recreate as many of those flavors as possible.

To do this, Impossible Foods is using what's known as a gas chromatography mass spectrometry system. This heats a sample of beef, releasing aromas that bind to a piece of fiber. The machine then isolates and identifies the individual compounds responsible for those aromas. "So we will now have kind of a fingerprint of every single aroma that is in beef," says Celeste Holz-Schietinger, principal scientist at Impossible Foods. "Then we can say, How close is the Impossible Burger? Where can we make improvements and iterate to identify how to make each of those particular flavor compounds?"

This sort of deconstruction is common in food science, a way to understand exactly how different compounds produce different flavors and aromas. "In theory, if you knew everything that was there in the right proportions, you could recreate from the chemicals themselves that specific flavor or fragrance," says Staci Simonich, a chemist at Oregon State University.


Original Submission

posted by martyb on Saturday September 23 2017, @12:29AM   Printer-friendly
from the head-for-the-hills dept.

As if the onslaught of hurricanes Irma and Maria were not enough, the National Weather Service in San Juan is reporting that a major dam is failing in Puerto Rico and that 70,000 people are being evacuated by bus. From CBS:

The National Weather Service in San Juan said Friday that the northwestern municipalities of Isabela and Quebradillas, home to some 70,000 people, were being evacuated with buses because the nearby Guajataca Dam was failing after Hurricane Maria hit the U.S. territory.

Maria poured more than 15 inches of rain on the mountains surrounding the dam, swelling the reservoir behind it.

Details remained slim about the evacuation with communications hampered after the storm, but operators of the dam reported that the failure was causing flash-flooding downstream. The 345-yard dam holds back a man-made lake covering about 2 square miles and was built decades ago, U.S. government records show.

"Move to higher ground now," the weather service said in a statement. "This is an extremely dangerous and life-threatening situation. Do not attempt to travel unless you are fleeing an area subject to flooding or under an evacuation order."

"Act quickly to protect your life," it added. "Buses will be evacuating people from these areas."

Wikipedia has a page about Guajataca Dam

NWS report on Twitter; also at Al Jazeera and BBC.


Original Submission

posted by Fnord666 on Friday September 22 2017, @11:39PM   Printer-friendly
from the cars-aren't-intelligent,-they-only-*think*-they-are dept.

According to CNBC, Tesla is teaming up with AMD to develop a custom chip optimized for AI, to be used for self-driving features in Tesla cars. The head of Tesla's "Autopilot" team is Jim Keller, formerly of AMD and Apple, who helped design the A4 and A5 chips while working at Apple and was lead architect on the Athlon 64 at AMD.

Also at Engadget, TechCrunch, and Business Insider

GlobalFoundries, which fabricates chips for Advanced Micro Devices Inc, said on Thursday that Tesla had not committed to working with it on any autonomous driving technology or product, contradicting an earlier media report. [...] The spokesperson for GlobalFoundries said that Jha’s comments at the GlobalFoundries Technology Conference were not reported accurately.


Original Submission

posted by Fnord666 on Friday September 22 2017, @10:06PM   Printer-friendly
from the what-about-the-Greeks? dept.

India's contributions to mathematics:

It should come as no surprise that the first recorded use of the number zero, recently discovered to be made as early as the 3rd or 4th century, happened in India. Mathematics on the Indian subcontinent has a rich history going back over 3,000 years and thrived for centuries before similar advances were made in Europe, with its influence meanwhile spreading to China and the Middle East.

As well as giving us the concept of zero, Indian mathematicians made seminal contributions to the study of trigonometry, algebra, arithmetic and negative numbers among other areas. Perhaps most significantly, the decimal system that we still employ worldwide today was first seen in India.

With such a significant technical lead, how did they fall behind?


Original Submission

posted by Fnord666 on Friday September 22 2017, @08:32PM   Printer-friendly
from the should-have-drafted-Demi-Moore dept.

The U.S. Marine Corps will soon have its first female infantry officer. The unnamed lieutenant is expected to lead an infantry platoon of about 40 marines:

The Marine Corps is set to have its first female infantry officer, a milestone in its nearly 250-year-long history.

The lieutenant is scheduled to graduate with her all-male peers on Monday after she completed all of the graduation requirements in the service's grueling 13-week Infantry Officer Course, the Corps said. Her completion of the course was first reported by The Washington Post. The officer's name was not made public.

The course was opened to women in 2012, and on an experimental basis. More than 30 women attempted it, but when none passed, the course was once again closed to females in the spring of 2015. After the Pentagon opened all military jobs to women, four additional women tried the course without success.

Also at The Hill.


Original Submission

posted by martyb on Friday September 22 2017, @06:59PM   Printer-friendly
from the should-I-stay-or-should-I-go-now? dept.

I knew this day would eventually come. We had been warned that Firefox 57 would force some significant changes on us users, including the removal of support for extensions that did not conform to the WebExtensions model, along with the introduction of the new Photon user interface appearance.

Although I have always only wanted to run the stable releases, long ago I had been forced to run the Developer Edition of Firefox just so I could easily use some extensions I had written on my own. Now Firefox was showing me that an update to Firefox 57.0b1 was available. Should I do it? Should I install this update? I debated with myself for several minutes. But in the end I knew I would have no choice. I would at some point have to update to Firefox 57 if I wanted to keep receiving security fixes and other important updates. So I did it. I upgraded to Firefox Developer Edition 57.0b1.

The update itself was uneventful. It installed as past updates have, and I restarted my browser to start using the new version. The first thing I noticed are the user interface changes. My initial reaction was that I had accidentally started my Vivaldi browser installation instead of my Firefox Developer Edition installation. A quick check of the About dialog did confirm that I was in fact using Firefox, and not Vivaldi.

There's not much to say about the Photon user interface. While Australis-era Firefox looked almost identical to Chrome to me, Photon-era Firefox looks like Vivaldi to me. I couldn't see any improvements, however. The menu shown after clicking the three line toolbar button may have had its appearance changed to be more like a traditional menu, but it is still muddled and much too busy to be useful. I didn't notice any increase in the responsiveness of the user interface. It still feels to me like it's slower than that of Chrome's user interface.

This would be a good time to talk about the overall performance of the browser. I can't perceive any improvement. I don't think it's worse than it was, but I also don't think that it's any better. From what I can see, pages aren't loading any faster. Changing between tabs doesn't feel any faster to me. Scrolling through loaded pages isn't any smoother. Chrome still feels snappier. If there were improvements on the performance front, I'm not seeing them.

Now it's time to talk about extensions. Although I was expecting breakage, it's still a painful feeling to see many of your favorite extensions labeled as "legacy" and no longer working. While a small number of my installed extensions already supported Firefox 57, there were others where I had to visit the developers' websites and download special dev or pre-release versions. In other cases I wasn't so lucky. Sometimes the developers had given up on supporting Firefox 57, and openly acknowledged that they wouldn't be making any further updates to the extensions. I had to find alternatives. Sometimes there were alternatives, but in at least one of the cases the alternative was much less capable than the extension I had been using. I spent well over an hour just trying to get the third-party extensions I use back to a state similar to how they had been when I'd been using Firefox 56.

Then there are my own personal extensions. I had written these over a number of years, and had been using them with Firefox for quite some time. But now they were deemed "legacy" and they no longer could be used now that I was running Firefox 57. I started to read up about what it would take to convert them to be WebExtensions compatible, and I soon learned that it would not be a trivial task. I will need to set aside a sizable chunk of time to get these ported over.

I've been using Firefox for a long time. I've experienced its highs, and I've experienced its many lows. Of these lows, I think that Firefox 57 is perhaps the lowest of them yet. Many of the extensions I have used for years no longer work. I will need to put in much time and effort to convert extensions I had written for my own personal use. I will need to learn to use its new user interface. But worst of all, I do not see any improvements or benefits. I don't think it performs any better now than it did in the past.

I feel particularly sorry for the Firefox users who aren't as technical as I'm lucky to be. They might not fully understand the implications of Firefox 57 when it comes time for them to eventually upgrade. They likely won't be able to deal with the many broken extensions. They too will need to learn a new user interface that doesn't really provide anything in the way of improvement. As bad as I found the experience of upgrading to Firefox 57 to be, I fear that these average users without a technical background will find it even more painful.

I'm now in a bind. I don't want to use one of the pre-57 ESR releases of Firefox, because I'll eventually end up in the same position that I am in today. I will have to rewrite my extensions either now or later. But since doing that will likely make them compatible with Chrome, I must ask myself, is it still worth using Firefox? I ponder: if my extensions will work with both Firefox and Chrome, but I find Chrome to perform much better, why not just use Chrome instead? That may very well be what I do. While some say that Firefox offers more privacy, I am doubtful about this. It has a long and complex privacy policy that talks of sending various data here and there.

I never really seriously considered moving away from Firefox in the past, even as my user experience got worse and worse over time. But I think the time to leave Firefox permanently has finally arrived. Firefox 57 takes away the few remaining advantages that Firefox had for me, namely the ability to run the extensions I had already written for myself.

I think that I should be feeling more sorrow and regret about finally leaving Firefox behind. But I don't feel any of that. In fact, I feel a sense of optimism that I haven't felt in a long time. Chrome, or more likely Chromium, will probably bring me a faster browsing experience than I've become accustomed to while using Firefox. I will have to rework my extensions, but at least they will then work with a better browser platform. They may even work with other browsers like Vivaldi and Brave, as well.

So while Firefox 57 has so far been one of the worst web browser user experiences for me yet, in some ways it may also be the best: it finally gives me a reason to move away from Firefox to an ecosystem that offers me so much more than what Firefox did. It may very well be putting me in a better position than I would have been in had I not tried Firefox 57 and been so disappointed with it.

Should you update to Firefox 57 as soon as it become available to you? If I were you, I would be cautious. While it's important to get the latest fixes to try and achieve a safe browsing experience, please be aware of the potential to break extensions, some of which there may be no equivalent WebExtensions compatible replacements for. Firefox 57 does include changes that could cause you a lot of problems. My advice would be to prepare before the upgrade, and be ready for your browsing experience to suffer. If you do choose to upgrade to Firefox 57, I sincerely hope that your upgrade goes better than mine has gone.


Original Submission