Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

The Best Star Trek

  • The Original Series (TOS) or The Animated Series (TAS)
  • The Next Generation (TNG) or Deep Space 9 (DS9)
  • Voyager (VOY) or Enterprise (ENT)
  • Discovery (DSC) or Picard (PIC)
  • Lower Decks or Prodigy
  • Strange New Worlds
  • Orville
  • Other (please specify in comments)

[ Results | Polls ]
Comments:63 | Votes:78

posted by CoolHand on Thursday February 11 2016, @10:47PM   Printer-friendly
from the code-on dept.

A link to an interesting article appeared in my e-mail box this morning with the following introduction:

I just watched an interesting discussion of Java and Node supporters on Youtube, that got my brain ticking. This article is an explosion of various ideas I have about Java, Node, JavaScript, and explicitly typed and weak typed languages.

I won't go into any more detail about the ideas that were produced as the result of the "explosion", but I'd be interested in reading about what others have to say. I will start the discussion with the following snippet from the article:

I don't remember the first release of Netscape browser with JavaScript, but I remember the reasons behind JS introduction:

  • Provide a simple scripting language "inspired by" Java to control embedded Java applets in web pages.
  • A simple scripting language to control forms.

Later, the public DOM representation of the page (partial initially), visible for JavaScript was introduced by Netscape, to provide some support of page changes driven by user actions beyond forms.

As you can see, the initial motivation was not to create a complete and powerful language to develop web client applications.

History is replete with examples of programming languages that have been bastardized and transmogrified into doing things they were never intended to do originally.

Enjoy!


Original Submission

posted by cmn32480 on Thursday February 11 2016, @09:08PM   Printer-friendly
from the glass-on-steroids dept.

Google will reportedly release a smartphone-assisted virtual reality headset in 2016 and build virtual reality software features into Android rather than rely on an app. The device will use plastic casing, add extra sensors, and include better lenses than those distributed with Google Cardboard:

We've said a few times now that Google's virtual reality initiative is too big for the company to just be working on Google Cardboard, and now The Financial Times has published a report detailing what seems to be the next phase of Google's VR push. The report says that Google is working on "a successor to Cardboard," creating a higher-quality headset and building VR software directly into Android.

The device sounds like a Google version of Samsung's Gear VR. Like Cardboard, the headset will be powered by your existing smartphone, with a "more solid plastic casing" along with better lenses and sensors. Also like Cardboard, this won't be limited to just a handful of devices, with the report saying that the headset "will be compatible with a much broader range of Android devices than Gear VR."

Such a device sounds like it would occupy a compelling spot in the market. The Gear VR is a great device—the $100 headset is a powerful entry-level VR experience—but it only works with Samsung phones. Cardboard has much wider phone compatibility, but it comes with a huge list of compromises that lead to a subpar experience. Taking the Gear VR model and expanding it to accept most popular smartphones sounds like a solid idea.


Original Submission

posted by on Thursday February 11 2016, @07:45PM   Printer-friendly
from the seeing-the-saab-for-the-trees dept.

Remember wood paneled station wagons? Well, wood is back, but this time it's not for aesthetics—it's for reducing vehicle weight with renewable materials. Swedish researchers have produced the world's first model car with a roof and battery made from wood-based carbon fiber.

Although it's built on the scale of a toy, the prototype vehicle represents a giant step towards realizing a vision of new lightweight materials from the forest, one of the benefits of a so-called bioeconomy.

The demo is a joint project of KTH Royal Institute of Technology in Stockholm, the Swedish research institute Innventia and Swerea, a research group for industrial renewal and sustainable development.

The key ingredient in the carbon fiber composite is lignin, a constituent of the cell walls of nearly all plants that grow on dry land. Lignin is the second most abundant natural polymer in the world, surpassed only by cellulose.

Göran Lindbergh, Professor of Chemical Engineering at KTH, says that the use of wood lignin as an electrode material came from previous research he did with Innventia. Lignin batteries can be produced from renewable raw materials, in this case the byproduct from paper pulp production.

"The lightness of the material is especially important for electric cars because then batteries last longer," Lindbergh says. "Lignin-based carbon fiber is cheaper than ordinary carbon fiber. Otherwise batteries made with lignin are indistinguishable from ordinary batteries."

Research along similar lines is being done a Oak Ridge National Laboratories And North Carolina State University


Original Submission

posted by takyon on Thursday February 11 2016, @06:22PM   Printer-friendly
from the where-is-mccoy-when-you-need-him dept.

Moore's Law, coined eponymously for Gordon Moore, co-founder of Intel Corporation, who, in a 1965 paper famously observed that component densities on integrated circuits will double every twelve months. He amended his observation in 1975 to a doubling every 24 months. Since then, the chip industry has borne out Moore's observation/prediction. However, there are still those who claim that Moore's Law is dying, just as many have done before.

However, Peter Bright over at Ars Technica is reporting notes a change in focus for the chip industry away from chasing Moore's Law. From the article:

Gordon Moore's observation was not driven by any particular scientific or engineering necessity. It was a reflection on just how things happened to turn out. The silicon chip industry took note and started using it not merely as a descriptive, predictive observation, but as a prescriptive, positive law: a target that the entire industry should hit.

Apparently, the industry isn't going to keep trying to hit that particular target moving forward, as we've seen with the recent delay of Intel's 10nm Cannonlake chips. This is for several reasons:

In the 2000s, it was clear that this geometric scaling was at an end, but various technical measures were devised to keep pace of the Moore's law curves. At 90nm, strained silicon was introduced; at 45nm, new materials to increase the capacitance of each transistor layered on the silicon were introduced. At 22nm, tri-gate transistors maintained the scaling.

But even these new techniques were up against a wall. The photolithography process used to transfer the chip patterns to the silicon wafer has been under considerable pressure: currently, light with a 193 nanometre wavelength is used to create chips with features just 14 nanometres. The oversized light wavelength is not insurmountable but adds extra complexity and cost to the manufacturing process. It has long been hoped that extreme UV (EUV), with a 13.5nm wavelength, will ease this constraint, but production-ready EUV technology has proven difficult to engineer.

Even with EUV, it's unclear just how much further scaling is even possible; at 2nm, transistors would be just 10 atoms wide, and it's unlikely that they'd operate reliably at such a small scale. Even if these problems were resolved, the specter of power usage and dissipation looms large: as the transistors are packed ever tighter, dissipating the energy that they use becomes ever harder.

The new techniques, such as strained silicon and tri-gate transistors, took more than a decade to put in production. EUV has been talked about for longer still. There's also a significant cost factor. There's a kind of undesired counterpart to Moore's law, Rock's law, which observes that the cost of a chip fabrication plant doubles every 4 years. Technology may provide ways to further increase the number of transistors packed into a chip, but the manufacturing facilities to build these chips may be prohibitively expensive—a situation compounded by the growing use of smaller, cheaper processors.

The article goes on to discuss how the industry will focus moving forward:

[More]

These difficulties mean that the Moore's law-driven roadmap is now at an end. ITRS decided in 2014 that its next roadmap would no longer be beholden to Moore's "law," and Nature writes that the next ITRS roadmap, published next month, will instead take a different approach.

Rather than focus on the technology used in the chips, the new roadmap will take an approach it describes as "More than Moore." The growth of smartphones and Internet of Things, for example, means that a diverse array of sensors and low power processors are now of great importance to chip companies. The highly integrated chips used in these devices mean that it's desirable to build processors that aren't just logic and cache, but which also include RAM, power regulation, analog components for GPS, cellular, and Wi-Fi radios, or even microelectromechanical components such as gyroscopes and accelerometers.

So what say you, Soylentils? Is Moore's Law really dead, or is this just another round of hyperbole?

posted by takyon on Thursday February 11 2016, @05:00PM   Printer-friendly
from the commonwealth-navigator dept.

That businessman/reality TV star who just won the New Hampshire primary is far from the only famous person addicted to sharing his current thoughts and mood on Twitter. When you do that, you're bound to eventually make a mistake that has consequences. This time it was Marc Andreessen, venture capitalist and co-founder of Netscape (and lead developer for the Mosaic Web browser before that), who got busted for tweeting a thought that shouldn't have left the hotel bar:

Anti-colonialism has been economically catastrophic for the Indian people for decades. Why stop now?

Indians complained; evidently they've grown accustomed to having their own country. It was noticed that Andreessen sits on the board of Facebook, which has been unsuccessfully trying to peddle free Internet service (featuring Facebook, of course) to India for awhile. Oops. Mark Zuckerberg wasn't pleased.

Andreessen, a master of the multi-part tweet, quickly backpedaled. And the original tweet was deleted.

takyon: The Register's Andrew Orlowski has a partial defense of Andreessen's comments that you may find illuminating and/or entertaining.


Original Submission

posted by on Thursday February 11 2016, @04:15PM   Printer-friendly
from the feeling-heavier-then-lighter-today dept.

As expected, gravitational waves, which were predicted by Albert Einstein a century ago, have been detected by the LIGO Collaboration:

Scientists are claiming a stunning discovery in their quest to fully understand gravity. They have observed the warping of space-time generated by the collision of two black holes more than a billion light-years from Earth. The international team says the first detection of these gravitational waves will usher in a new era for astronomy.

It is the culmination of decades of searching and could ultimately offer a window on the Big Bang. The research, by the LIGO Collaboration, has been accepted for publication in the journal Physical Review Letters. The collaboration operates a number of labs around the world that fire lasers through long tunnels, trying to sense ripples in the fabric of space-time. Expected signals are extremely subtle, and disturb the machines, known as interferometers, by just fractions of the width of an atom. But the black hole merger was picked up by two widely separated LIGO facilities in the US.

The historic paper in question: Observation of Gravitational Waves from a Binary Black Hole Merger (open, DOI: 10.1103/PhysRevLett.116.061102)

Archived video of the press conference webcast will be available here.

NASA provided an infographic for their Astronomy Picture of the Day feature with details about the discovery.

Also at NPR, NYT, Scientific American, and Ars Technica Live, The New Yorker. BBC's Jonathan Amos offers an analysis of the discovery.


Original Submission

posted by martyb on Thursday February 11 2016, @03:51PM   Printer-friendly
from the does-the-vehicle-pay-the-tickets,-too? dept.

In a letter to Chris Urmson, the Director of Google's Self-Driving Car Project, the National Highway Traffic Safety Administration (NHTSA) has entertained the possibility of treating the machine intelligence of an autonomous vehicle as a "driver" under federal law. The move could help "streamline the process of putting autonomous vehicles on the road," according to Karl Brauer, an analyst for Kelley Blue Book. From the letter:

As a foundational starting point for the interpretations below, NHTSA will interpret "driver" in the context of Google's described motor vehicle design as referring to the [Self-Driving System (SDS)], and not to any of the vehicle occupants. We agree with Google its [self-driving vehicle (SDV)] will not have a "driver" in the traditional sense that vehicles have had drivers during the last more than one hundred years. The trend toward computer-driven vehicles began with such features as antilock brakes, electronic stability control, and air bags, continuing today with automatic emergency braking, forward crash warning, and lane departure warnings, and continuing on toward vehicles with Google's SDV and potentially beyond. No human occupant of the SDV could meet the definition of "driver" in Section 571.3 given Google's described motor vehicle design - even if it were possible for a human occupant to determine the location of Google's steering control system, and sit "immediately behind" it, that human occupant would not be capable of actually driving the vehicle as described by Google. If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the "driver" as whatever (as opposed to whoever) is doing the driving. In this instance, an item of motor vehicle equipment, the SDS, is actually driving the vehicle.

California's Department of Motor Vehicles recently insisted that self-driving cars have a human driver.

The story was first reported at Reuters. Additional coverage at BBC and MarketWatch.


Original Submission

posted by cmn32480 on Thursday February 11 2016, @02:13PM   Printer-friendly
from the evolution-beats-chemicals dept.

Wild tomatoes are better able to protect themselves against the destructive whitefly than our modern, commercial varieties, new research has shown.

The study, published today in the academic journal Agronomy for Sustainable Development, shows that in our quest for larger redder, longer-lasting tomatoes we have inadvertently bred out key characteristics that help the plant defend itself against predators.

Led by Newcastle University, UK, the research shows that wild tomatoes have a dual line of defence against these voracious pests; an initial mechanism which discourages the whitefly from settling on the plant in the first place and a second line of defence which happens inside the plant where a chemical reaction causes the plant sap to "gum up" blocking the whitefly's feeding tube.

Thomas McDaniel, the PhD student who led the research, says the findings highlight the natural resistance of wild plant varieties and suggests we need to "breed some of that wildness back in" instead of continuously looking for new methods of pest control."By selecting for certain characteristics we have inadvertently lost some really useful ones," explains McDaniel, who is based in the School of Biology at Newcastle University.

"The tomatoes we buy in the supermarket may have a long shelf life and be twice as big as the wild varieties but the trade-off is an intensive and costly pest control regime—both biological and in the form of chemical pesticides.

"Our research suggests that if we can breed the whitefly resistant genes back into our commercial varieties then we can produce a super tomato that not only has all the characteristics that we have selected for but is also naturally resistant to the whitefly."

Novel resistance mechanisms of a wild tomato against the glasshouse whitefly

Older article: Control of tomato whiteflies using the confusion effect of plant odours (open, DOI: 10.1007/s13593-014-0219-4)


Original Submission

posted by martyb on Thursday February 11 2016, @12:36PM   Printer-friendly
from the who-does? dept.

Bruce Schneier opines about AT&T's CEO saying tech companies shouldn't have any input into the crypto debate:

My guess is that AT&T is so deep in bed with the NSA and FBI that he's just saying things he believes justif[y] his position.

Ars Technica has a few words about the matter, or you can head over to the The Wall Street Journal's [paywalled] original interview.


Original Submission

posted by martyb on Thursday February 11 2016, @10:54AM   Printer-friendly
from the he-should-demonstrate-by-personal-example dept.

Ars Technica has a good write up on James Clapper's (the Director of National Intelligence) public comments about the Internet of Things (IoT), and how they may be used.

http://arstechnica.com/tech-policy/2016/02/us-intelligence-chief-says-iot-climate-change-add-to-global-instability/

Considering that many IoT devices are intended for in-the-home devices, will citizens push back against this type of deeply intrusive monitoring?

Will people reject convenience that has the feature to be monitored in real time across many toys, products, and "smart" devices? How long will the data be retained, and can any of the collection be turned off?

What if it becomes a social construct in that tampering with an IoT device because it may contain or transmit something embarassing becomes tantamount to concealing evidence? Or if can cause an obstruction of justice if you disable the reporting functionality by blocking DNS or any other means of keeping the traffic inside the home?

What will the public let the government do with IoT data to foster better protection of civilian freedom from terror and tyranny?


[Update: James Clapper is the Director of National Intelligence (not the US FBI chief). -Ed.]

Original Submission

posted by martyb on Thursday February 11 2016, @09:19AM   Printer-friendly
from the what-would-YOU-do? dept.

Seven years after the global financial crisis erupted in 2008, the world economy continued to stumble in 2015. According to the United Nations' report World Economic Situation and Prospects 2016, the average growth rate in developed economies has declined by more than 54% since the crisis. An estimated 44 million people are unemployed in developed countries, about 12 million more than in 2007, while inflation has reached its lowest level since the crisis.

More worryingly, advanced countries' growth rates have also become more volatile. This is surprising, because, as developed economies with fully open capital accounts, they should have benefited from the free flow of capital and international risk sharing – and thus experienced little macroeconomic volatility. Furthermore, social transfers, including unemployment benefits, should have allowed households to stabilise their consumption.

[...] Neither monetary policy nor the financial sector is doing what it's supposed to do. It appears that the flood of liquidity has disproportionately gone towards creating financial wealth and inflating asset bubbles, rather than strengthening the real economy. Despite sharp declines in equity prices worldwide, market capitalization as a share of world GDP remains high. The risk of another financial crisis cannot be ignored.

There are other policies that hold out the promise of restoring sustainable and inclusive growth. These begin with rewriting the rules of the market economy to ensure greater equality, more long-term thinking, and reining in the financial market with effective regulation and appropriate incentive structures.

But large increases in public investment in infrastructure, education, and technology will also be needed. These will have to be financed, at least in part, by the imposition of environmental taxes, including carbon taxes, and taxes on the monopoly and other rents that have become pervasive in the market economy – and contribute enormously to inequality and slow growth.

There's likely no certain and simple solution. Witness recent efforts in Europe with major austerity initiatives and in the United States with quantitative easing. If you were the Emperor of the World, what would you do?


Original Submission

posted by martyb on Thursday February 11 2016, @07:45AM   Printer-friendly
from the Pons-Fleischmann dept.

Thursday Feb 11th will likely go down in scientific history as the formal announcement of the widely-leaked and hinted-at first detection of gravitational waves.

The LIGO gravitational wave team is having a press conference on Thursday at 10:30am EST to announce the widely expected to result in Nobel Prizes first detection of gravitational waves.

The LIGO team's press release notes:

(Washington, DC) -- Journalists are invited to join the National Science Foundation as it brings together the scientists from Caltech, MIT and the LIGO Scientific Collaboration (LSC) this Thursday at 10:30 a.m. at the National Press Club for a status report on the effort to detect gravitational waves - or ripples in the fabric of spacetime - using the Laser Interferometer Gravitational-wave Observatory (LIGO).

Do any Soylentils have the "secret" URL for the webcast? Please don't do anything stupid on this historic occasion, but it would be cool to watch history being made. Its kind of the physics equivalent of a moon rocket launch. It's very widely leaked that history will be made Thursday morning... wouldn't you like to see it?

Backreaction has everything you need to know about gravitational waves for preparation for the webcast.

It's an exciting time to be alive! On the other hand, if the endless leaks and insinuations are bogus, its also an exciting time to be pissed off, too.


Original Submission

posted by martyb on Thursday February 11 2016, @06:08AM   Printer-friendly
from the comparing-apples-to-orchards dept.

Most people understand that investing in the future is important, and that goes for conserving nature and natural resources, too. But in the case of investing in such "natural" assets as groundwater, forests, and fish populations, it can be challenging to measure the return on that investment.

A Yale-led research team has adapted traditional asset valuation approaches to measure the value of such natural capital assets, linking economic measurements of ecosystem services with models of natural dynamics and human behavior.

This innovation will enable policymakers to better evaluate conservation and natural resource management programs, make apples-to-apples comparisons between investing in conversation of natural capital and other investments, and provides a component critical to measuring sustainability.

Writing in the Proceedings of the National Academy of Sciences, the authors demonstrate how to price natural capital using the example of the Kansas High Plains' groundwater aquifer -- a critical natural resource that supports the region's agriculture-based economy.

Another method might be to compel politicians and the ultra-wealthy to live in Love Canal for 6 months and then ask them how much they'd pay to get out.

Original Study


Original Submission

posted by cmn32480 on Thursday February 11 2016, @04:28AM   Printer-friendly
from the hunt-and-peck dept.

The number of fingers does not determine typing speed, new study shows. People using self-taught typing strategies were found to be as fast as trained typists.

Researchers from Aalto University studied the typing behavior of 30 people covering a broad range of age and skill. Their findings challenge the common belief that you need to have taken a touch typing course – to learn how to type with all 10 fingers – in order to be fast:

"We were surprised to observe that people who took a typing course, performed at similar average speed and accuracy, as those that taught typing to themselves and only used 6 fingers on average", explains doctoral candidate Anna Feit.

This is the first study that explores how people type if they never learned the touch typing system. To record the exact finger movements during typing, the researchers used a so called optical motion capture system. Therefore they placed reflective markers on the joints of the fingers and recorded their position with 12 high-speed infrared cameras. Similar high fidelity systems have been used in professional film-making.

"When you ask a person which fingers they use for typing, they cannot tell much. The motion tracking data exposes it, and for the first time we can exactly say which finger presses which key", explains Dr. Daryl Weir.


Original Submission

posted by CoolHand on Thursday February 11 2016, @02:46AM   Printer-friendly
from the big-brother-in-action dept.

A user on Voat going by the handle CheesusCrust has done an analysis of Windows 10 telemetry using DD-WRT doing remote logging to a Linux machine, and they have found that even with all of the telemetry options disabled, a clean Windows 10 Enterprise Edition install still appears to be sending substantial amounts of data back to Microsoft. In an eight-hour period, the experiment identified 3967 connection attempts to 51 distinct Microsoft IP addresses. A further update after 30 hours of letting it sit shows a total of 113 different external IPs are accessed. CheesusCrust also performed a further test using the popular anti-Windows 10 telemetry application DisableWinTracking, and found that while it is able to reduce the data being sent back to Microsoft, even the most stringent options cannot completely eliminate it.

Previous SN coverage


Original Submission

posted by CoolHand on Thursday February 11 2016, @01:01AM   Printer-friendly
from the fun-with-quantums dept.

A researcher from IBM's quantum computing research group has created a startup that could compete with the likes of D-Wave and Google:

The airy Berkeley office space of startup Rigetti Computing boasts three refrigerators—but only one of them stores food. The other two use liquid helium to cool experimental computer chips to a fraction of a degree from absolute zero. The two-year-old company is trying to build the hardware needed to power a quantum computer, which could trounce any conventional machine by tapping into quantum mechanics.

The company aims to produce a prototype chip by the end of 2017 that is significantly more complex than those built by other groups working on fully programmable quantum computers. The following generation of chips should be able to accelerate some kinds of machine learning and run highly accurate chemistry simulations that might unlock new kinds of industrial processes, says Chad Rigetti, the startup's founder and CEO.

[...] Rigetti aims to ultimately set up a kind of quantum-powered cloud computing service, where customers pay to run problems on the company's superconducting chips. It is also working on software to make it easy for other companies to write code for its quantum hardware.

[...] The startup is currently testing a three-qubit chip made using aluminum circuits on a silicon wafer, and the design due next year should have 40 qubits. Rigetti says that's possible thanks to design software his company has created that reduces the number of prototypes that will need to be built on the way to a final design. Versions with 100 or more qubits would be able to improve on ordinary computers when it comes to chemistry simulations and machine learning, he says.

Paywall buster.


Original Submission