Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
[Editor's Comment: This article might sound a bit like a soyvertisement but it has been submitted by one of our community and someone who is well qualified in his field - David Eccles from the Malaghan Institute of Medical Research in New Zealand. It is interesting to read about what is considered currently to be state of the art in field genome sequencing.]
On the 14th and 15th of May, 2015, Oxford Nanopore Technologies held their inaugural nanopore sequencing conference, London Calling. The conference was set up to inform people about the current progress of Oxford Nanopore's first sequencing device, the muesli bar-sized, USB-powered MinION. Over 250 people were in attendance at the conference, representing 35 countries, including two from New Zealand: Nicole Moore from Environmental Science and Research, and David Eccles from the Malaghan Insititute of Medical Research. Over the course of two days, these attendees discovered how the MinION is quietly turning the world of sequencing inside out.
Everything needed for sample preparation and sequencing can fit into a single piece of checked luggage on an airplane. The MinION is robust enough to make it across unsealed roads to remote parts of Africa, where it has been used for sequencing on-location during the Ebola outbreak. The MinION has also been put through its paces for tracking the traffic of organisms. Detection at the species level can be achieved in under 20 minutes of sequencing, and very subtle changes for the same species from different origins can be identified in less than an hour.
Clive Brown, Chief Technical Officer for Oxford Nanopore Technologies, gave a brief summary of what is to come in the near future of nanopore sequencing:
A new class of magnets that expand their volume when placed in a magnetic field and generate negligible amounts of wasteful heat during energy harvesting, has been discovered by researchers at Temple University and the University of Maryland.
The researchers, Harsh Deep Chopra, professor and chair of mechanical engineering at Temple, and Manfred Wuttig, professor of materials science and engineering at Maryland, published their findings, "Non-Joulian Magnetostriction," in the May 21st issue of the journal, Nature. This transformative breakthrough has the potential to not only displace existing technologies but create altogether new applications due to the unusual combination of magnetic properties.
"Our findings fundamentally change the way we think about a certain type of magnetism that has been in place since 1841," said Chopra, who also runs the Materials Genomics and Quantum Devices Laboratories at Temple's College of Engineering.In the 1840s, physicist James Prescott Joule discovered that iron-based magnetic materials changed their shape but not their volume when placed in a magnetic field. This phenomenon is referred to as "Joule Magnetostriction," and since its discovery 175 years ago, all magnets have been characterized on this basis.
"We have discovered a new class of magnets, which we call 'Non-Joulian Magnets,' that show a large volume change in magnetic fields," said Chopra. "Moreover, these non-Joulian magnets also possess the remarkable ability to harvest or convert energy with minimal heat loss."
[Abstract]: http://www.nature.com/nature/journal/v521/n7552/full/nature14459.html
Robots.txt files are simple text files that website owners put in directories to keep web crawlers like Google, Yahoo, from indexing the contents of that directory. It's a game of trust, web masters don't actually trust the spiders to not access every file in the directories, they just expect these documents not to appear in search engines. By and large, the bargain has been kept.
But hackers have made no such bargain, and the mere presence of robots.txt files are like a X on a treasure map. And web site owners get careless, and, yes, some operate under the delusion that the promise of the spiders actually protects these documents.
The Register has an article that explains that hackers and rogue web crawlers, actually use robots.txt files to find directories worth crawling.
Melbourne penetration tester Thiebauld Weksteen is warning system administrators that robots.txt files can give attackers valuable information on potential targets by giving them clues about directories their owners are trying to protect.
Once a hacker gets into a system, it is standard reconnaissance practice to compile and update detailed lists of interesting sub directories by harvesting robots.txt files. It requires less than 100 lines of code.
If you watch your logs, you've probably seen web crawler tracks, and you've probably seen some just walk right past your robots.txt files. If you are smart there really isn't anything of value "protected" by your robots.txt. But the article lists some examples of people who should know better leaving lots of sensitive information hiding behind a robots.txt.
The Register and Threatpost report that the U.S. Department of Commerce may enshrine the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies into law, banning the export of zero-day vulnerabilities without permission:
The Bureau of Industry and Security (BIS) proposes to implement the agreements by the Wassenaar Arrangement (WA) at the Plenary meeting in December 2013 with regard to systems, equipment or components specially designed for the generation, operation or delivery of, or communication with, intrusion software; software specially designed or modified for the development or production of such systems, equipment or components; software specially designed for the generation, operation or delivery of, or communication with, intrusion software; technology required for the development of intrusion software; Internet Protocol (IP) network communications surveillance systems or equipment and test, inspection, production equipment, specially designed components therefor, and development and production software and technology therefor.
BIS proposes a license requirement for the export, reexport, or transfer (in-country) of these cybersecurity items to all destinations, except Canada. Although these cybersecurity capabilities were not previously designated for export control, many of these items have been controlled for their "information security" functionality, including encryption and cryptanalysis.
This rule thus continues applicable Encryption Items (EI) registration and review requirements, while setting forth proposed license review policies and special submission requirements to address the new cybersecurity controls, including submission of a letter of explanation with regard to the technical capabilities of the cybersecurity items. BIS also proposes to add the definition of "intrusion software" to the definition section of the EAR pursuant to the WA 2013 agreements.
A 60-day comment period ends July 20th.
Climate Central reports
The ravages of climate change could severely hurt the ability of utilities in the 11 Western states to generate power unless they "climate proof" their power grid using renewables and energy efficiency, something they are not prepared for, according to a new study[1] [by researchers at Arizona State University, published May 18 in the journal Nature Climate Change].
[...]Higher temperatures and low stream flow reduce coal-fired power plants' ability to use water for cooling, preventing them from operating at full capacity. The most vulnerable power plants could see a reduction in power generation capacity by up to 8.8 percent, the study says.
Renewables take a hit too, but are much less vulnerable to climate change.
[...]The Arizona State study recommends Western states invest in wind, solar, and other "resilient" renewable energy sources while upgrading the power grid and encouraging conservation as ways to overcome some of the challenges climate change poses to the region's power supply.
[1] Link in TFA redirects to the URL that I included.
Barclays PLC analyst Brian Johnson predicts that U.S. automobile sales will drop 40% within the next 25 years due to disruption caused by driverless technology, and that vehicle ownership rates will be cut in half as families move to having just one car:
Large-volume automakers "would need to shrink dramatically to survive," Johnson wrote. "GM and Ford would need to reduce North American production by up to 68 percent and 58 percent, respectively."
Self-driving cars have become a frequent topic for auto executives as the technology for the vehicles emerges. The market for autonomous technology will grow to $42 billion by 2025 and self-driving cars may account for a quarter of global auto sales by 2035, according to Boston Consulting Group. By 2017, partially autonomous vehicles will become available in "large numbers," the firm said in a report in April.
Johnson's report, entitled "Disruptive Mobility," contends that the shift to cars that drive themselves will upend the auto industry. "While extreme, a historical precedent exists," Johnson wrote. "Horses once filled the many roles that cars fill today, but as the automobile came along, the population of horses dropped sharply."
"By removing the driver from the equation (the largest cost in a taxi ride), the average cost per mile to the consumer could be 44 cents for a private ride in a standard sedan and 8 cents for a shared ride in a two-seater," Johnson wrote, noting that would be "well below" the $3 to $3.50 a mile consumers now pay to ride in an UberX car or the $1 to $1.50 a mile for an UberPool vehicle.
Algorithms tell you how to vote. Algorithms can revoke your driver’s license and terminate your disability benefits. Algorithms predict crimes. Algorithms ensured you didn’t hear about #FreddieGray on Twitter. Algorithms are everywhere, and, to hear critics, they are trouble. What’s the problem? Critics allege that algorithms are opaque, automatic, emotionless, and impersonal, and that they separate decision-makers from the consequences of their actions. Algorithms cannot appreciate the context of structural discrimination, are trained on flawed datasets, and are ruining lives everywhere. There needs to be algorithmic accountability. Otherwise, who is to blame when a computational process suddenly deprives someone of his or her rights and livelihood?
But at heart, criticism of algorithmic decision-making makes an age-old argument about impersonal, automatic corporate and government bureaucracy. The machine like bureaucracy has simply become the machine. Instead of a quest for accountability, much of the rhetoric and discourse about algorithms amounts to a surrender—an unwillingness to fight the ideas and bureaucratic logic driving the algorithms that critics find so creepy and problematic. Algorithmic transparency and accountability can only be achieved if critics understand that transparency (no modifier is needed) is the issue. If the problem is that a bureaucratic system is impersonal, unaccountable, creepy, and has a flawed or biased decision criteria, then why fetishize and render mysterious the mere mechanical instrument of the system’s will ?
The U.S. Consumer Financial Protection Bureau is asking a federal court to penalize PayPal for "illegally signing up and billing tens of thousands of consumers for its online credit product, PayPal Credit." Under the proposed order, PayPal would return $15 million to customers, pay a fine of $10 million for its actions, and make its PayPal Credit practices more clear:
Since 2008, the company has offered PayPal Credit, formerly called Bill Me Later, which is a financial product that operates like other forms of credit. Consumers make purchases using it as a form of payment and then repay the debt over time. As with credit cards and similar products, consumers using PayPal Credit may incur interest, late fees, and other charges.
From the first encounter a consumer may have had with PayPal Credit, there were problems. Tens of thousands of consumers who were attempting to enroll in a regular PayPal account, or make an online purchase, were signed up for the credit product without realizing it. The company enrolled other consumers while they tried to cancel or close out of the application process. Many people ended up enrolled without knowing how or why, only to discover unexpectedly that they actually had an account when they learned of a credit-report inquiry, or when they received emails welcoming them to PayPal Credit, billing statements, or debt-collection calls.
One reason so many consumers ended up having this product, unbeknownst to them, was that PayPal set the default payment method for all purchases to PayPal Credit. Other consumers were simply not able to select another payment method when they tried to pay.
Then, for those who did willingly sign up for the product, PayPal in many instances failed to honor advertised promotions, such as the promise of a $5 or $10 credit toward consumer purchases. This was deceptive advertising.
Finally, once enrolled, consumers encountered headache after headache. PayPal failed to post payments properly, lost payment checks, and mishandled billing disputes that consumers had with merchants or the company itself. Numerous consumers reported that the company took more than a week to process payment checks. And even when customers were unable to pay because of website failures, they still got charged late fees.
Also at The Register.
Advanced Micro Devices (AMD) has shared more details about the High Bandwidth Memory (HBM) in its upcoming GPUs.
HBM in a nutshell takes the wide & slow paradigm to its fullest. Rather than building an array of high speed chips around an ASIC to deliver 7Gbps+ per pin over a 256/384/512-bit memory bus, HBM at its most basic level involves turning memory clockspeeds way down – to just 1Gbps per pin – but in exchange making the memory bus much wider. How wide? That depends on the implementation and generation of the specification, but the examples AMD has been showcasing so far have involved 4 HBM devices (stacks), each featuring a 1024-bit wide memory bus, combining for a massive 4096-bit memory bus. It may not be clocked high, but when it's that wide, it doesn't need to be.
AMD will be the only manufacturer using the first generation of HBM, and will be joined by NVIDIA in using the second generation in 2016. HBM2 will double memory bandwidth over HBM1. The benefits of HBM include increased total bandwidth (from 320 GB/s for the R9 290X to 512 GB/s in AMD's "theoretical" 4-stack example) and reduced power consumption. Although HBM1's memory bandwidth per watt is tripled compared to GDDR5, the memory in AMD's example uses a little less than half the power (30 W for the R9 290X down to 14.6 W) due to the increased bandwidth. HBM stacks will also use 5-10% as much area of the GPU to provide the same amount of memory that GDDR5 would. That could potentially halve the size of the GPU:
By AMD's own estimate, a single HBM-equipped GPU package would be less than 70mm × 70mm (4900mm2), versus 110mm × 90mm (9900mm2) for R9 290X.
HBM will likely be featured in high-performance computing GPUs as well as accelerated processing units (APUs). HotHardware reckons that Radeon 300-series GPUs featuring HBM will be released in June.
Diane Cardwell reports at the NYT that once the next generation of larger, taller turbines in development hits the market, all 50 states could become wind energy producers and the bigger machines — reaching as high as 460 feet — could eventually make faster winds at higher altitudes an economical source of electricity. “We believe very much the central role of wind in meeting our climate challenges, and we’re very committed in this direction,” says Ernest Moniz, the secretary of energy. “It’s going to require being able to take advantage of a broader set of resources,” and it will give wind power a “bigger footprint,” onshore and off.
Energy officials and executives are pushing toward machinery that would reach 360 to 460 feet high. That would increase the wind development potential in an additional 700,000 square miles — more than a fifth of the United States — bringing the total area to 1.8 million square miles. The potential expansion would affect areas where wind farms already exist and bring areas into the market. The main regions where height would increase potential wind production include the Southeast, Northeast, states around the Ohio River valley and the Great Lakes, and parts of the interior West and Pacific Northwest. In all, the DOE report "Enabling Wind Power Nationwide" says, land-based and offshore wind could produce 16,150 gigawatts of electricity a year, more than 10 times the country’s consumption (PDF). Wind installations now account for 65 gigawatts, just under 5 percent of national demand. “We’ve proven out as an industry in Europe, with a fair number of turbines in Europe at 120 meters,” says Tom Kiernan. “By going to 100 or 110 meters, we can open up all 50 states."
Japanese airbag manufacturer Takata has doubled estimates of the number of vehicles affected by an airbag defect to 34 million. Moisture can infiltrate the defective airbags, which causes the chemical propellant inside to ignite too quickly, breaking the inflator and sending "metal shards into the passenger cabin that can lead to serious injury or death." The airbags have been linked to six deaths and over 100 injuries.
The NHTSA's Recalls Spotlight site asks owners to use a VIN search tool for up to several weeks after the announcement of the recall. Models affected include cars from Acura/Honda (5.5 million), BMW (765,000), Chrysler/Dodge/Ram (2.88 million), Ford (538,977), Infiniti/Nissan (1,091,000), Toyota/Lexus/Pontiac (1,514,000), Mazda (330,000), Mitsubishi (11,985), Saab, and Subaru (17,516).
(Numbers are subject to change.)
There's a new TLS protocol security vulnerability found that can be exploited using protocol downgrade that was left in place due to previous U.S. government export restrictions its been named "Logjam". It affects servers supporting the Diffie-Hellman key exchange, and it's caused by export restrictions mandated by the U.S. government during the Clinton administration. "Attackers with the ability to monitor the connection between an end user and a Diffie-Hellman-enabled server that supports the export cipher can inject a special payload into the traffic that downgrades encrypted connections to use extremely weak 512-bit key material. Using precomputed data prepared ahead of time, the attackers can then deduce the encryption key negotiated between the two parties."
Internet Explorer is the only browser yet updated to block such an attack — patches for Chrome, Firefox, and Safari are expected soon. The researchers add, "Breaking the single, most common 1024-bit prime used by web servers would allow passive eavesdropping on connections to 18% of the Top 1 Million HTTPS domains. A second prime would allow passive decryption of connections to 66% of VPN servers and 26% of SSH servers. A close reading of published NSA leaks shows that the agency's attacks on VPNs are consistent with having achieved such a break." Here is their full technical report (PDF).
Time for a complete overhaul?
[Update: Thanks to Canopic Jug for locating and providing a link to the Common Vulnerabilities and Exposures entry CVE-2015-4000; check there for official information and updates.]
A woman at a gym tells her friend she pays rent higher than $2,000 a month. An ex-Microsoft employee describes his work as an artist to a woman he's interviewing to be his assistant—he makes paintings and body casts, as well as something to do with infrared light that's hard to discern from his foreign accent. Another man describes his gay lover's unusual sexual fetish, which involves engaging in fake fistfights, "like we were doing a scene from Batman Returns."
These conversations—apparently real ones, whose participants had no knowledge an eavesdropper might be listening—were recorded and published by the NSA. Well, actually no, not the NSA, but an anonymous group of anti-NSA protestors claiming to be contractors of the intelligence agency and launching a new "pilot program" in New York City on its behalf. That spoof of a pilot program, as the prankster provocateurs describe and document it in videos on their website, involves planting micro-cassette recorders under tables and benches around New York city, retrieving the tapes and embedding the resulting audio on their website: Wearealwayslistening.com.
Could actions like these, while they will surely be dismissed as childish stunts by some, succeed at driving home the real impact of NSA spying to the general public in a way that hasn't been managed yet?
The Electronic Frontier Foundation has highlighted two amendments to Fast Track (a bill which would authorize the President to enter into binding trade agreements like the Trans-Pacific Partnership, or TPP, without Congressional oversight). Senator Elizabeth Warren and 14 other senators are backing an amendment that would eliminate "Fast Track" for any trade bill containing an investor-state dispute settlement clause. That would include the TPP as well as the Transatlantic Trade and Investment Partnership (between the U.S. and European Union).
A second amendment, from Sens. Blumenthal, Brown, Baldwin, and Udall, addresses the lack of transparency of the agreement, and would require "all formal proposals advanced by the United States in negotiations for a trade agreement" to be published on the Web within five days of those proposals being shared with other parties to the negotiations. This would bring the United States up to the same level as the European Commission, which has already begun publishing its own TTIP position papers and text proposals to the public.
Treasury Secretary Jacob Lew has said that the White House will veto TPP if a bipartisan currency manipulation measure he called a "poison pill" is passed:
"If a trade agreement is required to come back with a currency discipline that is enforceable through trade mechanisms, I don't think there is another country in the world that would agree to that," Mr. Lew said at a Bretton Woods Committee conference in Washington. "It's a poison pill in terms of getting agreement on TPP."
The Brookings Institution has an article about the TPP's effect on "biosimilar" regulations in the U.S. and other TPP partner nations. "Biologic drugs" are expensive therapies derived from biological sources. They include vaccines, anti-toxins, proteins, and monoclonal antibodies. The high costs have encouraged the development of biosimilars, follow-on versions of an original biologic. Some estimates predict that "competition from biosimilars could reduce US spending on biologics by $44 to $66 billion over the next ten years".
Current Food and Drug Administration regulations grant biologic drugs 12 years of "data exclusivity" following approval, during which the FDA "may not approve a biosimilar application that relies on the data submitted as part of the original biologic application". Without the ability to reuse original clinical trial data, biosimilars face the same high costs of biologic drugs. Opponents to the long exclusivity period argue that restrictions on biosimilar competition keep drug prices high. The TPP would require member states to match the U.S.'s data exclusivity period, and make it harder to change:
For the 11 countries besides the U.S. that are involved in the TPP, current data exclusivity protections range from zero (Brunei) to eight years (Japan). Under the Obama Administration's current proposal, participating countries would increase those periods to match the US standard of 12 years. Curiously, this proposal directly contradicts the administration's ongoing domestic efforts to lower the period of data exclusivity. Since the ACA passed, the Obama administration has repeatedly proposed reducing it to seven, arguing that this would save Medicare $4.4 billion over the next decade. Some have noted that, once the 12-year period is enshrined in the TPP, it will become significantly more difficult to change it through the US legislative process. Furthermore, imposing US standards on the 11 member countries would inevitably restrict competition at the global level, and many patient advocacy and international humanitarian organizations have argued that doing so would undermine the efforts of US global health initiatives like the Vaccine Alliance and the Global Fund to Fight AIDS, Tuberculosis and Malaria, which rely on price competition to manage program costs.
TPP: Wikipedia and Google News.
News.Com in Australia has a story and pictures of a pestilence of spiders that happens every few years when the weather is just right.
It's the strange phenomenon everyone's talking about. The unearthly sight of hundreds of gossamer white threads floating through the air and settling on fields and houses.
...
The astonishing spectacle usually occurs in May or August in Australia, when sunshine follows rainfall. It is rare because it requires an unusual weather pattern for this time of year, which is when spiders are hatching. The spiderlings are light enough to float on threads, sometimes for hundreds of kilometres at up to 20,000 feet. They have even been spotted by aircraft.
Its a migration tactic used by juvenile spiders. Spin a bit of web, and then be blown great distances, landing en masse.
The site has photos of fields covered by webs, as well the webs covered with adult spiders. An arachnophobe's worst nightmare.