Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Best movie second sequel:

  • The Empire Strikes Back
  • Rocky II
  • The Godfather, Part II
  • Jaws 2
  • Star Trek II: The Wrath of Khan
  • Superman II
  • Godzilla Raids Again
  • Other (please specify in comments)

[ Results | Polls ]
Comments:81 | Votes:129

posted by janrinok on Sunday December 28 2014, @09:45PM   Printer-friendly
from the OS-for-teh-win dept.

ITWeb reports

Zimbra, a global leader in unified collaboration software, [together with] open source specialist LSD Information Technology, [December 11] released the Ponemon Institute's "The Open Source Collaboration Study: Viewpoints on Security and Privacy in the US and EMEA" report.

The report findings confirm a changing perception of open source, to a platform for the development of quality software that enhances privacy and security. Findings from the survey, which was conducted in 18 countries across Europe, the Middle East, and Africa (EMEA) as well as the United States, show that 66% of EMEA respondents agreed that commercial backing and code transparency reduces applications' privacy risks, and 67% agree these improve application security.

[...]By its nature, open source provides a layer of transparency not available in proprietary software. This transparency provides a method to trust but verify, whether a patch's efficacy, for hidden components or backdoors and that software development best practices are followed. This transparency improves code quality, which is an equal partner to security. In support of this, the Ponemon Institute survey found that 76% of EMEA respondents agree that commercial backing and code transparency improves application integrity and trustworthiness.

[...]A sampling frame of 17,680 US and 16,700 EMEA experienced IT and IT security practitioners were selected as participants to this survey. By design, 79% of US respondents and 74% of EMEA respondents are at or above the supervisory levels.[...]

[Ed's Note: There is a hint of 'slashvertisement' here as Zimbra are pushing their own products which, surprise surprise, are based on OS software although that need not negate the results of the survey.]

posted by janrinok on Sunday December 28 2014, @07:38PM   Printer-friendly
from the my-Dad-is-bigger-than-your-Dad! dept.

From the NY TImes comes the following story:

North Korea accused the United States on Saturday of being responsible for Internet outages it experienced in recent days amid a confrontation between them over the hacking of the film studio Sony Pictures. [...]

"The United States, with its large physical size and oblivious to the shame of playing hide and seek as children with runny noses would, has begun disrupting the Internet operations of the main media outlets of our republic," the North's National Defense Commission said in a statement.

"It is truly laughable," a spokesman for the commission said in comments carried by the North's official KCNA news agency. [...]

"Obama had better thrust himself to cleaning up all the evil doings that the U.S. has committed out of its hostile policy against (North Korea) if he seeks peace on U.S. soil. Then all will be well."

Kim Jong-un must have been unhappy that he couldn't watch YouTube videos during the outage. Well, he can continue to pout unless he gets his hackers to shut down Google.

posted by janrinok on Sunday December 28 2014, @05:03PM   Printer-friendly
from the what's-up-Doc? dept.

http://arstechnica.com/science/2014/12/when-the-doctors-away-the-patient-is-more-likely-to-survive/

A study shows that cardiology patients are more likely to survive when specialists are absent. This study was done as a followup to studies that have shown increased mortality of patients admitted during weekends to investigate whether differences in staffing caused the increase in mortality. Unexpectedly, they found that the patients did significantly better when the relevant specialists were unavailable. The study considered a number of possible causes, including a lower number of aggressive procedures or internal staffing changes.

posted by LaminatorX on Sunday December 28 2014, @02:16PM   Printer-friendly
from the by-my-heart-and-by-my-hand dept.

In their year end review The Scientist is carrying two stories that trumpet the BAD news in science over the last year.

The lists the Top Ten Retractions in 2014, which seems like more than previous years.

The retractions include:

  • STAP stem cell paper retractions from Nature
  • Spiking rabbit blood samples with human blood to make it look like an HIV vaccine was working.
  • A “peer review and citation ring” got 60 articles yanked
  • 120 bogus papers produced by the random text generator

In addition there was A List of the top Science Scandals of the year, some of which are included in the above, but also major containment issues at US government labs, including the discovery of undocumented pathogens in questionable storage.

It wasn't all bad news, a third story listed their nominations for The Years greatest breakthroughs.

Regardless of what we hear in the popular press, it is interesting to see what scientists themselves find most troublesome in their various fields. And it is interesting to note that many of the issues revolve around the review and publishing process.

posted by LaminatorX on Sunday December 28 2014, @11:00AM   Printer-friendly
from the road-trip dept.

The Tesla Motors Blog announces

We have long been excited to apply our learning back to our first vehicle, and are thrilled to do just that with the prototype Roadster 3.0 package. It consists of three main improvement areas.

1. Batteries
[...]We have identified a new cell that has 31% more energy than the original Roadster cell.

2. Aerodynamics
[...]we expect to make a 15% improvement

3. Rolling Resistance
[...]about a 20% improvement

[...]Combining all of these improvements we can achieve a predicted 40-50% improvement on range between the original Roadster and Roadster 3.0. There is a set of speeds and driving conditions where we can confidently drive the Roadster 3.0 over 400 miles. We will be demonstrating this in the real world during a non-stop drive from San Francisco to Los Angeles in the early weeks of 2015.

Appointments for upgrading Roadsters will be taken this spring once the new battery pack finishes safety validation. We are confident that this will not be the last update the Roadster will receive in the many years to come.

posted by janrinok on Sunday December 28 2014, @08:10AM   Printer-friendly

AirAsia flight QZ8501 from Indonesia to Singapore missing

An AirAsia flight travelling from Indonesia to Singapore with 162 people on board lost contact with air traffic control.

Flight QZ8501 lost contact at 07:24 (23:24 GMT), Malaysia-based AirAsia tweeted.

Search and rescue operations are under way.

posted by LaminatorX on Sunday December 28 2014, @07:33AM   Printer-friendly
from the Blue-Danube dept.

Spaceflight has faded from American consciousness even as our performance in space has reached a new level of accomplishment. In the past decade, America has become a truly, permanently spacefaring nation. All day, every day, half a dozen men and women, including two Americans, are living and working in orbit, and have been since November 2000. Charles Fishman has a long, detailed article about life aboard the ISS in The Atlantic that is well worth the read where you are sure to learn something you didn't already know about earth's permanent outpost in space. Some excerpts:

The International Space Station is a vast outpost, its scale inspiring awe even in the astronauts who have constructed it. From the edge of one solar panel to the edge of the opposite one, the station stretches the length of a football field, including the end zones. The station weighs nearly 1 million pounds, and its solar arrays cover more than an acre. It’s as big inside as a six-bedroom house, more than 10 times the size of a space shuttle’s interior. Astronauts regularly volunteer how spacious it feels. It’s so big that during the early years of three-person crews, the astronauts would often go whole workdays without bumping into one another, except at mealtimes.

On the station, the ordinary becomes peculiar. The exercise bike for the American astronauts has no handlebars. It also has no seat. With no gravity, it’s just as easy to pedal furiously, feet strapped in, without either. You can watch a movie while you pedal by floating a laptop anywhere you want. But station residents have to be careful about staying in one place too long. Without gravity to help circulate air, the carbon dioxide you exhale has a tendency to form an invisible cloud around your head. You can end up with what astronauts call a carbon-dioxide headache.

Even by the low estimates, it costs $350,000 an hour to keep the station flying, which makes astronauts’ time an exceptionally expensive resource—and explains their relentless scheduling: Today’s astronauts typically start work by 7:30 in the morning, Greenwich Mean Time, and stop at 7 o’clock in the evening. They are supposed to have the weekends off, but Saturday is devoted to cleaning the station—vital, but no more fun in orbit than housecleaning down here—and some work inevitably sneaks into Sunday.

Life in space is so complicated that a lot of logistics have to be off-loaded to the ground if astronauts are to actually do anything substantive. Just building the schedule for the astronauts in orbit on the U.S. side of the station requires a full-time team of 50 staffers.

Almost anyone you talk with about the value of the Space Station eventually starts talking about Mars. When they do, it’s clear that we don’t yet have a very grown-up space program. The folks we send to space still don’t have any real autonomy, because no one was imagining having to “practice” autonomy when the station was designed and built. On a trip to Mars, the distances are so great that a single voice or e‑mail exchange would involve a 30-minute round-trip. That one change, among the thousand others that going to Mars would require, would alter the whole dynamic of life in space. The astronauts would have to handle things themselves.

That could be the real value of the Space Station—to shift NASA’s human exploration program from entirely Earth-controlled to more astronaut-directed, more autonomous. This is not a high priority now; it would be inconvenient, inefficient. But the station’s value could be magnified greatly were NASA to develop a real ethic, and a real plan, for letting the people on the mission assume more responsibility for shaping and controlling it. If we have any greater ambitions for human exploration in space, that’s as important as the technical challenges. Problems of fitness and food supply are solvable. The real question is what autonomy for space travelers would look like—and how Houston can best support it. Autonomy will not only shape the psychology and planning of the mission; it will shape the design of the spacecraft itself.

posted by LaminatorX on Sunday December 28 2014, @04:01AM   Printer-friendly
from the applied-solipsism dept.

Humans and software see some images differently, pointing out shortcomings of recent breakthroughs in machine learning.

A technique called deep learning has enabled Google and other companies to make breakthroughs in getting computers to understand the content of photos. Now researchers at Cornell University and the University of Wyoming have shown how to make images that fool such software into seeing things that aren’t there.

The researchers can create images that appear to a human as scrambled nonsense or simple geometric patterns, but are identified by the software as an everyday object such as a school bus. The trick images offer new insight into the differences between how real brains and the simple simulated neurons used in deep learning process images.

http://www.technologyreview.com/news/533596/smart-software-can-be-tricked-into-seeing-what-isnt-there/

[Paper]: http://arxiv.org/pdf/1412.1897v1.pdf

posted by LaminatorX on Sunday December 28 2014, @01:50AM   Printer-friendly
from the merging-with-AT&T dept.

EETimes has an article on 10 of the top Single Board Computer projects from 2014. (Link is to single page print version, rather than the 10 page split version. Original multi page version available here.)

...Intel, iCOP, and other tech companies are expected to reveal new boards at the upcoming International CES, which is only a few short weeks away (January 6 through January 9, 2015). However, this year was no slouch for DIY designs for single-board computers.

The 10 projects are a mix of fun toys, with a couple of serious uses, and feature a variety of development kits.

posted by martyb on Saturday December 27 2014, @10:49PM   Printer-friendly
from the get-your-ideas-in dept.

The Center for American Progress reports

A majority of the justices on the Supreme Court [...] have refused to police partisan gerrymandering, largely because they believe that doing so would be too difficult. As Justice Scalia wrote in his plurality opinion in Vieth v. Jubelirer , "no judicially discernible and manageable standards for adjudicating political gerrymandering claims have emerged."

Scalia's view, however, is questionable. Mathematical models do exist that can measure when a state's electoral map produces results that are wildly out of line with voter preferences. And, in some recent gerrymandering cases, states have even openly stated that they tried to enhance some voters' power at the expense of others. Texas Attorney General and Governor-elect Greg Abbott (R) admitted in a 2013 court filing (PDF) that "In 2011, both houses of the Texas Legislature were controlled by large Republican majorities, and their redistricting decisions were designed to increase the Republican Party's electoral prospects at the expense of the Democrats." At the very least, a court should be able to discern that a partisan gerrymander occurred when the state freely admits as much.

A potentially significant writing contest sponsored by the Washington, DC advocacy group Common Cause seeks to further repudiate the claim that there is no meaningful way for judges to determine when a legislative map was drawn to give one party an advantage over the others. The contest offers a $5,000 top prize to lawyers and scholars who submit papers "creating a new definition for partisan gerrymandering or further developing an existing definition." (PDF) So the contest seeks to show that, if enticed by a cash prize, a community of scholars can discover something that the justices themselves cannot find—or at least that they claim not to have found—a way to prevent lawmakers from choosing their own voters.

The deadline for submissions is Friday, February 27, 2015.
The winners will be announced in May 2015.

posted by janrinok on Saturday December 27 2014, @08:42PM   Printer-friendly
from the axe-to-grind? dept.

http://www.npr.org/blogs/codeswitch/2014/12/26/371716376/hollywoods-acceptance-of-white-privilege-revealed-by-sony-hack

In short, evidence revealed by the Sony hack, and other outstanding evidence, points to Hollywood's continued resistance to reformation of racial perceptions.

posted by janrinok on Saturday December 27 2014, @07:10PM   Printer-friendly
from the anything-is-better-than-nothing dept.

Two stories appeared recently, The problem with license trends in Bruce Byfield's Blog on Linux Magazine's site, and the other on ZDNet, titled The Fall of the GPL and the rise of permissive licenses which both point out some worrying trends in open source licensing.

The first point both articles make is that there is an increasingly cavalier choice of open software licenses selected by projects. Many projects seem to choose a license at random or simply adopt what ever license was in vogue on the platform where they started development. Aaron Williamson, senior staff counsel at the Software Freedom Law Center, discovered that 85.1 percent of GitHub programs had no license at all.

Byfield spends much of his article explaining just how hard it is to actually obtain any reliable statistics. So many software sites simply fail to mention licenses in their repository directories, that one is reduced to google searching for license names, which often shows nothing at all. There are few ways to gather any statistics other than brute force download or researching project by project. Byfield's point is that nobody has done this in a believable way.

The trend seems clear that those who do choose a license are increasingly choosing MIT/BSD virtually un-restricted licensing for new projects as opposed to any versions of the GPL. From the ZDnet article:

(Apache/BSD/MIT) ... collectively are employed by 42 percent. They represent, in fact, three of the five most popular licenses in use today." These permissive licenses has been gaining ground at GPL's expense. The two biggest gainers, the Apache and MIT licenses, were up 27 percent, while the GPLv2, Linux's license, has declined by 24 percent.

It could be that those NOT choosing a license are simply tired of the whole argument, realize they will never be in a position to enforce any license anyway, and simply cast their code to the wind and trust to the mantra of "prior art". You would think that the generation that grew up with Groklaw and the SCO wars would actually care about this issue. One would think that watching Apple take BSD from open source to closed source while hurling sue-balls left and right would have served as a warning.

Or it could be a realization that the restrictions imposed by the GPL and other "copyleft" licenses are, in their own way, almost as burdensome as some commercial licenses. Or maybe it is the subtle differences in the GPL, GPLv2, GPLV3, LGPL, LGVLv2, LGPLv3, Affero-v1, Affero-v2, (ad infinitum) are so confusing that even a comparison chart of license features is confusing and bewildering to many who just want to cut code.

Are those of us in the software industry just making a mess for the future with all these licenses? Have we thrown up our collective hands in despair? Has the GPL-style copyleft license outlived its usefulness?

posted by janrinok on Saturday December 27 2014, @04:05PM   Printer-friendly
from the let's-try-to-hide-this-one dept.

Another story from the No-Shit-Sherlock Department.

El Reg is reporting that the NSA is admitting to what we already know:

The files have been heavily censored, but still manage to show that, either by accident or design, NSA staff routinely engaged in illegal surveillance with almost no comeback from management.

As the story points out, the report admitting this malfeasance was published when most people were not looking for news:

Slipping out unpleasant news at awkward times is a well-known PR practice – but the NSA has excelled itself by publishing on Christmas Eve internal reports detailing its unlawful surveillance. The agency dumped the docs online shortly after lunchtime on December 24, when most journalists are either heading home to their families or already drunk.

The report also points out one more reason why we should thank Edward Snowden:

The civil liberties body ACLU sued the NSA for the right to see the reports, and the courts sided with the group. The organization was only able to file the request thanks to knowing specifically what to ask for, thanks to internal documents leaked to the world by Edward Snowden.

posted by martyb on Saturday December 27 2014, @01:27PM   Printer-friendly
from the big-problem dept.

Ars Technica recently published a story: Immune cells tweak the body’s metabolism to help control obesity.

Obesity has reached epic proportions in the United States and is rising in other developed and developing countries as they adopt our diet and lifestyle. Autoimmune diseases, like celiac disease and multiple sclerosis, and allergies, also immune-mediated, have blossomed recently, too.

These conditions have exploded within too short of a time period to be attributable to genetic changes, so environmental factors, from synthetic pesticides to plastics to antibiotics, have been blamed for their increased prevalence. While it's probably simplistic to search for one cause to explain away both these types of modern ills, some studies are indicating that immune cells and molecules are important for regulating metabolism—and are dysregulated in obesity.

A specific type of immune cell, called Group 2 innate lymphoid cells (ILC2s), were found in white adipose tissue in mice last year. Now, they have been found in the same tissue in humans. Obese people, along with obese mice, have fewer of these cells in their fat than lean individuals do.

These cells respond to an immune signaling molecule called interleukin 33 (IL-33); that same molecule diminishes obesity by increasing caloric expenditure. This increased caloric expenditure is not due to increased physical activity or to burning more calories as more food is consumed. Instead, IL-33 just enhances the number of calories burned by normal physiological processes. Researchers figured all of this out by playing with mice deficient in IL-33 as well as those deficient in ILC2s—feeding them high fat versus regular chow, treating them with injections of IL-33, and comparing them to normal mice.

[Abstract]: http://www.nature.com/nature/journal/vaop/ncurrent/full/nature14115.html

posted by martyb on Saturday December 27 2014, @10:21AM   Printer-friendly
from the didn't-see-that-coming dept.

Phys.org reports that in a new paper accepted by the journal Astroparticle Physics, Robert Ehrlich, a recently retired physicist from George Mason University, claims that the electron neutrino is very likely a tachyon or faster-than-light particle. Ehrlich's new claim of faster-than-light neutrinos is based on a much more sensitive method than measuring their speed, namely by finding their mass. The result relies on tachyons having an imaginary mass, or a negative mass squared. Imaginary mass particles have the weird property that they speed up as they lose energy – the value of their imaginary mass being defined by the rate at which this occurs. According to Ehrlich, the magnitude of the neutrino's imaginary mass is 0.33 electronvolts, or 2/3 of a millionth that of an electron. He deduces this value by showing that six different observations from cosmic rays, cosmology, and particle physics all yield this same value (PDF) within their margin of error. One check on Ehrlich's claim could come from the experiment known as KATRIN, which should start taking data in 2015. In this experiment the mass of the neutrino could be revealed by looking at the shape of the spectrum in the beta decay of tritium, the heaviest isotope of hydrogen.p

But be careful. There have been many such claims, the last being in 2011 when the "OPERA" experiment measured the speed of neutrinos and claimed they travelled a tiny amount faster than light. When their speed was measured again the original result was found to be in error – the result of a loose cable no less. "Before you try designing a "tachyon telephone" to send messages back in time to your earlier self it might be prudent to see if Ehrlich's claim is corroborated by others."