Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
The Center for American Progress reports
A majority of the justices on the Supreme Court [...] have refused to police partisan gerrymandering, largely because they believe that doing so would be too difficult. As Justice Scalia wrote in his plurality opinion in Vieth v. Jubelirer , "no judicially discernible and manageable standards for adjudicating political gerrymandering claims have emerged."
Scalia's view, however, is questionable. Mathematical models do exist that can measure when a state's electoral map produces results that are wildly out of line with voter preferences. And, in some recent gerrymandering cases, states have even openly stated that they tried to enhance some voters' power at the expense of others. Texas Attorney General and Governor-elect Greg Abbott (R) admitted in a 2013 court filing (PDF) that "In 2011, both houses of the Texas Legislature were controlled by large Republican majorities, and their redistricting decisions were designed to increase the Republican Party's electoral prospects at the expense of the Democrats." At the very least, a court should be able to discern that a partisan gerrymander occurred when the state freely admits as much.
A potentially significant writing contest sponsored by the Washington, DC advocacy group Common Cause seeks to further repudiate the claim that there is no meaningful way for judges to determine when a legislative map was drawn to give one party an advantage over the others. The contest offers a $5,000 top prize to lawyers and scholars who submit papers "creating a new definition for partisan gerrymandering or further developing an existing definition." (PDF) So the contest seeks to show that, if enticed by a cash prize, a community of scholars can discover something that the justices themselves cannot find—or at least that they claim not to have found—a way to prevent lawmakers from choosing their own voters.
The deadline for submissions is Friday, February 27, 2015.
The winners will be announced in May 2015.
In short, evidence revealed by the Sony hack, and other outstanding evidence, points to Hollywood's continued resistance to reformation of racial perceptions.
Two stories appeared recently, The problem with license trends in Bruce Byfield's Blog on Linux Magazine's site, and the other on ZDNet, titled The Fall of the GPL and the rise of permissive licenses which both point out some worrying trends in open source licensing.
The first point both articles make is that there is an increasingly cavalier choice of open software licenses selected by projects. Many projects seem to choose a license at random or simply adopt what ever license was in vogue on the platform where they started development. Aaron Williamson, senior staff counsel at the Software Freedom Law Center, discovered that 85.1 percent of GitHub programs had no license at all.
Byfield spends much of his article explaining just how hard it is to actually obtain any reliable statistics. So many software sites simply fail to mention licenses in their repository directories, that one is reduced to google searching for license names, which often shows nothing at all. There are few ways to gather any statistics other than brute force download or researching project by project. Byfield's point is that nobody has done this in a believable way.
The trend seems clear that those who do choose a license are increasingly choosing MIT/BSD virtually un-restricted licensing for new projects as opposed to any versions of the GPL. From the ZDnet article:
(Apache/BSD/MIT) ... collectively are employed by 42 percent. They represent, in fact, three of the five most popular licenses in use today." These permissive licenses has been gaining ground at GPL's expense. The two biggest gainers, the Apache and MIT licenses, were up 27 percent, while the GPLv2, Linux's license, has declined by 24 percent.
It could be that those NOT choosing a license are simply tired of the whole argument, realize they will never be in a position to enforce any license anyway, and simply cast their code to the wind and trust to the mantra of "prior art". You would think that the generation that grew up with Groklaw and the SCO wars would actually care about this issue. One would think that watching Apple take BSD from open source to closed source while hurling sue-balls left and right would have served as a warning.
Or it could be a realization that the restrictions imposed by the GPL and other "copyleft" licenses are, in their own way, almost as burdensome as some commercial licenses. Or maybe it is the subtle differences in the GPL, GPLv2, GPLV3, LGPL, LGVLv2, LGPLv3, Affero-v1, Affero-v2, (ad infinitum) are so confusing that even a comparison chart of license features is confusing and bewildering to many who just want to cut code.
Are those of us in the software industry just making a mess for the future with all these licenses? Have we thrown up our collective hands in despair? Has the GPL-style copyleft license outlived its usefulness?
Another story from the No-Shit-Sherlock Department.
El Reg is reporting that the NSA is admitting to what we already know:
The files have been heavily censored, but still manage to show that, either by accident or design, NSA staff routinely engaged in illegal surveillance with almost no comeback from management.
As the story points out, the report admitting this malfeasance was published when most people were not looking for news:
Slipping out unpleasant news at awkward times is a well-known PR practice – but the NSA has excelled itself by publishing on Christmas Eve internal reports detailing its unlawful surveillance. The agency dumped the docs online shortly after lunchtime on December 24, when most journalists are either heading home to their families or already drunk.
The report also points out one more reason why we should thank Edward Snowden:
The civil liberties body ACLU sued the NSA for the right to see the reports, and the courts sided with the group. The organization was only able to file the request thanks to knowing specifically what to ask for, thanks to internal documents leaked to the world by Edward Snowden.
Ars Technica recently published a story: Immune cells tweak the body’s metabolism to help control obesity.
Obesity has reached epic proportions in the United States and is rising in other developed and developing countries as they adopt our diet and lifestyle. Autoimmune diseases, like celiac disease and multiple sclerosis, and allergies, also immune-mediated, have blossomed recently, too.
These conditions have exploded within too short of a time period to be attributable to genetic changes, so environmental factors, from synthetic pesticides to plastics to antibiotics, have been blamed for their increased prevalence. While it's probably simplistic to search for one cause to explain away both these types of modern ills, some studies are indicating that immune cells and molecules are important for regulating metabolism—and are dysregulated in obesity.
A specific type of immune cell, called Group 2 innate lymphoid cells (ILC2s), were found in white adipose tissue in mice last year. Now, they have been found in the same tissue in humans. Obese people, along with obese mice, have fewer of these cells in their fat than lean individuals do.
These cells respond to an immune signaling molecule called interleukin 33 (IL-33); that same molecule diminishes obesity by increasing caloric expenditure. This increased caloric expenditure is not due to increased physical activity or to burning more calories as more food is consumed. Instead, IL-33 just enhances the number of calories burned by normal physiological processes. Researchers figured all of this out by playing with mice deficient in IL-33 as well as those deficient in ILC2s—feeding them high fat versus regular chow, treating them with injections of IL-33, and comparing them to normal mice.
[Abstract]: http://www.nature.com/nature/journal/vaop/ncurrent/full/nature14115.html
Phys.org reports that in a new paper accepted by the journal Astroparticle Physics, Robert Ehrlich, a recently retired physicist from George Mason University, claims that the electron neutrino is very likely a tachyon or faster-than-light particle. Ehrlich's new claim of faster-than-light neutrinos is based on a much more sensitive method than measuring their speed, namely by finding their mass. The result relies on tachyons having an imaginary mass, or a negative mass squared. Imaginary mass particles have the weird property that they speed up as they lose energy – the value of their imaginary mass being defined by the rate at which this occurs. According to Ehrlich, the magnitude of the neutrino's imaginary mass is 0.33 electronvolts, or 2/3 of a millionth that of an electron. He deduces this value by showing that six different observations from cosmic rays, cosmology, and particle physics all yield this same value (PDF) within their margin of error. One check on Ehrlich's claim could come from the experiment known as KATRIN, which should start taking data in 2015. In this experiment the mass of the neutrino could be revealed by looking at the shape of the spectrum in the beta decay of tritium, the heaviest isotope of hydrogen.p
But be careful. There have been many such claims, the last being in 2011 when the "OPERA" experiment measured the speed of neutrinos and claimed they travelled a tiny amount faster than light. When their speed was measured again the original result was found to be in error – the result of a loose cable no less. "Before you try designing a "tachyon telephone" to send messages back in time to your earlier self it might be prudent to see if Ehrlich's claim is corroborated by others."
For most city-dwellers, the elevator is an unremarkable machine that inspires none of the passion or interest that Americans afford trains, jets, and even bicycles. But according to Daniel Wilk the automobile and the elevator have been locked in a “secret war” for over a century, with cars making it possible for people to spread horizontally, encouraging sprawl and suburbia, and elevators pushing them toward life in dense clusters of towering vertical columns.
Elevators first arrived in America during the 1860s, in the lobbies of luxurious hotels, where they served as a plush conveyance that saved the well-heeled traveler the annoyance of climbing stairs. It wasn’t until the 1870s, when elevators showed up in office buildings, that the technology really started to leave a mark on urban culture. Business owners stymied by the lack of available space could look up and see room for growth where there was previously nothing but air—a development that was particularly welcome in New York, where a real estate crunch in Manhattan’s business district had, for a time, forced city leaders to consider moving the entire financial sector uptown. Advances in elevator technology combined with new steel frame construction methods to push the height limits of buildings higher and higher. In the 1890s, the tallest building in the world was the 20-story Masonic Temple in Chicago. By 1913, when hydraulic elevators had been replaced with much speedier and more efficient electrical ones, it was the 55-story Woolworth Building in New York, still one of the one-hundred tallest buildings in the United States as well as one of the twenty tallest buildings in New York City. "If we didn't have elevators," says Patrick Carrajat, the founder of the Elevator Museum in New York, "we would have a megalopolis, one continuous city, stretching from Philadelphia to Boston, because everything would be five or six stories tall."
But the elevator did more than make New York the city of skyscrapers, it changed the way we live. “The elevator played a role in the profound reorganization of the building,” writes Andreas Bernard. That means a shift from single-family houses and businesses to apartments and office buildings. “Suddenly … it was possible to encounter strangers almost anywhere.” The elevator, in other words, made us more social — even if that social interaction often involved muttered small talk and staring at doors. Elevators also reinforced a social hierarchy; for while we rode the same elevators, those who rode higher lived above the fray. "It put the “Upper” into the East Side. It prevented Fifth Avenue from becoming Wall Street," writes Stephen Lynch. "It made “penthouse” the most important word in real estate."
Android Lollipop will include a feature that techrepublic.com's Jack Wallen believes could be a game changer for certain people. That feature? RAW image support.
When you snap a shot with your Android camera, the internal software compresses the image into a .jpg file. To the untrained, naked eye, that photo usually looks pretty spectacular. The thing is, what you see is what you get. You can't really manipulate that photo on any low level. It's compressed and saved in a read/write format, so the images can be more easily edited with a bitmap editor (such as The Gimp or Photoshop).
With RAW images, the data has been minimally processed from the image sensor. Many consider RAW images to be the digital equivalent of the old school negative. These RAW images will have a wider dynamic color range and they preserve the closest image to what the sensor actually saw.
Artificial Intelligence is no match for natural stupidity:
What will intelligent machines mean for society and the economy in 30, 50 or even 100 years from now?
That's the question that Stanford University scientists are hoping to take on with a new project, the One Hundred Year Study on Artificial Intelligence (AI100).
The university is inviting artificial intelligence researchers, roboticists and other scientists to begin what they hope will be a long term — 100 years long — effort to study and anticipate the effects of advancing artificial intelligence (AI) technology. Scientists want to consider how machines that perceive, learn and reason will affect the way people live, work and communicate.
http://www.computerworld.com/article/2862983/stanford-launches-100-year-study-of-artificial-intelligence.html
[Source]: http://news.stanford.edu/news/2014/december/ai-century-study-121614.html
Building a moderately complex Web page requires understanding a whole stack of technologies, from HTML to JavaScript. Now a researcher from the Massachusetts Institute of Technology (MIT) has wrapped these technologies into a single language that could streamline development, speed up performance and better secure Web sites.
The language, called Ur/Web, provides a way for developers to write pages as self-contained programs. It incorporates many of the most widely used Web technologies, freeing the developer from working with each language individually.
http://www.computerworld.com/article/2863069/application-development/mit-unifies-web-dev-into-a-single-speedy-new-language-urweb.html
[Related]: http://www.impredicative.com/ur/
[Source]: http://newsoffice.mit.edu/2014/new-programming-language-coordinates-web-page-components-1223
The Independent reports that:
A British firm could be set to net billions of pounds after making a major breakthrough in cybersecurity. Scientists at Scentrics, working with University College London, say they can guarantee total privacy for emails and text messages. It also means that for the first time laptop and smartphone users will be able to connect to wifi hotspots on the move without worrying about hackers. Only the security services would be able to gain access to the messages, if they needed to. The Scentrics application can be embedded into a mobile handset or computer device, enabling the user to obtain "one-click privacy" at the press of a button. Or it can be downloaded as an app, so the sender can pay a small fee for security every time, for instance, they send an image of family or friends over the internet.
The patent assignee modestly states:
"In terms of British Intellectual Property [IP], it is only dwarfed by the invention of the world wide web itself," said Mr Chandrasekaran. "The internet was born without this in its DNA and we've done it." He explained: "What we've done is to patent the IP for a standards-based, fully automatic, cryptographic key management and distribution protocol for UMTS and TCP/IP." In layman's terms, the company and UCL have found a way of defeating what cryptologists call "the man-in-the-middle attack" or MITM - the ability of someone to hack and intercept an electronic message.
The venture comes from a heavy hitting institution and the people involved seem to be quite connected but the scheme only works by having secure access to a public key infrastructure. Unfortunately, As I previously noted when the last one-step crypto system flamed out (but before the next five went nowhere):
any one-step, hermetically-sealed, silver-bullet solution is poor technology and, in the case of security, is actively dangerous. Although it should never be necessary to pull something to pieces, or understand innards, technology is far from waving a magic wand and having something work 100% of the time. Technology is based upon tiers of leaky abstractions. Therefore, *when* it fails, it needs to be divisible so that debug can proceed. Ideally, technology should be a binary tree of components and faults can be found in the manner that Christmas tree lights can be fixed.
Even when packaged and idiot-proofed, the implication for end users is that anything significant needs to be a multi-step process. For example, install application, install certificates, test certificates. Anything less will have a horrendous corner-case which will be awkward to detect, diagnose or correct. And in the case of security, these corner-cases foreseeably threaten liberty.
Full disclosure: I may or may not be connected to one of the parties mentioned in a previous article.
Kitsap Sun reports at Military.com that the USS Ranger, a 1,050-foot-long, 56,000-ton Forrestal-class aircraft carrier, is being towed from the inactive ship maintenance facility at Puget Sound for a 3,400-mile, around-Cape Horn voyage to a Texas dismantler who acquired the Vietnam-era warship for a penny for scrap metal. “Under the contract, the company will be paid $0.01. The price reflects the net price proposed by International Shipbreaking, which considered the estimated proceeds from the sale of the scrap metal to be generated from dismantling,” said officials for NAVSEA. “[One cent] is the lowest price the Navy could possibly have paid the contractor for towing and dismantling the ship.”
The Ranger was commissioned Aug. 10, 1957, at Norfolk Naval Shipyard and decommissioned July 10, 1993, after more than 35 years of service. It was stricken from the Naval Vessel Register on March 8, 2004, and redesignated for donation. After eight years on donation hold, the USS Ranger Foundation was unable to raise the funds to convert the ship into a museum or to overcome the physical obstacles of transporting the ship up the Columbia River to Fairview, Oregon. As a result, the Ranger was removed from the list of ships available for donation and designated for dismantling. The Navy, which can't retain inactive ships indefinitely, can't donate a vessel unless the application fully meets the Navy's minimum requirements. The Ranger had been in pristine condition, but for a week in August volunteers from other naval museums were allowed to remove items to improve their ships. The Ranger was in a slew of movies and television shows, including "The Six Million Dollar Man," "Flight of the Intruder" and "Star Trek IV: The Voyage Home" where it stood in for the USS Enterprise carrier. But the Ranger’s most famous role was in the 1980’s Tom Cruise hit, "Top Gun." “We would have liked to have seen it become a museum, but it just wasn’t in the cards,” said Navy spokesman Chris Johnson. “But unfortunately, it is a difficult proposition to raise funds. The group that was going to collect donations had a $35 million budget plan but was only able to raise $100,000.”
Europe's top court has ruled that obese people can be considered as disabled, meaning that they can be covered by an EU law barring discrimination at work.
The decision on Thursday followed a question from a Danish court, which was reviewing a complaint of unfair dismissal brought by Karsten Kaltoft, a child-minder, against a Danish local authority.
The Court of Justice of the European Union was asked to rule on whether EU law forbids discrimination on the grounds of obesity or whether obesity could be considered a disability. The Luxembourg-based court ruled that EU employment law did not specifically prohibit discrimination on the grounds of obesity, and that the law should not be extended to cover it. However, the court said that if an employee's obesity hindered "full and effective participation of that person in professional life on an equal basis with other workers" then it could be considered a disability.
(According to statistics from the World Health Organisation, based on 2008 estimates, roughly 23 percent of European women and 20 percent of European men were obese.)
H2O is an optimized HTTP server with support for HTTP/1.x and the upcoming HTTP/2; it can be used either as a standalone server or a library. With this first release, H2O concentrates on serving static files / working as a reverse proxy at high performance.
Built around PicoHTTPParser (a very efficient HTTP/1 parser), H2O outperforms Nginx by a considerable margin. It also excels in HTTP/2 performance.
Together with the contributors [the author] will continue to optimize / add more features to the server, and hopefully reach a stable release (version 1.0.0) when HTTP/2 becomes standardized in the coming months.
(TFA of course explains the second part of the headline - why performance will matter)
Nicholas St. Fluer reports at The Atlantic that according to researchers, our convenient, sedentary way of life is making our bones weak foretelling a future with increasing fractures, breaks, and osteoporosis. For thousands of years, hunter-gatherers trekked on strenuous ventures for food with dense skeletons supporting their movements and a new study pinpoints the origin of weaker bones at the beginning of the Holocene epoch roughly 12,000 years ago, when humans began adopting agriculture. “Modern human skeletons have shifted quite recently towards lighter—more fragile, if you like—bodies. It started when we adopted agriculture. Our diets changed. Our levels of activity changed,” says Habiba Chirchir, A second study attributes joint bone weakness to different levels of physical activity in ancient human societies, also related to hunting versus farming.
The team scanned circular cross-sections of seven bones in the upper and lower limb joints in chimpanzees, Bornean orangutans and baboons. They also scanned the same bones in modern and early modern humans as well as Neanderthals, Paranthropus robustus, Australopithecus africanus and other Australopithecines. They then measured the amount of white bone in the scans against the total area to find the trabecular bone density. Crunching the numbers confirmed their visual suspicions. Modern humans had 50 to 75 percent less dense trabecular bone than chimpanzees, and some hominins had bones that were twice as dense compared to those in modern humans. Both studies have implications for modern human health and the importance of physical activity to bone strength. “The lightly-built skeleton of modern humans has a direct and important impact on bone strength and stiffness,” says Tim Ryan. That's because lightness can translate to weakness—more broken bones and a higher incidence of osteoporosis and age-related bone loss. The researchers warn that with the desk-bound lives that many people lead today, our bones may have become even more brittle than ever before. “We are not challenging our bones with enough loading," says Colin Shaw, "predisposing us to have weaker bones so that, as we age, situations arise where bones are breaking when, previously, they would not have."