Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

The shambling corpse of Steve Jobs lumbers forth, heeding not the end of October! How will you drive him away?

  • Flash running on an Android phone, in denial of his will
  • Zune, or another horror from darkest Redmond
  • Newton, HyperCard, or some other despised interim Apple product
  • BeOS, the abomination from across the sea
  • Macintosh II with expansion slots, in violation of his ancient decree
  • Tow his car for parking in a handicap space without a permit
  • Oncology textbook—without rounded corners
  • Some of us are still in mourning, you insensitive clod!

[ Results | Polls ]
Comments:31 | Votes:92

posted by CoolHand on Tuesday November 17 2015, @11:54PM   Printer-friendly
from the just-news dept.

Microsoft's plan to run Android applications on Windows phones and tablets, known as "Project Astoria", may be indefinitely shelved, or at least delayed:

Microsoft has sidelined its plan to allow Windows 10 devices to run Android apps before it could do any serious damage, according to a report. Daniel Rubino at the Windows Central blog gathered some convincing evidence that Microsoft's Project Astoria has been wound down, while the runtime allowing the Android-on-Win10 magic to work has disappeared. Microsoft declined to elaborate on its fate, but stressed that developers had "other tools offer great options for developers".

The plan to bridge the "app gap" allowing Android binaries to run on on Windows 10 mobile devices was famously, and not unjustifiably, described* as a "suicide note" by Microsoft watcher Paul Thurrott when its existence was widely discussed back in April. The fear was that the existence of an Android runtime on Windows 10 phones and tablets would remove the incentive for developers to create native Windows applications. Windows would become a device driver layer and as a consequence, Microsoft's best chance to lure users into its d̶a̶t̶a̶-̶s̶l̶u̶r̶p̶i̶n̶g̶ cloud consumer services such as Cortana would disappear.

[...] In September, the Astoria forums went silent. Microsoft no longer briefs developers about it and the runtime has been removed from the latest builds of Windows 10 Mobile. Rubino suggests that it was labour intensive, with as many as 80 developers involved, which can't have helped. But it's only part of the picture. If you've followed the travails of the BlackBerry 10 (BB10) operating system, it's a vivid demonstration of Thurrott's "suicide prediction". The essential dilemma is this: the better you make your runtime, the less incentive there is to create native applications.

Project Islandwood, a similar Microsoft effort for iOS that requires Objective-C apps to be recompiled, appears intact. The Astoria team reportedly had 60-80 Microsoft devs working on it, compared to just 5 for Islandwood. The recompilation rather than emulation approach would also make piracy more difficult. Projects Westminster and Centennial, for porting "web apps" and legacy Win32 desktop applications respectively, also remain on track.


Original Submission

posted by CoolHand on Tuesday November 17 2015, @10:22PM   Printer-friendly
from the showing-you-the-light dept.

Researchers have designed a nanoscale device that, under ideal conditions, can confine a "bit" of light (that is, light with a single precise energy value) for an infinite amount of time. Although a physically realized device would inevitably lose some of the trapped light due to material imperfections, the researchers expect that it should be possible to completely compensate for this loss by incorporating some form of optical gain like that used in lasers, so that in principle the lifetime can be infinitely large even in a real device.
...
But now the scientists may have found a way to keep light in.
...
To overcome light's penchant for escaping, Lannebère and Silveirinha utilized an idea proposed by John von Neumann and Eugene Wigner in 1929, and later extended by others, which has led to the discovery that transparent structures with tailored geometries can perfectly confine light by scattering it in a very specific way.

Lannebère and Silveirinha showed that this strategy for confining light can be achieved by shining light on a spherical "meta-atom," so-named because it allows light to have only a specific quantized energy value (creating a light "bit"), similar to how an atom allows electrons to occupy only certain quantized energy levels.


Original Submission

posted by martyb on Tuesday November 17 2015, @08:52PM   Printer-friendly
from the karma dept.

The Prenda Law porn copyright troll saga continues.

John Steele and Paul Hansmeier formed a law firm which concentrated on copyright matters, which is to say, they sued John Does and sometimes individuals for allegedly downloading or sharing copyrighted pornographic videos. Steele Hansmeier became Prenda Law, which was succeeded by Hansmeier's Alpha Law Firm. More recently, Paul Hansmeier's law firm Class Justice has been suing small businesses for allegedly illegally discriminating against disabled people.

Now the Minnesota Lawyers Professional Responsibility Board has petitioned the Minnesota Supreme Court to disbar or suspend Hansmeier for his antics. The 43-page petition (PDF) is an eye-opener about Prenda's tactics, and the growing impatience of judges with them.

Previous related stories:
Prenda-Linked Copyright Trolling Lawyer Paul Duffy Dead at Age 55
Porn Studio Asks Judge to Ban Talk About "Copyleft" Blogs at Trial
Appeals Court: Shell Game Over, Prenda Law Must Pay Sanctions in Full

This story is also being covered by Techdirt.


Original Submission

posted by martyb on Tuesday November 17 2015, @07:26PM   Printer-friendly
from the moah-more dept.

I was a first-year graduate student at UC Berkeley in 1978. I had been an undergraduate at MIT, and had used the ITS timesharing systems there, which ran on PDP-10's. ITS put a "--MORE--" at the bottom of the screen when one typed out files; you pressed the space bar to continue.

At Berkeley, we'd just gotten our first VAX UNIX system, though there were already PDP-11 UNIX systems. There was a very simple program through which one could pipe stdout to do screen-at-a-time output. It rang the terminal bell after printing 24 lines, and waited for a carriage return. It was called cr3. My guess is that in some version of UNIX, someone had hacked a page-at-a-time output mode into the tty output drivers. Using stty, one could already say cr0, cr1, and cr2, which added different amounts of delay when printing a carriage return, for the benefit of slow printing terminals. cr3 was probably unused, and the page- at-a-time mode was piggybacked on it. But our version of UNIX didn't have this cr3 stty mode; instead we had the cr3 program that provided equivalent functionality.

Many of the terminals at Berkeley were Lear-Siegler ADM-3 and ADM-3A "dumb" terminals. Both models (or maybe just the ADM-3's) rang the terminal bell when the cursor advanced to near the right margin, as a typewriter bell would. Unfortunately, they rang the bell on output as well as keyboard input, which made for incessant beeping. It was particularly maddening in a room full of terminals. So most of the bell speakers had been disconnected. Since cr3 rang the terminal bell to indicate that a full page had been output, you couldn't tell when it was waiting for input on those muted terminals. The problem was exacerbated by the slow response time of the overloaded UNIX systems.

So I wrote a simple cr3-like program, but had it print "--More--" instead of ringing the bell. I had it accept space instead of carriage return to continue, because that was what I was used to from ITS. I also made it take multiple filenames, and had it print lines of colons ("::::::::::::") before and after it printed each filename.

An old article, but it's occasionally good to touch on the foundations of what we take for granted.


Original Submission

posted by martyb on Tuesday November 17 2015, @05:52PM   Printer-friendly
from the approaching-ludrious-fast dept.

The 46th edition of the TOP500 list of supercomputers has been released. While the familiar Tianhe-2 continues to lead the list with a performance of 33.86 petaflops, China has nearly tripled its representation within the top 500 systems, to 109 supercomputers today from just 37 in June. The United States has 200 systems on the list, down from 231 in June and the country's lowest share since the list was first published in 1993.

There are two new entrants within the top 10 systems. The U.S. Department of Energy's unfinished Trinity supercomputer debuts at #6 with a LINPACK of 8.1 petaflops. Trinity's performance is expected to grow to around 42.2 peak petaflops once Intel's Knights Landing Xeon Phi coprocessors are added in 2016. University of Stuttgart's High Performance Computing Center Stuttgart (HLRS) has doubled the performance of Hazel-Hen. It is now a 5.6 petaflops system that reaches #8 on the list and is Germany's most powerful supercomputer, edging out the 5 petaflops JUQUEEN ranked at #11. Trinity and Hazel-Hen are both Cray XC systems, reflecting a recent resurgence in Cray Inc.'s representation on the list (now with a 24.9% share of total installed performance).

[More after the break.]

Here are some more trends from the current list:

  • Today's #500 system has a performance of 204.3 teraflops (164 teraflops in June).
  • Total combined performance of all 500 systems has reached 420 petaflops (361 petaflops in June, 309 petaflops one year ago).
  • 80 systems on the list have reached 1 petaflops or greater performance (67 systems in June).
  • 104 systems use accelerators or coprocessors, such as NVIDIA GPUs and Xeon Phi "manycore" chips (90 in June).
  • Chinese vendor Sugon now beats IBM in system share, 49-45, although 14 systems not counted are labeled either IBM/Lenovo and Lenovo/IBM. Sugon had just 5 systems on the June 2015 list. Many of the Sugon systems appear to be from commercial telecom and Internet customers, and are ranked closer to the bottom of the list.

TOP500 also published interviews this week with TOP500 "co-authors" Horst Simon and Jack Dongarra. The Next Platform (formerly The Platform) has coverage and analysis of the list.


Original Submission

posted by martyb on Tuesday November 17 2015, @04:29PM   Printer-friendly
from the Judges-have-DNA,-too dept.

The Electronic Frontier Foundation has filed an amicus brief challenging DNA collection from people arrested in California:

Californians who've merely been arrested and not charged, much less convicted of a crime, have a right to privacy when it comes to their genetic material, EFF said in an amicus brief filed Nov. 13 with the state's highest court.

EFF is urging the California Supreme Court to hold that the state's arrestee DNA collection law violates privacy and search and seizure protections guaranteed under the California constitution. The law allows police to collect DNA from anyone arrested on suspicion of a felony—without a warrant or any finding by a judge that there was sufficient cause for the arrest. The state stores arrestees' DNA samples indefinitely, and allows access to DNA profiles by local, state, and federal law enforcement agencies.

EFF is weighing in on People v. Buza, a case involving a San Francisco man who challenged his conviction for refusing to provide a DNA sample after he was arrested. EFF argues that the state should not be allowed to collect DNA from arrestees because our DNA contains our entire genetic makeup—private and personal information that maps who we are, where we come from, and who we are related to. Arrestees, many of whom will never be charged with or convicted of a crime, have a right to keep this information out of the state's hands.

"Nearly a third of those arrested for suspected felonies in California are later found to be innocent in the eyes of the law. Hundreds of thousands of Californians who were once in custody but never charged still have their DNA stored in law enforcement databases, subject to continuous searches," said EFF Senior Staff Attorney Jennifer Lynch. "This not only violates the privacy of those arrested, it could impact their family members who may someday be identified through familial searches. The court must recognize that warrantless and suspicionless DNA collection from arrestees puts us on a path towards a future where anyone's DNA can be gathered, searched, and used for surveillance."


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @02:54PM   Printer-friendly
from the scary-thoughts dept.

National Security experts Steven Simon and Daniel Benjamin write in The New York Times that in the aftermath of the terror attacks in Paris, most Americans probably feel despair, and a presentiment that it is only a matter of time before something similar happens here. "But such anxiety is unwarranted. In fact, it's a mistake to assume that America's security from terrorism at home is comparable to Europe's. For many reasons, the United States is a significantly safer place. While vigilance remains essential, no one should panic."

According to Simon and Benjamin the slaughter in France depended on four things: easy access to Paris, European citizens happy to massacre their compatriots, a Euro-jihadist infrastructure to supply weapons, and security agencies that lacked resources to monitor the individuals involved - problems the United States does not have — at least not nearly to the degree that Europe does, undermining its ability to defend itself. For example, Europe's external border controls allow for free border-crossing inside most of the European Union, making life simple for criminals.

But the United States doesn't have this problem. Pretty much anyone coming to the United States from Middle Eastern war zones or the radical underground of Europe would need to come by plane, and, since 9/11, we have made it tough for such people to fly to the United States. The United States has another advantage: an intelligence, law enforcement, and border-control apparatus that has been vastly improved since the cataclysm of 9/11. Post-9/11 visa requirements and no-fly lists weed out most bad actors, and both the Bush and Obama administrations demanded that countries in our visa waiver program provide data on extremists through information-sharing pacts called HSPD-6 agreements.

"None of this should lead American authorities, or the American people, to settle into a false sense of security," conclude Simon and Benjamin adding that "what the Paris attacks show is that the world needs America's intelligence and security resources even more than its military might."


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @01:12PM   Printer-friendly
from the 599-more-to-buy-a-toilet-seat dept.

A neat little technical division of the GSA, known as 18F, which is modeled on startup culture and bringing a much more innovative take on technology to the government (it's the same group that's going around making the rest of the federal government encrypt their websites...), recently ran an experiment which (somewhat unexpectedly to all involved) resulted in the GSA awarding a $1 contract for a bit of open source software. And, yes, that's ONE DOLLAR.

A few weeks back, 18F announced this experiment in "micro-purchase" contracts, with the idea being to see if they could create a quick and simple process to both (1) do small focused contracts and (2) make it easy for smaller tech firms to actually provide their products and services to the government. So 18F posted the details of a specific problem it was trying to solve to Github, and then created a Google form, to serve as a sort of blind reverse auction. Here's how 18F described things:

If you're interested in bidding, the closing time for the bid is 12 p.m. on Thursday, October 29, 2015. The opening bid starts at $3,499, and the lowest bid at the closing time will have 10 working days to ship the code necessary to satisfy the criteria. If the criteria are met, the vendor gets paid. If the criteria aren't met the vendor shall not receive payment, the next lowest bidder will have the opportunity.

[...] No one expected someone to (a) bid $1 and (b) then deliver working code that not only met the requirements, but exceeded them. But that's what happened.

In some respects, this result was the best possible outcome for the experiment. It proved that some of our core assumptions about how it would work were wrong. But the experiment also validated the core concept that open-source micro-purchasing can work, and it's a thing we should try to do again. A few weeks ago, micro-purchasing for code was just an idea, but now that we've done our first experiment, the data demonstrate that the idea has potential and can be improved upon.

You can see the "winning" $1 pull request by Brendan Sudol over at Github, which went above and beyond the requirements:

Not only did Brendan Sudol meet the requirements of loading the data, the new code had 100 percent test coverage, an A grade from Code Climate, and included some new functionality to boot.


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @11:35AM   Printer-friendly
from the will-they-taste-like-chicken dept.

A recently published study from two Iowa State University scientists shows that a gene found only in a single plant species can increase protein content when introduced into staple crops.

The research has implications for a wide array of crops, especially for staples grown in the developing world, where sufficient sources of protein are sometimes limited.

"We've found that introducing this gene to plants such as corn, rice and soybean increases protein without affecting yields," said Ling Li, an adjunct assistant professor of genetics, development and cell biology.

Li has worked for years with Eve Syrkin Wurtele, a professor of genetics, development and cell biology, on a gene they discovered in 2004 that appears only in Arabidopsis, a small flowering plant. Their studies of this gene, called QQS, have yielded several publications in peer-reviewed academic journals, a U.S. patent and multiple pending patents.

Li and Wurtele refer to QQS as an "orphan gene" because it's not present in the genome of any other organism.

The gene regulates the protein content in Arabidopsis seeds and leaves, so Li and Wurtele wondered what would happen if they used transgenic technology to introduce the gene to other plants. Could it lead to increased protein in plants that humans commonly eat?

Especially significant news for vegans and vegetarians.


Original Submission

posted by NCommander on Tuesday November 17 2015, @10:32AM   Printer-friendly
from the prevention-goes-a-long-way dept.

So, every once in awhile, if things have been quiet, I like to pop a post seeing what the general feelings of the community are towards the site, seeing what we could do better, etc. I think its been a few months since the last time I posted in Meta, so I think this is a good time to get a pulse on the community, and provide a venue to get any feedback (good or bad).

On the staff side, I know things have been relatively quiet over the summer due to most of us being busy with life and such, which is why we haven't made a major site upgrade since the original rehash upgrade at the end of May. For myself, I've been working at a new job, and dealing with a fair bit of non-Soylent related things to the point that I really only have the time or energy to post, and occasional check in, and I'd like to thank the rest of the staff, and mrcoolbp in particular for holding down the fort for me.

Anyway, in contrast to my usual posts, I'm going to cut this off here, and will be looking forward to your comments below.

73 from NCommander

posted by cmn32480 on Tuesday November 17 2015, @09:54AM   Printer-friendly
from the they-do-it-to-piss-us-off dept.

New research suggests that cats possess the genes that protect vegetarian animals from ingesting poisonous plants by giving them the ability to taste bitter. Animals use their sense of taste to detect whether a potential food is nutritious or harmful. A sweet taste signals the presence of sugars, an important source of energy. A bitter taste, on the other hand, evolved as a defence mechanism against harmful toxins commonly found in plants and unripe fruits.

Evolution has repeatedly tweaked animals' taste buds to suit various dietary needs. Changes in an animal's diet can eliminate the need to sense certain chemicals in food, and so receptor genes mutate, destroying their ability to make a working protein.

One example of this comes from strictly meat-eating cats, who can no longer taste sweetness. But if bitter detection evolved to warn of plant toxins, then it stands to reason that cats, which (usually) eschew plants, shouldn't be able to taste bitter either. Humans and other vegetable-munching animals can taste bitter because we possess bitter taste receptor genes. If cats have lost the ability to taste bitterness, we should find that their receptor genes are riddled with mutations.


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @08:12AM   Printer-friendly
from the time-to-jump-to-the-next-big-thing dept.

Facebook is seeing a huge surge in the amount of data being requested by governments.

The site has released its new information on government information requests, showing a huge rise in the amount of data that governments are trying to get hold of about Facebook's users.

The number of posts that are being taken down because they are contravening local laws is also surging, at almost double the amount of posts that were being censored from last year.

Facebook said that it had received 35,051 requests from governments for access to account data. That was up 18 per cent, across all countries.

http://www.independent.co.uk/life-style/gadgets-and-tech/news/facebook-data-requests-from-governments-surge-site-says-a6732011.html


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @06:28AM   Printer-friendly
from the buy-low-sell-high dept.

The Guardian, a proponent of fossil fuel divestment, reports on an analysis by Corporate Knights:

The Bill and Melinda Gates Foundation would have had $1.9bn (£1.3bn) more to spend on its lifesaving health projects if it had divested from fossil fuels and instead invested in greener companies, according to a new analysis.

The Canadian research company Corporate Knights examined the stock holdings of 14 funds, worth a combined $1tn, and calculated how they would have performed if they had dumped shares in oil, coal and gas companies three years ago.

Overall, the funds would have been $23bn better off with fossil fuel divestment. The Wellcome Trust, which is the world's biggest health charity after the Gates Foundation, would have been $353m better off. The huge Dutch pension fund ABP would have had $9bn in higher returns, while Canada's CPP would have had $7bn more.


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @03:51AM   Printer-friendly
from the the-world-is-greatful dept.

Dr Stewart Adams knew he had found a potential new painkiller when it cured his hangover ahead of an important speech.

"I was first up to speak and I had a bit of a headache after a night out with friends. So I took a 600mg dose, just to be sure, and I found it was very effective."

Now 92, Dr Adams remembers the years of research, the endless testing of compounds and the many disappointments before he and his research team pinpointed ibuprofen as a drug with potential more than 50 years ago.


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @02:16AM   Printer-friendly
from the cost-of-living-differential dept.

From Ryan McMaken at Mises.org:

Given the importance of the cost of living, it is very problematic that the official poverty rate totals for US states do not take costs into account.

When measuring poverty rates internationally, poverty is just defined as households that make 50 percent or 60 percent of the national median income. [...] It simply makes poverty a purely relative measure, so we end up with a situation where purchasing power for a median household in one country (say, Portugal) is actually lower than a poverty-level household in another country (say, the US).

The US official measure, on the other hand, attempts to get around this problem by defining the poverty rate as an actual dollar amount based on what a household can buy.

[...] The problem is this dollar amount is applied nationwide and then used to calculate poverty rates.

[...] Many have noticed certain regional trends here, and that has led to a myriad of articles claiming that so-called "red states" have higher poverty rates than the "blue states." In many cases, "red states" is really code for "low tax" or "free-market-ish" state. In other words, this map "proves" that low taxes and freer economies cause more poverty.

[...] If we adjust the states and poverty rates for the cost of living, however, the map looks a bit different [...]

In this case, the state with the highest poverty rate is California at 23 percent. Arizona and Florida are close behind with rates of 22 percent and 20 percent, respectively. New York has risen to sixth place with a poverty rate of 18 percent, while Mississippi has fallen to eighth place with a rate of 17 percent.

Here we see our bias-confirming assumptions no longer seem to apply since no correlation is apparent along the lines of the red-state/blue-state claims.

[...] Obviously, we have to look somewhere beyond our neat-and-nice ideas about red states and blue states to come up with an explanation.


Original Submission

posted by cmn32480 on Tuesday November 17 2015, @12:32AM   Printer-friendly
from the proprietary dept.

Editorial Projects in Education reports

To promote wider use of open educational resources by states and schools, the U.S. Department of Education proposed [October 29] a new regulation that would require any new intellectual property developed with grant funds from the department to be openly licensed.

That would make such materials available for free use, revision, and sharing by anyone. It would also represent a big, federally supported step away from the textbook publishing industry, long a backbone of K-12 education in the U.S.

[...] The announcement is just one part of the department's new #GoOpen campaign. At an Open Education Symposium being hosted [October 29] in Washington by the department and the White House Office of Science and Technology Policy, school districts, and companies pledged to support the new drive for [Open Educational Resources] (OER).

A group of 10 districts in California, Delaware, Kansas, Missouri, Ohio, and Wisconsin, as well as Department of Defense schools, are pledging to replace at least one textbook with openly licensed educational resources within the next year. So-called "Ambassador Districts" that already use OER--including Virginia's Chesterfield County schools and Pennsylvania's Upper Perkiomen schools--also committed help other districts make similar moves.

[...] The American Association of Publishers, and the software industry association that represents education technology companies, responded to the announcement with reservations.

[...] The department's efforts are just the latest step in a growing trend toward open educational content. Efforts in the U.S. Senate to overhaul the Elementary and Secondary Education Act have included language that would encourage schools to use OER, and adaptive-learning company Knewton recently launched a new platform to bring its technology to the open-content marketplace. States such as New York have robust existing initiatives to develop and share open content, and last spring, California-based nonprofit the Learning Accelerator announced contracts with 10 companies to develop open materials for 12 states.

The Alexandria (Virginia) News has more names and more specifics.


Original Submission