Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 9 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What would you use if you couldn't use your current distribution/operating system?

  • Linux
  • Windows
  • BSD
  • ChromeOS / Android
  • macOS / iOS
  • Open[DOS, Solaris, STEP, VMS]
  • I don't use a computer you insensitive clod!
  • Other (describe in comments)

[ Results | Polls ]
Comments:96 | Votes:111

posted by janrinok on Tuesday September 08 2015, @11:08PM   Printer-friendly
from the when-the-system-doesn't-work dept.

Iron Speed, a firm which provided a rapid application development tool for creating .NET apps, is shuttering itself thanks to "litigation with a patent troll", according to a letter sent to customers by co-founder and chairman Alan Fisher.

The Iron Speed designer enabled developers to create applications for web, cloud and mobile using a point-and-click interface. Customers include AT&T, Cisco, DHL, Disney, HP and the US Army, according to the company's website. Yet all this is no more, writes Fisher:

There are several reasons for this, one of which has been the ongoing expense of litigation with a patent troll who has challenged our intellectual property. While we feel this is baseless, patent litigation is generally a multi-million dollar exercise. This has put a drain on our resources we can no longer afford, and coupled with excessive cracked key use and license sharing, our product sales have been severely impaired.

We will continue offering Technical Support through December 31 2015, but it is unlikely that there will be future software releases.

Because we are unable to issue any refunds, any customer with current software update or technical support subscriptions has been issued an additional perpetual license in his account.

A thread on the Iron Speed forums confirms the situation and provides more details.

The patent issue seems related to the way the Iron Speed designer generates applications automatically based on a database schema, removing much of the gruntwork in building applications that are essentially forms over data.

Microsoft has its own tool which does this, called LightSwitch, but this has not been updated much in the latest edition of Visual Studio, causing developers to doubt its future. Another issue with LightSwitch is its reliance on the deprecated Silverlight for desktop applications, though it can also generate HTML and JavaScript.


Original Submission

posted by martyb on Tuesday September 08 2015, @09:25PM   Printer-friendly
from the it's-more-or-less-that-less-is-more-or-more-is-less dept.

If you walk along South Beach in Miami right now, you will notice something strange, even by Florida standards: Dotting the sandscapes are sky-blue boxes that supply free sunscreen. In a novel experiment this year, the City of Miami Beach has put 50 free sunscreen dispensers in public spaces, and those dispensers are full of radiation-mitigating goo, free to any and all passersby. BBC reports that one in five people living in Florida will eventually suffer from skin cancer but the new campaign hopes that increasing people's awareness will lead to a change in behavior. "[The sunscreen dispensers'] visibility - even without additional messaging - could be a good cue to action," says Dr Richard De Visser, a psychologist who has researched health campaigns.

The sunscreen is the type that is effective at preventing cancer and premature skin aging: Broad-spectrum, water resistant, and SPF 30. You can buy a product that is labeled as higher than SPF 30, but it's almost always a waste, and potentially harmful. Above SPF 30, the difference is essentially meaningless. SPF 15 filters out about 93 percent of UV-B rays, SPF 30 filters out 97 percent, SPF 50 filters out 98 percent, and SPF 100 might get you to 99. The problem, though, is the psychology of the larger number. "We put on the "more powerful" sunscreens and then suddenly think we're Batman or some other superhero who can stay out in the sun indefinitely." says James Hamblin. "But no sunscreen is meant to facilitate prolonged exposure of bare skin to direct sunlight." Dr. Jose Lutzky, head of the melanoma program out Mount Sinai, says Florida is second behind California in incidence of melanoma but the trend is going in the wrong direction. "Unfortunately, our numbers are growing. That is really something we do not want to be first in."


Original Submission

posted by martyb on Tuesday September 08 2015, @08:05PM   Printer-friendly
from the Here-Comes-the-Sun-♪♪♪♪ dept.

A new report finds solar will be cheaper than wholesale electricity prices across Europe by 2030 – without the need for any technological breakthroughs. The implications for fossil fuels are obvious – and means high renewable targets might reduce the energy costs, rather than increasing them.

The study, by the EU-sponsored European Photovoltaic Technology Platform, found that by 2030, the generation costs of solar PV – including grid integration costs of 2c/kWh– will be lower than the wholesale price of electricity in most of Europe. In southern European states, it already is cheaper, and by 2030 the cost of solar PV could be as low as €20-€25/MWh, depending on the cost of capital. Even in London, the cost of large-scale solar PV will be around €50/MWh – equal to the current wholesale price and way below the cost of nuclear.


Original Submission

posted by janrinok on Tuesday September 08 2015, @06:59PM   Printer-friendly
from the I-don't-suppose-my-RaspPi-will-be-up-to-the-task dept.

Fujitsu Laboratories today announced the development of a machine-learning technology that can generate highly accurate predictive models from datasets of more than 50 million records in a matter of hours.

Current techniques for generating highly accurate machine-learning results from a small set of sample data and the accuracy of past predictive models, extracts the learning algorithm and configuration combination that produce the most accurate result, and applies it to the larger dataset. This results in highly accurate predictive models from datasets of 50 million records in a few hours. Predictive models produced by this technology can work to quickly make improvements, such as minimizing membership cancellations on e-commerce websites and enhancing response times to equipment failures. Details of this technology are being presented at the meeting of the Information-Based Induction Sciences and Machine Learning (ISIMBL), opening Monday, September 14 at Ehime University in Japan.

The popularity of smartphones and other advances make it possible to gather massive quantities of sensor data, and machine learning and other advanced analytic techniques are being used extensively to extract valuable information from that data. Using the access logs of e-commerce websites, for example, it is possible to discover when people are most likely to cancel memberships on a given website, to identify those people quickly, and to take measures to discourage cancellation. Using detailed daily power-consumption data, it is possible to discover patterns of increased or decreased usage and to predict periods and times when power usage will increase. This can lead to a reduction in power costs by applying more precise controls over power generation, transmission, and storage. Developing predictive models by machine learning is considered an effective way to obtain accurate predictions. There are numerous methods for accurate predictions will also depend on fine-tuning its configuration. Therefore, generating an effective predictive model requires examining combinations of algorithms and configurations.

Attempting to examine every possible combination of algorithm and conditions causes the number of combinations to balloon quickly. Furthermore, learning time of a combination can take days to examine, making it impractical to use machine learning extensively. Instead, algorithms and conditions are typically selected by analysts based on their experience, so the results ultimately depend heavily on the analyst's skill. In cases where the volume of data is great and analysis ends up taking more than one night, examinations are usually limited to a restricted number of combinations, or analysis can only be applied to a small portion of the data, and it is not possible to automatically derive accurate predictive models in a limited period of time.

Fujitsu Laboratories has developed a technology that estimates machine-learning results, able to generate and automatically tune an accurate predictive model from a small amount of sample data. It has prototyped this on Apache Spark, an open-source platform for parallel execution.

More after the break...

For each standard machine-learning algorithm, Fujitsu Laboratories measured actual machine-learning run times while varying the number of records in a dataset and the number of attributes used to represent the data, and built a run-time estimation model based on those measurements. Additionally, to improve the accuracy of those estimates, actual on-the-fly run-time measurements are used for correction. The company built up a database of combinations of previously used algorithms and configurations, along with the accuracy of the predictive model they produced, and uses this to estimate the predictive accuracy of new combinations. This makes it possible to make an assessment based on the smallest amount of data possible without sacrificing predictive accuracy. Estimating the run time and the accuracy of a predictive model produces accurate predictive models quickly. Techniques for estimating the predictive accuracy of a single machine-learning algorithm do exist, but there has been no such technology that can be applied to multiple algorithms and multiple dataset sizes. Because this technique incorporates actual run-time measurements into estimates based on the conditions for each machine-learning run (including the algorithm, number of records, number of attributes, infrastructural information, and so forth), it gets more accurate the more it is used.

This technology selects time-efficient candidates from among all the candidate combinations, and iterates over them efficiently and in parallel. In existing techniques, there is no way to determine which combination of machine-learning conditions is best according to any ranking system; instead, they depend on the know-how of an analyst manually picking conditions in order for the analysis to proceed. This technology combines estimates of run time and predictive accuracy to select candidate combinations of algorithms and configurations that are expected to provide high improvements of predictive accuracy in return for short run time. Each selected combination is then run in a distributed manner. Taking run time into consideration when selecting candidates makes it possible to execute each algorithm in an optimal order and quickly obtain the most accurate machine learning model. Because this technique automatically focuses on the most effective combinations, it does not depend on the know-how of an analyst.

The company ran internal tests using a dataset of 50 million records and eight servers with 12 processor cores each. Existing techniques would take roughly one week to develop a predictive model with 96% accuracy; Fujitsu Laboratories confirmed that this technique reached that level in slightly more than two hours. It is also demonstrated that this technology would make the practical application of machine-learning possible when used for access-log analysis with 30 million lines of web access logs. This technology could, for example, also be used to provide services such as predicting electrical-power demand for every household in an area the size of Tokyo metropolitan area, or detecting early-warning signs of intent to cancel among users of online services with hundreds of thousands of members.

Fujitsu Laboratories is conducting field trials of this technology in Fujitsu Analytic's solutions using big data, with a goal of a practical implementation during fiscal 2015.


Original Submission

posted by martyb on Tuesday September 08 2015, @06:01PM   Printer-friendly
from the unstable-population dept.

Japan's shrinking population has been a problem for many years. This sort of problem can feed on itself, as the increasing workload on the non-elderly makes raising a family more difficult. Harassment of pregnant workers, called matahara, is common, with the worst of it actually coming from other women. Workers are being pressured to abort their unborn. This is likely related to the effect that motherhood has in the workplace, with businesses rarely hiring temporary help to fill in for women who give birth. Japan currently has less than ⅔ of the births per woman required just to keep the population stable.


Original Submission

posted by martyb on Tuesday September 08 2015, @04:59PM   Printer-friendly
from the basque-in-their-uniquness dept.

DNA from ancient remains seems to have solved the puzzle of one of Europe's most enigmatic people: the Basques.

The distinct language and genetic make-up of the Basque people in northern Spain and southern France has puzzled anthropologists for decades.

One theory proposed that they were an unmixed pocket of indigenous hunters.

Now, a study in PNAS journal suggests they descend from early farmers who mixed with local hunters before becoming isolated for millennia.

The article is well worth reading in its entirety. According to it, the Basque language, Euskera, is unrelated to every other European language because the region was spared the influx of Indo-European speakers that covered the rest of the continent.


Original Submission

posted by martyb on Tuesday September 08 2015, @02:57PM   Printer-friendly
from the billiards-on-sterioids dept.

According to Ars Technica there are some reported results out of CERN's LHC (Large Hadron Collider) that suggest consistent oddities in lepton decay.

They note that the results are likely to be a "statistical fluke and will vanish as the current, high-energy run starts pumping out data in earnest." The results were found in the LHCb (bottom quark) detector. These particles are well-understood so any deviations are relatively easy to spot and would possibly suggest problems with the Standard Model of particles and their interactions. In this case, the LHCb detected tau particles in the decay of neutral B mesons into D mesons at a rate that was 2.1 standard deviations off from what one would expect from the Standard Model.

Now, in particle physics, 2.1 standard deviations is the sort of result that frequently goes away as more data is gathered—it takes three standard deviations to get physicists excited, and five before they start saying they've found something. Which is why, based on the abstract of the paper, there's nothing to get excited about here.

But deep in the discussion, there's an intriguing indication that something unusual might be going on here: "The measured value is in good agreement with previous measurements at BaBar and Belle." These other two detectors studied B mesons produced by electron/positron collisions. So that means three different detectors, using different types of particle collisions, have seen a similar (if similarly weak) excess.

So, is this the first signs of some new physics or just statistical noise? We cannot yet tell, but it will certainly bear watching.

The abstract is available on arXiv 1506.08614 and is slated to be published in Physical Review Letters.


Original Submission

posted by n1 on Tuesday September 08 2015, @01:11PM   Printer-friendly
from the technology-finds-a-way dept.

The Norwegian Pirate Party has made a big statement by launching a free DNS service which allows Internet users to bypass the local Pirate Bay blockade. The party advocates a free and open Internet for everyone and believes that the recent website blockades set a dangerous precedent.

Last week Norway became the latest country to block access to The Pirate Bay.

A local court ordered Internet providers to block users' access to several large 'pirate' websites in the hope that it will decrease online copyright infringement.

The local Pirate Party is now vigorously protesting the ruling and has decided to fight back. Since the sites will be blocked on the DNS level the party is countering by providing their own DNS servers.

"We want a free and open Internet for everyone. The copyright industry's fight for control over culture has put us in a situation where this is no longer the case in Norway," Pirate Party co-chairman Øystein Middelthun tells TF.

"The censorship is easy to bypass, by simply changing your name server, so we decided to practice what we preach and offer such a service to all those affected by the problem," he adds.

Indeed, since the sites' IP-addresses are not blocked the blockade can be easily circumvented by changing the DNS settings on one's device or computer. The Pirate Party is not the only company offering alternative DNS, OpenDNS and Google have a similar service.

The Pirate Party's DNS has added benefits though, as it supports additional Top Level Domains including .geek or .pirate, and the Namecoin based .bit. In addition, it operates from Norway with minimal logging to guarantee users' privacy.


Original Submission

posted by janrinok on Tuesday September 08 2015, @11:20AM   Printer-friendly
from the its-always-someone-else's-fault dept.

http://www.theregister.co.uk/2015/09/07/ashley_madison_amazon_godaddy/

Three John Doe plaintiffs have filed a complaint (PDF) against Amazon Web Services, GoDaddy, and 20 John Roes (anonymous defendants), in the Arizona District Court, for "intentionally inflicting emotional distress upon Ashley Madison users."

The plaintiffs want not less than $3m in damages or losses, and a jury trial to boot, and complain that the hack has resulted in them becoming victims to threats and extortion.


Original Submission

posted by janrinok on Tuesday September 08 2015, @09:49AM   Printer-friendly
from the I'll-drink-to-that dept.

Japanese firm Suntory wasn't the first distiller to get whisky to the International Space Station (ISS). Ardbeg Distillery has characterized samples of whisky sent to the ISS, finding hints of "antiseptic smoke, rubber and smoked fish, along with a curious, perfumed note, like violet or cassis, and powerful woody tones, leading to a meaty aroma."

That's the verdict of Dr Bill Lumsden, director of distilling and whisky creation at Ardbeg Distillery, which sent vials of its pre-maturation Ardbeg new spirit distillate aloft in 2011 to determine "the effect of micro-gravity on the behaviour of terpenes, the building blocks of flavour for whisky spirits as well as for many other foods and wines".

The experiment, organised by US space research outfit NanoRacks, involved mixing 6ml of the distillate ("the liquid resulting from distillation, which is normally filled into oak barrels for maturation") with "oak wood shavings from the inside of a charred American White Oak ex-Bourbon barrel".

The malt launched to the ISS in August 2011, returning to terra firma in September 2014. The liquid and control samples kept on Earth were then subjected to comparartive "gas chromatography (GC) for major volatile congener analysis, high-pressure liquid chromatography (HPLC) for key maturation related congener analysis".

Ardbeg Space Experiment - Final Frontier Film: The Pier Review

The upshot of this, as revealed in Lumsden's paper The impact of micro-gravity on the release of oak extractives into spirit (PDF) is "no significant difference" in values of "major volatile congeners", such as alcohols, aldehydes, ketones and esters.

However, "significant variable between the ISS and Earth samples was discovered when the results of the HPLC analysis of key maturation related congeners (wood extractives) were considered... The absolute concentration of these compounds was far higher than would normally be expected in standard, barrel-matured spirit, almost certainly as a result of the much higher surface area of woody material that the spirit was exposed to," the paper explains.


Original Submission

posted by janrinok on Tuesday September 08 2015, @08:16AM   Printer-friendly
from the er,-got-to-keep-fit dept.

NPR reports that more adults across the country are strapping on helmets and hopping on bikes to get to work but between 1998 and 2013, the rate of bicycle-related injuries among all adults increased by 28 percent, from 96 injuries per 100,000 people in 1998-1999, to 123 injuries per 100,000 people in 2012-2013. And while the death rate among child cyclists has plummeted in the past four decades, the mortality rate among cyclists ages 35 to 54 has tripled. "There are just more people riding and getting injured in that age group. It's definitely striking," says Dr. Benjamin Breyer, Breyer isn't sure what's driving the surge in accidents among Generation Xers and baby boomers, but one reason could be what's known as the Lance Armstrong effect. "After Lance Armstrong had all of his success at the Tour de France, a lot more people were riding, and there were a lot more older riders that took up the bicycle for sport," says Breyer adding that the problem is that ""if you consider a 65-year-old who falls off their bike exactly the same way a 25-year-old does, the 65-year-old is going to sustain more injuries even if they're in great shape."

The most recent National Household Travel Survey showed that the vast majority of the increase in bicycling between 1995 and 2009 came from Americans older than 25, with the biggest increases coming in the oldest groups. That has meant more men in their 50s and 60s on road bikes, riding at high speeds, Breyer says — a recipe for serious injuries. Though a rapidly growing share of older people would like to ride, American cities built during the last 60 years don't make it easy for most people to do so. "I think it's very fair to say that many older bikers don't find themselves in that highly expert category," says Kathryn Lawler. "We have to make the kind of infrastructure for that middle group if we're going to find the benefits." At the end of the day, reducing cycling accidents may boil down to something simple: Making sure that bikers know the rules of the road — and that drivers know how to deal with bikers. "As the population of cyclists in the United States shifts to an older demographic, further investments in infrastructure and promotion of safe riding practices are needed to protect bicyclists from injury," say researchers


Original Submission

posted by janrinok on Tuesday September 08 2015, @06:42AM   Printer-friendly
from the stoned dept.

Stone monoliths found buried near Stonehenge could have been part of the largest Neolithic monument built in Britain, archaeologists believe. The 4,500-year-old stones, some measuring 15ft (4.5m) in length, were discovered under 3ft of earth at Durrington Walls "superhenge". From the BBC:

The monument was on "an extraordinary scale" and unique, researchers said. The Stonehenge Hidden Landscapes team has been creating an underground map of the area in a five-year project. Remote sensing and geophysical imaging technology has been used to reveal evidence of nearly 100 stones without the need for excavation.

The monument is just under two miles (3km) from Stonehenge, Wiltshire, and is thought to have been a Neolithic ritual site. Experts think it may have surrounded traces of springs and a dry valley leading into the River Avon. Although no stones have been excavated they are believed to be fashioned from sarsen blocks found locally. Sarsen stones are sandstone blocks found mainly on Salisbury Plain and the Marlborough Downs in Wiltshire. A unique sarsen standing stone, The Cuckoo Stone, remains in the field next to Durrington Walls.

The stones are believed to have been deliberately toppled over the south-eastern edge of the bank of the circular enclosure before being incorporated into it. Lead researcher Vince Gaffney, of the University of Bradford, said: "We don't think there's anything quite like this anywhere else in the world.


Original Submission

posted by cmn32480 on Tuesday September 08 2015, @05:08AM   Printer-friendly
from the flights-of-fantasy dept.

Ever since Marvel Comics first popularized the idea, the world has been waiting for someone to invent an honest-to-goodness flying aircraft carrier. Now, DARPA says it's going to give it a try.

Not all at once, though. At least, not initially. Late last month, DARPA -- the U.S. Defense Advanced Research Projects Agency -- announced the launch of its " Gremlins " program. While not quite a true Marvel-icious flying "helicarrier," Gremlins sounds pretty ambitious in its own right.

Essentially, the plan is to design a fleet of small, unmanned aerial vehicles that can both take off from and land back aboard a larger aircraft.

The U.S. gave the idea a try in the 1930's. Similar article is at ExtremeTech (JavaScript required)..


Original Submission

posted by CoolHand on Tuesday September 08 2015, @03:41AM   Printer-friendly
from the isn't-graphene-what-in-pencils? dept.

http://phys.org/news/2015-09-layering-technique-graphene-fiber-strength.html
http://www.sciencemag.org/content/349/6252/1083

(Phys.org)—A team of researchers working at Rensselaer Polytechnic Institute has found a way to create a graphene fiber that is stronger and maintains conductive properties better than prior efforts. In their paper published in the journal Science, the team describes their technique and suggests possible uses for the resultant material.

Graphene has excellent conductivity and mechanical strength, when in its 2D form—getting it to maintain both attributes when using it to make 3D products, however has been problematic. In this new effort, the researchers report on a new technique they developed for creating grahene fiber that offers higher thermal and electrical conductivity and better strength than other methods.


Original Submission

posted by CoolHand on Tuesday September 08 2015, @01:34AM   Printer-friendly
from the prehistoric-finance dept.

The New York Times published recently an article that describes how the Gravity Model formula developed by a practitioner of the the dismal science in the 1960's which predicts the amount of trade between two markets, developed to predict economic performance in modern times, also applied to ancient economic systems, and how economists were able to test the formula. The article whets one's appetite:

One morning, just before dawn, an old man named Assur-idi loaded up two black donkeys. Their burden was 147 pounds of tin, along with 30 textiles, known as kutanum, that were of such rare value that a single garment cost as much as a slave. Assur-idi had spent his life's savings on the items, because he knew that if he could convey them over the Taurus Mountains to Kanesh, 600 miles away, he could sell them for twice what he paid.

At the city gate, Assur-idi ran into a younger acquaintance, Sharrum-Adad, who said he was heading on the same journey. He offered to take the older man's donkeys with him and ship the profits back. The two struck a hurried agreement and wrote it up, though they forgot to record some details. Later, Sharrum-­Adad claimed he never knew how many textiles he had been given. Assur-idi spent the subsequent weeks sending increasingly panicked letters to his sons in Kanesh, demanding they track down Sharrum-Adad and claim his profits.

These letters survive as part of a stunning, nearly miraculous window into ancient economics. In general, we know few details about economic life before roughly 1000 A.D. But during one 30-year period — between 1890 and 1860 B.C. — for one community in the town of Kanesh, we know a great deal. Through a series of incredibly unlikely events, archaeologists have uncovered the comprehensive written archive of a few hundred traders who left their hometown Assur, in what is now Iraq, to set up importing businesses in Kanesh, which sat roughly at the center of present-day Turkey and functioned as the hub of a massive global trading system that stretched from Central Asia to Europe. Kanesh's traders sent letters back and forth with their business partners, carefully written on clay tablets and stored at home in special vaults. [...]

Economists were drawn to the Kanesh archive because it offered an unprecedented chance to see how well the Gravity Model applied in an economy entirely unlike our own. This was trade conducted via donkey, through a land of independent city-states whose legal and cultural systems were totally dissimilar to any we know. But still, the model held up: Ali Hortacsu, a University of Chicago economist on the Kanesh team, says that the trade figures between Assur and Kanesh matched the formula almost perfectly. ''It was a very nice surprise,'' he told me.

It just goes to show that what's old is new again.


Original Submission

posted by janrinok on Tuesday September 08 2015, @12:27AM   Printer-friendly
from the it-ain't-over-till-the-fat-lady-sings dept.

Microsoft Corp. gets a second chance to prove it's entitled to keep data stored overseas out of the hands of U.S. investigators when its lawyers appear before a federal appeals court Wednesday, but the computer software giant is already hedging its bets, calling on Congress to clarify the law. The 2nd U.S. Circuit Court of Appeals will hear Microsoft's challenge to a July 2014 lower court ruling concluding that a court or law enforcement agency in the United States is empowered to order a person or entity to produce materials, even if the information is housed outside the country.

The Redmond, Washington-based company hopes the appeals court will overturn the decision upholding the U.S. government's right to search a consumer email account that Microsoft stores in Dublin, Ireland. The government wants to search the account as part of a narcotics investigation.

A warrant for the information was issued in December 2013, saying there was probable cause to believe the account in a facility opened in 2010 was being used to further narcotics trafficking. Microsoft turned over the customer's address book, which was stored in the United States.

In court papers, Microsoft calls on Congress to "grapple with the question whether, and when, law enforcement should be able to compel providers like Microsoft to help it seize customer emails stored in foreign countries."

"Only Congress has the institutional competence and constitutional authority to balance law enforcement needs against our nation's sovereignty, the privacy of its citizens and the competitiveness of its industry," it said.

But Manhattan federal prosecutors said in court filings that "powerful government interests" override potential negative effects on Microsoft's business or any other company seeking to profit on the storage of information overseas. "The fact remains that there exists probable cause to believe that evidence of a violation of U.S. criminal law, affecting U.S. residents and implicating U.S. interests, is present in records under Microsoft's control," they wrote. "With the benefits of corporate citizenship in the United States come corresponding responsibilities, including the responsibility to comply with a disclosure order issued by a U.S. court. Microsoft should not be heard to complain that doing so might harm its bottom line."

Prosecutors noted Microsoft still controls the foreign-based data and U.S.-based employees can retrieve it. They said Microsoft customers also have no right under the company's terms of service to demand that data be stored at any particular data center.

In a filing in the appeal, the government of Ireland noted that the Irish Supreme Court has ruled that Irish courts have the power to order production of documents by an Irish registered company by one of its branches situated in a foreign country. It said Irish taxation authorities also can force Irish banks to produce records of accounts held by customers wherever the information is located.

"Ireland continues to facilitate cooperation with other states, including the United States, in the fight against crime and would be pleased to consider, as expeditiously as possible, a request under the treaty, should one be made," it said.

In another appeals submission, 29 major U.S. and foreign news and trade organizations asked the court to reverse the lower court, saying journalists and publishers worldwide rely on email and cloud-storage services provided by Microsoft and others to gather, store and review documents protected by the First Amendment.

"Even if the subscriber today is not a reporter—although we do not know for sure—the next subscriber may be," the court papers said.


Original Submission