Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

When transferring multiple 100+ MB files between computers or devices, I typically use:

  • USB memory stick, SD card, or similar
  • External hard drive
  • Optical media (CD/DVD/Blu-ray)
  • Network app (rsync, scp, etc.)
  • Network file system (nfs, samba, etc.)
  • The "cloud" (Dropbox, Cloud, Google Drive, etc.)
  • Email
  • Other (specify in comments)

[ Results | Polls ]
Comments:73 | Votes:124

posted by martyb on Thursday June 07 2018, @11:31PM   Printer-friendly
from the how-big-is-that-in-Libraries-of-Congress? dept.

Okay, Last Year's Kilonova Did Probably Create a Black Hole

In August of 2017 [open, DOI: 10.1103/PhysRevLett.119.161101] [DX], another major breakthrough occurred when the Laser Interferometer Gravitational-Wave Observatory (LIGO) detected waves that were believed to be caused by a neutron star merger. Shortly thereafter, scientists at LIGO, Advanced Virgo, and the Fermi Gamma-ray Space Telescope were able to determine where in the sky this event (known as a kilonova) occurred.

This source, known as GW170817/GRB, has been the target of many follow-up surveys since it was believed that the merge could have led to the formation of a black hole. According to a new study by a team that analyzed data from NASA's Chandra X-ray Observatory since the event, scientists can now say with greater confidence that the merger created a new black hole in our galaxy.

[...] While the LIGO data provided astronomers with a good estimate of the resulting object's mass after the neutron stars merged (2.7 Solar Masses), this was not enough to determine what it had become. Essentially, this amount of mass meant that it was either the most massive neutron star ever found or the lowest-mass black hole ever found (the previous record holders being four or five Solar Masses).

Previously: "Kilonova" Observed Using Gravitational Waves, Sparking Era of "Multimessenger Astrophysics"
Neutron-Star Merger Grows Brighter


Original Submission

posted by martyb on Thursday June 07 2018, @09:54PM   Printer-friendly
from the Film-At-11-Maybe...-Or-Maybe-Not... dept.

It's easy to think that film cameras are gone forever. But Marketplace has a short story about how Kodak is apparently close to re-releasing the Ektachrome 100 film line. Tweet as covered in the story.

There's news that Kodak is about to bring back Ektachrome 100, a popular slide film for analog cameras, that's been gone for five years. Launched in the 1940s, Ektachrome was one of the first commercially available color films and became the "preferred choice of magazine and advertising shooters." (It was a favorite of National Geographic.)

As far as I can tell, the development has been hanging for quite some time as here is one among several stories back from January of 2017 stating it was coming back. I guess software isn't the only industry that suffers from vaporware potential. Marketplace's question could also be asked here: What pieces of discontinued technology do you wish would come back?


Original Submission

posted by janrinok on Thursday June 07 2018, @08:17PM   Printer-friendly
from the very-late-steve-jobs-promised dept.

Commentary by Sean Hollister over at CNet reminds us that Steve Jobs promised to make FaceTime an open standard back when he announced it on June 7th 2010, eight years ago. Sean hasn't forgotten but perhaps everyone else has, especially over at Apple. It probably never will be released as an open standard even now with declining market share. However, as a reminder Sean links to the Youtube video of the WWDC 2010 keynote address where it is announced and provides the relevant quote:

"Now, FaceTime is based on a lot of open standards -- H.264 video, AAC audio, and a bunch of alphabet soup acronyms -- and we're going to take it all the way. We're going to the standards bodies starting tomorrow, and we're going to make FaceTime an open industry standard."

As of the time of publications, Apple has not responded to CNet's request for a comment.


Original Submission

posted by janrinok on Thursday June 07 2018, @06:53PM   Printer-friendly
from the begun-the-core-wars-have dept.

AMD released Threadripper CPUs in 2017, built on the same 14nm Zen architecture as Ryzen, but with up to 16 cores and 32 threads. Threadripper was widely believed to have pushed Intel to respond with the release of enthusiast-class Skylake-X chips with up to 18 cores. AMD also released Epyc-branded server chips with up to 32 cores.

This week at Computex 2018, Intel showed off a 28-core CPU intended for enthusiasts and high end desktop users. While the part was overclocked to 5 GHz, it required a one-horsepower water chiller to do so. The demonstration seemed to be timed to steal the thunder from AMD's own news.

Now, AMD has announced two Threadripper 2 CPUs: one with 24 cores, and another with 32 cores. They use the "12nm LP" GlobalFoundries process instead of "14nm", which could improve performance, but are currently clocked lower than previous Threadripper parts. The TDP has been pushed up to 250 W from the 180 W TDP of Threadripper 1950X. Although these new chips match the core counts of top Epyc CPUs, there are some differences:

At the AMD press event at Computex, it was revealed that these new processors would have up to 32 cores in total, mirroring the 32-core versions of EPYC. On EPYC, those processors have four active dies, with eight active cores on each die (four for each CCX). On EPYC however, there are eight memory channels, and AMD's X399 platform only has support for four channels. For the first generation this meant that each of the two active die would have two memory channels attached – in the second generation Threadripper this is still the case: the two now 'active' parts of the chip do not have direct memory access.

This also means that the number of PCIe lanes remains at 64 for Threadripper 2, rather than the 128 of Epyc.

Threadripper 1 had a "game mode" that disabled one of the two active dies, so it will be interesting to see if users of the new chips will be forced to disable even more cores in some scenarios.


Original Submission

posted by janrinok on Thursday June 07 2018, @04:29PM   Printer-friendly
from the be-sure-auto-replay-is-on dept.

Submitted via IRC for SoyCow8317

New research presented at this year's Euroanaesthesia congress in Copenhagen, Denmark shows that the quality of chest compressions during cardiopulmonary resuscitation (CPR) can be improved by using either a smartphone app or by using the song 'La Macarena' as a mental memory aid.

Improving the quality of compressions performed during CPR can significantly increase the chance of survival and lead to better health outcomes. The goal of the study was to compare the effectiveness of a smartphone metronome application, and a musical mental metronome in the form of the song "La Macarena" at improving the quality of chest compressions. Both the app and the song provide a regular rhythm to help time compressions.

The team selected a group of 164 medical students from the University of Barcelona to perform continuous chest compressions on a manikin for 2 minutes. Subjects either received no guidance (control), were provided with the smartphone app (App group), or were asked to perform compressions to the mental beat of the song "La Macarena" (Macarena group).

The authors conclude that: "Both the app and using mental memory aid 'La Macarena' improved the quality of chest compressions by increasing the proportion of adequate rate but not the depth of compressions. The metronome app was more effective but with a significant onset delay."

Source: https://www.sciencedaily.com/releases/2018/06/180601225705.htm

[Editor's Comment: This is not a new idea, there was publicity a few years back in the UK for a similar thing but using "Staying Alive" by the BeeGees for the musical beat, and another editor was told during medical training to use "Another One Bites the Dust" by Queen. Both seem more appropriate titles than '"La Macarena" - perhaps this is just a twist on whatever music is currently popular. Of course, 'using an app' might replace 'using a computer' as the most pointless claim to being different.]


Original Submission

posted by janrinok on Thursday June 07 2018, @03:05PM   Printer-friendly
from the don't-give-huge-blocks-to-businesses dept.

Things are looking up for our next-generation internet.

[...] But the shortage of IPv4 elbow room became a steadily worsening issue -- have you noticed all those phones that can connect to the network now, for example? So tech companies banded together to try to advance IPv6. The result: World IPv6 Day on June 8, 2011, when tech giants like Google, Facebook and Yahoo tested IPv6 sites to find any problems. For a sequel, they restarted those IPv6 connections and left them on starting on World IPv6 Launch Day, June 6, 2012.

Back then, there was still a risk that IPv6 wouldn't attract a critical mass of usage even with the tech biggies on board. The result would've been an internet complicated by multilayer trickery called network address translation, or NAT, that let multiple devices share the same IP address. But statistics released Wednesday by one IPv6 organizer, the Internet Society, show that IPv6 is growing steadily in usage, with about a quarter of us now using it worldwide. It looks like we're finally moving into a future that's been within our grasp since the Clinton administration.

"While there is obviously more to be done -- like roll out IPv6 to the other 75 percent of the Internet -- it's becoming clear that IPv6 is here to stay and is well-positioned to support the Internet's growth for the next several decades," said Lorenzo Colitti, a Google software engineer who's worked on IPv6 for years.

[...] How much room does IPv6 have? Enough to give network addresses to 340 undecillion devices -- that's two to the 128th power, or 340,282,366,920,938,463,463,374,607,431,768,211,456 if you're keeping score.


Original Submission

posted by janrinok on Thursday June 07 2018, @01:39PM   Printer-friendly
from the EU-got-something-right dept.

Gervase Markham has a thorough blog post about a case for the total abolition of software patents. He makes his case based on their complete lack of promotion of innovation and aims at identifying the principles involved. The feasibility of eliminating them may be a ways off due to the heavy politics involved so the idea may seem like a very distant policy possibility.

One immediate question is: how does one define a software patent? Where is the boundary? Various suggestions have been made, but actually, this question is not as important as it appears, for two reasons. Firstly, if we can demonstrate that there is a group of clearly identifiable patents which are harmful, or harmful when enforced in particular situations, then we can adopt the principle that such patents should not be granted or should not be enforceable, and where one draws the exact line between them and other patents becomes a secondary, practical, definitional issue beyond the initial principle. Secondly, some methods proposed for dealing with the problem of software patents do not actually require one to define what a software patent is. For example, one proposal is that one could change the law such that no program written to run on a general purpose computer could ever be said to be infringing a patent. In this case, you need a definition of "general purpose computer", but you don't need one for "software patent". Given these two points, I don't intend to spend time on definitional issues.

Currently software patents are a problem affecting the US and prohibited in the EU due to Article 52 of the European Patent Convention in 1973 (EPC). However, they are currently being pushed by the European Patent Office (EPO) in the name of "harmonization" despite being invalid. Many consider the fact that Europe remains unafflicted by software patents to be a moderating influence on the US market, holding back a free for all.


Original Submission

posted by janrinok on Thursday June 07 2018, @12:13PM   Printer-friendly
from the Darth-Vadar-doesn't-help-though dept.

"Some alien planets in multiple-star systems — such as the two-sun Tatooine, Luke's home world in the 'Star Wars' universe — may indeed nestle in orbits that are stable for long stretches of time, a new study suggests.

In the study, researchers ran more than 45,000 computer simulations, examining where planets of various masses and dimensions could exist in two- and three-star systems.

"We ran the simulations for periods ranging from 1 million to 10 million years, in order to see if the systems are stable over very long periods," study lead author Franco Busetti, of the School of Computer Science and Applied Mathematics at the University of the Witwatersrand in South Africa, said in a statement.

"The analysis shows that most configurations had large enough stable regions for planets to exist," added Busetti, who presented the results Monday (June 4) at the 232nd meeting of the American Astronomical Society in Denver. "Many of these areas are actually very habitable for planets."

Fewer than 40 three-star planets are known. But the new study, which has been submitted to the journal Astronomy & Astrophysics, could help astronomers find more, Busetti said.

"It could assist in selecting suitable candidates for a survey of such systems and guide the observational searches for them," he said. "The geometry of the stable zone indicates not only where to look for planets but how to look."


Original Submission

posted by janrinok on Thursday June 07 2018, @10:46AM   Printer-friendly
from the voices-in-my-head dept.

Two American diplomats stationed in China were reportedly evacuated from the region after being sickened by a mysterious ailment linked to odd sounds.

The two Americans evacuated worked at the American Consulate in the southern city of Guangzhou, the New York Times reported Wednesday, adding that their colleagues and relatives are also being tested by a State Department medical team.

American officials have been worried for months that American diplomats and their families in Cuba -- and now China -- have been subjected to a "sonic attack," leading to symptoms similar to those "following concussion or minor traumatic brain injury," the State Department said in a statement Tuesday.

The new cases broaden a medical mystery that began affecting American diplomats and their families in Cuba in 2016. Since then, 24 Americans stationed in Havana have experienced dizziness, headaches, fatigue, hearing loss and cognitive issues, the State Department said.

[...] The nature of the injury, and whether a common cause exists, hasn't been established yet, the department said.

Previously: Sonic Attack? U.S. Issues Health Alert After Employee Experiences Brain Trauma in Guangzhou, China

Related: US Embassy Employees in Cuba Possibly Subjected to 'Acoustic Attack'
U.S. State Department Pulls Employees From Cuba, Issues Travel Warning Due to "Sonic Attacks"
A 'Sonic Attack' on Diplomats in Cuba? These Scientists Doubt It
Cuban Embassy Victims Experiencing Neurological Symptoms
Computer Scientists May Have Solved the Mystery Behind the 'Sonic Attacks' in Cuban Embassy


Original Submission

posted by janrinok on Thursday June 07 2018, @09:24AM   Printer-friendly
from the are-we-surprised? dept.

Gizmodo writes that FCC emails show that the agency spread lies to bolster false DDoS attack claims. Their system became overwhelmed in early 2017 after John Oliver directed his audience to flood the agency with comments supporting net neutrality. A similar surge had happened for similar reasons back in 2014. However, the current FCC team appears to have lied about both occasions.

Internal emails reviewed by Gizmodo lay bare the agency's efforts to counter rife speculation that senior officials manufactured a cyberattack, allegedly to explain away technical problems plaguing the FCC's comment system amid its high-profile collection of public comments on a controversial and since-passed proposal to overturn federal net neutrality rules.

The FCC has been unwilling or unable to produce any evidence an attack occurred—not to the reporters who've requested and even sued over it, and not to U.S. lawmakers who've demanded to see it. Instead, the agency conducted a quiet campaign to bolster its cyberattack story with the aid of friendly and easily duped reporters, chiefly by spreading word of an earlier cyberattack that its own security staff say never happened.


Original Submission

posted by mrpg on Thursday June 07 2018, @07:55AM   Printer-friendly
from the molecular-motion-pictures-presents dept.

Submitted via IRC for SoyCow8093

Scientists from IUPUI, MIT, Nokia Bell Labs, NTT and the University of Bristol in England, which led the study, have shown how an optical chip can simulate the motion of atoms within molecules at the quantum level. The study is published in the May 31 issue of the journal Nature.

[...] Understanding the behavior of molecules requires an understanding of how they vibrate at the quantum level. But modeling these dynamics requires massive computational power, beyond what exists or is expected from coming generations of supercomputers.

An optical chip uses light instead of electricity and can operate as a quantum computing circuit. In the study published in Nature, data from the chip allows a frame-by-frame reconstruction of atomic motions to create a virtual movie of how a molecule vibrates.

Source: Scientists Use Photonic Chip to Make Virtual Movies of Molecular Motion


Original Submission

posted by mrpg on Thursday June 07 2018, @06:20AM   Printer-friendly
from the help!-I'm-a-prisoner-in-an-izakaya dept.

Submitted via IRC for SoyCow8317

Scientists at Hokkaido University and Kyoto University have developed a theoretical approach to quantum computing that is 10 billion times more tolerant to errors than current theoretical models. Their method brings us closer to developing quantum computers that use the diverse properties of subatomic particles to transmit, process and store extremely large amounts of complex information.

Quantum computing has the potential to solve problems involving vast amounts of information, such as modelling complex chemical processes, far better and faster than modern computers.

[...] In a paper published in the journal Physical Review X, Akihisa Tomita, an applied physicist at Hokkaido University, and his colleagues suggested a novel way to dramatically reduce errors when using this approach. They developed a theoretical model that uses both the properties of quantum bits and the modes of the electromagnetic field in which they exist. The approach involves squeezing light by removing error-prone quantum bits, when quantum bits cluster together.

This model is ten billion times more tolerant to errors than current experimental methods, meaning that it tolerates up to one error every 10,000 calculations.

Source: Hokkaido University and Kyoto University Scientists Develop More Error-Tolerant Approach to Quantum Computing


Original Submission

posted by mrpg on Thursday June 07 2018, @04:44AM   Printer-friendly
from the finally-good-news dept.

The Center for American Progress reports

Hawaii's Governor David Ige (D) signed a bill on [June 4] implementing the most ambitious climate law in the United States. The new law, which comes into effect July 1, sets the goal for the state to become carbon neutral by 2045.

Along with the carbon neutral bill, Governor Ige also signed two other climate bills into law. A second bill requires the state to use carbon offsets to restore the state's forests by planting trees which will help absorb carbon from the atmosphere. And a third law mandates that sea level rise be factored into the review process for building projects.

[...] Ige [noted]: "Sea level rise is already having an impact on beaches, roadways, and homes near the shoreline. As a result, we face difficult land-use decisions, and requiring an analysis of sea level rise before beginning construction is just plain common sense."

The carbon neutral bill notes that Hawaii could see $19 billion worth of damage from sea level rise.

Hawaii now surpasses Rhode Island as the most ambitious state when it comes to tackling climate change. Rhode Island aims to cut greenhouse gas emissions by 95 percent below 1990 levels by 2050.

Before introducing the three laws, Hawaii already had some of the strongest climate policies in the country. This includes a target introduced in 2015 to reach 100 percent renewable energy by 2045. And last year, despite President Trump's announcement to withdraw from the Paris climate agreement, Hawaii became the first state the introduce a law to uphold the Paris climate target of limiting warming to 2 degrees Celsius.

As an archipelago in the Central Pacific, however, Hawaii relies on carbon-heavy modes of transport: planes and ships. This is one reason why the state chose to pursue a carbon offset program, Scott Glenn, head of the state's environmental quality office [said].


Original Submission

posted by mrpg on Thursday June 07 2018, @03:05AM   Printer-friendly
from the [sigh-2] dept.

[...] The social media company said Huawei, computer maker Lenovo Group, and smartphone makers OPPO and TCL Corp were among about 60 companies worldwide that received access to some user data after they signed contracts to re-create Facebook-like experiences for their users.

Members of Congress raised concerns after The New York Times reported on the practice on Sunday, saying that data of users’ friends could have been accessed without their explicit consent. Facebook denied that and said the data access was to allow its users to access account features on mobile devices.

[...] Chinese telecommunications companies have come under scrutiny from U.S. intelligence officials who argue they provide an opportunity for foreign espionage and threaten critical U.S. infrastructure, something the Chinese have consistently denied.

[...] Senators John Thune, the committee’s Republican chairman, and Bill Nelson, the ranking Democrat, on Tuesday wrote to Zuckerberg after The New York Times reported that manufacturers were able to access data of users’ friends even if the friends denied permission to share the information with third parties.


Original Submission

posted by mrpg on Thursday June 07 2018, @01:39AM   Printer-friendly
from the [sigh] dept.

A city watchdog has launched a stinging attack on TSB chief Paul Pester for portraying "an optimistic view" of its catastrophic IT meltdown in April that prevented 1.9 million customers from using online bank services.

[...] "For example, TSB referred to 'the vast majority' of customers being able to access their online accounts, at a time when there was a successful first-time login rate of only 50 per cent on the web channel."

[...] TSB planned to shift off LBG's infrastructure after it was bought by Spanish bank Sabadell for £1.7bn in 2013. At the time, Sabadell estimated the system switch would save the bank some £160m a year.

[...] Following the incident, a number of customers fell victim to fraud via phishing calls, emails and texts sent by scammers purporting to be TSB and asking them to verify their bank details.

A TSB spokeswoman said: "We look forward to updating the committee on the work TSB has undertaken to resolve problems for customers since our last appearance.

"We recognise that we have more to do to restore the bank's operations to the level that customers expect and are completely focused on that and ensuring customers are not left out of pocket."

related: Warning Signs for TSB's IT Meltdown were Clear a Year Ago


Original Submission

posted by mrpg on Thursday June 07 2018, @12:03AM   Printer-friendly
from the cows-and-poultry-agree dept.

[...] Agricultural data from 38,700 farms plus details of processing and retailing in 119 countries show wide differences in environmental impacts — from greenhouse gas emissions to water used — even between producers of the same product, says environmental scientist Joseph Poore of the University of Oxford. The amount of climate-warming gases released in the making of a pint of beer, for example, can more than double under high-impact production scenarios. For dairy and beef cattle combined, high-impact providers released about 12 times as many greenhouse gases as low-impact producers, Poore and colleague Thomas Nemecek report in the June 1 Science.

[...] The greatest changes in the effect of a person’s diet on the planet, however, would still come from choosing certain kinds of food over others. On average, producing 100 grams of protein from beef leads to the release of 50 kilograms of greenhouse gas emissions, which the researchers calculated as a carbon-dioxide equivalent. By comparison, 100 grams of protein from cheese releases 11 kg in production, from poultry 5.7 kg and from tofu 2 kg.

[...] Producing food overall accounts for 26 percent of global climate-warming emissions, and takes up about 43 percent of the land that’s not desert or covered in ice, the researchers found. Out of the total carbon footprint from food, 57 percent comes from field agriculture, livestock and farmed fish. Clearing land for agriculture accounts for 24 percent and transporting food accounts for another 6 percent.


Original Submission

Today's News | June 8 | June 6  >