2019-01-01 00:00:00 ..
2019-06-18 11:49:55 UTC
2019-06-19 10:49:57 UTC
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
By joining the Open Invention Network, Microsoft is offering its entire patent portfolio -- with the legacy exception of its Windows and desktop application code -- to all of the open-source patent consortium's members.
Before Microsoft joined, OIN had more than 2,650 community members and owns more than 1,300 global patents and applications. OIN is the largest patent non-aggression community in history and represents a core set of open-source intellectual-property values. Its members include Google, IBM, Red Hat, and SUSE. The OIN patent license and member cross-licenses are available royalty-free to anyone who joins the OIN community.
This is maybe the biggest Microsoft news since Microsoft "acquired" The Linux Foundation nearly two years ago in Nov 2016.
Also at Ars Technica.
Last week, TSMC made two important announcements concerning its progress with extreme ultraviolet lithography (EUVL). First up, the company has successfully taped out its first customer chip using its second-generation 7 nm process technology, which incorporates limited EUVL usage. Secondly, TSMC disclosed plans to start risk production of 5 nm devices in April.
TSMC initiated high-volume manufacturing of chips using its first generation 7 nm fabrication process (CLN7FF, N7) in April. N7 is based around deep ultraviolet (DUV) lithography with ArF excimer lasers. By contrast, TSMC's second-generation 7 nm manufacturing technology (CLN7FF+, N7+) will use extreme ultraviolet lithography for four non-critical layers, mostly in a bid to speed up production and learn how to use ASML's Twinscan NXE step-and-scan systems for HVM. Factual information on the improvements from N7 to N7+ are rather limited: the new tech will offer a 20% higher transistor density (because of tighter metal pitch) and ~8% lower power consumption at the same complexity and frequency (between 6% and 12% to be more precise).
[...] After N7+ comes TSMC's first-generation 5 nm (CLN5FF, N5) process, which will use EUV on up to 14 layers. This will enable tangible improvements in terms of density, but will require TSMC to extensively use EUV equipment. When compared to TSMC's N7, N5 technology will enable TSMC's customers to shrink area of their designs by ~45% (i.e. transistor density of N5 is ~1.8x higher than that of N7), increase frequency by 15% (at the same complexity and power) or reduce power consumption by 20% power reduction[sic] (at the same frequency and complexity).
TSMC will be ready to start risk production of chips using its N5 tech in April, 2019. Keeping in mind that it typically takes foundries and their customers about a year to get from risk production to HVM, it seems like TSMC is on-track for mass production of 5 nm chips in Q2 2020, right in time to address smartphones due in the second half of 2020.
Tape-out. Risk production = early production.
Related: TSMC to Build 7nm Process Test Chips in Q1 2018
"3nm" Test Chip Taped Out by Imec and Cadence
TSMC Will Make AMD's "7nm" Epyc Server CPUs
GlobalFoundries Abandons "7nm LP" Node, TSMC and Samsung to Pick Up the Slack
"We have to be focused on what we want to enable," said Ben Gomes, Google's search engine chief. "And then when the opening happens, we are ready for it." It was Wednesday, July 18, and Gomes was addressing a team of Google employees who were working on a secretive project to develop a censored search engine for China, which would blacklist phrases like "human rights," "student protest," and "Nobel Prize."
"You have taken on something extremely important to the company," Gomes declared, according to a transcript of his comments obtained by The Intercept. "I have to admit it has been a difficult journey. But I do think a very important and worthwhile one. And I wish ourselves the best of luck in actually reaching our destination as soon as possible." [...] Gomes, who joined Google in 1999 and is one of the key engineers behind the company's search engine, said he hoped the censored Chinese version of the platform could be launched within six and nine months, but it could be sooner. "This is a world none of us have ever lived in before," he said. "So I feel like we shouldn't put too much definite into the timeline."
[...] Google has refused to answer questions or concerns about Dragonfly. On Sept. 26, a Google executive faced public questions on the censorship plan for the first time. Keith Enright told the Senate Commerce, Science and Transportation Committee that there "is a Project Dragonfly," but said "we are not close to launching a product in China." When pressed to give specific details, Enright refused, saying that he was "not clear on the contours of what is in scope or out of scope for that project."
Senior executives at Google directly involved in building the censorship system have largely avoided any public scrutiny. But on Sept. 23, Gomes briefly addressed Dragonfly when confronted by a BBC reporter at an event celebrating Google's 20th anniversary. "Right now, all we've done is some exploration," Gomes told the reporter, "but since we don't have any plans to launch something, there's nothing much I can say about it." Gomes' statement kept with the company's official line. But it flatly contradicted what he had privately told Google employees who were working on Dragonfly — which disturbed some of them. One Google source told The Intercept Gomes's comments to the BBC were "bullshit."
Here's an article written by Dave Lee, the BBC reporter that Ben Gomes misled.
Previously: Google Plans to Launch Censored Search Engine in China, Leaked Documents Reveal
Uproar at Google after News of Censored China Search App Breaks
"Senior Google Scientist" Resigns over Chinese Search Engine Censorship Project
Google Suppresses Internal Memo About China Censorship; Eric Schmidt Predicts Internet Split
The "ring rain" of material falling from Saturn's rings into the planet's atmosphere is a much more intense, contaminated downpour than scientists thought.
For decades, astronomers have suspected that Saturn's rings pelt the planet with grains of water ice, but some of the final observations from NASA's Cassini spacecraft provide the first detailed views of these celestial showers (SN: 4/14/18, p. 6). Ring rain is highly contaminated with organic matter and other molecules, and hammers Saturn at thousands of kilograms per second, researchers report in the Oct. 5 Science. Understanding the rain's surprising quantity and quality could help clarify the origins and evolution of Saturn's rings.
Researchers analyzed data collected by Cassini's Ion Neutral Mass Spectrometer during the spacecraft's final few orbits in 2017, as it sailed through the gap between Saturn and its innermost ring, known as the D ring (SN Online: 9/15/17). Water constituted only about 24 percent of the material tumbling from Saturn's ring system into its atmosphere; the rest was methane, carbon monoxide, dinitrogen, ammonia, carbon dioxide and fragments of organic nanoparticles.
The ring rain's diverse chemical composition "was a big surprise," because remote observations show that Saturn's ring system, on the whole, is almost entirely water ice, says Cassini project scientist Linda Spilker of NASA's Jet Propulsion Laboratory in Pasadena, Calif., who wasn't involved in the study. Researchers aren't sure why ring rain is so deprived of water.
Submitted via IRC for chromas
New computerized weapons systems currently under development by the US Department of Defense (DOD) can be easily hacked, according to a new report published today.
The report was put together by the US Government Accountability Office (GAO), an agency that provides auditing, evaluation, and investigative services for Congress.
Congress ordered the GAO report in preparation to approve DOD funding of over $1.66 trillion, so the Pentagon could expand its weapons portfolio with new toys in the coming years.
But according to the new report, GAO testers "playing the role of adversary" found a slew of vulnerabilities of all sort of types affecting these new weapons systems.
"Using relatively simple tools and techniques, testers were able to take control of systems and largely operate undetected, due in part to basic issues such as poor password management and unencrypted communications," GAO officials said.
The report detailed some of the most eye-catching hacks GAO testers performed during their analysis.
In one case, it took a two-person test team just one hour to gain initial access to a weapon system and one day to gain full control of the system they were testing.
Some programs fared better than others. For example, one assessment found that the weapon system satisfactorily prevented unauthorized access by remote users, but not insiders and near-siders. Once they gained initial access, test teams were often able to move throughout a system, escalating their privileges until they had taken full or partial control of a system.
In one case, the test team took control of the operators' terminals. They could see, in real-time, what the operators were seeing on their screens and could manipulate the system. They were able to disrupt the system and observe how the operators responded.
Another test team reported that they caused a pop-up message to appear on users' terminals instructing them to insert two quarters to continue operating.
Multiple test teams reported that they were able to copy, change, or delete system data including one team that downloaded 100 gigabytes, approximately 142 compact discs, of data.
The report claims the DOD documented many of these "mission-critical cyber vulnerabilities," but Pentagon officials who met with GAO testers claimed their systems were secure, and "discounted some test results as unrealistic."
As you probably have noticed, our site has been a bit sluggish lately.
We are aware of the issue and are developing plans for dealing with it. The primary issue lies in the database structure and contents. On-the-fly joins across multiple tables cause a performance hit which is exacerbated by the number of stories we have posted over the years (yes, it HAS been that long... YAY!). Further, stories which have been "archived" — allowing no further comments or moderation — are still sitting in the in-RAM DB and could be offloaded to disk for long-term access. Once offloaded, there would be much less data in the in-RAM database (queries against empty DBs tend to be pretty quick!) so this should result in improved responsiveness.
A complicating factor is that changing the structure on a live, replicated database would cause most every page load to 500 out. So the database has to be offlined and the code updated. That would likely entail on the order of the better part of a day. Obviously, shorter is better. On the other hand "The longest distance between two points is a short cut." We're aiming to do it right, the first time, and be done with it, rather than doing it quick-and-dirty, which usually ends up being not quick and quite dirty.
So, we ARE aware of the performance issues, are working towards a solution, and don't want to cause any more disruption than absolutely necessary.
We will give notice well in advance of taking any actions.
Just weeks after hurricane Florence battered the US southeast with historic rains and flooding, another major hurricane is now bearing down on the Florida panhandle with 145 mph (~240 kph) winds and heavy rains forecast. From Ars Technica:
Hurricane Michael continued to intensify during Tuesday night, bringing an unprecedentedly strong storm to the northwest Florida coast on Wednesday. This is a serious situation for the Florida Panhandle and downstream areas in southeastern Alabama, Georgia, and the Carolinas.
As of the National Hurricane Center's 9am ET update, Michael had 145mph sustained winds, solidly in the range of a Category-4 major hurricane. Winds along the Florida coast were already rising above tropical storm strength at the time, all but closing the window for further evacuations as the storm nears shore and moves inland later today.
Perhaps most concerning, Michael's central pressure continued to fall during the overnight hours, down to 933 millibars by Wednesday morning. This is an indication of the storm's organization, and with Michael's satellite appearance actually improving as the storm approaches land, some slight further intensification is possible today before landfall near Panama City. If Michael's central pressure falls further, to 930 millibars, it would rank among the 10 most intense hurricanes to make landfall in the US on record in terms of central pressure.
Meteorologists are reacting to the rapidly intensifying storm with some measure of alarm. Mike Bettes, a meteorologist with The Weather Channel, noted on Twitter Wednesday morning that his crew was pulling out of Apalachicola, a small coastal community to the right of Michael's projected landfall that will likely bear the brunt of the storm's winds and surge.
If you are in the path of this storm, please take whatever measures necessary to keep yourself safe.
Submitted via IRC for Bytram
The first-ever detection of highly energetic radiation from a microquasar has astrophysicists scrambling for new theories to explain the extreme particle acceleration. A microquasar is a black hole that gobbles up debris from a nearby companion star and blasts out powerful jets of material.
"What's amazing about this discovery is that all current particle acceleration theories have difficulties explaining the observations," said Hui Li, a theorist in Los Alamos National Laboratory's Theoretical Division who served on the team. "This surely calls for new ideas on particle acceleration in microquasars and black hole systems in general."
The team's observations, described in the Oct. 4 issue of the journal Nature, strongly suggest that particle collisions at the ends of the microquasar's jets produced the powerful gamma rays. Scientists think that studying messages from this microquasar, dubbed SS 433, may offer a glimpse into more extreme events happening at the centers of distant galaxies.
A. U. Abeysekara, et. al. Very-high-energy particle acceleration powered by the jets of the microquasar SS 433. Nature, 2018; 562 (7725): 82 DOI: 10.1038/s41586-018-0565-5
Five days after Bloomberg stunned the world with still-unconfirmed allegations that Chinese spies embedded data-sniffing chips in hardware used by Apple, Amazon, and dozens of other companies, the news organization is doubling down. Bloomberg is now reporting that a different factory-seeded manipulation from the previously described one was discovered in August inside the network of a major US telecommunications company.
Bloomberg didn't name the company, citing a non-disclosure agreement between the unnamed telecom and the security firm it hired to scan its data centers. AT&T, Sprint and T-Mobile all told Ars they weren't the telecom mentioned in the Bloomberg post. Verizon and CenturyLink also denied finding backdoored Supermicro hardware in their datacenters, Motherboard reported.
Tuesday's report cites documents, analysis, and other evidence provided by Yossi Appleboum, who is co-CEO of a hardware security firm called Sepio Systems. Bloomberg said that, while Sepio was scanning servers belonging to the unnamed telecom, the firm detected unusual communications from a server designed by Supermicro. Supermicro, according to last week's Bloomberg report, is the hardware manufacturer whose motherboards were modified in the factory to include a tiny microchip that caused attached servers to come under the control of a previously unreported division of China's People's Liberation Army. Supermicro told Bloomberg it had no knowledge of the implant, marking the second time the hardware maker has denied knowing anything about the reported manipulations.
[...] The criticism was still at full pitch on Tuesday morning when Bloomberg published its follow-up article. While it names a single source, some security experts quickly challenged the credibility of the report. "Sure this story has one named source but it technically makes even less sense than the first one," Cris Thomas, a security expert who tweets under the handle SpaceRogue, wrote. "Come on @Bloomberg get somebody who knows what they're talking about to write these stories. Calling BS on this one as well."
Arthur T Knackerbracket has found the following story:
For one brief shining moment after the 2015 detection of gravitational waves from colliding black holes, astronomers held out hope that the universe's mysterious dark matter might consist of a plenitude of black holes sprinkled throughout the universe.
UC Berkeley physicists have dashed those hopes.
Based on a statistical analysis of 740 of the brightest supernovas discovered as of 2014, and the fact that none of them appear to be magnified or brightened by hidden black hole "gravitational lenses," the researchers concluded that primordial black holes can make up no more than about 40 percent of the dark matter in the universe. Primordial black holes could only have been created within the first milliseconds of the Big Bang as regions of the universe with a concentrated mass tens or hundreds of times that of the sun collapsed into objects a hundred kilometers across.
The results suggest that none of the universe's dark matter consists of heavy black holes, or any similar object, including massive compact halo objects, so-called MACHOs.
-- submitted from IRC
Arthur T Knackerbracket has found the following story:
Ken Bowles, a UC San Diego software engineer who helped popularize personal computers in the 1970s and '80s through advances that were exploited by such entrepreneurs as Apple's Steve Jobs, died on Aug. 15 in Solana Beach. He was 89.
His passing was announced by the university, which said that Bowles, an emeritus professor of computer science, had died peacefully.
Bowles was not well-known to the general public. But he was famous in computer science for helping researchers make the leap from huge, expensive mainframe computers to small "microcomputers," the forerunner of PCs.
He was driven by the desire to make it faster and easier for researchers and programmers to work on their own, and to develop software that could be used on many types of computers.
By 1968, Bowles found himself in the perfect spot to push his vision. He was appointed director of the university's computer center, just three years after joining the faculty.
University historians say Bowles taught his students to write and rewrite code on the world's first microprocessors, the chips that revolutionized the computer industry in the 1970s. They were soon writing programs expressly for microcomputers, bypassing mainframes.
Bowles and his team also adopted and modified Pascal, an early programming language that was opening up computer science. The modified version became known as UCSD Pascal and was widely used to teach people how to program.
[...] "The development of UCSD Pascal was a transformative event not just for UCSD but for all of computer science," according to a statement by Dean Tullsen, chair of the department of computer science and engineering at UC San Diego.
"It was arguably the first high-level programming system that both worked on small systems that schools, most businesses, and eventually individuals could afford, and was portable across many systems."
-- submitted from IRC
Arthur T Knackerbracket has found the following story:
A new study led by an infectious disease epidemiologist at Tulane University School of Public Health and Tropical Medicine could change the way doctors treat a common sexually transmitted disease.
Professor Patricia Kissinger and a team of researchers found the recommended single dose of medication isn't enough to eliminate trichomoniasis, the most common curable STD, which can cause serious birth complications and make people more susceptible to HIV. Results of the research are published in Lancet Infectious Diseases.
Globally, an estimated 143 million new cases of trichomoniasis among women occur each year and most do not have symptoms, yet the infection is causing unseen problems. The recommended treatment for more than three decades has been a single dose of the antibiotics metronidazole or tinidazole.
The researchers recruited more than 600 women for the randomized trial in New Orleans; Jackson, Mississippi; and Birmingham, Alabama. Half the women took a single dose of metronidazole and the other half received treatment over seven days.
Kissinger and her team found the women who received multiple doses of the treatment were half as likely to still have the infection after taking all the medication compared to women who only took a single dose.
-- submitted from IRC
Patricia Kissinger, et. al. Single-dose versus 7-day-dose metronidazole for the treatment of trichomoniasis in women: an open-label, randomised controlled trial. The Lancet Infectious Diseases, 2018; DOI: 10.1016/S1473-3099(18)30423-7
Arthur T Knackerbracket has found the following story:
This won't be good news for criminals who forget their gloves.
Researchers at Flinders University in South Australia have developed a new test that can measure the amount of DNA we "shed", the university revealed Friday. This could mean that a single tap of a finger on a door handle can be used to link potential suspects to a crime, and also potentially reveal who an item last came into contact with.
In a test involving 11 donors, the researchers took 264 fingerprints, getting participants to wash their hands, then give prints of their thumbs at intervals of 180 minutes.
"We know that some people pass on more of their DNA because when they touch something more of their cells are left behind," lead researcher Adrian Linacre said. "They are called shedders but it's very difficult at the moment to see who is a shedder."
The researchers developed a dye which can identify deposits of DNA at a crime scene, accurately pointing investigators to where DNA samples lie, instead of leaving them to guess.
"The dye binds within a number of seconds... certainly within 10 seconds we can see all the DNA that's there, and we can count it," Linacre told the Australian Broadcasting Corporation Friday. This lets the researchers determine whether someone is a "shedder".
"By counting the amount of cellular material, which appears as green dots, we know if someone's a heavy shedder or a poor shedder," Linacre told the ABC.
-- submitted from IRC
Intel — or to be precise, a company Intel hired to create a whitepaper on Core i9 gaming performance — has crossed that line. According to Forbes, Intel contracted with Principled Technologies to distribute a whitepaper containing various claims about gaming performance between Intel's upcoming Core i9-9900K and Core i7-8700K and the AMD Threadripper 2990WX, 2950X, and Ryzen 7 2700X. With AMD having surged into competitive positioning in the past 18 months and Intel taking heat from its 10nm delays, Chipzilla has every reason to push a narrative that puts it in the driving seat of gaming. But Intel is using this whitepaper to claim that it's up to 50 percent faster than AMD in gaming based on Ashes of the Singularity in particular, and that's where the problems start. The Intel results are somewhat higher than we'd expect, but the AMD CPUs — particularly the Ryzen 7 2700X — are crippled.
There are several problems with the AMD benchmarks as run by Principled Technologies. PT was careful to document its own configuration steps on both systems, which is why we know what, precisely, the company did wrong. First, the Ryzen systems were tested without XMP enabled. XMP is the high-end memory timing standard that enthusiast kits use to hit maximum performance and Ryzen gaming performance is often tied directly to its RAM clock and sub-timings. Using substandard timing could lower Ryzen's performance by 5-15 percent. Second, all of the benchmarks in question were run using a GTX 1080 Ti and a resolution of just 1080p. If you wanted to create a report tailor-made to Ryzen's weaknesses, that's the resolution you'd use. Unfair? Not necessarily — it's the most common resolution after all. But there's a reason we include 1440p and 4K results in our resolutions comparisons for gaming, and Intel/Principled didn't do so.
Third, Principled Technologies notes that it enabled "Game Mode" in AMD's Ryzen Master utility. The implication is that it did this on both systems. This can have serious side effects on how well an AMD system benchmarks. On Threadripper, engaging Game Mode cuts the CPU core count in half and enables NUMA to allow the remaining CPU cores to schedule workloads on the cores closest to the memory controllers. On Ryzen 7, clicking Game Mode just cuts the core count in half. That's why AMD's user guide for Ryzen 7 specifically states that Game Mode is reserved principally for Threadripper and that Ryzen customers shouldn't use it [...] the 50 percent performance gain that Intel claims for itself is exactly the kind of result we'd expect if the 2700X had been crippled by having its CPU neutered.
In addition to what is mentioned above, AMD's stock Ryzen 7 2700X Wraith Prism cooler was used for the AMD system while a premium Noctua NH-U14S cooler was used for the Intel system. This could allow the system to hit higher frequencies for longer periods of time.
If you think Apple products are overpriced now, wait until they’re 50 years old.
This original Apple I recently sold at auction for $375,000, making it one of the most expensive 6502-based computers in history. Given that only something like 60 or 70 of the machines were ever made, most built by hand by [Jobs] and [Wozniak], it’s understandable how collectors fought for the right to run the price up from the minimum starting bid of $50,000. And this one was particularly collectible. According to the prospectus, this machine had few owners, the most recent of whom stated that he attended a meeting of the legendary Homebrew Computer Club to see what all the fuss was. He bought it second-hand from a coworker for $300, fiddled with it a bit, and stashed it in a closet. A few years later, after the Apple ][ became a huge phenomenon, he tried to sell the machine to [Woz] for $10,000. [Woz] didn’t bite, and as a result, the owner realized a 125,000% return on his original investment, before inflation.
The machine was restored before hitting the auction block, although details of what was done were not shared. But it couldn’t have been much since none of the previous owners had even used the prototyping area that was so thoughtfully provided on the top edge of the board. It was sold with period-correct peripherals including a somewhat janky black-and-white security monitor, an original cassette tape interface, and a homebrew power supply. Sadly, there’s no word who bought the machine – it was an anonymous purchase.
Submitted via IRC for Bytram
Car accidents, sports injuries, even too much typing and texting can injure the peripheral nerves, leaving people with numbness, tingling and weakness in their hands, arms or legs. Recovery can take months, and doctors have little to offer to speed it along.
Now, researchers at Washington University School of Medicine in St. Louis and Northwestern University have developed an implantable, biodegradable device that delivers regular pulses of electricity to damaged peripheral nerves in rats, helping the animals regrow nerves in their legs and recover their nerve function and muscle strength more quickly. The size of a quarter, the device lasts about two weeks before being completely absorbed into the body. The findings are published Oct. 8 in Nature Medicine.
For most people with peripheral nerve injuries, doctors suggest painkillers such as aspirin and physical therapy. Severe cases may require surgery, and standard practice is to administer some electrical stimulation to the injured nerves during the surgery to aid recovery. "We know that electrical stimulation during surgery helps, but once the surgery is over, the window for intervening is closed," said co-senior author Wilson "Zack" Ray, MD, an associate professor of neurosurgery, of biomedical engineering and of orthopedic surgery at Washington University. "With this device, we've shown that electrical stimulation given on a scheduled basis can further enhance nerve recovery."
Wireless bioresorbable electronic system enables sustained nonpharmacological neuroregenerative therapy (DOI: 10.1038/s41591-018-0196-2) (DX)