Slash Boxes

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password

Site News

Join our Folding@Home team:
Main F@H site
Our team page

Funding Goal
For 6-month period:
2019-07-01 to 2019-12-31
(All amounts are estimated)
Base Goal:


Covers transactions:
2019-01-01 00:00:00 ..
2019-08-18 13:49:50 UTC
(SPIDs: [1128..1147)
Last Update:
2019-08-19 13:33:31 UTC

Support us: Subscribe Here
and buy SoylentNews Swag

We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your marital status?

  • Single (Never Married)
  • Separated
  • Legally Separated
  • Divorced
  • Married
  • Married with Children
  • Other (Specify)

[ Results | Polls ]
Comments:109 | Votes:209

posted by Fnord666 on Tuesday January 08 2019, @11:39PM   Printer-friendly
from the einstein-dismisses-india-scientists dept.


Some academics at the annual Indian Science Congress dismissed the findings of Isaac Newton and Albert Einstein.

Hindu mythology and religion-based theories have increasingly become part of the Indian Science Congress agenda.

But experts said remarks at this year's summit were especially ludicrous.

[...] The head of a southern Indian university cited an old Hindu text as proof that stem cell research was discovered in India thousands of years ago.

G Nageshwar Rao, vice chancellor of Andhra University, also said a demon king from the Hindu religious epic, Ramayana, had 24 types of aircraft and a network of landing strips in modern day Sri Lanka.

Another scientist from a university in the southern state of Tamil Nadu told conference attendees that Isaac Newton and Albert Einstein were both wrong and that gravitational waves should be renamed "Narendra Modi Waves" [Narendra Modi is the current Prime Minister of India].

Original Submission

posted by Fnord666 on Tuesday January 08 2019, @10:02PM   Printer-friendly
from the seeing-the-future dept.

CES 2019 Quick Bytes: Consumer 10nm is Coming with Intel's Ice Lake

We've been on Intel's case for years to tell us when its 10nm parts are coming to the mass market. Technically Intel already shipped its first 10nm processor, Cannon Lake, but this was low volume and limited to specific geographic markets. This time Intel is promising that its first volume consumer processor on 10nm will be Ice Lake. It should be noted that Intel hasn't put a date on Ice Lake launching, but has promised 10nm on shelves by the end of 2019. It has several products that could qualify for that, but Ice Lake is the likely suspect.

At Intel's Architecture Day in December, we saw chips designated as 'Ice Lake-U', built for 15W TDPs with four cores using the new Sunny Cove microarchitecture and Gen11 graphics. Intel went into some details about this part, which we can share with you today.

The 15W processor is a quad core part supporting two threads per core, and will have 64 EUs of Gen11 graphics. 64 EUs will be the standard 'GT2' mainstream configuration for this generation, up from 24 EUs today. In order to drive that many execution units, Intel stated that they need 50-60 GB/s of memory bandwidth, which will come from LPDDR4X memory. In order for those numbers to line up, they will need LPDDR4X-3200 at a minimum, which gives 51.2 GB/s. [...] For connectivity, the chips will support Wi-Fi 6 (802.11ax) if the laptop manufacturer uses the correct interface module, but the support for Wi-Fi 6 is in the chip. The processor also supports native Thunderbolt 3 over USB Type-C, marking the first Intel chip with native TB3 support.

CES 2019 Quick Bytes: Intel's 10nm Hybrid x86 Foveros Chip is Called Lakefield

The reason this chip exists is because one of Intel's customers requested a processor with integrated graphics that can idle at 2 milliwatts. After a few years of engineering, Intel is finally there. There's also another trick at play here.

The chip uses a combination of Intel's high power and low power cores. Inside the new chip, which Intel announced at CES is called Lakefield, is one of its high-powered Core architecture Sunny Cove cores, and four low-powered Tremont Atom cores. This is the first Intel chip, or consumer chip at least, to use both core designs at once. This is fairly common for Arm chips in smartphones, but we have not seen it yet in the PC space. We have a block diagram showing cache layouts and things, and at the first showing, Intel's Jim Keller said that the company were having fun with the technology with designing things that could become future parts.

Intel is also announcing an "AI" focused chip that will compete with Nvidia's similar GPU products:

Intel has just announced a brand new class of AI processor: the Intel Nervana NNP-1. This is one of the first truly powerful AI processors that Intel has promised to produce. All previous AI chips the company made were in the mWatt of power, this one is going to be in the "hundreds of watts" of power. While no specific details were given in the demo, it was inferred that the technology will take advantage of Intel's DL Boost technology to offer a CPU based competitor to GPUs.

See also: Intel's Keynote at CES 2019: 10nm, Ice Lake, Lakefield, Snow Ridge, Cascade Lake
Intel's New 9th Gen Desktop CPUs: i3-9350KF, i5-9400F, i5-9400, i5-9600KF, i7-9700KF, i9-9900KF

Original Submission

posted by Fnord666 on Tuesday January 08 2019, @08:25PM   Printer-friendly
from the scoping-things-out dept.

Submitted via IRC for takyon

New BGU System Produces High-Res Images at Low Cost

An article in the December issue of the journal Optica demonstrated that nanosatellites the size of milk cartons arranged in a spherical (annular) configuration were able to capture images that match the resolution of the full-frame, lens-based or concave mirror systems used on today's telescopes.

BGU Ph.D. candidate Angika Bulbul, working under the supervision of Prof. Joseph Rosen of BGU's Department of Electrical and Computer Engineering, explains the groundbreaking nature of this study, saying it proves that by using a partial aperture, even a high-resolution image can be generated. This reduces the cost of traditionally large telescopic lenses.

"We found that you don't need the entire telescope lens to obtain the right images. Even by using a partial aperture area of a lens, as low as 0.43%, we managed to obtain a similar image resolution to the full aperture area of mirror or lens-based imaging system. The huge cost, time and material needed for gigantic traditional optical space telescopes with large curved mirrors can be slashed," she said.

Original Submission

posted by Fnord666 on Tuesday January 08 2019, @06:48PM   Printer-friendly
from the RIP dept.

Submitted via IRC for Bytram

ARP Founder Alan R. Pearlman Has Died

Alan R. Pearlman, engineer and founder of the pioneering synth manufacturer ARP Instruments, died yesterday at the age of 93.

His daughter, Dina Pearlman, shared the news:

My father passed away today after a long illness.

At 93, too weak to speak he still managed to play the piano this morning, later passing away peacefully in the afternoon. He was a great man and contributed much to the world of music you all know today.

Hopefully I can find something more eloquent to say, but I am too sad for words right now.

Pearlman (1925 – 2019) founded ARP Instruments, Inc. (originally Tonus, Inc.) in 1969, in the very early days of the synth industry. "ARP" was Pearlman's nickname, as a kid growing up in New York City.

Original Submission

posted by Fnord666 on Tuesday January 08 2019, @05:11PM   Printer-friendly
from the making-a-difference dept.

Submitted via IRC for takyon

Can a set of equations keep U.S. census data private?

The U.S. Census Bureau is making waves among social scientists with what it calls a "sea change" in how it plans to safeguard the confidentiality of data it releases from the decennial census.

The agency announced in September 2018 that it will apply a mathematical concept called differential privacy to its release of 2020 census data after conducting experiments that suggest current approaches can't assure confidentiality. But critics of the new policy believe the Census Bureau is moving too quickly to fix a system that isn't broken. They also fear the changes will degrade the quality of the information used by thousands of researchers, businesses, and government agencies.

The move has implications that extend far beyond the research community. Proponents of differential privacy say a fierce, ongoing legal battle over plans to add a citizenship question to the 2020 census has only underscored the need to assure people that the government will protect their privacy.

[...] Differential privacy, first described in 2006, isn't a substitute for swapping and other ways to perturb the data. Rather, it allows someone—in this case, the Census Bureau—to measure the likelihood that enough information will "leak" from a public data set to open the door to reconstruction.

"Any time you release a statistic, you're leaking something," explains Jerry Reiter, a professor of statistics at Duke University in Durham, North Carolina, who has worked on differential privacy as a consultant with the Census Bureau. "The only way to absolutely ensure confidentiality is to release no data. So the question is, how much risk is OK? Differential privacy allows you to put a boundary" on that risk.

A database can be considered differentially protected if the information it yields about someone doesn't depend on whether that person is part of the database. Differential privacy was originally designed to apply to situations in which outsiders make a series of queries to extract information from a database. In that scenario, each query consumes a little bit of what the experts call a "privacy budget." After that budget is exhausted, queries are halted in order to prevent database reconstruction.

In the case of census data, however, the agency has already decided what information it will release, and the number of queries is unlimited. So its challenge is to calculate how much the data must be perturbed to prevent reconstruction.

Original Submission

posted by The Mighty Buzzard on Tuesday January 08 2019, @04:32PM   Printer-friendly
from the spit-and-baling-wire dept.

After a morning of slow query logging and cussing, a misplaced "GROUP BY" that was turning a 0.04 second query into one that took over fifteen seconds has been fixed. I'll leave the slow query log running overnight though just to make sure I didn't miss any less common ones. If you're still seeing any serious site slowdowns, let us know.

posted by Fnord666 on Tuesday January 08 2019, @03:44PM   Printer-friendly
from the shave-and-a-haircut-qubits dept.

Quantum scientists demonstrate world-first 3D atomic-scale quantum chip architecture

UNSW researchers at the Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) have shown for the first time that they can build atomic precision qubits in a 3D device -- another major step towards a universal quantum computer.

The team of researchers, led by 2018 Australian of the Year and Director of CQC2T Professor Michelle Simmons, have demonstrated that they can extend their atomic qubit fabrication technique to multiple layers of a silicon crystal -- achieving a critical component of the 3D chip architecture that they introduced to the world in 2015. This new research was published today in Nature Nanotechnology.

The group is the first to demonstrate the feasibility of an architecture that uses atomic-scale qubits aligned to control lines -- which are essentially very narrow wires -- inside a 3D design.

What's more, the team was able to align the different layers in their 3D device with nanometer precision -- and showed they could read out qubit states single shot, i.e. within one single measurement, with very high fidelity.

Spin read-out in atomic qubits in an all-epitaxial three-dimensional transistor (DOI: 10.1038/s41565-018-0338-1) (DX)

Related: UNSW Proposes 7nm Architecture for Quantum Computing Chips

Original Submission

posted by Fnord666 on Tuesday January 08 2019, @02:07PM   Printer-friendly
from the zesty-sauce dept.

Gene editing could create spicy tomatoes, say researchers

Spicy tomatoes could soon be on the menu thanks to the rise of genome-editing technology, say researchers. It is not the first time experts have claimed the techniques could help to precisely and rapidly develop fruits and vegetables with unusual traits: scientists have already been looking at changing the colour of kiwi fruits and tweaking the taste of strawberries.

But researchers in Brazil and Ireland say such methods also could offer practical advantages, with spicy tomatoes offering a way of harvesting capsaicinoids, the pungent chemicals found in chilli peppers.

[...] Tomatoes and chilli peppers developed from a common ancestor but diverged about 19m years ago. "All the genes to produce capsaicinoids exist in the tomato, they are just not active," Zsögön said.

Capsaicinoids: Pungency beyond Capsicum (open, DOI: 10.1016/j.tplants.2018.11.001) (DX)

Original Submission

posted by martyb on Tuesday January 08 2019, @12:30PM   Printer-friendly
from the if-your-parents-didn't-have-children,-then-you-probably-won't,-either dept.

Monogamy may have a telltale signature of gene activity

In the animal world, monogamy has some clear perks. Living in pairs can give animals some stability and certainty in the constant struggle to reproduce and protect their young—which may be why it has evolved independently in various species. Now, an analysis of gene activity within the brains of frogs, rodents, fish, and birds suggests there may be a pattern common to monogamous creatures. Despite very different brain structures and evolutionary histories, these animals all seem to have developed monogamy by turning on and off some of the same sets of genes.

"It is quite surprising," says Harvard University evolutionary biologist Hopi Hoekstra, who was not involved in the new work. "It suggests that there's a sort of genomic strategy to becoming monogamous that evolution has repeatedly tapped into."

Conserved transcriptomic profiles underpin monogamy across vertebrates (DOI: 10.1073/pnas.1813775116) (DX)

Original Submission

posted by martyb on Tuesday January 08 2019, @10:53AM   Printer-friendly
from the to-the-moon-and-back dept.

Soon, three companies will be able to perform resupply missions for the International Space Station, and that may be one too many:

How Sierra Nevada's "Dream Chaser" Could Become a Nightmare for Northrop Grumman

[Sierra Nevada Corporation (SNC)] intends to perform its obligations under [Commercial Resupply Services (CRS-2)] using its new "Dream Chaser" spaceplane, a privately developed space shuttle (but only one-quarter the size of the Space Shuttle) that will launch into orbit atop a rocket, make its delivery, then land back on Earth under its own power like an airplane.

[...] Dream Chaser is designed to be reusable, with a service life of 15 missions. In this regard, the SNC is similar to SpaceX, which sends cargo to ISS aboard reusable Dragon space capsules launched into orbit by also-reusable Falcon rockets. Utilizing reusable spacecraft, both SNC and SpaceX should be able to save considerably on the cost of their missions, because they will not need to build new spacecraft for each supply run. In contrast, Northrop Grumman performs its ISS resupply missions using disposable Cygnus cargo capsules carried by expendable Antares rockets -- likely a more expensive proposition.

[...] Currently, plans are for SNC to purchase Atlas V rockets from United Launch Alliance for this purpose. But in 2016, SNC's then-VP of Space Systems John Olson let on that SNC was designing the spaceplane to be "agnostic" as to which launcher it uses to get into orbit. So in theory, at least, SNC could use a SpaceX Falcon rocket to carry Dream Chaser instead. Because SpaceX's Falcons are cheaper than the expendable rockets used by other space launch companies, this would probably result in a lower launch cost for SNC (and the cost could be even cheaper if SNC uses reusable Falcons).

Granted, this would necessitate giving money to a competitor. However, seeing as Sierra Nevada is going to have to buy its launch vehicles from somebody, it might as well buy them from the cheapest provider. And if it does so, this will almost certainly mean that not only SpaceX, but SNC, too, can bid below what Northrop Grumman must charge to perform CRS-2 supply missions for NASA -- giving SNC a leg up in future competitions to resupply ISS.

Related: United Nations to Launch a Space Mission
NASA to Continue Funding Private Spaceflight, Considers Sixth Hubble Upgrade Mission

Original Submission

posted by martyb on Tuesday January 08 2019, @09:16AM   Printer-friendly
from the picture-this dept.

The NVIDIA GeForce RTX 2060 6GB Founders Edition Review: Not Quite Mainstream

In the closing months of 2018, NVIDIA finally released the long-awaited successor to the Pascal-based GeForce GTX 10 series: the GeForce RTX 20 series of video cards. Built on their new Turing architecture, these GPUs were the biggest update to NVIDIA's GPU architecture in at least half a decade, leaving almost no part of NVIDIA's architecture untouched.

So far we've looked at the GeForce RTX 2080 Ti, RTX 2080, and RTX 2070 – and along with the highlights of Turing, we've seen that the GeForce RTX 20 series is designed on a hardware and software level to enable realtime raytracing and other new specialized features for games. While the RTX 2070 is traditionally the value-oriented enthusiast offering, NVIDIA's higher price tags this time around meant that even this part was $500 and not especially value-oriented. Instead, it would seem that the role of the enthusiast value offering is going to fall to the next member in line of the GeForce RTX 20 family. And that part is coming next week.

Launching next Tuesday, January 15th is the 4th member of the GeForce RTX family: the GeForce RTX 2060 (6GB). Based on a cut-down version of the same TU106 GPU that's in the RTX 2070, this new part shaves off some of RTX 2070's performance, but also a good deal of its price tag in the process

Previously: Nvidia Announces RTX 2080 Ti, 2080, and 2070 GPUs, Claims 25x Increase in Ray-Tracing Performance

Original Submission

posted by martyb on Tuesday January 08 2019, @07:39AM   Printer-friendly
from the now-THAT-is-fast-(for-humans) dept.

A mission to send a spacecraft covering a distance of 20 astronomical units[*] per year could be used to explore interesting targets in the Kuiper belt:

The proposed interstellar probe itself is a suggestion that's been kicking around for a while now. The idea is that the team could use existing and near-term technology, and rely on speed boosted by a sequence of gravity assists, to send a spacecraft racing across the solar system faster than any vessel to date.

[...] Once instruments are set, it's a matter of picking a dream destination — or several. [Kathleen Mandt, a planetary scientist at the Johns Hopkins University Applied Physics Laboratory,] and other team members have studied how potential targets will align, assuming the probe could launch by 2030 as desired. That means looking at a whole host of Kuiper Belt objects found beyond Neptune's orbit.

Take, for example, Quaoar, a Kuiper Belt object that's about half as wide as Pluto. Scientists have spotted the signature of methane on this object's surface, which could mean it still clings to a thin atmosphere. But scientists aren't sure how much Quaoar resembles its larger, more famous cousin.

And Quaoar is just one potential large target. "I would love to do a flyby of Eris, because it's similar in size to Pluto but farther out in the solar system," Mandt said. In particular, she would want to pursue planetology, investigating how Eris matches or differs from Pluto. She'd want to be able to answer questions like whether Eris has an atmosphere and what volatile elements are still at its surface, if any, she said.

Other possible destinations from this class of objects include Makemake, which has its own moon and is outshined in the Kuiper Belt only by Pluto, and Haumea, a football-shaped dwarf planet. For all of these worlds, a zippy flyby could tell scientists about the object's surface composition and geology, as well as whether the surfaces hides oceans.

[*] Wikipedia's Astronomical Unit page notes it was originally defined as the average distance of the Earth from the Sun. It works out to being approximately 150 million km or 93 million miles. A craft travelling at 20 au per year would, therefore, be travelling at: (20 au)*(150e6 km/au)/((365 days)*(24 hours/day)) which reduces to over 340,000 kph (200,000 mph). By comparison, the average lunar distance is nearly 390,000 km (240,000 miles). Assuming instantaneous acceleration and deceleration, a trip to the moon at that speed would take about 72 minutes!

Original Submission

posted by martyb on Tuesday January 08 2019, @06:02AM   Printer-friendly
from the down-from-a-trillion dept.

Amazon is now the USA's most valuable publicly-traded company by market value:

Amazon's ended trading Monday with a market value of about $797 billion, compared with Microsoft's $783 billion. Apple, which had been part of a close three-way race for the seat, is now down to about $702 billion in market value after plunging last week on the news of its weak iPhone sales. Google parent company Alphabet has surpassed Apple with a market value of about $748 billion.

Previously: Microsoft Overtakes Amazon as Second Most Valuable U.S. Company

Original Submission

posted by martyb on Tuesday January 08 2019, @04:25AM   Printer-friendly
from the we've-come-a-long-way-since-Colossal-Cave dept.

(This is semi-advertising, but is definitely interesting to many "nerds," and one of those "why did nobody tell me these things" when I first heard about it. Who says advertising is bad?)

The "Awesome Games Done Quick" (AGDQ) marathon began on Sunday (1/6). This week-long 24/7 stream showcases people finishing games as quickly as possible, as in finishing the game Portal in about 10 minutes.

It is a semi-annual gaming event run for charity, which along with its partner "Summer Games Done Quick," (SGDQ) raises millions of dollars of donations for two charities. The charity for AGDQ is the Prevent Cancer Foundation. However, the actual event itself is free to watch online; no donation needed.

You can find more information on their website:

And/or join them on their Twitch stream.

Original Submission

posted by martyb on Tuesday January 08 2019, @02:48AM   Printer-friendly
from the it-takes-a-thief-... dept.

This coming summer the Society of Automotive Engineers (SAE) is running their 8th annual security workshop — details at

The SAE CyberAuto™ Challenge brings together students and engineers from different backgrounds, industries, and organizations to collaboratively seek new information on automotive cybersecurity. No matter your perspective of participation at CyberAuto Challenge, your experience will benefit you now and in the future:

  • High school and college students work with in-service vehicles and their production code, software stacks, and internal electronics
  • Automotive engineers learn new ways to think about vehicle security and safety
  • Government officials gain new perspectives about vehicle security and safety while engaging one-on-one with the next generation of cyber professionals
  • Researchers developing emerging techniques to find real solutions to cybersecurity challenges and engage the next generation cyber-auto engineers.

This AC has no idea if you can really teach security, but at least someone is trying. It's also possible that SAE is training the other side? The page has a glowing testimonial that ends:

To sum it all up: thank you. That five days of the CyberAuto Challenge changed my life.”

–Vanya Gorbachev, 2018 CyberAuto Challenge participant

Original Submission

posted by takyon on Tuesday January 08 2019, @01:11AM   Printer-friendly
from the metamesh dept.

Submitted via IRC for SoyCow1984

Engineers can now reverse-engineer 3D models

A system that uses a technique called constructive solid geometry (CSG) is allowing MIT researchers to deconstruct objects and turn them into 3D models, thereby allowing them to reverse-engineer complex things.

The system appeared in a paper entitled "InverseCSG: Automatic Conversion of 3D Models to CSG Trees" by Tao Du, Jeevana Priya Inala, Yewen Pu, Andrew Spielberg, Adriana Schulz, Daniela Rus, Armando Solar-Lezama, and Wojciech Matusik.

"At a high level, the problem is reverse engineering a triangle mesh into a simple tree. Ideally, if you want to customize an object, it would be best to have access to the original shapes — what their dimensions are and how they're combined. But once you combine everything into a triangle mesh, you have nothing but a list of triangles to work with, and that information is lost," said Tao Du to 3DPrintingIndustry. "Once we recover the metadata, it's easier for other people to modify designs."

Original Submission