Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Do you put ketchup on the hot dog you are going to consume?

  • Yes, always
  • No, never
  • Only when it would be socially awkward to refuse
  • Not when I'm in Chicago
  • Especially when I'm in Chicago
  • I don't eat hot dogs
  • What is this "hot dog" of which you speak?
  • It's spelled "catsup" you insensitive clod!

[ Results | Polls ]
Comments:81 | Votes:227

posted by martyb on Thursday July 04 2019, @10:46PM   Printer-friendly
from the down-payment dept.

Boeing pledges $100M to families of 737 Max crash victims – TechCrunch

Boeing has said it will offer $100 million to the families and communities of those who died aboard the two 737 Max passenger jets that crashed earlier this year. This “initial outreach” will likely only be a small part of the company’s penance for the mistakes that led to the deaths of 346 people.

In a statement, the company said it expected the money to “address family and community needs,” and “support education, hardship and living expenses.”

[...] CEO and president Dennis Muilenburg... earlier this year accepted the blame, acknowledging that “it is apparent that in both flights, the Maneuvering Characteristics Augmentation System, known as MCAS, activated in response to erroneous angle of attack information.”

[...] This initial payout is voluntary; it is highly unusual for an airplane maker to pay such a sum to the victims of a crash ahead of any lawsuits. Boeing, Airbus and other companies involved in passenger flight have certainly in the past paid damages, directly or via insurance or some other means, but that was generally after a lawsuit forced them to. Sometimes a company will approach families with ready money to prevent them from filing a lawsuit, but that’s not often publicized.

And lawsuits are certainly underway already, with dozens of families bringing suits for each crash. The amounts these could bring are very difficult to predict, but given the loss of life and that the flaws that led to it can be traced directly to mistakes by Boeing, the company could be on the hook for hundreds of millions more.


Original Submission

posted by janrinok on Thursday July 04 2019, @08:23PM   Printer-friendly

OpenPGP protocol developer, Daniel Kahn Gillmor, has written up what is happening with an attack against the OpenPGP's infrastructure. In recent days the SKS keyserver network has come under a particularly hard to mitigate attack which is, problematically, also difficult to resolve permanently. The problem lies with the design of that part of the infrastructure. Although replacements are available, the move has not yet happened.

Some time in the last few weeks, my OpenPGP certificate, 0xC4BC2DDB38CCE96485EBE9C2F20691179038E5C6 was flooded with bogus certifications which were uploaded to the SKS keyserver network.

SKS is known to be vulnerable to this kind of Certificate Flooding, and is difficult to address due to the synchronization mechanism of the SKS pool. (SKS's synchronization assumes that all keyservers have the same set of filters). You can see discussion about this problem from a year ago along with earlier proposals for how to mitigate it. But none of those proposals have quite come to fruition, and people are still reliant on the SKS network.

Also covered at Vice as Someone Is Spamming and Breaking a Core Component of PGP's Ecosystem and ZDNet

Earlier on SN: Op-Ed: Why I'm Not Giving Up on PGP (2016)


Original Submission

posted by janrinok on Thursday July 04 2019, @06:05PM   Printer-friendly
from the because-we-can dept.

Submitted via IRC for SoyCow4463

The Department of Defense wants ideas for a tiny autonomous space station

The Pentagon's Defense Innovation Unit (DIU) has issued a solicitation for a tiny, "self-contained and free flying orbital outpost" that can host experiments and equipment in orbit and could eventually be scaled up for human habitation.

The Orbital Outpost that's being solicited would be small: it needs to have at least a cubic meter of space inside, be able to carry 80 kilograms, have continuous power, and should have a pressurization of anywhere from 0 to 1 atmospheres. It should be able to move around in orbit on its own, and it has to be built quickly; the military wants it ready to go within two years after it awards a contractor a contract.

The military also says that it eventually wants the station to be modular (able to attach other components or other outposts), have a robotic arm, be able to carry people, and be hardened against radiation for "beyond [low Earth orbit] applications."

[...] A solicitation is just a proposal for contractors to submit an idea for any number of things that the military thinks it might need, like an autonomous aircraft to shepherd people on and off a battlefield. It's not an indication that the Department of Defense is imminently ready to establish its own fleet of space stations in orbit. Colonel Steve Butow, the director of the DIU's Space Portfolio, told Breaking Defense in an email that his outfit is "casting a wide net for commercial solutions that can meet the basic needs described in the first part of the solicitation (autonomous/robotic, etc)" and that the military is "more interested in the 'how' rather than the 'why'."


Original Submission

posted by chromas on Thursday July 04 2019, @03:41PM   Printer-friendly
from the should've-had-an-X12 dept.

Chris Siebenmann, a UNIX herder at the University of Toronto CS Lab, asserts that the death watch for the X Window System (aka X11) has probably started:

I was recently reading Christian F.K. Schaller's On the Road to Fedora Workstation 31 (via both Fedora Planet and Planet Gnome). In it, Schaller says in one section (about Gnome and their move to fully work on Wayland):

Once we are done with this we expect X.org to go into hard maintenance mode fairly quickly. The reality is that X.org is basically maintained by us and thus once we stop paying attention to it there is unlikely to be any major new releases coming out and there might even be some bitrot setting in over time. We will keep an eye on it as we will want to ensure X.org stays supportable until the end of the RHEL8 lifecycle at a minimum, but let this be a friendly notice for everyone who rely the work we do maintaining the Linux graphics stack, get onto Wayland, that is where the future is.

X11, for all its advantages, also has several incurable design flaws relating to security. However, the major distros have not yet been in any hurry to replace it. Wayland is touted as the next step in graphical interfaces. What are Soylentils thoughts on Wayland or the demise of X11?


Original Submission

posted by chromas on Thursday July 04 2019, @01:20PM   Printer-friendly
from the fake-chews dept.

Grubhub says its contract allowed it to create fake restaurant websites

Grubhub CEO Matt Maloney has responded to reports that his company creates fake websites for its restaurant partners, claiming that, according to its contract agreements, these businesses have signed away permission for Grubhub to engage in the marketing tactic on their behalf. According to a partial Grubhub contract obtained by the Los Angeles Times, a provision states that Grubhub "may create, maintain and operate a microsite ("MS") and obtain the URL for such MS on restaurant's behalf."

Grubhub provided The Verge with a similar snippet. Grubhub charges varying tiers of commission fees[pdf], the highest being "marketing commission" at 20-plus percent. The contract does not explicitly specify whether these microsites are considered "marketing" or what exactly these microsites would look like. Crucially, the contract does not specify whether Grubhub microsites would use proprietary restaurant photos, logos, or domain names that sound similar to / compete with the business's actual site. (Disclosure: my parents own a restaurant business that is listed on Grubhub / Seamless.)

In New Food Economy's report last Friday, restaurant owners say they "never gave [Grubhub] permission" to create these microsites and say the company is intercepting customer's direct orders in an effort to charge high commission fees. It is, however, possible that this fine print was overlooked upon contract agreement.


Original Submission

posted by takyon on Thursday July 04 2019, @11:00AM   Printer-friendly
from the washed-away dept.

Heavy Rain Forces Evacuation Order for 1.1 Million People in Southwestern Japan:

Authorities in southwestern Japan have instructed over 1.1 million residents to evacuate Wednesday as torrential rains continued, triggering flooding and mudslides.

Houses and fields were inundated after river dikes washed away in Miyazaki and Kagoshima prefectures, with Japan's Ground Self-Defense Force dispatching troops to affected areas at the request of Kagoshima Gov. Satoshi Mitazono.

[...] In the city of Kagoshima, where all 590,000 residents were told to evacuate, elderly people and others huddled in shelters to wait out the storm. Some evacuees in the neighboring prefecture's Kumamoto city were seen carrying their bedding to evacuation centers to spend the night.

The Japan Meteorological Agency has warned that heavy rains in southwestern and western Japan could continue for another day.

[...] The heavy rain also disrupted sections of the Kyushu shinkansen bullet train line and forced over 150 schools to cancel classes.

In the 24 hours through 6 a.m. Thursday, the agency forecast up to 350 millimeters [(13.8 inches)] of rain in southern Kyushu, up to 300 mm [(11.8 inches)] in northern Kyushu, and up to 250 mm [(9.8 inches)] in the Shikoku region.

The Kinki region, covering Osaka, was forecast to get 150 mm [(5.9 inches)] of rain, while the Tokai region centered on Nagoya anticipated 120 mm [(4.7 inches)]. The Chugoku region around Hiroshima and the Hokuriku area facing the Sea of Japan were also expected to see heavy rainfall, according to the agency.

It said a rainy front is expected to stay over the Japanese archipelago through Saturday, and could also drench eastern Japan.

Get familiar with the shape and location of Japan and then load earth.nullschool.net's real-time maps of the area. The green circle marks the approximate location of Tokyo. Here is a map showing the most-recent 3-hour precipitation amount (3HPA). This map shows Total Precipitable Water (TPW).

When there is 1-2 inches of rain in my area, it makes for a most unpleasant day. I cannot begin to imagine what it would feel like to receive over a foot of rain in 24 hours... and then to try to evacuate in those conditions, too? Besides that, where do they go? Downhill leads to more flooding. Uphill leads to being on slopes that could be involved in a landslide. A major storm last year in Japan took the lives of over 200 people. Here's hoping this year's increased awareness and evacuation orders help to reduce the number of fatalities.

Also at CNN, BBC, NPR, and Deutsche Welle.


Original Submission

posted by martyb on Thursday July 04 2019, @08:27AM   Printer-friendly
from the I-know-what-you-did^W-said-last-summer dept.

Amazon confirms it keeps your Alexa recordings basically forever

If you (like so many of us) hate listening to recordings of your own voice, you may be in for an unpleasant future, as Amazon has confirmed it hangs on to every conversation you've ever had with an Alexa-enabled device until or unless you specifically delete them.

That confirmation comes as a response to a list of questions Sen. Chris Coons (D-Delaware) sent to Amazon CEO Jeff Bezos in May expressing "concerns" about how Amazon uses and retains customers' Alexa voice assistant data.

Amazon's response to Coons, as first reported by CNET, confirms that the company keeps your data as long as it wants unless you deliberately specify otherwise.

"We retain customers' voice recordings and transcripts until the customer chooses to delete them," Amazon said—but even then there are exceptions.

Amazon, as well as third parties that deploy "skills" on the Alexa platform, keep records of interactions customers have with Alexa, the company said. If, for example, you order a pizza, purchase digital content, summon a car from a ride-hailing service, or place an Amazon order, "Amazon and/or the applicable skill developer obviously need to keep a record of the transaction," Amazon said, without clarifying the specific kind of data that's in that record.

[...] If you would like to review and delete any Alexa voice or transcript data in your Amazon account, you can do so under the Alexa Privacy section, found under "Change your digital and device settings" in the "Your Devices and Content" section of your account.

See also TechCrunch.


Original Submission

posted by martyb on Thursday July 04 2019, @06:06AM   Printer-friendly
from the building-better-bot-blocks dept.

https://thenextweb.com/google/2019/07/02/google-wants-to-make-the-25-year-old-robots-txt-protocol-an-internet-standard/:

Google's main business has been search, and now it wants to make a core part of it an internet standard.

The internet giant has outlined plans to turn robots exclusion protocol (REP) — better known as robots.txt — into an internet standard after 25 years. To that effect, it has also made its C++ robots.txt parser that underpins the Googlebot web crawler available on GitHub for anyone to access.

"We wanted to help website owners and developers create amazing experiences on the internet instead of worrying about how to control crawlers," Google said. "Together with the original author of the protocol, webmasters, and other search engines, we've documented how the REP is used on the modern web, and submitted it to the IETF."

The REP is one of the cornerstones of web search engines, and it helps website owners manage their server resources more easily. Web crawlers — like Googlebot — are how Google and other search engines routinely scan the internet to discover new web pages and add them to their list of known pages.

A follow-on post to Google's blog expands on the proposal.

The Draft Specification is available here. Google has put its open-source repository up on GitHub


Original Submission

posted by martyb on Thursday July 04 2019, @03:47AM   Printer-friendly
from the forewarned-is-forearmed dept.

Updated (20190704_023935 UTC) Server reboots have completed. I am unaware of any issues from these reboots; please reply here and post to the #dev channel on IRC if anything is amiss. Original story follows with minor updates to mark all servers have been rebooted. --martyb

For those who might not be aware, SoylentNews operations run on servers from Linode. I have recently become aware of their plans to reboot servers:

To complete our mitigations against the recent MDS (ZombieLoad) CPU vulnerability, we will be performing maintenance on a subset of Linode’s host machines. This maintenance will update the underlying infrastructure that Linodes reside on and will not affect the data stored within them.

Here is the schedule for our affected systems:

fluorine (*)2019-06-25 05:00 AM UTC
beryllium (*)2019-06-27 09:00 AM UTC
helium (*)2019-06-28 03:00 AM UTC
boron (*)2019-06-28 04:00 AM UTC
hydrogen (*)2019-07-02 09:00 AM UTC
sodium (*)2019-07-03 02:00 AM UTC

(*) Completed.

Historically, there is a two-hour window for reboots to occur, but it usually takes far less time than that.

We will attempt to minimize any impact on site operations, but want to let the community know what was coming up.

posted by chromas on Thursday July 04 2019, @01:28AM   Printer-friendly
from the This-is-important-information-if-you-or-your-loved-one-is-different-from-pondering-your-boat-engine dept.

Endless AI-generated spam risks clogging up Google's search results

Over the past year, AI systems have made huge strides in their ability to generate convincing text, churning out everything from song lyrics to short stories. Experts have warned that these tools could be used to spread political disinformation, but there's another target that's equally plausible and potentially more lucrative: gaming Google.

Instead of being used to create fake news, AI could churn out infinite blogs, websites, and marketing spam. The content would be cheap to produce and stuffed full of relevant keywords. But like most AI-generated text, it would only have surface meaning, with little correspondence to the real world. It would be the information equivalent of empty calories, but still potentially difficult for a search engine to distinguish from the real thing.

Just take a look at this blog post answering the question: "What Photo Filters are Best for Instagram Marketing?" At first glance it seems legitimate, with a bland introduction followed by quotes from various marketing types. But read a little more closely and you realize it references magazines, people, and — crucially — Instagram filters that don't exist:

You might not think that a mumford brush would be a good filter for an Insta story. Not so, said Amy Freeborn, the director of communications at National Recording Technician magazine. Freeborn's picks include Finder (a blue stripe that makes her account look like an older block of pixels), Plus and Cartwheel (which she says makes your picture look like a topographical map of a town.

The rest of the site is full of similar posts, covering topics like "How to Write Clickbait Headlines" and "Why is Content Strategy Important?" But every post is AI-generated, right down to the authors' profile pictures. It's all the creation of content marketing agency Fractl, who says it's a demonstration of the "massive implications" AI text generation has for the business of search engine optimization, or SEO.

"Because [AI systems] enable content creation at essentially unlimited scale, and content that humans and search engines alike will have difficulty discerning [...] we feel it is an incredibly important topic with far too little discussion currently," Fractl partner Kristin Tynski tells The Verge.

[...] The key question, then, is: can we reliably detect AI-generated text? Rowan Zellers of the Allen Institute for AI says the answer is a firm "yes," at least for now. Zellers and his colleagues were responsible for creating Grover, the tool Fractl used for its fake blog posts, and were able to also engineer a system that can spot Grover-generated text with 92 percent accuracy.

"We're a pretty long way away from AI being able to generate whole news articles that are undetectable," Zellers tells The Verge. "So right now, in my mind, is the perfect opportunity for researchers to study this problem, because it's not totally dangerous."

Spotting fake AI text isn't too hard, says Zellers, because it has a number of linguistic and grammatical tells. He gives the example of AI's tendency to re-use certain phrases and nouns. "They repeat things ... because it's safer to do that rather than inventing a new entity," says Zellers. It's like a child learning to speak; trotting out the same words and phrases over and over, without considering the diminishing returns.


Original Submission

posted by martyb on Wednesday July 03 2019, @11:07PM   Printer-friendly
from the super-hyper-ultra-turbo dept.

The GeForce RTX 2070 Super & RTX 2060 Super Review: Smaller Numbers, Bigger Performance

NVIDIA is launching a mid-generation kicker for their mid-to-high-end video card lineup in the form of their GeForce RTX 20 series Super cards. Based on the same family of Turing GPUs as the original GeForce RTX 20 series cards, these new Super cards – all suffixed Super, appropriately enough – come with new configurations and new clockspeeds. They are, essentially, NVIDIA's 2019 card family for the $399+ video card market.

When they are released on July 9th, the GeForce RTX 20 series Super cards are going to be sharing store shelves with the rest of the GeForce RTX 20 series cards. Some cards like the RTX 2080 and RTX 2070 are set to go away, while other cards like the RTX 2080 Ti and RTX 2060 will remain on the market as-is. In practice, it's probably best to think of the new cards as NVIDIA executing as either a price cut or a spec bump – depending on if you see the glass as half-empty or half-full – all without meaningfully changing their price tiers.

In terms of performance, the RTX 2060 and RTX 2070 Super cards aren't going to bring anything new to the table. In fact if we're being blunt, the RTX 2070 Super is basically a slightly slower RTX 2080, and the RTX 2060 Super may as well be the RTX 2070. So instead, what has changed is the price that these performance levels are available at, and ultimately the performance-per-dollar ratios in parts of NVIDIA's lineup. The performance of NVIDIA's former $699 and $499 cards will now be available for $499 and $399, respectively. This leaves the vanilla RTX 2060 to hold the line at $349, and the upcoming RTX 2080 Super to fill the $699 spot. Which means if you're in the $400-$700 market for video cards, your options are about to get noticeably faster.

Also at Tom's Hardware, The Verge, and Ars Technica.

Previously: Nvidia Announces RTX 2080 Ti, 2080, and 2070 GPUs, Claims 25x Increase in Ray-Tracing Performance
Nvidia Announces RTX 2060 GPU
AMD and Nvidia's Latest GPUs Are Expensive and Unappealing

Related: AMD and Intel at Computex 2019: First Ryzen 3000-Series CPUs and Navi GPU Announced
AMD Details Three Navi GPUs and First Mainstream 16-Core CPU


Original Submission

posted by Fnord666 on Wednesday July 03 2019, @09:29PM   Printer-friendly
from the say-cheese dept.

Canon is crowdfunding a tiny clippable camera that connects to your phone

Canon is turning to Indiegogo to crowdfund the Ivy Rec, a tiny outdoor camera built into a keychain carabiner. It's about the size of a USB flash drive, and it wirelessly connects via Wi-Fi or Bluetooth to the companion CanonMini Cam App to show a live preview on your phone. The empty square space of the clip doubles as a viewfinder, and there's a single dial on the back that lets you switch between modes.

The Ivy Rec has a 13-megapixel 1/3-inch CMOS sensor that can record 1080p / 60 fps video, and it's waterproof up to 30 minutes for depths of up to three feet. With no pricing information yet, it's hard to say if it'll be worth the buy or who it's really for. Canon says the camera is shockproof and great for the outdoors, so it could be useful if you clip it onto your backpack while you ride a bike. Or maybe clip it onto your dog or cat's collar so you can see the world from your pet's POV? (I mean, GoPros are already a thing.)

Too vibrant, not small enough to work as a spy camera.

Also at Engadget.


Original Submission

posted by Fnord666 on Wednesday July 03 2019, @07:52PM   Printer-friendly
from the moore-of-a-guideline dept.

Intel's Senior Vice President Jim Keller (who previously helped to design AMD's K8 and Zen microarchitectures) gave a talk at the Silicon 100 Summit that promised continued pursuit of transistor scaling gains, including a roughly 50x increase in gate density:

Intel's New Chip Wizard Has a Plan to Bring Back the Magic (archive)

In 2016, a biennial report that had long served as an industry-wide pledge to sustain Moore's law gave up and switched to other ways of defining progress. Analysts and media—even some semiconductor CEOs—have written Moore's law's obituary in countless ways. Keller doesn't agree. "The working title for this talk was 'Moore's law is not dead but if you think so you're stupid,'" he said Sunday. He asserted that Intel can keep it going and supply tech companies ever more computing power. His argument rests in part on redefining Moore's law.

[...] Keller also said that Intel would need to try other tactics, such as building vertically, layering transistors or chips on top of each other. He claimed this approach will keep power consumption down by shortening the distance between different parts of a chip. Keller said that using nanowires and stacking his team had mapped a path to packing transistors 50 times more densely than possible with Intel's 10 nanometer generation of technology. "That's basically already working," he said.

The ~50x gate density claim combines ~3x density from additional pitch scaling (from "10nm"), ~2x from nanowires, another ~2x from stacked nanowires, ~2x from wafer-to-wafer stacking, and ~2x from die-to-wafer stacking.

Related: Intel's "Tick-Tock" Strategy Stalls, 10nm Chips Delayed
Intel's "Tick-Tock" is Now More Like "Process-Architecture-Optimization"
Moore's Law: Not Dead? Intel Says its 10nm Chips Will Beat Samsung's
Another Step Toward the End of Moore's Law


Original Submission

posted by Fnord666 on Wednesday July 03 2019, @06:15PM   Printer-friendly
from the noooooo! dept.

Fire destroys Jim Beam warehouse filled with 45,000 bourbon barrels

A fire destroyed a massive Jim Beam warehouse filled with 45,000 barrels of bourbon, sending flames shooting into the night sky and generating so much heat that firetruck lights melted, authorities said Wednesday.

Firefighters from four counties responded to the blaze that erupted late Tuesday. Lightning might have been a factor, but fire investigators haven't been able to start looking for the cause, Woodford County Emergency Management Director Drew Chandler said.

No injuries were reported, Chandler said. The fire was contained but was being allowed to burn for several more hours Wednesday, he said.

[...] Officials from Jim Beam's parent company, Suntory Food and Beverage, said the multi-story warehouse that burned contained "relatively young whiskey," meaning it had not reached maturity for bottling for consumers. "Given the age of the lost whiskey, this fire will not impact the availability of Jim Beam for consumers," the spirits company said in a statement. The whiskey maker suffered a total loss in the warehouse. The destroyed whiskey amounted to about 1% of Beam's bourbon inventory, it said.

Also at CNN.


Original Submission

posted by chromas on Wednesday July 03 2019, @04:45PM   Printer-friendly
from the crisper-mice-now-with-crystals dept.

CRISPR and LASER ART Eliminate HIV from Mice

Today, a paper in Nature Communications, titled "Sequential LASER ART and CRISPR Treatments Eliminate HIV-1 in a Subset of Infected Humanized Mice" [open, DOI: 10.1038/s41467-019-10366-y] [DX] reports new work from a collaborative effort showing that a combination of long-acting slow-effective release antiviral therapy (LASER) and CRISPR-Cas9 successfully cleared HIV from infected humanized mice.

Howard Gendelman, MD, professor of internal medicine and infectious diseases at the University of Nebraska Medical Center and senior author on the paper, does not withhold his excitement over the result, which may the reason for his hyperbole. He tells GEN that the conclusion is "almost unbelievable, but, it's true" an idea, he adds, that "has been science fiction up until now." He notes that "for the first time in the world" they have shown total elimination of HIV infection from a model with an established infection and, even though there are caveats, "there is a real possibility that an HIV cure can be realized."

The team used a technique developed by co-author Kamel Khalili, PhD, professor in the department of neuroscience at the School of Medicine at Temple University, that uses the CRISPR-Cas9 system to remove the integrated HIV DNA from genomes. They combined the genome editing technique with the LASER ART, a technique developed by Gendelman's lab that targets viral sanctuaries by packaging the ART drugs into nanocrystals. LASER ART distributes the drugs to areas of the body where HIV harbors and releases them slowly over time. Testing the combination of these two methods together in mice led the authors to conclude that permanent elimination of HIV is possible.

[...] Gendelman explains that "It's kind of like being in a beach and trying to find the right shell—you might want a certain color or shape." When HIV replicates, he says, there are "billions and trillions" of particles so you're asking CRISPR to excise every single DNA provirus in this morass. He adds, "it would be inconceivable that it would be efficient enough to destroy every DNA molecule.... If one infectious particle remains, it will grow and replicate. You have to destroy every single one in the body." So, the ART reduced the viable targets. If you are inhibiting viral replication, he explains, you reduce the amount of HIV DNA in the host—in the cells and in the body, and that allows the CRISPR to be more effective. "It's a numbers game," Gendelman notes.

But, efficiency is not the only problem in the relationship between CRISPR and HIV. The sequence specificity of the approach a double-edged sword, notes Coffin. On the one hand, it minimizes off-target effects. But, as Coffin explains, it also sets the stage for rapid selection of resistance. In a virus that mutates as rapidly as HIV, the changes could quickly render CRISPR useless. Lastly, the mice were infected with a clonal virus with CRISPR delivered shortly after the infection, leaving little opportunity to generate a diverse population. However, this is not analogous to human patients as most patients do not report for treatment so soon after an infection. An effective treatment for humans would have to be designed to treat diverse viral populations with lots of mutations.

Also at ScienceAlert, CBS, and CNBC.


Original Submission

Today's News | July 5 | July 3  >