Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Tuesday February 28 2017, @02:43PM   Printer-friendly
from the natural-beauty-in-clouds-and-skylakes dept.

https://www.hpcwire.com/2017/02/27/google-gets-first-dibs-new-skylake-chips/

As part of an ongoing effort to differentiate its public cloud services, Google made good this week on its intention to bring custom Xeon Skylake chips from Intel Corp. to its Google Compute Engine. The cloud provider is the first to offer the next-gen Xeons, and is getting access ahead of traditional server-makers like Dell and HPE.

Google announced plans to incorporate the next-generation Intel server chips into its public could last November. On Friday (Feb. 24), Urs Hölzle, Google's senior vice president for cloud infrastructure, said the Skylake upgrade would deliver a significant performance boost for demanding applications and workloads ranging from genomic research to machine learning.

The cloud vendor noted that Skylake includes Intel Advanced Vector Extensions (AVX-512) that target workloads such as data analytics, engineering simulations and scientific modeling. When compared to previous generations, the Skylake extensions are touted as doubling floating-point performance "for the heaviest calculations," Hölzle noted in a blog post.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday February 28 2017, @05:52PM (3 children)

    by Anonymous Coward on Tuesday February 28 2017, @05:52PM (#472906)

    To be honest, I haven't experienced much difference in speed in CPUs for about 10 years. I had a little Core 2 Duo on my desktop that I did some number crunching on, then the guy in the next office blew $10k on a 32core 64Gb RAM machine (this was when that was a lot of RAM). I begged him to give me a login to run some stuff and he did... but it was pretty disappointing. More or less the same as my beater machine.

    I'm kind of intrigued by the "double performance" for intensive tasks but I won't hold my breath. In any case, GPUs "literally" (not literally) blow CPUs out of the water for most tasks. Instant 10X faster... and I'm saving up my pennies for a Pascal which should be another 2X. For my uses (mainly research / testing) I used to think chasing speed was a unimportant since you can always run something overnight... but it is immensely productive to be able to work interactively rather than planning / executing / analyzing experiments separately.

  • (Score: 0) by Anonymous Coward on Tuesday February 28 2017, @06:07PM (2 children)

    by Anonymous Coward on Tuesday February 28 2017, @06:07PM (#472924)

    Your old machine already had enough juice to play your porn collection.

    • (Score: 2) by DannyB on Tuesday February 28 2017, @09:41PM (1 child)

      by DannyB (5839) Subscriber Badge on Tuesday February 28 2017, @09:41PM (#473051) Journal

      It is safer to use a VM rather than the bare thing.

      Especially if the VM can easily be reset back to it's initial hard drive state for each session.

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 0) by Anonymous Coward on Wednesday March 01 2017, @12:30AM

        by Anonymous Coward on Wednesday March 01 2017, @12:30AM (#473144)

        When it comes to video or audio, I can really detect a difference when it's not touching the bare metal. I don't like VMs either.