Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 11 submissions in the queue.
posted by martyb on Saturday April 08 2017, @12:37PM   Printer-friendly
from the how-many-Cray-1s-is-that? dept.

NVIDIA issued a press release for its new card, Titan Xp:

Introduced today [April 6], the Pascal-powered TITAN Xp pushes more cores, faster clocks, faster memory and more TFLOPS than its predecessor, the 2016 Pascal-powered TITAN X.

With the new TITAN Xp we're delivering a card to users who demand the very best NVIDIA GPU, directly from NVIDIA and supported by NVIDIA.

Key stats:

  • 12GB of GDDR5X memory running at 11.4 Gbps
  • 3,840 CUDA cores running at 1.6GHz
  • 12 TFLOPs of brute force

This is extreme performance for extreme users where every drop counts.

Open to Mac Community

Speaking of users, we're also making the new TITAN Xp open to the Mac community with new Pascal drivers, coming this month. For the first time, this gives Mac users access to the immense horsepower delivered by our award-winning Pascal-powered GPUs.

TITAN Xp is available now for $1,200 direct from nvidia.com, and select system builders soon.

Don't shoot the messenger.

[More details can be found on the TITAN Xp product page where you can also place an order (Limit 2 per customer). --Ed.]


Original Submission

Related Stories

Nvidia Announces Titan V 1 comment

Nvidia has announced the Titan V, a $3,000 Volta-based flagship GPU capable of around 15 teraflops single-precision and 110 teraflops of "tensor performance (deep learning)". It has slightly greater performance but less VRAM than the Tesla V100, a $10,000 GPU aimed at professional users.

Would you consider it a card for "consumers"?

It seems like Nvidia announces the fastest GPU in history multiple times a year, and that's exactly what's happened again today; the Titan V is "the most powerful PC GPU ever created," in Nvidia's words. It represents a more significant leap than most products that have made that claim, however, as it's the first consumer-grade GPU based around Nvidia's new Volta architecture.

That said, a liberal definition of the word "consumer" is in order here — the Titan V sells for $2,999 and is focused around AI and scientific simulation processing. Nvidia claims 110 teraflops of performance from its 21.1 billion transistors, with 12GB of HBM2 memory, 5120 CUDA cores, and 640 "tensor cores" that are said to offer up to 9x the deep-learning performance of its predecessor.

Previously: Nvidia Releases the GeForce GTX 1080 Ti: 11.3 TFLOPS of FP32 Performance
More Extreme in Every Way: The New Titan Is Here – NVIDIA TITAN Xp


Original Submission

Nvidia Announces Titan RTX 14 comments

Nvidia has announced its $2,500 Turing-based Titan RTX GPU. It is said to have a single precision performance of 16.3 teraflops and "tensor performance" of 130 teraflops. Double precision performance has been neutered down to 0.51 teraflops, down from 6.9 teraflops for last year's Volta-based Titan V.

The card includes 24 gigabytes of GDDR6 VRAM clocked at 14 Gbps, for a total memory bandwidth of 672 GB/s.

Drilling a bit deeper, there are really three legs to Titan RTX that sets it apart from NVIDIA's other cards, particularly the GeForce RTX 2080 Ti. Raw performance is certainly once of those; we're looking at about 15% better performance in shading, texturing, and compute, and around a 9% bump in memory bandwidth and pixel throughput.

However arguably the lynchpin to NVIDIA's true desired market of data scientists and other compute users is the tensor cores. Present on all NVIDIA's Turing cards and the heart and soul of NVIIDA's success in the AI/neural networking field, NVIDIA gave the GeForce cards a singular limitation that is none the less very important to the professional market. In their highest-precision FP16 mode, Turing is capable of accumulating at FP32 for greater precision; however on the GeForce cards this operation is limited to half-speed throughput. This limitation has been removed for the Titan RTX, and as a result it's capable of full-speed FP32 accumulation throughput on its tensor cores.

Given that NVIDIA's tensor cores have nearly a dozen modes, this may seem like an odd distinction to make between the GeForce and the Titan. However for data scientists it's quite important; FP32 accumulate is frequently necessary for neural network training – FP16 accumulate doesn't have enough precision – especially in the big money fields that will shell out for cards like the Titan and the Tesla. So this small change is a big part of the value proposition to data scientists, as NVIDIA does not offer a cheaper card with the chart-topping 130 TFLOPS of tensor performance that Titan RTX can hit.

Previously: More Extreme in Every Way: The New Titan Is Here – NVIDIA TITAN Xp
Nvidia Announces Titan V
Nvidia Announces Turing Architecture With Focus on Ray-Tracing and Lower-Precision Operations
Nvidia Announces RTX 2080 Ti, 2080, and 2070 GPUs, Claims 25x Increase in Ray-Tracing Performance
Nvidia's Turing GPU Pricing and Performance "Poorly Received"


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Interesting) by Anonymous Coward on Saturday April 08 2017, @01:09PM (4 children)

    by Anonymous Coward on Saturday April 08 2017, @01:09PM (#490820)

    You know, I grew up using the Macintosh, and there was virtually nothing available for it, relative to what was available for the "PC".

    In my adulthood, I've moved to Linux, and there's virtually nothing available for it, relative to what is available for the Macintosh.

    • (Score: 2) by requerdanos on Saturday April 08 2017, @01:50PM (3 children)

      by requerdanos (5997) on Saturday April 08 2017, @01:50PM (#490836) Journal

      I've moved to Linux, and there's virtually nothing available for it, relative to what is available for the Macintosh.

      Are Titan X series cards not functional in Linux, as they are on Macintosh?

      Macintosh + proprietary binary driver supports these cards, and Linux + proprietary binary driver [phoronix.com] supports these cards.

      Phoronix (link above) does mention that acceleration for Titan X series cards is not supported under the free Noveau driver due to the firmware being unavailable from Nvidia.

      But, of course, no free driver for Macintosh either, so not a lack of parity in availability.

      • (Score: 0) by Anonymous Coward on Saturday April 08 2017, @03:35PM

        by Anonymous Coward on Saturday April 08 2017, @03:35PM (#490859)

        See here [nvidia.com] for NVIDIA's minimum requirements for a Linux operating system.

      • (Score: 0) by Anonymous Coward on Saturday April 08 2017, @05:08PM (1 child)

        by Anonymous Coward on Saturday April 08 2017, @05:08PM (#490900)

        Use http://www.geforce.com/drivers [geforce.com] to check for support.
        Currently there are Linux 32 bit arm drivers, but not x86 or x64. FreeBSD (and Solaris) have both x86 and x64 drivers though, and I assume linux will shortly.

        I don't care about mac or windows, so I didn't check them.

        I went with Nvidia for my GPU because of their good Linux and FreeBSD support.

  • (Score: 0) by Anonymous Coward on Saturday April 08 2017, @01:13PM (3 children)

    by Anonymous Coward on Saturday April 08 2017, @01:13PM (#490824)

    "Are we really going to call this 'Xp'? You know, like that old Windows that Microsoft wants everybody to stop using?

    • (Score: 2) by takyon on Saturday April 08 2017, @05:42PM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday April 08 2017, @05:42PM (#490916) Journal

      I was thinking of the emoticon. At least they didn't call it Titan XD.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by fishybell on Saturday April 08 2017, @06:26PM (1 child)

      by fishybell (3156) on Saturday April 08 2017, @06:26PM (#490925)

      I once worked for a company that used XP in their new product name to signify that it was cross-platform (ie. didn't only run on OS/2 anymore).

      When Windows XP came out they changed their name so people wouldn't think it only ran on Windows XP.

      • (Score: 2) by takyon on Saturday April 08 2017, @07:09PM

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday April 08 2017, @07:09PM (#490936) Journal

        The kind of people who are going to buy this overpriced GPU are not going to be confused much by the name. Maybe search engines will be confused by it, returning results like "Can I use my NVIDIA Titan with Windows XP?"

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 3, Interesting) by bzipitidoo on Saturday April 08 2017, @01:49PM (5 children)

    by bzipitidoo (4388) on Saturday April 08 2017, @01:49PM (#490835) Journal

    I never buy top end graphics hardware. Pay $1200 for hardware that 5 years later will be equaled by low end stuff that uses less power and needs less cooling, and costs 1/10th the price? Not me! But the trickle down effect is nice. The bottom end gets better. Last time I bought NVidia, I got a GE Force GT 610, because it was fanless, to replace a previous generation GE Force that had its fan fail. Can't stand those graphics card fans anyway, sound like hair dryers.

    Lately I've been running with Intel's integrated HD graphics, which has improved enough to be competitive with the cheapest, least powerful stuff from NVidia and AMD. Not that I'm in love with Intel, but they've promised and delivered better support for 3D acceleration in open drivers for Linux. I hate having to turn back to the proprietary Nvidia driver that has to be recompiled for every kernel update when I find that the Nouveau driver still can't match the 3D performance. So, one solution is just avoid Nvidia (Nvidious) altogether. As to being able to play games at their maximum graphics settings, on a gigantic 4k or more monitor, meh, don't care that much about that aspect of games.

    Learning OpenCL is on my bucket list. I have a sense of what it can do. The problem is figuring out how to put all that parallel processing power to good use.

    • (Score: 2) by kaszz on Saturday April 08 2017, @01:54PM (3 children)

      by kaszz (4211) on Saturday April 08 2017, @01:54PM (#490839) Journal

      Any recommendation on 3D hardware? What to select, not to avoid.

      • (Score: 0) by Anonymous Coward on Saturday April 08 2017, @03:37PM

        by Anonymous Coward on Saturday April 08 2017, @03:37PM (#490860)

        The command line is where it's at.

      • (Score: 2) by Hairyfeet on Sunday April 09 2017, @09:40AM

        by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Sunday April 09 2017, @09:40AM (#491126) Journal

        I usually stay a gen or two behind with AMD (currently R9 280, previously HD 7750 and before that HD 4850) and have been nothing but happy and they seem to run just fine no matter what OS I throw at them. Windows of course runs fine and Linux drivers have usually caught up with the hardware by then so that is rarely an issue. You can always go to Phoronix and look up whatever card you are looking at and they usually have Linux benches for it if you are wanting to go that direction.

        Generally I find Nvidia runs a little cooler while AMD cards tend to last longer performance wise (it even has a nickname, AMD "finewine" because they age so well) so its really which matters to you more.

        --
        ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
      • (Score: 2) by tibman on Sunday April 09 2017, @03:49PM

        by tibman (134) Subscriber Badge on Sunday April 09 2017, @03:49PM (#491175)

        AMD RX 480 is a good deal right now. 580 will be coming out soon but looks just like a more tuned up version of the 480 (so don't wait for it). It appears to have good linux support as well. Doom is playing over 100 fps @1440p vulkan. Played Left 4 Dead 2 last night at 290 fps, lol.

        --
        SN won't survive on lurkers alone. Write comments.
    • (Score: 0) by Anonymous Coward on Saturday April 08 2017, @04:57PM

      by Anonymous Coward on Saturday April 08 2017, @04:57PM (#490893)

      are you just cheap and saying in a roundabout fashion that you will never buy a discrete video card if you can get video included with the CPU?

      I have a few fanless gpus for some dedicated workstations, that replaced the function of integrated gpus. Let's face it; integrated GPUs are only as good as what the least common denominator demands of it.

      That's why good hardware doesn't come with integrated GPUs.

  • (Score: 2) by hamsterdan on Saturday April 08 2017, @05:46PM (1 child)

    by hamsterdan (2829) on Saturday April 08 2017, @05:46PM (#490917)

    Are they talking about 1st gen Mac Pro (2012?) or running via thunderbolt in an external PCIe enclosure on newer machines? just curious.

    • (Score: 2) by zeigerpuppy on Saturday April 08 2017, @05:55PM

      by zeigerpuppy (1298) on Saturday April 08 2017, @05:55PM (#490920)

      i guess they may be preparing for the next MacPro.
      in the meantine there's quite a few people doing high end graphics/video production work on hackintoshes who have been waiting for pascal support.
      My dual Xeon 2680 rig cost me a quarter the price of a mac pro for twice the performance.
      most of us are using the Asus Z9PE or Z10PE board, really nice rigs but definitely a little fiddly to get configured.

  • (Score: 0) by Anonymous Coward on Sunday April 09 2017, @07:33AM (1 child)

    by Anonymous Coward on Sunday April 09 2017, @07:33AM (#491114)

    This card is about 1% different from cards available 6mo ago. Still waiting for the "extreme" to arrive.

    • (Score: 0) by Anonymous Coward on Sunday April 09 2017, @09:40AM

      by Anonymous Coward on Sunday April 09 2017, @09:40AM (#491127)

      It's an extreme-ly bad deal. At least the difference in VRAM is more than 1%.

  • (Score: 2) by kaszz on Sunday April 09 2017, @06:28PM

    by kaszz (4211) on Sunday April 09 2017, @06:28PM (#491220) Journal

    Considering the open graphics FPGA project Open Graphics Project [wikipedia.org]. What 3D transforms or functions are the most important ones to implement? (speaking mathematically)

(1)