NVIDIA's GeForce GTX 1650 GDDR6 Released: GDDR6 Reaching Price Parity With GDDR5
Tucked inside NVIDIA's announcement of their spring refresh of their mobile GPU lineup, the company included a new low-end mobile part, the GeForce GTX 1650 GDDR6. Exactly as it says on the tin, this was a version of the company's GTX 1650 accelerator, except with newer GDDR6 instead of the GDDR5 it launched with. Now, in one of NVIDIA's more poorly kept secrets, their desktop product stack is getting a version of the card as well.
[...] The entry-level card is the cheapest (and the slowest) of the Turing family, offering as much performance as NVIDIA can pack into a 75 Watt TDP.
[...] Overall, this low-key release should mark a more important turning point in the state of GDDR memory. If NVIDIA and its partners are now willing to release GDDR6 versions of low-end cards, then this is a strong indicator that GDDR6 has finally lost most of its new technology price premium, and that memory prices have fallen by enough to be competitive with 8Gbps GDDR5. GDDR6 prices were a sticking point for the profit-sensitive NVIDIA during the original Turing product stack launch, so while it has taken an extra year, the company is finally offering a top-to-bottom GDDR6-based product stack.
Let's see more GPUs and APUs with HBM already.
(Score: 2) by Runaway1956 on Tuesday April 07 2020, @01:07AM (12 children)
I purchased one of these. Zotac makes a low profile card, that doesn't need any extra power connections. Just plug it into any x16 slot, and away it goes. For $150 you get a low power consumption card, that outperforms the 980, and runs head-to-head with the 1080, both of which sold for near a thousand dollars.
Had I known about the Super cards, I would have waited a few more weeks to make that purchase. Extra cores for almost the same price, basically. But, I haven't found the 1650 Super in low profile yet. Give it some time, I guess.
No, it isn't going to run the latest and bestest games at the highest settings, but the card isn't aimed at gamers. For that, you want the 2080 cards.
(Score: 2) by Booga1 on Tuesday April 07 2020, @02:13AM
Yeah, I have a friend who could use something like this as well. He bought an eMachines computer with a dinky 250 watt power supply in a custom form factor that isn't replaceable. He's currently on another 75 watt nVidia card, so this would be a perfect option to swap it out for.
(Score: 2) by takyon on Tuesday April 07 2020, @02:36AM (4 children)
I am more interested to see what performance will fit in this kind of TDP in the future than in the monster cards.
For comparison:
RTX 2080 Ti: 13.45 TFLOPS
RTX 2080: 10.07 TFLOPS
GTX 1080: 8.9 TFLOPS
GTX 980: 5 TFLOPS
PlayStation 4 Pro: 4.2 TFLOPS
GTX 1650 (GDDR5): 3 TFLOPS
GTX 1650 (GDDR6): 2.85 TFLOPS
PlayStation 4: 1.84 TFLOPS
Ryzen 7 4800U: 1.79 TFLOPS
This GDDR6 version offers no advantage over the one you have, which is probably why it was quietly launched.
From the TFLOPS number, which does not mean everything, it's weaker than the 980 or 1080. This list [tomshardware.com] confirms that, although the performance gap is only about 25% between 1650 and 980.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday April 07 2020, @03:41PM (3 children)
VR isn't quiet portable yet.
the 1080GTX is the "low bar" and not found in laptops and the 2080 is still to expensive when inside a laptop?
(Score: 2) by takyon on Tuesday April 07 2020, @06:53PM (2 children)
VR is still in an early adopter hell. Any headset without eye tracking and foveated rendering is obsolete. Those could be used to push framerate and (maximum) resolution up, with the eventual target being 8K-16K and 200+ FPS. Software advancements may be almost as important as the GPU hardware advancements.
Portability goes beyond laptops. All headsets will eventually become standalone, like today's Snapdragon 8XX headsets but with 100x the performance.
Nvidia just refreshed its high-end laptop GPUs:
NVIDIA’s 2020 Laptop Refresh: Launches GeForce RTX 2080 Super, 2070 Super, & GTX 1650 Ti [anandtech.com]
If you want a "VR laptop", you'll have to pay out the ass. Unless you can find a laptop that "fell off the back of the truck".
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 1, Informative) by Anonymous Coward on Thursday April 09 2020, @01:05AM (1 child)
I'm waiting for the trifecta of eye-tracking, foveated rendering, and dynamic focal length compensation
(your eyes naturally want to focus on a close plane when aimed at something closer -- and providing just a fixed focal depth gives you a headache).
Although for actually perfect rendering of movement, it's also necessary to provide per-eye 3d-- (a, b and c axis rotations centred on each eye -- eyes rotate about z, don't you know -- don't believe me? Look in to a mirror at your iris, now tilt your head -- your eye will probably stay level, requiring it to rotate in its socket.) --motion compensation to the image element.
So that it can stay locked to the fovea despite eye movement. This is the only way to provide motion-blur effects that work properly when your eyes are tracking moving objects, as well as to eliminate the loss-of-detail that occurs when there's a big velocity difference between the two.
Nice additional benefit of that, is that the actual render resolution (and actual display resolution) can then be adapted to the fovea itself, which allows it to be dropped surprisingly low -- less than a few Megapixel IIRC -- without any loss of visible detail. One can still have ~4 pixels per actual fovea cell without needing to truly render the massive dimensions we currently use, which will cause massive improvements to both dynamic performance (frame rate / latency) as well as energy efficiency.
It makes a good case for a deferred HW renderer like PowerVR -- since the selection of the pixels required to really be rendered could be made by that kind of hardware quite late in the rendering process, although it would require some adaptation to deal with the non-linearity in the pixel spacing.
But, yeah, 3d movement systems that have fast enough feedback to effectively neutralise the movements of two human eyes, whilst remaining lightweight, is a bit beyond us just yet it seems.
'til then, I just don't think VR is serious. It's getting better, but it's just not quite there yet. It'll continue to give you a headache after a few hours of use, and require games to have workarounds to solve the issues with blur due to movement.
(Score: 2) by takyon on Thursday April 09 2020, @02:25AM
The focal length problem is being worked on:
Half Dome Updates: FRL Explores More Comfortable, Compact VR Prototypes for Work [oculus.com]
Yeah, that's another great reason why VR is in early adopter hell.
As far as eye tracking in headsets goes, it should be standard soon. The performance left on the table by not using foveated rendering is too important.
Eye tracking is the next phase for VR, ready or not [cnet.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by jasassin on Tuesday April 07 2020, @03:24AM (5 children)
Not being a jerk here. Who is it aimed at?
jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
(Score: 5, Interesting) by Runaway1956 on Tuesday April 07 2020, @04:18AM (2 children)
It's aimed primarily at "entry level" gamers, whatever the heck that means. You'll find that phrase on various sites. The low profile card is very specifically aimed at people who have small form factor boxes, which is exactly what I was looking for. It fits into a 2U server, and a bunch of those mini-ATX, micros, and whatever else. It will make an excellent card for enterprise, and for regular people who need something better than an embedded Intel, but don't need a gaming rig.
Let me hit on that gaming thing again: the 1650 will play just about any game 5 years old very nicely, on higher settings, if not the highest settings.
Games being released now and in the next year or so are actually waiting on optimised drivers for the RTX 2070 and 2080. My low-end 1650 isn't even under consideration by real gamers.
(Score: 2) by takyon on Tuesday April 07 2020, @11:07PM (1 child)
Check this out:
Single Board Computer + GTX 1650 Can it Game? ODYSSEY / ReComputer SBC [youtube.com]
lol
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Runaway1956 on Tuesday April 07 2020, @11:35PM
That's pretty cool.
(Score: 2) by shortscreen on Tuesday April 07 2020, @08:12AM (1 child)
There are gamers and then there are gamers. For some, it seems that spending the most money and obtaining the most bragging rights is the primary game. For them, anything other than the flagship part is "low end" even though 99% of games would run fine on it. (Since when is the GTX moniker used for low end parts anyway?)
I play plenty of games, but I've never once spent even $100 on a video card let alone $500. But it's nice that somebody buys that stuff, so I can buy it from them on ebay five years later for much less.
(Score: 2) by takyon on Tuesday April 07 2020, @11:40AM
Since the ultra expensive RTX 20-series cash grab was launched. Now "GTX" is used for a handful of cards [wikipedia.org] without any dedicated raytracing (or ML) cores. Turing might be the only generation to do this since Nvidia (as well as AMD) are going to want some minimum level of raytracing functionality on every card eventually.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]