Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards. Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.

What proprietary raytracing? Not even going to try to understand what you mean by “focused on visuals” given the site you’re posting on.

Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...

Ugh no they don’t. There are many game genres where fidelity is far far more important than fps. The only genre where that’s not true is competitive fps and that’s a far cry from all gamers.
 
Say that to the gamers that start complaining every time there is a graphics downgrade on game release compared to trailers...

"Gamers" want every bloody thing under the sun and will whine about a puddle being removed. At some point internet culture turned from "don't feed the trolls" to doing nothing but, to the point where trolls get people fired from companies.

But when it comes down to what will actually matter, instead of what will be whined about (which is everything) price v performance is probably top dog. And considering a 2080ti is 35% faster than a 1080ti, while costing more than 35% more, well that's not good for Nvidia's sales. It kind of feels like Nvidia allowed their engineers to throw everything under the sun at Turing, without regards to cost, die size, or market demand, and this is the end result.

If AMD can get it's mid tier 7nm Navi cards out in say, first half of next year, assuming they're any good at all they'll crush Nvidia out of that entire segment until Nvidia also shows up with 7nm stuff. And 7nm is difficult as hell to work with right now, the amount of engineering hours you need to pour into taping out a single chip is ridiculous, so they can't do it overnight. Of course this all assumes AMD actually gets Navi out by then, it seems like they knew what was up with Global Foundries a while ago, Zen 2 and the pro Vega 20 or whatever are both on TSMC 7nm, but I don't know what the plan for Navi was.
 
None of that^ even matters, if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.
I don’t understand this request for “a GPU that can give stable 75fps at 4K.”

GPUs do calculations. It’s up to the programmers to determine how many calculations they want to do per frame. That’s it.

All we can ask for is that new GPU improve the number of operations per second, and, as far as I can tell, the new GPUs will give us that.

Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti’s
Nvidia will be more than happy to accept “advanced gamers’” money for one of their Pascal based GPUs. It’s not as if AMD has an alternativez

(RTX is a bust)...
We will see.

... and just wait 3 months until AMD releases their 7nm Vega 128 freesync2 cards for $1,200... then upgrade to a nice 4k display.
I think you’re setting yourself up for multiple disappointments.

First of all, it’s questionable that a Vega 128 (?) exists. Second, chance that they’ll be available in just 3 months are close to nil. Third, it’s unlikely that they’ll sell a 128 CU GPU with 4 HBM2 at $1200. And, finally, it’s an open question if they’ll finally be able to make something that’s actually faster than an RTX 2080 Ti.

Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards.
Do you have any proof at all that the likes of BF V are not using DXR? (What is it with people who are stuck in this endless loop of not understanding that RTX is used to provide DXR functionality? It’s not that hard.)

Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.

Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...

If you only care about performance, then nothing prevents you from playing BF 1942. Or play Counterstrike at 500fps.

Of course it’s about visuals. Visual has been driving the whole industry for the past 3 decades.
 
But when it comes down to what will actually matter, instead of what will be whined about (which is everything) price v performance is probably top dog. And considering a 2080ti is 35% faster than a 1080ti, while costing more than 35% more, well that's not good for Nvidia's sales.
2080Ti is for the TitanXp crowd, and for the ultra enthusiasts. NVIDIA made sure there is always a crowd at that price range repeatedly over the previous generations. It's not different this time.
And 7nm is difficult as hell to work with right now, the amount of engineering hours you need to pour into taping out a single chip is ridiculous, so they can't do it overnight.
So you think NVIDIA worked with TSMC to create a custom 12nm (modified 16nm) and then had the oversight to forgo 7nm entirely? That's highly illogical.
If AMD can get it's mid tier 7nm Navi cards out in say, first half of next year, assuming they're any good at all they'll crush Nvidia out of that entire segment
That's assuming 2060 and 2050 are not price competitive, NVIDIA can adjust their pricing anytime.

Of course this all assumes AMD actually gets Navi out by then, it seems like they knew what was up with Global Foundries a while ago,
The way I see it is AMD is in a bit of pickle this time, I think they can't afford to be late with accelerated RT the way there were late to AI or Compute/CUDA or Tessellation/Geometry performance. They need to step up this time, especially as I think Intel will be following NVIDIA's footsteps in RT as well.
 
None of that^ even matters, if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.

Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti's (RTX is a bust)... and just wait 3 months until AMD releases their 7nm Vega 128 freesync2 cards for $1,200... then upgrade to a nice 4k display.

Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards. Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.




Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...


Turing is the latest technology on the streets. That is why it requires a premium.

Turing vs Vega isn't about raw game numbers, it is about about building blocks for the future games. If 2080 can run last years games, equal to their competitors Vega 7nm, then it is a far better value to the end user and Gamer.

Again, not being attached to any brand and logically looking at now and the future, any unbiased Gamer building/buying a new rig over the upcoming holiday are not going to pass up the technology Turing-2080 brings, all because 2 year Vega can crunch certain outdated games extremely well?

Latest tech = Premium



Honestly, what would buying a Vega card today, get you.. ? over buying a Turing card..? And in the future..?

Even at equal cost, 2080 is a much better choice for the Premium Gamer, because it supports the most standards, better compute, latest VR tech, prepped & ready for future game titles and game engines. Nvidia has announced many Gaming partners and pranced many of them on-stage at many of their events. They always mention their "partners" in tweets, etc. Nvidia has a lot of collaboration in the gaming industry and what seems like a lot of Development studios are on board and behind them.

I suspect Nvidia is holding back. I think there are several huuuge announcement yet to be made about Turing.

I am not worried about $50 here, or there. I see 2080 as the latest tech and demands a premium.

I am buying 2080 because it won't hinder my gaming.
 
I suspect Nvidia is holding back. I think there are several huuuge announcement yet to be made about Turing.
I have no idea what that could be?

I am not worried about $50 here, or there. I see 2080 as the latest tech and demands a premium.
It's undeniable that the price is much higher than what it used to be, and it's not unreasonable for many to choke on that.

But I don't remember if there ever was a moment where a GPU vendor not only outperformed their competitor with their latest product line, but with the previous one as well.

I wonder if Nvidia will simply keep on producing Pascal for some time to come. I though it was really funny at RTX launch day to see a bunch of people commenting about how they'll stick it to Nvidia by buying a cheaper 1080 Ti. That will teach them! :)
 
Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti's
As I recall it, you were the one preaching about the Vega 64 dominance over 1080Ti, now you claim to stick to your 1080Ti?
if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.
The 2080Ti is going to be 35~45% faster than a 1080Ti, more once OC'ed, it will also be 60~70% faster than a Vega 64. There isn't anything faster than this card @4K right now. If 1080Ti came close to running 4K60. 2080Ti will be closer.
Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.
Say that to the PC crowd who mod the hell out of their games to push the visuals far beyond the original graphics of the game, say it to the PC crowd who activate MSAA, SSAA, and super resolutions just to eliminate every last drop of jaggies on their screens, say it to the crowd who push for Ultra settings all the time at all cost, and who activate taxing technologies such as advanced shadowing/physics/tessellation/draw distance/hair simulations just to enjoy a better scenery. Fact is visuals are what sells video games and graphics cards, not just performance. If you want 4K60 or a 100fps @1080p you can do it with a 1060/580 just like a console, all you need to do is blast those video settings down to low and you are set to go. Heck, the console crowd has been satisfied with a mere 30fps for the vast majority of their time, only upgrading to enjoy better visuals once a new gen comes in.
 
Say that to the gamers that start complaining every time there is a graphics downgrade on game release compared to trailers...
and what this is going to change with "RAYTRACING" and "DLSS" ? LOL companies dont like spending manhours on something they know its gonna sell regardless of the quality no matter what technology it has in the end they know that they will sell well

batman arkham knight is a prime example of that a clusterF**K
 
I doubt the average gamer will ever own a Geforce GTX 2080 Ti, let alone a 1080 Ti. These are halo parts and the price reflects that. At some point there will be an inflection point where the features and performance can be had at a comfortable price point. It's about paving the way for future generations, like Formula One does for cars and driving safety.

Casual gaming is still much bigger than competitive gaming or PC gaming for that matter.

When are reviews expected? Or when does the NDA expire?
 
and what this is going to change with "RAYTRACING" and "DLSS" ? LOL companies dont like spending manhours on something they know its gonna sell regardless of the quality no matter what technology it has in the end they know that they will sell well
batman arkham knight is a prime example of that a clusterF**K
AFAIK, raytracing requires less effort when compared to how things are currently done. So I guess that a hypothetical "raytraced-only" arkham knight game would require less manhours than the real one.
We'll get there someday.
 
AFAIK, raytracing requires less effort when compared to how things are currently done. So I guess that a hypothetical "raytraced-only" arkham knight game would require less manhours than the real one.
We'll get there someday.

Yeah, the transition should be interesting. Now that there’s a more ‘natural’ alternative I imagine at some point developers will tire of the acrobatics required to emulate proper lighting in screen space. There are already some pretty impressive algos for faking this stuff but it’s not clear that people will spend a lot more cycles on further improvement.

It’s probably going to come down to whether consoles embrace RT. If they do then a lot of brainpower will likely be redirected to optimizing ray budget instead of fumbling around with screen space buffers.

Also, it’s not clear what this means for voxel based AO and GI. Is that still a performance win vs RT or does it also die off?
 
Also, it’s not clear what this means for voxel based AO and GI. Is that still a performance win vs RT or does it also die off?

My armchair assumption is volumetric representations of the scene will still be more efficient and provide better looking results within a limited compute budget than pure polygon based Ray tracing for diffuse GI and glossy reflections and Ray tracing will actually push more experimentation in that area as well. They are working with polygon raytracing because it's the new thing and it's the most straightforward thing on the current tech. Future engines will trace a bit of everything simultaneously depending on material prpperties, distance/LOD, Ray length, light type, etc.
 
Why there different days for non-ti? What would be the reasoning behind it? I wonder when reviewers actually get cards.
 
Last edited:
Why there different days for non-ti? What would be the reasoning behind it? I wonder when recorders actually get cards.

2 things come to mind. Gives reviewers more time for testing and allows them to create 2 content pieces instead of one (3 if you include the embargo lift for the architecture slides). The latter giving more clicks/views and more overall coverage for the product benefiting both the media outlets and Nvidia.
 
9 more games adding support for NVIDIA DLSS

Newly-Announced DLSS Titles:

  • Darksiders III from Gunfire Games / THQ Nordic
  • Deliver Us The Moon: Fortuna from KeokeN Interactive
  • Fear The Wolves from Vostok Games / Focus Home Interactive
  • Hellblade: Senua's Sacrifice from Ninja Theory
  • KINETIK from Hero Machine Studios
  • Outpost Zero from Symmetric Games / tinyBuild Games
  • Overkill's The Walking Dead from Overkill Software / Starbreeze Studios
  • SCUM from Gamepires / Devolver Digital
  • Stormdivers from Housemarque
Other Titles Implementing DLSS:

  • Ark: Survival Evolved from Studio Wildcard
  • Atomic Heart from Mundfish
  • Dauntless from Phoenix Labs
  • Final Fantasy XV: Windows Edition from Square Enix
  • Fractured Lands from Unbroken Studios
  • Hitman 2 from IO Interactive / Warner Bros.
  • Islands of Nyne from Define Human Studios
  • Justice from NetEase
  • JX3 from Kingsoft
  • Mechwarrior 5: Mercenaries from Piranha Games
  • PlayerUnknown’s Battlegrounds from PUBG Corp.
  • Remnant: From The Ashes from Arc Games
  • Serious Sam 4: Planet Badass from Croteam / Devolver Digital
  • Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes
  • The Forge Arena from Freezing Raccoon Studios
  • We Happy Few from Compulsion Games / Gearbox
https://www.guru3d.com/news-story/9-more-games-adding-support-for-nvidia-dlss.html
 
Back
Top