Nvidia Turing Speculation thread [2018]

Status
Not open for further replies.
GPUs got so good at "faking" everything that RT enhancements don't offer that huge a noticeable overall improvement. It's not a gameplay enhancement unless you use shadows and light as part of the gameplay really though even those are easily faked too.
Enormous amount of additional costs to fake everything though, driver issues etc.

Ray tracing going to help solve a lot of these weird problems and everything is going to look better.

Those big budget games can make more complex scenes and narrative games will have more options for visual impact now.
 
This BF5 demo is really cool though.

It was pretty neat, but sooooooooo much could still be reasonably faked, especially the lighting. Reflections are not that big a deal. I bet it'll lack some of the raster fall back methods by agreement for RTX enhancement.
 
BFV had an RTX alpha tag on it, and you could see some bugs when the camera was pointed at the rifle. For instance, the other soldier was firing off-screen and you could see weird flames shooting out at strange angles.

Still, very impressive imhp.
 
RTX games to be released :

53_575px.jpg
 
Hard to discern what the basic performance difference will be ray tracing aside. There was a lot of blurring the lines between the ray tracing feature and general performance.

He said the 2070 will "have the same performance as the 1080ti" but the 2070 supposedly has 2304 cores and a 1080ti has 3584...so ahhh call me sceptical of that statement.
 
I was left unimpressed by the actual hardware and pricing but still excited about what's to come eventually. Might wait for 7nm at this point, I guess there must be a reason Nvidia weren't willing to talk about non-rtx perf (like at all), maybe 12nm+ is the limiting factor here.
 
What does everyone think of the new measuring metric? I'm finding it difficult to figure out the actual performance of these gpu's thanks to this rtx-ops thing. I'm assuming that the ray-tracing capability is somewhat separate from all the other processes going on, and so rtx-ops was invented to take it into account?
 
What does everyone think of the new measuring metric? I'm finding it difficult to figure out the actual performance of these gpu's

And that's exactly its purpose imo. To obfuscate the performance of those GPUs in existing non-rtx workloads. The 2070 might be significantly slower than the 1080 Ti in non-rtx tasks (and catching up in RTX tasks), and this is something that Nvidia generally never done before.

Therefore, create a magic nonsensical metric to compare against your existing non-rtx lineup. It's the most Nvidia-like thing they could do really :p

Edit: Also a problem they created for themselves with GPU Boost 3.0, which essentially overclocks all Pascal GPUs close to their limits by default.
 
The 2070 might be significantly slower than the 1080 Ti
Well, the 2070 is priced considerably lower than the 1080 Ti...

The real question is the 2080 vs 1080Ti at the same price. I think performance will be very similar, with the 2080 having newer arch vs 1080Ti having 3GB more memory.
 
It's just me / twitch or the tomb raider tech demo shows barely any difference with RT on?
It does show a difference, but it is subtle and in movement is a bit pointless...
GPUs got so good at "faking" everything that RT enhancements don't offer that huge a noticeable overall improvement. It's not a gameplay enhancement unless you use shadows and light as part of the gameplay really though even those are easily faked too.
What are you talking about guys? The difference is MASSIVE. Night and day. That kid's shadow's penumbra size was pretty large. I swear, when I saw that this was my reaction:

 
What are you talking about guys? The difference is MASSIVE. Night and day. That kid's shadow's penumbra size was pretty large. I swear, when I saw that this was my reaction:

Likely human effect. People get so used faked scenes that look good but have no comparison to something significantly more accurate. They don’t see a big difference because they are used to seeing the fake. In reverse, if you are used to seeing a correct image, and then take away everything, it’s much more noticeable.

TLDR; sometimes harder to see additions than it is to see removal.
 
Well... RT is such a step up. Having a non RT capable gpu will be like having nothing in 1-2 gpu refresh. And with nvidia pushing this, it will take on pc I guess. I guess it will recreate a gap between high end pc and ps5/xbxx too.

I hope AMD have an answer for games to that. But I don't believe Vega and Navi have.

Intel being such a powerhouse I guess they can adjust in time for their 2020 gpu.

I really didn't see RT coming this soon to gaming gpu...

Now I know the perfs won't be good enough this gen, like every new tech, but It's a hell of a start.
 
Status
Not open for further replies.
Back
Top