Are ROPs still even that important? MSAA is pretty much a dead technology.
4K requires as much fillrate as 1080p 4xAA. Add all those fancy buffers and shadow maps that games create every frame and more fillrate may actually be useful.
Are ROPs still even that important? MSAA is pretty much a dead technology.
8K resolution is in all terms completely irrelevant even for highest end. As long as you can do 4k it's enough64 ROPs seems like an... interesting design decision if that's what it ends up as.
Higher clocks might help a bit compared to Navi10, but it'll be interesting to see how it stacks up at super high resolutions and frame rates.
For example, games at 8k resolution or even 4k with a 120fps target, or anything VR which tends to be very light on compute but heavy on ROPs and bandwidth. Doubly so for those that use forward renderers and MSAA.
On the other hand, clocks have nearly doubled since then too, and bandwidth would need to scale accordingly.64 ROPs has been with us since Hawaii/2013... oh boy
Yeah but have ROPs been a bottleneck at all in the last 10 or so years? And as already stated, clock speeds have skyrocketed recently.4K requires as much fillrate as 1080p 4xAA. Add all those fancy buffers and shadow maps that games create every frame and more fillrate may actually be useful.
4K requires as much fillrate as 1080p 4xAA. Add all those fancy buffers and shadow maps that games create every frame and more fillrate may actually be useful.
Yeah but have ROPs been a bottleneck at all in the last 10 or so years? And as already stated, clock speeds have skyrocketed recently.
VR is an incredibly niche market though. Its not showing much growth either.And forward rendering and MSAA is definitely not dead for VR - in fact, it's the recommended state-of-the-art approach, although clustered forward rendering is probably a better compromise. An awful lot of effort has been spent making good forward renderers specifically because in VR you want 4x or even 8x MSAA.
Combine that with a 4k or higher resolution frame buffer per eye and then rendering out at 90fps in most cases, and even 120 or 144hz with the most modern headsets and it's pretty easy to see how you can be fillrate bound in a hurry.
https://docs.unrealengine.com/en-US/Engine/Performance/ForwardRenderer/index.html
It's one of the reasons why in a lot of VR titles the 1080Ti punches way above its usual weight class against Turing. Clock speeds barely budged with Turing and ROP counts didn't change tier to tier. For a non-VR game you'd expect the 2080 and especially the 2080 Super to beat the 1080Ti in every single scenario. In VR that's not the case, as the +50% fillrate advantage actually matters and they're generally quite light on compute.
Here's hoping big Navi is a beast in that regard!
N23 (the small one) is Q1'21.
Same for mainstream nVidia furnaces.
So you basically have to wait 2Q.
I hope you’re exaggerating. Have no room for a furnace in my HTPC.
Well, "furnace".I hope you’re exaggerating. Have no room for a furnace in my HTPC.
I wish but alas.he doesn't know
Actually I think marbles rtx did have caustics on the marble, the Thing is that the denoiser in motion erases them. Still shots from the Demo have caustics on the marble for example.We still need ROPs because they have a blending unit included in them. Even with Nvidia's Marble RTX demo in all of it's glory it lacked ray traced caustics on the marbles themselves ... (their shadowing felt unnaturally uniform for a transparent object)
Alpha blending/transparency is a part of the wider phenomena of light scattering. Light rays don't just reflect but they can scatter inside the medium as well like we see with volumetric objects such as clouds which can either brighten or darken with respect to their thickness.
ROPs were traditionally used as a rough approximation for some of these wider phenomena and we'll continue to use them because I highly doubt that the near future will involve ray traced volumetric rendering or transparency/caustics on a widespread basis. It'll be a problem to attempt for the next generation after this coming generation ...
Fun experiment - ...
Fun experiment - take a glass in your kitchen and fill it with water. Put it on the table. Barring sun shining through a window and falling on th glass, you won’t see any at all. Next, take it out into the sun. If you put it down in the sun you will see caustics, but question #1 - could you have accurately predicted the pattern before you saw it? Question #2 - could you have accurately predicted it’s intensity given degree of haze/clouds and the surface the glass stands on?
For me, even having done these experiments and explicitly looking for the results, the answer to both of these are still NO!
The phenomenon is both complex, and belongs to that plethora of visual data that the brain wisely treats as noise and discards.
And the conlusion from that is that you can either fake the effect (quite crudely is perfectly fine), or disregard it entirely.
Your brain won’t mind, it doesn’t care and has no idea what would be physically correct anyway.
Of course. It’s not that there aren’t conditions where they exist, it’s that outside academia it’s pointless to have an accurate rendition of them.Maybe for glasses of water but my brain certainly doesn't discard all caustics. Take a swim in the caribbean. Guarantee you will notice and appreciate them!
Of course. It’s not that there aren’t conditions where they exist, it’s that outside academia it’s pointless to have an accurate rendition of them.
Does it matter, though? It's ought to be more than enough for 4KSo it's 64 ROPs again, for a ~20TFLOPs GPU?
Even if the core is clocking at 2.2GHz or so, the fillrate/compute ratio will be completely different to Navi 10 and Vega.
The FLOPS to fillrate ratio is very close to Polaris / RX 480 (2304 SPs / 32 ROPs) and not far from Tahiti / HD 7970 in 2012 (2048 SPs / 32 ROPs)So it's 64 ROPs again, for a ~20TFLOPs GPU?
Even if the core is clocking at 2.2GHz or so, the fillrate/compute ratio will be completely different to Navi 10 and Vega.