Nvidia Post-Volta (Ampere?) Rumor and Speculation Thread

Status
Not open for further replies.
Out of which only Control is anywhere near worth mentioning.
That's your pure subjective opinion, which other people (especially those who value graphics fidelity) don't particularly share.

Fact is, when you buy a GPU, you at the very least future proof it with current DX features, you don't buy a DX10 GPU when DX11 is on the market. That's not smart. And what we have here is basically the same situation.
 
That's your pure subjective opinion, which other people (especially those who value graphics fidelity) don't particularly share.
No, that's an opinion many more people share.
Fact is, when you buy a GPU, you at the very least future proof it with current DX features
No such thing.
Turing GPUs are for the most part woefully underpowered for any sizeable DXR job.
Either wait for RDNA2 or Ampere, whatever floats your boat more or don't cry when Turing chokes in 1 year on snazzier DXR implementation.
 
No, that's an opinion many more people share.

No such thing.
Turing GPUs are for the most part woefully underpowered for any sizeable DXR job.
Either wait for RDNA2 or Ampere, whatever floats your boat more or don't cry when Turing chokes in 1 year on snazzier DXR implementation.

I happen to be someone who enjoyed Control immensely, not only for the story and gameplay but also the graphics. I run a water-cooled 2080 Ti and a 3440x1440 ultrawide 100Hz GSync display on my gaming rig. I can assure you that the game ran very well with highest in-game settings and native resolution. If the performance in a given DXR-enabled title is inadequate, that's why features like GSync, DLSS, and VRS exist.
 
Turing GPUs are for the most part woefully underpowered for any sizeable DXR job.
As opposed to RDNA1 which is not even capable of running DXR?

Either wait for RDNA2 or Ampere, whatever floats your boat more or don't cry when Turing chokes in 1 year on snazzier DXR implementation.
That's laughable, you should need to see some comparative numbers first, before you declare who is going to choke or not. AMD didn't even care to showcase a live demo for their RT announcement, I wonder why?!
 
No such thing.
Turing GPUs are for the most part woefully underpowered for any sizeable DXR job.
Either wait for RDNA2 or Ampere, whatever floats your boat more or don't cry when Turing chokes in 1 year on snazzier DXR implementation.

I don't expect RDNA2 to be any better than Turing when it comes to DXR.
 
When you go first for a big function, performances are always problematic. And without competition, they didn't have to push that hard.
 
FAD is FAD, my child.
You'll get your demo when both AMD and their partners deem it necessary.

Good, always leave yourself some room to be surprised.

Your argument makes no sense. A few posts ago you acknowledge that both the next XBox and Playstation will feature ray tracing and that the feature will go mainstream at that point. Which is it then? Is ray tracing a fad or is it on the verge of becoming mainstream?
 
Your argument makes no sense. A few posts ago you acknowledge that both the next XBox and Playstation will feature ray tracing and that the feature will go mainstream at that point. Which is it then? Is ray tracing a fad or is it on the verge of becoming mainstream?
FAD as in financial analyst day.
You know, the one which happened yesterday.
 
Status
Not open for further replies.
Back
Top