Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
Say that to the gamers that start complaining every time there is a graphics downgrade on game release compared to trailers...
Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards. Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.
Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
Say that to the gamers that start complaining every time there is a graphics downgrade on game release compared to trailers...
I don’t understand this request for “a GPU that can give stable 75fps at 4K.”None of that^ even matters, if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.
Nvidia will be more than happy to accept “advanced gamers’” money for one of their Pascal based GPUs. It’s not as if AMD has an alternativezMost of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti’s
We will see.(RTX is a bust)...
I think you’re setting yourself up for multiple disappointments.... and just wait 3 months until AMD releases their 7nm Vega 128 freesync2 cards for $1,200... then upgrade to a nice 4k display.
Do you have any proof at all that the likes of BF V are not using DXR? (What is it with people who are stuck in this endless loop of not understanding that RTX is used to provide DXR functionality? It’s not that hard.)Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards.
Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.
Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
2080Ti is for the TitanXp crowd, and for the ultra enthusiasts. NVIDIA made sure there is always a crowd at that price range repeatedly over the previous generations. It's not different this time.But when it comes down to what will actually matter, instead of what will be whined about (which is everything) price v performance is probably top dog. And considering a 2080ti is 35% faster than a 1080ti, while costing more than 35% more, well that's not good for Nvidia's sales.
So you think NVIDIA worked with TSMC to create a custom 12nm (modified 16nm) and then had the oversight to forgo 7nm entirely? That's highly illogical.And 7nm is difficult as hell to work with right now, the amount of engineering hours you need to pour into taping out a single chip is ridiculous, so they can't do it overnight.
That's assuming 2060 and 2050 are not price competitive, NVIDIA can adjust their pricing anytime.If AMD can get it's mid tier 7nm Navi cards out in say, first half of next year, assuming they're any good at all they'll crush Nvidia out of that entire segment
The way I see it is AMD is in a bit of pickle this time, I think they can't afford to be late with accelerated RT the way there were late to AI or Compute/CUDA or Tessellation/Geometry performance. They need to step up this time, especially as I think Intel will be following NVIDIA's footsteps in RT as well.Of course this all assumes AMD actually gets Navi out by then, it seems like they knew what was up with Global Foundries a while ago,
None of that^ even matters, if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.
Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti's (RTX is a bust)... and just wait 3 months until AMD releases their 7nm Vega 128 freesync2 cards for $1,200... then upgrade to a nice 4k display.
Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards. Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.
Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
I have no idea what that could be?I suspect Nvidia is holding back. I think there are several huuuge announcement yet to be made about Turing.
It's undeniable that the price is much higher than what it used to be, and it's not unreasonable for many to choke on that.I am not worried about $50 here, or there. I see 2080 as the latest tech and demands a premium.
I though it was really funny at RTX launch day to see a bunch of people commenting about how they'll stick it to Nvidia by buying a cheaper 1080 Ti. That will teach them!
As I recall it, you were the one preaching about the Vega 64 dominance over 1080Ti, now you claim to stick to your 1080Ti?Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti's
The 2080Ti is going to be 35~45% faster than a 1080Ti, more once OC'ed, it will also be 60~70% faster than a Vega 64. There isn't anything faster than this card @4K right now. If 1080Ti came close to running 4K60. 2080Ti will be closer.if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.
Say that to the PC crowd who mod the hell out of their games to push the visuals far beyond the original graphics of the game, say it to the PC crowd who activate MSAA, SSAA, and super resolutions just to eliminate every last drop of jaggies on their screens, say it to the crowd who push for Ultra settings all the time at all cost, and who activate taxing technologies such as advanced shadowing/physics/tessellation/draw distance/hair simulations just to enjoy a better scenery. Fact is visuals are what sells video games and graphics cards, not just performance. If you want 4K60 or a 100fps @1080p you can do it with a 1060/580 just like a console, all you need to do is blast those video settings down to low and you are set to go. Heck, the console crowd has been satisfied with a mere 30fps for the vast majority of their time, only upgrading to enjoy better visuals once a new gen comes in.Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.
and what this is going to change with "RAYTRACING" and "DLSS" ? LOL companies dont like spending manhours on something they know its gonna sell regardless of the quality no matter what technology it has in the end they know that they will sell wellSay that to the gamers that start complaining every time there is a graphics downgrade on game release compared to trailers...
AFAIK, raytracing requires less effort when compared to how things are currently done. So I guess that a hypothetical "raytraced-only" arkham knight game would require less manhours than the real one.and what this is going to change with "RAYTRACING" and "DLSS" ? LOL companies dont like spending manhours on something they know its gonna sell regardless of the quality no matter what technology it has in the end they know that they will sell well
batman arkham knight is a prime example of that a clusterF**K
AFAIK, raytracing requires less effort when compared to how things are currently done. So I guess that a hypothetical "raytraced-only" arkham knight game would require less manhours than the real one.
We'll get there someday.
Also, it’s not clear what this means for voxel based AO and GI. Is that still a performance win vs RT or does it also die off?
Why there different days for non-ti? What would be the reasoning behind it? I wonder when recorders actually get cards.
Review dates according to videocardz https://videocardz.com/newz/nvidia-geforce-rtx-2080-reviews-go-live-on-september-17th
2080 9/17
2080ti 9/19
Architecture slide embargo 9/14