Phone boo boo. Fixed, thanks.2070ti?
Phone boo boo. Fixed, thanks.2070ti?
Yeah the only reason I can think of myself with limited understanding of salvage is that RT and/or Tensor core structures don't support disabling portions.Yeah but what’s the point? It has the same bus width as TU104 and 75% of the SMs. Are yields so wacky on 12nm that it made sense to spin a separate chip instead of salvaging TU104 dies? Maybe yields are pretty good and they didn’t want to have to sell an expensive-to-manufacture TU104 chip with 1/4 of the units disabled.
I could understand if RTX 2060 used a salvaged TU106 but if there’s really a TU116 the whole setup seems strange.
NVIDIA RTX 2080 & 2080Ti spotted in FINAL FANTASY XV Benchmark
http://benchmark.finalfantasyxv.com/result/
This graphics card is not released yet.
Data on this page may change in the future.
Yep posted back here.Videocardz has supposedly published nVidia's review guide with comparison numbers.
https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled
3840x2160 Standard FFXV benchmark (no gameworks)
What I assume to be a FE 2080 Ti is about 12% faster at 2160p compared against an overclocked 1080 Ti.
Seems to me that disabling Gameworks would shift the bottleneck away from the GPU for this title - no? I mean, if consoles can run the game at "4k" without Gameworks effects then how demanding can it be? Since this benchmark doesn't provide average FPS, let alone frame times, it's hard to know what kind of performance difference is observable between the tested GPUs.
It's all GPU bound. This game is very demanding on the GPU, especially at 3840x2160, there's no way the 2080 or 2080 Ti are CPU bound at those settings.
"Standard quality" as indicated by the benchmark is equivalent to a "normal" preset (i.e. low detail by the standards of most PC gamers). This is exactly the type of settings preset that is used in CPU bottleneck testing, while keeping things relatively comparable to real-world scenarios (as opposed to a "low" preset). 4k or not, if we have the same CPU horsepower as yesterday but add say 50% more GPU power, then turn off GPU-intensive features, you have a recipe for a CPU bottleneck. Sure, it's *less* of a CPU bottleneck than at 1080p, but there's no getting around the fact that you're asking the GPU to do less work. Naturally, this presents a scenario wherein the CPU is being asked to feed the GPU data more quickly in order to generate frames quicker. At some point the CPU won't be able to keep up.
What are you on about? This benchmark is not CPU bound even at 1080p.
If my explanation escapes your grasp I won't repeat myself. Far better things to do with my time.
The 19th is review day. It was the 17th for 2080 but they changed it to the 19th for both cards.Nvidia's NDA for reviews of the RTX 2080 ends when tomorrow?