The worst thing about the hairworks and tessellation-related gameworks effects is that AMD would've not suffered as much if they had fixed their driver when it was still the 290X series and not when they rebranded it with 390.
Hocp tested couple of games with them gameworks effects enabled and...
Vega was a decent upgrade over Polaris, but not enough. Their clocks were set to around 1.5-1.6GHz with 'upto' modifier while nvidia cards were doing at least 1.8GHz and close to 2GHz actual with non-FE cards.
RDNA3 has been stuck at the same clocks as later RDNA2 chips. The doubled FLOPS made...
Perhaps it's different now but I played through Portal RTX when it launched and FG was simply too 'floaty' for me. Thankfully ultra-perf mode looks very decent in this game and I used that.
Cyberpunk otoh has been way better and though I can still feel the added latency since my base frame-rate...
RTINGS have a new article out on WOLED vs QD-OLED with burn-in updates on the latest from both camps. The S95C looks much better than G3 at 8 months, but they're circumspect regarding how the brightness might fall off for S95C.
https://www.rtings.com/tv/learn/qd-oled-vs-woled#burn-in
RDNA3 is return to the status quo where AMD push their flagchip to best the 2nd best from nvidia which is comfortably clocked since they have a bigger chip that is >=50% larger.
I remember in the very first level where Pyscho leads you up the stairs on the ship, even 8xAA wasn't removing the aliasing on metal railings and the whole room/chamber just shimmered badly. Can't recall the resolution I was playing it at but very likely <1080p.
Considering how TAA improves...
Running different aspect ratio leading to black bars is causing faster burn-in in 700hrs on the QDs since they are boosting the brightness for the 16:9 area in case of the UW. 1st Gen have the software issue for compensation cycles.
I also noticed the brightness increase on S90C when running...
The QD-OLEDs were not properly running their compensation cycles. Sony were especially egregious with this.
If you check the S95C vs G3 images at 4 months here( G3 started 4 before ), S95C looks better.
https://www.rtings.com/tv/learn/longevity-results-after-10-months
When it comes to L2 cache, 4090 is a meager 50% higher than 4070Ti's. 4090's L2 cache is mere 8MB more than 4090's.
Not sure how much that affects performance and nvidia don't seem to in a mood to deploy a 4090 super with full 96MB of L2 cache.
While system memory reduction isn't as good as VRAM, I was surprised at how many games simpy run out of virtual memory even on 32GB system if the pagefile is reduced to around 1GB.
Jedi Survivor is the worst I've come across with virtual mem usage crossing 50GB.
The best I'm hoping for is AMD racing to beyond 3Ghz on RDNA4 chips, which was rumored for RDNA3.
So that PS5 Pro can easily clock at 7800XT level with console level TDP.
8K 120Hz just in time for the next-gen?:runaway:
https://www.tweaktown.com/news/94879/tcl-unveils-27-inch-8k-65-120hz-oled-and-57-240hz-mega-pc-gaming-monitor/index.html