Digital Foundry Article Technical Discussion [2024]

Status
Not open for further replies.
Display makers must be lazy and incompetent. It's the only explanation. Yea, sure, it seems to be a universal thing that none of them can create some flawless display technology even when the nature of capitalistic competition would clearly benefit them to do so, but that's probably because of some engineering brain drain among the entire industry, and not because there's simply inherent issues that they all face and people are simply expecting too much.
 
But this isnt 2014 anymore. HDR (and VRR) and strobing will be a problem. OLEDs cant even do 300 Nits fullscreen and strobing will massivly reduce brightness.
Don't you mean "OLEDs cant even do 300 Nits full screen with strobing"? Because I am pretty sure high end OLEDs can do 600+ nits full screen with no problem.

Spider Man 2 upgrades for PS5 Pro. Ray Traced shadows, ambient occlusion and improved reflections. Also screen space global illumination. Bodes well for the upcoming PC version.

Two things to note: reflections are still not on par with the PC versions of previous Spider-Man games, suggesting PS5 Pro still not powerful enough as PC ray tracing hardware, also screen space global illumination suggests abundance of compute resources more than ray tracing resources. Or maybe I am reading too much into this.

Secondly, PS5 Pro has too many graphics options now! Almost like a stripped down PC version!
 
Display makers must be lazy and incompetent. It's the only explanation. Yea, sure, it seems to be a universal thing that none of them can create some flawless display technology even when the nature of capitalistic competition would clearly benefit them to do so, but that's probably because of some engineering brain drain among the entire industry, and not because there's simply inherent issues that they all face and people are simply expecting too much.
It actually doesn't. There's only a handful of major manufacturers of high-end display panels for the consumer market - Samsung, LG, Sharp, Japan Display, JOLED, AU Optronics, HannStar, and BOE, and they'd rather not compete at all. Besides BOE and JOLED, all of these companies (or their corporate predecessors) have been fined for price-fixing before. And ultimately, the amount of people willing to pay a high premium for better gaming monitors is not large enough to justify significant R&D and capital expenditure.
 
They're not realtime either are they? So they'd still add a tad of latency if nothing else?

MiniLED is just another type of backlighting for LCD. MicroLED like OLED has no backlight
I’m aware, it’s still good however. Beyond a certain amount of brightness zones most people cannot tell the difference between self emission and zoned backlights.
 
It actually doesn't. There's only a handful of major manufacturers of high-end display panels for the consumer market - Samsung, LG, Sharp, Japan Display, JOLED, AU Optronics, HannStar, and BOE, and they'd rather not compete at all. Besides BOE and JOLED, all of these companies (or their corporate predecessors) have been fined for price-fixing before. And ultimately, the amount of people willing to pay a high premium for better gaming monitors is not large enough to justify significant R&D and capital expenditure.

I just looked at LG Displays financials for 2023, and if I'm reading it right they spent $1 billion USD on R&D in 2023.
 
Don't you mean "OLEDs cant even do 300 Nits full screen with strobing"? Because I am pretty sure high end OLEDs can do 600+ nits full screen with no problem.

I haven't seen any OLED TV or monitor able to do 600 nits full screen. Most are 200 ~ 250 at best.
In theory because the limit on brightness is mostly because of heat so in BFI mode it could be a bit brighter, but for example ASUS's simulated BFI only allows to around 70% brightness. So it's not as bad as 50% but still can't maintain 100% brightness.
 
Worth bearing in mind it was ported by something like a 4 person team so performance optimisation for the port was probably way down the list of priorities.


if you look at TPU performance of God of war Ragnarok the Performance on Radeon is Abysmal

the RX6800 is on par with the RTX3060TI, and the RTX3070TI is faster than RX6800XT

first its Zen2 CPUs and now RDNA2 GPUs (even RDNA3 performance is bad) the 4080 is 13% faster than RX7900XTX at 1440p (the 4090 is 43% faster than 7900XTX which is Huge) it is odd considering these are the same Architecture of the PS5


https://www.techpowerup.com/review/god-of-war-ragnarok-fps-performance-benchmark/5.html
 
Last edited:
(the 4090 is 43% faster than 7900XTX which is Huge)
Is it though considering the 4090 is 100% more expensive more or less (going by the non discounted price) and the 4080 is easily over 13% more than a 4080 (going by the non discounted price)
If anything it's the 4080 which is the bad choice
just noticed all the 4080's are out of stock and the other uk retailer I buy from has none either
1731490115519.png
 
Last edited:
Is it though considering the 4090 is 100% more expensive more or less (going by the non discounted price) and the 4080 is easily over 13% more than a 4080 (going by the non discounted price)
If anything it's the 4080 which is the bad choice

Price isn’t relevant in a technical comparison though. Thought the discussion was about game optimization.

I haven't seen any OLED TV or monitor able to do 600 nits full screen. Most are 200 ~ 250 at best.
In theory because the limit on brightness is mostly because of heat so in BFI mode it could be a bit brighter, but for example ASUS's simulated BFI only allows to around 70% brightness. So it's not as bad as 50% but still can't maintain 100% brightness.

I will probably go back to LCD for my next TV. My OLED struggles mightily during the day.

On the persistence blur issue I’ll just have to wait for micro LED to be a thing.
 
Last edited:
Display makers must be lazy and incompetent. It's the only explanation. Yea, sure, it seems to be a universal thing that none of them can create some flawless display technology even when the nature of capitalistic competition would clearly benefit them to do so,
To be fair, it might be possible but not economically, or aligned with Business Sense. Capitalistic competition is about maximising money, typically in the shorter term, so investing in SuperNew Tech that'll lead to better TVs might not be worth it if you can keep making SameOldSameOld with fat margins. There might be a startup out there with a transformative display tech, but unable to get funding because the business sense isn't there - the product is too expensive, too niche, too hard to get off the ground.

However, that argument aside, there isn't any magic alternative. ;) Maybe, of the top of my head, something like carbon nanotube electron emitters so instead of a vacuum tube, you have a grid of emitters, one per pixel, and phosphors, and keep the natural light and decay of the phosphors like a CRT and do whatever you like with the emission to maximise smoothness.

One thing I think an absolute certainty though, consumers will never go back on flat, lightweight displays. If a new tech isn't thin like current TVs I highly doubt it'll get anywhere.
 
if you look at TPU performance of God of war Ragnarok the Performance on Radeon is Abysmal

the RX6800 is on par with the RTX3060TI, and the RTX3070TI is faster than RX6800XT

first its Zen2 CPUs and now RDNA2 GPUs (even RDNA3 performance is bad) the 4080 is 13% faster than RX7900XTX at 1440p (the 4090 is 43% faster than 7900XTX which is Huge) it is odd considering these are the same Architecture of the PS5


https://www.techpowerup.com/review/god-of-war-ragnarok-fps-performance-benchmark/5.html
AMD massively underperforms in this title, yeah. The 3080 trounced the 6800 XT by around 20% when they’re usually close.
 
This frame time graph is what @Dictator saw under his bed as a child when he thought there were monsters.

YJEdrZD.png
 
lol that 7800X3D result there is funny. As RANDOMLY, for NO REASON, I have recorded similar behaviour across processors in that game. It is random, has nothing to do with what is happening on screen. DD2 sucks IMO still.
It's still Mixed on Steam after 8 months. I don't think Capcom is ever fixing it. It seems to be mostly fine on the PS5 Pro with a much more stable frame rate than everywhere else.
 
It's still Mixed on Steam after 8 months. I don't think Capcom is ever fixing it. It seems to be mostly fine on the PS5 Pro with a much more stable frame rate than everywhere else.
You have to take a look at the frame-time length there on that 7800X3D spurt. The length is long enough be completely hidden by 33.3 ms VSYNC update. On a console like PS5Pro, we cannot actually know the real frame-time even if it is erratic looking as we cannot induce tearing, so Vsync hides anything below the Vsync time.

Such an frame-time spurt on PS5 Pro could potentially look like a handful of 33.3 ms vsync frames on a frame-time graph. That looks "cleaner" from a graphing perspective.
 
Status
Not open for further replies.
Back
Top