Really?
1 - Dynamic scalability isn't a holy grail that solves all. You most probably need to set the boundaries for resolution and framerate.
Imagine a game that pushes the Anaconda with insane amounts of per-pixel processing and uses a dynamic resolution that ranges between e.g. 1800p30 + upscaling for indoors and 1200p30 + upscaling for outdoors.
Put that game into a Lockhart that has a 3x smaller GPU and the 1200p30 parts (2133*1200 = 2.5MPixels) will go down to a resolution that is 3x smaller, so 2.5 MPixel / 3 = 833 KPixels, which at 16:9 is something like 1215*685.
Such a rendering resolution, on the typical >= 50" TV sets from nowadays, would probably look like ass. So what the devs must do is make the game for the smaller common denominator to avoid going below 1080p on it, meaning all the IQ juice they wanted to put there for Anaconda now needs to be scaled down for dynamic scalability to work (while the PS5 gets all the juice regardless). That or they needed to treat Lockhart as yet another entirely new device to develop for, which defeats the purpose.
Setting boundaries is now extremely difficult? Clarity is now important where AF is the first thing tossed out for consoles
2 - Geometry performance is now more dependent on compute performance (primitive culling compute shader on some engines, primitive shader on Navi). That means if geometry setup takes up a bunch of GPU core time to make sure you can use very large meshes and many of those, then the percentage amount of GPU compute needed for geometry is going to be different between the larger GPU and the smaller one. Since the performance hit from geometry will be asymetrically larger for the smaller GPU, the devs need to cut down geometry for the lowest common denominator or make two sets of geometry for each console that they to put in the disk.
You seem to be making different assumptions with respect to the make-up of the APU - or rather, in which way the GPU portion would have to be cut-down.
3 - More work for devs. Doesn't matter how close the architecture is, if it's a console you still need to do all quality for that hardware. Having the game consistently do CtDs on a PC when running on GPU XyZ with ZyX software / driver version / whatever won't be newsworthy. But on a console? That's opening a can of worms. Release year is going to be crazy for QA devs doing because multiplatform crossgen titles will release for XBone, PS4, XBoneX, PS4 Pro, PS5 and Scarlett. Four of those are fairly known quantities, but the new two are not. With Lockhart we'd be looking ay three instead.
A single example that can be a bit dubious by the small updates made to the architecture between Liverpool & Neo. If there are widespread issues for other titles, I'm not aware of them at this time.
Exactly what is "more work" defined as? Would it be more or less work to bust the resolution down to 720 or 900p for Durango titles with the higher end GPU on PS4 serving as the target console? Would it be a lot more work than Switch
----------
It may sound like a sarcastic set of questions, but they're not meant to be.
----------
What's odd to me is that we have a fair level of abstraction, and a lot of devs have been involved in multiplatform development for a long time now, so it seems a bit dubious of a complaint between Xbox & Windows development.
edit9357gflops
I'll certainly acknowledge extant factors on the manufacturing side that would be far more detrimental to mult-SKU with non-phone-costs.
4 - Virtual Reality. VR is extremely demanding on performance where devs must target absolute minimum 60FPS at the headset's maximum resolution which definitely won't be lower than WMR's current mininum of 1440*1440 + SMAA/TXAA (though preferrably SSAA) per eye. It's not like a VR game can dip down to 25 FPS and make people vomit, so performance must be very finely tuned. How would devs handle VR games for the slower console? Have it significantly reduce the immersive experience by reducing rendering resolution and/or AA and hurt the baseline? Say the VR headset isn't compatible with the new console?
er.... what
VR hasn't crossed my mind or a significant amount of the population, but it seems like a rather premium experience. How do devs cope with 4Base?
The CPU & the way engines are written can be a much bigger limiting factor for sustaining high framerates (along with bandwidth). In terms of the need for a GPU to feed native lol-resolution, well, perhaps a $499 console isn't the solution for a premium experience requiring high-end PC GPUs (whatever price is commanded on there).
That said, I don't know enough about the pro/cons for VR development budgets with respect to pushing the requirements (or art budget) so high.
Last edited: