Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
"The PlayStation 5 edition costs $69.99, and the PlayStation 4 version is $59.99—though you can upgrade a PS4 copy of the base game to the Director's Cut for $19.99"
sounds very rude charging $20, should of only been $10
though perhaps they are arguing the ps5 version is $80 (but they include a $10 discount, but I dont think anyone would believe this argument)
Ah yeah but is cheaper to just get standard ps4 version on disc and update to dc
 
"The PlayStation 5 edition costs $69.99, and the PlayStation 4 version is $59.99—though you can upgrade a PS4 copy of the base game to the Director's Cut for $19.99"
sounds very rude charging $20, should of only been $10
though perhaps they are arguing the ps5 version is $80 (but they include a $10 discount, but I dont think anyone would believe this argument)
I' m not sure that I follow your reasoning : the $19.99 in questione Is the price for the "upgrade" from an old Ps4 Copy of GOT to a new Ps4 Copy of DC., pretty much this Is the price of DLC Iki Island and the little adjustament that SP has made to the camera control, lock on, ecc. If you then want to Upgrade from this version of the Ps4 DC to a PS5 DC ( so, having on PS5 a boost in resolution, real time cutscenes and the use of 3d audio and of the Ps5 controller vibration and resistance ) you need to pay other $10 dollars
 

Crysis 3 on the Switch. Hopefully the materials are revised like in Crysis 2. According to the various statements in the video Crysis 3 will unfortunately have fewer changes. Probably the materials are like in the PC version from 2013. Visually, I think Crysis 2 will go more in the direction of Ryse through the PBR.

Crysis 3 looks very good on the Switch. DF stated in the past that TAA on the Switch often makes the image too soft and that Anti-Aliasing methods such as FXAA would suit the Switch better. In my opinion the image of Crysis 3 is still good even with TAA. The shown DOOM in the video looks much more blurred. FXAA would also flicker a lot with that amount of vegetation and shiny materials. Since Alien Isolation uses TAA on the Switch it looks better there than on the PlayStation 4 in my view.

For Crysis 3 I always used a temporal AA called TXAA . This uses MSAA as well which makes it much more resource-intensive.

It's psychological.

Not that it's what I personally want, but I've been advocating for some years now that developers should stop offering any sort of 'extreme' sort of settings that hurts performance too much. Or if they do, have them separated as 'Future' settings in a separate submenu or adjustable through an ini file or something.

Too many gamers judge how 'optimized' a game is by sticking everything on max and seeing how it runs. So developers could get better reception by taking away many high cost settings, so gamers could put things on max and then have surprisingly good performance. Gamers would claim the game is 'very optimized' and be happy with it and give good Steam reviews and whatnot.

I feel like Far Cry 5 is a good example of this. Gamers were very happy with the PC version of the game and how well it ran. All because there were no extreme graphics options. All the highest settings would be more like 'High' settings in some other game.

Players who only evaluate a degree of optimisation in view of the highest settings are stupid anyway and should rather not say anything.

They should make separate graphics settings for more intelligent players but they still should be found in the game menu.
 
Last edited:
I' m not sure that I follow your reasoning : the $19.99 in questione Is the price for the "upgrade" from an old Ps4 Copy of GOT to a new Ps4 Copy of DC., pretty much this Is the price of DLC Iki Island and the little adjustament that SP has made to the camera control, lock on, ecc. If you then want to Upgrade from this version of the Ps4 DC to a PS5 DC ( so, having on PS5 a boost in resolution, real time cutscenes and the use of 3d audio and of the Ps5 controller vibration and resistance ) you need to pay other $10 dollars
Well Im not really understanding what is what then (and your post isnt really clearing things up)
Its all too complicated for my limited attention span

https://ghostoftsushima.fandom.com/wiki/Ghost_of_Tsushima#Edition_Variants
Standard Edition
Standard Launch Edition
Digital Deluxe Edition
Special Edition
Collector's Edition
Digital Deluxe Upgrade

what a mess! and they dont even mention PS5 (which adds more variations), dont worry mate its of little importance
 
impressive cb implementation (image sharper than 1872p native)
So well yes that 4K CBR is sharper than native 1872p and also has about the same sharpness than the native 2016p image on Xbox. It's really the best implementation of CBR done by a multiplat dev.
 
PS4 Pro had also cb enabled (so half the pixels on both).

Curious why they haven't patched in at least 1080p for series S if as they added this patch. Shouldn't be much of a problem.
are sure ? so wrong info on df video edit: so they went similar way as sucker punch with ghost dc, dubling fps and increase resolution from ps4pro to ps5 still using cb, seeing results and performance very good choice
 
Last edited:
Is there a reasonable answer for why devs aren't just going for resolution parity between PS5 and XSX and then increasing detail/quality elsewhere with the extra GPU horsepower of the XSX?
 
Is there a reasonable answer for why devs aren't just going for resolution parity between PS5 and XSX and then increasing detail/quality elsewhere with the extra GPU horsepower of the XSX?

Two reasons.
  • Keeping IQ and asset quality the same makes life easier for the developers.
    • Scaling resolution is just changing a simple parameter requiring little to no work on the part of the developer.
  • Keeping IQ and asset quality the same generally the fairest way to treat gamers on different consoles as differences in resolution are unlikely to be noticed from typical living room distances.
    • Without side by side comparisons almost all consumers are unlikely to notice the difference in resolution, especially if any UI or text elements are at native resolution.
      • This was true for PS4/XBO and is even more true for the smaller resolution differences between PS5/XBS-X
    • There are, of course, exceptions to this, such as consumers that sit much closer than typical living room distances to their TV. But even here, they aren't going to be able to tell without looking up side by side comparisons on the internet, or if they have both consoles seeing it side by side in their living room.
Regards,
SB
 
Is there a reasonable answer for why devs aren't just going for resolution parity between PS5 and XSX and then increasing detail/quality elsewhere with the extra GPU horsepower of the XSX?
requires a lot more work, changing the resolutions requires hardly any effort. Though it does seem as if often the series X tries too much and it framerate suffers, like in the above tomb raider why did they think letting the series X drop down to 40fps where the ps5 is nearly 60fps locked is a good improvement. But maybe its not so simple, perhaps its the variable clocks in the ps5 that enables better framerates and dropping the res on the series X wont improve that much (I suspect it would though),
perhaps in the future series X gets a OS update that enables variable clocks (I think all AMD GPU's have this, right, so it must be possible?)
another game where its like WTF is up with the series S, 900p thats less than 2.5x resolution of what was marketed as a 1440p console

edit actually had another quick look ps5 does nearly drop down to 50fps at one stage
 
Last edited:
We can be certain the xsx wont get any downclocking or trading power resources between components since ms promised they wont advertise with 12tf which can be lower when things get hammered.

In the pc space, a gpu has its rated performance metric of said TF number. A gpu like that does boost upwards, not down because the cpu takes too much.
GPUs there clock down when the powers not needed, not because the cpu needs it.

And this shows as xsx games tend to perform around 15 to 18% better as opposed to the ps5 versions. The var clocks dont really have a home in the console space where fixed clocks deliver a said performance level whereas var clocks need more dev input to keep account for a perf level between 9.2 and 10TF.
 
I really don't think we've seen any credible evidence that the variable clocks were a bad idea yet. No leaked developer complaints, no consistent underperformance, etc. I get being suspicious of it and theres a chance it will turn out bad as more and more fully nextgen games start coming out but for now I think the engineers at amd and sony deserve the benefit of the doubt.
 
I really don't think we've seen any credible evidence that the variable clocks were a bad idea yet.
Maybe Im misunderstanding what variable clocks are, Isn't it just the mHz of the GPU ramping up and down based on workload.
My current GPU does this, my last one as well and the one before that, Maybe Im talking out of my ass but hasnt this been standard for many years?, I'm not a hardware nerd so perhaps it isn't, but if there was big problems with variable clockspeed surely ppl would of noticed by now?
One good thing with fixed clockrate I can see is you are less likely to have the fan change speed which can be an annoying sound, but apart from that what else could be the issue.
hmmm maybe the life of the device is shorter by the changing of the speed? (Just throwing shit out there)
 
Maybe Im misunderstanding what variable clocks are, Isn't it just the mHz of the GPU ramping up and down based on workload.
Just throwing shit

The difference is stated base performance. Lets say a 2080Ti (i have it) has a claimed 13.5TF in given performance metric, its the base clock, the performance your always getting if all things are equal (correct PSU, airflow, matching CPU and other components ofcourse), the GPU is using its gameclock to boost upwards if termals, load etc allow for it, resulting in higher performance metrics, not lower whenever the CPU is eating too much load.

The PS5 on the other hand is giving you the specs of 10.2TF at its max when the CPU isnt eating from its load. The tech is quite much the same (in special as in laptops smartshift and maxQ), but reversed from max performance metric to allow it to go down, along with the CPU/GPU trading powerload from/to eachother. Its not whats exactly happening in the pc space, and probably wont either for gaming gpus, never has.
Cant imagine Nvidia or AMD stating 13TF as the 'gameclock' (max metric), and never stating what its lowest clocks are. In the PC space we talk base clocks, gameclocks (boost) for whenever the situation allows for it. What sony has done is claiming 'gameclocks' (boost) IE the max potentional, and downclock from there when the CPU would be needing it or the other way around. Also, downclocking the CPU because the GPU needs more juice isnt whats happening in my Zen2 system.
Again, not exactly the same thing as both have different starting points.
 
Did they say anywhere in the video if the PS5 or Xbox series crysis 2 and 3 versions were backwards compatible versions or native apps? I'd assume backwards compatible enhanced but..
 
We can be certain the xsx wont get any downclocking or trading power resources between components since ms promised they wont advertise with 12tf which can be lower when things get hammered.

In the pc space, a gpu has its rated performance metric of said TF number. A gpu like that does boost upwards, not down because the cpu takes too much.
GPUs there clock down when the powers not needed, not because the cpu needs it.

And this shows as xsx games tend to perform around 15 to 18% better as opposed to the ps5 versions. The var clocks dont really have a home in the console space where fixed clocks deliver a said performance level whereas var clocks need more dev input to keep account for a perf level between 9.2 and 10TF.

XSX has ~17% more compute and 25% more memory bandwidth so if PS5 was maintaining maximum clocks at all times it will still be between ~17-25% slower (On paper) which is well within your 15-18% figure.

The fact that PS5 is competing as well as it is in multiplats goes to somewhat prove it's not having issues maintaining and running at it's maximum allowed clocks all the time.

What also needs to be kept in mind is there are certain aspects of the GPU pipeline that run ~17% faster on PS5 due to these higher clocks so it's not as simple as saying XSX's GPU is faster across the board then PS5's.

So as it stands there's nothing at all to show that variable clocks have no space in a console and you could even argue that currently PS5 is showing they can work very well.

What would be interesting to know is what the lowest clock speed PS5's GPU is allowed to drop too during extreme loads.
 
  • Like
Reactions: snc
Status
Not open for further replies.
Back
Top