Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
I don't understand how the Series S can perform worse than the P4Pro in any game.
The GPU have around the same teraflops, but it's RDNA2 vs CCN3, plus a much stronger CPU, more bandwidth... so how can the S perform worse than a PS4P? Makes no sense for me. It's just as mysterious as the Series X not showing it's GPU advantage.
Its not mysterious at all, its just (likely) a simple thing/setting in the code that they need to change with the series S, (since the series s vs ps4 pro where the series S is better at all things though they have tweaked it for ps4 sweetspot), though differences with PS5 vs XB seriesX is more readily explained, the ps5 GPU is better than series X GPU at certain things. And as every programmer knows you are only as strong as your weakest spot, each tenth of a milllisec counts
 
I understand that, but don't they have enough to brute force to at least equal performance? Are the Microsoft APIs so much worse? It's hard to believe.
I really doubt it's about anything being better or worse.

It's about resources and time when moving old software with all its pieces to completely new hardware and software chain. (While creating yearly sports title..)

It certainly will be interesting to see how the Xbox versions evolve in future revisions.
If there has been performance pitfalls, they will most likely be ironed out and we might even see new hardware features being used properly at some point.
 
I understand that, but don't they have enough to brute force to at least equal performance? Are the Microsoft APIs so much worse? It's hard to believe.

Although I am certain things are vastly improved, back when the last generation consoles launched there were instances where Xbox One could be much slower than PS4 with some API calls. This following is from Digital Foundry's interview with 4A Games about what it was like to work with the [then] new consoles.

Oles Shishkovstov said:
Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result. ... On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.
 
Only Battlefield 4 and Titanfall 2 show any noticeable drops below 120 fps. Interesting that The Series X even in back compat mode is offering up to 2x+ the CPU performance of my overclocked 6700k in Battlefield V.

At last gen console settings and player numbers?

This doesn't add up as here's a 7700k running at 144fps average at max settings:

https://www.techspot.com/amp/review/1754-battlefield-5-cpu-multiplayer-bench/

I'd expect the console CPUs to be faster, but not 2x faster.
 
At last gen console settings and player numbers?

This doesn't add up as here's a 7700k running at 144fps average at max settings:

https://www.techspot.com/amp/review/1754-battlefield-5-cpu-multiplayer-bench/

I'd expect the console CPUs to be faster, but not 2x faster.

I'm pretty sure thats an SDK thing since BF4 most likely uses the old draw call limited DX11 mono driver SDK which has notoriously bad cpu performance. Which also points to that direction is that the newer BF1 and BF5, which run higher resolution and better graphics have a basically perfect framerate.
 
At last gen console settings and player numbers?

This doesn't add up as here's a 7700k running at 144fps average at max settings:

https://www.techspot.com/amp/review/1754-battlefield-5-cpu-multiplayer-bench/

I'd expect the console CPUs to be faster, but not 2x faster.
They test on Norvick map. Performance is highly variable depending on what part of map and how many enemies are there. A 6700k/7700k has absolutely no chance of a 90 fps minimum in many areas of the map, especially in breakthrough mode. It's a terrible experience. I'm not sure what you mean by last gen player numbers but the settings in BFV don't have a big impact on CPU performance.
 
Last edited:
I wonder how much they have to cut back RT on the consoles in order to get playable FPS...in before "RT does not look worth it to me" crowd.

It looks very nice, but somehow i think metro doesnt really impress me personally all that much, technically perhaps, but somethings missing. CP2077 still holds the candle of the most impressive looking title so far, ranking above DS on DF's list (on highest end hw, that is)
 
I wonder how much they have to cut back RT on the consoles in order to get playable FPS...in before "RT does not look worth it to me" crowd.
I think it's going to come down to what they want to release with consoles. Since they redid all the lighting in the game just for RT lighting it's a whole new game. If players are looking to increase frame rate by dropping RT, then it would have to be a different game entirely. So I'm not sure what developers are going to choose for their release here.
 
Seeing the amount of time artists can save with this method, is really all we discussed in theory of the advantages of RT lighting and getting rid of traditional lighting. Man really happy with this video. Very good potential for the future to see more of these developers with smaller budgets keep up with those bigger studios. Goodbye baking
 
Seeing the amount of time artists can save with this method, is really all we discussed in theory of the advantages of RT lighting and getting rid of traditional lighting.

RT will also present some issues. If your game is aiming for a particular lighting aesthetic then game directors, much like lighting directors in movies, will be fighting realistic light using diffusers and colours that mute reflections.
 
RT will also present some issues. If your game is aiming for a particular lighting aesthetic then game directors, much like lighting directors in movies, will be fighting realistic light using diffusers and colours that mute reflections.
That's fine and I'm sure there are work around items for that. As artists are freely available to add in back fixed lighting points as required. But they don't need to add them to fix lighting in the room.

So start with RTGI, then do whatever you need.

The overall benefits outweigh the costs.
 
Seeing the amount of time artists can save with this method, is really all we discussed in theory of the advantages of RT lighting and getting rid of traditional lighting. Man really happy with this video. Very good potential for the future to see more of these developers with smaller budgets keep up with those bigger studios. Goodbye baking

Pretty much my take! Really curious how this would run on my RTX2060.
 
ACE Update

Alex on spitting RT bars on this video

This is next gen lighting.

Just a refresher...
pyHfJQT.png
 
Status
Not open for further replies.
Back
Top