Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
First, I don't think it's at all true that PS4, Pro and 1X would run faster deferred. If that were true, they'd all be running deferred, but they're not.

Not necessarily true. That must be a compromise between performances and visual quality or their artistic goals.
 
(we) expect to see more of this type of thing going forward now that 1X is released. You'll always market the best looking variants (Pro/1X) so the non-marketed ones will have less time dedicated towards optimizing them. Resolution on the XBO will likely suffer, if you don't want to deal with esram.

This is contrary to past generations where games get better technically as developers learn to eek more performance out of the hardware, combined with general OS and API improvements. Looks at CODs and GTA games for evidence, or Halo. Devs aren't about to shovel shit on the console SKU with the vastly larger market because that is where most sales will be (or lost).
 
Not necessarily true. That must be a compromise between performances and visual quality or their artistic goals.

What compromise do you see in the xb1 version that you can pin down to that one difference?

Isn't resolution the only difference between the two visually?
 
Last edited:
This is contrary to past generations where games get better technically as developers learn to eek more performance out of the hardware, combined with general OS and API improvements. Looks at CODs and GTA games for evidence, or Halo. Devs aren't about to shovel shit on the console SKU with the vastly larger market because that is where most sales will be (or lost).

I don't think anyone's going to shovel shit, but I do think that X1 will compete more for consideration in terms of software design, and optimisation at crunch, going forward.

I think PS3 suffered from this a little towards the end - games had almost caught up with the 360, but as PS3 games sales dropped off faster than 360 in core markets like America (perhaps due to the huge success of the PS4), the PS3 in some cases started to lose a little ground, relatively speaking.

For X1 I think it'll simply mean resolution and perhaps frame rates dropping a little relative to PS4.
 
we don't use traditional forward rendering anymore at least, not the same type as represented in that article. We're well into Forward+ or Clustered Forward+, Tiled Forward, 2.5+ etc.
Modern forward renderers can all handle a high number (not as absolutely as high as deferred) but still a very high number of light sources, something that traditional forward could not.
Doom 2016 is Forward, As is Forza series, as is Order1886 ... lots of titles are Forward.

This quick article might highlight some challenges you may face when choosing one over the other; at least certainly more representative of modern pipeline challenges.

http://www.yosoygames.com.ar/wp/2016/11/clustered-forward-vs-deferred-shading/
 
For X1 I think it'll simply mean resolution and perhaps frame rates dropping a little relative to PS4.

The difference in DOOM was already quite large.

The PS4/XB1 have almost the same architecture. The PS3/360 were completely different.

Last game tested by DF with old gen consoles : http://www.eurogamer.net/articles/digitalfoundry-2015-metal-gear-solid-5-phantom-pain-face-off

I think he was hoping for some concrete examples, and not some 'tips' in the form of a link to a general article on deferred vs forward renderers.

Maybe, but my point is that forward rendering could stick better to their artistic goals in W2. This article explains some of the visual issues with deferred rendering.
 
Last edited:
This is contrary to past generations...
Past generations haven't had mid-gen refreshes to provide a different marketing face for the same game. Not saying I agree with Iroboto, but there is a consideration there - do you invest in the main console, or in the flagship console from which you'll be making all your PR materials? The latter means better videos and screenshots and more 'oo, that's lovely' from gamers who then buy the game and get the inferior (but may not even notice?) cheap version. All Sony's videos seem to be PS4Pro now. That's less of a concern because it's the same architecture running the same optimisations save a little DRFP16 perhaps, where devs can optimise for PS4 and PS4Pro at the same time. For XB1, there's more of a schism and more need to either choose or invest in both. If XBOX optimisation ties in with PC and PS4 optimisation, there's definitely a case that XB1 will miss out as not worth the effort.
 
Not saying I agree with Iroboto,
I think Scott did a better job at trying to explain what I said, and yea I agree with him. XBO's architecture is starting to show it's age. Unless they've developed some way to get around the inherent disadvantages of esram combined with with respect ALU, it's hard to see XBO keeping up with the other 3 as they continue to push forward. At best it will get a stripped down bare bones version or 720p is going to be this reality for XBO that @Shifty Geezer talked about in another thread so many years ago.
 
The difference in DOOM was already quite large

And it just got a little larger!

The PS4/XB1 have almost the same architecture. The PS3/360 were completely different

Despite their similar architecture PS4 and X1 won't have the same performance profile as you scale render targets. X1 will hit a wall as high bandwidth consumers spill increasingly from super fast esram to decidedly less fast DDR3.

Mitigating this requires extra work - as evidenced by Wolf 2 - but it still won't be as optimal as designing the game around esram from the ground up.

In the absence of this extra work iD would have had to drop resolution further until they got to an acceptable level of performance. Their optimisations saved them 2ms on the X1. If purely limited by pixel shading (ALU and not BW), that's at least a 12% drop in resolution to equal those last minute optimisations.

It will be increasingly less important for many developers to spend that time on X1 as its importance shrinks. Fortunately, if you shrink the resolution enough the render targets will start to crawl back into the esram. So there is at least as easy - if crude - workaround.
 

Wolfenstein 2 is not a traditional forward renderer. Lots of async compute, and compute in general. id software is not going to pick a lesser renderer for PS4/PS4 Pro and PC, which make up the vast majority of their customers. Their forward renderer is the default, highly optimized and high quality. There is no way that you could consider Xbox One the "base" platform for that title, when it's using a different renderer than the rest. There's also no way that they'd ship on PS4 or PS4 Pro without using the best option for that platform, which is the forward renderer. So that's really all there is to it.

 
Past generations haven't had mid-gen refreshes to provide a different marketing face for the same game. Not saying I agree with Iroboto, but there is a consideration there - do you invest in the main console, or in the flagship console from which you'll be making all your PR materials?

No, you'd be idiotic to focus your technical efforts for the purpose of marketing for a tiny, insignificant market. Games reviews have a demonstrable impact on sales and poor reviews, because of technical issues, will impact revenue from the larger install base. That and from a technical perspective it's always easier to scale up rather than down. You can always scale up, sometimes you just dan't scale down because there just isn't sufficient performance. Then you're redesigning, rebalancing and that takes time and time costs money.

You're forgetting that publishers are as greedy as developers are lazy.
 
This thread is making pressure for me to buy an Xbox One X. I was under the impression that the original plan around the X1X was to scale up the X1 games and taking in account the amount of enhanced games announced that seemed to be the path forward.

Now, this thread is painting a different scenario for me and I would like to think that Microsoft somehow covered the scenario that is being painted here.

What's the real impact of coding optimizing for the ESRAM and then just running that optimized code in X1X? I remember that someone from Microsoft said that was really small due the bandwidth from GDDR5 but how accurate is that comment?
 
Probably very. There's nothing special about the ESRAM meaning anything that works with it will work from VRAM. The only potential problem would be 'lower latency' but it seems that was never a significant feature of the ESRAM and doesn't realistically impact graphics work.
 
but it still won't be as optimal as designing the game around esram from the ground up.

I don't think so. Those optimizations are good for every system and developers try to push the consoles as much as they can. So, a game optimized for the PS4 is, in a certain way, optimized for the XB1 by default.

Any optimization to save as much bandwidth as you can is good for any system.

id software is not going to pick a lesser renderer for PS4/PS4 Pro and PC, which make up the vast majority of their customers.

I never said that. I said that they probably choose the best compromise between their artistic goals and performances. Forward rendering was the best choice if the hardware was strong enough.

Deferred rendering is faster but doesn't exactly meet their visual criterias. But this was the best compromise for the XB1.
 
Performance improvements with deferred are not universal across all gpu families, at least from what I've read. What exactly is the visual quality tradeoff for deferred rendering in Wolfenstein 2?

I don't know... developers often have their own standards. Anyway, this option is here to improve the performances. The gain may depend on what type of GPU you have, but it's here to help performances...

There is no logical reason to believe that deferred would only be faster on XB1... it makes no sense to me.
 
Status
Not open for further replies.
Back
Top