Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Digital Foundry vs DriveClub

Eurogamer have DF's analysis of DriveClub up.

Digital Foundry said:
There was a lot pressure on Evolution Studios to deliver a state-of-the-art flagship title that defined the PS4's next-generation credentials - a racing game that blurs the line between arcade and simulation while boasting the latest graphical features only possible on higher-end hardware. Despite some teething problems earlier in development, this is a feat that DriveClub manages to accomplish. Indeed, the game has benefitted from a remarkable makeover: from scrappy contender to visually polished technological showcase for PlayStation 4 hardware. The attention to detail and complex effects work is undeniably superb, while the handling model delivers an experience to suit both casual players and the more hardcore driving fans looking for something different from the likes of Gran Turismo or Forza Motorsport.
 
Apparently Ryse makes heavy use of compute shaders. Crytek have 'advised' us that we could see some big performance differences between PC GPU architectures given how well different architectures handle compute. I'm certainly looking forward to seeing those results.

I assume it isn't more so than previous generations? What with it not being custom architecture anymore.
 
Oh i dont disagree with any of your statements. I am not new here though I know the purpos of the thread. I just dont see what it has to do with any dev forcing parity. Everyone knows the power difference. I just dont see why it is considered dev forced parity if a multiplat title targets 1080p on both conoles. 1080p could have been a main priority to the devs like framerate is priority for other devs. It just seems like anytime a 3rd party game is 1080p on the X1 it is a result of parity instead of a dev design choice. Sure the X1 is more suited for 900p on demanding games or games that strive for 60 fps, but acting like any title released at 1080 is a reult of devs pushing for something the X1 is incapable of. That is the point of my post.

It seems clear from my X1 resolution list you're essentially going to get a mix of 1080, 900, and even some 720/792 on X1. In fact you will get more 1080 than any other resolution, but you will get a mix. There is a good amount of 1080 on the box, considering that many titles like Disney Infinity 2.0, perhaps even Destiny and Diablo 3, are not ultra demanding. Probably the skew is like 50% 1080, 25% 900, 25% 720/792. Very high profile titles though, seemingly tend to be the most demanding and there the resolution mix will shift downward. But for me, it was good to understand there's plenty of 1080P to be found on Xbox One. I was pretty bummed to not even get 1080P after an 8 year gen when Xbox One first came out, but I feel better about it now.

For now, strictly ad hoc list but it should be representative and most major releases are covered, the numbers sit at roughly twenty six 1080P titles, ten 900-ish titles, and nine 720/792 titles. Going forward I'd hope 720 becomes increasingly rare, it seems to be the case. All of the major releases this fall seem to be at least 900-ish.

Not sure how to handle Master Chief Collection though.
 
It seems clear from my X1 resolution list you're essentially going to get a mix of 1080, 900, and even some 720/792 on X1. In fact you will get more 1080 than any other resolution, but you will get a mix. There is a good amount of 1080 on the box, considering that many titles like Disney Infinity 2.0, perhaps even Destiny and Diablo 3, are not ultra demanding. Probably the skew is like 50% 1080, 25% 900, 25% 720/792. Very high profile titles though, seemingly tend to be the most demanding and there the resolution mix will shift downward. But for me, it was good to understand there's plenty of 1080P to be found on Xbox One. I was pretty bummed to not even get 1080P after an 8 year gen when Xbox One first came out, but I feel better about it now.

For now, strictly ad hoc list but it should be representative and most major releases are covered, the numbers sit at roughly twenty six 1080P titles, ten 900-ish titles, and nine 720/792 titles. Going forward I'd hope 720 becomes increasingly rare, it seems to be the case. All of the major releases this fall seem to be at least 900-ish.

Not sure how to handle Master Chief Collection though.

It's more likely that as time goes on, and as devs learn to push these boxes to their limits, that the XB1 will start seeing more 900p and even 720p games. The only exception would be if devs en-masse started artificially limiting their games on the PS4 version so as to allow the XB1 to keep up.
 
It's more likely that as time goes on, and as devs learn to push these boxes to their limits, that the XB1 will start seeing more 900p and even 720p games. The only exception would be if devs en-masse started artificially limiting their games on the PS4 version so as to allow the XB1 to keep up.

On what evidence is this based?

There are still many multiplatform engines being carried forward from last gen and modified for the new consoles. Many of them with deferred renderers, because that was the best fit to primarily accommodate PS3 architecture. Deferred rendering does not suit the X1's architecture well but seems to pose no problems on PS4.

Just as last gen it took a shift toward deferred rendering for the PS3 to reach it's potential, it will take a similar engine shift for the X1 to reach it's potential. Meanwhile, just as last gen, the console with the more straightforward setup reaches close to it's potential a lot quicker in the hardware cycle.

Forza Horizon shows that forward+ seems to be a very good fit right now, but who knows what is around the corner?
 
It's more likely that as time goes on, and as devs learn to push these boxes to their limits, that the XB1 will start seeing more 900p and even 720p games. The only exception would be if devs en-masse started artificially limiting their games on the PS4 version so as to allow the XB1 to keep up.

I hope not. As a PS4 owner, if the primary manifestation of the difference in power between the two systems is slightly higher resolution I'm going to be disappointed.
 
It's more likely that as time goes on, and as devs learn to push these boxes to their limits, that the XB1 will start seeing more 900p and even 720p games. The only exception would be if devs en-masse started artificially limiting their games on the PS4 version so as to allow the XB1 to keep up.


There other exceptions too. But what your wrote is most likely [path of least resistance] in terms of multiplatforms.

The way I see it, if bandwidth is ultimately a unit of completed or needs to be completed work; in a straight drag race X1 is still capable of more with or without CPU contention issues. The number of ROPS or CUs is not relevant in this scenario if you are looking at sustained average bandwidth. If X1 was truly maxed out it would peak close to 192 and average 140. That's well above the realistic limits for ps4 and there is no answer for PS4 except to wait for the work, there definitely going to be more idle time for the CUs on ps4.

Games optimized to have an average bandwidth amount that is higher than ps4 is capable of will ultimately be in Xbox favour. If we are talking theoretical extremes of course.
 
Many of them with deferred renderers, because that was the best fit to primarily accommodate PS3 architecture
I disagree. Deferred rendering wasn't introduced to support PS3 (it's not a good fit for XB360) but because it was the best technique to get the results developers were wanting. And from that...

Just as last gen it took a shift toward deferred rendering for the PS3 to reach it's potential, it will take a similar engine shift for the X1 to reach it's potential.
Nope. Devs aren't creating cross-platform engine architectures based on one fractional hardware platform. They'll go with whatever technique gives the best results on the system they are targeted, balanced by overall business strategy (some may pick an architecture better suited to one platform whereas others may go with a more generally focussed architecture). Most notably, future engines will be based around the new abilities of the GPU and things like computer-based rendering. Machines that aren't a good fit for those new engines (older PCs) will either get a throwback engine (UE3 instead of UE4, say) or a cut-down game.

No individual console has been important enough to influence software architecture since PS2. Perhaps PS3 gets a nod for pushing multithreaded design a bit, but the industry was headed that way anyway.
 
I disagree. Deferred rendering wasn't introduced to support PS3 (it's not a good fit for XB360) but because it was the best technique to get the results developers were wanting.

I recall that the first deferred shaders on consoles came out of Sony studios. Killzone 2 was the poster boy for it, and deferred lighting/shading was used on a great many Sony 1st party games.

Maybe Sony were ahead of the curve with renderers (like they were with AA techniques) and maybe deferred renderers were going to happen anyway, but the net result was widescale adoption of engines that benefitted the PS3 more than the 360 when compared to the more traditional forward renderers.


Devs aren't creating cross-platform engine architectures based on one fractional hardware platform. They'll go with whatever technique gives the best results on the systems they are targeted, balanced by overall business strategy

That's kind of what I was saying. Forward+ (or whatever new techniques crop up) have a better chance of fitting both platforms this generation than deferred renderers which will fit the PS4 better due to buffer sizes.

Engines will tend to transition from best-fit-last-gen to best-fit-this-gen for multiplatform games. Until that transition is over multiplatform engines will be, on balance, more suited to the PS4's architecture.
 
The fact forward+ is a best fit for all platforms wouldn't mean its adoption was because it was a best fit for XB1. Forward+ is just a better way to render in a lot of cases, and it'll be adopted no matter how well it fits each particular platform due to the similarities in core architectures. I agree with your last line, but not the premise that devs will choose an engine architecture to fit XB1 specifically. No hardware is that significant. If Forward+ is a good fit for contemporary compute-based, DX11 + class GPUs featured in the majority of hardware, it'll be used, and any platform it doesn't run well on will just have to struggle. That's exactly how it was with deferred, with devs adopting the technique to implement their artistic vision regardless of it not being ideal for 360.
 
The fact forward+ is a best fit for all platforms wouldn't mean its adoption was because it was a best fit for XB1. Forward+ is just a better way to render in a lot of cases, and it'll be adopted no matter how well it fits each particular platform due to the similarities in core architectures. I agree with your last line, but not the premise that devs will choose an engine architecture to fit XB1 specifically. No hardware is that significant. If Forward+ is a good fit for contemporary compute-based, DX11 + class GPUs featured in the majority of hardware, it'll be used, and any platform it doesn't run well on will just have to struggle. That's exactly how it was with deferred, with devs adopting the technique to implement their artistic vision regardless of it not being ideal for 360.

:yep2::yep2::yep2:

If the goal is for 4xMSAA, 1080p, high geometry and high number of dynamic lights forward+ is a good solution.
 
I agree with your last line, but not the premise that devs will choose an engine architecture to fit XB1 specifically. No hardware is that significant. If Forward+ is a good fit for contemporary compute-based, DX11 + class GPUs featured in the majority of hardware, it'll be used, and any platform it doesn't run well on will just have to struggle.

What about publisher pressure to avoid PR situations about platform parity? These can quickly escalate as we can see, there's definitely value in choosing tech that would help to minimize them. Sure, this is a technology focused forum for the most part, but we shouldn't completely ignore other factors...
 
What about publisher pressure to avoid PR situations about platform parity?
I think that's now a borderline impossibility. Any future game on both consoles which isn't measurably better on PS4 will suffer the same conspiracy theories that Ubisoft brought upon themselves with AC:U.
 
What about publisher pressure to avoid PR situations about platform parity?
If a game's starting focus and basis of its whole technology to release on multiple platforms is 'will it cause an internet riot', either the industry is a lost cause, or the game's likely not going to be any good. ;)

I presume developers start with, "this is a great idea for a game," followed with, "let's make it look like this (concept art)," followed with, "we'll need XYZ tech to make this happen." Quite possibly followed with, "what middleware exists to enable this?"

We're obviously covering a huge range of games with these generalisations, and there'll always be variations. My indie game is on Unity. I've no idea how well it runs particularly on any architecture, and don't care. If one platform doesn't run it well, tough! Nowt I can do about it. Anyone using an off-the-shelf middleware, which extends to some pretty big titles these days, is going to be at the mercy of the middleware engineers, who in turn are likely independent of console politics. I doubt Epic's internal teams are choosing a rendering path for UE based on how much console fanboys grumble about parity or lack of.

Certainly no-one (en masse, anyhow) chose their multiplatform engine tech to support a singular platform last gen. Publishers didn't make devs use forward rendering on XB360 because it's a better fit (and it isn't anyway if your game style calls for zillions of lights), nor even make sure PS3's games ran with decent parity. Lots of lower framerates and resolutions on PS3. That's a lot of noise about parity at the moment, but taken in the scope of the entire industry, we're talking about a couple of AAA games - hardly a representative selection.
 
Status
Not open for further replies.
Back
Top