Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
They are both essentially identical other than one having a 50% higher resolution and marginally better frame rate, while the other has a very heavy sharpening filter applied. The remaining differences were caused by problems with DF's capture equipment, will be sorted by launch (missing AO on xb1) or are simply due to the playthroughs being different.

So while I respect your opinion, it's still just that. An unbiased analysis would suggest the PS4 has quite a significant advantage.

An unbiased analysis of the tech would conclude that (ignoring a variety of others factors). That's not the same thing as an analysis of how the game looks visually. Tech graphics and visuals aren't synonymous. The utility of high end tech graphics is ONLY to provide the end user with as nice of visuals as possible. When pixels are (evidently) no longer the limiting factor for IQ and clarity, other things need to be considered as well (i.e. AA, AF, no tearing, contrast ratios, color spectrum). Do those not count for some reason? The objective answer should be that they DO count. We just aren't proficient at quantifying them in out common discourse on the topic.

IQ has been conflated to mean resolution for yrs now by common forumites. It's one thing if the assets are held back by the frame res. It's quite another when that's not the case though. especially in light of major advantages one version might have in those other areas I noted.
 
PS4 has display planes like XB1. It's unsure whether devs have access to the display planes for HUD, or if the second plane is reserved for the OS. Without that knowledge, there's an argument that the scaler used in that component is upscaling the game and compositing it with the UI, and is doing a poor job.

PS4 only has 2 (1 for OS, 1 for games). It's not using 2 for games (and thus not 1 for game world; 1 for HUD).
 
While it's possible, I wouldn't expect any overhead of virtualizing the GPU to manifest itself as an effect on fill rate or shader performance. It's much more likely at add overhead to draw calls, which would manifest itself as a CPU time issue where reducing resolution wouldn't help.
Even there given the hardware has multiple ring buffers there is no real need for virtualization to introduce overhead.
I would surmise (having never used the hardware) either the issue is the way that the engines interact with the ESRAM resulting in a significant bandwidth issue, or just the limitations of the 16 ROPS.
If it's the former, it's likely that some of the defecit can be reclaimed as engines are targeted at the hardware, if it's the latter, it's likely things won't improve dramatically.

If the ROPS are the limiting factor for say BF4 it really isn't good sign for Xbox One in the long run, especially as COD appears to be less taxing on a graphical level and we already know that is also 720p. I think its quite obvious that fitting a 1080p render into the 32MB of Esram has caused developers massive problems and the fact that deadlines were so tight meant that drop to 720p was really the only option at hand.
 
The interesting part about all of this for me (a 360 gamer), is that the PS4 isn't just getting minor increases on the Xbox One, it's getting massive differences. We've seen that BF4 is > 50% better on PS4, now it looks like COD is >100%. That's huge!! I honestly think that Microsoft must be happy that these devs are only increasing the resolution, because if they had the resource and intent they'd be able to make the PS4 game 720p and increase the graphical complexity so that it's far and away better on PS4, to the point where anyone would be able to see the difference.
 
*AHEM* Can we PLEASE keep on topic to what the Digital Foundry articles mean from a Technical perspective?

That means keeping the business discussions of both companies out of this thread. There's already an existing thread for that. It's over here: http://forum.beyond3d.com/showthread.php?t=63567
 
20 FPS average in most situations, pop in is improved from E3 but texture streaming is still a major issue. 720p upscaled to 1080p looks jarring and very bad, as the UI actually looks clean and pristine in comparison to the game around it. Much be the effect of having a 1080p display pane overlayed overhead.

WIth all of this disappointing stuff, i don't even know if the increased zombie count will make up for it.
 
720P sub 30fps? And it's an exclusive game. That's very dissapointing.
 
DF vs dead rising 3

http://www.eurogamer.net/articles/digitalfoundry-vs-dead-rising-3

Key points : 720p
Less than 30 fps in action situations.

Ok, this is dissapointing. Especially since the devs in a not to long ago interview stated how important it is for them to have a steady 30fps framerate:

http://www.eurogamer.net/articles/2...-more-to-do-with-what-you-do-with-those-specs

Josh Bridge said:
And now we're at locked 30...

Just as a sidenode: here is one of the first released PS3 exclusives...Heavenly Sword:

720p resolution + 4xMSAA+ Nao32 HDR + massive amounts of enemies on screen + about the same framerate as DR3.

Sorry, could not find 720p vid, game is too old...lol!!!

Go to about 2:00 minute mark to see the amount of DR-style on-screen enemies:

http://www.youtube.com/watch?v=mV2xJ0Ll8T8


In conclusion: one of the few launch exclusives I will skip.

PS: please don't kill the messenger
 
Or it could be the talent of the developer. or launch title woes. But even if it were strictly down to launch title, you'd expect a game to run decently on much more powerful hardware, as this was originally a 360 game as far as i know.

Of course the combination of the ESRAM, the anemic GPU(on top of the OS reserves taking out of a chunk afterward) and of course, the CPU.

But for some reason my mind has issues wrapping around the thought any game on next gen performing this bad at 720p. It doesn't even look discernible from current gen in any real form outside of pure zombie count. The fact that the game is open world isn't really much to speak of considering how many current gen games have done that.
 
It's clear by now that the first few titles on X1 are simply projects which started on 360 and have not been optimised due to deadlines. Or even tested as much as they could, as surely they would have noticed the dips in framerate and adjusted things accordingly.
 
It also appears to have a tacky sharpen filter to combat the blurring effect of upscaling (like BF4). I don't subscribe to the idea that this is some magical feature of the scaling hardware, but I would not be shocked if MS is evangelizing the technique to every Xbox One developer producing a sub-1080p game.
 
It's clear by now that the first few titles on X1 are simply projects which started on 360 and have not been optimised due to deadlines. Or even tested as much as they could, as surely they would have noticed the dips in framerate and adjusted things accordingly.

I agree. All games so far started as such: DR3, Ryse, BF and CoD are cross gen, this Crimson Dragon iirc.

What about Killer Instinct?

Forza seems to be the only X1 title so far (in this sense) and hits 1080p/60Hz.
 
Last edited by a moderator:
I agree. Al games so far started as such: DR3, Ryse, BF and CoD are cross gen, this Crimson Dragon iirc.

What about Killer Instinct?

Forza seems to be the only X1 title so far (in this sense) and hits 1080p/60Hz.

But this is what I dont get. They are games that started on 360. A generation before. Many of the assets arent even next generational. CoD started on a less powerful console. AC too. You would expect a next gen console would be easier to get it optimized to run smoother and at higher resolution.
Its not like they are porting a game from a next gen platform.
All these games except Ryse and DR3 have better performance on the PS4
 
I really think that it is due to lack of time: launch release window, new hardware, apparently quite some changes late to the OS reservation/functionality like party chat, efficient use of ESRAM seems to need a bit different data structures compared to last gen...which takes a lot of work (time) to redesign extrapolating from my coding experience.

All in all it is due to launch problems imo...and that Sony and MS went with relative low next gen specs.
 
All those zombies models and open world assets must be stored in the main starved bandwidth ram I guess. Hence the low res and framerate.

Another confirmation about the hardware sharpen upscaler of the XOne.
 
Status
Not open for further replies.
Back
Top