Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
The full image is available at source from DF. Right click the thumbnails to copy/paste/download. That worst-case image with the blurry head isn't indicative of the rest of the captures, and the blur is universal to that image.

I hadn't realised you could do that, cheers. Downloading both and flicking between the two makes the difference between PC at Ultra settings and PS4 a lot more pronounced. There's clearly a lot more going on in the PC shot ambient occlusion wise. And the detail in the distance looks a little more pronounced again, most obviously on the guy in the background when comparing the PS4 and PC High shot (PC ultra and medium are at slightly different angles so make a direct comparison more difficult).
 
I've never tried a direct comparison to see which could push more triangles but it's kind of uninteresting in a vacuum. In real life PS3's vert unit was quite bad, and was indeed a bottleneck for many games. Standard optimizations were SPU backface culling, constant patching, SPU skinning, and interpolator packing... all more or less aimed at helping along the vertex/tri hardware.

Thanks for the info!

And yah, this is the kind of thing I was trying to describe having read about. It does appear to have been standard to use Cell significantly for GPU assistance after the first few years.

Everything I have read suggest that triple buffering reduces the negative impact of vsync on framerate, but doesnt completely eliminate it. Wii U's biggest advantage seems to most certainly be the memory. Both the main ddr3 and edram are over twice as plentiful. If DF's anylsis is to be believe, they suggest that the framerate dips are typically caused by heavy fillrate effects such as particle and post processing effects. If my novice level understanding of things serve me correctly, this should suggest that the edram on the Wii U's GPU is in fact sufficient as to not be a bottleneck.

The overheads for triple buffering should be very small. I'm not a developer by any stretch of the imagination, but my time with admittedly simple OGL programming seemed to show that there was almost no performance penalty for triple buffering, beyond memory used for additional buffers (didn't test for input latency).

It may well be that DF are right, and that fillrate is the cause of the drops. If so, I would expect it to be related to more efficient use of the GPU - perhaps early Z rejection causing obscured fragments/pixel to not be rendered. In a straight contest of fillrate I'd expect the 360 to win ... but it's not like I can say that for sure.

I still have a feeling that vertex transformation and triangle setup will favour the WiiU in this game compared to the 360, and I'd bet ten English pounds that without Cell assistance WiiU Bayonetta will fare massively better than the PS3 here.

I hadnt heard that the PS3 was as such a disadvantage in polygon performance. You mentioned the Cell being good at culling, was the Xenon not very good at culling? I know polygons per second are no longer a big benchmark for games, but still, that was a pretty big deficit.

A Xenon core should be almost as fast as an SPU, I reckon. Which is to say quite fast (128-bit vector units and high clockspeed). You can predict what you'll need to have in the cache and so should be able to prefetch, test, and write back to memory without random accesses or nasty branching. Cell having twice as many cores means it could do additional vertex work on top of matching whatever Xenon was doing though, once developers had got up to speed with Cell..

The extra headroom in Cell seemed to get used in supporting RSX.

The WiiU CPU should be much worse at vector work than either Cell or Xenon, but on the other hand the GPU probably needs the least assistance of the three (I'd guess).
 
It may well be that DF are right, and that fillrate is the cause of the drops. If so, I would expect it to be related to more efficient use of the GPU - perhaps early Z rejection causing obscured fragments/pixel to not be rendered. In a straight contest of fillrate I'd expect the 360 to win ... but it's not like I can say that for sure.

During the rasterization process, the 360's rops could operate without being held back because of a memory bandwidth limitation. I theorize that Wii U too has a similar advantage thanks to its edram. Even though I doubt it has nearly the bandwidth that 360 did, I still suspect its enough to be sufficient for these operations. I think there is a good chance that its because of the edram that the 176 Gflop GPU is able to compete and arguably exceed the Xenons performance. Perhaps the shaders find themselves bandwidth limited at times on Xenon, same with the texture units. Im not trying declare a number on the edrams bandwidth, but my theory is that is good enough for Wii U's GPU to never be bandwidth limited on any of its operations, thus making a very efficient GPU.
 
I suppose if it has no AA at all on low, it would make sense. That's a pretty blurry AA to be considered high detail. I didn't think their AA on Xbox One was supposed to introduce that much of a blur. The game at 1080p on PC I thought would appear a lot sharper.

As shifty said, the picture is zoomed in. The full image does not look as blurry.
And it is most definitely the AA, you can see it when you pay attention to Marius' shoulder pads, the low quality AA is cleaner but provides less edge smoothing compared to high quality.

PC low
PC high
XB1
 
Last edited by a moderator:
We need DF to double confirm! *edit: ugg might be better in frame analysis thread.

rGxXkbM.png


http://www.gamespot.com/videos/middle-earth-shadow-of-mordor-graphics-comparison/2300-6421635/
 
Did you just compromise the integrity of DF's well researched and immaculately presented oversights with something from GameSpot? :runaway:

J/K, I'm interesting in reading DF's analysis on Shadows of Mordor. Hopefully tomorrow :)
 
Did you just compromise the integrity of DF's well researched and immaculately presented oversights with something from GameSpot? :runaway:

J/K, I'm interesting in reading DF's analysis on Shadows of Mordor. Hopefully tomorrow :)


Lol haha. Not likely the situation. There is a small footnote that the comparison is from sept 30 patch. I have a small fear that game spot had made a mistake but we'll really need DF this time to doubly check since game spot came out of the gates first.
 
From DF:
PS4 is basically a match for the high quality setting on PC, which requires a graphics card with 3GB of RAM for best performance.

PS4:
http://images.eurogamer.net/2013/ar.../0/PS4_003.bmp.jpg/EG11/quality/90/format/jpg

PC @High
http://images.eurogamer.net/2013/ar...0/High_003.bmp.jpg/EG11/quality/90/format/jpg


http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures

IMO Texture quality looks identical , HBAO on PC and SSAO on PS4 , FXAA on PS4 and MSAA on PC , clothing details on PS4 nearly identical to PC@Ultra setting.
 
Since I'm dying to know lets say Shadows of Mordor and Alien Isolation resolution on X1, but we have to wait for DF articles, since this is the info we really want I wonder if they could just make mini articles about the resolution quickly? Just a couple paragraphs. And then do the full looks/faceoffs in due time.

You'd still get the hits.

But I always find it funny that we dont know the resolution until somebody tells us. That's how little difference it makes I guess.

Although too be fair, people do often note a 900P game looks "fuzzy" or so forth before we know it's 900p officially, even at non technical place like GAF. Then again sometimes they are probably wrong and it's just FXAA or something, which gets us back to square one, we have to have someone tell us.
 
-PS4 has higher resolution shadows and more dense vegetation (PC quite a bit more dense than both).
I've seen barely any vegetation at all on PS4. There are clearly places bushes for concealment (ala Assassin's Creed) but that's about it. Must not have ventured far enough.
 
I've seen barely any vegetation at all on PS4.
You don't want their to be. It clogs the air vents and attracts bugs and spiders that can make a mess of the internals. I always remove early grass and moss growths as soon as I see them, and I advise the same.
 

I'm pretty sure that the PC version has no kind of AA. Some lines do look to be smoothed (mostly around the fur area), but upon closer inspection, it's apparent that the PC version has no AA. You can inject it though.
 
Shadow of Mordor face-off is up. Mostly the same, PS4 a few minor graphical pluses, plus 1080, X1 comes in at 900. Neither has framerate problems, holding a steady 30.

So I guess the idea there was some kind of "day one patch" to put the X1 at 1080 must have been rubbish?
 
Shadow of Mordor face-off is up. Mostly the same, PS4 a few minor graphical pluses, plus 1080, X1 comes in at 900. Neither has framerate problems, holding a steady 30.



So I guess the idea there was some kind of "day one patch" to put the X1 at 1080 must have been rubbish?


Yup. Mistake on gamespot. Poor way to
come out of the gates if they want to get in on the comparison video market.
 
That's I suppose because the fur is partly transparent, making it appear aa'd.
 
No sign of the full article yet, but the framerate analysis for Forza Horizon 2 is up on youtube, and is entirely uneventful, with no deviations at all from a rock-solid 30fps with perfect frame-timing (there's a couple of occasions where the frametime display records a 16ms frame, but those are all at points where the video is cut between different recorded segments, so I'm pretty sure they're false positives).
 
Status
Not open for further replies.
Back
Top