Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
If I remember correctly, try using Crysis and Crysis 3 dealing with your test... I've done plenty of foliage test scaling/performance with those games. But yes, vegetation (lots of it) can eat away at resources...

I've run into the problem of custom resolutions throwing my aspect ratio out of whack!

I've tried widescreen monitors but don't like 'em (because of multi-monitor desk space mainly), and so normally game at 1600 x 1000 on my trusty 16 x 12 PVA monitor. I've had to switch to 4:3 and test with 1xxx by 1200 (upscaled on the GPU to 1600 x 1200) in order to try and get an idea of the impact of horizontal only scaling, but Lost Coast (a quick to install game I have a habit of testing display stuff with) doesn't like my funky resolution and forces a bad aspect ratio.

I'll continue and test vegetation performance some other time, but my first impressions are that on the desktop horizontal only scaling is like having an annoying eye defect (astigmatism I guess), but in-game and particularly with MSAA enabled, horizontal only scaling does lead to a generally sharper perceived image than the same number of pixels scaled both ways - perhaps because in a first person game you spend a lot of time scanning along the ground and into the distance ('up' the screen)...?

This could be a good subject for Digital Foundry to look into if they ever Face Off fast enough to have any spare time ....

This might be misleading...I know putting the res at 900P makes PC games look really terrible to me on my 1080 monitor, yet even 720 last gen games dont look bad to me on my HDTV in the other room.

I guess it's a combo of how close you are sitting, and perhaps HDTV's having significantly better scaling/image processing than monitors which are designed with little image processing?

A lot of monitors have cheap bilinear scalers, if that's where you do your scaling. You're also generally very close to the pixels (about 18 inches for me). My old Dell has a nice variable sharpness scaler that can go from a blurry mess to nice (better than bilinear anyway) all the way up to nearest neighbour only sampling, but using it does add a little input latency. My plasma telly just looks good with everything, even though it's cheap by plasma standards ...

Also seeing the X1 gain a little CPU edge now makes me think they should have tried to push that a little more for PR purposes. That way if they could have got the CPU to say 1.9ghz, perhaps by having significantly better frame rates in some games they could have convinced people the tech race was essentially a tie ("they have res, we have framerate", type thing), without doing any major overhaul to the already locked in hardware or adding great expense.

Similarly, I feel like these 1300/1400X1080 resolutions sound a whole lot better to people than 900P, even though pixel count is similar. The former has 1080 in it, and laypeople probably think the distinction is even more meaningless, or cant really grasp it at all, whereas "1080P" and "900P" have become loaded terms, in a negative way for Xbox. Possibly another PR win to be had there.

I think there's marketing clout in both of the things you suggest: CPU and "anamorphic 1080p" :eek: . What's really interesting from an IQ perspective is the impact of different types of scaling. There's got to be a way for us to look into that ....
 
DF are talking about traffic density being the same, not about the precise position and type of vehicle on the roads. There are semi-random elements used in their traffic spawning routines.

It's the consistency with which the X1 is (marginally) ahead that leads to their entirely reasonable speculation.

They said "traffic patterns are indeed identical" and as pointed out there are points where PS4 has more traffic and a better FPS. Watch from 3:38 to 3:43, if it were down to the CPU then the PS4 would have dipped lower at this point.
 
I would say the number of cars and locations...type of traffic is arguable - using the term 'identical' implies everything is the exactly the same when it's not. Even the patterns are not the same (check out 3:38 to 3:43). Again, the only identical footage all show PS4>XBO in fps...even in high speed at some junctions.
 
"the density of active vehicles is matched for both PS4 and Xbox One", they say. Not sure I agree that the patterns are identical as there are random elements in traffic, but the density of active vehicles does look the same. X1 does have a small but undeniable advantage going through junctions.

What you can't see is what's going on in the background - cars not on-screen being driven (pathfinding, collision), monitoring of density and spawning of cars etc. It's the pattern of the X1 generally having a small advantage in these areas hour after hour after hour, even with PS4 using an SSD that makes a strong case for some kind of processing advantage.
 
What is weird is at night even with high traffic there does not seem to have any slowdown, or going through an intersection where there is a slow down, making a U turn and going through it again at high speed and the slowdown does not repeat. Whatever it is, for now it looks like it is impacting the ps4 more.
People keep talking about the small cpu advantage of the xbox one, but can the edram fast access also help in those situations ?
 
Last edited:
I would say the number of cars and locations...type of traffic is arguable - using the term 'identical' implies everything is the exactly the same when it's not. Even the patterns are not the same (check out 3:38 to 3:43). Again, the only identical footage all show PS4>XBO in fps...even in high speed at some junctions.

I noticed that too, the video (analysed closely) doesn't match the editorial content.

It's unfortunately a trend in many DF article since the Watchdogs article. More recently the DF article about Evolve (alpha) in which they concluded that PS4 version generally performed significantly worse. But they were comparing mostly different scenarios and only showed 2 different videos (not a face-off, with mostly a different weapon used, scenes were constantly busier with more NPC and alphas on PS4, they showed mainly outdoor scenes on PS4, but many indoor scenes on XB1, many XB1 scenes were just about exploring, not shooting etc.).

Finally I found 2 similar scenes on both videos, 2:12 for XB1, 1:26 for PS4, (same weapon used, same outdoor level, same enemy, same alphas from flame thrower etc), in this scene the PS4 version performed very similarly as its XB1 equivalent, technically the XB1 version even dipped 1 fps lower at one point.

Maybe Evolve performs better on XB1 but what they selected on both videos certainly doesn't prove it.
 
Well that's my point. I don't disagree with the fact that AI is 'working in the background' but if that's the case then why is it only at junctions? And again, why are there junctions where PS4 performs better?

If it were cut and dried I'd expect more consistency than I'm seeing - the only thing that's consistent is every other bit of footage which is more scripted and nearly always favours PS4...even more so during the shootouts.

All I'm saying is 'this shows the XBO has a better CPU' is a 'lazy throw-away comment' which cannot be confirmed 100%. It could be other bottle-necks on PS4 - it could be that there's something else the PS4 is doing that XBO isn't - only R* knows at this point.
 
Face-Off: Far Cry 4


First impressions of Far Cry 4 on Xbox One are positive. Image quality is very clean and the overall presentation compares favourably to the PS4 game. On close inspections, detail looks a little softer and less refined, but otherwise it holds up very well during gameplay. Pixel counting - not very easy here, for reasons we'll go into later - reveals a 1440x1080p framebuffer horizontally scaled up to full-HD resolution (1920x1080), although artefacts from the resizing process appear subdued compared to most sub-1080p games. In comparison we see a native 1080p image deployed on the PS4 that appears suitably sharp, and indeed clearer than the Xbox One game, but the Microsoft console is punching enough above its weight with a presentation that - by and large - defies its sub-native pixel-count.

What HRAA brings to the table is worthy of a Digital Foundry feature in its own right, but let's look at the basics. For Xbox One, the choice of resolution means that scaling artefacts are only apparent on one axis. On top of that, for every four horizontal pixels rendered, it draws upon three source pixels, an additional three from the previous frame plus accumulated data from previous frames. HRAA works nicely not just for anti-aliasing, then, but also in reconstituting something approaching the quality of a full 1080p framebuffer when it is working at its best. PlayStation 4 looks even cleaner, mostly because it appears to have access to more temporal data than Xbox One version of the algorithm and doesn't need to upscale at all.

Overall, performance sticks closely to 30fps on both consoles, but the Xbox One comes out on top, displaying slightly higher frame-rates during intense shoot-outs and generally fewer dips elsewhere. It's likely that the difference in resolution between the two versions is the cause here, with the Xbox One rendering 25 per cent fewer pixels, and using a less refined version of HRAA into the bargain. That said, the difference doesn't stand out during a general run of play, and neither version suffers from performance issues that get in the way of enjoying how the unpredictable gunfights turn out.

In terms of the multi-platform comparison, the PS4 gets the nod here for its sharper native 1080p presentation and almost solid 30fps frame-rates. In comparison, the image quality isn't quite as pristine on the Xbox One, although frame-rates are slightly higher under load, but the quality of the presentation overall remains excellent. It's a great buy on either platform.
 
After watching the videos, the most I saw was 2-3fps difference (intensive scene), lasting about 1 second... happening about 2-3 times. That's very acceptable in my book...

If the game was a complete slide-show, or even the delta (30fps) was only being met 80% of the time, then I would agree. But scaling back IQ for a occasional dip or two, isn't warranted in this case. PC gamers (myself included) wouldn't scale back IQ until it's truly needed... so, a few dips isn't going to spoil your gaming.
 
It would if it was frequent, but as it stands in the DF footage I think both platforms are fine from a performance POV. I certainly wouldn't drop the resolution for what we're seeing.

It's frequent fluctuation and microstutter and tearing that bring a game down. Speaking of other games ...

The thing about frequent "29 fps" (as seen in other games) is that it's often accompanied by a frame being displayed late. This causes a 16 ms "hitch" in timing, and sometimes a 16 ms "speedup" as frames return to normal pacing afterwards. This mucks up the timing and it's this that's so visible to many folks rather than simply a missing frame.

Once in a while isn't too bad, but when it's every second, or several times a second, it's nasty. Some people can't see it, but for others it's intensely irritating and massively worse than larger occasional drops.
 
Yeah I don't think 27fps justifies 900p vs 1080p

Assuming the problem is just framebuffer size, they might be able to get away with a modest black bar (1920x972 or even 1920x960 for integer multiple of 8). Use the black space for sub-titles? Could even be a single large black bar on the bottom.

37ms (27fps) vs 33ms (30fps) ~10% difference
 
The PS4 version of GTA5 looks spectacular, selling resolution or graphic details for a guaranteed 30fps is simply not worth. Dips happen so rarely and usually will be very hard to actually pick out when so much shit is going on.
 
Assuming the problem is just framebuffer size, they might be able to get away with a modest black bar (1920x972 or even 1920x960 for integer multiple of 8). Use the black space for sub-titles? Could even be a single large black bar on the bottom.

37ms (27fps) vs 33ms (30fps) ~10% difference

So you would drop the resolution to avoid a short drop that happen once during the opening sequence? The XB1 also slightly drops during this moment (an explosion) so we should reduce the resolution on both versions?

That's really not worth it. Drops on both platforms are really, really occasional (and short), even in the framerate stress test provided by DF. The first is caused by an explosion (CPU bottleneck?) and the 2 significant others are most probably usual stutters caused by loading data so any drop in resolution probably wouldn't resolve things.

When playing the game I haven't noticed any of those drops during explosions or shootout or even loading, they must be really, really rare. Most interestingly the only drops I noticed are not during shooting/explosion/loading but in exploration phases in some rare spots on the map where vegetation is really dense + alphas from smoke behind the vegetation. I am really surprised that DF didn't record those spots that are here most probably GPU bottlnecked (you can hear the fan spins significantly faster in those areas).

But DF didn't select them in their video, they only showed CPU heavy and loading moments, not purely GPU heavy moments which is very odd... or not.
 
It's unfortunate (that the consoles are not overly powerful) but I guess devs are having to get the right balance and I think with GTAV & FC4 they seem to get it right. It would however be nice if devs gave us a switch to decide - even just a basic 'quality vs performance' switch.

I think as devs get to grips with these consoles things will improve, well I certainly expect it on PS4 as devs start to utilise CUs better - Uncharted 4 will be a good show for things to come :)
 
I'm starting to think some ps4 games should also be at a slightly reduced resolution, if the result can't even be seen under normal circumstances.
So you would drop the resolution to avoid a short drop that happen once during the opening sequence? The XB1 also slightly drops during this moment (an explosion) so we should reduce the resolution on both versions?

That's really not worth it. Drops on both platforms are really, really occasional (and short), even in the framerate stress test provided by DF. The first is caused by an explosion (CPU bottleneck?) and the 2 significant others are most probably usual stutters caused by loading data so any drop in resolution probably wouldn't resolve things.

When playing the game I haven't noticed any of those drops during explosions or shootout or even loading, they must be really, really rare. Most interestingly the only drops I noticed are not during shooting/explosion/loading but in exploration phases in some rare spots on the map where vegetation is really dense + alphas from smoke behind the vegetation. I am really surprised that DF didn't record those spots that are here most probably GPU bottlnecked (you can hear the fan spins significantly faster in those areas).

But DF didn't select them in their video, they only showed CPU heavy and loading moments, not purely GPU heavy moments which is very odd... or not.

I agree. I wished that DF would provide more different situations and more data to look at. I must admit that while playing this game for a few hours...it felt really really smooth. Even in the clip they showed...what is the average FPS...it must be in the range of 29FPS for both versions...

Also, I wished they invested a bit more time in their tests to make situations equal:


zl7F2uH.jpg
 
Last edited:
DF play though hours of the game on each platform. The few minutes of video they present are not the entirety of their testing, but are representative.
 
So you would drop the resolution to avoid a short drop that happen once during the opening sequence? The XB1 also slightly drops during this moment (an explosion) so we should reduce the resolution on both versions?

That's really not worth it. Drops on both platforms are really, really occasional (and short), even in the framerate stress test provided by DF. The first is caused by an explosion (CPU bottleneck?) and the 2 significant others are most probably usual stutters caused by loading data so any drop in resolution probably wouldn't resolve things.

When playing the game I haven't noticed any of those drops during explosions or shootout or even loading, they must be really, really rare. Most interestingly the only drops I noticed are not during shooting/explosion/loading but in exploration phases in some rare spots on the map where vegetation is really dense + alphas from smoke behind the vegetation. I am really surprised that DF didn't record those spots that are here most probably GPU bottlnecked (you can hear the fan spins significantly faster in those areas).

But DF didn't select them in their video, they only showed CPU heavy and loading moments, not purely GPU heavy moments which is very odd... or not.

Well I don't find the occasional and totally acceptable slowdowns in games like Far Cry 4 anything to worry about, I was just thinking aloud that if a very well AA'ed 1440*1080 or whatever other resolution gives real-life results as good as DF says, then it would be interesting to see the games running on PS4 at that resolution and how that would affect performance. It's been a standard staple lately to see face-off's where PS4 looks sharper but XO runs smoother. Perhaps the reason has nothing to do with the lowered resolution but it would still be nice to see.
 
Status
Not open for further replies.
Back
Top