Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Because people are making it seem as if the game plays poorly because of the occasional frame drop. They're looking at that number and passing a judgement that is not correct. The framerate never drops too low for too long.

The game is not a tech showcase. There seems to be a lot of room for improvement. I would prefer stable 60hz over resolution or anything else. Hopefully their second effort will be better.

What forum do you think you're on? Because this is a tech forum and the analysis that playability suffers due to a framerate varying from 60 to single digits and gobs of screen tear is most certainly a proper analysis.
 
Agree, it's useful to know what the average framerate is, you have the video to show you under what context does the framerate fluctuates, so it's not like other publications that only show the average framerate without any context.

Average frame rate would work if they actually had a benchmark. Otherwise it's open to even more abuse.
 
Because people are making it seem as if the game plays poorly because of the occasional frame drop. They're looking at that number and passing a judgement that is not correct. The framerate never drops too low for too long.

The game is not a tech showcase. There seems to be a lot of room for improvement. I would prefer stable 60hz over resolution or anything else. Hopefully their second effort will be better.
Not going to comment on how it actually feels as I haven't played it. But it depends on what you consider a frame drop. The framerate drops quite often to the 40s and sometimes to the 30s. For at least some people, that might be considered 'playing poorly'. Tomb Raider PS4 has similar performance and I've seen some (albeit few) people complain when it matters much less than a twitch shooter like Titanfall.
 
Because people are making it seem as if the game plays poorly because of the occasional frame drop. They're looking at that number and passing a judgement that is not correct. The framerate never drops too low for too long.

The game is not a tech showcase. There seems to be a lot of room for improvement. I would prefer stable 60hz over resolution or anything else. Hopefully their second effort will be better.

No one I've read so far in this thread about games tech and the game is performing as it does is bound to attract comment. If it's fun anyway (and I haven't read anyone say it isn't) then great but I'm fascinated by why it doesn't reach the goal that Respawn have always emphasised which is 60fps. I'm further intrigued by the choice of res at 1408x792, I mean it's literally never been used before to my knowledge and doesn't seem to offer many benefits over 720p when the framerate still isn't locked at 60fps or their old 'perceptual 60fps' concept.
 
Last edited by a moderator:
The resolution chosen may be related to esram size e.g. the largest buffer they could use without tiling the primary render target.

Not all of the frame rate drops may be due to GPU load.

They had to beat the engine into shape to allow them to handle everything they wanted to. On the PC, with it's vastly, vastly superior threaded performance (3+ times faster) this might not have been an issue. On console the game might be choking on a single thread meaning they thought they might as well bump up the resolution as the very worst drops (~20 fps with 12 Titans) weren't GPU related.
 
Why would you automatically make the assumption that a drop to 720p would improve the frame rate? Do you have access to the profiler and know that it's pixel bound in those scenarios?
 
of course no one thinks performance is automatically tied to resolution in all cases. But its such a marginal bump, there's literally no reason for it to really exist outside of saying your game isn't 720p. They obviously can't get it to 900p without compromises to begin with, so going for a less then half hearted approach just seems silly.
 
900p is such a marginal bump over 792p there's literally no reason for it to really exist outside of saying your game isn't 792p. 900p means you obviously can't get it to 1440p* without compromises to begin with, so going for a less then half hearted approach just seems silly.

*lolsigh
 
I understand what your saying. It just seems like they aren't even close to their target, so the effort would have been wisely spent elsewhere instead of spending all that time trying to reassure people the resolution wasn't final. I think the framerate should be the most important priority by far considering its state on X1, especially taking into account that they have repeatedly said that 'framerate is king".

It doesn't seem like it.

Also...why did you skip 1080p and go to 1440?? :/
 
Because everyone knows 1080p is just a compromise of 1440p which is just a compromise of 4K res.
 
Titanfall frame drops are understandable once you realize how hectic the game can get especially in "last titan standing". When you have 12 titans dropping rocket salvos, firing off their primary weapon, dashing to and fro, dropping smoke, going for melee attacks, going nuclear and all that happening in a confined area (this mode is usually pack on pack with very little lone wolving), i doubt if any game could handle those type of scenarios without frame drops.
 
Last edited by a moderator:
Because everyone knows 1080p is just a compromise of 1440p which is just a compromise of 4K res.

You might be 100% right if it weren't for the scaling issues. Afaik very few have a native 792p screen or 1140p or 4k. Apart from the obvious higher resolution and therefor better picture quality there is other reasons for wanting the superior resolution. Thankfully the incredible bad scaling job on the XB1 has been improved but it's still no match for a native resolution.

When i play this game on my PC i would love to be able to drop the resolution to get a better and more steady FPS (aka XBOX ONE mode) but thanks to the native resolution of my monitor on my PC and my Laptop it's simply not an option, it looks like shit when i do that.
 
When i play this game on my PC i would love to be able to drop the resolution to get a better and more steady FPS (aka XBOX ONE mode) but thanks to the native resolution of my monitor on my PC and my Laptop it's simply not an option, it looks like shit when i do that.

Have you tried to apply the setting in the control panel to make the GPU do the scaling? It should work a lot better than leaving the monitor do it.
 
Have you tried to apply the setting in the control panel to make the GPU do the scaling? It should work a lot better than leaving the monitor do it.

Yes, it helps compared to the monitors i have but i only use it on my notebook now and then, still not native which was my point (somewhat captain obvious, i know sorry). But thanks for the tip anyway.
 
900p is such a marginal bump over 792p there's literally no reason for it to really exist outside of saying your game isn't 792p. 900p means you obviously can't get it to 1440p* without compromises to begin with, so going for a less then half hearted approach just seems silly.
720p = 921600 pixels
792p = 1115136 pixels
900p = 1440000 pixels
1080p = 2073600 pixels
1440p = 3686400 pixels

792p = 21% increase over 720p
900p = 29% increase over 792p (56% increase over 720p)
1080p = 44% increase over 900p (125% increase over 720p)
1440p = 78% increase over 1080p (300% increase over 720p)

Although easy to state next-step resolutions as marginal increases, they actually represent significant percentage increases. And even then, the visual results are questionable (how much better really is 1080p than 900p in the eyes of most gamers?). Titanfall is already struggling at 792p. What do those 21% extra pixels get you that contributes in a noticeable way on screen? If it's a compromise that doesn't benefit the experience at all, it was the wrong one. 720p with higher framerate or whatever would likely be a better experience.

Of course, if the resolution isn't the bottleneck here, an extra 20% pixels could be a freebie. Respawn may have been targeting 720p and found they could give a little extra. We don't really know. I don't disagree with Inuhanyou's thinking though. 792p is a marginal increase that'll lead to more blurring on 720 native sets and no significant visual advantage on other displays, so one has to wonder why choose that resolution? I won't go so far to suggest that it's only to avoid the 'last gen 720p' label, but I wouldn't try to counter argue with every resolution increase being marginal, because they're not. Especially compared to 720p which is an option for any game wanting to target smooth, high framerates.
 
@tkf you could turn off scaling totally, you'd have black borders around the outside but no scaling blurriness.

edit:
Just watched total biscuits review on youtube since hes one of the few people who goes through the options in a review
and there seems a good few thinks you could tun down instead of the res to improve framerate eg: aa / ragdolls/ impact marks or shadows

ps: he says he gets a constant 120fps but he does have 2 titans (thats nvidia titans obviously ;))
 
Guess I just imagined them saying AMD FX-6300 with a GeForce GTX 760
Not when they said the experience would be better on a PC, that machine shouldn't hold a candle against a non premature Xbox One, which is a premature console.

Heck, even PS4 is, look at the library of games... X1 in that sense is ahead but not by much.


They didnt use a "powerful pc"
They used different PCs, one of them was quite powerful, hence unfair.

Sorry but seeing as the video does not show the game running as good as possible on the pc. Eg: they limited the resolution to 1920x1080 how can you possibly make that claim.
and Digital foundry themselves disagree with you.

Its an £80 cpu and a £175 gpu total £255 you could complete the rest of the pc for not a huge amount more than the cost of an xb1. To say that the cost of the xb1 (£400) is marginal compared to the pc df used is laughable.
Granted but they did choose a pc that has bottom of the barrel sound hardware (aka onboard sound)
Alas, that's the sound most PCs have these days, Xbox One is much more capable than anything else sound wise, something that is always forgotten and pulled under the rug when these unfair comparisons are made.


Lets get something straight having more colours is not inferior, blame the game/system/monitor for not being able to handle the full range


I really hope your not pointing to that as an example of something good ? If the game needs 6 months work on it it shouldnt of been released.

Seems like your hurt the game is better on the pc so youve made a bunch of stuff up to defend the xb1.
If you are going to use a TV to play, Limited range is undoubtedly the best choice. If you are going to use a computer monitor it is a matter of preference. Still.. full Range sucks quite a bit.

Microsoft recommend on their Xbox.com site to use Standard Range, ‘cos for a TV it is best, and you will never have problems with that range.

https://support.xbox.com/en-US//xbox-one/system/adjust-display-settings

Color space

“We highly recommend that you leave the color space setting set to TV (RGB Limited). RGB Limited is the broadcast standard for video content and is intended for use with televisions.”
Why is it the best advice to NEVER use full range on a TV?

Limited range works on all televisions and basically almost all the video material you can see is created with Standard RGB in mind, usually the original format in which that video material was created. Moreover, many many TVS made in 2013 and 2014 don’t even support full Range.

That being said and since this is a tech forum, I shall explain what limited/full range are as I understand it.

Limited range or Standard RGB and Full Range are two existing ways of defining the value of Black and White. The Full Range is set to 0 to 255. That is, counting 0 as the first step, there are 256 steps from black to white. :)

In contrast, Standard RGB –or limited RGB- features 37 less steps compared to full RGB , and absolute black to absolute white ranges from the values 16 to 235.

In other words, with Standard RGB the value of black is 16 , which is the first step. Absolute White Range for Standard RGB is placed in the step 235.

So why choose a limited range TV and what problems may arise if you don’t? First, basically movies , videos and all the material you see on DVD or Blu- Ray format is encoded in YCbCr and Limited range ...

Furthermore, the problem of choosing Full range on a TV that does not accept full RGB is that you would see values in typical "black" that should be gray instead. (eg the value 19-20-21 are almost black using Limited Range, where black starts at 16, BUT 19-20-21-etc steps are grey if you use Full Range ... etc)

This is an example of a full range image, represented step by step :

blacktest.png


In the picture above you see that the step 20 , which is close to the step 16 -absolute black on Limited RGB-, should be black using Standard RGB, but it is gray instead. :rolleyes:

BUT if you display this image on a Limited Range TV –steps 16 to 235- you should hardly see steps 15 and under. If that happens no worries, it’s not your fault, you are viewing a Full range picture on a Standard RGB/Limited Range display.

If in doubt always use Limited range and the image will look good to everyone regardless of the TV.

That’s why Full Range sucks so much, despite DF treating it as if it was the Holy Grail, which is not the way to go.
 
Because developers have spent the last couple of years learning how to specifically optimise their games for FX6300 / GTX 760 based systems with the benefits of a low level and low overhead API. Oh no wait...
Okay, still... they know what they get, good CPU and a lot of video memory, that's not a luxury you have with the Xbox One and the eSRAM.

If used well the Xbox One would give that PC a very hard time. It is in the article though, the settings doesn't match those of the console, but the experience is better....

Of course, that PC has the upgradability factor, so the PC out paces the current consoles, I believe. especially at the current rate of development.

But in the future there'll be newer consoles that can beat the current home pc. It's a cycle. Then the pc outpaces consoles again.

And on and on.

In hardware no doubt, but the PC has a ton of spare CPU power to burn so any advanced game audio that the One might implement on SHAPE can probably be replicated by the PC's CPU.
Give me specialiased hardware any day of the week, 'cos yes, that's really nice, a PC CPU could produce great sound but -dunno what bkillian might think- many of the possibilities of SHAPE couldn't be replicated on a PC.
 
Lego the Movie next-gen Digital Foundry face-off:

http://www.eurogamer.net/articles/digitalfoundry-2014-lego-the-movie-next-gen-face-off

A native 1080p presentation is handed in on both PS4 and Xbox One once again...post-process anti-aliasing is used across both consoles, with the PS4 getting an ever-so-slightly stronger version of the effect

Well, they're wrong. Both use the exact same AA but PS4 has a better horizontal resolution. I suspect native 1920x1080 resolution for XB1 vs 1920x1200 for PS4:

XB1 up, PS4 bottom respectively:

crooped_lego_movie_XO_009_bmp.png

crooped_lego_movie_PS4_009_bmp.png


Second example:

crooped_lego_movie_XO_008_bmp.png

crooped_lego_movie_PS4_008_bmp.png


Another:

crooped_lego_movie_XO_002_bmp.png

crooped_lego_movie_PS4_002_bmp.png


Original links of DF images:

http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_008.bmp.jpg
http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_008.bmp.jpg
http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_002.bmp.jpg
http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_002.bmp.jpg

On those images it's obvious the horizontal resolution is different, like in 5 seconds I knew PS4 had a better horizontal resolution. How could they miss the PS4 supersampling?
 
If the game is rendering to the full colour space, and the video output pipeline doesn't wreck that, full range is better.

Limited range is a crappy, hacky solution in the age of digital output. Anyone wanting to use their PC with limited range for gaming has a screw loose.
 
Status
Not open for further replies.
Back
Top