Digital Foundry Article Technical Discussion Archive [2014]

Discussion in 'Console Technology' started by DieH@rd, Jan 11, 2014.

Thread Status:
Not open for further replies.
  1. damienw

    Regular

    Joined:
    Sep 29, 2008
    Messages:
    478
    Likes Received:
    42
    Location:
    Seattle
    What forum do you think you're on? Because this is a tech forum and the analysis that playability suffers due to a framerate varying from 60 to single digits and gobs of screen tear is most certainly a proper analysis.
     
  2. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,415
    Likes Received:
    38
    Location:
    Somewhere out there
    Average frame rate would work if they actually had a benchmark. Otherwise it's open to even more abuse.
     
  3. djskribbles

    Legend Veteran

    Joined:
    Jan 27, 2007
    Messages:
    5,198
    Likes Received:
    615
    Not going to comment on how it actually feels as I haven't played it. But it depends on what you consider a frame drop. The framerate drops quite often to the 40s and sometimes to the 30s. For at least some people, that might be considered 'playing poorly'. Tomb Raider PS4 has similar performance and I've seen some (albeit few) people complain when it matters much less than a twitch shooter like Titanfall.
     
  4. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    509
    Likes Received:
    165
    No one I've read so far in this thread about games tech and the game is performing as it does is bound to attract comment. If it's fun anyway (and I haven't read anyone say it isn't) then great but I'm fascinated by why it doesn't reach the goal that Respawn have always emphasised which is 60fps. I'm further intrigued by the choice of res at 1408x792, I mean it's literally never been used before to my knowledge and doesn't seem to offer many benefits over 720p when the framerate still isn't locked at 60fps or their old 'perceptual 60fps' concept.
     
    #904 Lalaland, Mar 16, 2014
    Last edited by a moderator: Mar 16, 2014
  5. function

    function None functional
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,943
    Likes Received:
    1,940
    Location:
    Wrong thread
    The resolution chosen may be related to esram size e.g. the largest buffer they could use without tiling the primary render target.

    Not all of the frame rate drops may be due to GPU load.

    They had to beat the engine into shape to allow them to handle everything they wanted to. On the PC, with it's vastly, vastly superior threaded performance (3+ times faster) this might not have been an issue. On console the game might be choking on a single thread meaning they thought they might as well bump up the resolution as the very worst drops (~20 fps with 12 Titans) weren't GPU related.
     
  6. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    926
    Likes Received:
    39
    Location:
    On my rock
    Why would you automatically make the assumption that a drop to 720p would improve the frame rate? Do you have access to the profiler and know that it's pixel bound in those scenarios?
     
  7. Inuhanyou

    Regular

    Joined:
    Dec 23, 2012
    Messages:
    785
    Likes Received:
    48
    Location:
    New Jersey, USA
    of course no one thinks performance is automatically tied to resolution in all cases. But its such a marginal bump, there's literally no reason for it to really exist outside of saying your game isn't 720p. They obviously can't get it to 900p without compromises to begin with, so going for a less then half hearted approach just seems silly.
     
  8. function

    function None functional
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,943
    Likes Received:
    1,940
    Location:
    Wrong thread
    900p is such a marginal bump over 792p there's literally no reason for it to really exist outside of saying your game isn't 792p. 900p means you obviously can't get it to 1440p* without compromises to begin with, so going for a less then half hearted approach just seems silly.

    *lolsigh
     
  9. Inuhanyou

    Regular

    Joined:
    Dec 23, 2012
    Messages:
    785
    Likes Received:
    48
    Location:
    New Jersey, USA
    I understand what your saying. It just seems like they aren't even close to their target, so the effort would have been wisely spent elsewhere instead of spending all that time trying to reassure people the resolution wasn't final. I think the framerate should be the most important priority by far considering its state on X1, especially taking into account that they have repeatedly said that 'framerate is king".

    It doesn't seem like it.

    Also...why did you skip 1080p and go to 1440?? :/
     
  10. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    10,355
    Likes Received:
    5,758
    Location:
    Cleveland
    Because everyone knows 1080p is just a compromise of 1440p which is just a compromise of 4K res.
     
  11. dobwal

    Veteran

    Joined:
    Oct 26, 2005
    Messages:
    4,831
    Likes Received:
    844
    Titanfall frame drops are understandable once you realize how hectic the game can get especially in "last titan standing". When you have 12 titans dropping rocket salvos, firing off their primary weapon, dashing to and fro, dropping smoke, going for melee attacks, going nuclear and all that happening in a confined area (this mode is usually pack on pack with very little lone wolving), i doubt if any game could handle those type of scenarios without frame drops.
     
    #911 dobwal, Mar 16, 2014
    Last edited by a moderator: Mar 16, 2014
  12. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    You might be 100% right if it weren't for the scaling issues. Afaik very few have a native 792p screen or 1140p or 4k. Apart from the obvious higher resolution and therefor better picture quality there is other reasons for wanting the superior resolution. Thankfully the incredible bad scaling job on the XB1 has been improved but it's still no match for a native resolution.

    When i play this game on my PC i would love to be able to drop the resolution to get a better and more steady FPS (aka XBOX ONE mode) but thanks to the native resolution of my monitor on my PC and my Laptop it's simply not an option, it looks like shit when i do that.
     
  13. Dr Evil

    Dr Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,633
    Likes Received:
    624
    Location:
    Finland
    Have you tried to apply the setting in the control panel to make the GPU do the scaling? It should work a lot better than leaving the monitor do it.
     
  14. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    Yes, it helps compared to the monitors i have but i only use it on my notebook now and then, still not native which was my point (somewhat captain obvious, i know sorry). But thanks for the tip anyway.
     
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    38,307
    Likes Received:
    7,756
    Location:
    Under my bridge
    720p = 921600 pixels
    792p = 1115136 pixels
    900p = 1440000 pixels
    1080p = 2073600 pixels
    1440p = 3686400 pixels

    792p = 21% increase over 720p
    900p = 29% increase over 792p (56% increase over 720p)
    1080p = 44% increase over 900p (125% increase over 720p)
    1440p = 78% increase over 1080p (300% increase over 720p)

    Although easy to state next-step resolutions as marginal increases, they actually represent significant percentage increases. And even then, the visual results are questionable (how much better really is 1080p than 900p in the eyes of most gamers?). Titanfall is already struggling at 792p. What do those 21% extra pixels get you that contributes in a noticeable way on screen? If it's a compromise that doesn't benefit the experience at all, it was the wrong one. 720p with higher framerate or whatever would likely be a better experience.

    Of course, if the resolution isn't the bottleneck here, an extra 20% pixels could be a freebie. Respawn may have been targeting 720p and found they could give a little extra. We don't really know. I don't disagree with Inuhanyou's thinking though. 792p is a marginal increase that'll lead to more blurring on 720 native sets and no significant visual advantage on other displays, so one has to wonder why choose that resolution? I won't go so far to suggest that it's only to avoid the 'last gen 720p' label, but I wouldn't try to counter argue with every resolution increase being marginal, because they're not. Especially compared to 720p which is an option for any game wanting to target smooth, high framerates.
     
  16. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,302
    Likes Received:
    1,803
    @tkf you could turn off scaling totally, you'd have black borders around the outside but no scaling blurriness.

    edit:
    Just watched total biscuits review on youtube since hes one of the few people who goes through the options in a review
    and there seems a good few thinks you could tun down instead of the res to improve framerate eg: aa / ragdolls/ impact marks or shadows

    ps: he says he gets a constant 120fps but he does have 2 titans (thats nvidia titans obviously ;))
     
  17. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    7,767
    Likes Received:
    1,632
    Not when they said the experience would be better on a PC, that machine shouldn't hold a candle against a non premature Xbox One, which is a premature console.

    Heck, even PS4 is, look at the library of games... X1 in that sense is ahead but not by much.


    They used different PCs, one of them was quite powerful, hence unfair.

    Alas, that's the sound most PCs have these days, Xbox One is much more capable than anything else sound wise, something that is always forgotten and pulled under the rug when these unfair comparisons are made.


    If you are going to use a TV to play, Limited range is undoubtedly the best choice. If you are going to use a computer monitor it is a matter of preference. Still.. full Range sucks quite a bit.

    Microsoft recommend on their Xbox.com site to use Standard Range, ‘cos for a TV it is best, and you will never have problems with that range.

    https://support.xbox.com/en-US//xbox-one/system/adjust-display-settings

    Why is it the best advice to NEVER use full range on a TV?

    Limited range works on all televisions and basically almost all the video material you can see is created with Standard RGB in mind, usually the original format in which that video material was created. Moreover, many many TVS made in 2013 and 2014 don’t even support full Range.

    That being said and since this is a tech forum, I shall explain what limited/full range are as I understand it.

    Limited range or Standard RGB and Full Range are two existing ways of defining the value of Black and White. The Full Range is set to 0 to 255. That is, counting 0 as the first step, there are 256 steps from black to white. :)

    In contrast, Standard RGB –or limited RGB- features 37 less steps compared to full RGB , and absolute black to absolute white ranges from the values 16 to 235.

    In other words, with Standard RGB the value of black is 16 , which is the first step. Absolute White Range for Standard RGB is placed in the step 235.

    So why choose a limited range TV and what problems may arise if you don’t? First, basically movies , videos and all the material you see on DVD or Blu- Ray format is encoded in YCbCr and Limited range ...

    Furthermore, the problem of choosing Full range on a TV that does not accept full RGB is that you would see values in typical "black" that should be gray instead. (eg the value 19-20-21 are almost black using Limited Range, where black starts at 16, BUT 19-20-21-etc steps are grey if you use Full Range ... etc)

    This is an example of a full range image, represented step by step :

    [​IMG]

    In the picture above you see that the step 20 , which is close to the step 16 -absolute black on Limited RGB-, should be black using Standard RGB, but it is gray instead. :roll:

    BUT if you display this image on a Limited Range TV –steps 16 to 235- you should hardly see steps 15 and under. If that happens no worries, it’s not your fault, you are viewing a Full range picture on a Standard RGB/Limited Range display.

    If in doubt always use Limited range and the image will look good to everyone regardless of the TV.

    That’s why Full Range sucks so much, despite DF treating it as if it was the Holy Grail, which is not the way to go.
     
  18. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    7,767
    Likes Received:
    1,632
    Okay, still... they know what they get, good CPU and a lot of video memory, that's not a luxury you have with the Xbox One and the eSRAM.

    If used well the Xbox One would give that PC a very hard time. It is in the article though, the settings doesn't match those of the console, but the experience is better....

    Of course, that PC has the upgradability factor, so the PC out paces the current consoles, I believe. especially at the current rate of development.

    But in the future there'll be newer consoles that can beat the current home pc. It's a cycle. Then the pc outpaces consoles again.

    And on and on.

    Give me specialiased hardware any day of the week, 'cos yes, that's really nice, a PC CPU could produce great sound but -dunno what bkillian might think- many of the possibilities of SHAPE couldn't be replicated on a PC.
     
  19. Globalisateur

    Globalisateur Globby
    Veteran Regular

    Joined:
    Nov 6, 2013
    Messages:
    2,563
    Likes Received:
    1,368
    Location:
    France
    Lego the Movie next-gen Digital Foundry face-off:

    http://www.eurogamer.net/articles/digitalfoundry-2014-lego-the-movie-next-gen-face-off

    Well, they're wrong. Both use the exact same AA but PS4 has a better horizontal resolution. I suspect native 1920x1080 resolution for XB1 vs 1920x1200 for PS4:

    XB1 up, PS4 bottom respectively:

    [​IMG]
    [​IMG]

    Second example:

    [​IMG]
    [​IMG]

    Another:

    [​IMG]
    [​IMG]

    Original links of DF images:

    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_008.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_008.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_002.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_002.bmp.jpg

    On those images it's obvious the horizontal resolution is different, like in 5 seconds I knew PS4 had a better horizontal resolution. How could they miss the PS4 supersampling?
     
  20. function

    function None functional
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,943
    Likes Received:
    1,940
    Location:
    Wrong thread
    If the game is rendering to the full colour space, and the video output pipeline doesn't wreck that, full range is better.

    Limited range is a crappy, hacky solution in the age of digital output. Anyone wanting to use their PC with limited range for gaming has a screw loose.
     
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...