Digital Foundry Article technical discussion

Discussion in 'Console Technology' started by Shifty Geezer, Nov 14, 2008.

  1. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    907
    Location:
    On my rock
    Why would you automatically make the assumption that a drop to 720p would improve the frame rate? Do you have access to the profiler and know that it's pixel bound in those scenarios?
     
  2. Inuhanyou

    Regular

    Joined:
    Dec 23, 2012
    Messages:
    749
    Location:
    New Jersey, USA
    of course no one thinks performance is automatically tied to resolution in all cases. But its such a marginal bump, there's literally no reason for it to really exist outside of saying your game isn't 720p. They obviously can't get it to 900p without compromises to begin with, so going for a less then half hearted approach just seems silly.
     
  3. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,201
    900p is such a marginal bump over 792p there's literally no reason for it to really exist outside of saying your game isn't 792p. 900p means you obviously can't get it to 1440p* without compromises to begin with, so going for a less then half hearted approach just seems silly.

    *lolsigh
     
  4. Inuhanyou

    Regular

    Joined:
    Dec 23, 2012
    Messages:
    749
    Location:
    New Jersey, USA
    I understand what your saying. It just seems like they aren't even close to their target, so the effort would have been wisely spent elsewhere instead of spending all that time trying to reassure people the resolution wasn't final. I think the framerate should be the most important priority by far considering its state on X1, especially taking into account that they have repeatedly said that 'framerate is king".

    It doesn't seem like it.

    Also...why did you skip 1080p and go to 1440?? :/
     
  5. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    7,501
    Location:
    Cleveland
    Because everyone knows 1080p is just a compromise of 1440p which is just a compromise of 4K res.
     
  6. dobwal

    Veteran

    Joined:
    Oct 26, 2005
    Messages:
    4,390
    Titanfall frame drops are understandable once you realize how hectic the game can get especially in "last titan standing". When you have 12 titans dropping rocket salvos, firing off their primary weapon, dashing to and fro, dropping smoke, going for melee attacks, going nuclear and all that happening in a confined area (this mode is usually pack on pack with very little lone wolving), i doubt if any game could handle those type of scenarios without frame drops.
     
    #6146 dobwal, Mar 16, 2014
    Last edited by a moderator: Mar 16, 2014
  7. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,627
    You might be 100% right if it weren't for the scaling issues. Afaik very few have a native 792p screen or 1140p or 4k. Apart from the obvious higher resolution and therefor better picture quality there is other reasons for wanting the superior resolution. Thankfully the incredible bad scaling job on the XB1 has been improved but it's still no match for a native resolution.

    When i play this game on my PC i would love to be able to drop the resolution to get a better and more steady FPS (aka XBOX ONE mode) but thanks to the native resolution of my monitor on my PC and my Laptop it's simply not an option, it looks like shit when i do that.
     
  8. Dr Evil

    Dr Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,291
    Location:
    Finland
    Have you tried to apply the setting in the control panel to make the GPU do the scaling? It should work a lot better than leaving the monitor do it.
     
  9. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,627
    Yes, it helps compared to the monitors i have but i only use it on my notebook now and then, still not native which was my point (somewhat captain obvious, i know sorry). But thanks for the tip anyway.
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    35,039
    Location:
    Under my bridge
    720p = 921600 pixels
    792p = 1115136 pixels
    900p = 1440000 pixels
    1080p = 2073600 pixels
    1440p = 3686400 pixels

    792p = 21% increase over 720p
    900p = 29% increase over 792p (56% increase over 720p)
    1080p = 44% increase over 900p (125% increase over 720p)
    1440p = 78% increase over 1080p (300% increase over 720p)

    Although easy to state next-step resolutions as marginal increases, they actually represent significant percentage increases. And even then, the visual results are questionable (how much better really is 1080p than 900p in the eyes of most gamers?). Titanfall is already struggling at 792p. What do those 21% extra pixels get you that contributes in a noticeable way on screen? If it's a compromise that doesn't benefit the experience at all, it was the wrong one. 720p with higher framerate or whatever would likely be a better experience.

    Of course, if the resolution isn't the bottleneck here, an extra 20% pixels could be a freebie. Respawn may have been targeting 720p and found they could give a little extra. We don't really know. I don't disagree with Inuhanyou's thinking though. 792p is a marginal increase that'll lead to more blurring on 720 native sets and no significant visual advantage on other displays, so one has to wonder why choose that resolution? I won't go so far to suggest that it's only to avoid the 'last gen 720p' label, but I wouldn't try to counter argue with every resolution increase being marginal, because they're not. Especially compared to 720p which is an option for any game wanting to target smooth, high framerates.
     
  11. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    12,791
    @tkf you could turn off scaling totally, you'd have black borders around the outside but no scaling blurriness.

    edit:
    Just watched total biscuits review on youtube since hes one of the few people who goes through the options in a review
    and there seems a good few thinks you could tun down instead of the res to improve framerate eg: aa / ragdolls/ impact marks or shadows

    ps: he says he gets a constant 120fps but he does have 2 titans (thats nvidia titans obviously ;))
     
  12. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    5,882
    Not when they said the experience would be better on a PC, that machine shouldn't hold a candle against a non premature Xbox One, which is a premature console.

    Heck, even PS4 is, look at the library of games... X1 in that sense is ahead but not by much.


    They used different PCs, one of them was quite powerful, hence unfair.

    Alas, that's the sound most PCs have these days, Xbox One is much more capable than anything else sound wise, something that is always forgotten and pulled under the rug when these unfair comparisons are made.


    If you are going to use a TV to play, Limited range is undoubtedly the best choice. If you are going to use a computer monitor it is a matter of preference. Still.. full Range sucks quite a bit.

    Microsoft recommend on their Xbox.com site to use Standard Range, ‘cos for a TV it is best, and you will never have problems with that range.

    https://support.xbox.com/en-US//xbox-one/system/adjust-display-settings

    Why is it the best advice to NEVER use full range on a TV?

    Limited range works on all televisions and basically almost all the video material you can see is created with Standard RGB in mind, usually the original format in which that video material was created. Moreover, many many TVS made in 2013 and 2014 don’t even support full Range.

    That being said and since this is a tech forum, I shall explain what limited/full range are as I understand it.

    Limited range or Standard RGB and Full Range are two existing ways of defining the value of Black and White. The Full Range is set to 0 to 255. That is, counting 0 as the first step, there are 256 steps from black to white. :)

    In contrast, Standard RGB –or limited RGB- features 37 less steps compared to full RGB , and absolute black to absolute white ranges from the values 16 to 235.

    In other words, with Standard RGB the value of black is 16 , which is the first step. Absolute White Range for Standard RGB is placed in the step 235.

    So why choose a limited range TV and what problems may arise if you don’t? First, basically movies , videos and all the material you see on DVD or Blu- Ray format is encoded in YCbCr and Limited range ...

    Furthermore, the problem of choosing Full range on a TV that does not accept full RGB is that you would see values in typical "black" that should be gray instead. (eg the value 19-20-21 are almost black using Limited Range, where black starts at 16, BUT 19-20-21-etc steps are grey if you use Full Range ... etc)

    This is an example of a full range image, represented step by step :

    [​IMG]

    In the picture above you see that the step 20 , which is close to the step 16 -absolute black on Limited RGB-, should be black using Standard RGB, but it is gray instead. :roll:

    BUT if you display this image on a Limited Range TV –steps 16 to 235- you should hardly see steps 15 and under. If that happens no worries, it’s not your fault, you are viewing a Full range picture on a Standard RGB/Limited Range display.

    If in doubt always use Limited range and the image will look good to everyone regardless of the TV.

    That’s why Full Range sucks so much, despite DF treating it as if it was the Holy Grail, which is not the way to go.
     
  13. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    5,882
    Okay, still... they know what they get, good CPU and a lot of video memory, that's not a luxury you have with the Xbox One and the eSRAM.

    If used well the Xbox One would give that PC a very hard time. It is in the article though, the settings doesn't match those of the console, but the experience is better....

    Of course, that PC has the upgradability factor, so the PC out paces the current consoles, I believe. especially at the current rate of development.

    But in the future there'll be newer consoles that can beat the current home pc. It's a cycle. Then the pc outpaces consoles again.

    And on and on.

    Give me specialiased hardware any day of the week, 'cos yes, that's really nice, a PC CPU could produce great sound but -dunno what bkillian might think- many of the possibilities of SHAPE couldn't be replicated on a PC.
     
  14. Globalisateur

    Globalisateur Globby
    Veteran Regular

    Joined:
    Nov 6, 2013
    Messages:
    1,888
    Location:
    France
    Lego the Movie next-gen Digital Foundry face-off:

    http://www.eurogamer.net/articles/digitalfoundry-2014-lego-the-movie-next-gen-face-off

    Well, they're wrong. Both use the exact same AA but PS4 has a better horizontal resolution. I suspect native 1920x1080 resolution for XB1 vs 1920x1200 for PS4:

    XB1 up, PS4 bottom respectively:

    [​IMG]
    [​IMG]

    Second example:

    [​IMG]
    [​IMG]

    Another:

    [​IMG]
    [​IMG]

    Original links of DF images:

    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_008.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_008.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/XO_002.bmp.jpg
    http://cfa.gamer-network.net/2013/articles//a/1/6/6/2/7/1/1/PS4_002.bmp.jpg

    On those images it's obvious the horizontal resolution is different, like in 5 seconds I knew PS4 had a better horizontal resolution. How could they miss the PS4 supersampling?
     
  15. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,201
    If the game is rendering to the full colour space, and the video output pipeline doesn't wreck that, full range is better.

    Limited range is a crappy, hacky solution in the age of digital output. Anyone wanting to use their PC with limited range for gaming has a screw loose.
     
  16. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,425
    Location:
    Guess...
    No it's the exact opposite. With a PC you have no idea what you're going to get. You just develope the game to be as scalable as possible and hope whatever hardware configuration can cope. Titanfall is a perfect example of that with it's use of uncompressed audio to accomodate low end dual core CPU's. The XB1 is the exact opposite in that the developers know EXACTLY what hardware they need to target and have access to that hardware at a much lower level than afforded by DX11 on the PC.

    As for video memory, you couldn't have picked a worse example. That's probably the X1's only big advantage over mid range gaming PC's. With the X1 you know you have 5GB of memory available to do what you want with. On the PC you have to build the game in such a way that it can run on GPU's with as little as 1GB of dedicated video memory.

    Why would the XB1 give that PC a hard time? Do you have any idea how they actually compare on a technical level or is this just your opinion based on what you wish to be true rather than hard evidence?

    The GTX760 doubles and in some cases more than doubles the GPU in the XB1 in almost every way. The only exception is memory bandwidth but the XB1 onlt has an advantage there when you assume the esram is used to 100% of its theoretical potential (which would not be the case in the real world). Obviously it also lacks in the size of local video memory available.

    I'm not sure what you're saying with the second part of your comment? The settings are the same apart from texture resolution which is lower on the PC (on account of it being limited to 2GB video memory). Other than that the PC runs at a much higher resolution and has a higher framerate.

    So your argument is just wait another 8 years for the next round of consoles to be more powerful than PC's? And even if someone was willing to do that, they are likely to be disappointed. Just as when this generation launched it was already significantly outpaced by gaming PC's of the time it's pretty likely that if there is a next generation of consoles the same situation will be true again. Bottom line is that there will likely never again be a console launch where the console is more powerful than gaming PC's

    Actually it's the other way around. You can do pretty much anything on a CPU where as SHAPE will be limited to what's pre-programmed into the hardware. As far as I understand it SHAPE doesn't allow the developer to programme custom audio routines into engines unlike both a CPU and True-Audio so in that respect, although it's very fast, SHAPE definitely wouldn't be a substitute for those solutions.
     
  17. function

    function Wrong thread
    Veteran

    Joined:
    Mar 27, 2003
    Messages:
    4,201
    I quite agree! I was trying to use the original language to demonstrate that once you step outside the 720/1080 mindset you basically have choices about detail vs load.

    Ryse looked amazing at 900p, but the frame rate suffered. Lower might have been better.

    We don't know, but it's fun to speculate and I'd speculate that while in many cases the resolution does cause a lower frame rate, the most grievous and sudden drops (the <20 fps ones) are CPU related and that they probably figure they might as well get the extra detail.

    I disagree with Inuhanyou and yourself on this one. 900p is roughly as marginal and increase over 792 as 792 is over 720.

    Additionally, most 720p sets (and indeed most tv sets) aren't set to 1:1 pixel mapping, and many '720p' sets aren't actually 1280 x 720p. Plasmas - god's own tv choice - are all 768, for instance. And even my glorious 768 Plasma defaults to 1080p input.

    I'd speculate that the number of 1280 x 720 panels receiving a 1280 x 720 input, that have been set to 1:1 pixel mapping, and that are going to be used as an Xbox One gaming display, is insignificant compared to the number of sets that will benefit from having a more detailed image.

    792p will provide more detail for any set inputting at 1080 and will also provide a degree of super sampling for 720/768 tv sets. It will even 'overpower' the overscan on those 720p panels that that aren't set to 1:1 (which will be, like, almost all of them).

    DF should do an article investigating this! :D
     
  18. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    8,011
    Location:
    London, UK
    If the source material is full RGB, your device can send full RGB and your display is capable of full displaying full RGB (and calibrated), then you will get a fuller image going full. Using RGB limited will result in colour space remapping which will result in loss of 12.5% of your available range.

    I don't know what Xbox offers but PlayStation offers limited (forces limited always), full (forces full always) or automatic. Automatic is the best, it'll check the display device (DVI, HDMI only) and see if it's capable of full range. If it is and the source material is full it'll use full, if not, it'll use limited. You can test this by checking your TVs service menu.
     
  19. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    5,882
    What can I say. You're a PC snob... That was overkill. PC gamers have always been to me the *rich upper class* of gaming. Literally, pbjliverpool you always expect your games to be longer, deeper, better-looking, easier to configure and control...mainly just plain better than anything else.

    As for the sound I dare you, run a sound benchmark on any wimpy little 2GHz CPU, and compare it to what SHAPE could produce. Then there is the GPUs...ugh... The Xenos 2, so to speak, kicks the pants off many PC's GPUs. Regardless, PC games sometimes are amazing, not saying otherwise. And PC hardware is better still. Besides that, you can upgrade. For the Xbox One yes, you are right the games are ALL built around one hardware profile, but is is an evolving console too, when they free up an extra 8% GPU then things might get even more interesting. They don't risk incompatibility issues by doing that. You are informed about the PC world and your PCs are built to be better, I am not discussing that, but you have a long story of having the best of the best GPUs, which leads me to the point that DF comparisons aren't fair. You are very happy with those comparisons because you love the PC, okay.., I forgive you for that.

    Besides, in my opinion there is a fine line between standing 3 meters away from your TV using a handheld gamepad than standing 1 meter away from a PC monitor playing with desk mounted mouse and keyboard combo, so it is not really comparable. Yet DF insists on that, and talks crap about the Xbox One. Again, unfair.

    Those are very biased comparisons that don't make people happy, at least not me. You know the terms they mention and talk about because you are geeky like most PC gamers -not saying that in a negative tone- and you understand what AA is, or SMAA, AF, what is a bug -console games weren't usually buggy in my Xbox days-, and so on and so forth.

    The point is that with a console, you pop a game in, it's ready to go.

    EDIT: forgot to mention that the PC doesn't have to deal with a small pool of RAM /like the esRAM/, unlike the Xbox One

    DSoup and Function, I gotta go soon, but I will just say that something which isn't compatible with every single device out there isn't worth it. Sure, I play all games on my desktop and laptop PCs at full range, but the displays aren't standard displays nor TVs, they are Standard RGB compatible, I am sure, although I didn't test it.

    Xbox One is a console made for TV, so their recommendation telling people that enabling Standard RGB might be beneficial is understandable. My TV from 2013 doesn't have a service menu, not that I know of, the manual don't mention it.
     
  20. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    8,011
    Location:
    London, UK
    By your logic games should not support anything more than basic stereo sound because surround isn't standard for everybody. Similarly it was folly for PS3 and 360, let alone PS4 and One, to support 1080p because not everybody has a 1080p TV.

    There are still cruddy displays out there but the Samsung I bought for my PS3 in 2007 supported full RGB as did the Sony I replaced it with in 2010. I do find your attitude somewhat bizarre, coming from somebody who posted and posted and posted about their attempts to calibrate their TV for the best possible image quality.

    What's the point of having, IIRC, a 36-bit 69 billion colour panel if you're going to remap a 16.7m colour space into an effective 11.2m colour space? :???:
     

Share This Page

Loading...