*spin-off* Ryse Trade-Offs

Discussion in 'Console Industry' started by shredenvain, Sep 17, 2013.

Thread Status:
Not open for further replies.
  1. FordGTGuy

    Newcomer

    Joined:
    Jul 15, 2013
    Messages:
    106
    Likes Received:
    0
    You're messing with the context of Crytek's comment, they said they didn't compromise RYSE as it's always been 900p.

    Was it a compromise to choose graphics over 1080p? Of course but that is not what he is saying here.

    If it's always been 900p though the product itself was not compromised as people thought it was.
     
  2. Solarus

    Newcomer

    Joined:
    Jan 12, 2009
    Messages:
    156
    Likes Received:
    0
    Location:
    With My Brother
    what does he mean upscaler for AA? i didn't think you would get the appearance of aa by scaling an image up, only down? or is he talking like a sharpening filter or someting?
     
  3. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,418
    Likes Received:
    40
    Location:
    Somewhere out there
    You know as well as most people here know that graphics fidelity and resolution go hand in hand in eating up resources. Upping one side will undeniably decrease your capacity in the other side.

    By choosing 900P, they will undeniably have more resources to increase graphics fidelity provided everything else is equal. So when today they suddenly say "we're lowering polygons" and "in fact we're running at 900p", there is no doubt they're relocating resources to other areas in the game. Without doing 900p instead of 1080p, they would probably have to turn certain things off. This is a compromise and there's nothing wrong about it.


    Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.
     
  4. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    I think he means "900p is not a compromise from E3 demo", that people thought it was 1080p.
     
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,156
    Likes Received:
    5,090
    What? From a fixed hardware standpoint. 1080p is ALWAYS a compromise when it comes to what, how many, and how advanced are the things that you can render. It isn't like PC where you can just throw more powerful hardware in order to do the things you could do at a lower resolution.

    You also basically gain nothing at typical living room viewing distances on typical (30-60 inch) living room TVs. Sure you could make an argument for people playing their console on a desktop monitor or sitting 1 meter away from their TV (maybe, it depends on how good their upscale + AA algorithm is 900p to 1080p shouldn't present many upscale artifacts), but that's not the use case that console developers are coding for.

    Just like 60 FPS versus 30 FPS. 60 FPS is qute obviously a compromise with regards to what can be rendered. But in that case depending on the game type, there are actual benefits that will be felt by a large number of people. Unlike the difference between 1080p and 900p.

    Anyway...

    So, basically what you are trying to say is that every single game on every single console is a compromise. And so every single developer in the world should just come out and say that the graphics for their game is a compromise. Gotcha.

    And as others have noted, going from 900p (E3) to 900p (now) is certainly not a compromise between E3 and now.

    And yes, lowering the polygon count on in game characters in order to boost the graphical effects applied to those characters or visible improvements to the background is a compromise in terms of numbers. But if the reduction in polygons is unnoticable while the effects added are noticeable, is it really a compromise? Considering I have been vocal about how disappointed I am that ALL next gen games still have easily see straight polygon edges on characters, I'm not so sure. The 150poly character likely had just as many noticeable straight poly edges on character outlines as the 85k poly character. They might have been smaller segments, however. And going back to my whole "living room" console experience I may not notice them on TV as I do in screenshots on my monitor. But I'm not sure on that.

    Regards,
    SB
     
  6. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,418
    Likes Received:
    40
    Location:
    Somewhere out there
    What the heck is your argument? Nobody's saying 1080p won't require more resources than 900p or anything below.

    Surely you're not stating that they don't care for anybody sitting in front of their computer monitors. I do that and I know a LOT of people who do that. Just because you don't and don't care much for them doesn't mean they aren't important and the devs can just ignore them.

    You're again alienating people who you obviously don't care about.
    There are people that don't care if it's 30fps or 60 fps, while there are people who care more about resolution.

    If anybody says that it's not a compromise with a straight face, you know there's obviously something wrong. The fact that it is a compromise doesn't mean they have to tell you it is but doesn't qualify it to flat out lying.

    "I care about what I see and only what I see and if I don't notice it it's irrelevant."


    If that is so then it's fine, as long as the E3 version isn't 1080p, which we probably don't have precise info on.
     
    #406 Strange, Sep 30, 2013
    Last edited by a moderator: Sep 30, 2013
  7. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,018
    Likes Received:
    1,024
    1080p is a TV format. Unless Im mistaken, I dont remember 1080p being some desirable type of target solution for game development. I don't remember 1080p having any special hold on the PC which had the hardware to support the resolution well before the term hdtv became part of the general technology lexicon.

    Is there is a special reason of why 1080p was chosen as a standard TV resolution other then being the number of pixels that neatly fit into a dimension that readily support movie formats? Movie formats that seem to be influenced by the concept of serving an audience rather than just one particular set of eyes. If not, then i can see devs balancing resolution with a number other variables to determine where resources should be allocated in order to produce the best visuals possible.

    The concept of "33% more pixels" to me is not readily highlighted when comparing 1080p vs. 900p images in comparison to the ideal of the amount of gpu's processing power needed to accommodate 33% more or 500k more pixels.

    I see developers see way more concerned with frame rates versus a particular TV resolution.
     
    #407 dobwal, Sep 30, 2013
    Last edited by a moderator: Sep 30, 2013
  8. Billy Idol

    Legend Veteran

    Joined:
    Mar 17, 2009
    Messages:
    5,940
    Likes Received:
    768
    Location:
    Europe
    Potentially, with all the small details the have in Ryse, it is clear that 1080p would be a huge benefit.

    But the clips we have so far also show a lot of post processing in 'artificial' blurring due to the cinematic effects, so I guess that Ryse will get away with the lower resolution without real impact. Furthermore, it seems that the areas are quite small without a huge draw distance.

    I'd say in case of Ryse, 900p is ok...I just hope that we don't get lot's of edge aliasing issues/shimmering.
     
  9. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,572
    Likes Received:
    2,292
    As Shifty pointed out already, the game looks better now than it used to. Examples:

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    http://i.imgur.com/Y883qro.jpg

    http://i.imgur.com/sP2qeB7.jpg


    Before and after:

    [​IMG]

    [​IMG]

    So if they get better results I would call that re-allocation. 900p are a compromise if they preferred to go with a lower resolution to enhance the Antialiasing though.

    I'd say that 1080p + AAx2 should look great but if they found a better utilisation of those resources for a launch game then I am certainly not repulsed.

    Crytek also said that the game couldn't run at 1080p on the PS4 --although I'd expect some kind of bump if the game run on Sony's console 'cos of the GPU.

    http://gearnuke.com/crytek-ceo-ryse-wouldnt-have-run-at-1080p-on-ps4-decision-was-choice-not-hurdle/

    Finally, regarding the polygons, nowadays it is not about rendering more polygons like it was in the past.

    I bet i bet many people didn't know this: but Halo 1 uses more polygons in Master Chief, than halo 2's Master Chief. That's the wonders of modern technology and smarter GPUs.

    Despite the fact the Master Chief in Halo 1 had a lot more polygons than in Halo 2, it looked way worse in comparison. -now compare Xenon to Jaguar, too...., but that's another story-

    Links:

    http://beyond3d.com/showthread.php?t=43975 (AllNets response)

    http://halo.bungie.org/misc/bollmc2/ (flash animation showing the polygon mesh of Halo 2's Master Chief and the transition to how it looked with normal mapping http://halo.bungie.org/miscellaneous.html?search=bumpmapping)

    http://previews.teamxbox.com/xbox/395/Halo-2/p2/ (from the Halo 2 preview, my favourite Halo ever btw.)

    http://xbox.gamespy.com/xbox/halo-2/528851p8.html (another preview during the good ol' times of the extinct Gamespy)

    And the better article on the subject called The Halo Effect, it's worth the read!!

    http://www.cgw.com/Publications/CGW/2005/Volume-28-Issue-1-Jan-2005-/The-Halo-Effect.aspx

     
  10. temesgen

    Veteran Regular

    Joined:
    Jan 1, 2007
    Messages:
    1,536
    Likes Received:
    327
    1080P or 900P doesn't matter me if the game looks good and this one does, hopefully the gameplay matches the visuals. If so MS will have a solid new IP.
     
  11. COPS N RAPPERS

    Regular

    Joined:
    Nov 2, 2008
    Messages:
    957
    Likes Received:
    32
    I find that interesting that they mention they would have bumped into the same situations with PS4. makes me wonder about the gap.

    All of the big devs should have final or near final devkits by now of both consoles.

    We've been seeing the game in 900p the whole time, i can't recall much about jaggies, honestly it fooled everyone for 1080p.


    ----------------------------------------------

    for anyone fishing for high quality footage of Ryse, Xbox live has the old E3 8 minute demo in supper high fidelity. it should be under their "coming soon" tab in their video games section.

    Or you could do the next to best thing and see glimpses of it in these vids.
    http://www.gamersyde.com/download_cryengine_gc_cryengine_demo-30611_en.html
    http://www.gamersyde.com/download_ryse_son_of_rome_behind_the_scenes-30509_en.html

    they have a little compression but it's much better than youtube.
     
    #411 COPS N RAPPERS, Sep 30, 2013
    Last edited by a moderator: Sep 30, 2013
  12. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,418
    Likes Received:
    40
    Location:
    Somewhere out there
    Did we have any screens or videos that was 1080p to lead anybody to think it was 1080p?

    As mentioned before by other people, I think it was some rep that said it was 1080p that everybody thought "it is probably 1080p". We should know better here to not take a sub 1080p source and conclude that it's 1080p.
     
  13. COPS N RAPPERS

    Regular

    Joined:
    Nov 2, 2008
    Messages:
    957
    Likes Received:
    32
    what i was saying is that it didn't cross anyone's minds that it could be lower than 1080p. not even did it cross DF when they first saw it in person months ago. and honestly raw 1080p looks bad apposed to sub 1080p + multisampling.

    the big clincher to the next gen consoles i think is battlefield 4 running at 720p, and watch dogs with low detail shadows poor framerate and horrible pop in. according to what I've been seeing nothing looks to good about them anymore.
     
  14. KKRT

    Veteran

    Joined:
    Aug 10, 2009
    Messages:
    1,040
    Likes Received:
    0
    Oh, yeah definitely its Xbone performance fault that there is no wet shader on armor and skin ...

    PS4 must be weaker than PS3 then, if KZ:SF do not have wet shaders on armors and cloths in comparison to Beyond ...
     
  15. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,418
    Likes Received:
    40
    Location:
    Somewhere out there
    I don't take anything as native 1080p unless tested correctly. If devs state it I usually try to take their word for it but leave some room for doubt.

    I thought this was the general accepted stance from last generation :smile:
     
  16. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    5,438
    Likes Received:
    1,628
    Location:
    Australia
    How do you know KZSF doesn't have wet shaders in sp? Besides, it's genuine 1080p no less;).
     
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,734
    Likes Received:
    11,208
    Location:
    Under my bridge
    Nah. In creating Crysis, they made zero compromises (and optimisations :p) and just left the user to upgrade their hardware. It wasn't until they started creating for consoles that they realised how to design properly, and Crytek are far less experienced in that space than most other devs. Indeed, they were 'negatively experienced' coming from power PC, where their choices and habits were reckless in comparison to how console design works. I am not saying that Ryse is a result of a poor developer - only correcting your idea that Yerli is well versed on compromises. He (and Crytek) are compromise noobs.

    The context of Crytek's comments is sadly left to readers to interpret. Twitter is a lousy way for people to convey meaningful info, and we can see, time and again, the response to a tweet is lots of discussion about ambiguous, or plain stupid, remarks.

    Native resolution is desired for maximum clarity on contemporary fixed-panel displays.

    That's a whole other debate. Suffice to say it's the resolution of many TVs out there, and if you want the clarity that those TVs offer, you need to render at 1080p, which is why MS supports a separate GUI layer for rendering 1080p UIs. Of course, with more photographic rendering, you can get away with less clarity and not have it as obvious as it is when rendering UIs in sub-native resolution.

    Inconsistent framerates all through this generation suggests otherwise!
     
  18. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,277
    Likes Received:
    3,726
    Despite the absolutely terrible looking gameplay, here I was thinking the graphics looked really good, but I guess I was wrong, because number.
     
  19. Mianca

    Regular

    Joined:
    Aug 7, 2010
    Messages:
    330
    Likes Received:
    0
    Well, I guess the interpolation needed to upscale the image basically can be seen as a rough type of antialiasing:

    160px*90px original image:

    [​IMG]

    192px*108px bicubic resize:
    [​IMG]

    And as the HUD will be on a seperate, native 1080p display plane, text etc. won't even get blury in the process. If done right, I really doubt the resulting image quality will be whole lot different from native 1080p renderings with added AA.

    People bothering about this probably are the same people arguing that a 440ppi cell phone screen looks sharper than a 400ppi one ... while 99% of the users won't even be able to make out any pixels to begin with.

    Don't get me wrong: If a game is native 1080p, that's great and the preferable solution. But I honestly think all the outrage directed at Crytek's decision to go with 900p is getting a little out of proportion.
     
    #419 Mianca, Sep 30, 2013
    Last edited by a moderator: Sep 30, 2013
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,734
    Likes Received:
    11,208
    Location:
    Under my bridge
    Your example images are inaccurate. You'd need to compare a natively rendered image to an upscaled one, rather than a smaller image to a larger, upscaled one.

    The attached shows vector graphics rendered natively in a 192x108 image, and natively in a 160x90 buffer and then upscaled with simple upscaler.

    For some content it makes negligible observable difference. For other stuff, like alternating lines, it very obviously blurs the results. Whether it makes an impact on this game or not doesn't need ot be discussed from a theorietical POV as people can actually see the game. If one looks at the game and thinks, "my god, that's blurry. I can't play that!" then don't buy it. Otherwise, care not what resolution it's rendering at. ;)
     

    Attached Files:

Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...