*spin-off* Ryse Trade-Offs

Discussion in 'Console Industry' started by shredenvain, Sep 17, 2013.

Thread Status:
Not open for further replies.
  1. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    1,919
    Likes Received:
    1,069
    I don't think anyone is saying that 900p is right for every game.
    Just that they could have decided on 900p from the start due to the game.
     
  2. Renegade_43

    Newcomer

    Joined:
    Jul 23, 2005
    Messages:
    36
    Likes Received:
    10
    I already addressed this exact comment in my earlier post. Eventually there will be a breaking point. Maybe hardware 4 times faster that little extra sharpness might be a worthwhile tradeoff on top of the increased pixel quality. I hate to use this word... but it is a "balance" between which tradeoffs give the best image.

    What crytek is telling you is that they can make a better "overall" looking game by not simply aiming for the 1080p checkbox. Isn't that what matters most?
     
  3. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,139
    Likes Received:
    356
    The breaking point depends on the size of your display, your vision and the distance between your display and your eyes.
     
  4. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,889
    Likes Received:
    2,304
    So you think if crytek had the choice of the game running exactly as they want at 900p or exactly as they want at 1080p they would choose 900p ?
     
  5. KKRT

    Veteran

    Joined:
    Aug 10, 2009
    Messages:
    1,040
    Likes Received:
    0
    How do You define 'exactly as they want'? :>

    I think 'exactly as they want' would be something along the line as beginning of this trailer
    http://www.youtube.com/watch?v=uA88bWXUo_o

    ---edit---
    fixed link :)
     
    #485 KKRT, Oct 1, 2013
    Last edited by a moderator: Oct 1, 2013
  6. Gerry

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    747
    Likes Received:
    82
    Christ Davros, maybe you should stick to PC threads. You've had a mare in this one.
     
  7. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,271
    Likes Received:
    3,721
    I think that's the point. Some people are suggesting (right or wrong) that they'd target 900p regardless. So if for some reason they found more rendering time on the table (improved drivers, final hardware, improved sdk, fairy dust), they stick with 900p and use the resources elsewhere.

    I think it's reasonable that some devs would do that. Who knows if that is the case here.
     
  8. Renegade_43

    Newcomer

    Joined:
    Jul 23, 2005
    Messages:
    36
    Likes Received:
    10
    The majority of people have televisions under 60". Clearly the optimal resolution vs pixel quality shifts with larger televisions but they are catering the the best "overall" image for the majority of users! The majority do not have 100" screens. Crytek is choosing 900p because the typical living room shows more OVERALL benefit having higher quality pixels then simply shooting for 1080p at random.
     
  9. SlimJim

    Banned

    Joined:
    Aug 29, 2013
    Messages:
    590
    Likes Received:
    0
    They should have went with 1920*1080 24p :p

    To me, animation, art direction, sound design, and others triumph polycount or resolution.

    I can imagine some scenes from GOW3 or GOW:A obliterating anything that Ryse puts out, just because Ryse is grounded in reality and also because they went with a high polycount from the beginning, leaving almost no room for other effects. The animations will take you out of the game, no matter how many millions of polygons they claim to have.
     
  10. KKRT

    Veteran

    Joined:
    Aug 10, 2009
    Messages:
    1,040
    Likes Received:
    0
  11. Bagel seed

    Veteran

    Joined:
    Jul 23, 2005
    Messages:
    1,533
    Likes Received:
    16
    They should've made the game 900x1080 pillarboxed FPS, to simulate a helmet cam. Or, keep the 3rd person view but zoom way in over the shoulder so that half the screen is Marius' upper right back and head. These are much more innovative ways to increase rendering power than to drop res and upscale.
     
  12. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,578
    Likes Received:
    1,985
    Saying Ryse is 900p due to performance limitations of the XBOne hardware is not any more/less valid than saying any other characteristic of the graphical output is limited by the hardware. Isolating that one characteristic, though, without considering that all aspects of CG visuals are limited by available performance isn't. It is entirely possible that with more available power that Crytek may have still chosen to remain at 900p or even render to a lower resolution and instead use the available power to boost other aspects of the rendering if they felt that that would result in better overall image quality.
     
  13. SlimJim

    Banned

    Joined:
    Aug 29, 2013
    Messages:
    590
    Likes Received:
    0
  14. KKRT

    Veteran

    Joined:
    Aug 10, 2009
    Messages:
    1,040
    Likes Received:
    0
    Nope, there is no CGI in this game, everything You see here is real-time.

    In last Cevat's presentation, that smoke in the background bugged out, like sometimes particles bugs out in my SDK editor. Its all real particles. Animation is pre-canned though, but everything else is real-time, so its lit by light sources, its shadowed and self-shadowed, its affected by wind etc.
     
  15. COPS N RAPPERS

    Regular

    Joined:
    Nov 2, 2008
    Messages:
    957
    Likes Received:
    32
    Not sure if the topic is for this thread but it was brought up many months ago, 1080p native isn't a creditable bar anymore to gauge the consoles. It's too simple and inaccurate, any game can simply reach it by shrugging off a couple complex features.

    A good example of poor gauging would be UT4's elements demo on PC. it wasn't even at native 1080p, yet it could be classified as inferior tech because of that? no, the technology it was pushing was clearly much more advanced.

    The standard wasn't enforced this gen because it would obviously tie next gen consoles down. The only reason why the "HD Era" was brought up last gen was because of TVs supporting the signal. If 4k became the standard because of newer TVs now, these consoles would have to resort to Sub ps3/360 like graphics.

    what should gauge technology is the content they're pushing for, Frame buffer resolution is a second gauge but more importantly eliminating jaggies is what people want the most.
     
  16. Cranky

    Newcomer

    Joined:
    May 22, 2013
    Messages:
    134
    Likes Received:
    0

    and the quality of the scaler and how the display planes are used. The is a huge difference between scaling 1 of the 2 display planes to 1080 thus outputting a native 1080 p signal and forcing a non-native resolution onto a PC monitor, which is what most of the folks claiming to be able to see the difference are doing.
     
  17. Bigus Dickus

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    943
    Likes Received:
    16
    You want to have a "technical discussion" about why a console (that you show utter contempt for) can't achieve an arbitrary goalpost you equate with competency, kicking off the "discussion" with loaded terms like "hardware limitations" and "crippled performance." Right. Accusing someone else of being a shill is quite the punchline.

    Why 1080p? Because 1080p is a fairly common display resolution? Isn't more = better in your book? There are quite a few 4k displays out there. What is broken in these next gen consoles that they can't handle high res output at 60 frames/sec? Is it gimped memory busses? Underpowered CPU's? Underachieving GPU's? What failures did Sony and MS have in producing hardware capable of high quality graphics knowing years ahead of time what the target displays were?

    Or from an equally absurd perspective, when Crysis releases a new PC game that brings top of he line graphics cards to their knees, what is broken in these cards so that they can't handle a game at full HD resolution? Limited bandwidth from there ultrawide and fast gddr5 memory systems? Should they have not cut corners and went with 1024 bit wide bus?

    No less FUD than what you wrote. No less loaded language.

    Now, if you want to have a technical discussion on the topic, we can start with the old tried and true, as the rules of physics haven't changed for a new generation. What is human visual acuity in arc seconds, and what resolutions does that equate to for a given display size at a given distance? How does that change with AA, ie how much does resolution need to increase without AA as a percentage to maintain visual equivalence to a lower resolution with 2xAA? 4x? Different sampling methods? Now, how does the processing power compare at visually equivalent combinations? Bandwidth? Does esram or single fast pool fit some choices better than others? If you have an abundance of compute power, or bandwidth, does that lend certain choices more favorable? Are there situations in which a lower resolution visually equivalent or nearly equivalent choice frees enough resources to make a noticeable impact on pixel quality? What does he hardware look like in such a case, compute heavy or compute constrained? Bandwidth heavy or bandwidth constrained?

    Someone else asked if given a choice, would crytek prefer to run at 1080p looking just like they want, or 900p looking just like they want. That hypothetical presumes there is enough processing power to enable such a choice... ie, infinite. Of course if you have infinite power, you pick the higher res. Back in realityland, we have consoles with cost, power, and heat budgets, and devs will have to make choices that give the best on screen appearance possible within those budgets. This is no different than any other generation. Some games going for a certain loom or having a certain style may target lower res higher quality pixels. We saw plenty of that last gen. For other game styles, more pixels may be achievable or even desirable. Maybe some game will decide to target higher than 1080p and sacrifice where necessary to achieve that. The monitors I use daily are well over twice that resolution, so I would find that intriguing.
     
  18. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    Rez scaling on PC, IQ-wise, has been good for a few generations. The problem with PC games looking lacklustre in non-native is because of view distance, and the mostly static UI not being rendered in native res, which makes it look obvious.
     
  19. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,715
    Likes Received:
    5,812
    Location:
    ಠ_ಠ
    Discussion has run its course. Trade-offs exist on consoles. The End. (Please don't stay for the after-credits cut-scene).
    It's nappy time, kay?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...