*spin-off* Ryse Trade-Offs

Discussion in 'Console Industry' started by shredenvain, Sep 17, 2013.

Thread Status:
Not open for further replies.
  1. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,135
    Likes Received:
    2,248
    Location:
    Wrong thread
    Why is this so difficult to understand?

    If they could go to 1080p with no performance hit that would mean they were making a game at 900p and intentionally leaving a huge amount of power unused. That would be absolutely stupid.

    According to Crytek, they made a choice that 900p (god I hate that term) was the optimal resolution to allow them to achieve what they want to achieve. Because fewer pixels means they have more time to spend on each pixel.

    That is not just true for Xbone, it is true for PS4, PS3, 360, Wii U, and basically anything in use today.
     
  2. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    1,928
    Likes Received:
    1,079
    They could do that on pc because they could develop a game that would scale and max out hardware that is improving every year, and still not be able to run it on max settings for years.(reasonable budget hardware)

    Consoles you don't have that luxury.
    Why compromise the visuals by going for 1080p.
    Not saying every game should go 900p, it may depend on art direction, engine, and if scaling it produces an image that most people wouldn't think isn't native or close enough.

    The point is the gpu could have had 60%+ more processing power and they may have still gone for 900p, due to what I said in last paragraph, or it still may not have been enough to go 1080p with same visual fidelity with a locked frame rate.
     
  3. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,135
    Likes Received:
    2,248
    Location:
    Wrong thread
    If what the Crytek dude is saying is true, and Ryse has always been 1600 x 900 with a custom 'AA' scaler taking the game up to a final framebuffer size of 1080, then that means that no-one who has crying about about a "downgrade" can even tell the difference.

    They are the proof that they need to stop clinging on so tightly to the single metric that they think they can understand.
     
  4. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    1,928
    Likes Received:
    1,079
    I'm starting to think both Sony and MS should mandate that you cant say what res a game is prior to release. lol.

    It just doesn't mean what people think it does any more.
    At most it's going to become a measure of if something is sharp or blurry.
    720p = blurry -> 1080p = Sharp, regardless of what the actual resolution is, however you choose to measure it.

    Leave it up to the pixel counters to work it out, and everyone else will just think is it sharp or not. That way a game wont be marked down because everything isn't rendered in 1080p or something.
     
  5. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,276
    Likes Received:
    3,725
    I don't understand why people care. If you look at the game and think it looks good, what does it matter what resolution it is? It's an interesting thing to discuss from a technical perspective, but I don't understand why anyone would be upset about it, or why some people demand that games are forced to render at 1080p (which buffers?).

    I am curious about what they're doing for upscaling and AA. I think it would make sense that they're applying anti-aliasing to the upscaled image, and then adding the UI on top, as some here have suggested.
     
  6. BoardBonobo

    BoardBonobo My hat is white!
    Veteran

    Joined:
    May 30, 2002
    Messages:
    3,255
    Likes Received:
    153
    Location:
    SurfMonkey's Cluster...
    Oh. I thought it was a 900p@60. I'm confused as to why 900p is the sweet spot then apart from being the most appropriate for upscaling. What was it that stopped them hitting 1080p then?

    Apart from timing profiles etc, the similarity between the XB1 and the 360 shouldn't have led to any big surprises should it?
     
  7. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,418
    Likes Received:
    40
    Location:
    Somewhere out there
    It's not hard to understand because most people here probably do.

    Obviously they chose 900p because they wanted to have a certain amount of effects, and doing it at a higher resolution would force them to sacrifice IQ for the same fluidity or sacrifice fluidity to maintain on IQ. (if we define IQ here as independent from frame rates and resolution)

    Saying "Xbox One will struggle for power (to do what they want to do@1080p)" or "900p gives better results" is really the same.

    It's just the same situation described differently.
     
  8. Bagel seed

    Veteran

    Joined:
    Jul 23, 2005
    Messages:
    1,533
    Likes Received:
    16
    The first 720p 30fps game that comes out this gen better look like Infiltrator. That's my minimum expectation. And I don't think that's unreasonable, a few years in.
     
  9. DrJay24

    Veteran

    Joined:
    May 16, 2008
    Messages:
    3,891
    Likes Received:
    633
    Location:
    Internet
    I think this is a straw man, I don't see many people claiming they see a down grade, they are not talking about the IQ before or after. The discussion (to me) was about why they choose a sub-1080P res, the technical trade-offs. For example, ee know why Halo 3 ran at 640P, I'm interested to know why Crytek choose 900P for the Xb1. Was it the eSRAM size, the fillrate, etc.? Clearly they would run at 1080P if they could, but they can't.

    People are being a little to defensive, these types of trade off are done on every console for every game. We just happen to be talking about one game on one console in this thread.
     
  10. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    11,380
    Likes Received:
    1,820
    Considering how beautoful the game looks 900p was freakin worth it! Its probably the only game that looks truly next gen
     
  11. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,276
    Likes Received:
    3,725
    What do you mean by sweet spot? They picked 900p because it allowed them to fit in all of the rendering "effects" they wanted, at the framerate they were targeting. I imagine that resolution was seen as not compromising image quality significantly, so it was deemed acceptable. For all we know they started with the idea of 900p, assuming they could upscale and anti-alias with minimal impact to image quality, and never intended to try for 1080p.

    If you're asking the question about why they could not hit 1080p, assuming all other things remained equal, then I'd say it's a question that can only be answered by Crytek. I mean, the amount of information that is collected in profiling, and understanding how each piece of the rendering pipeline can impact the framerate is quite complicated. We just don't have the information, and honestly there are probably only a dozen people on this forum that could make a useful educated guesses if we did.

    What does the 360 have to do with this? It's a completely different hardware architecture.
     
  12. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,891
    Likes Received:
    2,309
    I stands corrected :D

    And the reason is they dont have the horsepower to do 1080p
    Crytek have a history of this, when Crysis was released no one was playing it at 1080 @ 60fps full quality and max aa
    You never know, amd could still be in the process of optimizing their drivers and by the time the xb1 comes out theres a chance ryse could end up running at 1080
    we've seen some big improvements in games on the pc from newer drivers although there is probably less performance to uncover on a console with it being closer to the metal
     
  13. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,135
    Likes Received:
    2,248
    Location:
    Wrong thread
    I didn't think that was the point you were trying to make. I still don't for that mattter. You said, to me:

    "So you're saying they wouldn't go for 1080p if they could do it with little other stuff turned off."

    And I'm saying that they might not. If Xbone had the power to do Ryse as is at 1080p, Crytek might very well have chosen to stay at "900p" and up detail elsewhere. Because as it now appears, no one claiming it's an issue could even tell it wasn't rendering at 1080p in the first place.

    You are acting like 1080p is some kind of implied goal. It doesn't have to be. And Crytek claim it wasn't. It might never have been. Even with double the power it still might not be.

    NOOOOOOOOOOOOOOO.

    IT IS NOT CLEAR THAT THEY WOULD RUN AT 1080 IF THEY COULD. THEY MIGHT HAVE CHOSEN TO FOCUS ON HIGHER QUALITY PIXELS INSTEAD OF HAVING MORE OF THEM.

    And it is not a "strawman" to point out that people don't seem to have been able to tell that the game was always 1600 x 900 and not 1080p. It's not a strawman because it validates Crytek's choice and people are here claiming that they are trying to understand Cryteks choices and understand why the game is as it is.

    I'm not getting defensive, I'm getting annoyed. Annoyed that people are making so little effort to try and understand that resolution is a CHOICE that is make of part of an OVERALL SET OF COMPROMISES that are part of trying to deliver THE BEST OVERALL VISUAL PACKAGE.

    *facepalm*
     
  14. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    3,796
    Likes Received:
    1,903
    Then I will bitch, moan and be transparent, because anything dealing with the technical aspects of XB1 hardware or it games are off limits to little dickus, crack-function and whoever/whatever else company shill that doesn't want to address questions. :wink:
     
    #474 Shortbread, Oct 1, 2013
    Last edited by a moderator: Oct 1, 2013
  15. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    That's the part which is mildly hysterical. People are basically asking why Crytek haven't implemented something most people can't see. If people can't see it as in this case where people clearly couldn't tell 900 from 1080, then why waste cycles on it to begin with? Presumably people want the best looking game no? If so isn't it logical to remove cycles spent on stuff that can't be noticed and spend them elsewhere where they can? That would be the smart developer choice no? It's weird, I don't get why people purposely want a game to make inefficient use of resources and spend then where they aren't noticed. Why would Crytek ever willingly do that?
     
  16. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,276
    Likes Received:
    3,725
    Because 1080p.
     
  17. BoardBonobo

    BoardBonobo My hat is white!
    Veteran

    Joined:
    May 30, 2002
    Messages:
    3,255
    Likes Received:
    153
    Location:
    SurfMonkey's Cluster...
    Then why ever have anything over 900p? If nobody can tell the difference why bother?
     
  18. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,709
    Likes Received:
    11,163
    Location:
    Under my bridge
    We may see that every game gets optimised to 900p for that very reason. We're certainly seeing 720p games for that reason. But it's clearly going to depend on the game. eg. a Geometry Wars type game could look noticeably sharper at 1080p over 900p.
     
  19. COPS N RAPPERS

    Regular

    Joined:
    Nov 2, 2008
    Messages:
    957
    Likes Received:
    32

    Exactly, The Xbox one can have a standard of 1080p 30fps like the rest of the games coming, but Ryse isn't like the rest of the games and this is the reason for going unorthodox.


    For fixed hardware even when you have an architecture laid out with less bottlenecks, there's nothing that says devs can't favor more shaders or polys over pixels. unorthodox pixel count goes with any fixed hardware, battlefield 4 is an other example of other favorable choices.
     
  20. RudeCurve

    Banned

    Joined:
    Jun 1, 2008
    Messages:
    2,831
    Likes Received:
    0
    +1...haha nicely put.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...