*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
F*ck! Only a game @1080p? too bad news.

One starts to wonder if there is ever going to come some good news regarding the Xbox One. I'm getting it regardless for my kids to try out Kinect, but it is not like I'm getting more exited as the time goes on. :-|
 
I wonder if this downgrade is simply due to a lack of graphics power, or if it has anything to do with them requiring to fit more buffers/data into the ESRAM at this lower resolution.

According to the presentation "A bit more deferred - Cryengine 3", it is supposed to be some kind of deferred lighting solution, rather than a full deferred shader engine. Don't know if this is still true for Ryse's engine, but if so, they would require somewhere between 3-4 rendertargets at a time: 1 for normals, 1 for depth, 1-2 for the light accumulation target. And then probably another one for the final composition (the back buffer).

According to the same presentation I mentioned above, this process consists of 3 passes:
1) Forward GBuffer generation: that would create normals, depth, and specular power. Let's say they can fit this into 2 x 32bit rendertargets. That's easily possible and also probable since they have limited space available in the ESRAM).
2) Deferred light accumulation: this would require another single rendertarget at 32bit or 64bit. This is already an important decision since you would have to account for any HDR lighting in this step for which you would need a floating point target. That is doable in 32bit and again because space and also bandwidth is of importance it is probable they do it with a 32bit target.
---------------------------------------------------------------
The three rendertargets in these two steps would have to be persistent in memory at the same time, since the light accumulation buffer reads the info from the other two rendertargets. At 1080p this would mean that the ESRAM is utilized to 3/4 and another 1/4 of the space is available for use. After this step however, it should be possible to scrap the depth and normal buffers, freeing the 2x32 bit targets since they aren't required anymore.
---------------------------------------------------------------
3) Forward shading with light accumulation texture: This requires another 1x32 bit rendertarget to do the final composition of the picture (the back buffer). So we have the accumulation buffer (read) and the back buffer (write) in the ESRAM. The other thing that is needed in this final step are the albedo textures for all the objects in the frame. Now the ESRAM is half-empty (@1080p), so ~16MB would be available to use at that time. That can be PRTs or any other stuff they would require in ESRAM.

Please correct me If I made a mistake somewhere in there. Now, 16MB for PRTs is a figure that MS mentioned at the Build conference during their PRT-Demo presentation. However, I would argue that this demo was nowhere near any real-world situation within a game. Too little texture variety, etc. for a modern AAA game. So it would be beneficial to have more than 16MB available to prevent more lookups from DRAM.

Now at 900p (assuming 1600x900), during the final composition stage, the ESRAM would be filled with 1600x900x4BPP = ~5,6MB x2 rendertargets = ~11,2MB. So in this case it would lead to them having ~5MB more free space in ESRAM at that point. Doesn't seem all that significant to me. Now I know this isn't really a technical thread, but what do you guys think?

I have to say it worries me that Crytek, who did such a great job on Xbox360 with Crysis2/3 have to tone down the resolution to get a solid 30fps gameplay on what is supposed to be next-gen hardware. Don't get me wrong, the game still looks incredible, but it is troublesome that first party has to make such sacrifices this early in the next-gen console's lifecycle. I mean it isn't even out yet and assuming games will look even better a few years from now it is highly unlikely that they will be able to up the resolution in the future while providing even more eyecandy.
 
RYSE clearly doesn't run under 30 fps and Forza 5 is running 1080p@60fps.

Never said it did... but I wouldn't be surprised if Forza 5 is being upscaled, but then again it doesn't look that hardware taxing to begin with. Anyhow, back to the topic... Ryse has been sited by a few sites about having frame-rate drops... but not many, so a perfect 30fps it isn't as of now.
 
Last edited by a moderator:
Haven't heard a single person complaining about blurring during gameplay or even suggest it wasn't running at 1080p.

In fact I've mostly been hearing about how beautiful the game is and some people that have played it calling it the best looking game they saw at the show.

We'll have to see when we finally get it home and play it ourselves.

Must be a good upscaler then :smile:. For one, I can see the difference between a native 1080 and an upscaled 1600X900 on my monitor. TVs themselves have pretty good upscalers in built in them, but a difference in quality by having actually more detailed picture is always visible. Like I said earlier, most people won't mind. My own TV is a HD ready one, meaining its a 720p TV which can take in 1080p signal. But I game on my Monitor, and I can easily tell when a game is 720 vs 1080. You can also try it out on your PC, just run a game @1600X900 in fullscreen and then run the same game on 1920X1080 fullscreen. The blurriness of the lesser res image is very clear, because it is being upscaled by the monitor. Having the XBOne upscale it is a better option, of course, but we cannot have the same detail in the image as a native 1080. I mean its not something new, more pixels=more detail. :)
 
I thought MS said this is the Gears of War for Xbone, not very convincing when you had to drop it down to 900p at launch.
 
I have to say it worries me that Crytek, who did such a great job on Xbox360 with Crysis2/3 have to tone down the resolution to get a solid 30fps gameplay on what is supposed to be next-gen hardware.

Actually Crysis was not native 720p on 360...or PS3.
Also Crytek never delivered solid frame rate on consoles either.
 
Never said it did... but I wouldn't be surprised if Forza 5 is being upscaled, but then again it doesn't look that hardware taxing to begin with. Anyhow, back to the topic... Ryse has been sited by a few sites about having frame-rate drops... but not many, so a perfect 30fps it isn't as of now.

Forza 5 has been pointed out on multiple occasions to be 1080p including the tweet from Aaron about RYSE being 900p native.
 
Actually Crysis was not native 720p on 360...or PS3.
Also Crytek never delivered solid frame rate on consoles either.

To be fair not even one game with very advance graphical features is delivering consistent framerate on current gen consoles.
Problem is that Ryse is for next-gen console and they could budget it accordingly, its not like they with C2 and C3 they had to downscale next-gen game to current-gen hardware, instead they are going all out again and sacrifice IQ in meantime ;\

Framerate and then IQ are the most important Crytek ! All Your assets and post-process techniques degrade with worse IQ. Why bother with the highest precision Bokeh or high quality SSR when You render in 900p? Lower quality and 1080p would look better.
 
Last edited by a moderator:

On a TV, the difference between 900 and 1080 is not going to be noticeable. On a 28" monitor 18-24" away, it might be just noticable, but not on a TV at typical viewing distances. Just as for most people 30 versus 60 FPS is unnoticeable. Given the latest movies, it seems obvious that crytek and MS decided that IQ trumps resolution. Also, I wouldn't be surprised if Marius is 1080p and the rest of the FOV is 900p.

Currently, based on the latest vidoc Ryse is by FAR the best looking launch game on either console. Quantum Break seems like it will crush it later on, but who knows when that will release and what PS4 games will look like by then.

If 900p looks that good, then thanks to the display planes, I expect the multiplats to look identical to PS4 games. Given the display planes and the way that the human eye operates, I would render the main character at 1080 along with the center 960 x 540 pixels at 1080 and the rest of the screen at somewhere between 720 and 1080, whatever is needed to hit the frame rate target.
 
If 900p looks that good, then thanks to the display planes, I expect the multiplats to look identical to PS4 games. Given the display planes and the way that the human eye operates, I would render the main character at 1080 along with the center 960 x 540 pixels at 1080 and the rest of the screen at somewhere between 720 and 1080, whatever is needed to hit the frame rate target.

I think this is the best way to give a incredibly jarring and downright ugly presentation. People keep talking about doing this but I never hear a dev suggesting it, there must be some reason that others are not seeing that makes it either not worthwhile or hard to implement.
 
I find that Killer Instinct is the greater offender, but then again, KI is made by dev studio that had really problematic games in the past.

As for Ryse... this is most probably result of making the game on PC for too long a period [they aimed to high], and then having to optimized game for modest GPU. I haven't followed much discussion, is it possible that 32megs of ESRAM and 16ROPs are not enough for HQ 1080p games?
 
Last edited by a moderator:
If 900p looks that good, then thanks to the display planes, I expect the multiplats to look identical to PS4 games. Given the display planes and the way that the human eye operates, I would render the main character at 1080 along with the center 960 x 540 pixels at 1080 and the rest of the screen at somewhere between 720 and 1080, whatever is needed to hit the frame rate target.

AFAIK display planes are a fancy way of saying 'alpha-blended overlay'.

If you want to render different parts of the scene in different resolutions and merge them, then AFAIK you'd need to do that manually.

As for Ryse... this is most probably result of making the game on PC for too long a period [they aimed to high], and then having to optimized game for modest GPU.

I'll probably be killed for it, but another possibility is that the console's internal target spec was missed. (a 150Mhz boost is awesome, unless you were told to expect something more significant).
 
Last edited by a moderator:
I find that Killer Instinct is the greater offender, but then again, KI is made by dev studio that had really problematic games in the past.

As for Ryse... this is most probably result of making the game on PC for too long a period [they aimed to high], and then having to optimized game for modest GPU. I havent followed much discussion, is it possibly 32megs of ESRAM and 16ROPs are not enough for HQ 1080p games?

It would seem that way. The ESRAM is, according to the EG article, is going to be the preserve of lower resolution render targets and thats where the performance will be.

For the first wave of games that isn't going to be much of an issue but I can see the difference becoming more apparent later in the generation. Both of the consoles have pretty straightforward architecture, it's going to be a far quicker process to start to squeeze the best out of them than it was with this generation. We are only now starting to see the best from the 360\PS3, IMO that will happen within a couple of game cycles with the next gen boxes.
 
Good news is that we will have more eye candy. I think its a good move considering a lot of people will only play on 720p televisions. But fr people on 1080p monitors or TVs, the blurring will be visible.

Not necessarily, the reduction in rendering resolution is more likely to maintain a solid framerate and to use the ESRAM efficiently. The eye candy will probably remain the same as long as the photoshop edits aren't hiding things and we are seein the XB version and not the PC.
 
Ryse was reported to be running at 1080p at e3 by eurogamer with a few frame drops.
Do you think that it was always 900p or did they change to get better performance because they are running out of time for optimization.

Eurogamer incorrectly thought KI was 1080p at e3 too
 
AFAIK display planes are a fancy way of saying 'alpha-blended overlay'.

If you want to render different parts of the scene in different resolutions and merge them, then AFAIK you'd need to do that manually.
There was scope in their description to render a background and foreground. However, given PIP capabilities I'm more inclined to believe one plane is game, second is game UI, and third is OS overlay including HDMI passthrough.
 
Status
Not open for further replies.
Back
Top