Rendering Tech of Ryse

Starx

Regular
e3156d-1395939600.jpg



http://www.crytek.com/download/2014_03_25_CRYENGINE_GDC_Schultz.pdf
 
Hadn't read through that yet. Interesting that they didn't use the Xbox hardware scalar or display plane controls. Wonder if those still aren't properly exposed in the SDK.
 
Neat... Hybrid TBDS & Forward+

I talked to quite a few graphics devs at GDC, and the hybrid approach is becoming pretty popular. It makes sense, since you can just dump out your tile indices from your lighting compute shader.
 
Target hardware less powerful than high-end PCs at launch

Interesting that they would even put that in there. Like a pre-warning or something of what most people already knew.
 
I want to know what type of voodoo Crytek did when rendering the eyes during the cutscenes, and why nothing currently out comes close.

I feel like i should do a thread about this later.
 
Neat... Hybrid TBDS & Forward+
Isn't that eSRAM unfriendly?

I mean, the game looks like no other game I've ever seen before, so I am glad they did that, but I thought that Forward Rendering is a better approach for the memory architecture of the Xbox One.
 
I want to know what type of voodoo Crytek did when rendering the eyes during the cutscenes, and why nothing currently out comes close.

I feel like i should do a thread about this later.

They discuss it in reasonable detail and links in the last two slides (52-53)
 
Isn't that eSRAM unfriendly?

I mean, the game looks like no other game I've ever seen before, so I am glad they did that, but I thought that Forward Rendering is a better approach for the memory architecture of the Xbox One.
Found this
Digital Foundry: What is your experience with the ESRAM on Xbox One? How do you utilise it in Ryse? Is 32MB really enough for the high-bandwidth rendering elements you'd want to utilise? How important is allocating graphics between DDR3 and ESRAM for Xbox One development?
Cevat Yerli: We put our most accessed render targets like the G-Buffer targets into ESRAM. Writing to ESRAM yields a considerable speed-up. While 32MB may not be enough to use something like MSAA to the fullest, with a smart memory management strategy it is possible to deal with that.
http://www.eurogamer.net/articles/digitalfoundry-crytek-the-next-generation
 
Interesting that they would even put that in there. Like a pre-warning or something of what most people already knew.

Seems like legit question for developing Ryse. They were already making a high end title mostly with PC in mind, with performance target for highest settings around 6 Tflops, so they would have to figure out how to compete with this title with Ryse on much slower hardware.

Isn't that eSRAM unfriendly?

I mean, the game looks like no other game I've ever seen before, so I am glad they did that, but I thought that Forward Rendering is a better approach for the memory architecture of the Xbox One.

In 900p not really and they were using similar setup already in Crysis 3 on past gens.
Their Gbuffer in Ryse is only 17mb [3x32bit * 900p].

---
I really liked their approach to shading aliasing with Gbuffer filter and the way they are making textures, seems very efficient with good quality.
Global 8k shadow map for distance geometry of the whole levels seems like awesome idea, also shows the advantage of higher ram count. On PS3 just this shadow map would take half of available PS3 VRAM, mindboggling how much we were limited for years.
A little disappointed by no info about their AO upgrades and SMAA 1Tx tech.

---
@Warchild
There are few slides about eye rendering in additional slides.
 
Last edited by a moderator:
I'd also like to say that I'm very impressed with the journey of Crytek since Cyris 1.

Back then they had some very talented graphics programmers coming up with a LOT of inventive solutions for implementing many visual features; but they were quite lacking in the performance aspects, and even high-end PCs have struggled to realize their vision.

And now on the X1, they seem to have that conquered as well, having both one of the best featured engine and image quality, while running at reasonably good performance and pretty much the highest resolution* on the system. We're now at a point where I'm more interested to see games using this engine, instead of UE4 (and by the way, where are Epic's nextgen titles??).

It's a bit of a disappointment that they have yet to make a game that can score 90+ altogether, but maybe that'll require lesser involvement from Yerli. He seems to be doing a good job at managing the studios and the tech development and all, but game design doesn't seem to be his strong suit. The Crysis games were never really catching to me either and I'm not a bit interested in playing Ryse either - but the tech is there to make something truly outstanding. Maybe they should start to pursue all the design talent leaving the Sony studios recently?...

* yeah, Forza is 1080p, but it does not have to bother with detailed characters and what they require, and the IQ is not the best either, seems more like a need to hit a checklist feature instead of using the best solution.
 
It's a bit of a disappointment that they have yet to make a game that can score 90+ altogether, but maybe that'll require lesser involvement from Yerli. He seems to be doing a good job at managing the studios and the tech development and all, but game design doesn't seem to be his strong suit. The Crysis games were never really catching to me either and I'm not a bit interested in playing Ryse either - but the tech is there to make something truly outstanding. Maybe they should start to pursue all the design talent leaving the Sony studios recently?...

They have some design people from ND/SSM, some of their old design people are in ND now etc :)

I think the biggest problem for Crytek on past gen, were hardware limitations and time limitations. They need to make a game they want, big and sandbox focused and they need as much time as want to.
Crysis 2 was done within limitations of consoles in some degree, it was also changed few times [at first i was more like Crysis 1] and was developed on almost completely new engine. Personally its the game i most enjoyed from Crytek, still with tons of flaws, especially in encounter and enemy design.
Crysis 3 was made in 23 months and it generally was good from polish standpoint, but it lacked ambition in enemy variety and encounters, it was still made within some past gen generation limitations, which was mostly visible in scenes with tanks and Pingers in last levels.
Ryse was in development hell till it landed in Crytek Frankfurt less than two years ago and then they havent got much time left till Xbone launch period. It was almost made by new team.

In comparison Last of Us was in development for almost 5 years.

They really need time and a game they really desire to do.
 
As far as I know this flag is set by TBDR and TBR GPUs. It was a bad design choice by MS to have them both under one bit but it is what it is. It's not about capabilities but about what the HW actually does under the cover. PowerVRs would set it, I bet Adreno would as well.

//edit
Oh, oh. It is important to understand that "deferred" in TBDR hardware means completely different thing than "deferred" in, say, deferred lighting renderer or whatever. These two should not be confused.
 
Yeah,

The Tile based deferred renderer flag indicates if you're running on a hardware implementation that already works as a deferred renderer (ie PowerVR).

An increasingly large number of game engines want the efficiency benefits of deferred (and tiled) to avoid shading hidden pixels and speed up lighting effects but most hardware can't do it by default. So they're starting to do some of the computations up front in software like submitting geometry twice to allow "normal" immediate mode rendering hardware (ie: NVidia/AMD) to do deferred.

The flag you've found is useful because if you know you're running on a platform that already has hardware TBDR support then you can potentially skip lots of the extra setup costs of doing it in software.
 
I'd also like to say that I'm very impressed with the journey of Crytek since Cyris 1.

Back then they had some very talented graphics programmers coming up with a LOT of inventive solutions for implementing many visual features; but they were quite lacking in the performance aspects, and even high-end PCs have struggled to realize their vision.

And now on the X1, they seem to have that conquered as well, having both one of the best featured engine and image quality, while running at reasonably good performance and pretty much the highest resolution* on the system. We're now at a point where I'm more interested to see games using this engine, instead of UE4 (and by the way, where are Epic's nextgen titles??).

It's a bit of a disappointment that they have yet to make a game that can score 90+ altogether, but maybe that'll require lesser involvement from Yerli. He seems to be doing a good job at managing the studios and the tech development and all, but game design doesn't seem to be his strong suit. The Crysis games were never really catching to me either and I'm not a bit interested in playing Ryse either - but the tech is there to make something truly outstanding. Maybe they should start to pursue all the design talent leaving the Sony studios recently?...

* yeah, Forza is 1080p, but it does not have to bother with detailed characters and what they require, and the IQ is not the best either, seems more like a need to hit a checklist feature instead of using the best solution.

I agree the cryengine is absolutely destroying other licensable engines in visuals.
Imagine what a game like Thief could have looked like using the current Cryengine.
Its not an ugly game but it could have been much better. I think UE3 still has alot of potential to shine on the new consoles if the semaritan demo is any indication. Im not sure I can agree that 900p is the highest achievable res on the Xbox one though. Aside from Forza 5 Tomb Raider looks pretty decent at 1080p on the X1 as well as Nba2k14. If 900p ends up being the resolution of choice on the X1 I wont be upset especially if great AA is being used like that in Ryse.
 
Back
Top