*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
If Yerli is telling the truth, that Ryse has always been 900p, then that would mean Crytek knew ahead of time certain XB1 limitations would cripple performance on handling the new CryEngine. Because we damn well know the new CryEngine goes well beyond 900p rendering...


You know, There is a big misunderstanding when it comes to hardware limitations, out side of beyond3d. fixed hardware will always be locked to limits, and you could easily jump out of those boundaries in favor of unorthodox methods.

Imagine the sacrifices Guerrilla Games would have to go through if they wanted to get rid of shadow fall's 7 LODs, in favor of 80k single LODs. there is a reason why they chose 7LODs (40k max descending to 20k and lower) @ 1080p; for environments and shaders.

1080p is a compromise in it's self. Polygons, shaders, and frame rate are compromises in their selves as well.


even when you upgrade hardware, you are still trading in for something that has similar limitations. this is something that should cross everyone's minds. so when developers talk about compromises they are not insulting the box, they are simply stating the obvious of working with fixed hardware.
 
That was a helpful post.

Certainly a lot more helpful than some of the utter noise here that people actually try to pass off as signal. I'm tired of seeing intellectually dishonest hypotheses getting backed without passing through grey matter in multiple threads- and from the looks of it, it's not just me. Anyway that's all I'm saying without digressing anymore.


Considering the amount of fine-grained detail that would be lost in morphological antialiasing anyway in native 1080 though, why wouldn't Crytek make a decision to render in 900 and then apply ML/SMAA towards the upscaled result, killing potential upscaling artifacts in the way?

The illusion that "native resolution" is some sort of ideal, panacea to graphics is rather naive and founded on the misconceptions of previous-generation resolution wars,
which I'm pretty sure some bunch have more than a chip on their shoulders for.
The more stark reality is, in real time rendering limitations are present to work with; trying to assume the limitations of hardware through some paper wall hodgepodge completely discounts the plurality of decisions made by TDs, ADs and other associated ninjas (TM) to deliver a certain level of holistic visual quality.
 
Ryse E3 was builded with the "bad" drivers, no? what if with the new drivers, with better performance (supposedly), instead to upgrade the resolution to 1080p, they chose to enhanced the shaders?
 
You know, There is a big misunderstanding when it comes to hardware limitations, out side of beyond3d. fixed hardware will always be locked to limits, and you could easily jump out of those boundaries in favor of unorthodox methods.

Imagine the sacrifices Guerrilla Games would have to go through if they wanted to get rid of shadow fall's 7 LODs, in favor of 80k single LODs. there is a reason why they chose 7LODs (40k max descending to 20k and lower) @ 1080p; for environments and shaders.

1080p is a compromise in it's self. Polygons, shaders, and frame rate are compromises in their selves as well.


even when you upgrade hardware, you are still trading in for something that has similar limitations. this is something that should cross everyone's minds. so when developers talk about compromises they are not insulting the box, they are simply stating the obvious of working with fixed hardware.

Is that a fair comparison 7 lods but thats for npc the fixed geometry for ryse only applies to the main character and makes more sense since it is 3rd person.
 
Is that a fair comparison 7 lods but thats for npc the fixed geometry for ryse only applies to the main character and makes more sense since it is 3rd person.

It's fair in that it makes a point. You could have 5 times the performance of Xbox One and still target 900p. No matter what performance your hardware the percentage difference in pixels from 900p to 1080p is still the same. You can gain a lot in per pixel quality rendering at a lower resolution that in Cryteks view provides the best overall image.

And you may ask why not just do 480p then? There is clearly an "optimal compromise" between resolution and effects on all systems, PS4, XB1 and PC. When the resolution gets too low you start to lose the benefits of your higher per pixel quality. Games should be judged on their overall image not one arbitrary number that was decided based on film standards.
 
The way I see it they found the best settings to achieve the graphical fidelity they wanted to showcase. There is nothing weird or worng about it, nor. Does it show any inability or ability of the hardware. They kept graphical effects above resolution for this title. if the differwnce in image sharpness isn't much, then i think every developer would go for more effects. Its just a matter of goal. Internally, they want a good looking game, not a 1080p game.
 
If Yerli is telling the truth, that Ryse has always been 900p, then that would mean Crytek knew ahead of time certain XB1 limitations would cripple performance on handling the new CryEngine. Because we damn well know the new CryEngine goes well beyond 900p rendering...

What possible XB1 hardware limitations did Crytek foresee ahead of time when developing Ryse on the system? Is it the ESRAM configuration? Not enough raw system memory bandwidth throughput? Not enough CUs/ROPs? Or a combination of everything?

Will the XB1 sweet spot for rendering intensive games be 1600x900p, scaled to 1920x1080p? Because it seems IMHO, Yerli is applying 1080p native render output was never an option with Ryse development on XB1.

FUD, plain and simple. You aren't trying to ask a question, you are trying to make a point. Bitch and moan about just wanting to have a technical discussion all you want... that post of yours is transparent and IMO has no place here.
 
Yeah, I'm getting tired of posts like that one from Shortbus.

Most HDTVs don't even have 1:1 pixel mapping turned on. When I go places I have to fight owners to get the remote and turn it on. I wouldn't be surprised if "900p" is indiscernible from "1080p" for the vast majority of HDTV owners, and not a big deal for most of the tiny number who can push their nose up against the tv and actually know what they're looking at.

And I expect that there's very little overlap between those that know what they're looking at and those that make the most noise.
 
I wonder what effects would've been cut if they went with 1080p and see if the tradeoffs are worth it. If just by looking at it, I would cut the object motion blur, get rid off the Bokeh dof, tone down the character polys a bit further to 65k, reduce the joints and maybe cut back some of the grass. This should at least keep everything clean and sharp without looking too much different.
 
I wonder what effects would've been cut if they went with 1080p and see if the tradeoffs are worth it. If just by looking at it, I would cut the object motion blur, get rid off the Bokeh dof, tone down the character polys a bit further to 65k, reduce the joints and maybe cut back some of the grass. This should at least keep everything clean and sharp without looking too much different.
Yet still looking worse so why would you?
Although I'm not sure about you saying it wouldn't look _too_ different anyway.

This talk of bottlenecks, what bottlenecks?
Maybe the only bottleneck is that it is on a _console_ and to achieve what some of you go on about would require it to run on PC.
How is that even remotely a bottleneck?

It's up there in the top 3 of best looking launch games on any platform(assuming most peoples view) and yet people still going on like the graphics are gimped in some way. :rolleyes:
 
Anyway back to the discussion.

Does anyone believe the scaling that was mentioned was their own software implementation or that it could be done by the hardware scaler?
Do we know enough about the hardware scaler to know if it could do what they mentioned?
 
Last edited by a moderator:
If consoles are similar to pc's then the gpu does the scaling (there isnt a separate scaling chip)
And if your struggling for horsepower and it looks like ryse is the that would suggest scaling would be handed off to the t.v
Just send a 900p signal to the t.v and it will upscale (unless you've set your t.v to not upscale)
 
Could they just not take the same route as GG have done with KZ:SF? Run the single player at 1080p@30 with all the effects switched on instead of crippling the resolution to enable effects @60fps?

After all Ryse is a brand new IP, there is no conditioning in place to expect 60fps.
 
If consoles are similar to pc's then the gpu does the scaling (there isnt a separate scaling chip)
And if your struggling for horsepower and it looks like ryse is the that would suggest scaling would be handed off to the t.v
Just send a 900p signal to the t.v and it will upscale (unless you've set your t.v to not upscale)

You're assuming that Ryse isn't "1080p" because the Xbox is "struggling for power". Crytek are saying they chose 900p for [reasons].

Scaling takes no power in the Xbone because there's a variable res scaler built in - even better than the one in the 360. Also, there probably aren't any/many TVs that could even take a "900p" input. None of the HDTVs I've used can accept a resolution like that. Even PC inputs are limited to a handful of resolutions.

Mine can't take anything other than 1080, 720, 576, 480.
 
Could they just not take the same route as GG have done with KZ:SF? Run the single player at 1080p@30 with all the effects switched on instead of crippling the resolution to enable effects @60fps?

After all Ryse is a brand new IP, there is no conditioning in place to expect 60fps.

Ryse is 30fps in single player from what I've read.
 
If consoles are similar to pc's then the gpu does the scaling (there isnt a separate scaling chip)
There is a separate scaling chip, or rather logic block, that supports three layered buffers which can be scaled independently, and uses a reportedly high quality, though undisclosed, upscaler. It costs zero CPU or GPU power to upscale a lower resolution framebuffer.
 
You're assuming that Ryse isn't "1080p" because the Xbox is "struggling for power". Crytek are saying they chose 900p for [reasons].

Scaling takes no power in the Xbone because there's a variable res scaler built in - even better than the one in the 360. Also, there probably aren't any/many TVs that could even take a "900p" input. None of the HDTVs I've used can accept a resolution like that. Even PC inputs are limited to a handful of resolutions.

Mine can't take anything other than 1080, 720, 576, 480.

So you're saying they wouldn't go for 1080p if they could do it with little other stuff turned off.

Of course they would, they're Crytek. Push everything to the limit and watch the hardware break down.:LOL:
 
Status
Not open for further replies.
Back
Top