*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
Yep, i've made a test with shot from SDK level:

1080p
http://i2.minus.com/iDTL2e6mXarfi.png

Lanchos resample
http://i5.minus.com/id7SUAnSYKaIs.png

Lanchos + 1 DPI sharpening
http://i2.minus.com/iyMxFvY3ddp8z.png

Mitchel resample
http://i4.minus.com/i3FeJQgXYxz7x.png

---
On my TV, there is almost unnoticeable difference between 1080p and Lanchos, but i have crappy 1080p mode and only 32'. But when i'm comparing them on my PC monitor, the difference is more noticeable.
Ryse will still have better scaling than those.
 
Your example images are inaccurate. You'd need to compare a natively rendered image to an upscaled one, rather than a smaller image to a larger, upscaled one.

The attached shows vector graphics rendered natively in a 192x108 image, and natively in a 160x90 buffer and then upscaled with simple upscaler.

For some content it makes negligible observable difference. For other stuff, like alternating lines, it very obviously blurs the results. Whether it makes an impact on this game or not doesn't need ot be discussed from a theorietical POV as people can actually see the game. If one looks at the game and thinks, "my god, that's blurry. I can't play that!" then don't buy it. Otherwise, care not what resolution it's rendering at. ;)

I really wonder what the discourse would be on these games if we had no transparency into native resolutions or upscaled resolutions via the "Quaz Method". Regardless of Crytek's motives, i believe companies will make decisions based on general efficiencies or artistic merit, and the focus on resolution is getting in the way of meaningful discussion on those topics.
 
@ Shifty Geezer: Thanks. I mainly wanted to visualize the pseudo-AA effect - but for a direct comparison of final results, your image is of course way better.

Given your examples, I would actually concede that the diagonal lines do demonstrate a certain loss of detail when directly juxtaposed to the native resolution image ... not so much with the rest of the graphics, though.

Overall conclusion is that 900p at least handily beats our "108p" examples ;)
 
Your example images are inaccurate. You'd need to compare a natively rendered image to an upscaled one, rather than a smaller image to a larger, upscaled one.

The attached shows vector graphics rendered natively in a 192x108 image, and natively in a 160x90 buffer and then upscaled with simple upscaler.

For some content it makes negligible observable difference. For other stuff, like alternating lines, it very obviously blurs the results. Whether it makes an impact on this game or not doesn't need ot be discussed from a theorietical POV as people can actually see the game. If one looks at the game and thinks, "my god, that's blurry. I can't play that!" then don't buy it. Otherwise, care not what resolution it's rendering at. ;)

It's quincunx all over again :oops:
 
If Yerli is telling the truth, that Ryse has always been 900p, then that would mean Crytek knew ahead of time certain XB1 limitations would cripple performance on handling the new CryEngine. Because we damn well know the new CryEngine goes well beyond 900p rendering...

What possible XB1 hardware limitations did Crytek foresee ahead of time when developing Ryse on the system? Is it the ESRAM configuration? Not enough raw system memory bandwidth throughput? Not enough CUs/ROPs? Or a combination of everything?

Will the XB1 sweet spot for rendering intensive games be 1600x900p, scaled to 1920x1080p? Because it seems IMHO, Yerli is applying 1080p native render output was never an option with Ryse development on XB1.
 
What possible XB1 hardware limitations did Crytek foresee ahead of time when developing Ryse on the system?

That the XB1 didn't come with a fast high end multi-core cpu and a 32+ CU gpu with 200-300 GB/s worth of off chip dram bandwidth.

This isn't the PC space where you target flagship gpus and then allow pc gamers to adjust settings to accommodate lesser gpus as well as their individual tastes as a gamer.

They picked a resolution that best expressed the visuals they wanted with Ryse.
 
Last edited by a moderator:
If Yerli is telling the truth, that Ryse has always been 900p, then that would mean Crytek knew ahead of time certain XB1 limitations would cripple performance on handling the new CryEngine. Because we damn well know the new CryEngine goes well beyond 900p rendering...

What possible XB1 hardware limitations did Crytek foresee ahead of time when developing Ryse on the system? Is it the ESRAM configuration? Not enough raw system memory bandwidth throughput? Not enough CUs/ROPs? Or a combination of everything?

Will the XB1 sweet spot for rendering intensive games be 1600x900p, scaled to 1920x1080p? Because it seems IMHO, Yerli is applying 1080p native render output was never an option with Ryse development on XB1.

He just said that Ryse was always showed running in 900p. Its clear that they were experimenting with resolutions when they were developing the game, every developer does that.
 
Oh, yeah definitely its Xbone performance fault that there is no wet shader on armor and skin ...

PS4 must be weaker than PS3 then, if KZ:SF do not have wet shaders on armors and cloths in comparison to Beyond ...

Some parts of KZ:SF also has pixelated shadows.
 
He just said that Ryse was always showed running in 900p. Its clear that they were experimenting with resolutions when they were developing the game, every developer does that.

Wasn't the stone henge video found to be 1080P? does that mean it wasn't running on a XBONE?.
 
If Yerli is telling the truth, that Ryse has always been 900p, then that would mean Crytek knew ahead of time certain XB1 limitations would cripple performance on handling the new CryEngine. Because we damn well know the new CryEngine goes well beyond 900p rendering...

What possible XB1 hardware limitations did Crytek foresee ahead of time when developing Ryse on the system? Is it the ESRAM configuration? Not enough raw system memory bandwidth throughput? Not enough CUs/ROPs? Or a combination of everything?

Will the XB1 sweet spot for rendering intensive games be 1600x900p, scaled to 1920x1080p? Because it seems IMHO, Yerli is applying 1080p native render output was never an option with Ryse development on XB1.

New low of shifted goalposts :rolleyes:
 
Agreed, plus, he's also questioning the source as to whether he's telling the truth. FUD at its best. :rolleyes:

Tommy McClain

You guys have such problems with people having questions... this is a tech discussion board. I don't sit around constantly debating PS4/PC questions or pure FUD answers. Because at the end of the day, it's only hardware... so questions will be asked, regardless of your take on them.
 
Last edited by a moderator:
New low of shifted goalposts :rolleyes:

That was a helpful post.

Back to his question, if you put Ryse in 1080P and it dropped frames (which is implied it would), what is the cause? 1080P @ 30fps is not unreasonable expectation considering the hardware improvements. MS was just telling us that they saw dropped frames from CPU bottlenecks, that clearly isn't the case here. So what is the issue; CU, ROPS, bandwidth? We will never know probably, but these types of questions are the point of the forum.
 
That was a helpful post.

Back to his question, if you put Ryse in 1080P and it dropped frames (which is implied it would), what is the cause? 1080P @ 30fps is not unreasonable expectation considering the hardware improvements. MS was just telling us that they saw dropped frames from CPU bottlenecks, that clearly isn't the case here. So what is the issue; CU, ROPS, bandwidth? We will never know probably, but these types of questions are the point of the forum.

So you want to know what the bottleneck would be to prevent it from running at 1080p, all other things remaining equal? Are you assuming the resolution of other buffers remain the same, or increase proportionally? Without an in-depth technical interview, I don't think you're going to get an answer. There are just too many moving parts to take a stab in the dark by looking at screenshots. We don't even know how the architecture works yet, or how it is being used. Could be fillrate, shader or cpu limited at different times.
 
That was a helpful post.

Back to his question, if you put Ryse in 1080P and it dropped frames (which is implied it would), what is the cause? 1080P @ 30fps is not unreasonable expectation considering the hardware improvements. MS was just telling us that they saw dropped frames from CPU bottlenecks, that clearly isn't the case here. So what is the issue; CU, ROPS, bandwidth? We will never know probably, but these types of questions are the point of the forum.
You'd probably get dropped frames because it's 900p now. If it were 1080p there'd be some other trade-off.

Were any Crytek PS360 games 720p?
 
I'm wondering how much input MS took from developers during the XBone design period. If the limits of the design were known ahead of time by the likes of Crytek the why was this never fed back to MS? Or perhaps it was but because of the none game focus it was ignored?
 
You'd probably get dropped frames because it's 900p now. If it were 1080p there'd be some other trade-off.

Were any Crytek PS360 games 720p?

Yes, I was assuming every thing else was held constant. How do the additional pixels stress the hardware; where is the bottleneck?
 
That was a helpful post.

Back to his question, if you put Ryse in 1080P and it dropped frames (which is implied it would), what is the cause? 1080P @ 30fps is not unreasonable expectation considering the hardware improvements. MS was just telling us that they saw dropped frames from CPU bottlenecks, that clearly isn't the case here. So what is the issue; CU, ROPS, bandwidth? We will never know probably, but these types of questions are the point of the forum.

But if rops were an issue wouldn't that mean that Forza 5 shouldn't run at 1080p@60fps and 2xMSAA(?) ?

Next gen is just weak compared to top of the mid end pc game segment with that i mean the 7870 level cards not even the 79xx cards. Still shouldn't keep people from enjoying the games.
 
I'm wondering how much input MS took from developers during the XBone design period. If the limits of the design were known ahead of time by the likes of Crytek the why was this never fed back to MS? Or perhaps it was but because of the none game focus it was ignored?

They probably received and listened to a lot of feedback, including from Crytek. How do you know the feedback was ignored, without knowing what it was, or who presented it? Do you mean non-game focus?
 
Status
Not open for further replies.
Back
Top