*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
So you're saying they wouldn't go for 1080p if they could do it with little other stuff turned off.

Of course they would, they're Crytek. Push everything to the limit and watch the hardware break down.:LOL:

Why is this so difficult to understand?

If they could go to 1080p with no performance hit that would mean they were making a game at 900p and intentionally leaving a huge amount of power unused. That would be absolutely stupid.

According to Crytek, they made a choice that 900p (god I hate that term) was the optimal resolution to allow them to achieve what they want to achieve. Because fewer pixels means they have more time to spend on each pixel.

That is not just true for Xbone, it is true for PS4, PS3, 360, Wii U, and basically anything in use today.
 
So you're saying they wouldn't go for 1080p if they could do it with little other stuff turned off.

Of course they would, they're Crytek. Push everything to the limit and watch the hardware break down.:LOL:
They could do that on pc because they could develop a game that would scale and max out hardware that is improving every year, and still not be able to run it on max settings for years.(reasonable budget hardware)

Consoles you don't have that luxury.
Why compromise the visuals by going for 1080p.
Not saying every game should go 900p, it may depend on art direction, engine, and if scaling it produces an image that most people wouldn't think isn't native or close enough.

The point is the gpu could have had 60%+ more processing power and they may have still gone for 900p, due to what I said in last paragraph, or it still may not have been enough to go 1080p with same visual fidelity with a locked frame rate.
 
If what the Crytek dude is saying is true, and Ryse has always been 1600 x 900 with a custom 'AA' scaler taking the game up to a final framebuffer size of 1080, then that means that no-one who has crying about about a "downgrade" can even tell the difference.

They are the proof that they need to stop clinging on so tightly to the single metric that they think they can understand.
 
I'm starting to think both Sony and MS should mandate that you cant say what res a game is prior to release. lol.

It just doesn't mean what people think it does any more.
At most it's going to become a measure of if something is sharp or blurry.
720p = blurry -> 1080p = Sharp, regardless of what the actual resolution is, however you choose to measure it.

Leave it up to the pixel counters to work it out, and everyone else will just think is it sharp or not. That way a game wont be marked down because everything isn't rendered in 1080p or something.
 
I don't understand why people care. If you look at the game and think it looks good, what does it matter what resolution it is? It's an interesting thing to discuss from a technical perspective, but I don't understand why anyone would be upset about it, or why some people demand that games are forced to render at 1080p (which buffers?).

I am curious about what they're doing for upscaling and AA. I think it would make sense that they're applying anti-aliasing to the upscaled image, and then adding the UI on top, as some here have suggested.
 
Ryse is 30fps in single player from what I've read.

Oh. I thought it was a 900p@60. I'm confused as to why 900p is the sweet spot then apart from being the most appropriate for upscaling. What was it that stopped them hitting 1080p then?

Apart from timing profiles etc, the similarity between the XB1 and the 360 shouldn't have led to any big surprises should it?
 
Why is this so difficult to understand?

If they could go to 1080p with no performance hit that would mean they were making a game at 900p and intentionally leaving a huge amount of power unused. That would be absolutely stupid.

According to Crytek, they made a choice that 900p (god I hate that term) was the optimal resolution to allow them to achieve what they want to achieve. Because fewer pixels means they have more time to spend on each pixel.

That is not just true for Xbone, it is true for PS4, PS3, 360, Wii U, and basically anything in use today.

It's not hard to understand because most people here probably do.

You're assuming that Ryse isn't "1080p" because the Xbox is "struggling for power". Crytek are saying they chose 900p for [reasons].
Obviously they chose 900p because they wanted to have a certain amount of effects, and doing it at a higher resolution would force them to sacrifice IQ for the same fluidity or sacrifice fluidity to maintain on IQ. (if we define IQ here as independent from frame rates and resolution)

Saying "Xbox One will struggle for power (to do what they want to do@1080p)" or "900p gives better results" is really the same.

It's just the same situation described differently.
 
The first 720p 30fps game that comes out this gen better look like Infiltrator. That's my minimum expectation. And I don't think that's unreasonable, a few years in.
 
If what the Crytek dude is saying is true, and Ryse has always been 1600 x 900 with a custom 'AA' scaler taking the game up to a final framebuffer size of 1080, then that means that no-one who has crying about about a "downgrade" can even tell the difference.

They are the proof that they need to stop clinging on so tightly to the single metric that they think they can understand.

I think this is a straw man, I don't see many people claiming they see a down grade, they are not talking about the IQ before or after. The discussion (to me) was about why they choose a sub-1080P res, the technical trade-offs. For example, ee know why Halo 3 ran at 640P, I'm interested to know why Crytek choose 900P for the Xb1. Was it the eSRAM size, the fillrate, etc.? Clearly they would run at 1080P if they could, but they can't.

People are being a little to defensive, these types of trade off are done on every console for every game. We just happen to be talking about one game on one console in this thread.
 
Considering how beautoful the game looks 900p was freakin worth it! Its probably the only game that looks truly next gen
 
Oh. I thought it was a 900p@60. I'm confused as to why 900p is the sweet spot then apart from being the most appropriate for upscaling. What was it that stopped them hitting 1080p then?

Apart from timing profiles etc, the similarity between the XB1 and the 360 shouldn't have led to any big surprises should it?

What do you mean by sweet spot? They picked 900p because it allowed them to fit in all of the rendering "effects" they wanted, at the framerate they were targeting. I imagine that resolution was seen as not compromising image quality significantly, so it was deemed acceptable. For all we know they started with the idea of 900p, assuming they could upscale and anti-alias with minimal impact to image quality, and never intended to try for 1080p.

If you're asking the question about why they could not hit 1080p, assuming all other things remained equal, then I'd say it's a question that can only be answered by Crytek. I mean, the amount of information that is collected in profiling, and understanding how each piece of the rendering pipeline can impact the framerate is quite complicated. We just don't have the information, and honestly there are probably only a dozen people on this forum that could make a useful educated guesses if we did.

What does the 360 have to do with this? It's a completely different hardware architecture.
 
There is a separate scaling chip, or rather logic block,
I stands corrected :D

You're assuming that Ryse isn't "1080p" because the Xbox is "struggling for power". Crytek are saying they chose 900p for [reasons].
And the reason is they dont have the horsepower to do 1080p
Crytek have a history of this, when Crysis was released no one was playing it at 1080 @ 60fps full quality and max aa
You never know, amd could still be in the process of optimizing their drivers and by the time the xb1 comes out theres a chance ryse could end up running at 1080
we've seen some big improvements in games on the pc from newer drivers although there is probably less performance to uncover on a console with it being closer to the metal
 
It's not hard to understand because most people here probably do.

Obviously they chose 900p because they wanted to have a certain amount of effects, and doing it at a higher resolution would force them to sacrifice IQ for the same fluidity or sacrifice fluidity to maintain on IQ. (if we define IQ here as independent from frame rates and resolution)

Saying "Xbox One will struggle for power (to do what they want to do@1080p)" or "900p gives better results" is really the same.

It's just the same situation described differently.

I didn't think that was the point you were trying to make. I still don't for that mattter. You said, to me:

"So you're saying they wouldn't go for 1080p if they could do it with little other stuff turned off."

And I'm saying that they might not. If Xbone had the power to do Ryse as is at 1080p, Crytek might very well have chosen to stay at "900p" and up detail elsewhere. Because as it now appears, no one claiming it's an issue could even tell it wasn't rendering at 1080p in the first place.

You are acting like 1080p is some kind of implied goal. It doesn't have to be. And Crytek claim it wasn't. It might never have been. Even with double the power it still might not be.

I think this is a straw man, I don't see many people claiming they see a down grade, they are not talking about the IQ before or after. The discussion (to me) was about why they choose a sub-1080P res, the technical trade-offs. For example, ee know why Halo 3 ran at 640P, I'm interested to know why Crytek choose 900P for the Xb1. Was it the eSRAM size, the fillrate, etc.? Clearly they would run at 1080P if they could, but they can't.

NOOOOOOOOOOOOOOO.

IT IS NOT CLEAR THAT THEY WOULD RUN AT 1080 IF THEY COULD. THEY MIGHT HAVE CHOSEN TO FOCUS ON HIGHER QUALITY PIXELS INSTEAD OF HAVING MORE OF THEM.

And it is not a "strawman" to point out that people don't seem to have been able to tell that the game was always 1600 x 900 and not 1080p. It's not a strawman because it validates Crytek's choice and people are here claiming that they are trying to understand Cryteks choices and understand why the game is as it is.

People are being a little to defensive, these types of trade off are done on every console for every game. We just happen to be talking about one game on one console in this thread.

I'm not getting defensive, I'm getting annoyed. Annoyed that people are making so little effort to try and understand that resolution is a CHOICE that is make of part of an OVERALL SET OF COMPROMISES that are part of trying to deliver THE BEST OVERALL VISUAL PACKAGE.

And the reason is they dont have the horsepower to do 1080p

*facepalm*
 
FUD, plain and simple. You aren't trying to ask a question, you are trying to make a point. Bitch and moan about just wanting to have a technical discussion all you want... that post of yours is transparent and IMO has no place here.

Yeah, I'm getting tired of posts like that one from Shortbus.

Then I will bitch, moan and be transparent, because anything dealing with the technical aspects of XB1 hardware or it games are off limits to little dickus, crack-function and whoever/whatever else company shill that doesn't want to address questions. ;)
 
Last edited by a moderator:
If Xbone had the power to do Ryse as is at 1080p, Crytek might very well have chosen to stay at "900p" and up detail elsewhere. Because as it now appears, no one claiming it's an issue could even tell it wasn't rendering at 1080p in the first place.

That's the part which is mildly hysterical. People are basically asking why Crytek haven't implemented something most people can't see. If people can't see it as in this case where people clearly couldn't tell 900 from 1080, then why waste cycles on it to begin with? Presumably people want the best looking game no? If so isn't it logical to remove cycles spent on stuff that can't be noticed and spend them elsewhere where they can? That would be the smart developer choice no? It's weird, I don't get why people purposely want a game to make inefficient use of resources and spend then where they aren't noticed. Why would Crytek ever willingly do that?
 
That's the part which is mildly hysterical. People are basically asking why Crytek haven't implemented something most people can't see. If people can't see it as in this case where people clearly couldn't tell 900 from 1080, then why waste cycles on it to begin with? Presumably people want the best looking game no? If so isn't it logical to remove cycles spent on stuff that can't be noticed and spend them elsewhere where they can? That would be the smart developer choice no? It's weird, I don't get why people purposely want a game to make inefficient use of resources and spend then where they aren't noticed. Why would Crytek ever willingly do that?

Because 1080p.
 
That's the part which is mildly hysterical. People are basically asking why Crytek haven't implemented something most people can't see. If people can't see it as in this case where people clearly couldn't tell 900 from 1080, then why waste cycles on it to begin with? Presumably people want the best looking game no? If so isn't it logical to remove cycles spent on stuff that can't be noticed and spend them elsewhere where they can? That would be the smart developer choice no? It's weird, I don't get why people purposely want a game to make inefficient use of resources and spend then where they aren't noticed. Why would Crytek ever willingly do that?

Then why ever have anything over 900p? If nobody can tell the difference why bother?
 
Then why ever have anything over 900p? If nobody can tell the difference why bother?
We may see that every game gets optimised to 900p for that very reason. We're certainly seeing 720p games for that reason. But it's clearly going to depend on the game. eg. a Geometry Wars type game could look noticeably sharper at 1080p over 900p.
 
Considering how beautoful the game looks 900p was freakin worth it! Its probably the only game that looks truly next gen


Exactly, The Xbox one can have a standard of 1080p 30fps like the rest of the games coming, but Ryse isn't like the rest of the games and this is the reason for going unorthodox.


For fixed hardware even when you have an architecture laid out with less bottlenecks, there's nothing that says devs can't favor more shaders or polys over pixels. unorthodox pixel count goes with any fixed hardware, battlefield 4 is an other example of other favorable choices.
 
If what the Crytek dude is saying is true, and Ryse has always been 1600 x 900 with a custom 'AA' scaler taking the game up to a final framebuffer size of 1080, then that means that no-one who has crying about about a "downgrade" can even tell the difference.

They are the proof that they need to stop clinging on so tightly to the single metric that they think they can understand.

+1...haha nicely put.
 
Status
Not open for further replies.
Back
Top