*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
1080p is a compromise with regards to graphics rendering.
graphics rendering is a compromise with regards to 1080p.

Do you see anything wrong with either statement?
I don't.

In both cases, they're all compromises, and there's quite frankly nothing wrong to admit that (except for PR problems). But saying a fish is not a fish will probably face more backlash.

Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.

You're messing with the context of Crytek's comment, they said they didn't compromise RYSE as it's always been 900p.

Was it a compromise to choose graphics over 1080p? Of course but that is not what he is saying here.

If it's always been 900p though the product itself was not compromised as people thought it was.
 
what does he mean upscaler for AA? i didn't think you would get the appearance of aa by scaling an image up, only down? or is he talking like a sharpening filter or someting?
 
Too long to read that whole thing? 1080p is a compromise with regards to graphics rendering. Not the other way around. Lower resolution always allows for better graphics. The question is how low can you go before people legitimately will notice without being told.

You know as well as most people here know that graphics fidelity and resolution go hand in hand in eating up resources. Upping one side will undeniably decrease your capacity in the other side.

By choosing 900P, they will undeniably have more resources to increase graphics fidelity provided everything else is equal. So when today they suddenly say "we're lowering polygons" and "in fact we're running at 900p", there is no doubt they're relocating resources to other areas in the game. Without doing 900p instead of 1080p, they would probably have to turn certain things off. This is a compromise and there's nothing wrong about it.


Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.
 
You know as well as most people here know that graphics fidelity and resolution go hand in hand in eating up resources. Upping one side will undeniably decrease your capacity in the other side.

By choosing 900P, they will undeniably have more resources to increase graphics fidelity provided everything else is equal. So when today they suddenly say "we're lowering polygons" and "in fact we're running at 900p", there is no doubt they're relocating resources to other areas in the game. Without doing 900p instead of 1080p, they would probably have to turn certain things off. This is a compromise and there's nothing wrong about it.


Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.

I think he means "900p is not a compromise from E3 demo", that people thought it was 1080p.
 
You know as well as most people here know that graphics fidelity and resolution go hand in hand in eating up resources. Upping one side will undeniably decrease your capacity in the other side.

By choosing 900P, they will undeniably have more resources to increase graphics fidelity provided everything else is equal. So when today they suddenly say "we're lowering polygons" and "in fact we're running at 900p", there is no doubt they're relocating resources to other areas in the game. Without doing 900p instead of 1080p, they would probably have to turn certain things off. This is a compromise and there's nothing wrong about it.


Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.

What? From a fixed hardware standpoint. 1080p is ALWAYS a compromise when it comes to what, how many, and how advanced are the things that you can render. It isn't like PC where you can just throw more powerful hardware in order to do the things you could do at a lower resolution.

You also basically gain nothing at typical living room viewing distances on typical (30-60 inch) living room TVs. Sure you could make an argument for people playing their console on a desktop monitor or sitting 1 meter away from their TV (maybe, it depends on how good their upscale + AA algorithm is 900p to 1080p shouldn't present many upscale artifacts), but that's not the use case that console developers are coding for.

Just like 60 FPS versus 30 FPS. 60 FPS is qute obviously a compromise with regards to what can be rendered. But in that case depending on the game type, there are actual benefits that will be felt by a large number of people. Unlike the difference between 1080p and 900p.

Anyway...

So, basically what you are trying to say is that every single game on every single console is a compromise. And so every single developer in the world should just come out and say that the graphics for their game is a compromise. Gotcha.

And as others have noted, going from 900p (E3) to 900p (now) is certainly not a compromise between E3 and now.

And yes, lowering the polygon count on in game characters in order to boost the graphical effects applied to those characters or visible improvements to the background is a compromise in terms of numbers. But if the reduction in polygons is unnoticable while the effects added are noticeable, is it really a compromise? Considering I have been vocal about how disappointed I am that ALL next gen games still have easily see straight polygon edges on characters, I'm not so sure. The 150poly character likely had just as many noticeable straight poly edges on character outlines as the 85k poly character. They might have been smaller segments, however. And going back to my whole "living room" console experience I may not notice them on TV as I do in screenshots on my monitor. But I'm not sure on that.

Regards,
SB
 
What? From a fixed hardware standpoint. 1080p is ALWAYS a compromise when it comes to what, how many, and how advanced are the things that you can render. It isn't like PC where you can just throw more powerful hardware in order to do the things you could do at a lower resolution.

What the heck is your argument? Nobody's saying 1080p won't require more resources than 900p or anything below.

You also basically gain nothing at typical living room viewing distances on typical (30-60 inch) living room TVs. Sure you could make an argument for people playing their console on a desktop monitor or sitting 1 meter away from their TV (maybe, it depends on how good their upscale + AA algorithm is 900p to 1080p shouldn't present many upscale artifacts), but that's not the use case that console developers are coding for.

Surely you're not stating that they don't care for anybody sitting in front of their computer monitors. I do that and I know a LOT of people who do that. Just because you don't and don't care much for them doesn't mean they aren't important and the devs can just ignore them.

Just like 60 FPS versus 30 FPS. 60 FPS is qute obviously a compromise with regards to what can be rendered. But in that case depending on the game type, there are actual benefits that will be felt by a large number of people. Unlike the difference between 1080p and 900p.

You're again alienating people who you obviously don't care about.
There are people that don't care if it's 30fps or 60 fps, while there are people who care more about resolution.

Anyway...

So, basically what you are trying to say is that every single game on every single console is a compromise. And so every single developer in the world should just come out and say that the graphics for their game is a compromise. Gotcha.

If anybody says that it's not a compromise with a straight face, you know there's obviously something wrong. The fact that it is a compromise doesn't mean they have to tell you it is but doesn't qualify it to flat out lying.

And as others have noted, going from 900p (E3) to 900p (now) is certainly not a compromise between E3 and now.

And yes, lowering the polygon count on in game characters in order to boost the graphical effects applied to those characters or visible improvements to the background is a compromise in terms of numbers. But if the reduction in polygons is unnoticable while the effects added are noticeable, is it really a compromise? Considering I have been vocal about how disappointed I am that ALL next gen games still have easily see straight polygon edges on characters, I'm not so sure. The 150poly character likely had just as many noticeable straight poly edges on character outlines as the 85k poly character. They might have been smaller segments, however. And going back to my whole "living room" console experience I may not notice them on TV as I do in screenshots on my monitor. But I'm not sure on that.

Regards,
SB

"I care about what I see and only what I see and if I don't notice it it's irrelevant."


I think he means "900p is not a compromise from E3 demo", that people thought it was 1080p.
If that is so then it's fine, as long as the E3 version isn't 1080p, which we probably don't have precise info on.
 
Last edited by a moderator:
1080p is a TV format. Unless Im mistaken, I dont remember 1080p being some desirable type of target solution for game development. I don't remember 1080p having any special hold on the PC which had the hardware to support the resolution well before the term hdtv became part of the general technology lexicon.

Is there is a special reason of why 1080p was chosen as a standard TV resolution other then being the number of pixels that neatly fit into a dimension that readily support movie formats? Movie formats that seem to be influenced by the concept of serving an audience rather than just one particular set of eyes. If not, then i can see devs balancing resolution with a number other variables to determine where resources should be allocated in order to produce the best visuals possible.

The concept of "33% more pixels" to me is not readily highlighted when comparing 1080p vs. 900p images in comparison to the ideal of the amount of gpu's processing power needed to accommodate 33% more or 500k more pixels.

I see developers see way more concerned with frame rates versus a particular TV resolution.
 
Last edited by a moderator:
Potentially, with all the small details the have in Ryse, it is clear that 1080p would be a huge benefit.

But the clips we have so far also show a lot of post processing in 'artificial' blurring due to the cinematic effects, so I guess that Ryse will get away with the lower resolution without real impact. Furthermore, it seems that the areas are quite small without a huge draw distance.

I'd say in case of Ryse, 900p is ok...I just hope that we don't get lot's of edge aliasing issues/shimmering.
 
He is saying that it's been the same resolution all along so there was no compromise on the games resolution. Also if they were able to optimize their use of polygons, which makes it look better with less, are we really going to consider that a compromise?

The funny thing is that no one has a hard time believing it was 1080p before we knew it wasn't.
As Shifty pointed out already, the game looks better now than it used to. Examples:

ryse1_by_raziel1992-d6odjcy.gif


rkq09Hu.jpg


pKufYcm.jpg


2L9GUTX.jpg


http://i.imgur.com/Y883qro.jpg

http://i.imgur.com/sP2qeB7.jpg


Before and after:

ibr2dgbhWoGRSU.gif


ib11ggBsW8mOGS.gif


So if they get better results I would call that re-allocation. 900p are a compromise if they preferred to go with a lower resolution to enhance the Antialiasing though.

I'd say that 1080p + AAx2 should look great but if they found a better utilisation of those resources for a launch game then I am certainly not repulsed.

Crytek also said that the game couldn't run at 1080p on the PS4 --although I'd expect some kind of bump if the game run on Sony's console 'cos of the GPU.

http://gearnuke.com/crytek-ceo-ryse-wouldnt-have-run-at-1080p-on-ps4-decision-was-choice-not-hurdle/

Finally, regarding the polygons, nowadays it is not about rendering more polygons like it was in the past.

I bet i bet many people didn't know this: but Halo 1 uses more polygons in Master Chief, than halo 2's Master Chief. That's the wonders of modern technology and smarter GPUs.

Despite the fact the Master Chief in Halo 1 had a lot more polygons than in Halo 2, it looked way worse in comparison. -now compare Xenon to Jaguar, too...., but that's another story-

Links:

http://beyond3d.com/showthread.php?t=43975 (AllNets response)

http://halo.bungie.org/misc/bollmc2/ (flash animation showing the polygon mesh of Halo 2's Master Chief and the transition to how it looked with normal mapping http://halo.bungie.org/miscellaneous.html?search=bumpmapping)

http://previews.teamxbox.com/xbox/395/Halo-2/p2/ (from the Halo 2 preview, my favourite Halo ever btw.)

Less Polygons: A good thing

Yes, you read well and I’m not crazy. HALO 2 has models with less polygons than the original. And yes, that’s a good thing. In fact is a benefit of per pixel lighting that make models less dense. Less polygons per model simplifies collision, improves physics (which were already amazing in HALO 1), and memory issues. This way, the game will have faster loading times. Another benefit of having less polygons per mesh will be for online functionality. With this new rendering method levels data is smaller and so the info transmitted over internet networks during online games. The improvement in the visuals is now achieved in every Xbox when the rendering process happens and not by having bigger complex data that has to be transmitted when playing online.
http://xbox.gamespy.com/xbox/halo-2/528851p8.html (another preview during the good ol' times of the extinct Gamespy)

In the previous game, while everything looked nice, it all had a very flat, boring appearance. With the addition of a new bump-mapping engine, every object in the game has the illusion of depth. When combined with the lighting effects, nothing in the game looks flat. Metal surfaces are slightly pitted and bumpy, and look so real that you almost want to reach out and touch them. The same goes for the rock walls, as well as the weapons that you'll come across.
And the better article on the subject called The Halo Effect, it's worth the read!!

http://www.cgw.com/Publications/CGW/2005/Volume-28-Issue-1-Jan-2005-/The-Halo-Effect.aspx

Following this mandate to surpass the quality of the previous title, the artists had two main goals for Halo 2: make it better and make it run faster. This required the team to build all the new and returning characters with lighter geometry.

Despite the lower polygon counts, the new models outshine their predecessors in detail, thanks to an increased reliance on normal mapping, a form of texture mapping that greatly enhances the realism and lighting of a surface by encoding details using three vectors of information instead of the usual two vectors designated by bump maps.

In Halo 2, the spotlight is on Master Chief, which 3D artist Eric Arroyo built vertex by vertex within Discreet’s 3ds max. While modeling the cyber-soldier’s smoothly curving metallic surfaces, with their sharp ridges, and beveled edges, Arroyo and the rest of the team paid special attention to the Master Chief’s joints, to ensure that the complex structural design of the highly detailed battle suit would deform realistically.
The character Master Chief returns in Halo 2, this time with fewer polygons but a higher level of detail
 
1080P or 900P doesn't matter me if the game looks good and this one does, hopefully the gameplay matches the visuals. If so MS will have a solid new IP.
 

I find that interesting that they mention they would have bumped into the same situations with PS4. makes me wonder about the gap.

All of the big devs should have final or near final devkits by now of both consoles.

Potentially, with all the small details the have in Ryse, it is clear that 1080p would be a huge benefit.

But the clips we have so far also show a lot of post processing in 'artificial' blurring due to the cinematic effects, so I guess that Ryse will get away with the lower resolution without real impact. Furthermore, it seems that the areas are quite small without a huge draw distance.

I'd say in case of Ryse, 900p is ok...I just hope that we don't get lot's of edge aliasing issues/shimmering.

We've been seeing the game in 900p the whole time, i can't recall much about jaggies, honestly it fooled everyone for 1080p.


----------------------------------------------

for anyone fishing for high quality footage of Ryse, Xbox live has the old E3 8 minute demo in supper high fidelity. it should be under their "coming soon" tab in their video games section.

Or you could do the next to best thing and see glimpses of it in these vids.
http://www.gamersyde.com/download_cryengine_gc_cryengine_demo-30611_en.html
http://www.gamersyde.com/download_ryse_son_of_rome_behind_the_scenes-30509_en.html

they have a little compression but it's much better than youtube.
 
Last edited by a moderator:
We've been seeing the game in 900p the whole time, i can't recall much about jaggies, honestly it fooled everyone for 1080p.

Did we have any screens or videos that was 1080p to lead anybody to think it was 1080p?

As mentioned before by other people, I think it was some rep that said it was 1080p that everybody thought "it is probably 1080p". We should know better here to not take a sub 1080p source and conclude that it's 1080p.
 
Did we have any screens or videos that was 1080p to lead anybody to think it was 1080p?

what i was saying is that it didn't cross anyone's minds that it could be lower than 1080p. not even did it cross DF when they first saw it in person months ago. and honestly raw 1080p looks bad apposed to sub 1080p + multisampling.

the big clincher to the next gen consoles i think is battlefield 4 running at 720p, and watch dogs with low detail shadows poor framerate and horrible pop in. according to what I've been seeing nothing looks to good about them anymore.
 
And even at 900p, they still couldn't get a decent wet shader on the Armour and skin in that rain level
Oh, yeah definitely its Xbone performance fault that there is no wet shader on armor and skin ...

PS4 must be weaker than PS3 then, if KZ:SF do not have wet shaders on armors and cloths in comparison to Beyond ...
 
what i was saying is that it didn't cross anyone's minds that it could be lower than 1080p. not even did it cross DF when they first saw it in person months ago. and honestly raw 1080p looks bad apposed to sub 1080p + multisampling.

the big clincher to the next gen consoles i think is battlefield 4 running at 720p, and watch dogs with low detail shadows poor framerate and horrible pop in. according to what I've been seeing nothing looks to good about them anymore.

I don't take anything as native 1080p unless tested correctly. If devs state it I usually try to take their word for it but leave some room for doubt.

I thought this was the general accepted stance from last generation :smile:
 
Oh, yeah definitely its Xbone performance fault that there is no wet shader on armor and skin ...

PS4 must be weaker than PS3 then, if KZ:SF do not have wet shaders on armors and cloths in comparison to Beyond ...

How do you know KZSF doesn't have wet shaders in sp? Besides, it's genuine 1080p no less;).
 
Really? I'd say Cevat Yerli is quite well versed on the compromises that have to be made with regards to graphics rendering quality in order to support higher resolutions considering that CryEngine is considered by most to be the most graphically advanced rendering engine on the PC.
Nah. In creating Crysis, they made zero compromises (and optimisations :p) and just left the user to upgrade their hardware. It wasn't until they started creating for consoles that they realised how to design properly, and Crytek are far less experienced in that space than most other devs. Indeed, they were 'negatively experienced' coming from power PC, where their choices and habits were reckless in comparison to how console design works. I am not saying that Ryse is a result of a poor developer - only correcting your idea that Yerli is well versed on compromises. He (and Crytek) are compromise noobs.

You're messing with the context of Crytek's comment, they said they didn't compromise RYSE as it's always been 900p.
The context of Crytek's comments is sadly left to readers to interpret. Twitter is a lousy way for people to convey meaningful info, and we can see, time and again, the response to a tweet is lots of discussion about ambiguous, or plain stupid, remarks.

1080p is a TV format. Unless Im mistaken, I dont remember 1080p being some desirable type of target solution for game development.
Native resolution is desired for maximum clarity on contemporary fixed-panel displays.

Is there is a special reason of why 1080p was chosen as a standard TV resolution other then being the number of pixels that neatly fit into a dimension that readily support movie formats?
That's a whole other debate. Suffice to say it's the resolution of many TVs out there, and if you want the clarity that those TVs offer, you need to render at 1080p, which is why MS supports a separate GUI layer for rendering 1080p UIs. Of course, with more photographic rendering, you can get away with less clarity and not have it as obvious as it is when rendering UIs in sub-native resolution.

I see developers see way more concerned with frame rates versus a particular TV resolution.
Inconsistent framerates all through this generation suggests otherwise!
 
Despite the absolutely terrible looking gameplay, here I was thinking the graphics looked really good, but I guess I was wrong, because number.
 
what does he mean upscaler for AA? i didn't think you would get the appearance of aa by scaling an image up, only down? or is he talking like a sharpening filter or someting?
Well, I guess the interpolation needed to upscale the image basically can be seen as a rough type of antialiasing:

160px*90px original image:

160x90px.gif


192px*108px bicubic resize:
192x108px.gif


And as the HUD will be on a seperate, native 1080p display plane, text etc. won't even get blury in the process. If done right, I really doubt the resulting image quality will be whole lot different from native 1080p renderings with added AA.

People bothering about this probably are the same people arguing that a 440ppi cell phone screen looks sharper than a 400ppi one ... while 99% of the users won't even be able to make out any pixels to begin with.

Don't get me wrong: If a game is native 1080p, that's great and the preferable solution. But I honestly think all the outrage directed at Crytek's decision to go with 900p is getting a little out of proportion.
 
Last edited by a moderator:
Your example images are inaccurate. You'd need to compare a natively rendered image to an upscaled one, rather than a smaller image to a larger, upscaled one.

The attached shows vector graphics rendered natively in a 192x108 image, and natively in a 160x90 buffer and then upscaled with simple upscaler.

For some content it makes negligible observable difference. For other stuff, like alternating lines, it very obviously blurs the results. Whether it makes an impact on this game or not doesn't need ot be discussed from a theorietical POV as people can actually see the game. If one looks at the game and thinks, "my god, that's blurry. I can't play that!" then don't buy it. Otherwise, care not what resolution it's rendering at. ;)
 

Attachments

  • Image1.png
    Image1.png
    13.7 KB · Views: 40
Status
Not open for further replies.
Back
Top