*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
Obviously. If lower resolutions didn't affect IQ, we wouldn't have higher resolutions. ;) That's the question, and DF's article suggested 'very little'.

I've seen some people who prefer more AA than more resolution, that's why I ask if a lower resolution can "always" affect IQ.
 
All things being equal, yes. Reducing resolution and increasing AA will be subjective in terms of whether people prefer it or not.
 
Resolution used to be king, and for good reason.

I'm not sure I agree. I don't recall resolution ever being "king", and I've been gaming since the teletype era. It's always been a trade off. If you want more resolution, and assuming that your application is already utilizing the available hardware well, you will likely need to accept a reduction in one or more of the following areas: (And plenty more I'm not thinking of.)
  1. Frame rate
  2. Color accuracy/depth (Remember all those crazy graphics modes?)
  3. Poly count (I guess this one won't always be the case.)
  4. Antialiasing (Is that the same as resolution?)
  5. Lighting, post processing and graphical bell 'n whistles in general

Could someone sacrifice all of the above in a quest for the absolute highest resolution possible? Sure. Has that ever happened? Seldom, I would guess. I have certainly seen people turning "everything" down in an all out quest to maximize frame rate, but I don't know that I've ever encountered a corresponding "resolution nazi". Maybe amongst CAD-CAM aficionados?
 
@RealtimeCevat Native res > upscaled. Hypothetically, if the title was to be developed for PS4, would hit the same hurdle?

159fye.jpg
 
I'm not sure I agree
That's fair enough.
But my comment was related to consoles, as it was a console discussion.

I think that now with things like dynamic resolution, gpu horse power for the targeted resolutions, that games should not look blurry and even if it's not 1080p they will look amazing. Especially due to the exact things you listed.

Current generation was talking about 720p -1080p and they never really had what was necessary power wise to pull it off.
So everyone was buying full hd TV's being told that's what games needed to shine.

But now, even at the same resolution things will look a lot better.

Resolution was a reasonable compromise in current generation that you could see but lived with, next generation you probably wont be able to tell that it's not 1080p unless your really trying hard to.
So there may be resolution compromises, but not ones that will be readily apparent.

Resolution was king for good reason, yea playing it up a bit haha.
I just think it's become a lot less relevant due to what can be done at the resolutions that will be targeted etc.
 
Can anyone get anything out of those screenshots in terms of real resolution?
Those are from high quality version of MP trailer. I seriously cant find good straight line that is long enough and not blurred as hell.

http://i.minus.com/iY3VerKPTgBGw.jpg
http://i.minus.com/izNbD75aU5Mz1.jpg
http://i.minus.com/iwnyMoAmg38RQ.jpg
http://i.minus.com/iz2qsEnlVvmlm.jpg
http://i.minus.com/i7q70V7y6Uz1.jpg
http://i.minus.com/iRAXS3QjjZlvY.jpg
http://i.minus.com/iC0JrTUQ8jx4V.jpg

There's loads of aliasing and I only looked at the first screen. It's more apparent in the high contrast areas like the bottom right edge of the character on the right. Some appears to have something approximately like 2x MSAA or slightly better, while some edges appear to have little to no AA. I'm guessing some form of post process AA. Unfortunately, I have yet to see a single really good purely post process AA in any shipping game on any platform. The only good AA I've seen involving post process has been when it is combined with MSAA.

So, not surprised that aliasing is pretty evident at times. Still, it's better than no AA.

Regards,
SB
 
There's loads of aliasing and I only looked at the first screen. It's more apparent in the high contrast areas like the bottom right edge of the character on the right. Some appears to have something approximately like 2x MSAA or slightly better, while some edges appear to have little to no AA. I'm guessing some form of post process AA. Unfortunately, I have yet to see a single really good purely post process AA in any shipping game on any platform. The only good AA I've seen involving post process has been when it is combined with MSAA.

So, not surprised that aliasing is pretty evident at times. Still, it's better than no AA.

Regards,
SB
I'm not talking about IQ, because i know that they are using SMAA T2x there. I'm talking about resolution.
 

So choosing 900p over 1080p wasn't because of a hurdle, it's more efficient!

Okay. Apparently every other developer aiming for 1080p native output are all idiots. Why 1080p when 900p is more efficient and nobody can see the difference! :smile:

I think I know which bin to put his future comments in.
 
So choosing 900p over 1080p wasn't because of a hurdle, it's more efficient!

Okay. Apparently every other developer aiming for 1080p native output are all idiots. Why 1080p when 900p is more efficient and nobody can see the difference! :smile:

I think I know which bin to put his future comments in.

Really? I'd say Cevat Yerli is quite well versed on the compromises that have to be made with regards to graphics rendering quality in order to support higher resolutions considering that CryEngine is considered by most to be the most graphically advanced rendering engine on the PC. Although DICE with Frostbite 3 might disagree. :)

Yes, higher resolutions = compromise in graphics rendering quality. The higher your resolution the more constrained your options become with regards to what graphical effects you wish to implement. The lower the resolution the more graphical effects are available to you.

On PC, you can always throw more powerful hardware at the problem to mitigate the problems involved with going to a higher resolution while still keeping the graphics rendering algorithms you wish to use. On console you have a fixed hardware target that you cannot change.

900p was likely where they decided that at typical living room distances with typical living room TV's that less than 1% (or whatever number) of people would be able to tell a difference. Below that and they likely felt there was an increasing chance that some small percentage of people may be able to tell a difference. Above that, they would constrain themselves too much with regards to what they could implement.

In that context, of course, he'd keep 900p as the rendering resolution on PS4 as anything above that is irrelevant because the vast majority of people would not be able to see the difference and it would limit what they could do graphically.

You'll notice that he didn't say the rendering would be identical on a PS4. It's possible they could have enabled even more advanced graphical effects on PS4, or not. The choice of resolution determines what combination of effects they can implement.

So rather than doing 1080p because it's a checkmark feature, they went 900p with more advanced rendering.

People really need to understand that with a fixed hardware target, Resolution is your main limitation with regards to what you can do. As long as the visual impact of resolution is negligible (I'm sure someone will make an asinine suggestion of 480p but that represents a significant visual impact), then it's purely a benefit to go lower in order to do more.

How low you go will be entirely dependant on the developer and their budget. Most developers may not have the budget to take advantage of the extra headroom, and for them 1080p will be the target resolution because they wouldn't be able to take advantage of the extra performance available if they went below 1080p anyway.

For some developers they may target a set of rendering goals. And then attempt to optimize it as much as possible and release the game at whatever resolution they end up at with a playable frame rate. This is similar to what DICE and Guerilla games appear to be doing.

For Crytek, they've instead set resolution at X and are optimizing it as much as possible with regards to how many graphical rendering effects, algorithms, features, etc. as they can within that limitation.

Especially right now at the start of a generation, it may be easier for a developer to target 1080p as they don't have time to hit launch and still have time to properly exploit the performance headroom that a lower resolution offers.

And that doesn't even taking into account the other graphics rendering constraint that is similarly constraining as resolution. Framerate.

Too long to read that whole thing? 1080p is a compromise with regards to graphics rendering. Not the other way around. Lower resolution always allows for better graphics. The question is how low can you go before people legitimately will notice without being told.

For the X360/PS3 generation where the comparison point was 720p, we saw games go as low as 540p without people noticing until they were told and then they automagically could see it where they couldn't before.

It'll be interesting to see how low one of the big budget AAA developers will go this generation to get all the features they want at the framerate they want. I'm willing to bet 900p isn't going to be the lowest...on both platforms.

Regards,
SB
 
There's been so many, it's hard to keep track of which one is contradicting the other or franticly back peddling from some verbal gaff. :LOL: :rolleyes:

You're not helping your own image with that comment, at least show a little respect to other human beings.

So choosing 900p over 1080p wasn't because of a hurdle, it's more efficient!

Okay. Apparently every other developer aiming for 1080p native output are all idiots. Why 1080p when 900p is more efficient and nobody can see the difference! :smile:

I think I know which bin to put his future comments in.

Depends on the game and the developer if it works out better for them than why not?

It certainly managed to deceive everyone at E3 to Gamescom that it was 1080p.

Really? I'd say Cevat Yerli is quite well versed on the compromises that have to be made with regards to graphics rendering quality in order to support higher resolutions considering that CryEngine is considered by most to be the most graphically advanced rendering engine on the PC. Although DICE with Frostbite 3 might disagree. :smile:

Yes, higher resolutions = compromise in graphics rendering quality. The higher your resolution the more constrained your options become with regards to what graphical effects you wish to implement. The lower the resolution the more graphical effects are available to you.

On PC, you can always throw more powerful hardware at the problem to mitigate the problems involved with going to a higher resolution while still keeping the graphics rendering algorithms you wish to use. On console you have a fixed hardware target that you cannot change.

900p was likely where they decided that at typical living room distances with typical living room TV's that less than 1% (or whatever number) of people would be able to tell a difference. Below that and they likely felt there was an increasing chance that some small percentage of people may be able to tell a difference. Above that, they would constrain themselves too much with regards to what they could implement.

In that context, of course, he'd keep 900p as the rendering resolution on PS4 as anything above that is irrelevant because the vast majority of people would not be able to see the difference and it would limit what they could do graphically.

You'll notice that he didn't say the rendering would be identical on a PS4. It's possible they could have enabled even more advanced graphical effects on PS4, or not. The choice of resolution determines what combination of effects they can implement.

So rather than doing 1080p because it's a checkmark feature, they went 900p with more advanced rendering.

People really need to understand that with a fixed hardware target, Resolution is your main limitation with regards to what you can do. As long as the visual impact of resolution is negligible (I'm sure someone will make an asinine suggestion of 480p but that represents a significant visual impact), then it's purely a benefit to go lower in order to do more.

How low you go will be entirely dependant on the developer and their budget. Most developers may not have the budget to take advantage of the extra headroom, and for them 1080p will be the target resolution because they wouldn't be able to take advantage of the extra performance available if they went below 1080p anyway.

For some developers they may target a set of rendering goals. And then attempt to optimize it as much as possible and release the game at whatever resolution they end up at with a playable frame rate. This is similar to what DICE and Guerilla games appear to be doing.

For Crytek, they've instead set resolution at X and are optimizing it as much as possible with regards to how many graphical rendering effects, algorithms, features, etc. as they can within that limitation.

Especially right now at the start of a generation, it may be easier for a developer to target 1080p as they don't have time to hit launch and still have time to properly exploit the performance headroom that a lower resolution offers.

And that doesn't even taking into account the other graphics rendering constraint that is similarly constraining as resolution. Framerate.

Too long to read that whole thing? 1080p is a compromise with regards to graphics rendering. Not the other way around. Lower resolution always allows for better graphics. The question is how low can you go before people legitimately will notice without being told.

For the X360/PS3 generation where the comparison point was 720p, we saw games go as low as 540p without people noticing until they were told and then they automagically could see it where they couldn't before.

It'll be interesting to see how low one of the big budget AAA developers will go this generation to get all the features they want at the framerate they want. I'm willing to bet 900p isn't going to be the lowest...on both platforms.

Regards,
SB

Nicely put, I think Halo 3 looked great even though it was 640p.

http://www.gamegrep.com/news/5430-halo_3_runs_640p_native_resolution/
 
Last edited by a moderator:
Too long to read that whole thing? 1080p is a compromise with regards to graphics rendering. Not the other way around. Lower resolution always allows for better graphics. The question is how low can you go before people legitimately will notice without being told.

1080p is a compromise with regards to graphics rendering.
graphics rendering is a compromise with regards to 1080p.

Do you see anything wrong with either statement?
I don't.

In both cases, they're all compromises, and there's quite frankly nothing wrong to admit that (except for PR problems). But saying a fish is not a fish will probably face more backlash.

Stop trying to put a spin to it because a compromise is a compromise. Rewording it doesn't suddenly make it not so.
 
Status
Not open for further replies.
Back
Top