*spin-off* Ryse Trade-Offs

Status
Not open for further replies.
Then why ever have anything over 900p? If nobody can tell the difference why bother?
Why compromise the visuals by going for 1080p.
Not saying every game should go 900p, it may depend on art direction, engine, and if scaling it produces an image that most people wouldn't think isn't native or close enough.

I don't think anyone is saying that 900p is right for every game.
Just that they could have decided on 900p from the start due to the game.
 
Then why ever have anything over 900p? If nobody can tell the difference why bother?

I already addressed this exact comment in my earlier post. Eventually there will be a breaking point. Maybe hardware 4 times faster that little extra sharpness might be a worthwhile tradeoff on top of the increased pixel quality. I hate to use this word... but it is a "balance" between which tradeoffs give the best image.

What crytek is telling you is that they can make a better "overall" looking game by not simply aiming for the 1080p checkbox. Isn't that what matters most?
 
The breaking point depends on the size of your display, your vision and the distance between your display and your eyes.
 
So you think if crytek had the choice of the game running exactly as they want at 900p or exactly as they want at 1080p they would choose 900p ?

Christ Davros, maybe you should stick to PC threads. You've had a mare in this one.
 
So you think if crytek had the choice of the game running exactly as they want at 900p or exactly as they want at 1080p they would choose 900p ?

I think that's the point. Some people are suggesting (right or wrong) that they'd target 900p regardless. So if for some reason they found more rendering time on the table (improved drivers, final hardware, improved sdk, fairy dust), they stick with 900p and use the resources elsewhere.

I think it's reasonable that some devs would do that. Who knows if that is the case here.
 
The breaking point depends on the size of your display, your vision and the distance between your display and your eyes.

The majority of people have televisions under 60". Clearly the optimal resolution vs pixel quality shifts with larger televisions but they are catering the the best "overall" image for the majority of users! The majority do not have 100" screens. Crytek is choosing 900p because the typical living room shows more OVERALL benefit having higher quality pixels then simply shooting for 1080p at random.
 
They should have went with 1920*1080 24p :p

To me, animation, art direction, sound design, and others triumph polycount or resolution.

I can imagine some scenes from GOW3 or GOW:A obliterating anything that Ryse puts out, just because Ryse is grounded in reality and also because they went with a high polycount from the beginning, leaving almost no room for other effects. The animations will take you out of the game, no matter how many millions of polygons they claim to have.
 
They should've made the game 900x1080 pillarboxed FPS, to simulate a helmet cam. Or, keep the 3rd person view but zoom way in over the shoulder so that half the screen is Marius' upper right back and head. These are much more innovative ways to increase rendering power than to drop res and upscale.
 
Saying Ryse is 900p due to performance limitations of the XBOne hardware is not any more/less valid than saying any other characteristic of the graphical output is limited by the hardware. Isolating that one characteristic, though, without considering that all aspects of CG visuals are limited by available performance isn't. It is entirely possible that with more available power that Crytek may have still chosen to remain at 900p or even render to a lower resolution and instead use the available power to boost other aspects of the rendering if they felt that that would result in better overall image quality.
 
I believe that is like the start of Lost Odyssey: it's probably CGI that seamlessly translates into realtime.

Nope, there is no CGI in this game, everything You see here is real-time.

In last Cevat's presentation, that smoke in the background bugged out, like sometimes particles bugs out in my SDK editor. Its all real particles. Animation is pre-canned though, but everything else is real-time, so its lit by light sources, its shadowed and self-shadowed, its affected by wind etc.
 
Not sure if the topic is for this thread but it was brought up many months ago, 1080p native isn't a creditable bar anymore to gauge the consoles. It's too simple and inaccurate, any game can simply reach it by shrugging off a couple complex features.

A good example of poor gauging would be UT4's elements demo on PC. it wasn't even at native 1080p, yet it could be classified as inferior tech because of that? no, the technology it was pushing was clearly much more advanced.

The standard wasn't enforced this gen because it would obviously tie next gen consoles down. The only reason why the "HD Era" was brought up last gen was because of TVs supporting the signal. If 4k became the standard because of newer TVs now, these consoles would have to resort to Sub ps3/360 like graphics.

what should gauge technology is the content they're pushing for, Frame buffer resolution is a second gauge but more importantly eliminating jaggies is what people want the most.
 
The breaking point depends on the size of your display, your vision and the distance between your display and your eyes.


and the quality of the scaler and how the display planes are used. The is a huge difference between scaling 1 of the 2 display planes to 1080 thus outputting a native 1080 p signal and forcing a non-native resolution onto a PC monitor, which is what most of the folks claiming to be able to see the difference are doing.
 
Then I will bitch, moan and be transparent, because anything dealing with the technical aspects of XB1 hardware or it games are off limits to little dickus, crack-function and whoever/whatever else company shill that doesn't want to address questions. ;)

You want to have a "technical discussion" about why a console (that you show utter contempt for) can't achieve an arbitrary goalpost you equate with competency, kicking off the "discussion" with loaded terms like "hardware limitations" and "crippled performance." Right. Accusing someone else of being a shill is quite the punchline.

Why 1080p? Because 1080p is a fairly common display resolution? Isn't more = better in your book? There are quite a few 4k displays out there. What is broken in these next gen consoles that they can't handle high res output at 60 frames/sec? Is it gimped memory busses? Underpowered CPU's? Underachieving GPU's? What failures did Sony and MS have in producing hardware capable of high quality graphics knowing years ahead of time what the target displays were?

Or from an equally absurd perspective, when Crysis releases a new PC game that brings top of he line graphics cards to their knees, what is broken in these cards so that they can't handle a game at full HD resolution? Limited bandwidth from there ultrawide and fast gddr5 memory systems? Should they have not cut corners and went with 1024 bit wide bus?

No less FUD than what you wrote. No less loaded language.

Now, if you want to have a technical discussion on the topic, we can start with the old tried and true, as the rules of physics haven't changed for a new generation. What is human visual acuity in arc seconds, and what resolutions does that equate to for a given display size at a given distance? How does that change with AA, ie how much does resolution need to increase without AA as a percentage to maintain visual equivalence to a lower resolution with 2xAA? 4x? Different sampling methods? Now, how does the processing power compare at visually equivalent combinations? Bandwidth? Does esram or single fast pool fit some choices better than others? If you have an abundance of compute power, or bandwidth, does that lend certain choices more favorable? Are there situations in which a lower resolution visually equivalent or nearly equivalent choice frees enough resources to make a noticeable impact on pixel quality? What does he hardware look like in such a case, compute heavy or compute constrained? Bandwidth heavy or bandwidth constrained?

Someone else asked if given a choice, would crytek prefer to run at 1080p looking just like they want, or 900p looking just like they want. That hypothetical presumes there is enough processing power to enable such a choice... ie, infinite. Of course if you have infinite power, you pick the higher res. Back in realityland, we have consoles with cost, power, and heat budgets, and devs will have to make choices that give the best on screen appearance possible within those budgets. This is no different than any other generation. Some games going for a certain loom or having a certain style may target lower res higher quality pixels. We saw plenty of that last gen. For other game styles, more pixels may be achievable or even desirable. Maybe some game will decide to target higher than 1080p and sacrifice where necessary to achieve that. The monitors I use daily are well over twice that resolution, so I would find that intriguing.
 
and the quality of the scaler and how the display planes are used. The is a huge difference between scaling 1 of the 2 display planes to 1080 thus outputting a native 1080 p signal and forcing a non-native resolution onto a PC monitor, which is what most of the folks claiming to be able to see the difference are doing.

Rez scaling on PC, IQ-wise, has been good for a few generations. The problem with PC games looking lacklustre in non-native is because of view distance, and the mostly static UI not being rendered in native res, which makes it look obvious.
 
Discussion has run its course. Trade-offs exist on consoles. The End. (Please don't stay for the after-credits cut-scene).
It's nappy time, kay?
 
Status
Not open for further replies.
Back
Top