Ryse: Son of Rome [XO]

Have you guys seen the gamersyde video for PC Ryse?
http://www.gamersyde.com/news_ryse_ready_to_conquer_pc-15812_en.html

In some way the PC version looks inferior to the Xbox version. I wonder if its just the blurring effect from the upscaling which hides imperfections, or if the esram affords the Xbox version some extra bandwidth allowing them to pull off some higher qualityrendering techniques that were removed from this trailer.

misterxmedia will jump for joy if its true, and he can switch from stating that the xbox has 3tflops of alu computational capacity to stating the esram allows for "effective" image quality equivalent to a 3tflop gpu. Thus maintaining his legitimacy in the fact of inevitability the gpu isn't 3tflops.
 
Last edited by a moderator:
They would probably make a lot more money by bringing the whole Crysis trilogy to PS4 and Xbone than the money they'll make with Ryse on PC.

I'd certainly buy the Crysis trilogy on PS4. I played Crysis and Crysis 2 on PS3 multiple times and would again because I just love the freedom of the gameplay. Crysis 3 is sitting on the PS3 waiting for me, it was free with PS+ a month or so ago.

Then again, the trilogy should take a lot more time to port, though.
Given what 4A did with Metro Redux in 4 and 6 months (Xbox One and PS4 respectively) I'd hope not too much time! It does seem like a really obvious thing so I hope the success of Tomb Raider Definitive Edition and Metro Redux at least have Crytek thinking about this if they aren't already working on it :yep2:
 
Have you guys seen the gamersyde video for PC Ryse?
http://www.gamersyde.com/news_ryse_ready_to_conquer_pc-15812_en.html

In some way the PC version looks inferior to the Xbox version.

Looks the same to me aside from the much higher resolution.

I wonder if its just the blurring effect from the upscaling which hides imperfections,

If we're really going with the lower resolution = better graphics argument (I thought that died when the new consoles started being capable of running at 1080p) then the obvious solution for PC gamers would be to simply lower the resolution. Oh the irony of that. Now that consoles are finally running games in full HD it turns out sub HD is better and only PC gamers have access to it in all games thanks to the ability to manually configure resolution. ;)

or if the esram affords the Xbox version some extra bandwidth allowing them to pull off some higher qualityrendering techniques that were removed from this trailer.

Considering the esram even running at it's totally unrealistic/unreachable theoretical peak still offers much less bandwidth than the highest end PC GPU's that seems unlikely.

misterxmedia will jump for joy if its true,

Since when did something have to be true for him to start spreading it around the internet as proof that the XBO was built by aliens from the future? ;)
 
Have you guys seen the gamersyde video for PC Ryse?
http://www.gamersyde.com/news_ryse_ready_to_conquer_pc-15812_en.html

In some way the PC version looks inferior to the Xbox version. I wonder if its just the blurring effect from the upscaling which hides imperfections, or if the esram affords the Xbox version some extra bandwidth allowing them to pull off some higher qualityrendering techniques that were removed from this trailer.

misterxmedia will jump for joy if its true, and he can switch from stating that the xbox has 3tflops of alu computational capacity to stating the esram allows for "effective" image quality equivalent to a 3tflop gpu. Thus maintaining his legitimacy in the fact of inevitability the gpu isn't 3tflops.

You're joking... right?
 
Have you guys seen the gamersyde video for PC Ryse?
http://www.gamersyde.com/news_ryse_ready_to_conquer_pc-15812_en.html

In some way the PC version looks inferior to the Xbox version. I wonder if its just the blurring effect from the upscaling which hides imperfections, or if the esram affords the Xbox version some extra bandwidth allowing them to pull off some higher qualityrendering techniques that were removed from this trailer.

misterxmedia will jump for joy if its true, and he can switch from stating that the xbox has 3tflops of alu computational capacity to stating the esram allows for "effective" image quality equivalent to a 3tflop gpu. Thus maintaining his legitimacy in the fact of inevitability the gpu isn't 3tflops.

Again with the blurring thing? Also what does misterxmedia have to do with Ryse coming to Pc? Like everyone here I havent played the Pc version, but I am pretty sure your rig will determine image quality.
 
What do you mean?? Can you elaborate?
DOF (during gameplay) and chromatic aberration both contribute to blurring the IQ yet people still consider the game one of the best looking games ever due to the lighting, materials, physics and animation. The crispness so lauded of native resolution is not present in this game.
 
DOF (during gameplay) and chromatic aberration both contribute to blurring the IQ yet people still consider the game one of the best looking games ever due to the lighting, materials, physics and animation. The crispness so lauded of native resolution is not present in this game.

But there's obviously a world of difference between specific effects applied in specific circumstances, to select areas of the screen with the sole purpose of making the game look better, and a permanent full screen loss of detail coupled with aliasing and shimmer caused by low resolutions.
 
I thought the slight blur in Ryse was from their AA solution that incorporates a temporal element. The Order looks to have a slight blur at all times from the post-process chain.
 
But there's obviously a world of difference between specific effects applied in specific circumstances, so select areas of the screen with the sole purpose of making the game look better, and a permanent full screen loss of detail coupled with aliasing and shimmer caused by low resolutions.

Have you actually played Ryse? The Smaa is great in Ryse. You rarely ever see any aliasing or sub pixel scattering or shimmering. Its AA is second only to ISS
Like I posted earlier Again with the blur? It seems to me the only people that complain of the blur in Ryse are people who havent actually played the game.
So there is a disagreement between those who have played it and those who havent.
Can we please just drop it. If not Im sure Brit will clean it up just like last time it happened in this thread.
 
Have you actually played Ryse? The Smaa is great in Ryse. You rarely ever see any aliasing or sub pixel scattering or shimmering. Its AA is second only to ISS
Like I posted earlier Again with the blur? It seems to me the only people that complain of the blur in Ryse are people who havent actually played the game.
So there is a disagreement between those who have played it and those who havent.
Can we please just drop it. If not Im sure Brit will clean it up just like last time it happened in this thread.

I didn't mention anything about blur in Ryse. I'm talking about general blur that exists from upscaling (which is a fact, not opinion) as well as the general increase in aliasing and shimmer caused by lower resolution (again, fact, not opinion). Great AA as well as the general tone of the graphics can reduce how obvious those things are but resolution always has an impact regardless of how small.

My point is that comparing the negative impact of lowering resolution to the positive impact of techniques which implement targeted "blur type effects" to specific areas of the screen is totally invalid.
 
I thought the slight blur in Ryse was from their AA solution that incorporates a temporal element. The Order looks to have a slight blur at all times from the post-process chain.

How are you guys comparing a released game with an unreleased one? Has anyone here played The Order?
 
I thought the slight blur in Ryse was from their AA solution that incorporates a temporal element. The Order looks to have a slight blur at all times from the post-process chain.

It's probably both since by definition upscaling must introduce some level of blur (however tiny). If people really do think that makes the game look better then I'm sure the PC version will offer the same SMAA antialiasing solution along with the ability to run at 900p. Hell, you could even pimp out the PC version graphics to the max and lower the resolution all the way down to 720p Just image the awesome blur effects! ;)
 
It's probably both since by definition upscaling must introduce some level of blur (however tiny).
That may be true in most implementations, but not by definition. eg. Nearest neighbour upscale introduces no blur, nor does this information reconstruction approach.

The Order's graphics as used as an example (which may be horribly OT and require a nuking of this thread once again), aren't applying visual effects very selectively, so I think it's fair to say that it has a uniformly present, and that does contribute to its cinematic realism. Blur does, because cinema footage is blurred in a majority of the time (DOF, movement of camera, movement of actors with longer exposure).

I hope this is all on topic and relevant, because I'd hate to have to delete my own post as part of the collateral damage. :nope:
 
DOF (during gameplay) and chromatic aberration both contribute to blurring the IQ yet people still consider the game one of the best looking games ever due to the lighting, materials, physics and animation. The crispness so lauded of native resolution is not present in this game.

I have not played the Order on my screen yet, so I can't comment on the crispness atm.

But I think that when you stand still, no movement, the post processing effects should reduce such that this should 'allow' to see the crispness and increased detail, right?

During movement, it is difficult to judge anyway, as the temporal resolution plays an equal important role.

PS: But on the other hand, you are right...I did blind test a few games out there to my GF and she insta-voted the Order as the most realistic and best looking game. Something like The Tomorrow Child e.g. does not impress at all.
 
so I think it's fair to say that it has a uniformly present, and that does contribute to its cinematic realism. Blur does, because cinema footage is blurred in a majority of the time (DOF, movement of camera, movement of actors with longer exposure).

I guess that's were motion blur comes in. I still think it's very different to the blur (and other negative aspects) introduced from lowering resolution and then upscaling though. Otherwise developers are getting thing very wrong and everyone should be gaming on low end PC's for the best possible graphics!
 
I guess that's were motion blur comes in. I still think it's very different to the blur (and other negative aspects) introduced from lowering resolution and then upscaling though. Otherwise developers are getting thing very wrong and everyone should be gaming on low end PC's for the best possible graphics!

Don't exaggerate. Clearly a lot of people care a great deal about the crispness of native resolutions. However, many others don't. Some even prefer the slight blurriness precisely because it doesn't look as crisp. It's one of the reasons chroma aberration is applied in the first place.
 
Don't exaggerate. Clearly a lot of people care a great deal about the crispness of native resolutions. However, many others don't. Some even prefer the slight blurriness precisely because it doesn't look as crisp. It's one of the reasons chroma aberration is applied in the first place.

Okay, so lets say these people that prefer the full screen blur introduced by non-native resolution have a valid point. Does that mean that every game where the PS4 exceeds the XBO resolution looks better on XBO to these people? And by extension does that mean that any game on the XBO or PS4 running at 1080p looks better on a PC that can run it at 900p or lower?

This sounds like awsome news for PC gamers to me who can apparently, at least for subset of people acheive better graphics than the consoles using less powerful hardware. Have I got that wrong?
 
Back
Top