I dont think the blur that you guys are talking about is coming from upscaling in Ryse. I play Ryse on a 32 inch tv at 720p. There isnt any upscaling going on when I play. The only blur that I see in Ryse occurs with quick camera movements. I think it is intentional motion blur for cinematic purposes.
You're not going to see any upscaling if your TV is 720p and Ryse's native resolution is 900p if you set the output to 720p....
How would you get to witness upscale blur if you are playing the game at 720p (presumably downscaled from 900p) on a 720p TV?I know that was my point. The only blur I have noticed is motion blur.
I know that was my point. The only blur I have noticed is motion blur.
Is the blur that everyone is refering to being seen in screenshots or coming from first hand expierence playing the game at 1080p?
If its from screenshots my point is that its motion blur.
It would be better if in shredenvain's case if the game recognized his 720p tv and rendered at that res, at least he'd get dome extra fps
I can't think why, unless the game is written to count processing cycles instead of using the system clock. I'd expect any and every engine to use system clock for timing, and test against elapsed milliseconds etc. The PC shows arbitrary resolutions are comfortably supported in all games. You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.Once again, No. The developers likely haven't tested at that speed, so you can't guarantee the user experience. The intricate timings inside the engine could completely break the user experience.
You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.
I can't think why, unless the game is written to count processing cycles instead of using the system clock. I'd expect any and every engine to use system clock for timing, and test against elapsed milliseconds etc. The PC shows arbitrary resolutions are comfortably supported in all games. You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.
Lack of something due to you yourself artificially removing it and then not seeing it doesn't prove it doesn't exist in the first place.
An example:
I have a deck of cards, I remove the suits from the deck
Because I don't see any kings in this deck of cards, There are no kings in any set of cards.
Doesn't make a whole lot of sense does it?
That's not the norm though. Surely?! You can't even begin coding a game without numerous tutorials telling you how to decouple IO and rendering from framerate! Any game reliant on the framerate for time sensitive aspects is asking for trouble and poorly engineered.
The blur from upscaling is very subtle, unlike motion blur. It is definitely there (it can't not be there, as there's no way to upscale a fractional amount without introducing blur other than pixel resizing which looks far worse), and is equivalent to something like a 0.75 pixel radius Gaussian going by observation of tests relative to screenshots.Ok I was not being clear earlier. Yes I play Ryse at 720p. I have also played the game at 1080p at a friends house for around an hour. No matter what resolution I have played it at there is blur but only with quick camera movements. So that is why I dont think the blur is from upscaling.
That's not the norm though. Surely?! You can't even begin coding a game without numerous tutorials telling you how to decouple IO and rendering from framerate! Any game reliant on the framerate for time sensitive aspects is asking for trouble and poorly engineered.