Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Post AA is getting better but it still has a long way to go before it matches 4xMSAA+4xTrSSAA that I use.

Best post AA I've seen in game has to be in the latest Tomb Raider, FXAA in that game nails pretty much all the edges.
 
I dont think the blur that you guys are talking about is coming from upscaling in Ryse. I play Ryse on a 32 inch tv at 720p. There isnt any upscaling going on when I play. The only blur that I see in Ryse occurs with quick camera movements. I think it is intentional motion blur for cinematic purposes.

You're not going to see any upscaling if your TV is 720p and Ryse's native resolution is 900p if you set the output to 720p....
 
You're not going to see any upscaling if your TV is 720p and Ryse's native resolution is 900p if you set the output to 720p....

I know that was my point. The only blur I have noticed is motion blur.
Is the blur that everyone is refering to being seen in screenshots or coming from first hand expierence playing the game at 1080p?

If its from screenshots my point is that its motion blur.
 
I know that was my point. The only blur I have noticed is motion blur.
Is the blur that everyone is refering to being seen in screenshots or coming from first hand expierence playing the game at 1080p?

If its from screenshots my point is that its motion blur.

Lack of something due to you yourself artificially removing it and then not seeing it doesn't prove it doesn't exist in the first place.

An example:
I have a deck of cards, I remove the suits from the deck
Because I don't see any kings in this deck of cards, There are no kings in any set of cards.

Doesn't make a whole lot of sense does it?
 
It would be better if in shredenvain's case if the game recognized his 720p tv and rendered at that res, at least he'd get dome extra fps
 
It would be better if in shredenvain's case if the game recognized his 720p tv and rendered at that res, at least he'd get dome extra fps

Once again, No. The developers likely haven't tested at that speed, so you can't guarantee the user experience. The intricate timings inside the engine could completely break the user experience. The entire point of console games is a guaranteed experience for everyone. The moment you add in additional complexities is when you compromise the user experience.
 
Once again, No. The developers likely haven't tested at that speed, so you can't guarantee the user experience. The intricate timings inside the engine could completely break the user experience.
I can't think why, unless the game is written to count processing cycles instead of using the system clock. I'd expect any and every engine to use system clock for timing, and test against elapsed milliseconds etc. The PC shows arbitrary resolutions are comfortably supported in all games. You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.
 
It increases the testing matrix, to do it properly you really need to have two HUDs.
Developers don't like having to deal with it, PC developers have no choice.
 
You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.

Hitting the hardware pretty hard, isn't that the exact benefit of console development, TTM - To The Metal! Physics engines or AI engines can have quirks when running faster than tested for. And your Menus and other User Displays need to be tweaked as well. It more than doubles the testing matrix.
 
That reminds me of a bug I once worked on. We had this level that was supposed to end automatically after a certain amount of time elapsed, but the end sequence would never trigger when the game ran at 60fps. We ended up tracking it down to a scripting component that was using an accumulator to keep track of how much time had elapsed to know when to end the level, and due to some janky math it wouldn't accumulate when the time delta was too small. The funny part was that you could get through the level by spamming enough magic to flood the screen with particles so that the framerate would tank. :p
 
I can't think why, unless the game is written to count processing cycles instead of using the system clock. I'd expect any and every engine to use system clock for timing, and test against elapsed milliseconds etc. The PC shows arbitrary resolutions are comfortably supported in all games. You'd have to be hitting the hardware pretty hard for a change of resolution to affect a game (other than framerate) and I very much doubt CryEngine will be so sensitive on consoles.

It's not even that. Strange things can happen.

For example, the PC port of the original Dead Space. The higher your FPS (above 30) the laggier your controls became.

That one confused at a lot of people when the game came out as they couldn't understand why the controls for the game were so shite on their enthusiast class game machine.

Regards,
SB
 
That's not the norm though. Surely?! You can't even begin coding a game without numerous tutorials telling you how to decouple IO and rendering from framerate! Any game reliant on the framerate for time sensitive aspects is asking for trouble and poorly engineered.
 
Lack of something due to you yourself artificially removing it and then not seeing it doesn't prove it doesn't exist in the first place.

An example:
I have a deck of cards, I remove the suits from the deck
Because I don't see any kings in this deck of cards, There are no kings in any set of cards.

Doesn't make a whole lot of sense does it?

Ok I was not being clear earlier. Yes I play Ryse at 720p. I have also played the game at 1080p at a friends house for around an hour. No matter what resolution I have played it at there is blur but only with quick camera movements. So that is why I dont think the blur is from upscaling.
I was trying to explain to people that are blaming this blur on upscaling that they may be seeing it in screenshots or videos when the camera is in motion. If you actually own and play the game on a regular basis you would understand what I am talking about.
It may very well be from upscaling but if it is then why does it only occur when you move the camera in a drastic way? Also how can I see this blur when I am playing in 720p if it is from upscaling.

Is that better? Im just being honest from my experience playing the game. I do apologize for not making myself clear earlier.
 
That's not the norm though. Surely?! You can't even begin coding a game without numerous tutorials telling you how to decouple IO and rendering from framerate! Any game reliant on the framerate for time sensitive aspects is asking for trouble and poorly engineered.

You wish.
 
Ok I was not being clear earlier. Yes I play Ryse at 720p. I have also played the game at 1080p at a friends house for around an hour. No matter what resolution I have played it at there is blur but only with quick camera movements. So that is why I dont think the blur is from upscaling.
The blur from upscaling is very subtle, unlike motion blur. It is definitely there (it can't not be there, as there's no way to upscale a fractional amount without introducing blur other than pixel resizing which looks far worse), and is equivalent to something like a 0.75 pixel radius Gaussian going by observation of tests relative to screenshots.
 
That's not the norm though. Surely?! You can't even begin coding a game without numerous tutorials telling you how to decouple IO and rendering from framerate! Any game reliant on the framerate for time sensitive aspects is asking for trouble and poorly engineered.

It almost never happens for a title that was developed on the PC first. But weird things like that happen from time to time with console ports which are sometimes only tested in one environment at one set of settings.

Evidently for Dead Space, no one thought to test the port when you unlocked the frame rate.

Now, take a console only title where developers and their QA departments aren't used to testing more than one setting and may not have the budget to test more than one setting and then add multiple settings. And then add in additional difficulties if you are coding so close to the metal that your timings for things may be so intricately tied to other things that changing how quickly one thing can operate at may affect the timing on other things.

I'm sure multiple developer's here have experience something similar to what MJP experienced.

And while rare for PC only titles as I mentioned above, it occasionally happens in PC only titles. :p

The infamous acceleration you gained from bunny hopping in Quake/Quake 2 was a side effect of higher frame rates. With higher FPS = faster and longer bunny jumps. For Quake 2, if your FPS was high enough you could grapple/bunny hop through windows that were too small for your character to fit through.

In Everquest, Everquest 2, and I believe Rift and maybe AoC. If your FPS was high enough (or was it low enough? I can't remember), combined with a little big of lag, you could clip through some geometry in a predictable and easily reproducible way.

Regards,
SB
 
Status
Not open for further replies.
Back
Top