Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
I was kind of wondering if they'd still have dips, but they don't, so I think the dynamic resolution is a good choice.

I wish the Battlefield games had dynamic resolution. I bet they could get PS4 up to 1080p at times and Xbox One up to 900p. Not suggesting that dynamic resolution is easy though.
 

I'm starting to wonder what Gamer's (console Gamer's) consider to be 'next-generation' of gaming...

Is it Halo 5 with butter smooth framerate (60fps)... at the cost of IQ (sub 1080p, low AF sampling, low textures and shadows, etc...)?

Or

Is it Until Dawn with awesome graphics... at the cost of performance (lots of dips within the low 20fps)?

From a PC gamer perspective - both these games would be seen as a failure. From poor IQ to poor performance...

So my main question is this... Which trade-offs are more forgivable than others?
 
I'm starting to wonder what Gamer's (console Gamer's) consider to be 'next-generation' of gaming...

Is it Halo 5 with butter smooth framerate (60fps)... at the cost of IQ (sub 1080p, low AF sampling, low textures and shadows, etc...)?

Or

Is it Until Dawn with awesome graphics... at the cost of performance (lots of dips within the low 20fps)?

From a PC gamer perspective - both these games would be seen as a failure. From poor IQ to poor performance...

So my main question is this... Which trade-offs are more forgivable than others?

Depends on the game, depends on the person. You'll find some people that want 60fps and some people that want 1080p, and it can vary depending on the title. For a twitch shooter, I want 60fps. For a single player game, I'm fine with 30fps and improved visuals. There's no way to boil it down to one answer.
 
Depends on the game, depends on the person. You'll find some people that want 60fps and some people that want 1080p, and it can vary depending on the title. For a twitch shooter, I want 60fps. For a single player game, I'm fine with 30fps and improved visuals. There's no way to boil it down to one answer.

Correct.

But my question was more towards - are there certain IQ effects that console games shouldn't be trade-offs for... regardless of game genres or scope? Should AF be sacrificed for a certain frame-rate or resolution? Should sub-par shadowing and texturing be acceptable for a frame-rate or resolution target?

Or

Should there be a middle ground standard on not sacrificing certain effects/IQ (AF, texture quality, shadowing, etc...) that one would expect to be an advantage over the previous generation or something a mid-end PC (gamer) wouldn't cringe at?

In other words, what are the default post-processing effects and IQ standards console Gamer's shouldn't be without - regardless?
 
Since all consoles are PCs basically we might just as well give them all the same settings for performance. But m the most important aspect is choosing the correct settings for the correct type of game. Basically the speed of the gameplay and the need for fast response times dictates whether high frame rates should be prioritized over prettier graphics.
 
Personally i prefer if multiplayer games are 60 or higher, that's the main reason i am playing multiplayer games on PC where i can reach 100+ fps in multiple mp focused games. With that said i am completely fine with something like Destiny, where the framerate and more importantly the frame delivery times are consistent. That's the key word for me, if frame delivery is consistent i can adapt to 30 fps MP quite easily, i never felt that there's something more to be explored in destiny by going 60 even though i'd play it on PC if it was available there. When it comes to single player i usually prefer consistency over frame targets like 60 or 30, if it sits between 40-50 i can deal with unlocked framerate (ala inFAMOUS First Light), anything below that i prefer locked 30.

tl;dr SP 30 is good enough (depending on genre ofc), 60 is a nice bonus if possible. MP 30 can be ok but 60 is much better. Key factor in both: consistent frame delivery

Of course, there's always an exception to the rule, for example Until Dawn, really rough performance in specific levels in the game but it never really hindered my enjoyment due to the nature of the gameplay.
 
Correct.

But my question was more towards - are there certain IQ effects that console games shouldn't be trade-offs for... regardless of game genres or scope? Should AF be sacrificed for a certain frame-rate or resolution? Should sub-par shadowing and texturing be acceptable for a frame-rate or resolution target?

Or

Should there be a middle ground standard on not sacrificing certain effects/IQ (AF, texture quality, shadowing, etc...) that one would expect to be an advantage over the previous generation or something a mid-end PC (gamer) wouldn't cringe at?

In other words, what are the default post-processing effects and IQ standards console Gamer's shouldn't be without - regardless?

Short answer: Let the devs, artists decide. As long as the game looks good, I don't really care what check boxes are ticked. The dev team and artists combined will find the best way to get a good combination of visuals and performance. To me, nothing should be off limits.
 
Short answer: Let the devs, artists decide. As long as the game looks good, I don't really care what check boxes are ticked. The dev team and artists combined will find the best way to get a good combination of visuals and performance. To me, nothing should be off limits.

I agree, there should be no forbidden techniques it all depends on the project. I really dislike Quincux AA for it's nasty texture smearing side effects but if a title lacked fine texture detail (such as a Mario 64 style title) then the cheapness of Quincux might free up cycles for another more important effect.

Personally as someone who owns a PC and PS4 I really dislike the low levels of AF on console titles as I'll sacrifice resolution on my PC for more on screen effects at a smoothish 30. Then again as someone who has always lagged behind the cutting ege on PC I've gotten used to trading off (hell I played Doom in a 40% window on my 386SX) and don't care that much. I do object to how few tweaking parameters are exposed to the end user on PC these days though, half the fun of a title for me was fiddling with my own acceptable LoD settings etc (I'm pretty sure I spent more time tweaking Gothic 3 than playing it).
 
I know that used to be the prevailing opinion of console gamers, has it now changed would they like the option of options ?

I'd be happy with two options personally:

(I was also one of those former PC gamers that spent most time fiddling with settings - it's a fairly pointless exercise)

60fps - lower graphical settings/resolution.
30fps - higher graphical settings/resolution.

Just to be clear; I don't care to be able to change the individual settings - just the option of 30 or 60.

Personally, I think 343i have been good choices for Halo 5. It looks pretty too.
 
I'm starting to wonder what Gamer's (console Gamer's) consider to be 'next-generation' of gaming...

Is it Halo 5 with butter smooth framerate (60fps)... at the cost of IQ (sub 1080p, low AF sampling, low textures and shadows, etc...)?

Or

Is it Until Dawn with awesome graphics... at the cost of performance (lots of dips within the low 20fps)?

From a PC gamer perspective - both these games would be seen as a failure. From poor IQ to poor performance...

So my main question is this... Which trade-offs are more forgivable than others?
The case of Halo 5 is similar to the case of Halo 3 on the 360. Gamers were expected to be blown away but got a game that sacrificed IQ and detail for scale. People still loved it for multiplayer and because its Halo. I personally found it mediocre and disappointing in the visual department. ODST and Reach werent that impressive either although they were improved. I think people got used to not getting the most outstanding graphics from Halo. As long as it is good looking, maintains its art style, it has good story and awesome multiplayer they wont mind. It wont be a mediocre looking game thats for sure and its framerate/fluidity does compensate to some extend.
This will still look and play better than any COD game that have been selling like hotcakes despite their shortcomings.
 
The thing is, this game is actually quite pretty in terms of art and visuals. Some IQ issues are all that can be mentioned.
 
If they had a toggle for 30 vs 60Hz, I'd be ok with that, but ultimately the devs, artists would still decided what the compromises would be for each setting. That's the way it should be. I want games to look cohesive. I don't want a whole bunch of toggles that produce varying results. The studio is going to produce the best and most balanced image that suits their art style for a given frame time.
 
60fps - lower graphical settings/resolution.
30fps - higher graphical settings/resolution.

Just to be clear; I don't care to be able to change the individual settings - just the option of 30 or 60.
Scaling down graphics resolution doesn't help the CPU at all. Scaling down most graphics settings does not help the CPU at all (AA, filtering, post effect quality, lighting/ambient quality, etc). If you are already using close to 100% of the CPU at 30 fps, it is not easy to scale up to 60 fps by decreasing the graphics quality alone. You also need game logic and level design changes. Most demanding levels would likely require drastical redesign, making the game play quite different.

Releasing both 30 fps and 60 fps console versions of the same game would be a big extra cost to the whole team (level designers, level artists, lighting artists, QA, QC, etc), not just the rendering team. As AAA game projects are already too expensive, I doubt this will happen.
 
Last edited:
It presumably must work for some games since we have quite a few that adjust resolution on the fly (ID Tech 5 games, Halo 5, etc.) I guess none of those games push the CPU to 100% then.

Also, why would so many games that are released on the two primary consoles have a resolution difference between them even though they're using what is essentially the same CPU? None pushing the CPU very hard?

I imagine Halo 5 could offer a toggle between 1080p @ 30fps and variable resolution @ 60fps, since the base performance is actually 60fps. I do think they made the right decision with their game though.

Correct me if I'm wrong, but if 60fps is the baseline, then scaling up for a crisper IQ and higher resolution / settings should be completely feasible. It's scaling down that'd be the issue / challenge.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top