Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Also the design of the game wont be limited to what the older gen could do. It will be build around the new gen capabilities.

The game seems promising... hopefully the framerate and awkward NPC animation issues get resolved.
 
I thought I heard someone from ubi saying something along the lines of "Our first next gen only AC game" on one of the game´s presentations during e3... Maybe there´s another team handling the oldgen, with the main devs not worring much if everything from the next gen game can carry over?
 
And another game defecates it's way onto our screens at 20~50 fps, showing us just how little hope there is for the future.

But at least it's 1080p! Hooraaaaaaaay!!

Actually, it's quite the contrary. I do believe fluctuating framerates are the bright future of gaming...with the advent of adaptive-sync and Gsyinc monitors. :rolleyes:

And such a resolution on a PC build means absolutely nothing unfortunately (for both consoles). Some recent Watch_dogs trailers/gameplay snippets were running on genuine PS4 at 1080p...and we all know how it turned out with the retail game.
 
Only if they eventually become built into everything for essentially free do I see those catching on. I'm not sure what the prospects of that are.

Adaptive-sync is supposed to be just that. But not Gsync though.

About the prospects, I seriously expect that the PS5 and XB2 will have adaptive-sync by default available for all their games. I'll be seriously disappointed if they don't. PS4 and XB1 almost could have got it at launch.

But they'll probably need a DisplayPort interface output. That might be a problem, I admit, because they would still need an HDMI (2.0?) output for compatibility with current TVs and HDMI + DisplayPort is more costly than only HDMI. But I think in ~5 years those DisplayPort needed for Adaptive-sync will be much cheaper than now.
 
If the technology gains any traction they'll add support in a future HDMI standard.

I'd be sceptical about any extension to the HDMI spec that doesn't benefit TV or movies gaining wide adoption. Take the DisplayPort 1.2 spec which was finalised in Dec 2009 and still most panels on the market aren't fully compliant. The ASIC market for displays is a ruthless 'lowest cost wins' market and building support for variable display rates that will only benefit a subset of the console market is unlikely to appeal. If it does gain traction it may only be in the high end sets. I do hope I'm wrong though.
 
I'd be sceptical about any extension to the HDMI spec that doesn't benefit TV or movies gaining wide adoption. Take the DisplayPort 1.2 spec which was finalised in Dec 2009 and still most panels on the market aren't fully compliant.

There are quite a few things that make it into HDMI that have nothing to do with TV or moves, like ethernet-over-HDMI for example, as well as ever-so-niche features like SACD streaming support. However getting TV manufacturers to support it is something else. The best bet is probably Sony who already make some of the lowest-latency TVs around (to appeal to gamers) as well as PlayStation branded displays.

The ASIC market for displays is a ruthless 'lowest cost wins' market and building support for variable display rates that will only benefit a subset of the console market is unlikely to appeal. If it does gain traction it may only be in the high end sets. I do hope I'm wrong though.

At the low-end yes, but some TV manufacturers like Panasonic, Samsung and Sony, offer medium to high to crazy high-end models. It matters less when you're passing on that cost to the consumer in a premium product.
 
Actually, it's quite the contrary. I do believe fluctuating framerates are the bright future of gaming...with the advent of adaptive-sync and Gsyinc monitors. :rolleyes:

High and stable frame rates will be better than fluctuating ones even with adaptive vsync. That just makes the negative impact of fluctuating frame rates smaller. 20~50 fps will still feel and play worse than 60 fps, but the most jarring, juddering shitness will be reduced or eliminated (which is entirely positive - there is no downside to adaptive vsync in itself).

And neither of the current "next gen" consoles support it, and neither do tvs. And almost every single monitor in production or planned for this year doesn't either.

So that 20~50 fps game we look at today has - and always will have - a shit frame rate. So will the ones we see over the next few years.

Now adaptive resolution ... that has existed for a few years now. And it works. It works very very well. But no-one bothers with it.

So no, there is little hope for either the immediate or even medium term future.
 
Not all framerate hiccups are resolution related. Probably even less for deferred rendering engines.

Obviously not all frame rate hiccups are rendering load related, but I'd wager that most still are (even with deferred renderers).

Adaptive resolution works wonders for brief periods where the rendering load spikes (e.g. alpha from asplodes) and is far less noticeable than tearing as a means of keeping frame rate up.

But I suppose if you make a game that slops around in the 20~50 fps range that means you'd constantly be greatly varying resolution to try and keep up around 50 (lol). Even with adaptive vsync the game is going to sludge along with poor and inconsistent input lag at the bottom end.

Adaptive vsync is great but it doesn't magically make poor and inconsistent frame rates not poor and inconsistent. And the brand new consoles don't even support it anyway. So there goes the next 6 or 7 years ...
 
...where the rendering load spikes (e.g. alpha from asplodes)
The next waves of games will include new techniques that provide smoother frame rate. For example compute shader based particle system (bin+gather, no overdraw) is generally around 3x faster than pixel shader / alpha blender based traditional particle system, but in the worst case (nearby explosion fills the screen with extreme overdraw) the compute shader based particle system is over 20x faster. This will grearly reduce the particle related hiccups. GPU driven rendering pipelines cut the draw call cost to almost zero, eliminating most of the fluctuation caused by varying visible object count.

However techniques like these require complete rewrite of the existing (CPU / pixel shader based) rendering engines, and are not at all compatible with last gen consoles or DX10 PCs. It will take some time.
 
Status
Not open for further replies.
Back
Top