Rendering of Watch_Dogs

Jcustom

Newcomer
To me it feels like this problem runs deeper than the size of the various buffers, it's more likely the result of renderers not built around the nextgen systems and techniques. Neither WD nor (in particular) TF looks like they're "proper" nextgen engines with linear lighting and physics based shading, they're more likely some forward+ based approaches. So maybe the problem is submitting geometry twice, or some other inherent inefficiency that can't be helped without some major architectural reworking and possibly asset modifications too.

Yes, the Order on the other hand works pretty nicely without going deferred; but its scope is much more limited and the framerate isn't 60fps either.

"The Disrupt engine uses an aggressively multithreaded renderer, running on fully deferred physically based rendering pipeline with some technological twists to allow for more advanced effects."
That's a quote from this interview http://www.dsogaming.com/interviews...-global-illumination-city-size-modding-scene/
 
In that case, I might be wrong. But it's still not looking like this is the most the new systems could do, some pretty big inefficiencies have to be there to explain the low resolutions.
 
Looks like i'm unable to edit my previous post so i just say that their "fully deferred physically based rendering pipeline" might be a little misleading since i don't think that they have embraced the full physically based shading pipeline (see Crysis 3 in comparison to Ryse) but at least looks like their engine is using deferred rendering.
Also i'm off topic, probably better move to the OT of the game.
 
Comparing to Infamous open world impressive engine which is very dependant of GPU compute, as they apparently optimized the engine for Nvidia GPUs, they most certainly didn't use GPU compute of those AMD GPUs.

I believe GPU compute were even used in Tomb Raider DE for both XB1 & PS4 to run tressfx2. If Ubisoft run all physics simulation on the weak jaguars they might not have enough time to run the graphics with decent output (resolution)...

Even the facial animation of Infamous were processed by GPU compute and 100% of Infamous particles engine is run by asynchronous GPU compute. If all this work is given to the poor Jaguars on PS4/XB1...
 
If the game is CPU limited, why the cut in resolution?

There has to be a bandwidth or fillrate issue here as well.
 
If the game is CPU limited, why the cut in resolution?

There has to be a bandwidth or fillrate issue here as well.
For PlayStation 4 there is an unbalanced correlation between growing CPU bandwidth resulting in diminished GPU bandwidth. Somebody did post a chart somewhere in the last week or so.

In general I agree with your sentiments, Watch_Dogs looks "ok" but isn't blowing the doors of current gen (One/PS4) graphically - but it's cross-gen title. I'm one of those wacky 'gamer' folks, I'm interested in the game. As long as the graphics aren't bad, I'm good. I just hope the game is as well ;)

I'm optimistic, I've not read a remotely negative preview, all the journalists who got to play the unlimited 90 minutes hands off in March/April, were gagging to play more. That sounds good to me.
 
For PlayStation 4 there is an unbalanced correlation between growing CPU bandwidth resulting in diminished GPU bandwidth.

Interesting, but logical. However in this case the X1 shouldn't be that heavily effected, especially because the CPU doesn't have access to the ESRAM memory bus. But apparently the pixel number wasn't pushed back as much on the X1 compared to 900p as it was on the PS4 compared to 1080p.

In general I agree with your sentiments, Watch_Dogs looks "ok" but isn't blowing the doors of current gen (One/PS4) graphically

I think it looks very good, I only noted that it's missing that "real" or rather hyper-real look that a few games seem to have mastered. I've already mentioned the scene complexity and some of the night club lighting was indeed very real.

It's just that becoming CPU limited should not also constrain the resolution as well. Going sub-1080 on PS4 and sub-800p on X1 seemed to be only required on games with last gen engine architectures so far.
 
If the game is CPU limited, why the cut in resolution?

There has to be a bandwidth or fillrate issue here as well.

Why Ubisoft CPU bottlnecked PC games (like AC4) doesn't run well even with big GPUs on PC? Is there a bandwidth problem on PC rigs?

Didn't Sucher Punch tell us that the CPU was their main bottleneck on PS4? Didn't Respawn insinuate that the big framerates drops on Titanfall on XB1 was not because of the GPU/resolution but because of the CPU?

What about those multiplatform games running way worse in many cases on Wii U that could only be explained by a weaker CPU or badly optimized CPU code?
 
Sounds like the GI is similar (if not the same as) FarCry 3's implementation - fairly heavy load IIRC.
 
Why Ubisoft CPU bottlnecked PC games (like AC4) doesn't run well even with big GPUs on PC? Is there a bandwidth problem on PC rigs?

Didn't Sucher Punch tell us that the CPU was their main bottleneck on PS4? Didn't Respawn insinuate that the big framerates drops on Titanfall on XB1 was not because of the GPU/resolution but because of the CPU?

What about those multiplatform games running way worse in many cases on Wii U that could only be explained by a weaker CPU or badly optimized CPU code?

Its because of dynamism (Morin).
 
Why Ubisoft CPU bottlnecked PC games (like AC4) doesn't run well even with big GPUs on PC? Is there a bandwidth problem on PC rigs?

Didn't Sucher Punch tell us that the CPU was their main bottleneck on PS4? Didn't Respawn insinuate that the big framerates drops on Titanfall on XB1 was not because of the GPU/resolution but because of the CPU?

What about those multiplatform games running way worse in many cases on Wii U that could only be explained by a weaker CPU or badly optimized CPU code?

It's a multicore issue. The crux of what Cerny and devs like Sucker Punch are getting at in terms of compute and multicore CPU usage is that compute simply MUST be used to make up for the weak CPU. SP began to hit their limits and they were already dipping into compute. Ubi is likely not using it for any game simulation, AI or whatever at all.
 
It's a multicore issue. The crux of what Cerny and devs like Sucker Punch are getting at in terms of compute and multicore CPU usage is that compute simply MUST be used to make up for the weak CPU. SP began to hit their limits and they were already dipping into compute. Ubi is likely not using it for any game simulation, AI or whatever at all.

That with first parties will be solved once they start mastering the cpu registers(so, programming to the metal certain functions). After all Emotion engine had 3,6 Gflops...You could run more than 10 San Andreas with its pedestrians path tracing... with 6 jaguar cores. This of the dynamism sounds like a joke really.After all i supposse the same dynamism will be in the Expresso, Xenon and Cell versions...
 
Last edited by a moderator:
This of the dynamism sounds like a joke really.After all i supposse the same dynamism will be in the Expresso, Xenon and Cell versions...

I've not read the article but assume they are referring to the dynamics, or rather interactivity of things, onscreen be they vehicles or pedestrians as well as the core mechanics if interacting with the city, I.e your ability to interfere with electricity, water, gas, security systems, sanitation etc.

On last gen versions, they've said a couple of times that the world is less dense.

In a week and a half we'll know.
 
Interesting, but logical. However in this case the X1 shouldn't be that heavily effected, especially because the CPU doesn't have access to the ESRAM memory bus. But apparently the pixel number wasn't pushed back as much on the X1 compared to 900p as it was on the PS4 compared to 1080p.



I think it looks very good, I only noted that it's missing that "real" or rather hyper-real look that a few games seem to have mastered. I've already mentioned the scene complexity and some of the night club lighting was indeed very real.

It's just that becoming CPU limited should not also constrain the resolution as well. Going sub-1080 on PS4 and sub-800p on X1 seemed to be only required on games with last gen engine architectures so far.

Oh that reminds me, we can chalk up another wrong rumor to Thuway (or was it Famousmortimer?), he had Watch Dogs at 1080PS4/960X1P. Of course he would probably just say "well they changed it later"
 
Increasing a games resolution resolution also increases cpu temps, perhaps more work by the api, not just potential for further distant objects to be rendered.

http://www.tomshardware.com/answers/id-1730220/increased-resolution-lead-increased-cpu-temp.html

There is different load on the CPU at different resolutions on PC, yes. 1080P actually can present a different heavier load to the CPU than say 720P. You could see this for example, game runs at 90 FPS on HD 7770 and HD 7970 at 720P, clearly it's CPU limited to 90 FPS right? Maybe crank it to 1080P, it could conceivably run at limited 70 FPS on both vastly disparate GPU's. Once again CPU limited rather than GPU, but we see the CPU load has changed.
 
There is different load on the CPU at different resolutions on PC, yes. 1080P actually can present a different heavier load to the CPU than say 720P. You could see this for example, game runs at 90 FPS on HD 7770 and HD 7970 at 720P, clearly it's CPU limited to 90 FPS right? Maybe crank it to 1080P, it could conceivably run at limited 70 FPS on both vastly disparate GPU's. Once again CPU limited rather than GPU, but we see the CPU load has changed.

Thanks, I'm aware of all that. Here's a better worded question.
I'm more interested in specifically what the increased workload is from the skyrims (gamebryo engine) render pipeline and api on the cpu that is impacted significantly by increasing the resolution.
 
I still find this CPU affecting pixel processing power stuff very strange. It'd be nice to hear an actual dev commenting on the issue.
 
In that case, I might be wrong. But it's still not looking like this is the most the new systems could do, some pretty big inefficiencies have to be there to explain the low resolutions.

I'm starting to think this "Dynamism" system(s) is using a lot, "A LOT" of CPU resources... leaving very little room for anything else. Sounds impressive if everything works correctly, but still system taxing.

http://blog.ubi.com/watch-dogs-disrupt-engine-multiplayer/

The surface layer of the Disrupt engine is focused on what Guay describes as “dynamism,” or the simulation systems within the game: “In our city we simulate the way people drive cars. The electricity is simulated. The water is simulated. The wind is simulated. Everything reacts to everything. Making all those systems talk to one another is where you get branching reactions.”

Take the rain, for example. When the sky starts to open up, civilians will pull out umbrellas. The lights reflect off wet surfaces. We can see the wind shifting the direction of the rain and blowing debris around. Even leaves and trash on the ground will begin to appear damp and weighted down by moisture. These small but significant details lend an unparalleled level of immersion to Watch Dogs.

Even the clothing comes to life in Watch Dogs. It boggles the mind to think about just how long was spent getting the simple act of Aiden putting his hands in his pockets to look just right. The wind pulling at a passerby’s clothing will cause them to tighten their jackets. “Everyone on the street should have clothing simulation,” Guay says. “We want to see it blow in the wind and move with them.”

These are merely the “details,” though. Something major like a car crash will create a widespread ripple effect. Civilians will get caught up in a traffic jam and start honking or even leave their cars to investigate. Others will be injured in the wreck. Onlookers will alert emergency response teams. It all combines to offer an unprecedented amount of realism in a videogame.
 
Back
Top