Watch_Dogs by Ubisoft

They certainly didn't use GPU compute, Nvidia GPUs not being very good at it, and all physics, particles & animations must certainly rely on the weak consoles CPU then, explaining the very bad performance on both consoles.

It's really a pure PC game optimized for Nvidia GPUs and big CPUs "if you have the CPU to match" ported on all consoles, it really shouldn't surprise us as we know how Nvidia is affiliated with Watch_Dogs. It's explains all the downgrades and why it's one of the worst optimized game on both consoles. I think even on XB1 only Dead Rising 3 (launch title) runs worse than WDs.

Case closed.
Using compute is not the only available way to extract performance and visual fidelity out of consoles, only exclusives exploited it to a moderate degree so far, other multi plats didn't and they look just fine, BF4, NFS Rivals, Ghosts, Thief, Trials .. etc.

Implying the downgrades are caused by not using compute is ridiculous,even the games that use compute don't overuse it, it's just an implementation in a couple of effects, nothing any decent PC GPU can't handle. Besides we don't know if the game is using it or not.

The biggest incentive for the downgrade is old consoles, not some PC vendor.

Let's not over egg the pudding, Ubisoft - for all there faults - promised nothing.

Nope, they promised to deliver next gen visuals with the Dirsput engine, they consistently showed excellent footage up to min 2013, and then they flopped, there is nothing to absolve them for what they did.

http://www.dsogaming.com/interviews...-global-illumination-city-size-modding-scene/
http://blog.ubi.com/watch-dogs-disrupt-engine-multiplayer/
 
Last edited by a moderator:
Ubisoft have also never stated the final resolution until recently. Previously it was no information, or "we are targetting 1080p" on PS4.

Actually 1080p was confirmed and there was a video of them stating so. I believe UBI is purposely capping the PS4 resolution for some odd reason... maybe to keep review sites focused on gameplay, rather than the 1080p/900p/720p battle. Releasing the patch later... but that's my tinfoil hat theory...

http://www.videogamer.com/ps4/watch_dogs/news/ubisoft_confirms_watch_dogs_is_1080p_on_ps4.html

The PlayStation 4 version of Watch Dogs will render at native 1080p, cinematics animation lead Lars Bonde has confirmed.
 
Using compute is not the only available way to extract performance and visual fidelity out of consoles, only exclusives exploited it to a moderate degree so far, other multi plats didn't and they look just fine, BF4, NFS Rivals, Ghosts, Thief, Trials .. etc.

Implying the downgrades are caused by not using compute is ridiculous,even the games that use compute don't overuse it, it's just an implementation in a couple of effects, nothing any decent PC GPU can't handle. Besides we don't know if the game is using it or not.

The biggest incentive for the downgrade is old consoles, not some PC vendor.



Nope, they promised to deliver next gen visuals with the Dirsput engine, the consistently showed excellent footage up to min 2013, and then they flopped, there is nothing to absolve them for they did.

http://www.dsogaming.com/interviews...-global-illumination-city-size-modding-scene/
http://blog.ubi.com/watch-dogs-disrupt-engine-multiplayer/

Not sure about old gen but PC assets are all identical to the nextgen consoles for the most part. Perhaps it was the nextgen consoles that hold back the high end PC version. Maybe they found the nextgen consoles just couldn't hit the performance expected. They after all had to drop the ps4 version to 900p which suggests they were hitting some hardware limit. Perhaps memory bandwidth. Or who knows there may be a 1080p patch at some point.
 
Stolen from GAF. This is very telling... especially the ram requirements. The PS4/XB1 maybe more limited (graphic wise), because of the memory reservations in place on the systems. I wouldn't be surprised if the consoles aren't hitting the medium settings all the times.

According to Morin, dual-cores CPUs will not be able to run Watch_Dogs (so it will be interesting to see how our simulated dual-core system will behave).Not only that, but the game may not even boot if PC gamers are not equipped with 6GB of RAM. When a fan asked Morin whether the game would run on his PC system that was equipped with 4GB of RAM, Morin replied and said that he should climb it up to 6GB.

“The main issue is your ram. If you climb it up to 6 you should be able to run the game at low settings.“

Morin also claimed that a GTX670 will be able to run the game at Ultra settings, provided the CPU does not bottleneck the GPU. But what CPU is ideal for Ultra settings? Morin stated that every CPU that scores above 9000 points at Intel’s PassMark will be able to run the game at max settings.

“As a ref: cpu above 9 000 -10 000 should run decently in Ultra with a 670 GPU“

Morin has also stated that users with CPUs similar to Intel i5-3470 will be able to run the game at medium or high settings, as that CPU will bottleneck the game.
 
It's a multiplatform game being discussed by all sides with equal impartiality (it does kinda suck, especially next to the promo materials). Why drag console wars-ery into this? Are we to avoid ever criticising multiplatform visuals because that have platform-diplomatic-immunity? "I am cross-platform; you can't touch me!"

WD is a massive downgrade. Ubisoft over-promised and under-delivered. That's what's got people annoyed, irrespective of any platform. You're the only person here who can't separate discussion of the game from the console wars.

I'm not the one who starts the comparisons to Infamous or last gen games. And as I said before, it's perfectly fine to find faults in this game (there are many) but that's very different from engaging in the "GTA5 looks better" hyperbole.

If you guys wanna hear and see real bullshit, look up the 2005 E3 Oblivion videos. For a little perspective on pre-release exaggeration and hype machine engineering.
No need to go that far. Just look at DS2. From Software passed off a tech demo as the real game all the way until late January and then when the consumers finally got the game on their hands and it looked and performed nothing like the tech demo, From's explanation simply was "things change".

At least when it comes to Watch_Dogs we've been aware of how the game really looks since a year ago. People expecting the game to jump back to the 2012 quality level have been unrealistic. But then again, Ubisoft is one of those companies that people love to hate...

Nope, they promised to deliver next gen visuals with the Dirsput engine, they consistently showed excellent footage up to min 2013, and then they flopped, there is nothing to absolve them for what they did.

http://www.dsogaming.com/interviews...-global-illumination-city-size-modding-scene/
http://blog.ubi.com/watch-dogs-disrupt-engine-multiplayer/

What exactly did they fail to deliver in regards to what's mentioned in those articles?
 
They certainly didn't use GPU compute, Nvidia GPUs not being very good at it,

That's not true at all, Nvidia GPU's are perfectly capable of using GPU compute. They may be way behind AMD in OpenCL but in DirectCompute they're pretty comparable. You've just got to make sure you use the architecture properly to get good performance out of it. And a devloper optimising for Nvidia GPU's would obviously do so. Check out these benchmarks:

http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/15

With the exception of Lux which Nvidia traditionally fares terribly in, the 670 is showing greater than 7850 (PS4) levels of performance here - as derived from the 7870 scores.

and all physics, particles & animations must certainly rely on the weak consoles CPU then, explaining the very bad performance on both consoles.

Even if GPU compute isn't being used in the PC version (and given their high end CPU requirements - why would they need to) there's no reason why that would mean the console versions must lack it as well. You don't expect specific optimisations of each of the consoles architectures?

It's explains all the downgrades and why it's one of the worst optimized game on both consoles. I think even on XB1 only Dead Rising 3 (launch title) runs worse than WDs.

This seems like a rather large exageration to me.
 
Stolen from GAF. This is very telling... especially the ram requirements. The PS4/XB1 maybe more limited (graphic wise), because of the memory reservations in place on the systems. I wouldn't be surprised if the consoles aren't hitting the medium settings all the times.

WTH, the game will be bottlenecked to medium or high settings on a 3.2Ghz quad IvyBridge? What are they doing with all that power?? I can only assume insane draw call levels. Shame we won't see a Mantle path for this.

EDIT: 9000 - 10,000 is a very high Passmark score requiring a very beefy CPU. For reference an i7-3820 only scores 9022. That's a 3.6Ghz quad Ivybridge with Hyperthreading.
 
That's not true at all, Nvidia GPU's are perfectly capable of using GPU compute. They may be way behind AMD in OpenCL but in DirectCompute they're pretty comparable. You've just got to make sure you use the architecture properly to get good performance out of it. And a devloper optimising for Nvidia GPU's would obviously do so. Check out these benchmarks:
Exactly, actually NV is bad in OpenCL only because they dropped it's driver support long time ago, with DC they offer similar or better performance compared to AMD's equivelents. They also used DC in Ghosts (an NV optimized game) to implement the Fur simulation effects.

What exactly did they fail to deliver in regards to what's mentioned in those articles?
We support hundreds of dynamic lights, dozens of high resolution shadow casters, dynamic environment reflections and global illumination in a fully dynamic time of day environment. Our setup also allows for screen space effects that are usually not compatible with deferred rendering such as skin subsurface scattering, anisotropic specular highlights and hair shading.

WATCH_DOGS does support Global Illumination. We prioritized developing our Global Illumination technique since we believe that’s one of the key feature that differentiates between current-generation and next-generation games running in a fully dynamic world (i.e.: no pre-rendering possible). Our technique is custom made to fit all the requirements of our environment (urban city, vast exteriors, detailed interiors). At its core, it uses light probes that are baked at the highest quality using a vast cloud of computers.

Dynamic environment reflections, Hundreds of shadow caster and Global Illumination my ***.
 
Perhaps they didn't promise, but the problem with showing too early a game that gets people excited (look at the opening posts in this thread) is you set the expectations.

Expectations for PC owners, sure. Expectations for console owners? Not so much. Anything existing in a creative process that is shown early is going to be subject to change. Not everybody may understand this but resolute ignorance of the creative process, especially in this day and age where developers and game directors are so accessible and wiling to talk on the record about things, is less easy to forgive.

The game and gameplay are the same. The graphics on cheap consoles are not as good as PCs with graphics cards costing a sizeable chunk of that console. Shocker! ;)

Nope, they promised to deliver next gen visuals with the Dirsput engine, they consistently showed excellent footage up to min 2013, and then they flopped, there is nothing to absolve them for what they did.

I suggest that the reason they stopped showing PC footage was their marketing promotion deal with Sony. Feel free to rage about that if it makes you feel better.

What I take away form both of the interviews you linked, is you associate 'next generation' with graphics whereas the dev team's emphasis is on gameplay. While they do talk graphics, they mentioned want a full GI world and nothing pre-rendered, they talk a lot more about interactivity, density of the world, seamless vast exteriors with detailed interiors - not mention the important part about this all scaling according to the platform.

"Watch Dogs is truly a next-gen game – not just in terms of offering cutting edge graphical performance on the next generation of consoles, but also when it comes to the gameplay, the immersion and the seamless online experience. And that’s due in large part to a great foundation: the Disrupt Engine."​

There's obviously no metric for "cutting edge graphical performance", it's a meaningless and immeasurable claim so congrats into being suckered into marketing. You're not the first.


Thanks, I hadn't seen this. Unfortunately the source video the article links too isn't accessible, alas. I'm not wearing a tinfoil hat either but would not be surprised at a 1080p patch for PlayStation 4 within the first couple of weeks!
 
Exactly, actually NV is bad in OpenCL only because they dropped it's driver support long time ago, with DC they offer similar or better performance compared to AMD's equivelents. They also used DC in Ghosts (an NV optimized game) to implement the Fur simulation effects.






Dynamic environment reflections, Hundreds of shadow caster and Global Illumination my ***.
1. I see reflections here: http://www.youtube.com/watch?v=o6WnHmRE0IY
2. They said "hundreds" of dynamic lights, but "dozens" of shadow casting lights.
3. There is indirect lighting. It's subtle but it's there.
 
1. I see reflections here: http://www.youtube.com/watch?v=o6WnHmRE0IY
2. They said "hundreds" of dynamic lights, but "dozens" of shadow casting lights.
3. There is indirect lighting. It's subtle but it's there.
1-Static reflections, water puddles only reflect certain street and car lights, and nothing else. True dynamic reflections where characters, vehicles and particles are reflected is not featured here.

2-Yes, Dozens, I don't see even a dozen shadow caster. Car lights don't cast shadows, neither do muzzle flashes, explosions or even lightning. There is also a possibly that some flash lights don't cast them too, but we've yet to confirm that.

3-I don't see it, especially during day time, it looks the opposite actually, even direct lighting is lacking.

They even listed physical based rendering as a featured technique, what kind of crap is that?!!

There's obviously no metric for "cutting edge graphical performance", it's a meaningless and immeasurable claim so congrats into being suckered into marketing. You're not the first.
Exactly, marketing lies.

When you combine footage with excellent graphics with a chit chat about technological features in the engine, you are pretty much implying next gen graphics. And when your results underdeliver it only means you were lying.
 
When you combine footage with excellent graphics with a chit chat about technological features in the engine, you are pretty much implying next gen graphics. And when your results underdeliver it only means you were lying.
Just because your inferred something, doesn't mean they implied it. They were very clear (in the interview you linked) about the graphical techniques that they considered important for the game and these are present. Just because you expected more don't make them liars.

You need to develop a filter for marketing, or just read more carefully what folks are actually state will be included in a product, or you stand to be disappointed a lot in life.
 
1-Static reflections, water puddles only reflect certain street and car lights, and nothing else. True dynamic reflections where characters, vehicles and particles are reflected is not featured here.
The interview says: "dynamic environment reflections", meaning environment reflections updated in realtime. This is shown in the cars while you drive through the city:

http://www.liveleak.com/view?i=2e9_1400371115

It's never stated that reflections are updated for every single object in the scene. In the other article it only says this about reflections: "The lights reflect off wet surfaces". That's true.

2-Yes, Dozens, I don't see even a dozen shadow caster. Car lights don't cast shadows, neither do muzzle flashes, explosions or even lightning. There is also a possibly that some flash lights don't cast them too, but we've yet to confirm that.
We haven't seen the whole city yet so perhaps some areas are true to that. Of course there's also the caveat that we don't know if the numbers he's throwing refer to what's been rendered on screen or simply about the level design in which case sure, there are dozens of shadow casting lights throughout the map, they're just not rendered at the same time :LOL:

3-I don't see it, especially during day time, it looks the opposite actually, even direct lighting is lacking.
It's hard to see because the player wears a dark coat, but when using lighter clothing it's much more noticeable:

watch_dogs_beta_ps4-1l1usf.png


Also, how is direct lighting lacking?

They even listed physical based rendering as a featured technique, what kind of crap is that?!!
Why is that crap? PBR is mostly related to surface shading (energy conservation and things like that) and content-authoring pipelines (proper albedo and specular values).

Exactly, marketing lies.

When you combine footage with excellent graphics with a chit chat about technological features in the engine, you are pretty much implying next gen graphics. And when your results underdeliver it only means you were lying.
They've delivered on what they've been demoing this past year. If you expected more, that's on you.
 
WTH, the game will be bottlenecked to medium or high settings on a 3.2Ghz quad IvyBridge? What are they doing with all that power?? I can only assume insane draw call levels. Shame we won't see a Mantle path for this.

EDIT: 9000 - 10,000 is a very high Passmark score requiring a very beefy CPU. For reference an i7-3820 only scores 9022. That's a 3.6Ghz quad Ivybridge with Hyperthreading.


The 3820 is actually a Sandybridge-EX, but yes: >9000 in Passmark is ridiculous as a CPU requirement for a game. The new consoles would probably get less than 3000.
 
The 3820 is actually a Sandybridge-EX, but yes: >9000 in Passmark is ridiculous as a CPU requirement for a game. The new consoles would probably get less than 3000.
It's much easier to optimise for a CPU when you know the specifics of the instructions timings, cache sizes, memory performance. There are simply too many variations of this across all the Intel and AMD processors in many gaming PCs. Plus they could also be using compute in places on consoles.

But yeah, the CPU requirements for the PC version are certainly eyebrow raising.

Having seen a ton of leaked PlayStation 3 footage, graphics aside, it's not obviously missing much except pedestrian and car density.
 
It's much easier to optimise for a CPU when you know the specifics of the instructions timings, cache sizes, memory performance. There are simply too many variations of this across all the Intel and AMD processors in many gaming PCs.

I'd say it's much more likely to be API overhead. Why would optimisations to the things you mentioned not transfer back to the PC CPU's with higher clock speeds, bigger caches, faster memory etc... when they are all using the same instruction sets?
 
I'd say it's much more likely to be API overhead. Why would optimisations to the things you mentioned not transfer back to the PC CPU's with higher clock speeds, bigger caches, faster memory etc... when they are all using the same instruction sets?
But the time an instruction takes to complete can (and will) vary across processor architectures, as will the size of the cache and the cost of hitting hitting DRAM.

There are a crazy amount of variations across 80x86 in just the last 3-4 years and it's incredibly difficult to optimise for them all. On a console, you know exactly how many cycles each instruction takes, you can optimise routines to fit cache and you know the cost of addressing DRAM.

Accepting DirectX has a CPU overhead but wow, that'd be some crazy overhead! ;)
 
Accepting DirectX has a CPU overhead but wow, that'd be some crazy overhead! ;)

I'd personally have thought it would be far more crazy to gain a 3x performance increase on one x86 architecture over another from low level optimisation. We don't get anything close to that on the GPU side which I would have thought would have more scope for that type of optimisation.

On the other hand we already know the API is a huge overhead for PC games and especially so in relation to draw calls. And a big open world, very dense game like Watch Dogs is presumably going to be unusually heavy on draw calls.
 
On the other hand we already know the API is a huge overhead for PC games and especially so in relation to draw calls. And a big open world, very dense game like Watch Dogs is presumably going to be unusually heavy on draw calls.
I was under the impression that DirectX was still limited in the multithreaded department, i.e. you can do it but the game needs synchronicity between the calling threads, which I thought was a limiting factor. I.e, if there still is an issue with multithreading, why are quad-core CPUs the preferred base?

Happy to be corrected! :)
 
I was under the impression that DirectX was still limited in the multithreaded department, i.e. you can do it but the game needs synchronicity between the calling threads, which I thought was a limiting factor. I.e, if there still is an issue with multithreading, why are quad-core CPUs the preferred base?

Happy to be corrected! :)

Games can scale across multiple threads but as far as I'm aware the scaling isn't balanced across all cores and so the game could still be bottle necked by the primary thread. But of course that's not the only performance limitation of DX11.
 
Back
Top