Watch Dogs.

I think the two main concerns are 1) it runs quite a bit better on NVIDIA 2) don't exceed your VRAM capacity. What I find crazier is Wolfenstein, OpenGL idTech5, apparently runs best on AMD. ;)

Anyway I'm sure there will be patches for WD, and new driver tweaks down the road too.
 
What is sufficient on the CPU side? I have a i7-3770 which I think should be enough.
 
What is sufficient on the CPU side? I have a i7-3770 which I think should be enough.
I haven't played it on anything other than a 4.3GHz 3570k and it runs very well as long as the GPU settings are in check for my 6950. GPU is definitely my bottleneck.
 
What is sufficient on the CPU side? I have a i7-3770 which I think should be enough.

More than, someone posted a set of benchmarks in another one of the many Watch Dogs threads which apparently showed particularly heavy CPU sequence and while it was indeed very heavy, the 2500K was still hitting above 30fps as a minimum framerate (I can't remember the average).

Still, it's crazy that any game can slow a CPU like that down so much. DX12 can't come fast enough.
 
I think the two main concerns are 1) it runs quite a bit better on NVIDIA 2) don't exceed your VRAM capacity. What I find crazier is Wolfenstein, OpenGL idTech5, apparently runs best on AMD. ;)

That doesn't make sense. They didn't implement the CUDA transcoding for Wolfenstein?
 
Also NVIDIA OpenGL drivers are supposed to be the gold standard in the industry. Maybe something the IT5 engine is doing is very well suited to GCN.
 
btw watch dogs's engine is not all bad. It have very versatile resolution support.

just put any arbitary number. Watch dog will run in correct aspect ratio. playing portrait? its okay. Playing in PSP resolution of 480x272 pixels? its fine :D awesome.

too bad the GUI cant be rendered independent of game resolution. It will be perfect if i can run the game in 480x480 (this gave me rather good picture in 30fps) and the GUI in 1920x1080.
 
BTW NVIDIA drivers make me all warm and fuzzy inside. If I go to adjust a game's profile and it's not already in the list, I click Add and the last game I played is at the top of the list. So wonderful compared to the, eh, rather basic experience I get on my 7950. It wasn't always this way. NV has dramatically improved things in the last couple years.
 
BTW NVIDIA drivers make me all warm and fuzzy inside. If I go to adjust a game's profile and it's not already in the list, I click Add and the last game I played is at the top of the list. So wonderful compared to the, eh, rather basic experience I get on my 7950. It wasn't always this way. NV has dramatically improved things in the last couple years.
Yeah. Something I've seen is the GeForce Experience suggests sane/useful settings even for my "old" 560 Ti, whereas that AMD Raptr Evolved program often recommends quite low-end settings for games on my faster 6950.

Eh careful I might start on a crazy tirade about AMD's support of older cards....
 
Eh careful I might start on a crazy tirade about AMD's support of older cards....

While you're on the subject, my even older GTX260 gets very good support even in modern titles (that don't require DX11 obviously). It was a difficult choice at the time between the 260 and a 1GB 4770, but I'm confident I made the right choice.
 
Yeah NV's driver support is white hot at the moment. The latest release implements some sort of shader caching system which reduces game load times. Pretty awesome.
 
hidden watch dogs config (refering to xboxoe)

http://pastebin.com/LxK2aLB5

isYVIFkt6ZqYl.jpg

http://i3.minus.com/iLwctJ7vAIIlD.jpg
http://i7.minus.com/ibsp1JyzojivGU.jpg

http://www.dsogaming.com/screenshot...gen-possible-extra-pc-enhancements-uncovered/


edit:
yihhaw, those guys on XENTAX has successfully extract WD files
http://forum.xentax.com/viewtopic.php?f=10&t=11534&sid=b0858a7d89a6dfb93c9c0953bc1b0665&start=15

mods mods mods come faster mods :D


Edit2:

i cant make sure the hidden setting is working due to my tablet pc is too slow.. but looking at WD's memory, it does have "nextgen" and durango config.

ABpCGOa.png
 
Last edited by a moderator:
If these settings turn out stable when enabled, will anyone still doubt that the PC version (and maybe even the newer gens?) was purposedly castrated to put the older gen consoles in a better light for comparison?
 
So you think they didn't want to possibly kill their oldgen sales if consumers saw the nextgen looking so good and decided to wait until they picked up a new console? I can see ubisoft doing something like that.
 
Old gens have a much larger userbase, so making the game look bad in them could hamper their sales.

We're probably looking at orders from the publisher. People who only care about their excel tables and couldn't care less about long-run customer satisfaction or respecting the work of developers.
 
It's an interesting development and that first shot obviously looks spectacular but I think we need a lot more evidence (shots, video and benchmarks) before we can pin any hope on this.
 
hmm the shadowing in WD is weird..

this man cast soft shadow because it got hit by light from a car far back. But it did not cast shadow when hit by a car right beside him :/

in screenshot its hard to see, but in motion the darkness is moving with a form llke feet.

61n0AWM.jpg
 
The PC CPU demands are bizarre. I wonder why this sort of thing happens. The game runs on some very weak CPUs in the consoles. I haven't looked at how threaded it is yet...

AMD is a lot worse with CPU usage it seems,

LAVkEWp.png


small gain from HT compared to nvidia, and way higher impact from CPU clock

http://www.pcgameshardware.de/Watch...s/Watch-Dogs-Test-GPU-CPU-Benchmarks-1122327/


I'm being forced to use low LOD because of my Sandy Bridge Core i3 + AMD Graphics, and it still performs poorly, at least it keeps my GPU cool, or I can enable MSAA and some other things for "free"
 
Back
Top