Watch Dogs.

The PC version is being slammed pretty hard by users in metacritic.

Disappointing graphics + problems with AMD cards + uplay being down and not letting people download/play the game = perfect storm in the PC area.
 
Wow it's getting trashed hard there. Performance, boring gameplay, glitches and poor graphics being the main complaints.
 
The screenshots on Steam on the actual game page itself don't really seem to match what we're seeing. Example:

ss_0ce6b9f9a11b038f9d1e5dcf12a5d9a9043e6c16.1920x1080.jpg
 
Some of those comments are pretty stupid (on Metacritic). It really does annoy the hell out of me when some PC gamers whack everything up to maximum on their mid range graphics cards and then whine about performance. FFS just lower your graphics settings to reasonable levels and stop making PC gamers and the PC platform in general look bad!

For the record, this game runs fine on PC with console level hardware at console settings. The problem is PC gamers expecting too much from their hardware. Okay, admittedly the higher quality settings are horribly optimised too but when has that ever not been the case?
 
Yap, that "screenshot" in the Steam page makes the actual game really hard to swallow.


Some of those comments are pretty stupid (on Metacritic). It really does annoy the hell out of me when some PC gamers whack everything up to maximum on their mid range graphics cards and then whine about performance.

I do know that some metacritic reviewers/voters are plain old trolls, but I believe there's a good bunch of critics who are just comparing the performance with games with better looks within the genre (GTA IV, Sleeping Dogs), making it a legit complaint.

Then there's also AMD users who are aware of the Gameworks' scheme.
 
I do know that some metacritic reviewers/voters are plain old trolls, but I believe there's a good bunch of critics who are just comparing the performance with games with better looks within the genre (GTA IV, Sleeping Dogs), making it a legit complaint.

Then there's also AMD users who are aware of the Gameworks' scheme.

I think at the highest settings there's certainly a valid agument of poor graphics vs performance cost.

However the general "performance is crap on my high end PC" comments are just adding fuel to the fire for those who misrepresent the PC platform as poor performing in general as opposed to consoles. The game actually performs very well in relation to consoles at the same settings (GPU wise, not CPU wise) but if that goes ignored then you get people being scared away from the PC version in favour of the console version even if they have high end GPU's, purely because they've "heard it runs bad on high end PC's".
 
to be fair,
watch dogs have crazy needs of VRAM. and those with high-end PC automatically got crazy stutter because they think their VRAM is enough.

lower the texture to medium and it works fine again.

then there;s the problem of underutilization of CPU and GPU in watch dogs. The workload is spread across cores but in total, it only reach 50% cpu usage. The gpu usage usualy hover around 70%.


more developer should follow what Rockstar have done in GTA IV. Show the indicator that your PC VRAM is enough or not on the graphic option menu.
 
The PC CPU demands are bizarre. I wonder why this sort of thing happens. The game runs on some very weak CPUs in the consoles. I haven't looked at how threaded it is yet...

As for the bitching that's everywhere... well every game gets this. Lots of loudly negative people in the world. I'm enjoying the game a lot. It's not the second coming of virtual life or whatever but it is a good time.

to be fair,
watch dogs have crazy needs of VRAM. and those with high-end PC automatically got crazy stutter because they think their VRAM is enough.

Yeah I've got my 6950 2gb running high textures but medium lod. It stutters too much otherwise. You definitely ideally want at least 3GB for this game. And I have no doubt that most PC gamers have no idea how to determine this. A 1GB card is going to be trouble.

Wolfenstein and Thief are similar to WD in their VRAM demands btw. 2GB is no longer high end and 1GB is a real problem.
 
The PC CPU demands are bizarre. I wonder why this sort of thing happens. The game runs on some very weak CPUs in the consoles.

API overhead pure and simple. We've already been warned by developers (some on this board) that draw calls were going to become a serious problem for PC's this generation and that's exacly what we're seeing. Luckly Mantle and DX12 are here to save the day but we're going to have to tough it out until they hit mainstream at the end of next year.

Regarding the memory requirements, it seems High is as good as it gets for 3GB cards but even then resolution and AA needs to be kept reasonable.

What resolution and AA are you running at? Do the stutters go away if you put LOD back to high but drop resolution to 900p and AA to post processing?
 
but the game refuses to use 100% cpu. Its weird. The only game that like this were Titanfall, left4dead series, and Assassin's creed series.

games like GTA IV, Battlefield series, easily eat almost 100% cpu constant.
 
If metacritic users were being harsh, steam users are being brutal:
http://steamcommunity.com/app/243470/reviews/

Keep in mind that only the users with the game can review it, and one can see the amount of hours spent in the game for each review.





Here's the most interesting review I found through kotaku:

pqZQfc6.jpg

:D
 
What resolution and AA are you running at? Do the stutters go away if you put LOD back to high but drop resolution to 900p and AA to post processing?
1920x1080 with temporal SMAA. High textures. Medium LOD, medium shadows, medium reflections, MHBAO, disabled DOF, disabled motion blur, High shaders, High Water. Vsync 1 and 3 GPU buffers. This runs quite nicely.

I haven't tried 1600x900. MSAA misses too much aliasing so am not using it.
 
API overhead pure and simple. We've already been warned by developers (some on this board) that draw calls were going to become a serious problem for PC's this generation and that's exacly what we're seeing. Luckly Mantle and DX12 are here to save the day but we're going to have to tough it out until they hit mainstream at the end of next year.

Regarding the memory requirements, it seems High is as good as it gets for 3GB cards but even then resolution and AA needs to be kept reasonable.

What resolution and AA are you running at? Do the stutters go away if you put LOD back to high but drop resolution to 900p and AA to post processing?

Methinks they could have done a lot more optimization on the DX11 version if they'd wanted to. You can pack quite a bit into a single draw call these days.

Regarding VRAM requirements, it's really unfortunate that NVIDIA is still selling "high end" cards with 2GB. The GTX760 and 770 should have been 4GB cards, but then NV would have had to make the 780 a 6GB card and I doubt they wanted to do that.
 
This is exactly what I've been fearing this generation and so far as been the case with the latest releases. The developers just aren't bothering to really optimize for the PC and putting stupidly high CPU requirements to cover themselves. We're left with draw call limitations on hardware that can do so much more than the consoles.

So once again PC gamers are going to be shafted until Mantle/DX12 are actually released AND developers start taking advantage of them. So another few years?
 
Back
Top