Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
there's a difference between targeting the hardware too high versus performance issues with the build. Former can't be fixed, later can be patched, if you can't understand the difference or just insist on mixing them up, keep on trucking.
 
Hence I haven't had one issue or crash on pc gaming in years, none in 2012, none in 2013 and none so far in 2014.
I can't even begin to comprehend that. I can't think of any software, simple or complex, on any platform that I could use for years without seeing a single crash or issue. Maybe the timer on my oven.

Look in the support boards for any recent game. Ubisoft Watch Dogs support board for example has posts tagged with platform and I'd say a good 85% are PC problems. That's with a considerably smaller userbase too no doubt.
 
I think that auto resolved itself for me for the simple reason that I almost never play games at launch anymore, I simply don't have the time. I add them to my wishlist and get around to them eventually.
That's one way to solve it and no doubt it works out cheaper as well!

I'd like to be there when you present this solution for PC launch game niggles to major publishers like Ubisoft, Activision and EA ;)
 
Seems the stuttering is unrelated to VRAM, even a 6GB Titan suffers the same problem, major stuttering at Ultra, and minor at High, even at Medium. Something in this game is broken.

AMD performance is far worse too.
http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off

EDIT: The idiots at Ubisoft seems to think the problem is the lack of unified memory on PC, as the game consumes more than 3GB for video data on consoles (at High settings naturally), it's like they never coded for PCs before!! Everybody knows PCs don't have unified models, but they work around that and deliver an even better experience!!

Also at 720p and Ultra settings, I never once saw the game exceed 2.9GB of VRAM, same with the 6 GB Titans. So the issue is definitely not the RAM here, it is the streaming itself.

https://twitter.com/SebViard/status/472030606420115458





 
Last edited by a moderator:
I think that auto resolved itself for me for the simple reason that I almost never play games at launch anymore, I simply don't have the time. I add them to my wishlist and get around to them eventually. Hence I haven't had one issue or crash on pc gaming in years, none in 2012, none in 2013 and none so far in 2014. We should also remember that it's not like consoles are trouble free either. People like to pretend that current consoles are plug and play like the old 8 bit NES, but they aren't. While I have had zero issues or crashes on pc gaming in years, I read all the issues people have with consoles on the forums included game crashes, game lockups, consoles overheating, and on and on. People on console are also having to tolerate issues with their games, it's not like console gaming back in the day where you plugged in the game and everything always worked. Those days are long gone.

Dubious.

I play Payday 2 and Starcraft II very often and I still on occasion experience a glitch or crash. Nevermind other titles like Skyrim or Fallout.

It's not a serious problem or anything, but the overall I'd describe it as 'routine'. Every so often things just don't go right.

I've just accepted this as part of gaming on PC. Or doing anything on PC really, most stuff will just crash and burn inexplicably sometimes.

Which has, in spite of my otherwise eternal misgivings for most things Apple, made me appreciate OSX which I use professionally (and have for years). Naturally I'm enthralled with Linux and this gaming exodus we're seeing to it has me very excited.
 
Seems the stuttering is unrelated to VRAM, even a 6GB Titan suffers the same problem, major stuttering at Ultra, and minor at High, even at Medium. Something in this game is broken.

AMD performance is far worse too.
http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off

EDIT: The idiots at Ubisoft seems to think the problem is the lack of unified memory on PC, as the game consumes more than 3GB for video data on consoles (at High settings naturally), it's like they never coded for PCs before!! Everybody knows PCs don't have unified models, but they work around that and deliver an even better experience!!

Also at 720p and Ultra settings, I never once saw the game exceed 2.9GB of VRAM, same with the 6 GB Titans. So the issue is definitely not the RAM here, it is the streaming itself.

https://twitter.com/SebViard/status/472030606420115458






Yeah if stuttering is occurring even on 6GB cards at 1080p and post process AA then the problem sounds like it's with the streaming afterall. I stand corrected on that one. It does sound like the majority or the problem can still be mitigated by a careful choice of settings though.

Personally I'd be doing everything possible to lock the game at 30fps given that 60 appears to be out of reach at decent settings. So on a high end Nividia GPU perhaps run everything maxed out apart from textures which you'd keep at high and then pile on the AA until you sync at a solid 30fps (rather than jumping between 60 and 30). From the sounds of it that eliminates all of the stutter.
 
Yeah if stuttering is occurring even on 6GB cards at 1080p and post process AA then the problem sounds like it's with the streaming afterall. I stand corrected on that one. It does sound like the majority or the problem can still be mitigated by a careful choice of settings though.

Personally I'd be doing everything possible to lock the game at 30fps given that 60 appears to be out of reach at decent settings. So on a high end Nividia GPU perhaps run everything maxed out apart from textures which you'd keep at high and then pile on the AA until you sync at a solid 30fps (rather than jumping between 60 and 30). From the sounds of it that eliminates all of the stutter.

Seems Ubisoft have messed up here given that daft levels of VRAM still suffer stuttering but I see your point about PC settings also. I believe this is a hang over from 8 years of up-ports which could be easily whacked to 'Ultra' on most cards. The idea of cross platform games that could test a PC is something that hasn't been true for a long time. Even Crysis 2 on the PC with it's Ultra settings wasn't that big of a deal unless you're into barco barriers with eleventy million tessellated polys. I suspect there shall be much gnashing of teeth as we all get used to high end cards actually offering an IQ advantage at 1080p again(as opposed to being a thing for 1440p or 4k gaming).
 
Yeah if stuttering is occurring even on 6GB cards at 1080p and post process AA then the problem sounds like it's with the streaming afterall. I stand corrected on that one. It does sound like the majority or the problem can still be mitigated by a careful choice of settings though.

Personally I'd be doing everything possible to lock the game at 30fps given that 60 appears to be out of reach at decent settings. So on a high end Nividia GPU perhaps run everything maxed out apart from textures which you'd keep at high and then pile on the AA until you sync at a solid 30fps (rather than jumping between 60 and 30). From the sounds of it that eliminates all of the stutter.

They should just fix the bloody game before they release it. Considering the supposedly power surplus the PC should have no issues running this game.
 
Dubious.

I play Payday 2 and Starcraft II very often and I still on occasion experience a glitch or crash. Nevermind other titles like Skyrim or Fallout.

It's not a serious problem or anything, but the overall I'd describe it as 'routine'. Every so often things just don't go right.

I've just accepted this as part of gaming on PC. Or doing anything on PC really, most stuff will just crash and burn inexplicably sometimes.

Which has, in spite of my otherwise eternal misgivings for most things Apple, made me appreciate OSX which I use professionally (and have for years). Naturally I'm enthralled with Linux and this gaming exodus we're seeing to it has me very excited.

Nothing dubious about it man that's my real world experience, not one issue in years. Main differences are that I don't play launch games anymore which probably has a lot to do with it, I don't use craptacular OS's like XP, I use Windows 8 which has been rock solid for me, I don't manually tweak ini settings on games to activate features that have never gone through q/a, I don't do extreme overclocks and expect them to work perfect, and I don't use crappy power supplies. Power supply is one of those things people often overlook but a cheap one can have sinister random consequences, like a no name one used to give me random crashes and reboots but only when using Vegas Pro software. Replacing it with a Corsair Gold 850 some years ago made all issues with Vegas Pro go away.

Incidentally I've used OSX a decent amount, now that is a crappy os! Display corruption when connecting multiple displays, lockups when printing, the os crashing when pulling out the usb->ethernet adapter, sound suddenly not working anymore requiring a reboot, primitive driver support, and on and on the problems were endless, none of which occurred with Windows 8 installed on the same machine with bootcamp. I already knew the "It Just Works" mantra for OSX was full of it just from using it myself, but it was doubly confirmed when my wife worked for Apple. Windows 8 on my same Macbook Pro with bootcamp has been far more stable than OSX, in fact it's never crashed once which I can't say for OSX in spite of using it far less than Windows 8.


They should just fix the bloody game before they release it. Considering the supposedly power surplus the PC should have no issues running this game.

Given how much they had to downgrade the game, it seems like they must have run into various development issues and/or time constraints and just had to release the game to start recouping cash. It'll probably work fine by the time I get around to it though :) Someone should try it with ssd to see if that helps with streaming issues. Now I'm going to get back to playing Remember Me...yeah that's how far behind I am on games.
 
Yeah if stuttering is occurring even on 6GB cards at 1080p and post process AA then the problem sounds like it's with the streaming afterall. I stand corrected on that one. It does sound like the majority or the problem can still be mitigated by a careful choice of settings though.
Yay, another opportunity to play with the settings and experiment! ;)

This is kind of my whole point. I don't want to do this. The PC argument is always, 'you can have what performance you are willing to invest in' (and tweak) and while that's entirely true and while I actually quite liked doing it when I was younger and single and had more time than I knew what to do with, I don't anymore. So spending time tweaking and testing settings, experimenting how far I can overlock the CPU and GPU mean less game time, which is already limited.

The answer for me is a console - I let the developers balance and tweak for a set piece of hardware. The answer for others may be different.
 
Yay, another opportunity to play with the settings and experiment! ;)

This is kind of my whole point. I don't want to do this. The PC argument is always, 'you can have what performance you are willing to invest in' (and tweak) and while that's entirely true and while I actually quite liked doing it when I was younger and single and had more time than I knew what to do with, I don't anymore. So spending time tweaking and testing settings, experimenting how far I can overlock the CPU and GPU mean less game time, which is already limited.

The answer for me is a console - I let the developers balance and tweak for a set piece of hardware. The answer for others may be different.

So much this. I used to run benchmarks for every setting, then combinations of those settings (to see if any combination had extra detail that came for "free".) That was part of the fun then, though. Now it's not fun anymore.
 
Blah blah blah, there's nothing wrong with WD, blah blah blah, it's just that you don't have have good spec PCs, blah blah blah, it's because you are setting it too Ultra, blah blah blah.

They are actually working on a PC performance patch, though. The irony.

I hope they don´t degrade IQ with that patch.
 
Last edited by a moderator:
The answer for me is a console - I let the developers balance and tweak for a set piece of hardware. The answer for others may be different.

Yeah but then you have to deal with stuff like this:

http://wccftech.com/fix-watch-dogs-90-loading-screen-bug-ps4/

It's not like consoles are plug and play anymore like in the 90s, they have their share of issues now with crashes, lockups, console specific critical bugs, etc. Console gamers have become more tolerant of issues, it's no longer "insert cartridge, play".
 
Yeah but then you have to deal with stuff like this:

http://wccftech.com/fix-watch-dogs-90-loading-screen-bug-ps4/

It's not like consoles are plug and play anymore like in the 90s, they have their share of issues now with crashes, lockups, console specific critical bugs, etc. Console gamers have become more tolerant of issues, it's no longer "insert cartridge, play".
I don't think console gamers have become more tolerant. In the old days QA assured you of virtually zero bugs because patches weren't an option. Now games can and do ship with bugs and you've little choice but to put up with them same as PC gamers. The alternative is to basically boycott gaming altogether, or at least wait a year before playing anything to ensure you get the bug-fixed version, which is a practice that'd kill the current gaming industry if everyone followed it!

I've had console freezes and PC freezes over the past years - neither platform is perfect. I'd probably say Win 7 crashed less than my PS3 though, as outside of Skype a while back (which could blue screen Win 7), it's been very stable. But that's mostly because I can close individual frozen apps. I still get app crashes but they don't take down the whole machine like a console game will.
 
Yeah but then you have to deal with stuff like this:

http://wccftech.com/fix-watch-dogs-90-loading-screen-bug-ps4/

It's not like consoles are plug and play anymore like in the 90s, they have their share of issues now with crashes, lockups, console specific critical bugs, etc. Console gamers have become more tolerant of issues, it's no longer "insert cartridge, play".

On the PC you have many of the same bugs, and more on top of those thanks to the huge amount of different configuations.

That exact bug is as bad as they get.. claim DLC content, game can't load anymore, backup you saves, delete and reinstall.. i mean.. wtf
 
I don't think console gamers have become more tolerant. In the old days QA assured you of virtually zero bugs because patches weren't an option. Now games can and do ship with bugs and you've little choice but to put up with them same as PC gamers. The alternative is to basically boycott gaming altogether, or at least wait a year before playing anything to ensure you get the bug-fixed version, which is a practice that'd kill the current gaming industry if everyone followed it!

I've had console freezes and PC freezes over the past years - neither platform is perfect. I'd probably say Win 7 crashed less than my PS3 though, as outside of Skype a while back (which could blue screen Win 7), it's been very stable. But that's mostly because I can close individual frozen apps. I still get app crashes but they don't take down the whole machine like a console game will.

It's not because of QA standards. It's because complexity in games has increased massively.
 
It sucks that first day patches along with another first month patch have become the norm with consoles. It's like they began relying on it.

I feel lucky I've had a near flawless experience with consoles so far, for about 15 years, with hundreds of games purchased. I buy mostly first party games though. Worst buggy console game yet was Lair. The unpatched day one release was really bad.

On PC we are used to the mandatory fiddling to get the balance between smooth and high quality which is understandable because each person has a different rig, but it's not normal that the exact recommended specs require fiddling to get a smooth experience. I did have a few serious bugs with Everquest Next Alpha, but it was fully expected, it was a freakin alpha version. This isn't something normal for a released game.
 
Status
Not open for further replies.
Back
Top