D
Deleted member 13524
Guest
I need to be clear on one thing here: to me, The Witcher 3 looks fantastic and the only reason why I'm probably not a 1st-day buyer is because I'm still having a great time with GTA V.
I'm not arguing either the game looks better or worse than what they showed in E3 2013. I don't think the 2013 comparisons are relevant, as it seems practically the same great-looking game. The screenshots seem to have been taken at a different time of the day so different lighting situations aren't really comparable. This is definitely not the same situation we saw with Colonial Marines or Watch Dogs.
Maybe the problem is that - because of Gearbox and Ubisoft - many people are now very vigilant about demo-to-gold differences. It's a consumer reaction, and you'd be well to expect this on every AAA release for the following years. The Internet doesn't forgive nor forget easily and both developers and publishers should have this in mind, for their own sake.
In my posts, I'm talking specifically about the general idea of developers placing advanced IQ options in their games that are impossible to implement at decent framerates given the hardware available during release.
I think it's a good idea, generally.
There's a bottleneck in the streaming engine? Then the framerate will hurt whenever the player does a 180º turn or the player will notice stuff popping up in the horizon. And then the hardcorest PC gamers will long for faster SSDs or humongous amounts of RAM to install the game in a RAM drive, just to get that slide to the max.
I think PC gamers love advanced IQ options, impossible-to-meet settings and the choice to either get more framerate or better looks. If they didn't, why would they spend so much money on hardware and time on solving driver/OS/instalation issues?
I'm not arguing either the game looks better or worse than what they showed in E3 2013. I don't think the 2013 comparisons are relevant, as it seems practically the same great-looking game. The screenshots seem to have been taken at a different time of the day so different lighting situations aren't really comparable. This is definitely not the same situation we saw with Colonial Marines or Watch Dogs.
Maybe the problem is that - because of Gearbox and Ubisoft - many people are now very vigilant about demo-to-gold differences. It's a consumer reaction, and you'd be well to expect this on every AAA release for the following years. The Internet doesn't forgive nor forget easily and both developers and publishers should have this in mind, for their own sake.
In my posts, I'm talking specifically about the general idea of developers placing advanced IQ options in their games that are impossible to implement at decent framerates given the hardware available during release.
I think it's a good idea, generally.
Still, the option to have it (again, above the pre-determined Ultra options) wouldn't hurt.Rate of streaming assets? You'd have to load in more (significantly more on a 180 degree turn) which'd choke the streaming engine and need a solution to accommodate like more precaching.
There's a bottleneck in the streaming engine? Then the framerate will hurt whenever the player does a 180º turn or the player will notice stuff popping up in the horizon. And then the hardcorest PC gamers will long for faster SSDs or humongous amounts of RAM to install the game in a RAM drive, just to get that slide to the max.
I think PC gamers love advanced IQ options, impossible-to-meet settings and the choice to either get more framerate or better looks. If they didn't, why would they spend so much money on hardware and time on solving driver/OS/instalation issues?