Well, Ubisoft would have to upgrade snowdrop not UE. But yea. I think we could see this from Ubisoft soon. Although to see it in 2019, will be unlikely.Sadly ubi titles are now AMD sponsored
Well, Ubisoft would have to upgrade snowdrop not UE. But yea. I think we could see this from Ubisoft soon. Although to see it in 2019, will be unlikely.Sadly ubi titles are now AMD sponsored
Open standards usually mean just console settings that lack any additional visual flair. Or worse, the inclusion of performance slashing DX12 path without any benefit to IQ. Many people like having an option or two that adds additional layer of visual realism on their PC game, even it these options are taxing.Sadly? At least it pushes their games towards open technologies and standards, something that couldn't be said when they were sponsored by NVIDIA.
Huh?!? The move from OpenGL / OpenCL to Vulkan gave me a speedup of two! (on both NV and AMD). Or worse, the inclusion of performance slashing DX12
I am not criticizing that, Of course it was necessary, but the fact that developers still do them with minimum efforts and end up hurting performance on both NVIDIA and AMD hardware is frustrating. You still have DX12 slashing 20% of Battlefield's performance for nothing. The recent Resident Evil 2 is another example. It seems almost nobody is able to utilize DX12 to improve performance or even image quality. And if not for DXR, DX12 would be a complete and total bust 5 years after it's introduction.Criticize the devs if you want, but not the introduction of low level APIs - it was necessary finally.
i don't see how a lack of PC extras (except twice pixels and framerate)
DX12 almost always ends up slower when something is GPU-limited. Obviously, driver writers know how to use barriers much better and there might be a broad range of optimizations applicable to different games (if these are integrated in driver), with DX12, every developer should do these by themselves.Personally i have no idea why it still happens DX12 often ends up slower than 11
Yeah, but i'm one of those who can't spot a difference, other than the fan becoming loud. Five SSAO modes? All of them look like crap, haha. Ultra or medium - game still looks the same to me.
But what would be a 'huge difference' for you? (I'm probably too ignorant myself but i'm interested in perception of others...)
DX12 almost always ends up slower when something is GPU-limited. Obviously, driver writers know how to use barriers much better and there might be a broad range of optimizations applicable to different games (if these are integrated in driver), with DX12, every developer should do these by themselves.
Besides, driver writers simply know new GPUs much better and care about performance on these new GPUs.
Quite obvious differences imo, offcourse personal and one might not find this important at all in 64 multiplayer matches but were comparing to the One X here, the most powerfull of the mid-gen upgrades.
You dont only get framerates and resolution upgrades, also higher settings in about all areas. Theres many comparisons on youtube for different titles.
Thanks, watched it and... it's the 'minor' stuff i have expected. I really want proper GI and nothing else, so i never care. The only game since years that impressed me at all is that path traced Quake 2. I'm a loud critic about RT HW, but i have to admit... finally real progress after so many years of 'nothing really happens'
Currently however, the only reason i prefer PC for gaming is mouse look. I'm envious about PS4 exclusives - but i could never play a game with a gamepad.
Not necessarily. Only if it's well funded. It costs a lot of money to have your driver teams constantly looking at the next game release and optimizing drivers for what they put in their code.So you say the DX11 advantage often comes from HW vendors work on optimizing unique AAA games? (probably individually for each)
Very interesting - i have not thought this is no longer easy with low level APIs.
How can you fix a game by doing more low level work? This dev said the game code had been broken even with high level API, with DX12 / Vulkan, there are 10x more chances to shoot yourself in the foot and this is exactly why DX12 is in a bad state in many games.not to mention that of course a lot of code shipped broken and 12/Vulkan was a way to break free from the past.
That's a good strategy to reverse the situation, now Nvidia struggles with all these low level GCN opts in DX12 (which can decrease performance on nvidia GPUs), neither can it fix these with drivers because API is too low level. No money waste, console devs pay, brilliant!They could not keep up against Nvidia's investment in this space, and on the console side developers are already doing it, so I suspect they sought a way to move PC to this space as well and relieve themselves of the burden.
I think the idea is that 12/Vulkan was a break away from DX9-11 and whatever legacy items were in there; from that perspective there is benefit. A single API whose behaviours should be properly defined and supported, no weird hacks to get an intended result.How can you fix a game by doing more low level work? This dev said the game code had been broken even with high level API, with DX12 / Vulkan, there are 10x more chances to shoot yourself in the foot and this is exactly why DX12 is in a bad state in many games.