Never looked at it that way before. Quite telling of a story.Math. Pixels per CU:
(1920 * 1080) / 18 = 115200
(1600 * 900) / 12 = 120000
Never looked at it that way before. Quite telling of a story.Math. Pixels per CU:
(1920 * 1080) / 18 = 115200
(1600 * 900) / 12 = 120000
Encyclopedia Brown is closed for business.Spawned off to stop overpowering DF thread. I couldn't find the existing discussion. Didn't it have its own thread?
I like this approach. Take some time to do but at least it's real data to work withYou could create a synthetic homemade test to test AF bandwidth demand across a library of games by downclocking vram until you start getting performance degradations with AF on at various levels and again with AF off, find the difference. Match settings as close to console versions as possible. It could be done very comprehensively if one wanted to spend a dozen hours testing variables. Preferably conducted on AMD 7800 hardware. Then you'd start to get a general idea of the cost of AF.
My thoughts as well. None of the games were running low level drivers so it's possible that the overhead allowed for latency hiding of free AF. The real question is when DX12 games are released and what the impact will be then.What they did with emulating the situation is interesting. Of course, even the bandwidth starved APU showed very little performance loss with AF enabled. That said, it could be CPU limited due to DX11 and the AF could be coming for "free". Too bad they didn't test how much CPU bound their test were by upping the resolution.
Pure speculation, but it would be funny if the drivers for DX11 PC had _always_ incorporated the cost of AF somehow. And not enabling or enabling would make virtually no difference.From my experience Tomb Raider is very easy on the CPU side and scales incredibly well with GPUs, i think that's one game we can write off as CPU limited in the comparison.
Drivers simulate consumed bandwidth for disabled features? Ok..Pure speculation, but it would be funny if the drivers for DX11 PC had _always_ incorporated the cost of AF somehow. And not enabling or enabling would make virtually no difference.
And their conclusion: we're not sure.
Epic fail.
Instead of DF looking at it purely from a hardware perspective (issue)... they should have spent a little bit more (MORE) time on seeing if certain developers have better (updated) SDK/toolchains when compared to others.
too bad they didn't test with slower DDR3 memory. If it is a memory-bandwidth issue, and I use an APU that is even slower than the xbox one gpu, I would prefer to also use lower-clocked memory. DDR3 1600 or DDR3 1333 might show different behaviors.What they did with emulating the situation is interesting. Of course, even the bandwidth starved APU showed very little performance loss with AF enabled. That said, it could be CPU limited due to DX11 and the AF could be coming for "free". Too bad they didn't test how much CPU bound their test were by upping the resolution.