Or perhaps you should. Did you miss the part where they said the game crashes all the time?
Did you see the terrible scaling?
Well the Alpha only supports AFR for now, so a weak iGPU may not make that much of a difference if you have a medium-range desktop card or better.
And I assume much less stable CPU overclocking and turbo.Unreal Engine Explicit Multiadapter:
https://channel9.msdn.com/Blogs/DirectX-Developer-Blog/DirectX-12-Multiadapter-Unreal-Engine-4
This benchmark uses Intel iGPU to do the the post processing (NVIDIA discrete GPU renders everything else). No AFR, so no micro-stuttering or other problems. Pretty much everybody has Intel iGPU in their CPUs right now. Getting that 11% perf boost is a nice addition.
Did you see the terrible scaling?
Unreal Engine Explicit Multiadapter:
https://channel9.msdn.com/Blogs/DirectX-Developer-Blog/DirectX-12-Multiadapter-Unreal-Engine-4
This benchmark uses Intel iGPU to do the the post processing (NVIDIA discrete GPU renders everything else). No AFR, so no micro-stuttering or other problems. Pretty much everybody has Intel iGPU in their CPUs right now. Getting that 11% perf boost is a nice addition.
The problem is that the feature is never likely to leave Alpha stage. Too complicated, requires too much from game developers and GPU driver writers, for too little gain.The feature is not even Alpha yet so I wouldn't read much into it.
Adaptive sync is not magic... it can't handle very high frequency refresh changes like you'd have in an asymmetric AFR scenario and it won't solve any of the back-pressure/timing problems in these configs. Adaptive sync stuff helps in cases where you are running a relatively consistent throughput but less than the monitor refresh. It doesn't fundamentally alter the fact that you can't predict when a frame "will hit the display" and still have to use rolling averages based on GPU back-pressure to approximate that.Well AFR on (not that much) asymmetric configurations may not be all that bad if you're using adaptive sync, for example.
You might actually have a point about the terrible scaling. Comparing a single gpu Guru3D review published today using original AOS benchmark at 2560x1440.
http://www.guru3d.com/articles_page...ce_gtx_980_ti_platinum_edition_review,24.html
I don't think AFR is worth investigating at all.
It would be exciting if they could find a way to leverage the IGP that goes unused in most gamers' machines. But that is even more difficult. Plus many folks haven't upgraded CPU in years for obvious reasons. My 3770K has an older IGP that doesn't support DX12 AFAIK.
This all seems a little stillborn to me.
its things like these that make so many consumers suspicious of certain sites. I was wondering why they used a fury nano in those tests and went checking the other pages. On those pages they used a Fury X. So they had one. Maybe I missed something?