This is a "best-case" scenario, but I don't know how things have progressed in the driver department in hybrid crossfire between APUs and discrete GPUs.
Maybe it's more, maybe it's less, hence the "if".
Nonetheless, if the notebook is paired with 1600MHz DDR3, the jump in the iGPU performance should be considerable, as should be the hybrid crossfire results.
I am not talking about Fps that are read from the end of a Fraps run or minimum Fps during one. I am talking about playability and how much of the improvement actually translates into better gameplay.
(This is from August timeframe, BF BC2, medium details, 2x MSAA, no HBAO)
Depends on your definition of "gaming". Mine: Running benchmarks must be within the game itself, not in a canned benchmark that's easy to identify and optimize for. So, I'd at least exclude AvP, Battleforge and Far Cry 2 here, possibly more.Not in gaming it isn't.
I am also not sure about running Crysis 2 and Shogun 2 in DX9 as well as using a purely CPU-limited setting for the higher end cards in Starcraft 2 by not enabling Antialiasing in the drivers which do specifically provide this option.
The more pixels need to be moved, the better for the Radeon and the worse for Geforce. I think it has something to do with the Geforce's inability to export more than two pixels per clock per SM. While that won't hurt gameplay very much in my opinion because only fps peaks are capped to a lower rate, it definitely shows in the benchmarks.At 1920x1200 GTX580 averages 11.88% faster, 2560x1600 only 8.49% faster than HD6970 ( http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_560_Ti_448_Cores/27.html - using TPU as source as they have the largest list of games tested )
Last edited by a moderator: