Haswell vs Kaveri

Meh, it never has been any time that I've looked. You still end up with a better overall gaming system by getting a cheaper CPU with a discrete CPU. And you end up *way* better if you just spend an extra $50-100... at those low price points even $10 shows big increases.

If gaming is really the only targeted use, and performance/initial_price the only concern, without regard to power, cooling, noise, etc., then yes, APUs can still be unconvincing for desktops.

But if you want a cheap gaming rig that will still handle general purpose tasks well, have low idle power, etc., they make good choices.
 
Andrew Lauritzen said:
Meh, it never has been any time that I've looked.
Indeed. Even the best offerings in this regard have been rather lackluster. But things could change in the future...

An HD 7750 has 72 GB/s of bandwidth, so until APUs are able to catch up with that, they will continue to be second class citizens.
 
The enthusiast mindset just doesn't really understand it but the facts speak for themselves, look at this :-
I'm pretty positive that the people buying <$50 GPUs were doing so for reasons other than gaming. At the time integrated was barely suitable for even desktop work, and had no significant video encode/decode. These days desktop+video is no longer a compelling reason to go discrete.

If you want a step up from Trinity you'd buy an i3 3220 ($130) + a 6670 DDR5 ($85) as your entry point...
That setup would be *significantly* faster than the APU solution. Pretty sure you could cheap out on a ~$50 CPU (aren't there new ivy bridge celerons in that price range now?) and still end up faster in games at a similar price point.

I just don't really buy this strange market you're talking about though... I literally no no one who would buy a low-end desktop PC for gaming these days. The people who don't want to build a decent desktop gaming machine buy consoles or laptops.

So I maintain that this comparison is only really interesting on mobile parts. The 100+W desktop APU stuff is just irrelevant. You can certainly put the mobile stuff into non-laptop form factors (see NUC or similar) though.

AMD do suggest that Kaveri can be used as a dedicated GPGPU processor in combination with a discrete GPU fr graphics in one of their HSA presentations though.
I'm sure they do, and certainly in theory any of the integrated cards can work directly with the host CPU in a variety of ways (some more than others). The issue is none of the APIs currently support this in any sort of natural way that is optimized for shared memory systems. They're all still based on big DMA buffers, OS/kernel-controlled memory and the assumption that CPU<->GPU interop always requires a copy of data. That sort of limits the performance you can get out of such a system...

On the other hand, If you only have a Sandybridge or older CPU with no GPGPU capable GPU on die then you are stuck with software compute only at lower performance levels than what would be available on the consoles and thus you have to reduce quality/detail settings to maintan performance.
Heh, I think you're being a bit optimistic about next gen console performance levels, but certainly I agree that you'd want to be able to scale and move around the relevant work to where it runs the most efficiently.
 
Last edited by a moderator:
The enthusiast mindset just doesn't really understand it but the facts speak for themselves, look at this :-

share.png


You can see that 30% of the entire market in 2008/2009 was buying gpu's < $50. They could get much better performance for another $50 sure, but they're not paying it for one reason or another.

If you want a step up from Trinity you'd buy an i3 3220 ($130) + a 6670 DDR5 ($85) as your entry point...but you're spending $85 more than you would on a 5800K alone. If people were willing to pay that then there would be a lot more sales above the $50 mark with discrete gpu's.

As it happens they just want something that works playably for as cheap as they can, and they swallow down "QUAD CORE" and "GHz" way more than anything else at this price range. A 4.2 GHz quad core with 7660 graphics looks so much better than some 3.3 GHz i3 "dual core with HT" + 6770 graphics. Given the choice 95% of people would take the 5800K at the same price, let alone $85 cheaper.
Interesting chart. Do you know if a similar chart exists for other years? I'm surprised there's not a peak near $200 and I wonder if there just wasn't a good $200 card that quarter.
I'm pretty positive that the people buying <$50 GPUs were doing so for reasons other than gaming. At the time integrated was barely suitable for even desktop work, and had no significant video encode/decode. These days desktop+video is no longer a compelling reason to go discrete.
I've heard a lot of what drives <$50 sales is psychology. People know integrated performance isn't great so they upgrade just to not have integrated graphics.
 
That setup would be *significantly* faster than the APU solution. Pretty sure you could cheap out on a ~$50 CPU (aren't there new ivy bridge celerons in that price range now?) and still end up faster in games at a similar price point.

Like I said though, it's the next step up. The 7660D is basically equal to the 6570 DDR3 and up to 6670 DDR3 in some games.

The problem with the Celerons has been mentioned by others a few times. The Techreport did an article fairly recently that showed that the dual cores simply weren't up to the task in many games.

I just don't really buy this strange market you're talking about though... I literally no no one who would buy a low-end desktop PC for gaming these days. The people who don't want to build a decent desktop gaming machine buy consoles or laptops.
It's not true. The vast majority of desktop PC's being sold are still at the low end. AMD still holds some 25% of the desktop market and they definitely aren't selling much above the entry level.

Somebody might buy an APU and upgrade the graphics later too - it's not like it's so much slower than an i3 that you'd notice, and it basically costs the same so why not?
 
Interesting chart. Do you know if a similar chart exists for other years? I'm surprised there's not a peak near $200 and I wonder if there just wasn't a good $200 card that quarter.

I'll have a look but I think this is the chart that AMD used for their "sweet spot" strategy back then, which is how I found it while searching last night.
 
It could also be plain and boring Richland. Remember, AMD sent a Richland ES to x264 devs few months ago.
 
It could also be plain and boring Richland. Remember, AMD sent a Richland ES to x264 devs few months ago.

given that Richland is a drop in replacement why would you sent it on a non production dev board? i think kabini is the most likely chip. given 3 months to launch they would have silicon about but consumer boards might not be ready.
 
Well I'm confused, are there going to be 2 products called 'HD graphics 5200'; 1 for notebooks with Cache and another for Ultrabooks without cache? But both called the same?

That, or then there's going to be "HD Graphics for Ultrabook 5200" and "HD Graphics for Notebooks 5200"
 
Well I'm confused, are there going to be 2 products called 'HD graphics 5200'; 1 for notebooks with Cache and another for Ultrabooks without cache? But both called the same?

This is what the driver says:

; HSW Classic
iHSWGT1D = "Intel(R) HD Graphics"
iHSWGT1M = "Intel(R) HD Graphics"
iHSWGT15D = "Intel(R) HD Graphics 4400"
iHSWGT2D = "Intel(R) HD Graphics 4600"
iHSWGT2M = "Intel(R) HD Graphics 4600"
; HSW ULT
iHSWGT1UT = "Intel(R) HD Graphics"
iHSWGT2UT = "Intel(R) HD Graphics Family"
iHSWGT3UT = "Intel(R) HD Graphics 5000"
iHSWGT3UT25W = "Intel(R) HD Graphics 5100"
iHSWGT2UX = "Intel(R) HD Graphics Family"
iHSWGT1ULX = "Intel(R) HD Graphics"
; HSW CRW
iHSWGT2CW = "Intel(R) HD Graphics 4600"
iHSWGT2CWDT = "Intel(R) HD Graphics 4600"
iHSWGT3CW = "Intel(R) HD Graphics 5200 with High Speed Memory"
iHSWGT3CWDT = "Intel(R) HD Graphics 5200 with High Speed Memory"
; HSW SRVR
iHSWSVGT1 = "Intel(R) HD Graphics"
iHSWSVGT2 = "Intel(R) HD Graphics P4600/P4700"
There is no HD5200 without eDRAM. Fudzilla told Ultrabooks won't come with onboard memory. HD5200 for Ultrabook makes no sense therefore. From the driver it looks like Haswell ULT models with GT3 are named HD Graphics 5000 or HD Graphics 5100 for the 25W model.
 
HD5200 will always come with eDRAM then.
I don't think Intel would establish 6 different names to their new iGPU lineup, just to end up giving the same name to 2 iGPUs with different performance numbers.
 
They should make GDDR5 at globalfoundries, guaranteeing supply and using more of their agreed wafer purchases for 2013.
 
Back
Top