Haswell vs Kaveri

Intel "Haswell" GT3 Graphics Twice as Fast as "Ivy Bridge"

Sources close to the company have been claiming a significant, in fact, 100 percent performance lead of the Haswell iGPU over previous-generation HD 4000 iGPU featured in today's Core "Ivy Bridge" chips. If true, Intel's graphics may have come perilously close to, or even caught up with, AMD's A-Series "Trinity" line of APUs, which feature the fastest integrated graphics processor ever made.
 
Dirt 3 isn't a very good title for the HD4000, where the GT 650M gets to be as much as 5x faster.

I wonder why Intel chose this game to show it off. Maybe because this game's engine was the achilles' heel in the HD4000's architecture, making it easier to get a larger performance bump?


Nonetheless, 2x HD4000 should bring the HD4500's performance close to a GT 650M (10-20% away) on some scenarios, but it definitely nullifies the need of a lower-end mobile Cape Verde or GK107.
And looking at AMD's and nVidia's recent announcements for mobile GPUs, almost everything below the HD88xx series seems to be rendered useless when the GT3 is present. If the 2x overall performance bump is confirmed.
 
Last edited by a moderator:
If the HD4000 sucks on the game and Haswell does not suck it's a pretty good show off, plus I can quite smell the driver optimisation when they demo a single game.

Regarding the announcement remember it's the high end variant nobody will get.
Most everyone is stuck on a HD2000, 2500 or even the one that doesn't get the honor of being given a number. (unless HD4000 is particularly common on laptops)

And there was that story about OEMs being pissed. They may well opt for a Mars or GK208 GPU, identifiable by the consumer.

If it's for an "ultrabook" then OEM can be pissed about the high cost and a mandatory touch screen, useless for gaming.
 
Yap.
Mobile Core i3
Mobile Core i5
Mobile Core i7


Every single one is carrying the HD4000 with 16 EUs, and only the ULV parts go below 1GHz for maximum clocks.
I think all mobile CPUs will get the GT3 variant, with GT2 being left for the high-end desktop chips where the GPU's max. performance is less relevant and the GT1 will probably only be in the rather obscure Pentium/Celeron G lines.
 
Alright.
But really, GT3 is a "special edition"
HD4000 on all models (except oncoming Pentium) was cheap, GT3 is not. Unless you can get GT3 without the external memory, but then it is not GT3 anymore.
 
Alright.
But really, GT3 is a "special edition"
HD4000 on all models (except oncoming Pentium) was cheap, GT3 is not. Unless you can get GT3 without the external memory, but then it is not GT3 anymore.


I haven't seen any info yet telling us that GT3 won't be the standard iGPU in the mobile chips.

It should be in Intel's best interest to nullify discrete GPUs in most price ranges and avoid giving Kaveri a chance.
 
I haven't seen any info yet telling us that GT3 won't be the standard iGPU in the mobile chips.

It should be in Intel's best interest to nullify discrete GPUs in most price ranges and avoid giving Kaveri a chance.

On high CPU performance range, i dont think there's any interest to sacrify performance and watts available of the CPU for the Integrate GPU when they will be forcibly coupled with an higher performance "discrete" GPU. And if only the CPU computional power is needed, no need for a powerfull iGP. With a discrete GPU in the system, this iGP will be at best only there for the display with web / 2D / encode and decoding, and be switched to the discrete cards when the need is there.
 
Last edited by a moderator:
If the HD4000 sucks on the game


It's the other way around. HD4000 runs pretty good in all ego engine games. Trinity does not enjoy a big lead there compared to many other games. In general, Codemasters racing games were from both AMD and Intel often used in presentations. There is a benchmark demo which is a good thing and also the graphics looks nice. That's why both used this often in the past.
 
On high CPU performance range, i dont think there's any interest to sacrify performance and watts available of the CPU for the Integrate GPU when they will be forcibly coupled with an higher performance "discrete" GPU. And if only the CPU computional power is needed, no need for a powerfull iGP. With a discrete GPU in the system, this iGP will be at best only there for the display with web / 2D / encode and decoding, and be switched to the discrete cards when the need is there.


Things over at Intel have evolved to a point where the presence of an iGPU doesn't consume any noticeable power when it's not being used, so that's not a concern.
Moreover, AFAIK the HD4000 doesn't consume more power than the cut-down HD2500 in light-load scenarios.

It shouldn't be any different with the GT2/GT3 Haswell iGPUs.
 
Things over at Intel have evolved to a point where the presence of an iGPU doesn't consume any noticeable power when it's not being used, so that's not a concern.
Moreover, AFAIK the HD4000 doesn't consume more power than the cut-down HD2500 in light-load scenarios.

It shouldn't be any different with the GT2/GT3 Haswell iGPUs.

I'd imagine that die area is as much, if not more, of a concern for the top end CPUs as power consumption. Granted the difference in die area might not be large, but the dedicated RAM is going to consume a significant amount of package space whether on die or off.

For a high end desktop CPU, like i7 and possibly even i5, it just doesn't make sense to use GT3, IMO.

It makes a lot more sense on an i3 and possibly the lower variants of i5.

Regards,
SB
 
However, the tone and insunuation in that snippet indicate something different going on besides simply making the physical separation unnecessary.
Potentially, something like refusing to scale up the PCIe bandwidth of the desktop and mobile chips, or quietly exerting pressure to prevent attempts to extend PCIe so that it can support coherency would do the trick over time.
It looks like you're right.

I'm not entirely sure which is the best thread to put this in, but from SemiAccurate: "Intel slams the door on discrete GPUs."

I'm putting it here because while the 2 free paragraphs don't explicitly mention any chips, the tags include "haswell" and "Crystalwell."
Okay, this article is now available to free subscribers.

Main points:
  • Broadwell will have 4x PCIe 2 lanes (20 Gbps).
  • That will put a ceiling on discrete GPU performance, which might be lower than that of some Broadwell integrated graphics.
 
It looks like you're right.

Okay, this article is now available to free subscribers.

Main points:
  • Broadwell will have 4x PCIe 2 lanes (20 Gbps).
  • That will put a ceiling on discrete GPU performance, which might be lower than that of some Broadwell integrated graphics.

I think that it is more probable that Intel slams the door on its own products and with such decisions makes the life of AMD a lot easier. :D
 
I'm not entirely sure which is the best thread to put this in, but from SemiAccurate: "Intel slams the door on discrete GPUs."
The followup report to that one, "How Intel can slam the door on GPUs," is now available to free subscribers.

Main points:
  • Intel can essentially justify the bandwidth crippling to the FTC, while it will be hard for AMD and NVIDIA to make a case against Intel quickly (or else the damage is done)
  • There's only one 4x PCIe 2 lane, best case it can be split into 4 1x lanes if necessary
 
I don't remember appointing a special reporter tasked with keeping B3D up to speed with the latest and greatest from S|A, so I will kindly ask that we diminish the volume of the advertising? Once in a blue-moon is probably fine, but being linked to each and every piece that goes up there is a bit extreme. If somebody wants to advertise their material on B3D they can go through normal channels (there's an email address dedicated to this).
 
No, it is not. Randomly made up stuff that sometimes happens to be true is not relevant to any discussion, but is tolerable in small quantities. When all one is contributing is links to such discussion and a brief breakdown (a bit like an RSS feed), I might take issue with it. As I said there's a dedicated email for to this sort of input.
 
Back
Top