Sandy Bridge preview

At Investor's meeting earlier this year, Dadi Perlmutter said they expect Sandy Bridge's GPU to be "order of magnitude" faster than 65nm generation graphics, while is GMA 4500.

Looks like they could meet that mark: http://www.anandtech.com/show/3871/the-sandy-bridge-preview-three-wins-in-a-row/7

Before anyone replies, that's the "1 core" part. There will be a "2 core" part.
Well, he says it "appears" to be the "1 core" / 6 EU part. Personally I don't buy it.
Because G45 already had 10 EUs and Ironlake has 12 EUs. Now given there are architectural improvements, twice the performance of Ironlake with the same number of resources (they are also clocked higher) sounds doable, but with half the execution resources? And yes those EUs still are comparable (there's already some Sandy Bridge support in the open-source linux driver). (FWIW, anandtech often says 10/12 shader units are not enough for G45/Ironlake, but this is not actually the case those intel igps have quite plenty shader power - because a EU is physically 4-wide hence that's really 40/48 adds/muls (but not mad...) per clock. Insufficient amount of execution units isn't the reason they aren't performing too well.)
I'm also a bit confused about why there should be parts with different graphics core. Are those Sandy Bridge CPUs using multiple dies (so for instance one could be 2 cores / 6 EUs / 3MB L3 cache the other 4 cores / 12 EUs / 6MB L3 cache)?
 
I thought it was about these new gpus in these cpus handling pci express differently with the integrated gpu and the motherboard needs to be compatible with that and the current boards aren't... and therefore they don't want people installing the cpus to motherboards that don't work with them and to ensure that they make the pin arrangement different.
 
Some i5's on the old platform definitely had integrated gpu's so that doesn't seem to make much sense unless they totally changed how they were handled.
 
Damn it. Somebody kindly merge the threads, please? ;)

@Mczak: There's probably a LOT more that Intel didn't reveal that helps the GPU. Well, that's left for IDF in 2 weeks. They are calling this "Gen 6" so I want to know what significant changes are there. Ironlake was a bit unexpected too.
 
The real question is they couldn't put 1 dummy pin to make it compatible with old LGA-1156 MB's?

The reason it's not compatible with LGA-1156 MBs is probably related to the way the pins are mapped, to the display output, and to the chipset.

The missing pin effectively acts as a clear indication of compatibility.
 
The reason it's not compatible with LGA-1156 MBs is probably related to the way the pins are mapped, to the display output, and to the chipset.

The missing pin effectively acts as a clear indication of compatibility.

IIRC, the display outputs were to be in the chipset/sb/ioh/whatever's-the-latest-name.
 
The reason it's not compatible with LGA-1156 MBs is probably related to the way the pins are mapped, to the display output, and to the chipset.
It's a total waste having display-out on the CPU itself, especially as framebuffer is stored in main RAM anyway and not on the CPU die...

The missing pin effectively acts as a clear indication of compatibility.
The missing pin is just so Intel get to sell everybody new motherboards all over again. They have a hard-on for changing the pinout slightly just so you HAVE to buy new everything from them one more time. They've done it a million times before in the past.
 
The reason it's not compatible with LGA-1156 MBs is probably related to the way the pins are mapped, to the display output, and to the chipset.

The missing pin effectively acts as a clear indication of compatibility.

Or more likely because BCLK is now driven by the CPU, not the MB, so they needed to make it incompatible with older chipsets.
 
The missing pin is just so Intel get to sell everybody new motherboards all over again. They have a hard-on for changing the pinout slightly just so you HAVE to buy new everything from them one more time. They've done it a million times before in the past.

Got any stats to show the ratio between the number of CPUs and/or motherboards sold in PCs that get upgraded to the numbers of CPUs and/or motherboards sold in PCs to people who have no inclination whatsoever to upgrade?

Just curious.
 
twice the performance of Ironlake with the same number of resources (they are also clocked higher) sounds doable, but with half the execution resources?

Insufficient amount of execution units isn't the reason they aren't performing too well.)

Don't the two sentences contradict each other? They have upped the performance by 4x only increasing the EUs by 50%. Plus you got the Graphics Turbo Mode. We are not still clear on whether the 1 vs 2 cores distinction is just the EUs, or there are more as well.

One thing I agree is doubling the EUs from 6 to 12 can't double performance again, and that would be necessary to compete with Llano's GPU.
 
From p. 10 of the article comments:
Anand on 8/28 said:
Right now all desktop parts have 6 EUs, all mobile parts have 12 EUs. There are no exceptions on the mobile side, there may be exceptions on the desktop side but from the information I have (and the performance I saw) this wasn't one of those exceptions.
 
Twice the units at a twice lower frequency can run at a lower voltage and hence have lower power consumption.
 
IGP performance is a lot more important in notebooks than on the desktop where there are far more options for discrete graphics cards and heat/power-consumption aren't as concerning.

Sure, but why wouldn't you enable all 12 units on desktops, since the whole APU can draw as much as 130W, while you're limited to 45W at most on notebooks?
 
Sure, but why wouldn't you enable all 12 units on desktops, since the whole APU can draw as much as 130W, while you're limited to 45W at most on notebooks?
Die size?Most notebooks cpus are dual-core,so intel can integrate a bigger IGP.
 
Back
Top