Ivy Bridge GPU details

Results from several synthetic tests under OCL:

8nXtA.jpg

wtyJP.jpg

UhdYR.jpg

NSyhR.jpg
 
That's a question of human psychology, everyone has his/ her own criteria for satisfaction.
If you ask me, personally, then my own 6870 is the absolute minimum for satisfaction.
For desktop use? That's craziness. These modern CPUGPUs are as capable as some not-so-low-end discrete cards of recent years. Hell, I can be happy using Aero on GMA 950 for most desktop stuff.
 
The GMA that comes with my wife's ancient Atom N270 netbook is absolutely sufficient to drive Win7 Aero Glass on my Dell U2711 at 2048x1152 (maximum analog VGA output rez.) She tools around on that box all day without complaint.

Sure, the Atom processor itself is pretty gutless, but she uses the entire Office 2010 suite, Quicken, and Internet Exploder v9 without fuss. She knows that my personal laptop is 'faster', but she also doesn't know how to engage the ATI card so it just runs on the Intel HD graphics built into the Allendale i5. She only notices the speed increase when working with the RAW files from our DSLR though, which actually isn't graphics-subsystem intensive, it's all CPU work.

Both of my parents, both of my step-parents, both of my brothers-in-law, and my parents-in-law all use Intel integrated graphics on their various laptops and desktops without issue or complaint. I would know, because they call me when it IS slow :) So, yeah, my anecdotal evidence > yours when it comes to REAL perceptions about video speed in day-to-day life.
 
Never tried any intel integrated graphics before prior to sandy bridge, but that level is quite sufficient for general OS stuff, even at 2560*1440. Ivy bridge, being considerably faster, would just be bonus cake on top... :)

Haswell, if it is as speedy as rumored, would even be decent gaming material, especially if coupled with some fast, say ~2.3GHz, DDR3. I'm really curious to see what Intel will cook up for future CPUs, they really seem to be on a roll.

Considering CPU performance has been plateauing for a good while now, it stands to reason that most of the extra transistors that will become available by smaller processes - and hence also an increasing part of the power budget - will go to the graphics co-processor in future CPUs, and that is quite exciting IMO.
 
My old 8500 GT was entirely insufficient for the Windows Vista version of Aero. I turned off Aero and things sped way, way up.
 
Quick heads up - I made two corrections to my article, pertaining to the L1 texture cache and implementation of the URB/L3.

DK
 
My old 8500 GT was entirerly sufficient for the Windows Vista version of Aero. I turned on Aero and things sped way, way up.
 
I know you're parodying me, but please don't say stuff that would horribly confuse folks. There's no way that Aero can speed up the UI in and of itself...

It's not just Windows' UI accelerated, there's a whole bunch of extra processing.
 
in my experience turning up a similar feature in a linux distro does lead to unacceptable CPU overhead, but you have bad drivers and a number of layers (toolkit library, X11, compositing).
on Windows you have excellent drivers, and the driver model was actually changed just to run Aero. so it's smooth. I've had Aero run with all animations disabled btw, which is a bit ridiculous but works. I've then run windows 7 in classic mode and had to use an old 3rd party program to change the colors, well it was just as fast :p, with all indexing, prefetching etc. disabled.

the 8500GT was powerful too, I've seen it. it ran doom3 crazy fast.
 
Why do you discuss so thoroughly 8500 GT? Is Ivy Bridge graphics performance on par with it?

Well, yes, 8500 GT is suitable for light gaming but the experience is not very pleasant. ;)
 
Win7 Aero is pretty lightweight. It runs ok on oldies like Radeon 9600 and the GMA 950.

8500GT is around the speed of a 6600GT or 9800Pro for 3D games I think. I recently baked an 8500GT to resurrect it and tried some games. It overclocks very well but there's not much there to overclock lol.
 
I know you're parodying me, but please don't say stuff that would horribly confuse folks. There's no way that Aero can speed up the UI in and of itself...

It's not just Windows' UI accelerated, there's a whole bunch of extra processing.

Huh? I didn't write that! I think someone must have done it as a joke when I left my PC on.
 
7300 GT. Somehow the naming scheme and the digits in it inflated and you got 8500 GT with almost the same performance.

Although the 7300GT was a surprisingly fast card for the x3xx segment (roughly equalling the 6600GT and not as far behind the 7600GS as its name would suggest).
 
So Ivy Bridge will match Trinity at 17W? I find that a very optimistic claim. We're looking at 25%-40% in favour of Trinity methinks.
 
Are there any Benchmarks or GPU specs available for the 17W version of Trinity?

Just the 2355 Vantage score from the AMD slide that I know of. However based on what we know about Ivy so far that should be enough.

On Anandtech http://www.anandtech.com/show/5772/mobile-ivy-bridge-and-asus-n56vm-preview/6 The 45W quad Ivy basically draws with the slower 35W Llano which also has slower memory. However notice how ridiculously far ahead it is in Vantage (43%) in the same review.

The 45W Ivy on Anand scores 4401, and there is another review showing a 35W Ivy scoring 3321 - http://www.laptopreviews.com/hp-elitebook-8470p-review-intel-ivy-bridge-shines-2012-04. We can sorta extrapolate the 17W Ivy performance will be ~2000. Let's be very generous and say 2500 - it's still only just ahead of the 17W Trinity. Remember what happened to the 43% lead in Vantage when it came to gaming?

David's comments were probably true had it been a 17W Llano, but it's not it's a 17W Trinity. I would expect an average of 33% faster for it vs 17W Ivy in gaming.
 
We can sorta extrapolate the 17W Ivy performance will be ~2000. Let's be very generous and say 2500 - it's still only just ahead of the 17W Trinity.
I'm not quite convinced this extrapolation makes much sense. 17W Ivy can reach nearly the same gpu clock as 35W one. It might not quite reach that as often due to power draw but I wouldn't trust any extrapolation numbers.

David's comments were probably true had it been a 17W Llano, but it's not it's a 17W Trinity. I would expect an average of 33% faster for it vs 17W Ivy in gaming.
Might be happening though I haven't really seen enough concrete information on those 17W Trinity chips (model numbers, shader count etc. are leaked for 35/65/100W Trinity's but nothing on the 17W ones).
 
Back
Top