Trinity vs Ivy Bridge

they disappear if you clean it all the time, but seriously, who clean a TV monitor 2x by weeks ) ..

You mean two times a week? If so, then I clean my display, keyboard, desk and laptop every day, sometimes few times a day. But not with a liquid, just to remove the dust particles which are too many in a dry climate. ;)

:LOL:
 
Meanwhile, Sandy Bridge is still at OpenGL 3.1 instead of OpenGL 3.3. What could be the reason? Is it still Intel's unwillingness to support previous-gen hardware, or is my understanding that a DX10.1 card should be capable of OGL 3.3 wrong?
 
I am curious about how dedicated Intel is to supporting their recent IGPs. Their previous hardware was not supported well. They are bullet proof GUI accelerators but you certainly can't count on the drivers for gaming.
 
I am curious about how dedicated Intel is to supporting their recent IGPs. Their previous hardware was not supported well. They are bullet proof GUI accelerators but you certainly can't count on the drivers for gaming.
It seems they are not that dedicated. They just moved HD Graphics (1st gen) away from first-rate driver support. That is a 2 year old DX10 architecture GPU.
Usually, they only support the two latest generations, with previous ones only getting occasional bugfixes. You can't even count on them being bulletproof GUI accelerators, as some Flash bugs, for example, did not get fixed (the latest official driver for X3100 (GM965) is from 2009, and that is a DX10 GPU).
The good sign, however, is that Win8 inbox drivers do include newer drivers for everything up from the GMA 950 (945G). It is, however, unknown whether non-inbox drivers will ever be available, and inbox drivers do not include either OpenGL support or the control panel.
 
To be honest previous Gens were hopeless slow, it is better to stop developing drivers for it. Only critical Bugfix releases are necessary if there are some. Even AMD will drop their HD4000 series from the monthly driver release stack. I bet it's similar to X1800/X1900 series. They will drop it completely sooner than later.
 
sandy bridge CPUs number in the many millions and will still be used a decade from now. they have a GPU for every model from celeron to core i7, barring oddities and socket 2011.

they should begin maintaining it I think. they can easily drop XP drivers but do a good job on vista/7/8 and make it run good on windows 9.

first-gen core i3, which didn't last that long on the market, they can drop it.
if there's no great driver support.. in the end I will have to build or buy computers with nvidia ARM I think, they can't run x86 games and that's all.
nvidia might even port their proprietary blob linux/unix driver to ARM, and maintain it for a decade :).
 
Last edited by a moderator:
This is a pretty poor showing, being only about 19% faster in the graphics department than ivy or Llano. I can't really say I am all that interested in it anymore :(.

On a second note, does anyone know how much Trinity is being killed by lack of bandwidth? Since it seems that quite a few benchmarks and games perform at that fabled 50% faster, but a few others just fall flat.
 
It's 20% faster vs a 45W chip, and it's losing games due to AMD's awful drivers otherwise the overall gap would be huge. Being held back by the cpu isn't really helping it either.

Steamroller can't come fast enough - I really hope they've abandoned the bulldozer core because it's the wrong type of core for pushing graphics.
 
I wouldn't expect this. Of course 10 watts less, but also two cores less for the IVB Dualcore. GPU gap is closer than I thought.
 
You're looking at one of the worst balanced chips in history. People thought SB was bad regarding the cpu/graphics balance but this is even worse. Looking at AT's benchmarks you can see the actual potential of the 7660G on the Civ 5 benchmark where it's basically twice as fast as the HD4000. For that to happen and for it to lose in other games is just ridiculous.

The Sony Vaio in AT's benchmarks should have been where this Trinity was at least, had it not been held back by a woefully inadequate cpu. The bulldozer core is simply not good enough and is a massive limiter on the graphics.
 
I have seen several benchmarks in the 20 fps range, I doub't CPU is a big bottleneck there. It could be a big issue for the 17W of Trinity though because it has only 1 module. Trinity got a GPU Turbo, it might be the case that the maximum frequency isn't often applied in the game. Some extremely faster Civ 5 workloads are no surprise since Llano did the same.
 
Yap, it seems that no matter how good the iGPU gets in AMD's APUs, the CPU performance will keep hindering it.
That said, a Hybrid Crossfire Trinity+Turks scenario doesn't sound too good right now as the CPU will be an even larger bottleneck.

Piledriver seems like a good effort at patching up Bulldozer, but it looks like the architecture just isn't competitive..
This seems to me like performing a risky surgery with a long and painfull recovery on a 92 year-old in order to increase his life for another 6 months... Is it worth it?

This doesn't bode well for AMD's roadmap, much less if we take into consideration their "10-15% improvement in performance/watt every year" targets. That's simply not enough, considering Intel has 50-100% better performance/watt right now.


Oh well, after some 8 years it seems I'm going to purchase something with an Intel CPU after all...
 
I think Trinity looks very good considering that AMD is still using the same 32 nm process, and the chip is pretty much same in size (compared to Llano).

Compared to AMDs last generation (Llano):
+20% CPU performance
+20% GPU performance (gaming)
-15% power usage (normalized battery life)

I don't think we have seen this big gains (in performance per watt) in a new architecture since Pentium 4 -> Core2. Trinity is a big improvement.
 
It's 20% faster vs a 45W chip, and it's losing games due to AMD's awful drivers otherwise the overall gap would be huge. Being held back by the cpu isn't really helping it either.
Its nothing to do with drivers. TDP bettween the GPU and CPU is shared and power will be steered to the element that is most "hungry" in that app. You need to look at like for like APU power budgets.
 
Back
Top