Introduction to SGX

Lazy8s

Veteran
Deferred Power -- http://mitrax.de/ -- published an article several months ago which outlines SGX from its family members to its architecture and to its end products.

The configuration of the specific variants, the basics of PowerVR and Series5, and insight on the processing performance of current end-devices is provided.

http://www.mitrax.de/?cont=artikel&aid=35

Note that it's written in German; Google Translate does a decent job.
 
second SGX article

Thanks to Lazy8s for the link to the first article.

Here on my site Deferred Power -- www.mitrax.de -- you can find a new article: http://www.mitrax.de/?cont=artikel&aid=36 .

I test the GMA500 on an MSI X320 Netbook with Intel and PowerVR driver.
There are two parts, the first will show the video capabilities and the second one show some numbers for old games and other benchmarks.

OK, it is in german, but why we have Google Translate? :)
 
I note that the article indicates that Pinetrail will have SGX in it. Although its an absolute perfect fit in my opinion, Some Linux drivers I've seen suggested that the pinetrail graphics would be a derivative of Intels own graphics hardware, not IMG. And drivers I've seen that support both Poulsbo and Moorestown make no reference to Pinetrail
 
I note that the article indicates that Pinetrail will have SGX in it. Although its an absolute perfect fit in my opinion, Some Linux drivers I've seen suggested that the pinetrail graphics would be a derivative of Intels own graphics hardware, not IMG. And drivers I've seen that support both Poulsbo and Moorestown make no reference to Pinetrail
There were some indications that Pineview would use SGX while Pinetrail wouldn't; does that seem plausible to you given what you saw in those drivers?

My understanding for a while has been that Pineview actually uses the same CPU/GPU/etc. SoC as Moorestown, but a different southbridge. Of course, that was before Intel revealed that Moorestown only support a 32-bit LPDDR1/DDR2 interface which seems hard to swallow especially in a netbook although not impossible. We'll see, I'm really not sure anymore.
 
There were some indications that Pineview would use SGX while Pinetrail wouldn't; does that seem plausible to you given what you saw in those drivers?

My understanding for a while has been that Pineview actually uses the same CPU/GPU/etc. SoC as Moorestown, but a different southbridge. Of course, that was before Intel revealed that Moorestown only support a 32-bit LPDDR1/DDR2 interface which seems hard to swallow especially in a netbook although not impossible. We'll see, I'm really not sure anymore.

Pineview is supposed to be the CPU/GPU etc, and Pinetrail is the platform, i.e Pineview + TigerPoint I/O Chipset

However there are 2 pinetrails being talked about one for netbook and one for nettop (which as I recall is basically a "desktop-lite" type of thing) Pinetrail-M and Pinetrail-D, I can;t remember which one is which, and also a a dual processor Pinetrail has been mentioned in despatches.

I wish Intel would clear the whole thing up....their codenames thing has gotten to be just so confusing.....and of course it would be great for IMG to have won a seat in some variation of it.

If you believe this article:-
http://www.slashgear.com/knd-k1850-is-first-pine-trail-atom-nettop-1460403/

Then a dual-core pinetrail is actually out on the shelves in products.
 
Last edited by a moderator:
Gah, been too long since I looked into those damn codenames. Anyway what I meant is:
Moorestown = CPU1 + SB1
Pinetrail-M = CPU1 + SB2
Pinetrail-D = CPU2 + SB2

Of course, I'm probably wrong. I'd just be VERY surprised if Intel bothered with taping-out four (!) different chips with nearly identical characteristics (Lincroft, Pinetrail-M, Pinetrail-D 1-core, Pinetrail-D 2-cores) - the real question is whether there are 2 or 3 of them IMO, with 4 not being very likely at all.
 
Wow performance is horrible. You'd think it would be a bit better than that. I guess the 200MHz clock speed isn't helping

one word drivers :cool:
That's something to blame Intel for; drivers are being supplied for Intel from Tungsten graphics while PowerVR has reference drivers for their IP that show far higher performance. Word has it that Tungsten isn't using the onboard firmware of the chip and it naturally falls back to software rendering.
 
Gah, been too long since I looked into those damn codenames. Anyway what I meant is:
Moorestown = CPU1 + SB1
Pinetrail-M = CPU1 + SB2
Pinetrail-D = CPU2 + SB2

Of course, I'm probably wrong. I'd just be VERY surprised if Intel bothered with taping-out four (!) different chips with nearly identical characteristics (Lincroft, Pinetrail-M, Pinetrail-D 1-core, Pinetrail-D 2-cores) - the real question is whether there are 2 or 3 of them IMO, with 4 not being very likely at all.

I know Fuad is often up the left when it comes to "scoops" but it appears his recent comment about pinetrail using GMA3150 is spot on.

A PC distributor in Canada has D410 and D510 based boards for sale (both are pinetrail based mini-atx motherboards), and is citing the graphics on them as GMA3150. See lines 37 & 38. This would leave pinetrail-M as the only possible IMG-ed pinetrail solution.

http://209.85.229.132/search?q=cache:G2TKeqskX5AJ:pub.supercom.ca/web/scomgw.nsf/(%24Att)/HUIS7WJSZA/%24File/Intel_Desktop_Board_Quick_Guide_(Oct09).xls%3FOpenElement+GMA3150&cd=19&hl=en&ct=clnk&gl=uk
 
And Intel's PR has the nerve to call IMG's IP inefficient when asked. How long is that GenX crap exactly going to plague us?
 
They say they "provide" them; not that they have them! :p
So when a customer signs a license agreement and says "we need a mature and comprehensive Windows Vista driver", they promise to deliver that on the contract and maybe consider starting work on it ;) (okay that's neither nice nor fair, but you get the point: it's easy to defend that kind of generic statement no matter the situation so I wouldn't take them as a serious indication of reality, one way or another. And I'm sure if Intel said they were willing to finance IMG to finish/polish/etc. their drivers, they'd gladly do it)
 
i get the funny feeling that IMG are pulling on intel what intel (through tungsten) were pulling on their customers back in the day. i may be totally wrong, of course, but if i'm right - oh, the irony.
 
IMG may provide them, whether the customer (Intel for example) decides to use them, or indeeds ask for them to be provided, is another matter. Intel, like any big company is full of politics.

I imagine for example that IMG supplied Apple with a lot of tech support for the iphone drivers, and possibily provided the drivers that they are using, which appear to show more closely what MBX/SGX is capable off.
 
They say they "provide" them; not that they have them! :p
So when a customer signs a license agreement and says "we need a mature and comprehensive Windows Vista driver", they promise to deliver that on the contract and maybe consider starting work on it ;) (okay that's neither nice nor fair, but you get the point: it's easy to defend that kind of generic statement no matter the situation so I wouldn't take them as a serious indication of reality, one way or another. And I'm sure if Intel said they were willing to finance IMG to finish/polish/etc. their drivers, they'd gladly do it)

The story is more we haven't been asked for thus we don't deliver (yet). On hindsight I'm willing to bet that if IMG would send Intel better drivers for free Intel wouldn't use them. How long can you really beat a dead horse or has Intel ever really cared for better drivers even for their own IGPs and I've missed something? Their execution on that matter stinks for years now.
 
That's something to blame Intel for; drivers are being supplied for Intel from Tungsten graphics while PowerVR has reference drivers for their IP that show far higher performance.
Well according to the article it really depends on the application. The article is incorrect that the intel driver is not using TBDR at all. I suspect why archmark delivers those inflated scores with the powervr driver is probably that the detection when it's really necessary to actually start rendering is a bit more clever, but it probably makes little difference in non-theoretical benchmarks.

Word has it that Tungsten isn't using the onboard firmware of the chip and it naturally falls back to software rendering.
I think you're not understanding what the firmware is used for, in any case, this has nothing to do with "software rendering".

Also, the article states it's surprising to see hardware tnl being slower (or at least not faster) than software tnl. This is not surprising at all, why do you think intel switches between sw/hw tnl on their i965-based chips (sure cpus used there are faster but so are the igps)... Also, hw tnl in d3d doesn't necessarily mean it's actually performed in hw and not just in the driver...
There seemed to be some other technical inaccuracies in the article, but I can't comment on that :).
 
Well according to the article it really depends on the application. The article is incorrect that the intel driver is not using TBDR at all. I suspect why archmark delivers those inflated scores with the powervr driver is probably that the detection when it's really necessary to actually start rendering is a bit more clever, but it probably makes little difference in non-theoretical benchmarks.
I admit of not having fully read the article
I think you're not understanding what the firmware is used for, in any case, this has nothing to do with "software rendering".
jep i should haved been clearer on what i meant "chip falls back to software vertex shading "
 
Back
Top