Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
PowerVR SGX was also demoed in 2006..



I'm pretty sure there's no wattage info for the 3DS GPU publicly available, and also not enough firm info on its "processing power" to make the kind of claim you just made.

Also very low resolution compared to what?



Can you explain to me how a single core 543 at 200Mhz would walk all over the 268Mhz PICA200 based GPU in 3DS? I mean apart from the usual "its super duper programmable" argument.

A power vr sgx 543 certainly was not demoed in 2006...so don't talk rubbish.
I can't believe you are trying to argue that a pica 200, with open .gl es 1.1 would out perform the sgx in any scenario.
For a start the citadel demo runs on a 535 just fine ...a 543 would be much more usefull than a 535 something much more impressive could be designed for it.

Are you forgetting the 3ds runs at a pathetic resolution of 320x 240?? (Something like that-3d also but still very very low) could you see it still run the same lighting effects at 720p??

What about something like nova 3? That's got plenty of lighting and effects and it runs at 720p on smartphones very well...remember smartphones like pcs are not the most efficient things to program for, with terrible bloat,they don't spend a lot of time making them and they sell for a few quid..even so something like max payne 3 or GTA 3 would never run on a 3ds...even with its minimal over head and coding to the metal advantage.

The processing power was made available when it was announced, I'm on my phone so I'm not going start searching for it, but off the top of my head it was open .gl es 1.1 spec...around 40million triangles per second, and just over 1000 texture fill rate @200Hz...hardly groundbreaking, in fact as I pointed out apart from a couple of nice effects and a much better fill rate, it seems more of a competitor to the original psp.....

Of course I am talking about just a single sgx 543, they could have doubled it up, or clocked it much higher I just low balled the core number and frequency to try and match nintendos stingyness.

Power consumption, now I don't know about pica specifically, but we do know that the 3ds carries low clocked arm 11 processors, low resolution moderate size main screen and a smaller low res bottom screen, no multi touch?, only 128mb ram, no android boggling it down or lte....batterylife should last for at least a day...it doesn't...poor design which ever way you look at it.

It's not even that cheap or thin either.

Wuu: I think you have been drinking the kool aid if you really think we are getting a dx 11 class gpu...no chance. As mentioned by others a 4670 or something would be capable of open cl, and yes that would be more advanced than a xenos, I'm not sure about performance, but sure enough performance will be in the same ball park, give or take some effects here and there.

The 4xxx class gpu is directely related to xenos, the shaders everything is very similar, xenos is a dx9 gpu but it also has more advanced features than that with a subset of dx9 as well as unified shaders, a weak tesslator and many others, it was almost a dx10 gpu in everything but name, it set the standard for which dx10 and other ati gpus were designed from there after all the way till hd 6970..or even gcn in some cases.



as others have stated, if the machine was running vastly more advanced hardware that would had shown in demos already, as is the custom with every new console demo..except nintendo.
 
Last edited by a moderator:
Wuu: I think you have been drinking the kool aid if you really think we are getting a dx 11 class gpu...no chance. As mentioned by others a 4670 or something would be capable of open cl, and yes that would be more advanced than a xenos, I'm not sure about performance, but sure enough performance will be in the same ball park, give or take some effects here and there.
I hope it's DX11 just for the sake of you having to eat your words here :LOL:
The 4xxx class gpu is directely related to xenos, the shaders everything is very similar, xenos is a dx9 gpu but it also has more advanced features than that with a subset of dx9 as well as unified shaders, a weak tesslator and many others, it was almost a dx10 gpu in everything but name, it set the standard for which dx10 and other ati gpus were designed from there after all the way till hd 6970..or even gcn in some cases.
No they're not, Xenos shaders are similar to R520/580 vertex shaders if something, R600 and onwards are much further from them, starting from the fact Xenos is Vec4+Scalar while R600 and onwards 'till GCN are VLIW5 (or VLIW4 in case of Trinity & Cayman)
 
Last edited by a moderator:
The over all architecture is very similar is what I said, anand attested to this in his northern islands review, beyond 3d also has a very good article about xenos a few years ago citing the new Architecture and advantages of unified shaders, the design is very very similar all the way up until vliw 4 northern islands, I never said it was a carbon copy, but it is the same lineage.
 
The over all architecture is very similar is what I said, anand attested to this in his northern islands review, beyond 3d also has a very good article about xenos a few years ago citing the new Architecture and advantages of unified shaders, the design is very very similar all the way up until vliw 4 northern islands, I never said it was a carbon copy, but it is the same lineage.

"Same lineage" is limited pretty much to the fact they're all unified shaders, the shader units themselves are more related to R5xx-series than R6xx or newer, which also is clear from B3D's Xenos Demystified article (Vec4+Scalar, just like R5xx vertex shaders). They're not SM4.0 capable either even though they at least in parts surpass SM3.0 requirements
 
Well I've heard many more distinguished people then my self say they are are related. Here's some references...
http://en.wikipedia.org/wiki/R600_(ASIC)

To be fair this article backs up what you have said regarding shaders, although it says that it is related and r600 is an evolution rather than a revolution. It also mentions other similarities such as the tesserlation amongst others...
http://www.bit-tech.net/hardware/graphics/2007/05/16/r600_ati_radeon_hd_2900_xt/6

"AMD's Cayman architecture marks their first departure from VLIW-5 since its introduction with the Xenos processor used in the Xbox 360, then shortly thereafter in the R600 equipped HD 2900 XT."
http://www.rage3d.com/reviews/video/amd_hd6970_hd6950_launch_review/index.php?p=2

I could go on but we're off topic lol, basically your correct when you say some changes were made from xenos to r600-cayman, but I was also correct in saying that xenos was the start of something and it was more of refinements rather than a complete change...anyway a hd 4000 class will obviously have more efficient shaders amongst others, but there may not be enough of them to make a big leap from box 360.

That's my take :)
 
Well I've heard many more distinguished people then my self say they are are related. Here's some references...
http://en.wikipedia.org/wiki/R600_(ASIC)
"is based on", just like you could say from pretty much any GPU out there that it's based on it's "predecessor" though Xenos isn't really really the predecessor, rather just earlier GPU, the fact they both have unified shaders and a tesselator is pretty much where it ends [excluding the typical features of any GPU out there]
To be fair this article backs up what you have said regarding shaders, although it says that it is related and r600 is an evolution rather than a revolution. It also mentions other similarities such as the tesserlation amongst others...
http://www.bit-tech.net/hardware/graphics/2007/05/16/r600_ati_radeon_hd_2900_xt/6
You could see every GPU out there as evolution of the previous one, you could see unified shaders as evolution of separate shaders (as their instruction sets grew more and more similar anyway) etc

"AMD's Cayman architecture marks their first departure from VLIW-5 since its introduction with the Xenos processor used in the Xbox 360, then shortly thereafter in the R600 equipped HD 2900 XT."
http://www.rage3d.com/reviews/video/amd_hd6970_hd6950_launch_review/index.php?p=2
They're wrong, Xenos is Vec4+Scalar, not VLIW5
I could go on but we're off topic lol, basically your correct when you say some changes were made from xenos to r600-cayman, but I was also correct in saying that xenos was the start of something and it was more of refinements rather than a complete change...anyway a hd 4000 class will obviously have more efficient shaders amongst others, but there may not be enough of them to make a big leap from box 360.

That's my take :)
It was only start of unified shaders (and tesselator), DX specs dictated that in DX10 the shaders must have unified featuresets anyway, so using completely unified shaders was only logical choice. nVidia used unified shaders too, but that doesn't mean they're related to Xenos Vec4+Scalar unified shaders or R600+ VLIW5 ;)
 
Yes .you could say that about anything, technically we're all related to sea creatures if you go that far back, but that's being silly:)

I think what distinguishes this though is the fact they are designed by the same company, if you discount any changes from one generation to another then nothing would ever be related would it?...that's clearly not true...bringing this back to nintendo you could say the game cube 'flipper' is related to the wiis 'Hollywood' chip...or are you saying different? :)
 
Does anyone else remember the PowerVR village benchmark? :)

What's impressive about Epic Citadel graphically in comparison to 3DS games is the resolution and the texture quality. The texture quality comes down to memory and the resolution mostly comes down to the screen. After all the SGX 535's fillrate is less than half that of the GPU in 3DS. Adding in extra hidden surface removal efficiency with a TBDR in a pretty perfect scenario like the village area of Epic Citadel will bring its effective filtrate up. But not to the point were its well past that of 3DS's GPU. Also not all games can be designed to be the perfect scenario for your particular GPU (lots of opaque overdraw).

What's particularly unimpressive is the total lack of dynamic shadows or lighting. This seems to be a recurring trait of any game on single core SGX GPU's. I've never seen a game on a single core version of that GPU throw around even 3DS level effects such as lighting, shadows, particals ect, even Epic just focused on high resolutions and static everything. No doubt SGX has its strengths, but AFAICS it also has its weaknesses compared to a GPU like PICA200. I find the whole assumption that its just better in every way and would automatically walk all over PICA200 to be quite silly.

I'd have loved to see a dual core SGX 543 in 3DS though, but it wasn't to be.

SGX543MP2 would've been too much (too big, too power-hungry for continuous use) for a handheld being mass-produced in Q4 2010, but I think we can all assume that even an underclocked Tegra 2 (to 600-750MHz CPU, 266MHz GPU?) would run circles around the 3DS' PICA200 + 2*ARM11 @ 286MHz combo.

At some point, it was rumoured that Nintendo was looking at Tegra 2 for the 3DS. For some reason, they rejected it and went with the pile of ... that it's inside the handheld now.

It might have been able to play some downgraded PS360 titles @ 800*240 with reduced assets, or at least multiplatform titles from the PSP.

Hooray for Nintendo's hardware choices. Reminds me of Nokia's S60v5/Symbian^1 hardware choices. :idea:
And look how well that turned out.
 
Yes .you could say that about anything, technically we're all related to sea creatures if you go that far back, but that's being silly:)

I think what distinguishes this though is the fact they are designed by the same company, if you discount any changes from one generation to another then nothing would ever be related would it?...that's clearly not true...bringing this back to nintendo you could say the game cube 'flipper' is related to the wiis 'Hollywood' chip...or are you saying different? :)

No I'm not saying different from Flipper and Hollywood, but Xenos & R600 is completely different comparison.
R5xx vertex shaders are similar to Xenos in functional sense, small upgrades here and there
Xenos and R600 shaders are similar in sense that they're both unified, but that's it - the shaders themselves are completely different, one is Vec4+Scalar, similar to vs from R5xx one is VLIW5, completely different from any previous ATI/AMD chip
 
Well fair enough on the shaders belong super scaler in r600..but the rest of the chip must also share design elements, your not telling me they have only unified shader resemblance and tesselator in name only? Being as r600 was incredibly late, I doubt very much they designed the xenos completely as a one off, the went back to the table and redesigned a whole new architecture with out carrying many of the ideas over are you??

They way I take it the xenos was a stepping stone onto dx 10, with many features in place, but also as you point out an over haul to the shaders and some other improvements, basically whilst not he same they are related in more than name only.
 
Well fair enough on the shaders belong super scaler in r600..but the rest of the chip must also share design elements, your not telling me they have only unified shader resemblance and tesselator in name only? Being as r600 was incredibly late, I doubt very much they designed the xenos completely as a one off, the went back to the table and redesigned a whole new architecture with out carrying many of the ideas over are you??

They way I take it the xenos was a stepping stone onto dx 10, with many features in place, but also as you point out an over haul to the shaders and some other improvements, basically whilst not he same they are related in more than name only.



AFAICT, the development of Xenos and R600 took different paths earlier than you might think.
Apart from the tesselator, the Xenos' architecture seems to be as different from R600 as it is from nVidia's G80/G92.

I guess they realized the Xenos' Vec4+Scalar architecture was better for a lighter and more power-efficient form factor, whereas the VLIW5 proved better for more powerful PC usage.
For example, the OpenGL ES 2.0 Imageon chips were developed with the same Vec4+Scalar shader arrangement as Xenos.
The ATI Imageon Z460 has a single Vec4+Scalar ALU, where Xenos has 48 of those.
I also remember that this GPU at some point was called "mini-Xenos".


After Qualcomm bought the low-power Imageon GPUs+Video processors division from AMD, they renamed the branding to Adreno (which is an anagram for "Radeon"), so the Imageon Z460 became Adreno 200, with the latter's successors using the same name scheme.
I don't know for sure but the latest Adreno 320 may still be using the Xenos' Vec4+Scalar arrangement, although this time with DX11 + Halti compliance.

It might also have to do with each architecture's origins. The large eDRAM in Xenos suggests it may have more roots in BitBoys than ATI. Even though ATI officialized the BitBoys' purchase in 2006 (a year after X360 was released), there might have been previous cooperative work and the sale might have been planned years before that.

For example, the Gamecube had an ATI sticker in it, but AFAIK no one from ATI ever worked in Flipper during its development. That GPU was designed entirely by ArtX, which was bought by ATI right before the Gamecube launched.


That said, Xenos might have been developed mostly by BitBoys, which then were purchased by ATI to do the Imageon chips (which use the same shader arranjement as Xenos), and the were re-purchased by Qualcomm and are now working on the Adreno 3xx/4xx.

Possible proof of that is the large eDRAM (introduced by Bitboys) in Xenos and the Vec4+Scalar shader arrangement still being used by Bitboys (Adreno division) today, while ATI/AMD GPUs used VLIW5, then VLIW4 and then GCN.


Some people here know if this is true or not, but I don't know if they'll say something about it ;)
 
I really don't see how using eDRAM is a sign of any BitBoys relation. Just because they were bought later doesn't mean they were involved in any way. Microsoft wanted bandwidth, and eDRAM was a natural choice.
 
I really don't see how using eDRAM is a sign of any BitBoys relation. Just because they were bought later doesn't mean they were involved in any way. Microsoft wanted bandwidth, and eDRAM was a natural choice.

Were there other GPUs before Xenos, apart from Glaze3D, that used eDRAM?


So the Bitboys team inheriting the Xenos' shader arrangement for their mobile GPUs, being purchased by ATI near the release of X360 and Xenos using eDRAM are all just coincidences?
 
I guess they realized the Xenos' Vec4+Scalar architecture was better for a lighter and more power-efficient form factor, whereas the VLIW5 proved better for more powerful PC usage.

Was it that or did vliw5 give them the highest claimable peak FLOP rate in the run up to a TFLOP?

And now that they're beyond a tflop+, haven't they gone back to Vec4+Scalar for GCN?

Btw, that's not saying anything is bad about vliw5 or their choice in using it.
 
Were there other GPUs before Xenos, apart from Glaze3D, that used eDRAM?

PS2 and Gamecube...

So the Bitboys team inheriting the Xenos' shader arrangement for their mobile GPUs, being purchased by ATI near the release of X360 and Xenos using eDRAM are all just coincidences?
What do you mean inheriting their shader arrangement? Xenos ALUs are basically the Vertex Shaders from R300 (vec4+1 scalar) with expanded capabilities to comply with unified shading and updated DX specs. How do you figure that they go and make a high powered DX9+ GPU from Bitboys mobile GPU architecture? That's ludicrous.

You're jumping to conclusions.
 
What about something like nova 3? That's got plenty of lighting and effects and it runs at 720p on smartphones very well...remember smartphones like pcs are not the most efficient things to program for, with terrible bloat,they don't spend a lot of time making them and they sell for a few quid..even so something like max payne 3 or GTA 3 would never run on a 3ds...even with its minimal over head and coding to the metal advantage.
Why do you think that?
 
I thought Xenos was primarily developed by the team that ArtX turned into, or was that R300?

At least according to the article, the group that worked on it was separate from ArtX. There was the R400 prototype (2003), which I think was scrapped when they got ambitious with the feature set then it later evolved into R500/C1.
 
GTA 3 barely runs on my (slightly overclocked) Samsung Galaxy 1... though I cannot compare the two, really... It does run... but in a lot of cases, it starts chugging along HEAVILY. And I guess sometimes it's the RAM/ROM (small RAM and slow-ish ROM) and sometimes the CPU/GPU in draw heavy scenes.
 
AFAICT, the development of Xenos and R600 took different paths earlier than you might think.
Apart from the tesselator, the Xenos' architecture seems to be as different from R600 as it is from nVidia's G80/G92.

I guess they realized the Xenos' Vec4+Scalar architecture was better for a lighter and more power-efficient form factor, whereas the VLIW5 proved better for more powerful PC usage.
For example, the OpenGL ES 2.0 Imageon chips were developed with the same Vec4+Scalar shader arrangement as Xenos.
The ATI Imageon Z460 has a single Vec4+Scalar ALU, where Xenos has 48 of those.
I also remember that this GPU at some point was called "mini-Xenos".


After Qualcomm bought the low-power Imageon GPUs+Video processors division from AMD, they renamed the branding to Adreno (which is an anagram for "Radeon"), so the Imageon Z460 became Adreno 200, with the latter's successors using the same name scheme.
I don't know for sure but the latest Adreno 320 may still be using the Xenos' Vec4+Scalar arrangement, although this time with DX11 + Halti compliance.

It might also have to do with each architecture's origins. The large eDRAM in Xenos suggests it may have more roots in BitBoys than ATI. Even though ATI officialized the BitBoys' purchase in 2006 (a year after X360 was released), there might have been previous cooperative work and the sale might have been planned years before that.

For example, the Gamecube had an ATI sticker in it, but AFAIK no one from ATI ever worked in Flipper during its development. That GPU was designed entirely by ArtX, which was bought by ATI right before the Gamecube launched.


That said, Xenos might have been developed mostly by BitBoys, which then were purchased by ATI to do the Imageon chips (which use the same shader arranjement as Xenos), and the were re-purchased by Qualcomm and are now working on the Adreno 3xx/4xx.

Possible proof of that is the large eDRAM (introduced by Bitboys) in Xenos and the Vec4+Scalar shader arrangement still being used by Bitboys (Adreno division) today, while ATI/AMD GPUs used VLIW5, then VLIW4 and then GCN.


Some people here know if this is true or not, but I don't know if they'll say something about it ;)
Bitboys had no involvement in Xenos and R600's development didn't start until Xenos was done. R600 started from the Xenos code base though the shader core was gutted to add VLIW5, instruction caching, etc. The Imageon part that became Adreno also started from the Xenos code base and didn't share code with R600.

I thought Xenos was primarily developed by the team that ArtX turned into, or was that R300?
R300. Xenos was developed on the east coast and Canada.
 
Status
Not open for further replies.
Back
Top