Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Can PowerVR Rogue in iPad 2013 topple rival potentially Wii U? The bumb is rumored to be almost 10x from current A5x

I'm pretty sure that's near impossible.
And the next next ipad can only be faster than Wii U if the console manages to underpower even the lowest expectations we've seen so far.
 
Yes, but some AMD spokesperson said in an interview with Golem (large German tech site) at E3 2011 that the GPU was not based on any existing Radeon chip.

And Jen-hsun and Kutaragi both claimed that the RSX was a custom part.
Seriously, how could an AMD spokesperson say anything else? "Yes, it's based on HDXYZ" would of course completely break all NDAs, and be a terrible blunder.

Being based on AMD Radeon tech isn't a problem as it is arguably just about as good as it gets for stationary console purposes, why on earth should they spend major effort reinventing the wheel? It would be a waste of time, money and engineering resources. There are a few things that are unnecessary on consoles that can be cut from a desktop design in order to increase efficiency, and there is going to be a direct communication path to the CPU that is not PCI-express. So off the shelf, no, but of course it's going to be based on what they have. And that's a good thing.
 
So putting altogether, I could definitely see Nintendo shipping something like this:
CPU:
Tri cores from the embedded line.
OoO, no SMT
2MB of edram (the L2 cache basically)
CPU speed could be set anywhere between say 1.5 and 2Ghz
64bit bus to either 512MB of ram.
RAM would be DDR3, clock speed anywhere between 533 and 800 MHz
Bandwidth to the main ram would be anywhere between 8.5GB/s and 12.8 GB/s
between 10 and 15 watts

(I'm using the info from this page because I believe that if that kind of budget ram is relevant to AMD low end gpu it is to Nintendo too).


GPU:

Caicos/ HD 6450 / HD 6400M
160 Stream Processing Units
8 Texture Units
16 Z/Stencil ROP Units
4 Color ROP Units.
64bit bus to 256MB of VRAM
RAM would be GDDR5, clock speed 800Mhz.
Bandwidth to the VRAM would be 25.6GB/s
------------------------

That's pretty much it, Edram is too complicated for rendering especially as the wiiumote ups the requirement for the framebuffer (two of them screen and WiiU).

I know fans will try to kill me but I believe that it makes a lot of sense, 256MB may make up for the higher cost of gddr5 vs ddr3 and Nintendo needs that bandwidth.
512MB freed from the FB requirement should be an improvement over the ps3.
The system would consist of two tiny and cool chips, max tdp would be 35Watts. Passive cooling and a unique fan in the box for the power supply, the chips, etc. should do the trick.
The cost using numbers given by AMD and extrapolating for the CPU north of 80$ including the memory chips.

Trying to analyse/predict something without doing any research into what's already known on the subject is never a good idea..
 
Last edited by a moderator:
And Jen-hsun and Kutaragi both claimed that the RSX was a custom part.
Seriously, how could an AMD spokesperson say anything else? "Yes, it's based on HDXYZ" would of course completely break all NDAs, and be a terrible blunder.

Being based on AMD Radeon tech isn't a problem as it is arguably just about as good as it gets for stationary console purposes, why on earth should they spend major effort reinventing the wheel? It would be a waste of time, money and engineering resources. There are a few things that are unnecessary on consoles that can be cut from a desktop design in order to increase efficiency, and there is going to be a direct communication path to the CPU that is not PCI-express. So off the shelf, no, but of course it's going to be based on what they have. And that's a good thing.

Earlier in this thread, it was revealed that there is some fixed-function aspect as an aside to the standard unified shader architecture. Someone has, then, decided to "reinvent the wheel" and has put some heavy customization into a chip. This work was done from 2009-2011.
 
Earlier in this thread, it was revealed that there is some fixed-function aspect as an aside to the standard unified shader architecture. Someone has, then, decided to "reinvent the wheel" and has put some heavy customization into a chip. This work was done from 2009-2011.

Provided you actually believe that.

A broader issue is semantics. How much does a GPU have to change before you can legitmately call it "new"? But the statement that the WiiU GPU won't be based on an existing chip doesn't make sense unless AMD has designed a raytracing chip for the console. It's NDA talk.
 
Last edited by a moderator:
Earlier in this thread, it was revealed that there is some fixed-function aspect as an aside to the standard unified shader architecture. Someone has, then, decided to "reinvent the wheel" and has put some heavy customization into a chip. This work was done from 2009-2011.

You probably mean "speculated" rather than "revealed".
 
And Jen-hsun and Kutaragi both claimed that the RSX was a custom part.
Seriously, how could an AMD spokesperson say anything else? "Yes, it's based on HDXYZ" would of course completely break all NDAs, and be a terrible blunder.

Being based on AMD Radeon tech isn't a problem as it is arguably just about as good as it gets for stationary console purposes, why on earth should they spend major effort reinventing the wheel? It would be a waste of time, money and engineering resources. There are a few things that are unnecessary on consoles that can be cut from a desktop design in order to increase efficiency, and there is going to be a direct communication path to the CPU that is not PCI-express. So off the shelf, no, but of course it's going to be based on what they have. And that's a good thing.
Custom just means it's a unique ASIC and not the same chip as you'd buy for your PC. The word says nothing about how much was customized. So RSX was custom even if they did nothing other than change the memory controller.
 
Custom just means it's a unique ASIC and not the same chip as you'd buy for your PC. The word says nothing about how much was customized. So RSX was custom even if they did nothing other than change the memory controller.

Then again, it's still pretty much G71 with lower memory bandwidth and the performance on the desktop part is a good indicator of what the RSX can do.

So yeah, if it's basically a Turks with an ARM core and audio DSP we can still assume most of its capabilities will be equivalent to what a Turks can do (unless the CPU is doing lots of shader work like what happens in the PS3).
 
So yeah, if it's basically a Turks with an ARM core and audio DSP we can still assume most of its capabilities will be equivalent to what a Turks can do (unless the CPU is doing lots of shader work like what happens in the PS3).


Yes and no.

The audio DSP (if true) would give better sound and would be able to offload 8-10% from a console CPU IIRC from seeing some slides of typical CPU (xenon?) workloads and free it to visuals/physics/AI whatever.

It can have a pretty standard GPU, but if it does have some fixed function units or co-processors, one can only guess what they do/offload and guess what the console can do (just like PS3/RSX it can do much more than anyone could ever think, given RSX).

Personally I would be quite happy with some fixed function units.
 
Yes and no.

The audio DSP (if true) would give better sound and would be able to offload 8-10% from a console CPU IIRC from seeing some slides of typical CPU (xenon?) workloads and free it to visuals/physics/AI whatever.

It can have a pretty standard GPU, but if it does have some fixed function units or co-processors, one can only guess what they do/offload and guess what the console can do (just like PS3/RSX it can do much more than anyone could ever think, given RSX).

Personally I would be quite happy with some fixed function units.

I wasn't commenting on the fact that it may or may not have fixed function, custom parts in it.
I just said that if it's as "custom" as RSX was, then the difference from the desktop counterpart won't be very large.

So the fact that they said it's a custom part isn't 100% proof that the GPU is altered in a way that it would make a difference in performance-per-clock, compared to the laptop/desktop counterpart.
 
Perhaps I'm looking at the 4cm fan and 45nm things a bit too hard, but given just how low I think Nintendo need the BoM to be and how limited the cooling will be the be I'm just not expecting that much. I think it can get into the same ballpark as the 360 while drawing substantially less power because they should be able to save loads of power on the CPU by clocking lower.
Devkits (E3 demo units in a black metal case) have two fans as far as I can tell - one 50mm or 60mm intake and one 40mm exhaust. It's possible the actual console uses a radial intake fan plus 40mm axial exhaust.
 
If it uses a radial fan, having another fan in-line with the flow would just act as an airbreak. Radials have quite high static pressure; a second fan would not be neccessary.
 
Custom just means it's a unique ASIC and not the same chip as you'd buy for your PC. The word says nothing about how much was customized. So RSX was custom even if they did nothing other than change the memory controller.

Yes.
The word "custom" carries an air of the exotic, performance and particular care - glamour if you wish. In that context it is understandable that you don't go out of your way to clarify that your "customization" consisted of cutting ROPs and memory bus in half. :)

So while saying that the Wii U GPU is a custom job is going to be true by definition, saying that it isn't based on other AMD products is guaranteed to be false, even ATI/AMDs desktop products show very large similarities between generations. They typically introduce a new series every year, and most building blocks of the chips are either unchanged or somewhat refined. Just as it is with CPUs, these days.
When a Hot Newness is introduced, and the new stuff is presented and hyped, make the thought experiment to detract that from the whole, and look at its shadow instead, all the things that were left as is.
That's not meant as negative critizism at all, evolution is a powerful force.
 
I'm pretty sure that's near impossible.
And the next next ipad can only be faster than Wii U if the console manages to underpower even the lowest expectations we've seen so far.

The current SGX543 mobilechip designs are from 2008-2009 with just more cores. Lot of people seemingly dont know how big deal Rogue will be. It will bring mobile graphics to parity with current-gen consoles. From the tens of Gflops to hundreds. It will absolutely be "on par" with Wii U next spring.

People will be shocked at the graphics it will push and pundits will claim doom to consoles. There is no reason for Activision or EA to not put big titles there when Apple provides a new controller.
 
The current SGX543 mobilechip designs are from 2008-2009 with just more cores. Lot of people seemingly dont know how big deal Rogue will be. It will bring mobile graphics to parity with current-gen consoles. From the tens of Gflops to hundreds. It will absolutely be "on par" with Wii U next spring.

People will be shocked at the graphics it will push and pundits will claim doom to consoles. There is no reason for Activision or EA to not put big titles there when Apple provides a new controller.

I totally agree...Rogue is a quantum leap over SGX 543...plus lets not forget A6X will have quad channel lpddr3..maybe 2gb ram...at least 2 cortex A15's?...also bearing in mind its TBDR so even better bandwidth budget?...Like i said, if the Wii U doesn't arrive with at LEAST 2x the performance of 360 its dead after 12 months.

Why waste £300 on a box thats plugged into the wall, with a cheap tablet as its main selling point and out of date graphics..when you can just buy a new new ipad??:p

EDIT; Use Airplay to connect to tv..jobs a good un!
 
The current SGX543 mobilechip designs are from 2008-2009 with just more cores. Lot of people seemingly dont know how big deal Rogue will be. It will bring mobile graphics to parity with current-gen consoles. From the tens of Gflops to hundreds. It will absolutely be "on par" with Wii U next spring.

People will be shocked at the graphics it will push and pundits will claim doom to consoles. There is no reason for Activision or EA to not put big titles there when Apple provides a new controller.

Where are the numbers that back up Rogue's hundreds of gigaflops and (even more importantly) memory/bandwidth setup?
For reference - will people be saying the same thing when Microsoft introduces their tablet for the next xbox and the ipad 5 or 6 "beats" it?
 
Why waste £300 on a box thats plugged into the wall, with a cheap tablet as its main selling point and out of date graphics...
Because it'll have unique software, controls that don't require fingers to get in the way, and battery life longer than 2 hours when playing games.
 
If it uses a radial fan, having another fan in-line with the flow would just act as an airbreak. Radials have quite high static pressure; a second fan would not be neccessary.
I'm just guessing here, anyway. Maybe the actual system has no exhaust fan at all. All I know is that the devkits had two fans, and that the E3 demo units looked like they were designed for a much higher airflow compared to the Wii.


The current SGX543 mobilechip designs are from 2008-2009 with just more cores. Lot of people seemingly dont know how big deal Rogue will be. It will bring mobile graphics to parity with current-gen consoles. From the tens of Gflops to hundreds. It will absolutely be "on par" with Wii U next spring.

People will be shocked at the graphics it will push and pundits will claim doom to consoles. There is no reason for Activision or EA to not put big titles there when Apple provides a new controller.
Question: If PowerVR chips are that amazing, why aren't we using them in PCs anymore? Right, because they're not. They're damn nice for sub 1W systems, but that's pretty much it. Rogue won't change that.
 
Last edited by a moderator:
The current SGX543 mobilechip designs are from 2008-2009 with just more cores. Lot of people seemingly dont know how big deal Rogue will be. It will bring mobile graphics to parity with current-gen consoles. From the tens of Gflops to hundreds. It will absolutely be "on par" with Wii U next spring.

People will be shocked at the graphics it will push and pundits will claim doom to consoles. There is no reason for Activision or EA to not put big titles there when Apple provides a new controller.

Rogue certainly looks like a great GPU and a generational step forward for mobile graphics. I think it'll be closer in power to next gen console GPU's than current mobile GPU's are to Xenos/RSX. However the bit I've highlighted above is quite an odd statement. Because not only do you not know what kind of performance we'll see from Rogue in 2013 (It seems Rogue will start out at around 200Gflops, which is below 360/PS3 GPU's) but you also don't really know what WiiU will be. Rumours and vague comments are not enough to compare graphics power.
 
Last edited by a moderator:
Rogue certainly looks like a great GPU and a generational step forward for mobile graphics. I think it'll be closer in power to next gen console GPU's than current mobile GPU's are to Xenos/RSX. However the bit I've highlighted above is quite an odd statement. Because not only do you not know what kind of performance we'll see from Rogue in 2013 (It seems Rogue will start out at around 200Gflops, which is below 360/PS3 GPU's) but you also don't really know what WiiU will be. Rumours and vague comments are not enough to compare graphics power.

That's just how he is. Says outlandish things and then won't respond when you call him out on it.
 
Status
Not open for further replies.
Back
Top