Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
2GB comes from the people on gaf that dont understand tech. If you look at the sdk leak, you have 1.5 GB of ram plus 512MB of flash for the OS, they added this to the ram.

And yes there is a ton of wishful thinking on GAF.

Care to elaborate what is all that whishful thinking on GAF? Because from what I see its quite down to the earth post E3.

About the memory I dont see why developers would need 1.5 Gigs for debugs taks, seems a bit too beefy for that, 1.5 seems confirmed but 2 GB doesnt seem impossible.

But you surely seem to enjoy trying to downplay WiiU as much as posible, but whats the point? We allready know it will be really far from PS720 spec wise so why do you seem to need to keep on that unnecesary task?
 
So did the 512MB of reserve ram for OS come from you also? For your source wasnt talking about the reserved 512MB of flash for the OS?

Nope. That came from Ideaman I believe where Wii U will have 8GB of flash for saves and downloads, and 512MB of flash for the OS. With the ones I talked to about the system memory, Ideaman said he was told the retail unit would definitely have 2GB. The other said it was definitely a target.
 
Care to elaborate what is all that whishful thinking on GAF? Because from what I see its quite down to the earth post E3.

About the memory I dont see why developers would need 1.5 Gigs for debugs taks, seems a bit too beefy for that, 1.5 seems confirmed but 2 GB doesnt seem impossible.

But you surely seem to enjoy trying to downplay WiiU as much as posible, but whats the point? We allready know it will be really far from PS720 spec wise so why do you seem to need to keep on that unnecesary task?

Pretty much anything on GAF, GPGPU, next gen gfx, and was way worse before the E3 crash.

I'm not downplaying anything but I been saying the same thing since I saw the case for Wiiu when it was announced. If you are going to call someone out at least post something to back up what you are saying. What have I been downplaying?

But i do very much enjoy the tech talk and Im in the camp the wiiu like the wii doesnt have to "so powerful" to be fun to play. Wiiu doing it own thing and people focus a lot on spec and try to turn it into something its not. There is a reason nintendo dont even comment on specs.

And just like the Wii before you have posters that have a "ton of wishful thinking" like below.
Why do people think that 'Hollywood' will be less powerful or advanced than Xbox2 ATI GPU?

* Work for Revolution GPU started way before MS joined Ati.
* Revolution will be released way after Xbox2

IMO 'Hollywood' GPU will be at the very least as powerful and advanced as Xbox2 GPU, having more time to design it and releasing next year point to that scenario.

Maybe graphic power wil not mean anything next gen, but Im sure a lot of people is going to be surprised with 'Hollywood'

Oh and Im really satisfied that Revolution being backwards compatible, as a nice bonus we now know that aside from the inovative controls, standard control will be there for sure.

Awesome news without doubt...

http://forum.beyond3d.com/showpost.php?p=426155&postcount=8

A lot of people were "surprised" by hollywood. Its funny how much the wiiu has been just like the wii. But that not a bad thing, people seem to forget the wii out sold everything last gen.
 
Last edited by a moderator:
Pretty much anything on GAF, GPGPU, next gen gfx, and was way worse before the E3 crash.

I'm not downplaying anything but I been saying the same thing since I saw the case for Wiiu when it was announced. If you are going to call someone out at least post something to back up what you are saying. What have I been downplaying?

But i do very much enjoy the tech talk and Im in the camp the wiiu like the wii doesnt have to "so powerful" to be fun to play. Wiiu doing it own thing and people focus a lot on spec and try to turn it into something its not. There is a reason nintendo dont even comment on specs.

And just like the Wii before you have posters that have a "ton of wishful thinking" like below.


http://forum.beyond3d.com/showpost.php?p=426155&postcount=8

A lot of people were "surprised" by hollywood. Its funny how much the wiiu has been just like the wii. But that not a bad thing, people seem to forget the wii out sold everything last gen.

There's a reason people focus on specs, it's because believe it or not it might be important to them and may actually influence their decision into thinking a different machine will give them a superior experience. Or in the case of the Wii U they find that it's barely an upgrade from PS360 and then decide the machine isn't worth their time or money and wait for the next big thing.

And are things with WiiU the same as with Wii? Will the machine outsell everything on the market this go round and leave Sony and MS to fight for second place? And is this (last) generation really over? It will be interesting to see how 360 sells this holiday season compared to WiiU.
 
I predict the wiiu will sell well, but ps360 won't be far behind as they are practically the same generation, just with much more games and better internet with the likely hood that all there friends are on the old systems
...by the sounds of it incredibly ps360 will have a better cpu...if that's even physically possible...I'm sure nintendo will have broke some golden rule of physics...nintendo laughs in the face of Moore's law and reverses it!! :)
 
Pretty much anything on GAF, GPGPU, next gen gfx, and was way worse before the E3 crash.

I'm not downplaying anything but I been saying the same thing since I saw the case for Wiiu when it was announced. If you are going to call someone out at least post something to back up what you are saying. What have I been downplaying?

But i do very much enjoy the tech talk and Im in the camp the wiiu like the wii doesnt have to "so powerful" to be fun to play. Wiiu doing it own thing and people focus a lot on spec and try to turn it into something its not. There is a reason nintendo dont even comment on specs.

* "Pretty much anything on GAF" doesnt seems too descriptive.
* WiiU GPU has GPGPU support 100%
* WiiU GPU has more advanced features than this current gen 100% because its a much modern achitecture.

So I dont have to look up other posts, these are two clear examples of you trying to downplay it even with the rough target specs we have that confirm the both GPGPU support and a more advanced architecture are a fact.

Your attitude against E6760-like architecture or 2GB memory possibility are another two examples, but since those are "speculation" I put them aside, nothing disproves these two at this point and I dont mind if there are true or not but this thread its titled "A speculative look on the Wii U GPU" and both fit -and make sense- in those terms.

And just like the Wii before you have posters that have a "ton of wishful thinking" like below.

http://forum.beyond3d.com/showpost.php?p=426155&postcount=8

A lot of people were "surprised" by hollywood. Its funny how much the wiiu has been just like the wii. But that not a bad thing, people seem to forget the wii out sold everything last gen.

Hahahahaha, daaaamn I got a good chuckle from that, I feel honored that you felt the need to look up my old posts to find a 7 year old one about the wii to disprove my point. I didnt remember that post so that really made my day!

I think Nintendo started developing GC succesor with PS360 level of specs in mind but after seeing how expensive that was and the incredible weak position of Nintendo in the home market they ditchet that plan and came up with the Wii one, so I dont regret that old post, thanks again for rescuing it.

Oh,WiiU its really a lot like the Wii in essence, they left the specs race again but there some key differences this time around.
 
Hahahahaha, daaaamn I got a good chuckle from that, I feel honored that you felt the need to look up my old posts to find a 7 year old one about the wii to disprove my point. I didnt remember that post so that really made my day!

You were right on the money in these two counts regards Nintendo:

Maybe graphic power wil not mean anything next gen,...
People were quite content to own Wii despite its lack of graphical power.
... but Im sure a lot of people is going to be surprised with 'Hollywood'
Everyone was surprised by 'Hollywood'!

Revisiting these old threads shows how expectations are founded on current understanding extrapolated (Nintendo launching alongside PS3 would be DOA :oops:), and typically ignore a whole load of variables that we just couldn't/can't anticipate. It's very wrong to call it wishful thinking though, when there's not much info to go on. When we started to get the facts about Wii but some were holding out for physics processing units and the like, that was wishful thinking. While the info we have is still sketchy though, it's all a matter of probabilities between options, as you say. And considering the rumours can change with changing specs as the hardware is designed and redesigned over time prior to launch, anyone correctly predicting the final hardware early on would be reliant on little more than blind luck. ;)
 
Wii U is going to be weaker than other next gen consoles, sure I see that, I can understand that line of thinking and agree with it, but there is seriously flawed logic going in this thread.

Someone on this page took a post from 7 years ago about the Wii's GPU to support his stance that Nintendo is going to release another "weak" console. It's very clear how Wii turned out. Nintendo saw how much it would cost to compete with Microsoft and Sony from a hardware perspective 8 years ago, they knew that they would have to release a HD box that pumped out high end graphics, but they also saw how well Gamecube sold with it's "superior" hardware compared to the king of that generation, the PS2. So they came up with a different approach to Wii, they came up with the Wiimote, but realizing how much they sunk into R&D for the Wiimote, they simply didn't want to spend the resources on new hardware, that is why Wii is a 50% overclocked Gamecube. We absolutely 100% know this didn't happen with the Wii U, and we know Wii U isn't using Xenos or Xenon, so there is no point to say that they did the same with 360, it simply isn't the cause and the logical olympics someone has to go through to make that work in their head has the conclusion before they even thought about the question.

So what is the Wii U? In 2009 Nintendo went to AMD and IBM. IBM at the time was working on the 476fp which is basically an embedded chip based off of the CPU found in Wii, with some small tweaks it's completely compatible with Wii's CPU. AMD at the time was developing the R800, but had just releases R700 prior, since Nintendo would be using an embedded chip (size of the console sort of dictates what sort of components they are going to use), they probably looked at AMD's embedded GPUs and decided to go with this, knowing that they couldn't get backwards compatibility out of a new chip though, they had to pick something new. (there is no reason they would not)

CPU: it's likely not a flops monster like cell or even xenon, devs are having problems working with it, and it's likely clocked lower, here is the thing though, not all devs are in fact having problems working with it. Gearbox has said that because Wii U has launched so much later than PS3/360 that it has a really brilliant CPU. IBM tells us the CPU is powerpc based in some nature, and we know it's hardware bc compatible with Wii, so 476fp is a good BASE to work from, since lherre from neogaf actually does work with a Wii dev kit and said it's SMT, we should assume that he is right, a 3 core 2 way SMT makes a lot of sense when you consider how easy devs can port their games over with.

GPU: Likely an embedded chip, the E4690 was available at the time, so why was it not reported in the dev kits? well It makes zero sense for them to stick to an old architecture like the R700, and while it will be customized, they probably built the GPU off of AMD's newest embedded GPU, which released April 2011, of course I'm talking about E6760, a 576GFLOPs GPU, this is probably not the exact GPU they used, but it's likely very close, and performance wise, it's also very close to the GPU they used in early dev kits, the HD4850. A 35watt chip at 40nm, it could come down a bit more if they produced it using the 32nm process, and some good speculation has also added that it also could have the gamecube gpu or at least part of it along for the ride.

I'm not saying that these are the specs of the system, but we have reports of ~600GFLOPs from the GPU and a CPU that is more advanced but lower clocked than Xenon, and we also have a lesser number of threads, which I would attribute to either the OS or a thread for streaming to the tablet as we have also heard that the CPU bogs down when you stream too much content to the controller.
 
Sounds reasonable, did you give any thoughts about the memory ?

2GB of ram with a large amount used in the OS. I think they will go for at least 52GB/s for the ram, not sure if that is possible with DDR3, I don't think GDDR5 is out of the question, especially if it's moderately clocked.
 
2GB of ram with a large amount used in the OS. I think they will go for at least 52GB/s for the ram, not sure if that is possible with DDR3, I don't think GDDR5 is out of the question, especially if it's moderately clocked.

Like your logic....so I'm thinking it's OoO and smt?? that would make it really hard for other to be weaker than xenon would it not?.... certainly it is very similar architecture to xenon so it isn't going to be hard to dev for?..so in that scenario it should not in anyway be weaker than a 7 year old cpu....there shouldn't be a single complaint from ANY devs about cpu power

....If they had to underclock it for tdp then that is very short sighted and in my opinion stupid for them to do....setting a target of a matchbox is all well and good, but you have to make sure it is more powerfull than your last generation competitors and also easily compatible for ports...incompetence from nintendo if that is the truth.

Saying that though apart from likely low cpu clocks...the rest of the system looks twice as powerfull as ps360....not sure about disk space or hdd options...but I would be very impressed with what you have wrote...as long as the cpu clocks improve...
 
Like your logic....so I'm thinking it's OoO and smt?? that would make it really hard for other to be weaker than xenon would it not?.... certainly it is very similar architecture to xenon so it isn't going to be hard to dev for?..so in that scenario it should not in anyway be weaker than a 7 year old cpu....there shouldn't be a single complaint from ANY devs about cpu power

....If they had to underclock it for tdp then that is very short sighted and in my opinion stupid for them to do....setting a target of a matchbox is all well and good, but you have to make sure it is more powerfull than your last generation competitors and also easily compatible for ports...incompetence from nintendo if that is the truth.

Saying that though apart from likely low cpu clocks...the rest of the system looks twice as powerfull as ps360....not sure about disk space or hdd options...but I would be very impressed with what you have wrote...as long as the cpu clocks improve...
Thanks, I think the reason devs are complaining is due to gflops performance being less and also from a thread being tied to wireless data transfer to the controller, the more data streamed to the controller, the more that the CPU takes a hit. Combine that with ports likely not using the dsp and you could tie up 2 or 3 threads, leaving less threads than xenon. The 476fp maxed out at 2ghz, so we will have to wait and see but I would guess the clocks are within 400mhz of that, but its just a wild guess.
 
Yes someone said that a few pages back....so say it comes in at 1.6ghz... 6 threads and hopefully a nice wallop of cache and bandwidth....but what about simd....would that processor have a simd? ...if it had a better simd engine than xenos then that would compensate would it not?

So I'm thinking for non ports there is some kind of dsp for the controllers..to stop sucking the power...so 1st party titles should have access to more power..

Kind of excited about this console a bit....still think that cpu is a bottle neck..
 
It's very clear how Wii turned out. Nintendo saw how much it would cost to compete with Microsoft and Sony from a hardware perspective 8 years ago, they knew that they would have to release a HD box that pumped out high end graphics, but they also saw how well Gamecube sold with it's "superior" hardware compared to the king of that generation, the PS2. So they came up with a different approach to Wii, they came up with the Wiimote, but realizing how much they sunk into R&D for the Wiimote, they simply didn't want to spend the resources on new hardware, that is why Wii is a 50% overclocked Gamecube.
I think that's an oversimplification. Unless we're missing something, Nintendo would have lost very little going with something like an ATi R300 series GPU, which was a couple of years old and small and cheap by then but would have offered much better visuals than Wii. It was clearly more profitable for Nintendo to go with Wii's final architecture, but they didn't have to be that cheap. That's the concern for Wii U, that Nintendo will make a choice that's excessively conservative.
2GB of ram with a large amount used in the OS. I think they will go for at least 52GB/s for the ram, not sure if that is possible with DDR3, I don't think GDDR5 is out of the question, especially if it's moderately clocked.
Is there much reason to think Nintendo will have a meaty OS? They aren't pursuing the media-hub as actively as MS and Sony. What %age of RAM does DS/3DS use for OS?
Thanks, I think the reason devs are complaining is due to gflops performance being less and also from a thread being tied to wireless data transfer to the controller, the more data streamed to the controller, the more that the CPU takes a hit.
Why would the CPU be involved in wireless communication? As wireless video is such a huge part of Wii U, Nintendo should be using more efficient hardware for the job such as video encoding to reduce bandwidth.
 
I think that's an oversimplification. Unless we're missing something, Nintendo would have lost very little going with something like an ATi R300 series GPU, which was a couple of years old and small and cheap by then but would have offered much better visuals than Wii. It was clearly more profitable for Nintendo to go with Wii's final architecture, but they didn't have to be that cheap. That's the concern for Wii U, that Nintendo will make a choice that's excessively conservative.
Is there much reason to think Nintendo will have a meaty OS? They aren't pursuing the media-hub as actively as MS and Sony. What %age of RAM does DS/3DS use for OS?Why would the CPU be involved in wireless communication? As wireless video is such a huge part of Wii U, Nintendo should be using more efficient hardware for the job such as video encoding to reduce bandwidth.

So originally the wiimote was going to become an add on for the GameCube, the reason they probably decided against it is because the GameCube was helpless, but if you look at the Wii as a revision, it is clear to see what they did. A r300 certaintly would have changed this entire generations layout, but they didn't. The reason for that is anyones guess, but its pretty clear that would of cost more money and more r&d budget and time. I assume those are the factors that gave us the Wii that launched.

Wii U is a bit more media focused and the os is designed to be used from the tablet during gameplay, also apps could be accessible during this time. I'm just spectulating but feel free to disagree.

The reports we are getting right now is that the CPU is being hit depending on how much is done on the tablet, I speculated way that is, it does have a wireless encoder as far as we know, but if the reports are true how do we make it fit?
 
Status
Not open for further replies.
Back
Top