Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
If all you are doing is streaming video to the tablet, then that should have a more or less constant cost on whatever system is doing it, I'd expect a hit to bandwidth (though nothing major) and no real impact to the CPU, outside the cost of rendering the remote frame locally.
OS tasks could impact CPU usage, but that doesn't matter if your displaying on the tablet or not.

It's hard to imagine a vaguely modern CPU even at the the rumored low clock rate having difficulty with 360 code, unless it had very limited FP support. I was just really surprised by the Namco quote, I mean my best guess is Tekken isn't exactly a CPU hog.
 
Someone on this page took a post from 7 years ago about the Wii's GPU to support his stance that Nintendo is going to release another "weak" console. It's very clear how Wii turned out.

GPU: Likely an embedded chip, the E4690 was available at the time, so why was it not reported in the dev kits? well It makes zero sense for them to stick to an old architecture like the R700, and while it will be customized, they probably built the GPU off of AMD's newest embedded GPU, which released April 2011, of course I'm talking about E6760, a 576GFLOPs GPU, this is probably not the exact GPU they used, but it's likely very close, and performance wise, it's also very close to the GPU they used in early dev kits, the HD4850. A 35watt chip at 40nm, it could come down a bit more if they produced it using the 32nm process, and some good speculation has also added that it also could have the gamecube gpu or at least part of it along for the ride.
NO that post was to point out wishful thinking. Not to show how the wiiu going to be weak.

from wiki:
Wishful thinking is the formation of beliefs and making decisions according to what might be pleasing to imagine instead of by appealing to evidence, rationality or reality.

It makes a lot sense to stick with r700 even when the r800 was out if they got a good deal on the design and they didnt want to spend a ton on R&D. We have nothing to support them ever changing gpu design. From the first leak to the latest leak sdk. They all match the r700 feature set. Now i have look for evidence that they have change gpu core but have found none beside PR talk. Like it has 2012 bell and whistles or whatever was said. AMD had already move this design to 40nm and released one r700 card on 40nm.

Now performance might be close to this design but again that is best case at this point and unlikely. Redwood pro would most likely be closer to the gpu specs of the system. We had a AC3 dev says it was around 1.5x stronger then ps360 about a month ago and believe it would be somewhere 1.5x-2x, given the r700 core and tdp range of the console.

They said cpu was on par and gpu 1.5x ps360.
http://www.insidegamer.nl/nieuws/98237/ac3-cpu-wii-u-net-zo-krachtig-als-ps3-x360-gpu-is-sterker

This seems to go with about every other report out there.... Pretty much the same story by everyone now.



Is there much reason to think Nintendo will have a meaty OS? They aren't pursuing the media-hub as actively as MS and Sony. What %age of RAM does DS/3DS use for OS?

I still believe this is a misunderstanding. The wiiu has 512MB of flash memory reserved for OS and I think people confuse this with ram and that where the 512MB rumor came from....

The x360 uses 32MB and ps3 uses 52MB. I dont know about 3ds/ds or even wii.
 
Last edited by a moderator:
Ninty's hardware decisions don't seem odd, it just seems to be all about keeping software development costs down. By being one step behind,they are trying to stay one step ahead if that's makes sense.
 
If all you are doing is streaming video to the tablet, then that should have a more or less constant cost on whatever system is doing it, I'd expect a hit to bandwidth (though nothing major) and no real impact to the CPU, outside the cost of rendering the remote frame locally.
OS tasks could impact CPU usage, but that doesn't matter if your displaying on the tablet or not.

It's hard to imagine a vaguely modern CPU even at the the rumored low clock rate having difficulty with 360 code, unless it had very limited FP support. I was just really surprised by the Namco quote, I mean my best guess is Tekken isn't exactly a CPU hog.

Maybe it lacks punch in its vector units? Isn't animation extremely vector math heavy?
 
Ninty's hardware decisions don't seem odd, it just seems to be all about keeping software development costs down. By being one step behind,they are trying to stay one step ahead if that's makes sense.



Yes, and can clearly be seen by their stock price. Perhaps people are losing confidence that Nintendo will be able to sustain itself in the hardware market going into the future. At this point in time I no longer want Nintendo to succeed on their own merits and games, but instead have to rely on 3rd party support to be as successful as the Wii was. At least that way the platform might have more than a handful of games I pick up and actually buy.
 
If all you are doing is streaming video to the tablet, then that should have a more or less constant cost on whatever system is doing it, I'd expect a hit to bandwidth (though nothing major) and no real impact to the CPU, outside the cost of rendering the remote frame locally.
OS tasks could impact CPU usage, but that doesn't matter if your displaying on the tablet or not.

If it's only video streaming to tablet, May be it will best to take output video like a second monitor in mirror mode and use a specific video chip coder for compression, so no bandwidth CPU/GPU used by this with, no high cost? and less complex to handle by dev?
 
NO that post was to point out wishful thinking. Not to show how the wiiu going to be weak.

It makes a lot sense to stick with r700 even when the r800 was out if they got a good deal on the design and they didnt want to spend a ton on R&D. We have nothing to support them ever changing gpu design. From the first leak to the latest leak sdk. They all match the r700 feature set. Now i have look for evidence that they have change gpu core but have found none beside PR talk. Like it has 2012 bell and whistles or whatever was said. AMD had already move this design to 40nm and released one r700 card on 40nm.

Now performance might be close to this design but again that is best case at this point and unlikely. Redwood pro would most likely be closer to the gpu specs of the system. We had a AC3 dev says it was around 1.5x stronger then ps360 about a month ago and believe it would be somewhere 1.5x-2x, given the r700 core and tdp range of the console.

They said cpu was on par and gpu 1.5x ps360.
http://www.insidegamer.nl/nieuws/98237/ac3-cpu-wii-u-net-zo-krachtig-als-ps3-x360-gpu-is-sterker

This seems to go with about every other report out there.... Pretty much the same story by everyone now.

I would hardly call it wishful thinking, no one knew what Wii was at that time, and any sane speculation wouldn't of assumed overclocked gamecube was the answer. Anyone assuming overclocked 360 this time is using some form of wishful thinking.

"a good deal on parts?" an embedded GPU would be cheaper than any desktop redesigned part from R700. I do have a question here since you've sort of run into a dead end, what exactly is a r700 feature set? how is it different than r800 or r900? sm 4.1 instead of 5? DX10.1 instead of DX11? where did you get that info from? early dev kit spec sheets are not written in stone, they don't tell the future, only what is currently available. It's not even a target spec list, but actual hardware that was in early dev kits that were out just before the launch of the embedded chip a lot of people are currently speculating on.

The little we have seen from Wii U, suggests at least 1 DX11 feature is being heavily used in games like Pikmin 3, Zombi U, p100. That feature being depth of field, sure it can be faked even in current gen and to a lesser extent scripted into the Wii like in last story (as lost in blue pointed out on gaf) pikmin 3's effect doesn't look like tiling blur effects, it looks like true depth of field we saw from stuff like the lady bug demo for R800. When stuff, no matter where it is on the screen gets to a certain distance, it blurs automatically. While that can be faked, it should take a larger performance hit than it would in DX11, so why are all these devs using it unless it's relatively free?

I'm not sure why you are holding on to early dev kit specs, the "PR" about 2012 bells and whistles comes from a dev on neogaf who works with the Wii U named Antonz, and Kotaku's article about a weak CPU a month or so ago still pointed to DX11 features but with DX9 performance. (whatever that means, as e350 is a DX11 gpu with only 80GFLOPs and highly outclassed by xenos in terms of performance)

So what exactly about using a 576GFLOP GPU seems impossible to you? I don't think developers doing ports for instance, are really going to have a good grasp of the actual hardware, dumping their 360 code directly onto Wii U will work (Darksiders 2 was up and running after 5 weeks, so we pretty much know this is being done) but it will be highly inefficient, I could see devs not finding a lot of room to bump stuff up more than 50% if this was the case.
 
I know you have been disagreeing..but for me it has cast a positive light on the wuu.

I said quite some pages back that I thought the sweet spot for wuu would be 50-100% more powerfull in most areas...and allow easy ports....what you have both written seems to point in this direction....now as long as the cpu really is the equivalent of the xenon...INCLUDING vector units...combined with a more modern 500gflop gpu....likely more bandwidth/ram....and also more disk space....then that system would trounce a ps360 on first party titles would it not??

If they pull that off, squashed into a matchbox that reliable with the innovative controllers...for a profit....then this is a clear master stroke from ninty.....certainly I would seriously consider buying one once my fears have been allayed.

One question...so we think it has 1gb of usable ram for games....what about edram??..I nice blob of say 32mb read/write edram would round that console off nicely.
 
So originally the wiimote was going to become an add on for the GameCube, the reason they probably decided against it is because the GameCube was helpless, but if you look at the Wii as a revision, it is clear to see what they did. A r300 certaintly would have changed this entire generations layout, but they didn't. The reason for that is anyones guess, but its pretty clear that would of cost more money and more r&d budget and time.
Depending on how late in the game that choice was made. As an off-the-shelf part they could have used, RnD could have been far less than designing a new GPU. We don't know the internal decision making going on during Wii's developments, but whatever it was, it could similarly apply to Wii U somehow, and see a much lower performance part than we could expect.

Wii U is a bit more media focused and the os is designed to be used from the tablet during gameplay, also apps could be accessible during this time. I'm just spectulating but feel free to disagree.
Nintendo have been dragged into the media hub world kicking and screaming, but they are going that way. Although the lack of BRD playback still shows they aren't serious about it. I can believe in a decent chunk reserved for something like web browsing, which could be streamed to the WuuPad, but I can't see Nintendo giving up a large chuink of resources to non-gaming tasks as they intend to produce and sell game content - the reason d'etre of their products is well established, and they hang off extra value where they can for cheap AFAICS.

The reports we are getting right now is that the CPU is being hit depending on how much is done on the tablet, I speculated way that is, it does have a wireless encoder as far as we know, but if the reports are true how do we make it fit?
I don't know. I question the reports as a result. ;) Unless it wasn't just accessing the tablet causing issues, but what the responses were (eg. tablet input tied to the game physics), I see no reason for the CPU to be heavily involved in just supporting the tablet. Even if it was being used for encoding video in the SDKs or something while waiting on final hardware, it shouldn't fluctuate with performance.
 
I know you have been disagreeing..but for me it has cast a positive light on the wuu.

I said quite some pages back that I thought the sweet spot for wuu would be 50-100% more powerfull in most areas...and allow easy ports....what you have both written seems to point in this direction....now as long as the cpu really is the equivalent of the xenon...INCLUDING vector units...combined with a more modern 500gflop gpu....likely more bandwidth/ram....and also more disk space....then that system would trounce a ps360 on first party titles would it not??

If they pull that off, squashed into a matchbox that reliable with the innovative controllers...for a profit....then this is a clear master stroke from ninty.....certainly I would seriously consider buying one once my fears have been allayed.

One question...so we think it has 1gb of usable ram for games....what about edram??..I nice blob of say 32mb read/write edram would round that console off nicely.

Yeah 32MB has been the consistent rumor, I'm guessing thanks to the size of the case, it's likely set up like 360s, so an APU with IBM and AMD parts on 1 chip, able to share the cache.

Ram, yes at least 1GB, lherre I believe pointed to having more, and the early dev kit specs say 3GB, considering debugging, it's at least half of that for the system.
 
Yeah 32MB has been the consistent rumor, I'm guessing thanks to the size of the case, it's likely set up like 360s, so an APU with IBM and AMD parts on 1 chip, able to share the cache.

Ram, yes at least 1GB, lherre I believe pointed to having more, and the early dev kit specs say 3GB, considering debugging, it's at least half of that for the system.

I didn't think I was possible to have 1.5gb with dual channel? Could just be me then :) yea apart from the cpu puzzle...it's starting to look very balanced.
 
Depending on how late in the game that choice was made. As an off-the-shelf part they could have used, RnD could have been far less than designing a new GPU. We don't know the internal decision making going on during Wii's developments, but whatever it was, it could similarly apply to Wii U somehow, and see a much lower performance part than we could expect.

We are expecting a low performance part, but something similarly to Wii could not happen by any stretch of the imagination, Wii's GPU was flipper overclocked by 50%. A modern GPU can't be made out of flipper without extreme customization that would cost so much that it makes it completely impossible, so it is all new hardware, likely customized from an off the shelf part, using an embedded part isn't some sort of magical performance enhancer, it's not a large jump over current generation gpus either, 576GFLOPs is nothing to get excited over. Trinity for instance is 384 shaders clock as high as 686MHz, giving you 527GFLOPs, and that is on a notebook part. The only reason e6760 is really getting so much attention, is it's power/performance ratio, which is a must with the size of the Wii U case, so an embedded GPU does make a great deal of sense.

Nintendo have been dragged into the media hub world kicking and screaming, but they are going that way. Although the lack of BRD playback still shows they aren't serious about it. I can believe in a decent chunk reserved for something like web browsing, which could be streamed to the WuuPad, but I can't see Nintendo giving up a large chuink of resources to non-gaming tasks as they intend to produce and sell game content - the reason d'etre of their products is well established, and they hang off extra value where they can for cheap AFAICS.
What they are getting for media, is basically all I use. I don't believe bluray is the future at all, digital and digital services will eventually win out anyways. As for the OS taking up less space, that would be good. I imagine that it would eventually shrink anyways, but if they are allowing "apps" while running games, I can see them setting aside some amount of ram for that. I'm still expecting Ideaman to be correct when he says 2GBs inside the system, but then even 1.5GBs is going to be a large improvement over what we have today.

I don't know. I question the reports as a result. ;) Unless it wasn't just accessing the tablet causing issues, but what the responses were (eg. tablet input tied to the game physics), I see no reason for the CPU to be heavily involved in just supporting the tablet. Even if it was being used for encoding video in the SDKs or something while waiting on final hardware, it shouldn't fluctuate with performance.

It could be a finalized hardware problem, maybe they were using the CPU to stream the content because the wireless encoder wasn't ready. There is a faint rumor going around that Ubisoft has found more power in the CPU recently, over 10% after figuring it out a bit more, that sort of thing will be common place over the next year I imagine.
 
I would hardly call it wishful thinking, no one knew what Wii was at that time, and any sane speculation wouldn't of assumed overclocked gamecube was the answer. Anyone assuming overclocked 360 this time is using some form of wishful thinking.

"a good deal on parts?" an embedded GPU would be cheaper than any desktop redesigned part from R700. I do have a question here since you've sort of run into a dead end, what exactly is a r700 feature set? how is it different than r800 or r900? sm 4.1 instead of 5? DX10.1 instead of DX11? where did you get that info from? early dev kit spec sheets are not written in stone, they don't tell the future, only what is currently available. It's not even a target spec list, but actual hardware that was in early dev kits that were out just before the launch of the embedded chip a lot of people are currently speculating on.

The little we have seen from Wii U, suggests at least 1 DX11 feature is being heavily used in games like Pikmin 3, Zombi U, p100. That feature being depth of field, sure it can be faked even in current gen and to a lesser extent scripted into the Wii like in last story (as lost in blue pointed out on gaf) pikmin 3's effect doesn't look like tiling blur effects, it looks like true depth of field we saw from stuff like the lady bug demo for R800. When stuff, no matter where it is on the screen gets to a certain distance, it blurs automatically. While that can be faked, it should take a larger performance hit than it would in DX11, so why are all these devs using it unless it's relatively free?

I'm not sure why you are holding on to early dev kit specs, the "PR" about 2012 bells and whistles comes from a dev on neogaf who works with the Wii U named Antonz, and Kotaku's article about a weak CPU a month or so ago still pointed to DX11 features but with DX9 performance. (whatever that means, as e350 is a DX11 gpu with only 80GFLOPs and highly outclassed by xenos in terms of performance)

So what exactly about using a 576GFLOP GPU seems impossible to you? I don't think developers doing ports for instance, are really going to have a good grasp of the actual hardware, dumping their 360 code directly onto Wii U will work (Darksiders 2 was up and running after 5 weeks, so we pretty much know this is being done) but it will be highly
inefficient, I could see devs not finding a lot of room to bump stuff up more than 50% if this was the case.
dof is not a dx11 feature... it can be done on just about any hardware. How do we know its free because a couple of games are using it? Like on gaf posters were saying that doesn't mean anything yet you post it again...

Do you have anything to back up what you say, seem to be a copy paste from gaf. Why would that gpu design be cheaper when it the latest design? I got my info from everything we know, early specs and leaked sdk. This dx11 or new latest and greatest amd gpu is like the new "gpgpu", too funny.

Funny you keep saying thing about x360 when no else in this threads has said anything. Idea man on gaf called it a xbox360 plus right before E3. Looks like he was right on the money. Idea man been right on every leak btw on gaf....

I said unlikely in my last post not impossible about that gpu. We just don't one piece of proof that they ever moved from r700. The thing that always stuck out to me was r800 was out when they started working with r700. If they wanted newer feature set why start at r700 when r800 was out and support all of them....
 
dof is not a dx11 feature... it can be done on just about any hardware. How do we know its free because a couple of games are using it? Like on gaf posters were saying that doesn't mean anything yet you post it again...

Do you have anything to back up what you say, seem to be a copy paste from gaf. Why would that gpu design be cheaper when it the latest design? I got my info from everything we know, early specs and leaked sdk. This dx11 or new latest and greatest amd gpu is like the new "gpgpu", too funny.

Funny you keep saying thing about x360 when no else in this threads has said anything. Idea man on gaf called it a xbox360 plus right before E3. Looks like he was right on the money. Idea man been right on every leak btw on gaf....

I said unlikely in my last post not impossible about that gpu. We just don't one piece of proof that they ever moved from r700. The thing that always stuck out to me was r800 was out when they started working with r700. If they wanted newer feature set why start at r700 when r800 was out and support all of them....

Here: https://graphics.stanford.edu/wikis...et&target=CS448s-10-10-depthOfFieldForWeb.pdf
It explains the difference between DoF before DX11 and what DX11's DoF is like (isn't done in the post process for one)

Also about R800, are you sure it was out before Nintendo came to AMD? remember R800 launched in late September of that year.

I've already covered why pointing to that sdk sheet is bad, I think the 360's early spec sheet listed a 4GHz CPU? and obviously we all know stuff has changed and have gotten reports from devs working with the boxes that it's a modern GPU. I guess I'm saying your info is out dated and pretty much been confirmed by at least a few devs as being accurate at the time but not lining up with the current GPU.
 
Here: https://graphics.stanford.edu/wikis...et&target=CS448s-10-10-depthOfFieldForWeb.pdf
It explains the difference between DoF before DX11 and what DX11's DoF is like (isn't done in the post process for one)

Also about R800, are you sure it was out before Nintendo came to AMD? remember R800 launched in late September of that year.

I've already covered why pointing to that sdk sheet is bad, I think the 360's early spec sheet listed a 4GHz CPU? and obviously we all know stuff has changed and have gotten reports from devs working with the boxes that it's a modern GPU. I guess I'm saying your info is out dated and pretty much been confirmed by at least a few devs as being accurate at the time but not lining up with the current GPU.

yes they have design years before they go to market.

Now with the x360 we knew the gpus were changing. We have report telling us the new gpu that knew in the dev kits. We had not had one report that has said anything like that.
 
Pretty sure we have reports saying it was using an off the shelf part and was waiting for new hardware when it comes to the gpu, unless you think Wii u has a hd4850 inside of it still.
 
The problem with even a 4850 is the games (even ports like Batman) just should unquestionably show a huge bump over PS360 with no effort at all (hell, just run it at 1080P). But they dont.

So like 2GB RAM, call me "skeptical". Of course my position on the GPU has been on message boards for a couple years so nothing new there.
 
The little we have seen from Wii U, suggests at least 1 DX11 feature is being heavily used in games like Pikmin 3, Zombi U, p100. That feature being depth of field, sure it can be faked even in current gen and to a lesser extent scripted into the Wii like in last story (as lost in blue pointed out on gaf) pikmin 3's effect doesn't look like tiling blur effects, it looks like true depth of field we saw from stuff like the lady bug demo for R800. When stuff, no matter where it is on the screen gets to a certain distance, it blurs automatically. While that can be faked, it should take a larger performance hit than it would in DX11, so why are all these devs using it unless it's relatively free?

I don't recall seeing a depth of field algorithm this gen that wasn't actually based on depth and was just "faked" with some screen area stenciled off blurring or something. The presentation you linked doesn't imply that the techniques it's promoting are more "real", either, just that they have less color bleeding and artifacts than current solutions. And I just browsed it quickly but it sounded like it's still a post effect.

So basically I wish people would stop calling DOF a "DX11 feature". Compute might offer improvements but I don't see any evidence of fundamental differences. I'll admit I'm not a graphics programmer, but I'm quite sure I shipped a first generation 360 game that had DOF that behaved the same way as in Pikmin. We just didn't send objects through it or animate it because that looked dumb and wouldn't have been good for the game.
 
Last edited by a moderator:
Do you have anything to back up what you say, seem to be a copy paste from gaf.
A copy/paste from gaf is fine here as the discussion is completely new and unique here. Syferz is not holding one discussion in two places, with a post here having to add more than his post on gaf. If you can't make the distinction and discuss the topic in two places idependently, then pick the one thread you want to follow the more closely, thank you. Gaf is only of interest here as a source for rumours. Any more discussion crossover will be responded to with offical action.
 
I don't recall seeing a depth of field algorithm this gen that wasn't actually based on depth and was just "faked" with some screen area stenciled off blurring or something. The presentation you linked doesn't imply that the techniques it's promoting are more "real", either, just that they have less color bleeding and artifacts than current solutions. And I just browsed it quickly but it sounded like it's still a post effect.

So basically I wish people would stop calling DOF a "DX11 feature". Compute might offer improvements but I don't see any evidence of fundamental differences. I'll admit I'm not a graphics programmer, but I'm quite sure I shipped a first generation 360 game that had DOF that behaved the same way as in Pikmin. We just didn't send objects through it or animate it because that looked dumb and wouldn't have been good for the game.

Well I'm not completely sure if Wii U is using "Diffuse DoF" effects but that is what it looks like. I could certainly be wrong about that, and hopefully someone on here knows for certain, because that should tell us if the console does in fact have "DX11" capabilities as developers have said.

Don't DX9 capable cards use is something called "Poisson Disk Blur"? both are talked about inside the PDF, I'm not coming from a large background in graphics programming, so I could be very wrong about all of this, but I from what I've seen and from what I've read, Wii U's DoF looks much more advanced than we have seen in DX9 cards, and DX10.1 cards used the same effect.

The one thing I can point to is how in Pikmin 3, it seems that the DoF is separate on different objects and this PDF does point to that as being a benefit of DX11's implementation of the effect.
 
Status
Not open for further replies.
Back
Top