Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Dev kits usually have more memory than the final hardware, not less. Besides, that 1.5GB is split between system and graphics. I'm guessing homerdog has 896MB dedicated to graphcs alone and then I believe anothr 4GB of system memory. WiiU isn't going to come close to that - although the end results will probably be largely similar thanks to the console environment.

I certainly won't be suprised if the WiiU GPU borrows elements from R8xx and even R9xx but in overall throughput the signs seem to point to something more modest than a 4850 both in cores and in clockspeed.

So again, raw power wise I don't expect it to match a GTX 260 but the end results will likely be comparable or maybe even better due to the console environment.

It sounds like we are debating two different things. I took him to referring to the card only and you sound like you're referring to his whole pc. I know dev kits have more memory so they can have the overhead before optimization, even though still that was the first kit and we (most of us at least) don't know if that might have changed since then. I also was looking at it from a perspective of Wii U going UMA and that being more than what the 260 by itself would have.

I think we are probably the same on the expected amounts of ALUs and clockspeed. I just think that after optimizations it should pass that level. But when I say that I'm also not saying it will blow it away.
 
Looking back at the topic.. B3D's article pretty much dismisses the chances of a 800sp GPU.
 
According to info given to a poster awhile back (he posts here too), the R700 in the first dev kit was a 4830 so I've believed 640 might be the minimum target. My current hypothesis is the final GPU could be considered a "Cayman Jr" in that it uses VLIW4, AMD's 8th gen. tessellator, maybe two front end engines, etc., but has a clock and ALU count that's similar to an R700. That's something you can't properly simulate with an off-the-shelf part.
 
My current hypothesis is the final GPU could be considered a "Cayman Jr" in that it uses VLIW4, AMD's 8th gen. tessellator, maybe two front end engines,

For Freud's Sake, it is no such thing. What we wrote may well be speculative in places, but it's not speculative to the extent that makes it horribly wrong by about two hardware generations, architecture and pretty much everything.
 
What's that supposed to mean? If we're looking at it from that perspective then Xbox 360 made about two gens worth of jumps from its alpha kit to the final hardware. At the same time how many commercially available cards had a unified shading architecture when the 360 was released? Nothing wrong with the idea IMO since it's based on already available tech (just from release date alone Cayman will be at least 1.5 years old when Wii U releases). It's plausible considering Cayman's development time would have been concurrent, if not before Wii U's gpu development. And the supposed benefits of VLIW4 sounds like something you'd want for a console gpu. I base this idea on what AMD said in their press release. It's not like I'm saying they are going to be putting a 6950-like gpu in the console. Like I said you can't replicate that with an off the shelf card. You couldn't replicate Xenos with off the shelf parts for similar reasons either.

We're all speculating so I don't see how you can take mine as saying yours is wrong. In fact looking back at the article (which I did enjoy reading), I don't really see the premise for you feeling that way.
 
What's that supposed to mean? If we're looking at it from that perspective then Xbox 360 made about two gens worth of jumps from its alpha kit to the final hardware. At the same time how many commercially available cards had a unified shading architecture when the 360 was released? Nothing wrong with the idea IMO since it's based on already available tech (just from release date alone Cayman will be at least 1.5 years old when Wii U releases). It's plausible considering Cayman's development time would have been concurrent, if not before Wii U's gpu development. And the supposed benefits of VLIW4 sounds like something you'd want for a console gpu. I base this idea on what AMD said in their press release. It's not like I'm saying they are going to be putting a 6950-like gpu in the console. Like I said you can't replicate that with an off the shelf card. You couldn't replicate Xenos with off the shelf parts for similar reasons either.

We're all speculating so I don't see how you can take mine as saying yours is wrong. In fact looking back at the article (which I did enjoy reading), I don't really see the premise for you feeling that way.

Good post.

I thought his statements feasible, but the conservatism of nintendo and constant desire to maintain Bill of materials if possible below the selling price, i don't believe happened same xbox360 aproach, but nevertheless i believe in possibility they working or customise a something like 4850 in aspect tesselation process.
 
What's that supposed to mean? If we're looking at it from that perspective then Xbox 360 made about two gens worth of jumps from its alpha kit to the final hardware.

It's probably more like 1.5 from a technical perspective and 1 from a raw power perspective.

Not that I'm being picky or anything ;)
 
I think the first 360 kits were using a 9800 Pro, weren't they?

Wu dev kits may be more representative of the final chip(s) than the 9800 was of Xenos.
 
Simple, look at their thermal envelope for such a tiny device.

And why would the thermal envelope dictate either the GPU has a current-gen tesselator and VLIW4 shaders or not?

AFAIK, the only performance figures he gave were the GTX260. That's about the performance level of a Juniper, which already exists in 15" laptops.
Wii U's thermal envelope could withstand a Juniper, that's for sure.
 
X800/850 IIRC.

I've just done a a quick search and found this references to 9800s (it's The Inquirer so a pinch of salt and all that):

http://www.theinquirer.net/inquirer/news/1030308/xbox-sdk-released-cool-apple-power-mac-g5s

DeanoC also makes reference to a 9800 Pro in early Heavenly Sword development, but that's running in a Pentium 4 PC so I don't know if that was a "dev kit" as such:

http://forum.beyond3d.com/showpost.php?p=1065985&postcount=130

I'm going off topic here, but I'm really just thinking how early 360 kits had big differences to final kits (even from the X800/850), but for Nintendo's WiiU the differences may not be quite so large. For the GC, Wii and the N64 the hardware was ready some time before the release date while MS cut things kind of close.
 
Wii U's thermal envelope could withstand a Juniper, that's for sure.

Not with the kind of cooling in the GC or Wii, or without making the kind of noise that's totally unacceptable for a cute little games console. A 24 W GPU would use more power than the GC/Wii drew at the wall, and dissipate more heat than the entire contents of the Wii or GC case.

Mobile parts aren't a good comparison anyway, as console vendors can't put all the rest of their working parts in desktop or OEM parts. Any working chip you throw away because it won't fit in your low-power envelope raises the cost of making your consoles.
 
And why would the thermal envelope dictate either the GPU has a current-gen tesselator and VLIW4 shaders or not?

AFAIK, the only performance figures he gave were the GTX260. That's about the performance level of a Juniper, which already exists in 15" laptops.
Wii U's thermal envelope could withstand a Juniper, that's for sure.

Here's what he said:

it uses VLIW4, AMD's 8th gen. tessellator, maybe two front end engines, etc., but has a clock and ALU count that's similar to an R700.

There's simply no way to get that within the form factor, thermal, and acoustic constraints of a WiiU system. Seriously, 800 SPUs at 625 Mhz? I cant imagine with that much dreaming in his post that he meant the lower end spectrum of 80 SPUs at 600MHz.
 
Heinrich - Thanks. However Nintendo said they didn't want go head up with Sony and MS after the GC and chose to go underpowered with Wii. That worked out great for them most of this gen, but it caught up with them in the end. Conservatism won't be an issue especially since they already said Wii U would cost more. Wii's underpowered direction forcing them to move now is what will be the main problem that affects any hardware goals for Wii U. The comparison to the 360 was made to support my idea.

pj/function - Everything I had seen awhile back indicated the G5 used as the alpha kit had a Radeon x800 (XT). In reviewing your "nitpickiness" pj, what I'm proposing is similar. :p

BRiT - I tell you about your "base power strictly by size" viewpoint. It's flawed IMO, yet you continue to run with it (on GAF as well for those who don't know). I don't think the size is that relevant since first I'm not saying it's going to have something equivalent to a 150W+ GPU. Using VLIW4 would also help reduce transistors (I focused on 640 by the way). With the current case design it's bigger than the Wii, the second vent is bigger and two smaller ones were added. It isn't compensating for an internal HD. The optical drive is more than likely not bulky just like Wii's.

Separately there is also the possibility, again looking strictly at press releases since that's the only official comments we have from IBM and AMD, of an SoP design happening as MDX pointed that out awhile back. And then you have the rumor about the 28nm GPU which would most likely come from NEC, not TSMC, so there is the possibility that could happen as well.
 
Last edited by a moderator:
pj/function - Everything I had seen awhile back indicated the G5 used as the alpha kit had an x800 (XT). Reviewing you "nitpickiness" pj, what I'm proposing is similar. :p

I know X800/850 based cards were used in pre-final, G5 based 360 development kits, but I had thought there were 9800 based kits before that. There's some indication from the likes of The Inquirer that they did exist (they were talking about G5 and 9800 based dev kits some months before the x800 was released).

BRiT - I tell you about your "base power strictly by size" viewpoint. It's flawed IMO, yet you continue to run with it (on GAF as well for those who don't know). I don't think the size is that relevant since first I'm not saying it's going to have something equivalent to a 150W+ GPU. Using VLIW4 would also help reduce transistors (I focused on 640 by the way). With the current case design it's bigger than the Wii, the second vent is bigger and two smaller ones were added. It isn't compensating for an internal HD. The optical drive is more than likely not bulky like Wii's.

The Intake vents and fan are about same size on the Wuu case as they were on the GC. Unless that fan is spinning a lot faster than on the GC and/or the heatsink is more expensive/sophisticated so as to be better at transferring heat from the chip(s?) to the air, then you're looking at similar heat output from the chips.

I don't think Nintendo would want much (if any) more fan noise, so aside from small airflow improvements from better fan design that leaves you with only more expensive cooling to fall back on. I don't know how much extra mileage you could get from that, but I don't think Nintendo would want to spend a lot on it.

The Wu looks really interesting, but I'm not expecting a large step up from the PS360.
 
BRiT - I tell you about your "base power strictly by size" viewpoint. It's flawed IMO, yet you continue to run with it (on GAF as well for those who don't know).

Surely you are confused. I do not post on GAF (nor would I ever).
 
I know X800/850 based cards were used in pre-final, G5 based 360 development kits, but I had thought there were 9800 based kits before that. There's some indication from the likes of The Inquirer that they did exist (they were talking about G5 and 9800 based dev kits some months before the x800 was released).



The Intake vents and fan are about same size on the Wuu case as they were on the GC. Unless that fan is spinning a lot faster than on the GC and/or the heatsink is more expensive/sophisticated so as to be better at transferring heat from the chip(s?) to the air, then you're looking at similar heat output from the chips.

I don't think Nintendo would want much (if any) more fan noise, so aside from small airflow improvements from better fan design that leaves you with only more expensive cooling to fall back on. I don't know how much extra mileage you could get from that, but I don't think Nintendo would want to spend a lot on it.

The Wu looks really interesting, but I'm not expecting a large step up from the PS360.

I did see a couple of mentions about the 9800 a week or so ago when I first looked at all this, but went with the x800 due to more references. So if you're saying that also I would believe there is merit to that.

The thing about that comparison is that we don't know the threshold of what the GC or Wii could handle. Wii's TDP is comparable to some tablets and it clearly has more space to work with. And like I mentioned they added (so far) a couple of smaller vents.

Surely you are confused. I do not post on GAF (nor would I ever).

I'll take your word for it. There was a Mr. Brit who would say that also. My apologies. :smile:
 
With the current case design it's bigger than the Wii, the second vent is bigger and two smaller ones were added. It isn't compensating for an internal HD. The optical drive is more than likely not bulky just like Wii's.

Also to add that this system is designed to lay flat and not vertical.
I can only guess they didnt allow for the other option, like the Wii, to control
the flow of heat.
 
Status
Not open for further replies.
Back
Top