Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Right, it was my point that the GPU is 800GFLOPs, has 3times the ram available to it, and a much faster CPU. Your GDDR5 comment makes no sense btw, as 360 and Wii U lack this as well. I'm not even sure you understand how to properly run logic in your head. I am the one on neogaf who originally posted the 176GFLOPs, so I know how to be realistic, but you don't even know how to come to terms with the clusters being far too big for 20ALUs. The more dense and low clocks are about the only way this makes any sense.

I am starting to think you are another poster on Neogaf, just come here to sound like you are more technically apt, not sure it is working though as your inability to read posts properly and notice that that PC has much more power than the Wii U, would give you some pause before stating that it hurts my point. That build is clearly struggling with the game with over twice the power we are assuming Wii U to have, well 4times the power in your case. Even given the resolution change, you can't make up that much ground with only 352GFLOPs and a much slower CPU.

The reality is, the only thing that points to 160SPs right now and completely dismiss 320SPs is a group of fanatics that clearly have never ported games to different platforms and would like nothing more than Wii U to lack a basic ability to out perform last generation consoles. I've left you to your ridiculous theories of R600 integration and 160SP, from developers comments we have been told that it is R700, Matt from NeoGAF clarified it to me directly in the latte thread, and quite a number of developers have said the GPU is about 50% better, well... considering 320 is 50% more SPs than 240, and 160 is 30% in the opposite direction, I think it is premature to say Wii U is 160SPs and 320SPs is out of the question.

I think function makes a much better point then you, since he's talking about 6 or 7 ports which were all inferior to 360, while you cherry pick one game that doesn't even fit the bill, we don't you try finding 3 games that runs on a HD5550 or simialer that's runs inferior compared to 360.
 
The kits the first demos were running on reportedly had the GPU clocked at 400MHz. Therefore, both the Zelda and the Japanese Garden demo were running on a 128GFLOPS GPU - if the system really only has 160 ALUs.

Thought that was running on a dev kit and not wiiu final hardware. If its not final hardware i dont see how you came up with those numbers with that little bit on info.
 
Thought that was running on a dev kit and not wiiu final hardware. If its not final hardware i dont see how you came up with those numbers with that little bit on info.
Why would a devkit have more ALUs?

The first generation CAT-DEV V10001 kits were apparently considerably weaker than the final hardware.
 
While I am not an engineer, I do understand what I am talking about, bottlenecks happen from many different places, for various reasons. Early ports built from 360 games show that the Wii U isn't a 360, and that it has different weaknesses and strengths, little else can be said. Especially about rushed ports, but if you want to continue that line of reasoning, I don't think I have the ability to explain why you are wrong.
A lot depends on the implementation of the ports. Ports from PS2 to Xbox, for example, needed complete rewrites. But 'ports' from a midrange PC to a high spec PC, or from an Android TF101 to a Nexus 10.1, are effectively the same code. The architectural difference between Wii U and PS360 is not on the same scale as between other hardwares of different generations, so the historical reference to porting issues is not valid. There will be some loss of efficiency in the porting, but going from a DX 9 part to a DX10 part should present very little headaches to the devs. Only if they have to do crazy memory management to struggle with bottlenecks (possible) or something are they likely to fail to get more from a more-capable GPU.

There's also the case of Nintendo first party games. If cross platform games are getting 160 SP performance from a 320 SP part, surely Nintendo's first party games should be that much visibly superior? Where does the lack of distinction between first and third party games fit in with the 320 SP argument?
 
I think function makes a much better point then you, since he's talking about 6 or 7 ports which were all inferior to 360, while you cherry pick one game that doesn't even fit the bill, we don't you try finding 3 games that runs on a HD5550 or simialer that's runs inferior compared to 360.

My point was proven, it wasn't that Wii U is or is not more capable than 360, it was that the PC port of a 360 game squandered drastically better hardware to produce something that ran worse than 360 though at a higher resolution. That is the logic of my argument and once you admit that it is sound logic, then you have to throw out early ports as evidence of performance, Wsippel also said that it was unscientific on the last page, so I am not alone in this thinking. Just because multiple examples exist doesn't mean I have to go find them, I think most of us can name quite a few bad ports, that run worse than the 360 game, even on machines with more ALUs available to them.
 
Last edited by a moderator:
Why would a devkit have more ALUs?

The first generation CAT-DEV V10001 kits were apparently considerably weaker than the final hardware.

Because they used stock amd cards. That how people came up with the crazy 1+ tflop performance base on the gpu in the early wiiu dev kits.
 
My point was proven, it wasn't that Wii U is or is not more capable than 360, it was that the PC port of a 360 game squandered drastically better hardware to produce something that ran worse than 360 though at a higher resolution.
When Function presented his argument, he talked up very explicitly the overheads and costs of PC rendering.

Wsippel also said that it was unscientific on the last page, so I am not alone in this thinking.
Wsippel was preceding his own statement as being unscientific.

Wsippel said:
This* is completely unscientific, but strictly looking at Razor's Edge, the reported slowdowns seem to have nothing to do with the GPU.
This* = "what I am about to say".
 
I'm reading this HD5550 comparison a lot lately, but that (40nm) chips just "eats" too much W to be considered in the same form, and I also don't really understand those many TDP references, because - while it's indeed related to the power consumption, especially at full load - it's really not the same thing.
 
A lot depends on the implementation of the ports. Ports from PS2 to Xbox, for example, needed complete rewrites. But 'ports' from a midrange PC to a high spec PC, or from an Android TF101 to a Nexus 10.1, are effectively the same code. The architectural difference between Wii U and PS360 is not on the same scale as between other hardwares of different generations, so the historical reference to porting issues is not valid. There will be some loss of efficiency in the porting, but going from a DX 9 part to a DX10 part should present very little headaches to the devs. Only if they have to do crazy memory management to struggle with bottlenecks (possible) or something are they likely to fail to get more from a more-capable GPU.

There's also the case of Nintendo first party games. If cross platform games are getting 160 SP performance from a 320 SP part, surely Nintendo's first party games should be that much visibly superior? Where does the lack of distinction between first and third party games fit in with the 320 SP argument?

iOS to Android games would have problems running the same game even with very similar hardware inside the devices, PS3 and 360 often struggle against themselves if a game is designed on one and bought over later. Wii U isn't different to this, but there are more examples, Linux games happen to run worse unless the drivers are correct. You talk about DX as if these consoles are using it, when we both know they are not. The modifications to the GPU is very hard to nail down. Also this last part of your argument is lacking, 360 launch games look nothing like current 360 games, Nintendo is in this same position and even if it were not, I find the idea that anyone can point at a game and say "oh that game has "x" number of ALUs" is completely beyond ridiculous. Which games have Nintendo brought to the platform that is suppose to show off this ability you are speaking about? Nintendoland or NSMBU, because those are the only two I know of.
 
My point was proven, it wasn't that Wii U is or is not more capable than 360, it was that the PC port of a 360 game squandered drastically better hardware to produce something that ran worse than 360 though at a higher resolution. That is the logic of my argument and once you admit that it is sound logic, then you have to throw out early ports as evidence of performance, Wsippel also said that it was unscientific on the last page, so I am not alone in this thinking. Just because multiple examples exist doesn't mean I have to go find them, I think most of us can name quite a few bad ports, that run worse than the 360 game, even on machines with more ALUs available to them.

Still his point remains vaild, 6 or 7 games ports, didn't show any advantages compared to 360, infact most were inferior. Can you list 3-5 games 360 games that run worst on an 320 SP GPU, that would help your point.
 
Still his point remains vaild, 6 or 7 games ports, didn't show any advantages compared to 360, infact most were inferior. Can you list 3-5 games 360 games that run worst on an 320 SP GPU, that would help your point.

The problem is those games are not qualified, we don't know if it's a GPU bottleneck or a RAM or CPU bottleneck, most have to do with transparencies, in which case the 360 handles them directly on the ROPs correct? The problem with this sort of thinking is that it simply is too general and lumps the GPU in as the entire performance factor for both consoles.

I don't really understand why I have to even explain bottlenecks here, I mean I am positive there are many posts in this forum about it. The ability to draw conclusions about performance of any system based on performance of ports that all seem to have different problems that others don't share points to the opposite.
 
My point was proven, it wasn't that Wii U is or is not more capable than 360, it was that the PC port of a 360 game squandered drastically better hardware to produce something that ran worse than 360 though at a higher resolution. That is the logic of my argument and once you admit that it is sound logic, then you have to throw out early ports as evidence of performance, Wsippel also said that it was unscientific on the last page, so I am not alone in this thinking. Just because multiple examples exist doesn't mean I have to go find them, I think most of us can name quite a few bad ports, that run worse than the 360 game, even on machines with more ALUs available to them.

Okay, let's see the arguments presented:

- function shows that an HD5550, with a core configuration matching what people are speculating for Wii U (320SP, 16TMU, 8ROP, 550MHz) vastly outperforms an XBox 360, and that XBox 360 slightly outperforms Wii U in most cases (games from several developers)
- You show that an HD4670 (also 320:16:8) at 800MHz performs similarly to Wii U while pushing twice as many pixels.

And you think you're making a stronger argument than he is?? He's right, your argument is only strengthening his claim, not yours.

You say that the HD4670 has 800GFLOP/s, that's clearly wrong and makes me wonder if you know how to compute them (it's 320 * 2 * 0.8GHz = 512 GFLOP/s). You refer to the apparent performance of the video but it doesn't matter because the post makes it clear that FRAPS is ruining it, only the FPS figure in the description matters. You say that it "only has higher resolution" but most of a GPU's resource utilization scales with resolution. You say the PC has an advantage because it has a better CPU but if the game were consistently CPU limited it'd be running at a higher resolution. You say that the PC has an advantage due to having more RAM but that doesn't play into this in the slightest.

The only thing that makes this comparison tricky is the big difference in memory hierarchy. The video cards have something very different from Wii U, that much we know for sure. We don't know if Wii U's eDRAM is enough to make up the difference from having a very weak main RAM bandwidth. Even trying to guess at this would require more knowledge of how the eDRAM can be used. So it's possible that main RAM bandwidth is holding back performance in every case so far (even the games where Wii U does best still don't perform nearly as well as they would on a 320SP PC part) but that'd mean Nintendo didn't do a good job balancing the design.
 
This* = "what I am about to say".
Almost everything in this thread is unscientific. It's all conjecture and speculation, with a ton of confirmation bias thrown in for good measure. Some people claim that the GPU can't have 320 ALUs because most ports perform worse than on 360, I claim that in at least one case, the slowdown doesn't even appear to be caused by the GPU. So maybe the GPU isn't the culprit in BLOPS and Batman, either.
 
In 2012 however, I can't help wondering why you wouldn't just include 1 Broadway core and, say, 2 far more capable Bobcat cores. You might even be able to put the Bobcat cores on the GPU die (assuming 40 nm TSMC). Perhaps the cost of including both types would be too high though, and the need for BC won out.

Or perhaps Bobcat cores just aren't more capable let alone far more, despite the assumption that Bobcat is a new design and Espresso a design from 1998 (not even Gekko is a design from 1998).

Almost everything in this thread is unscientific. It's all conjecture and speculation, with a ton of confirmation bias thrown in for good measure. Some people claim that the GPU can't have 320 ALUs because most ports perform worse than on 360, I claim that in at least one case, the slowdown doesn't even appear to be caused by the GPU. So maybe the GPU isn't the culprit in BLOPS and Batman, either.

Its also worth mentioning Trine 2. A game which runs with better shader quality on WiiU and at a higher resolution than 360 or PS3.
 
Last edited by a moderator:
The problem is those games are not qualified, we don't know if it's a GPU bottleneck or a RAM or CPU bottleneck, most have to do with transparencies, in which case the 360 handles them directly on the ROPs correct? The problem with this sort of thinking is that it simply is too general and lumps the GPU in as the entire performance factor for both consoles.

I don't really understand why I have to even explain bottlenecks here, I mean I am positive there are many posts in this forum about it. The ability to draw conclusions about performance of any system based on performance of ports that all seem to have different problems that others don't share points to the opposite.

We understand bottlenecks, but for so many games not to show any kind of improvement, is mind-boggling, and that's why many people here are agreeing his theory, which makes more sense then nintendo enginereers being incompetent.
 
We understand bottlenecks, but for so many games not to show any kind of improvement, is mind-boggling, and that's why many people here are agreeing his theory, which makes more sense then nintendo enginereers being incompetent.

Or, the games are limited by developing to the 360, unless you think they unlocked all of Wii U's power with these ports, because that is basically the discussion here, that Wii U is performing a bit worse than the 360 because it simply is a bit worse than the 360, running 360 games. Like I said, it only really shows that Wii U isn't a 360, you understanding bottlenecks means you should be able to understand what I mean when I say that.

As some have pointed out Trine 2 is running better than the PS3 and 360 counterparts and the developers have gone on record saying that they couldn't do some of the things they are doing on Wii U on the PS3/360.

I mostly started posting again because the seemingly acceptance of 160SPs seems bizarre to me, given that we have seen so little of the Wii U and it could go either way. If the measurements of the die really supported 160SPs, I'd easily accept it, but they are bigger than R800's 20SP clusters and iirc R700 has smaller ALUs than R800, of course even that doesn't make up for the difference or there would be no debate and we would just say, yep that is 320ALUs.
 
I'm not sure what the argument is about.

From the end results I think we can draw some conclusions.

If the WiiU is really 320 SP, the number of ports that show it not exceeding 360 performance indicates that somewhere along the pipeline there's a bottleneck that's forcing the WiiU to not be all that capable.
This effectively points to an big fat engineering fail on Nintendo's end as it's a hardware issue that first party devs and third party devs seem to not be able to solve.

If it's 160 SP, then the numbers generally add up much better and is more in line with general understanding. And WiiU is just a, well, relatively weak machine that's pretty efficient at ~ 30W.

I don't think professionals can be so spectacularly bad so I agree with the 160 SP argument.

Either way WiiU will be still be considered underpowered. Arguing that it's 320 SP probably ends up brings up more can of worms than one would like.
 
We understand bottlenecks, but for so many games not to show any kind of improvement, is mind-boggling, and that's why many people here are agreeing his theory, which makes more sense then nintendo enginereers being incompetent.

It's just a new hardware which needs time. I would be really surprised if the GPU would lack the juice to do more than what the 360 is doing graphically, even if it's not much better after all. (probably that's the case imho)
I think one of the Factor5 magicians (or perhaps someone form the God of War team?, can't really recall sorry.) said that you need about 5 years with a modern console to be able to get the max out of it.
The the best minds of the entire industry hacking and torturing the 360 and the PS3 for almost a decade now, I think it's really unfair to expect that such a new system should just perform like those out of nowhere with games which were designed with a stronger CPU and a weaker GPU in mind.
 
Okay, let's see the arguments presented:

- function shows that an HD5550, with a core configuration matching what people are speculating for Wii U (320SP, 16TMU, 8ROP, 550MHz) vastly outperforms an XBox 360, and that XBox 360 slightly outperforms Wii U in most cases (games from several developers)
- You show that an HD4670 (also 320:16:8) at 800MHz performs similarly to Wii U while pushing twice as many pixels.

And you think you're making a stronger argument than he is?? He's right, your argument is only strengthening his claim, not yours.

You say that the HD4670 has 800GFLOP/s, that's clearly wrong and makes me wonder if you know how to compute them (it's 320 * 2 * 0.8GHz = 512 GFLOP/s). You refer to the apparent performance of the video but it doesn't matter because the post makes it clear that FRAPS is ruining it, only the FPS figure in the description matters. You say that it "only has higher resolution" but most of a GPU's resource utilization scales with resolution. You say the PC has an advantage because it has a better CPU but if the game were consistently CPU limited it'd be running at a higher resolution. You say that the PC has an advantage due to having more RAM but that doesn't play into this in the slightest.

The only thing that makes this comparison tricky is the big difference in memory hierarchy. The video cards have something very different from Wii U, that much we know for sure. We don't know if Wii U's eDRAM is enough to make up the difference from having a very weak main RAM bandwidth. Even trying to guess at this would require more knowledge of how the eDRAM can be used. So it's possible that main RAM bandwidth is holding back performance in every case so far (even the games where Wii U does best still don't perform nearly as well as they would on a 320SP PC part) but that'd mean Nintendo didn't do a good job balancing the design.

You basically saved (Function) from needing to type anything.
 
Status
Not open for further replies.
Back
Top