Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Some ideas are so wrong and tiresome that after a while, we'd rather just not discuss them. The Wii discussions got pretty pathetic at times, with such arguments as the names of the chips telling us they'd be pretty potent, and a spurious PR comment proving the machine had a new, secret physics-processing unit. There are plenty of people arguing about the technology because they want it to be better, and not because they want to understand the nature of the machine, which is never a healthy starting point for a scientific investigation.


Of course, and those sorts of posts aren't helpful either. Its just the post he was referencing had some valid points - which, if incorrect at least deserved to be discussed with the OP. He wasn't being outlandish, he was trying to be reasonable in saying the CPU isn't "7 year old tech" for various reasons. Which is a fair question to raise, even if it doesn't mean it will actually show any improvement later down the line, or even if its innaccurate. The guy was hypothesising that the OoOE on the CPU could alleviate the lower clock speed (which was discussed before) and maybe that was nintendo's reasoning for underclocking it so much. I dont think it will alleviate the lack of clock speed as most developers aren't going to have the incentive or time to take advantage of things like that (assuming it even would make a difference) when their objective is to get the game in working order on the system asap so it can go on sale at laucnh. And that's fair enough. Its Nintendo's fault for not seeing that that would happen, and not just giving the developers more horsepower to 'plough through' any problems they have with porting their games.

Although, as I think someone pointed out, the compiler used should mean and IoE optimized code works just fine in OoOE (if not then thats a huge problem), so it likely won't have been too much of an issue.

All of the above could have just been explained to the guy, instead of just "lol-ing" it off as garbage. It was in a speculation thread after all and those with knowledge should correct/inform those without.


On that subject: I'd like some clarification as I dont know enought about it and I'm trying to figure out what nintendo were thinking when they left the clock speed of the CPU so low whilst coming out and saying the machine was designed with receiving cross-platform ports (from systems with relatively powerful CPUs) in mind...

- Can you drop code optimized for IoE onto a OoOE CPU and expect it to run no probs?
- Is it the compiler that sorts this out and is there room for further optimisation
- Is it likely these early ports were pushing the CPU to its limits?

All serious questions - not loaded in anyway as I'm not clever enough to answer them on my own :)
 
Last edited by a moderator:
What games?

Because everything I have seen has not supported your statement.


Its a minefield of a subject. "Better looking" and "Better performing" are two different things.

From what I've read; BLOPS 2 looks better than the PS3 version (and as good as the 360 version) at first. See these quotes at the start of DF's article.

All of this means that the new version gets the same 880x720 native resolution, with the same sharp upscaling filter employed on the Xbox 360 release. There's even 2x multi-sampling anti-aliasing (MSAA) - the first time we've seen hardware AA deployed in-game on a Wii U title (admittedly, it's early days there).

....Pretty much everything we said about the image quality on Xbox 360 in the original Face-Off applies here with the Wii U version - it's a remarkably close conversion, from a visual perspective at least. While the PC game provides the benchmark, Treyarch's upscaling algorithms work just as well as they do on Xbox 360, providing a good, clean presentation - quite remarkable bearing in mind just how extensive the horizontal upscale actually is. Despite the removal of the additional blurring added in the 1.02 patch, both Wii U and 360 are still a class apart from the more compromised PlayStation 3 version. One minor difference concerns v-sync - the Wii U version has no tearing whatsoever, while the PS3 and 360 games occasionally tear right at the top of the screen (this is basically unnoticeable during gameplay though).

So far so good...
BUT! The problem is that overall the framerate is lower on average than the PS3 version (by some 9fps) which in turn is lower than the 360 version. And in some sections it drops well below the others (although exceeds PS3 sometimes too) So overall, that game has been given a thumbs down.

Now, to many the fps part is the important bit. But to many others, they might be happy with it looking the same/better but not running as quite as smoothly.

Things like this are why we're seeing some say that some of the ports look better, while others are saying they're worse.
 
I think this is just semantic damage control. How a game moves is part of how it looks - you can't even see important game elements like animation and the visual representation of simulation without movement.

Anyway, Black Ops 2 looks worse on the WiiU than the Xbox 360, at least if these screenshots are from a final version of the game:

http://forum.beyond3d.com/showpost.php?p=1680416&postcount=1832

And then there's The Batman's shadows too (+ bonus frame rate problems):

http://forum.beyond3d.com/showpost.php?p=1680158&postcount=1830

And there's also Darksiders 2 with its missing trees (and longer load times):

http://www.youtube.com/watch?v=5nha4XiXnSg

Maintaining the same resolution and scene complexity is a tainted victory if your frame rate has to go to crap to do it. Considering the small amount of silicon (from mature processes) and tiny power consumption of the WiiU, what it's managing to do it is quite impressive. From a gamer's perspective though, for a new system running multi platform games what it's putting on screen certainly isn't.
 
I'm still discounting some of these issues as launch game generation shortcomings, which is VERY COMMON with new hardware. Yes, wuu is disappointingly underpowered, but if devs had had more experience with the system, been able to spend more time and effort on porting the game (limited now, due to the fact wuu had zero installed base, so money-return on effort spent is an issue), or possibly if wuu development software and documentation had been better then some of the problems in these games would have been lessened or non-existing.

*shrug* Speculation? Sure. I just don't see the point in totally dismissing the wuu before it's even had a chance to prove itself for real. Let's wait a couple months. Aliens: Colonial Marines for example has been hyped to be good on wuu, maybe raven or gearbox or whomever is porting the game has overhyped it, I dunno. Maybe it really will be the best version. Game's coming out in march next year I think and then we'll know the truth (about this game, anyway. :))
 
Since BLOPS comparisons has been abundant the last page or so, here is footage so that people reading can compare for themselves, as well as the internet allows. WiiU vs. 360
Frankly, the fundamental question is answered.
Q. Can the WiiU accept ports of multiplatform titles produced mainly targeting PS360?
A. Yes. It has already happened.

So can we please move on?

- Can you drop code optimized for IoE onto a OoOE CPU and expect it to run no probs?
- Is it the compiler that sorts this out and is there room for further optimisation
- Is it likely these early ports were pushing the CPU to its limits?

All serious questions - not loaded in anyway as I'm not clever enough to answer them on my own :)

A1. Yes. It's the other way around that may present problems. OoOE helps alleviate the impact of badly scheduled code. So migrating code that runs OK on an OoOE design to an in-order can yield unexpected performance issues.
A2. Compilers are never perfect, but unless there are great changes under the hood, I'd assume that even the WiiU one wasn't glaringly immature in terms of code quality. Bugs, on the other hand... The greater issue is how you organize and access your data structures, but that isn't in the hands of the compiler, mainly.
A3. Difficult to answer for anyone who hasn't profiled the code (i.e. anyone but the actual programmers). In all likelihood, if the code is reasonably balanced, there will instances where the CPU is limiting, but it would be very strange if it was the case all the time as it would correspondingly mean that all other resources were underutilized all the time. Even for ports, I find that hard to believe, I trust the guys working in the trenches wouldn't leave it like that even when hounded by project leaders frothing at the mouth due to impending deadlines. :)
 
Last edited by a moderator:
Since BLOPS comparisons has been abundant the last page or so, here is footage so that people reading can compare for themselves, as well as the internet allows. WiiU vs. 360
Frankly, the fundamental question is answered.
Q. Can the WiiU accept ports of multiplatform titles produced mainly targeting PS360?
A. Yes. It has already happened.

So can we please move on?



A1. Yes. It's the other way around that may present problems. OoOE helps alleviate the impact of badly scheduled code. So migrating code that runs OK on an OoOE design to an in-order can yield unexpected performance issues.
A2. Compilers are never perfect, but unless there are great changes under the hood, I'd assume that even the WiiU one wasn't glaringly immature in terms of code quality. Bugs, on the other hand... The greater issue is how you organize and access your data structures, but that isn't in the hands of the compiler, mainly.
A3. Difficult to answer for anyone who hasn't profiled the code (i.e. anyone but the actual programmers). In all likelihood, if the code is reasonably balanced, there will instances where the CPU is limiting, but it would be very strange if it was the case all the time as it would correspondingly mean that all other resources were underutilized all the time. Even for ports, I find that hard to believe, I trust the guys working in the trenches wouldn't leave it like that even when hounded by project leaders frothing at the mouth due to impending deadlines. :)

I see, thankyou very much for the concise answers. Makes a bit more sense now :)


Back to the BLOPS thing:

Maintaining the same resolution and scene complexity is a tainted victory if your frame rate has to go to crap to do it. Considering the small amount of silicon (from mature processes) and tiny power consumption of the WiiU, what it's managing to do it is quite impressive. From a gamer's perspective though, for a new system running multi platform games what it's putting on screen certainly isn't.

I agree, its a hollow victory if the games IQ looks the same/better than other versions if the games performance (and therefore how you interpret the visuals) is compromised.

Those two BLOPS2 screenshots you linked certainly dont paint the WiiU version in a nice light do they. I'd suggest watching the comparison video though, as its not quite as cut & dry as those screenshots would imply. Imo at least. Still doesnt change the framerate issues, which as you say undermine any other perceived gain.
 
Last edited by a moderator:
Frankly, the fundamental question is answered.
Q. Can the WiiU accept ports of multiplatform titles produced mainly targeting PS360?
A. Yes. It has already happened.

So can we please move on?
That was never in doubt. The question is what GPU is in there, or at least what's the level of capability of the GPU. A lack of significant improvement implies a lack of significant power, given the architectural designs we know of.

What we really need are eDRAM details and shader details. That'd be enough to get an understand of what the GPU is and how it compares in terms of performance, performance/watt, and alternatives Nintendo could have gone with.
 
Since BLOPS comparisons has been abundant the last page or so, here is footage so that people reading can compare for themselves, as well as the internet allows. WiiU vs. 360
Frankly, the fundamental question is answered.
Q. Can the WiiU accept ports of multiplatform titles produced mainly targeting PS360?
A. Yes. It has already happened.

So can we please move on?

Actually I think the BLOPS Digital Foundry test is worth dwelling on as it might tell us something about the GPU / CPU.

The game shows no signs of tearing, which means vsync, and that given the game spends some big chunks of time at 30fps (exactly 2hz/frame) it probably isn't using triple buffering either (probably for input latency reaons). On the other hand, there are points in the game where frame rate is not either 60 or 30 fps, so there is some kind of buffering going on, but possibly in terms of the CPU part of updates.

So, it may be that where you see the game drop to 30 fps the GPU is choking (how badly you can't tell if it's vsync + double buffered) and where it's between 30 and 60 the CPU is banging its head. Or at least, it seems reasonable that it might be the case.

Anyone: Are there any WiiU games that show signs of tearing (DF article on Mass Effect 3 doesn't mention it, but does show some solid 20 fps areas in the analysis)? If not, it maybe Nintendo haven't made soft vsync available to developers. If so, allowing that might allow the WiiU to avoid some of its most grievous frame rate plunges.

Could someone from DF find out it this is the case? It might tell us a lot (or at least something).

A3. Difficult to answer for anyone who hasn't profiled the code (i.e. anyone but the actual programmers). In all likelihood, if the code is reasonably balanced, there will instances where the CPU is limiting, but it would be very strange if it was the case all the time as it would correspondingly mean that all other resources were underutilized all the time. Even for ports, I find that hard to believe, I trust the guys working in the trenches wouldn't leave it like that even when hounded by project leaders frothing at the mouth due to impending deadlines. :)

This is why I'm interested in looking at the DF frame rate analysis for various games. If there's anything in the above speculation, then it'd seem both the CPU and GPU are indeed getting tested. If there was a chunk of GPU power lying around you'd expect developers to make like a PC gamer and bump up the res.
 
I agree, its a hollow victory if the games IQ looks the same/better than other versions if the games performance (and therefore how you interpret the visuals) is compromised.

Those two BLOPS2 screenshots you linked certainly dont paint the WiiU version in a nice light do they. I'd suggest watching the comparison video though, as its not quite as cut & dry as those screenshots would imply. Imo at least. Still doesnt change the framerate issues, which as you say undermine any other perceived gain.

There are things that could be done to minimise some of the frame rate issues, which is baffling. Some kind of controlled tearing would probably lift the frame rate somewhat on those 30fps bits, and there's always the option of dropping the resolution a little (most people wouldn't notice, especially with decent scaling).

Likewise, in cases where there are the awful jaggy shadows, I've seen it speculated (I think AlStrong and sebbi) that one reason might be needing to fit them in the edram alongside the frame and z buffers. Dropping the resolution to gain some time, and using a larger shadow buffer output to main ram might allow that to be worked around too.

It's likely that WiiU developers will find (or be allowed by management to make) more comfortable sets of compromises over time, even if outright performance doesn't increase a lot.
 
Funny, because your posts on neogaf state that you feel the WiiU is more powerful and that the reason that ports don't run well is because "different philosophy" and trying to " brute force" "old methods of game pipelines" and other such BS.

You also claim that people that think that the WiiU is only around as powerful as the 360

have "a hidden agenda"!

My nick in NeoGAF is XtremeXpider, I don't write any of those statements quoted ;)
 
Actually I think the BLOPS Digital Foundry test is worth dwelling on as it might tell us something about the GPU / CPU.

I have no issues with your analysis, per se. It's interesting in its own right.
But we can't know why the frame rates vary as they do.

(For example, we have no way of knowing if the game even uses the eDRAM at all. If the WiiU GPU is a bit stronger in terms of ALUs and a bit weaker in terms of main memory bandwidth, if they port the game over and it runs acceptably, will the group in charge of the port mess with what's not broken? Particularly when there are bugs to squash, and they need to come up with uses for the second screen that in turn needs to be tested and... For the CPU, just consider that they've rolled their own sound code for consistency over different platforms. Some code monkey who can't defend himself gets the thankless task of trying to rewrite the underlying SIMD code to something which utilizes Broadway twinned FP. Which mostly goes OK, but the system chokes a bit when it gets tasked with too much at the same time, but then that doesn't occur too often in the game, so it's deemed acceptable.)

I'm not suggesting either of the two have occurred in this case, I'm just emphasizing that we just don't know what's happening on the inside, we can only observe the overall net result. And I felt that comparing launch ports was destined to draw the discussion into platform wars and the "looks better" morass.

I tend to agree with Shifty on this one - in order to evaluate the design architecturally, we need shader and data path details. Someone leak the PDFs please. And I'm still interested in the rendering technique angle - if you were making a game for the platform ab initio - how would you take advantage of the system, and how would that differ from how it would be done on for instance the PS3/360 or PC? Would it be visually detectable? (For instance if achieving A is comparatively cheap, and B is comparatively expensive, what would be the visual result of games using A and not B?
 
Call Of Duty 2 : Xbox 360 launch title

181526_full.jpg


Call Of Duty Black Ops 2 : Xbox 360 ~7 years later

call_of_duty_black_ops_2_67.jpg


Call Of Duty Black Ops 2 : Wii U launch title

call_of_duty_black_ops_2_64.jpg


Would that not be a fairer comparison?
 
I'm waiting to read that it has a puny 80 SP VLIW5 design. :devilish: ;)
Still doesn't change the fact that the system is underwhelming, at the resolution the game is rendered I would bet that some salvage llano/trinity parts are competitive and more than that.
But lets be optimistic, the system will catch up with the 360 and may be slightly out do it after a few years.
 
Why would that be more fair? As I've already tried to explain, a lot of the knowledge gained on the X360/PS3 is not platform specific:
- multithreaded programming and algorithm / data structure design
- advanced rendering theory like HDR and gamma correct linear lighting pipeline, energy preserving shaders, cascaded shadow maps, various lighting techniques, streaming etc.
- advanced content creation methods and software tools - there's been 7+ years of R&D invested just into these, including off the shelf software and internally developed custom tools

All of this was completely new and unknown for studios developing X360 launch games, but it's routine today. The only, only difference is the Wii U architecture and if Nintendo was stupid enough not to capitalize on 7 years of experience gained on other platforms then they deserve all the negative consequences.
 
Would that not be a fairer comparison?

I think it's fair to say that at this point, that it would be unrealistic to think that software for Wii U won't improve over time and what we see now is what we can expect for the lifetime of the system.

I don't think in 2005 anyone would have expected to see the visuals (and even performance) on the 360/PS3 that we are seeing currently.
 
I think it's fair to say that at this point, that it would be unrealistic to think that software for Wii U won't improve over time and what we see now is what we can expect for the lifetime of the system.

I don't think in 2005 anyone would have expected to see the visuals (and even performance) on the 360/PS3 that we are seeing currently.

Yes it will improve, but it's not reasonable to expect the same degree of improvement seen over the PS360 lifespan, because programmable shaders are no longer new and much of the learning has been done.

Things like inability to run at better resolution are also unlikely to change, if a game was heavily CPU limited then just upping the resolution would be as simple as changing the setup code.

IMO just look at the power draw, look at the die sizes and assume Nintendo doesn't have any magic, it's probably the clearest picture we will get unless the docs leak.
 
Actually the visual jump in the images posted by almighty isn't that big... but then again they're both poorly chosen and at 60 fps there just isn't that much room to improve anyway.
 
I have no issues with your analysis, per se. It's interesting in its own right.
But we can't know why the frame rates vary as they do.
Yes you can. There are clear indicators of bottlenecks in games. Nothing Nintendo does can change the fact that e.g. higher resolution required more PS power or that more intelligent entities on screen require more CPU power.

For example, we have no way of knowing if the game even uses the eDRAM at all.
I'm sure they don't render straight to the memory mapped file. That would be absurd. Same goes for rendering to RT that's not on embedded DRAM.

If the WiiU GPU is a bit stronger in terms of ALUs and a bit weaker in terms of main memory bandwidth, if they port the game over and it runs acceptably, will the group in charge of the port mess with what's not broken? Particularly when there are bugs to squash, and they need to come up with uses for the second screen that in turn needs to be tested and...
Their toolchain definitely uses one of the proven compilers and it's not like this is the only, i don't know, 1024-bit architecture (in terms of general purpose register sizes) and you can tweak stuff like this. Most of what developers know about PS360 applies to Wii U as well. It's not like every console is a new, completely different toy with new rules and stuff. If you know how to play Risk, you can play Super Duper Risk in Space as well. Wii U to PS360 is not like shift from Tic Tac Toe to Go.
 
Status
Not open for further replies.
Back
Top