Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
you brought up a good point. the early rumors stated the wii u was 50% more powerful than what we have now... my question is how powerful exactly is 50%.
Technically, it'd mean rendering exactly the same scene and assets at 50% higher framerate or resolution or mix of the two, although other features (ge. alternative lighting models) would count as an improvement without being quantifiable. If you're using the GPU efficiently though, higher framerate/resolution should be the most apparent advantage.

does he there basically confirm Wii U is 50% more powerful than what we have and if so again how powerful is 50% more powerful.
That's PR fire-fighting, like Kutaragi saying PS3 has remarkable potential that needs to be untapped, yet it's the same ball-park performance as XB360 and not magically more powerful. Never go to corporate executives or press releases to gain an understanding of a device. We have much better reference material - games showing no obvious improvements that would come with much more (50% more) powerful hardware, tiny dies, leaks and trustworthy rumours, hacker investigations revealing clocks. We could really do with eDRAM specifics and a final nailing of the GPU class, but we have a pretty good picture now, and there's not much that can change that picture significantly. The only real shocker would be if the eDRAM implementation is both very fast and 'smart' like XB360s, but that should be apparent in launch titles getting 'free AA', which they're not.
 
No, he is saying that their "improvements" still have "50% more optimization to go". Whatever that may mean.

yeah i was wondering and never got a decent answer. i know people hare the numbers game and random multipliers but its something to think about if anyone should know he should.
 
yeah i was wondering and never got a decent answer. i know people hare the numbers game and random multipliers but its something to think about if anyone should know he should.

It is a new system with new architecture, so what he is saying is a giving. What the others are saying, though, is to not make it more than what it is. We have a rough idea on what the Wii U should be capable of, so that part is not a big mystery. The bigger question is: what will devs do with that power? We will have to wait and see on that.

If the Wii U has any sort of "hidden power", it would not be raw power but the special features and customizations of the GPU. What realistic expectations should we have on that?
 
Are you purposely being obtuse or are you just consistently missing the point and logic we are trying to bring to the thread? I understand what you are saying but, I am trying to show you how all of what you say doesn't add up.


How about you answer my question, what exactly do you think devs will accomplish with the Wii-U? Do you honestly believe it will keep up with the other next gen consoles?

I answered in another post but, yes.


What you should be asking yourself is if the Wii-U is so capable as you assume, why do devs need to reach the full potential of the system to outclass the PS360? If the leap in performance were there to begin with, devs wouldn't have to dig that deep to find it.

I never said the Wii- U has to be maxed out to outclass the PS360. I just said give the def


This isn't going to happen and is nothing more than a pipedream for Nintendo fans. Why would developers be asking MS and Sony for more power if they plan to lead development on the Wii-U? Why would MS and Sony even invest in new consoles if they thought next gen development would lead on the Wii-U?
The reason is this. In some cases there will be games that will be Wii U games that will be exclusives. Some developers would prefer to port it rather than build it ground up for the new MS and Sony consoles. Also, rather than let's say building 3 of the same game from the ground up developers may prefer to build from the one they are most familiar with which would be the Wii U. Like in art if, you have the basic framework of something drawn it is a lot easier to just color it in than do all of the drawing and having to color it in too.



No need to be so defensive. We can't compare the console market to the handheld market because Nintendo has had a history of dominating that market, not so with home consoles.

The reason I brought it up had to do with expectations. What most people expected didn't happen. That was my point.

You have no idea how well or poorly the Wii-U will sell.
I say you have no idea of how well it the Wii-U will sell.

Beyond power, there's also the architecture that needs to be considered. Regardless of how close the PS3 and 360 seem on the surface, each system has strengths and bottlenecks that differ in some big ways. Besides, while we all do it at some point, you can't just measure the "power" of a system in such a general way. There are many things to consider when you look at how a system is expected to perform. So just because the PS3 and 360 produce similar results, that doesn't mean porting would be easy. That kind of question shows a level of naivety on your part.

For someone trying to disprove my point you sound like me trying to prove it lol. You hit the nail on the hammer there. Although hitting the hammer on the nail seems to make better literal sense figuratively that is it there.

Please don't stray off-topic. Just to clarify, though, ERP was talking about the success of third-party games and not the system itself. Time will tell if Nintendo will be able to satisfy third-party publishers for good support, be we already know that alot of publishers will be betting on Microsoft next consoles and the PS4 to a lesser extent at this time.

Straying off topic hmmm let's see. I shall look above. Console Forum check. Console Technology check. Wii U hardware discussion and investigation check.
My statements are relating to the Wii U hardware. I may appear to be off track at times but all of it relates to intial discussions of the Wii U hardware.



The issue with your comparison with the PS3 vs Xbox360 is that the PS3 is NOT several times stronger in every way than the Xbox360. In fact, the PS3 has a weaker GPU (half the triangle set-up, etc), so some GPU tasks has to be offset to its CELL processor to make up for it if you're doing Xbox360 -> PS3 ports. Its split memory compared to 360's unified memory apparently gave devs some issues too.

Every console probably has some advantage over another in some technical way no matter how small or obscure. I mean overall.

For the Wii U, it appears to have the opposite problem from the PS3 in some ways. From what we know, its CPU is clocked slower and do not have as many GFLOPS as either current-gen CPUs. The processor, however, is probably more efficient in some tasks, and the GPU has been stated to be overall stronger. We are still not sure on how much stronger it is at this time, but it is safe to assume that it is not an order-of-magnitude difference from 360's GPU.

The specs for the PS4/Durango have not been confirmed, but they will likely be over 8x more powerful than current-gen overall, and probably at least several times stronger than in almost everything than the previous systems (though probably not as theoretically as powerful as CELL in some ways.) That means that both will at least be several times more powerful than the Wii U and should be capable to outperform it in graphics by brute force alone.

Now.. it may probably be arguable on how much better some games will look due to diminishing returns, budget, and personal tastes. For early ports, we may see modern enchanted 360 games at higher resolutions and framerates .or simply the PC-versions ports. The point is that they should be superior in a technical level.
I explained more about my stance in another post. I am trying to keep up with all of the replies lol.

Looking at games at a technical level is different from looking at an appealing one. Mario Galaxy, for example, is a generation behind games for HD systems in many technical ways, but looking at it in another perspective will give you various opinions.

The people at Beyond3D generally discuss things at a technical level, so that is something to consider when you make statements that you may not be able to back up with technical knowledge.

That's fine.

You are completely out of your depth here. You do not have the technological understanding to contribute to this thread. By all means hang around and learn and ask smart questions, but please don't derail this thread with off-topic and confused posts.



I am not out of my depth as I never claimed to be a 20 year development vet. I just use scientific principles and the technical knowledge I do possess. Just because you may not understand what I say does not mean all people do not. Maybe somethings I say or put them rather are beyond your realm of scope. I see many others agreeing with things I say. It also, doesn't mean I am wrong. If, you want to understand a different philosophy than yours it is best to try to do your best to think like them in their thoughts. Otherwise you will think like yourself and not learn anything. All of my comments have related to what is listed for what this thread is categorized as. Just because I seemingly change course it is relating to my initial posts.

You are using an open-ended phrase. What is 'anything'? Is Wii U only showing 10% of its capabilities, or 50%, or 90%? When you look at the architecture as we know it, there's nothing complicated to using Wii U, so there's no reason to think usage will be well below capabilities. Yes, 3rd parties may be being cheap, but clearly not everyone is, including Nintendo. Trine 2, for example, is showing improvements. There are some improvements that don't need understanding or effort but just performance, like MSAA, and the lack of such features tells is a decent bit about the capabilities of Wii U (not much above PS360). Your other post gives a better point of reference:
By sharper I guess you mean higher resolution. That's a matter of RAM, ROPS, etc., and not developer skill. If there isn't higher resolution in launch titles, there's not likely to be a big improvement there. There will be sharper textures as a result of more RAM. Lighting and shading, yes, they'll be somewhat improved as it's DX10 versus DX9, although probably not a massive improvement. DX9 is pretty capable. So yes, Wii U will look better in the end, as I've said before. It's just not going to be massively better - the laws of physics tell us this. The small, low-wattage parts in Wii U will not be comparable to large, hot parts in the next-gen consoles. There's no need to wait on Wii U games to learn that.

For the purposes of this thread, Wii U's launch titles are adequate to tell us there isn't a massive amount of performance in there, which would be obvious in launch titles on cross-platform engines (UE3) managing higher framerates and better IQ as a result of the potent GPU. We have a ball-park performance metric, die sizes, rumours/leaks about chip names and flavours....Wii U's launch titles don't bring anything more to the discussion, unless they are exhibiting a particular effect like higher quality DOF.

It's too early to determine the true extent of the Wii U's power. There are many factors. The shorter pipelines. The EDRAM, MCM, GPGPU, the low power wattage, the customizations the CPU and GPU. If, the PS3 and XBOX 360 where of the same design I probably would agree with you. Knowing certain numbers does about this amount of Ghz, that about of RAM, this many transitors does not mean you know what they mean in the context of performance. One thing I know as far as CPU's go and speed is this. Why have we not hit the 20 Ghz mark now after hitting 1 Ghz over a decade ago? Heat. That is the main enemy of achieving those speeds. Hence the need for multiple cores to get around this enemy. Whose to say that the Wii U CPU has not achieved balance of greater performance by bringing the heat down that much while seemingly looking ordinary? How many of you when GTA 3 came out thought that you would be able to play it on your cell phone now??? Look at the efficiency of cell phones. Now with a quad core processor. They do HD video are going toward HD gaming and do they have a heatsink with a fan that sounds like a helicopter? I think not. If, it does get a refund pronto!
 
I am not out of my depth as I never claimed to be a 20 year development vet. I just use scientific principles and the technical knowledge I do possess.
Which is inadequate to contribute to the technical investigation.

All of my comments have related to what is listed for what this thread is categorized as.
Language barrier aside, your remarks about coding troubles are too wishy-washy and generic to be going anywhere. Prior to your and artstyledev's appearance in this thread, we were merrily discussing CPU architecture and GPU architecture and system bandwidth and all sorts of real subject matter, and I want to return to this topic.
 
Which is inadequate to contribute to the technical investigation.

Language barrier aside, your remarks about coding troubles are too wishy-washy and generic to be going anywhere. Prior to your and artstyledev's appearance in this thread, we were merrily discussing CPU architecture and GPU architecture and system bandwidth and all sorts of real subject matter, and I want to return to this topic.


I beg to differ. I seriously doubt that few techies on here if any are devs for the Wii U, have done some type engineering for it, or are going under the hood of it and then doing further diagnostics and examining with software. Since that is the case how much technical investigating is going on??? In addition the thread also, says hardware discussion and isn't that is what is going on here?

As far as language goes whether something is put in simple terms or more intricately complex the main thing is what is said understood. Either way you put it if, that is not achieved then how it is said doesn't really matter. As far as coding goes I just simply meant you can't power your way to code. The more powerful a system is the more efficiently it will run the code but, as the old saying goes you can lead a horse to water but you can't make him drink. Just like you can't take an English only speaking person and repeatedly beat them with a baseball bat and magically make them speak Romanian.


I believe there is a lot yet to be learned about the Wii U's power. If, in a few years when developers learn the ins and out of it and have begin to build more games from the ground up with similar results then I would say you are right. So unless that happens I won't without more definitive evidence.
 
theizzeee there are devs and people who have worked in the business in one form or another for years on this forum while they aren't the majority you really might want to take time to learn who they are.

Meaniwhile you have a a lot to learn. Processors are governed by a lot of things but power/heat tends to be the main driver aka in anything with a decent design a 100 wat system will crush a 25 or 50 watt system etc.
 
In addition the thread also, says hardware discussion and isn't that is what is going on here?

Technical discussion, that's not the sort that you've been contributing. In comparison to the real contributions, yours have been more on the level of "oh, the hardware comes in a white or black box and I'm sure it'll be really powerful!". So please give it a rest.
 
I believe there is a lot yet to be learned about the Wii U's power.
I know you do, but that's not based on anything substantial. It's a belief based on hope, not technical understanding. No matter how much experience the devs have with the hardware, there's only so much they are going to be able to accomplish with a 40 watt system, tiny tri-core 1.25 GHz, probably weak-sauce OoOE, PPC CPU, and a 550 MHz, probably RV770 class 150 mm^2 GPU including eDRAM*. The parameters of their operation leave a bit of margin for unknown reveals, but something like really fast BW in the eDRAM should be apparent in the launch games. That is obvious to most with a decent bit of system understanding. You don't need specific Wii U experience; 20+ years of following console and graphic tech developments, or even just a little while learning how different machines have worked over the years, is enough to tell us that 3rd party devs should be able to enable free MSAA on their 128+ GBps eDRAM if it existed.

There is no magic or special sauce to Wii U. It's not like GC or PS2 with novel architectures that needed new ways of thinking to master any extra performance from. Its tech 3rd parties understand very well from having worked with similar the past 7 years, and the only substantial limitations at this point beyond hardware will be APIs. eg. It's remotely possible that there's a truckload of BW in eDRAM but the reason we aren't seeing its clear application is the dev environment not being up to snuff yet. The console was clearly launched in Beta flavour and there may be a situation where using Nintendo's compulsory systems (if they are compulsory) means no AA until the systems are updated.

What there is no doubt about whatsoever is that next-gen consoles will look better than the best Wii U has to offer even if they don't launch for two years, assuming the rumours for them are accurate. The alternative isn't possible, just as impossible as PS2 games looking better than PS3 launch titles which really were a poor use of the PS3 hardware. Efficiency and optimisation only gets you so far. The hardware gap between PS4 and Wii U will be a significant leap (as others say, to not have a significant leap would mean PS4 games looking worse than PS3, meaning Sony fail to compete with their old system, which has never happened in a console release ever).

* And these speculated specs are the purpose of this thread, to determine what exactly they are and not how well the machine will or won't compete with Sony and MS's next machines. Hence, if you haven't anything to contribute regards identifying the specifics of the GPU or the eDRAM features or BW, please refrain from posting.
 
What kind of operations could the Wii U GPU hypothetically be good at given the constraints of memory bandwidth?

The system has very low memory bandwidth so either the functions must have very low bandwidth requirements OR can fit into a very small slice of ED-RAM assuming that the ED-RAM is flexible enough to use as the developers see fit. So can someone with more knowledge tell us what kind of operations could be employed which fit into the performance profile of the Wii U?

The only thing I can really think of to point to techniques to take advantage of low memory vs high compute environments could be modern 3d rendering on mobile devices? Don't mobile devices like the iPad have poor external memory bandwidth compared to their compute performance so couldn't the techniques used on mobile devices point to possible paths for optimising the Wii U and perhaps these techniques might have signatures which could be spotted if one knew what one was looking for?
 
The more powerful a system is the more efficiently it will run the code
No, or at least not necessarily in terms of power or transistors. In terms of time, sure, but that's not the first thing that comes to mind. I don't say a Ferrari is more efficient at getting to 60mph than a Dacia. Heck, a dual-core Pentium 4 is more powerful than a single-core Pentium 4, but I'm not sure I'd say it's more efficient.

Unless by powerful you meant out-of-order vs in-order, and by code you meant unoptimized code. While the WiiU is OoO vs. PS360, the somewhat unimpressive presumably quick-n-dirty ports we've seen presumably aren't unoptimized in general.

I believe there is a lot yet to be learned about the Wii U's power.
Yes.

But arguing that it's packing a miracle in its 30W power envelope is the stretch that a lot of us are scoffing at.

If, in a few years when developers learn the ins and out of it and have begin to build more games from the ground up with similar results then I would say you are right. So unless that happens I won't without more definitive evidence.
And yet what definitive evidence are you basing your optimism on? That's the crux of the resistance that you're seeing.
 
I know you do, but that's not based on anything substantial. It's a belief based on hope, not technical understanding. No matter how much experience the devs have with the hardware, there's only so much they are going to be able to accomplish with a 40 watt system, tiny tri-core 1.25 GHz, probably weak-sauce OoOE, PPC CPU, and a 550 MHz, probably RV770 class 150 mm^2 GPU including eDRAM*. The parameters of their operation leave a bit of margin for unknown reveals, but something like really fast BW in the eDRAM should be apparent in the launch games. That is obvious to most with a decent bit of system understanding. You don't need specific Wii U experience; 20+ years of following console and graphic tech developments, or even just a little while learning how different machines have worked over the years, is enough to tell us that 3rd party devs should be able to enable free MSAA on their 128+ GBps eDRAM if it existed.
Hmm.. I don't think "weaksauce" is the correct word for the Wii U's CPU. It is definitely weaker in SIMD tasks than 360's CPU and Cell, but Broadway was surprisingly strong clock-to-clock with them in some things like general code. It is also clocked high considering its short stage pipelines. Each core is clocked around 70% more than Broadway and has larger asymmetric caches with eDRAM. It is probably roughly 6x Broadway in performance (5x raw power, + 33% more efficient per core?), and ~9x Gekko.

Were there any additions to Broadway from Gekko besides clocking the processor 50% higher?

There is no magic or special sauce to Wii U. It's not like GC or PS2 with novel architectures that needed new ways of thinking to master any extra performance from. Its tech 3rd parties understand very well from having worked with similar the past 7 years, and the only substantial limitations at this point beyond hardware will be APIs. eg. It's remotely possible that there's a truckload of BW in eDRAM but the reason we aren't seeing its clear application is the dev environment not being up to snuff yet. The console was clearly launched in Beta flavour and there may be a situation where using Nintendo's compulsory systems (if they are compulsory) means no AA until the systems are updated.
Matt from neogaf (who does seem to know some things) did claim that there were some things still locked in the "final" SKU. Time will tell.

What there is no doubt about whatsoever is that next-gen consoles will look better than the best Wii U has to offer even if they don't launch for two years, assuming the rumours for them are accurate. The alternative isn't possible, just as impossible as PS2 games looking better than PS3 launch titles which really were a poor use of the PS3 hardware. Efficiency and optimisation only gets you so far. The hardware gap between PS4 and Wii U will be a significant leap (as others say, to not have a significant leap would mean PS4 games looking worse than PS3, meaning Sony fail to compete with their old system, which has never happened in a console release ever).
I believe it should be clarified we are indeed talking at a technical level. Something "looking" better is subjective and can lead to subject derailing. :smile:
 
Hmm.. I don't think "weaksauce" is the correct word for the Wii U's CPU.

Shifty was talking about the OoOE capability of the WiiU's CPU with his weaksauce comment. i.e, although it is an OoOE design, that element of the design is very simplistic and ineffective compared with modern OoOE designs, e.g. x86.
 
It is probably roughly 6x Broadway in performance (5x raw power, + 33% more efficient per core?), and ~9x Gekko.

33% IPC improvement why? That kind of improvement is a huge deal. It usually means either a very different looking core or fixing things that were very broken. While it's possible there's no reason to assume it off the bat.

The 3x from core count increase is something you simply aren't going to actually get in anything. Would be curious to hear some estimates on how much 3 cores helps you. It'd do well in applications that were previously bandwidth limited; for all of the criticism Wii U's bandwidth has gotten it's at least a lot better than Wii's CPU available bandwidth fed through a pretty slow bus.

Assuming that the reports of uarch and clock speed are roughly accurate the big question in my mind is system memory latency (and to a lesser extent L2 latency, perhaps). It's DDR3 instead of GDDR3 or 5 like in the other consoles, so it's absolute latency should be much lower. But if it's still going through the GPU that'll be a hit. Still, it has the potential to be under half what it is on XBox 360, which could be a big benefit. But Nintendo has to be using a competent memory controller for this.
 
Last edited by a moderator:
Shifty was talking about the OoOE capability of the WiiU's CPU with his weaksauce comment. i.e, although it is an OoOE design, that element of the design is very simplistic and ineffective compared with modern OoOE designs, e.g. x86.
Yep. those commas are there for a reason. ;)

The 3x from core count increase is something you simply aren't going to actually get in anything. Would be curious to hear some estimates on how much 3 cores helps you.
A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.
 
Hello everyone, as you can see from my post count and join date, i'm a newbie both here and in terms of actual technical knowledge. Reason i joined was to to hopefully contribute to the discussion a little and learn much more about this kind of stuff. Hope i don't sound too out of reality :LOL:



It's DDR3 instead of GDDR3 or 5 like in the other consoles, so it's absolute latency should be much lower. But if it's still going through the GPU that'll be a hit. Still, it has the potential to be under half what it is on XBox 360, which could be a big benefit. But Nintendo has to be using a competent memory controller for this.

I've tried to research this a lot to no avail really, so maybe you can give me an answer; what is the general difference between DDR3 ram and GDDR3 ram? I've been going under the assumption that GDDR3 RAM like in 360 and PS3 is more important to GPU functions and DDR3 is a much more general purpose ram used in normal off the shelf computers.

But what actually separates them? And is there an advantage to Nintendo using DDR3 over GDDR3 besides the supposed cost savings? Or is that what it was plainly about?
 
Yep. those commas are there for a reason. ;)

A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.

Heh..I figured something was up with my interpretation, but I think I just wanted a reason to discuss more about the CPU.

How much will the bigger cache improve the CPU performance? The core with the 2MB of L2 cache is an interesting design choice.
 
A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.

I don't need to be told it's literally 3 single core processors, please don't insult my intelligence :p You're making it sound as if threads are either dependent or they aren't, and you get either full parallel utilization or full serial utilization.

Go run some of your favorite PC games and log CPU utilization over some period of time. You will be hard pressed to find a game which sustains the same utilization over 3 or more threads. Game threading just doesn't tend to scale that evenly, even for fairly synchronously threaded loads. And 3 threads is kind of an awkward way to split loads for synchronous stuff.

Plus you have to account for various overheads: cores competing for shared resources, accessing concurrency structures, and general overhead in making algorithms more parallel friendly (for instance reproducing work on multiple threads).

Inuhanyou said:
I've tried to research this a lot to no avail really, so maybe you can give me an answer; what is the general difference between DDR3 ram and GDDR3 ram? I've been going under the assumption that GDDR3 RAM like in 360 and PS3 is more important to GPU functions and DDR3 is a much more general purpose ram used in normal off the shelf computers.

But what actually separates them? And is there an advantage to Nintendo using DDR3 over GDDR3 besides the supposed cost savings? Or is that what it was plainly about?

GDDR3 is derived from DDR2 and GDDR5 from DDR3. In a nutshell, it's optimized to achieve higher frequencies (and thus provide more bandwidth) at the expense of worse latency. When MS used one unified pool of GDDR3 they were favoring GPU performance (particularly texture fetches) over CPU performance (really high L2 cache miss penalty). although they may have thought that streaming loads on CPU would dominate and benefit from the improved bandwidth too.
 
theizzeee there are devs and people who have worked in the business in one form or another for years on this forum while they aren't the majority you really might want to take time to learn who they are.

Meaniwhile you have a a lot to learn. Processors are governed by a lot of things but power/heat tends to be the main driver aka in anything with a decent design a 100 wat system will crush a 25 or 50 watt system etc.


I see there are many misunderstandings about me in this forum. I never claimed there was not people that have worked in various technical fields in here. That is something I already have known. My point was few if any here have first hand knowledge as far as what is under the Wii U in a first hand official capacity. So there is a lot more speculation going on here than investigating going on. I am not talking about people commenting from there experiences from their opinions from their professional background or people saying something about a teardown. At the end of the day although very useful at times nothing beats first hand knowledge.

I will beg to differ however about processors though. The main thing that determines a CPU's power is efficiency. Doing the most while exerting the least amount of energy. Heat is the major roadblock in that issue. One thing that holds back innovation is the belief that there is only one way of doing things. Holding on to one philosophy without being open to others people have that may be just as good as yours closes the doors to many new possibilities.
 
Status
Not open for further replies.
Back
Top