Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Is DDR4 really so slow as to be an issue? I had heard it was inherently twice as fast as DDR3 memory. Would that not help depending on the clocks and amount/speed of EDRAM?

It isn't inherently faster, because of the clock speed of early parts and their affordability.
Compare ddr2 PC 8500 and ddr3 PC 8500 : in that case, a bit lopsided, the bandwith is the same and the faster memory even is the ddr2 because of lower latencies.
Here, with ddr3 having a long run, maturity and high affordable clock speed, we might see ddr3 2133 competing with ddr4 2133.
Of course faster ddr4 will be available, maybe from the start (at least ddr4 2400) and then it will ramp up.

If you order like 10 million chips to be delivered before late 2013 for a launch, I don't know what you can get though. The new Xbox could even use fast ddr3 - the CPU won't complain of the lower latency (It's more something I don't rule out that something I predict, mind you :p ).
 
Trine 2 isn't a technically intensive game to begin with in any real fashion. . Considering that the wii u version is coming out after the other versions, it makes sense to improve it in some ways. Of course if your working only on optimizing the Wii U sku, the strengths of the platform will be seen as well.
Huh? The devs already had experience with PS3 and Trine. You really want to suggest that Trine 2 was a weak implementation on PS3 and they just put in a bit more effort on the Wii U port?

It has a more mature graphical feature set and shader model, much more EDRAM
If the bandwidth is severely reduced by comparison (ROP BW is almost certainly far lower), that eDRAM isn't going to help achieve more.

Don't know what Nintendo was thinking, assuming that a large amount of EDRAM bandwidth would actually make up for all the other bottlenecks in the console, but people like Shifty may have the knowledge to give that answer.
Indeed I do - Nintendo are crazy! :p
 
It isn't inherently faster, because of the clock speed of early parts and their affordability.
Compare ddr2 PC 8500 and ddr3 PC 8500 : in that case, a bit lopsided, the bandwith is the same and the faster memory even is the ddr2 because of lower latencies.
Here, with ddr3 having a long run, maturity and high affordable clock speed, we might see ddr3 2133 competing with ddr4 2133.
Of course faster ddr4 will be available, maybe from the start (at least ddr4 2400) and then it will ramp up.

If you order like 10 million chips to be delivered before late 2013 for a launch, I don't know what you can get though. The new Xbox could even use fast ddr3 - the CPU won't complain of the lower latency (It's more something I don't rule out that something I predict, mind you :p ).

I see...thanks for the info.

So Microsoft really could go either way. I had heard that DDR4 was poised to become much cheaper and affordable in the coming years. I guess its a toss up 50/50 for what they actually use. Does the type of ram between DDR3/4 have any effect on how much they might put in there do you think?

Obviously Nintendo didn't really care about put in two sticks DDR3 cause it was probably cheapest and most affordable at the time....but for Microsoft or Sony who want to look forward a few years down the road and stay on top of future proof concepts, its a valid question.
 
Huh? The devs already had experience with PS3 and Trine. You really want to suggest that Trine 2 was a weak implementation on PS3 and they just put in a bit more effort on the Wii U port?

Of course not, not what i was insinuating at all.

My main point was that the original Trine 2 was already out and since it was not a demanding game in terms of hardware, what they did for the Wii U in terms of improvements for its later release isn't some magical indicator that its(the Wii U's) capabilities are far beyond what we've already seen and it being a matter of "lazy devs" that more games aren't coming out superior on Wii U.
 
Huh? The devs already had experience with PS3 and Trine. You really want to suggest that Trine 2 was a weak implementation on PS3 and they just put in a bit more effort on the Wii U port?

It is clear they did.

Trine 2 on PS3 even is so poor as to have a version of FXAA that applied to the HUD.

Just because they have worked on the PS3 once before is hardly proof they are skilled with it.
 
Last edited by a moderator:
I remember one of the devs on Trine 2 saying that the more advanced physics system in Trine 2 for the Wii U was not a matter of the hardware needing to be more powerful or advanced for it, but instead a matter of it simply being available post ps3 360 launch, so they implemented it.
 
I remember one of the devs on Trine 2 saying that the more advanced physics system in Trine 2 for the Wii U was not a matter of the hardware needing to be more powerful or advanced for it, but instead a matter of it simply being available post ps3 360 launch, so they implemented it.

Advanced is too strong a word, I am pretty sure they said it was just a slightly newer version of Physx.
 
I thought about and, yeah, you guys are right. The GPU is definitely a significantly cut-down RV730 at 55nm. :/

So, this thing is definitively much, much weaker than the 360, right?

In case you are not being sarcastic, nearly everyone here are in agreement that the Wii U is overall more powerful system than the 360. Launch titles are already roughly on-par in performance, and it will get better in time. It is best not to expect such a giant leap in performance from launch to later games like what we are seeing for 360. It is also best not to expect the Wii U games to ever surpass even launch 720/PS3 games at a technical level.

Wait. What about Trine 2? What's the explanation for that one? Could the PS3 and 360 versions have been intentionally nerfed? Or was there something different about that one?
There are several reasons:

1) Frostbyte is a relatively small company and the devs designed their own engine. The advantage is that it would be easier for them to adjust and optimize their engine for a new system.

2) Trine is GPU-heavy and uses deferred shading. The Wii U seems to be designed to work well in that type of environment.
 
We can assume that atleast, the 720 and PS4 will have the ability to do 1080p native for the average game on the hardware out of the box.

That's much more than we can say for the Wii U already, no amount of optimization is going to fix that.

The Wii U at its best will probably surpass 360's visuals, but not to any kind of exceptional degree.

I'm not including PS3 in that, because the console is such a variable when a game optimized for offloading GPU processes onto the Cell comes into play in comparison to 360's relatively normal CPU and GPU set up. You get a game like GOW3, Last of Us, Beyond or Killzone 2/3 and compare it to Wii U games in even 5 years time, we'll see how it holds up.
 
We can assume that atleast, the 720 and PS4 will have the ability to do 1080p native for the average game on the hardware out of the box.

That's much more than we can say for the Wii U already, no amount of optimization is going to fix that.

The Wii U at its best will probably surpass 360's visuals, but not to any kind of exceptional degree.

I'm not including PS3 in that, because the console is such a variable when a game optimized for offloading GPU processes onto the Cell comes into play in comparison to 360's relatively normal CPU and GPU set up. You get a game like GOW3, Last of Us, Beyond or Killzone 2/3 and compare it to Wii U games in even 5 years time, we'll see how it holds up.

In terms of actual visuals, one unknown factor may be the customizations and additional features added to the GPU. Its base is reported to be something in the r700 series, which is dx10.1 equivalent. According to at least one source, the GPU supports some features beyond SM4-equivalent..but then again, so did Xenos. It will be interesting to see if we will ever know what modifications were done.
 
In case you are not being sarcastic, nearly everyone here are in agreement that the Wii U is overall more powerful system than the 360.
"Nearly everyone", according to what metrics? I have come to no such overall consensus, at least amongst knowledgeable people. "Overall" more powerful is a deceptively broad label, of what we KNOW, for sure, is that only the feature-set of the GPU itself is obviously more powerful. Well, advanced really.

The GPU's performance may be higher as well, but factors regarding system bandwidth, and possibly eDRAM bandwidth as well, put that assessment somewhat in doubt.

CPU performance is doubtful as well. Integer performance may be somewhat better, it's hard to say for sure due to lack of hyperthreading and suspected low clock speed of the CPU. We do know that wuu's float performance is incredibly weak compared to xenos, and cell in particular. The low system bandwidth may possibly not affect the CPU as much though, as CPUs tend to work out of their caches for ~95-98% of the time or so, on average. This is especially true if wuu caches are in the 1-2MB/core range as has long been rumored, but again the exact nature of the hardware is incredibly opaque still after all this time, which is in of itself quite incredible really.
 
With the Wii launching in 2006, we had it fully deciphered in maybe 2003/2004* but didn't realise that until 2006.

*Depending on when we fully understood GameCube.

Har har..i knew what you meant before the asterisks but okay :p

Literally two gamecubes duct taped together...that's new age Nintendo for you.

Knowing what we know now, i wonder what kind of conditions the Bird and Zelda demo were under to achieve those kinds of results? Obviously a demo in a closed environment is going to look a lot different in comparison to an actual game. But they still impressed me plenty, made the FFVII PS3 demo look aged.
 
Obviously a demo in a closed environment is going to look a lot different in comparison to an actual game. But they still impressed me plenty, made the FFVII PS3 demo look aged.
The PS3 FFVII demo was a realtime rendition of a CGI from 1997. The CGI capable in 1997 was obviously limited and not going to make best use of modern realtime techniques.
 
I thought about and, yeah, you guys are right. The GPU is definitely a significantly cut-down RV730 at 55nm. :/

No-one said it was definitely that.

So, this thing is definitively much, much weaker than the 360, right?

And definitely no-one said this.

Wait. What about Trine 2? What's the explanation for that one? Could the PS3 and 360 versions have been intentionally nerfed? Or was there something different about that one?

The most likely explanation is the one that's already been given and that you've ignored.

You should try putting together a argument instead of just stamping your foot. A good starting point might be "why isn't the Wii U even demonstrating performance equivalent to RV730 in the PC space".
 
The PS3 FFVII demo was a realtime rendition of a CGI from 1997. The CGI capable in 1997 was obviously limited and not going to make best use of modern realtime techniques.

the ps3 tech demo was more like using assets from advent children or crisis core

FF7 ps1 CGI
ffvii2.jpg


advent children:

final-fantasy-vii-advent-children-20050516013448003.jpg


FF7 ps3 tech demo:

1944.jpg


But we don't know if that demo would still run on a real ps3, that was dev kit stuff.
 
^ They obviously weren't assets from advent children, that's just ridiculous. Hell a GTX680 is only just now lifting assets directly from Visual Works CGI.

They would have had to have a super crazy renderfarm going to even attempt that sort of thing. And since they were attempting to give some sort of barometer for the PS3(hence tech demo) i doubt that they were using more than one.

Again though, in a closed environment like a tech demo, anything is possible to be prettied up.


Again though, with Luminous engine..i wonder if they could actually replicate the entirety of Advent children by themselves without visual works studio with just a GTX680. That's completely insane even if AC is old at this point.
 
the ps3 tech demo was more like using assets from advent children or crisis core
You're right. Just looking up the original FFVII, it was nothing like the PS3 remake, and when you consider the timeline for CGI, that's understandable. It's much more like Advent Children (in appearance, for those confused by me saying the tech demo was a CGI recreation which I now realise it wasn't!).

But we don't know if that demo would still run on a real ps3, that was dev kit stuff.
IIRC it was running on PS3 hardware. Doesn't matter much though. It was an early demo by SE who hadn't experience with modern shaders which shows, so it's hardly a reference point for PS3 to be compared to the reference point of Nintendo's demo. We also had demos like the Alfred Molina head and rubber ducks, also suitably removed from the Nintendo birds demo to be a useful comparison.
 
Status
Not open for further replies.
Back
Top