Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
What on earth makes you think colonial marines has been cancelled, or that it "might" be the final "notable" 3rd party game? Seems like you're just making up shite based on nothing at all to fit your own bias and fannishness.
 
What on earth makes you think colonial marines has been cancelled, or that it "might" be the final "notable" 3rd party game? Seems like you're just making up shite based on nothing at all to fit your own bias and fannishness.

There was a rumor of it being "indefinitely postponed", and Wii U's sales are so abysmal that I can't see any third party investing in it 6 months from now if there isn't a historic turnaround. I guess that last statement might have been a bit much, though at the current rate they're going to miss their revised forecast by about a million. Besides, with the game being the technical mess it is, I don't feel that they game should be used to judge any console.
 
So... has NFS:MW even been mentioned here, or is it only because of the extra RAM?

The ram aspect was my viewpoint, but we did have a debate in here about that.

Higher resolution textures, lighting and draw distance are all enabled by a larger amount of ram.
 
Higher resolution textures, lighting and draw distance are all enabled by a larger amount of ram.

Applies for just the textures.

From the demo I think I noticed a lot of surfaces reflecting the surrounding area, not just lights. This seems to be missing in the 360 version. Worth discussing?
 
^ Arguing for what we may or may not see at E3 seems more like conjecture than anything we've been talking about.


Never said it would, i was referring to the high resolution textures. Although the GPU being more powerful to a degree would contribute to that as i've always assumed a 300-350flop GPU. But i'd like to stress that i have no "interest" in making the Wii U out to be anything more than the information we've gleaned so far.
I have a Wii U, i like playing games on it. But i've been stating for a while that there's no magic in this set up. Anything that Criterion has gotten out of it has been through careful optimization of the elements that we've always known are there.

We've always known that 1GB of ram should bring theoretical higher resolutions, and the GPU, being a theoretical 320SPU part should perform much better than the old ass Xenos and RSX set ups. My questions have been down to why that has not been the case outside of one or two exceptions.
We should also take care to remember that mirroring images incurs no performance penalty. Only when the screens are different does the performance of the game start to be impacted, which is why the gamepad usually only shows light material. Its not a bullet point proving anything spectacular the Wii U is hiding.


Also, the 360 has a tessellation unit, so i'm not sure of your angle in regards to that, you mean the Wii U is in a more capable position to use its tessellation unit? That makes more sense
I won't argue against how powerful the Wii U’s in proportion to the 360/PS3, they all have their places against each other.

As for exotics.
I call the Cell an exotic piece of hardware. We know that the Wii U's GPU is based off of the RV700 series of GPU, with custom arrays bolted onto it. That's not exotic to me.


Yes, I meant to say a more modern tessellation unit. Afaik, Latte is rumored to house a fixed function unit, (ala the 360) but doesn't require explicit vertex fetches, rendering cracks have also further been reduced. When I speak of software at E3, it is certainly not conjecture. There are only 2 examples because proper documentation was not available before November, & Frozenbyte simply spent more time optimizing their engine to better exploit aspects of the Wii U hardware.

Nintendo's most recent SDK is still immature. Toolchains, developer customer support contracts, compilers, various language support that allow for access to its unique PPC-based architecture, etc. These are still very much in their infancy. So I'm sure you can understand this, as well as its direct implications for multi-platform software development. Rayman Legends should further showcase these enhancements in Sept., especially since its the lead platform.

Again we are discussing ported software, non-proprietary developed engines. In all likelihood using the 360 as the base platform. Its akin to looking at the PS3 ME3 port vs. the 360 & thinking “Is this all the system is capable of?” Not at all representing an accurate capability metric when viewing the software output of Naughty Dog or Santa Monica Studios. The titles I have knowledge of were built from the ground up, utilizing proprietary engines built specifically to exploit the Wii U's hardware strengths.

Who here is making the Wii U out to be more than what it is? I'm in total agreement with you regarding the theoretical specs. I am well aware that mirroring the screen incurs no performance penalty, I merely brought this system aspect up. Although mirroring & allowing for seamless & instantaneous changes to the game world, (as in NFS) as well as assisted car control (drifting etc.) is a bit disingenuous to label as a simple mirror, ala Tekken TT. Regardless I believe that Black Ops II was simply a starting point, & these asymmetric gameplay experiences will only further be refined. (certainly for 3rd parties as Nintendo’s API matures) Also all game scenarios will vary by complexity & implementation, so yes it is indeed a bullet point for the system.


I also recognise the Wii U's weaker central processor, albeit not as inept as previously thought. I stated as much a very long time ago here. (despite being of a very GPU-centric design) That is why I qualified the statement by saying "overall more capable system." It was not meant to be a point-by-point comparative evaluation of power vs. the current HD twins, as all systems have their deficiencies. Exotic, or rather unorthodox designs can vary by degrees. (yes the PS3’s Cell architecture had slipped my mind) And your description of Latte being an RV700 derivative with custom arrays “bolted on” as if an afterthought I found quite telling. It’s a tad more complex than your oversimplified GPU analysis. Anyhow I hope David Hegalson’s, CEO of Unity, words ring true: http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html


Gaming Blend: While the Wii U doesn't specifically use DirectX functionality, will the Unity Engine for the Wii U allow for DirectX 11 equivalent functionality in regards to shaders, soft and self shadowing as well as potential scalability for Shader 5.0 or higher?

Hegalson: Yeah. We’ll do a-We’ll make it potentially possible to do so.


The ram aspect was my viewpoint, but we did have a debate in here about that.

Higher resolution textures, lighting and draw distance are all enabled by a larger amount of ram.

Again with the magical ram enabling all effects. I believe you have been corrected twice now, excluding myself. Why you persist along this incorrect line of thought is puzzling, your agenda perhaps?
 
Last edited by a moderator:
Yes, I meant to say a more modern tessellation unit. Afaik, Latte is rumored to house a fixed function unit, (ala the 360) but doesn't require explicit vertex fetches, rendering cracks have also further been reduced.
Very interesting. Perhaps this was the feature of AMD's Gen-2 Tessellation unit for the R700 series that was not used outside of a demo, or it may be enhancements of the older Tessellation unit that AMD improved for just for Latte.
 
Thread Title Key English words: WiiU, hardware

Things Not in thread title: MyLife purchases, Playstation, speculation about non-HW
 
Ok, I was thinking if we assume that we're are homing in the rough ballpark specs of the Wii U as this article presumes, especially with regards to the power of the Wii U's GPU being roughly a cut down version of the HD4670.

being analysed, but the core fundamentals are now seemingly beyond doubt. The Wii U GPU core features 320 stream processors married up with 16 texture mapping units and featuring 8 ROPs. After the Wii U's initial reveal at E3 2011, our take on the hardware was more reserved than most. "We reckon it probably has more in common with the Radeon HD 4650/4670 as opposed to anything more exotic," we said at the time. "The 320 stream processors on those chips would have more than enough power to support 360 and PS3 level visuals, especially in a closed-box system."

It was ballpark speculation at the time based on what we had eyeballed at the event, but the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz.

It's easy to know what the capabilities of the graphics card are. I remember being able to play Batman Arkham Asylum at a decent 30 frames at 1280x800 (roughly 720p) with a few stutters later in the game. What about the CPU though? What could we compare that with to get an idea of it's actual capabilities. On paper it looks like an archaic design straight from 1998 when pentium II were the mainstream cpu's, but I'm pretty sure IBM would not use a design that is over 14 years ignoring all the progress in the cpu tech in those years. It may have much in common with that design, but it surely is improved a lot? If we compared it to say one athlon core in an athlon dual core processor then assuming all the three cores in the wii u's cpu were utilized fully then it would be roughly about a 2.0ghz athlon dual core. I'm no processor designer though so I have no idea whether it's significantly better or worse than that.
 
Last edited by a moderator:
Ok, I was thinking if we assume that we're are homing in the rough ballpark specs of the Wii U as this article presumes, especially with regards to the power of the Wii U's GPU being roughly a cut down version of the HD4670.
We probably shouldn't assume that. The DF article is worthless. "We told you so", yet nothing they wrote makes sense. The GPU doesn't look like a HD4670 at all, cut down or otherwise. The shader clusters make no sense, what they've identified as TMUs doesn't really look like TMUs, and we have no idea where the ROPs are, let alone how many ROPs there are. The only blocks we can identify without any doubt are the shader clusters, and neither their size nor the number of register files matches what you'd expect from an R700.

And while the CPU definitely is a member of the 750 family, Nintendo 750s are no 1998 designs to begin with. They were always quite different, with several new features and a heavily extended instruction set. Despite their age, they seem to keep up extremely well with more modern designs.
 
Despite their age, they seem to keep up extremely well with more modern designs.
...Compared to what? Powerpc PPE designs as in cell and xenos, possibly. On integer workloads anyway and likely because of the OoOE support. Any recent x86 is going to stomp all over the G3-derivative that is in GC/Wii/Wuu simply by function of being over ten years more recent in design.

Wuucpu's achilles heel is total lack of SIMD support of course. Unless Nintendo snuck something to that effect in there literally without telling anyone about it that is... It's incomprehensible how they could order a CPU without SIMD in this day and age. Unbelievable!
 
I love splitting hairs :D

For me the lack of integer SIMD would really hurt, but I may be in the minority on this one.. integer SIMD is practically all I've used..
 
...Compared to what? Powerpc PPE designs as in cell and xenos, possibly. On integer workloads anyway and likely because of the OoOE support. Any recent x86 is going to stomp all over the G3-derivative that is in GC/Wii/Wuu simply by function of being over ten years more recent in design.


What about old x86? :)
Pentium 3 was nice. Dual P3 rigs definitively killed the dinosaur workstations (sparc, mips, alpha) plus it competed nicely as a single CPU against the gekko in the console space.
If you duck taped three Pentium 3 together it would probably compete against the Wuu CPU.
 
...Compared to what?
Compared to some recent x86 netbook cpus.

Wuucpu's achilles heel is total lack of SIMD support of course. Unless Nintendo snuck something to that effect in there literally without telling anyone about it that is... It's incomprehensible how they could order a CPU without SIMD in this day and age. Unbelievable!
True. Just imagine what a sad bunch those recent netbook x86 cpus would've been if those ancient G3s had the slightest resemblance of a SIMD unit. *shudders at the thought*
 
True. Just imagine what a sad bunch those recent netbook x86 cpus would've been if those ancient G3s had the slightest resemblance of a SIMD unit. *shudders at the thought*

Sad bunch? I think a 2GHz Atom can easily hold its own against a < 1.25GHz G3 with "real" SIMD added..
 
In fp SIMD? Well, we'd have to decide what the 'real' SIMD added to the G3 would be. Are you ok with AltiVec?

Wouldn't just have to define the ISA, would also have to define the execution units of the processor and the problem at hand..

For instance if it's FMADD heavy work and it has 1 vec4 FMADD unit the Atom could probably keep up, if it's 2 vec4 FMADD units then maybe not.
 
Wouldn't just have to define the ISA, would also have to define the execution units of the processor and the problem at hand..

For instance if it's FMADD heavy work and it has 1 vec4 FMADD unit the Atom could probably keep up, if it's 2 vec4 FMADD units then maybe not.
I have access to a 7447A, which should be as close to the Atom on paper as it gets - 7447's SIMD cluster is entirely in-order, including permutes, with a sole FMADD unit (actually, a sole FP SIMD unit).
 
Status
Not open for further replies.
Back
Top