Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
@XpiderMX: But this is technical thread on a technical board. Unless you can back up this 2x thought, what you're doing is a straight path to ban. And believe it or not - you'd have someone with convincing arguments backing up your claim by now if it was at all in the realms of possibility.
 
@XpiderMX: But this is technical thread on a technical board. Unless you can back up this 2x thought, what you're doing is a straight path to ban. And believe it or not - you'd have someone with convincing arguments backing up your claim by now if it was at all in the realms of possibility.

Ok, thanks for the advice, I will edit the "2x" statement.
 
I've read some people on NeoGAF saying things like "when developers build "gpu centric" games, Wii U will show next gen face...

What's mean "gpu centric"? Can the devs stop using cpu and begin to use gpu for general purpose code?

All this looks like Nintendo fans words (IMO).

The idea that Wii U has some great GPU performance advantage over Xenos needs to be dropped. Very low performance computers are running games like COD and ME with no optimazations for that specific hardware better than Wii U on top of Windows

And the idea that PS360 have great CPUs is flawed. General purpose power wise they were matched by PCs at launch but of course developers try to exploit other things.Yes even 4 core Jaguar would bring multiple times general purpose power that Wii U CPU likely dosent
 
Last edited by a moderator:
Sounds like a lot of wishful thinking to me.
If it can't run games at higher resolution now I doubt it ever will, about the easiest thing in the world to change is the resolution you're running at.
It also is not impacted by the CPU performance.

Could ports be better on WiiU?
Probably, but I don't know how much better.

Can WiiU compete with 720/PS4 if they constitute a significant performance improvement? That I very much doubt.

Will WiiU be the lead platform for next gen games? I just don't see it.

There's a lot of magical thinking going on at NeoGAF regarding the WiiU. Every day there is a new great white hope that will save the anemic hardware. If it's not GPGPU, it's eDRAM or low latency or secret Nintendo voodoo in the hardware, and now it's being extended to an imaginary shift in development practices.
 
Do we have any more power usage numbers beside 34 watts?

There's a lot of magical thinking going on at NeoGAF regarding the WiiU. Every day there is a new great white hope that will save the anemic hardware. If it's not GPGPU, it's eDRAM or low latency or secret Nintendo voodoo in the hardware, and now it's being extended to an imaginary shift in development practices.

Yes it is very funny.
 
How is possible that we don't have any gpu details yet?

Nintendo is hush hush because of the negative connotations about their hardware not being up to par to carry the system and Nintendo into the next few years, especially as the PS4 and Xbox 720 arrive on the scene.

Sooner or later someone will put the Wii U's processors under the electron microscope so we can get a real good look at the architecture, at least from a macro view, which is enough to count cores and SIMDs for the GPU.
 
There's a lot of magical thinking going on at NeoGAF regarding the WiiU. Every day there is a new great white hope that will save the anemic hardware. If it's not GPGPU, it's eDRAM or low latency or secret Nintendo voodoo in the hardware, and now it's being extended to an imaginary shift in development practices.

Just take a look of this:

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
 
Just take a look of this:

[/LIST]


Bringing over recent NeoGAF's posts here won't be very informational.
The most recent few pages are full of wishful thinking and empty claims.

I do hope somebody would just electron microscope the damn GPU and tell us what exactly it is.
 
Bringing over recent NeoGAF's posts here won't be very informational.
The most recent few pages are full of wishful thinking and empty claims.

I do hope somebody would just electron microscope the damn GPU and tell us what exactly it is.

You're right, sorry, I will not quote neogaf posts :oops: It is only that the last statements looks like the zeldainformer article.
 
There's a lot of magical thinking going on at NeoGAF regarding the WiiU. Every day there is a new great white hope that will save the anemic hardware. If it's not GPGPU, it's eDRAM or low latency or secret Nintendo voodoo in the hardware, and now it's being extended to an imaginary shift in development practices.

Thing is, even with all these people going around arguing one side or the other, I've yet to see anyone state it with any sort of complete information backing up their argument. That is, noone has brought forward a collection of facts and figures of examples of how RAM bandwidth, CPU clocks, possible interconnects speeds in the MCM, EDRAM speed and GPGPU features would fit together and produce a particular result on screen.

We haven't even really had many people with extensive graphical programming knowledge come out and say "If Wii U has X in it's GPU/EDRAM/Whatever, combined with the starved bandwidth it will produce Y".

In short, until we get any more information all we can really seem to say is "Games look like 360 ones with a bit less performance."
 
Just take a look of this:


[/LIST]

Just fanboy rubbish.

Thing is, even with all these people going around arguing one side or the other, I've yet to see anyone state it with any sort of complete information backing up their argument. That is, noone has brought forward a collection of facts and figures of examples of how RAM bandwidth, CPU clocks, possible interconnects speeds in the MCM, EDRAM speed and GPGPU features would fit together and produce a particular result on screen.

We haven't even really had many people with extensive graphical programming knowledge come out and say "If Wii U has X in it's GPU/EDRAM/Whatever, combined with the starved bandwidth it will produce Y".

In short, until we get any more information all we can really seem to say is "Games look like 360 ones with a bit less performance."

Funny, because your posts on neogaf state that you feel the WiiU is more powerful and that the reason that ports don't run well is because "different philosophy" and trying to " brute force" "old methods of game pipelines" and other such BS.

You also claim that people that think that the WiiU is only around as powerful as the 360 have "a hidden agenda"!
 
Just take a look of this:
[/LIST]


To be fair you've not quoted the rest of his post. Regardless of whether his consluions are correct, its not really fair to quote them without the info he's trying to back it up with.

And again, discuss it with him - explain to him why he's wrong if he's wrong. This attitude of "omg lol @ this post" doesn't really help inform anyone. Especially when bringing from one forum to another :)

Edit: no doubt I'll be lambasted for being a "Nintendo fanboy" again. I'm not agreeing, I just dont like to see peoples opinions disregarded without explanation. Not that it needs explanation here - which is why you shouldn't have brought it over here ;)
 
Just take a look of this:
We can't discuss other forum topics here. Not least because a lot of posts about the Wii U are 'fanboy firefighting'. Just review the old Wii discussions. Even after the facts were known and the games were seen, there were people believing there was a physics processing unit in there and all sorts of Nintendo magic like an amazing new rendering technique that streams 3D scenes like video, or whatever the hell that was.

An important reference point for any high-level evaluation of hardware is the law of thermodynamics and friends. A CPU that small cannot have the power of much larger CPUs without using some radical new alien technology. Anyone claiming that devs aren't using it effectively are burying their head in the sand. Especially when we get info on its make-up that shows it's lacking high-throughput features and devs are telling us it's no good. Likewise, a GPU that small running that cold cannot have any amazing magical powers unless we're seeing a super new technology. Low latency RAM is still going to be epically slow compared to working from caches that it's use isn't going to amazingly imbue Wuu with as fast working space as high bandwidth. What's happening with these hopeful posts is people who don't really understand are looking at what's different in Wuu versus PS360 and then believing that those difference can make the system more powerful if only developers would write for it. Like Cell supposedly making PS3 all powerful and the only reason it doesn't render XB360 at 1080p60 is because devs aren't good enough for it.

Games that make best use of the GPU will look better on Wuu, I expect, with higher quality effects and shaders. We might see some better texturing and object detail, maybe. GPGPU can only take away from the graphical prowess, so the GPU will have less to render the graphics, even if developers want to struggle with using GPGPU code. And a lot will also depend on Nintendo's tools, which we know little about.
 
Edit: no doubt I'll be lambasted for being a "Nintendo fanboy" again. I'm not agreeing, I just dont like to see peoples opinions disregarded without explanation.
Some ideas are so wrong and tiresome that after a while, we'd rather just not discuss them. The Wii discussions got pretty pathetic at times, with such arguments as the names of the chips telling us they'd be pretty potent, and a spurious PR comment proving the machine had a new, secret physics-processing unit. There are plenty of people arguing about the technology because they want it to be better, and not because they want to understand the nature of the machine, which is never a healthy starting point for a scientific investigation.
 
With all the talk regarding GPGPU... Do we yet have confirmed information about the GPU architecture? Is it a DX11 compatible GPU?

Radeon HD 5000 series was the first AMD/ATI chip that supported true GPU compute (synchronization barriers, atomics, thread block shared memory with random R/W access, etc). Microsoft also created a downlevel hardware ComputeShader support for old hardware (CS 4.0), but it has no extra features compared to DX10 pixel shaders. In CS4.0 threads cannot cooperate by any means (no barriers/atomics, cannot access same shared memory regions). It doesn't allow any new algorithms that PS4.0 is not capable of handling.
 
No one port show "better" graphics than current gen version, even BLOPS2 is sub-hd at the same visuals than 360 version but worse performance.

Maybe you should try reading threads a little better, these done comparisons showing better looking multi platform games on Wii U
 
Status
Not open for further replies.
Back
Top