Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
"It's not up to the same level as the PS3 or the 360," said one developer speaking to GamesIndustry.biz. "It doesn't produce graphics as well as the PS3 or the 360. There aren't as many shaders, it's not as capable," said another.

This was my favorite Wii U comment.

Funny you keep saying thing about x360 when no else in this threads has said anything. Idea man on gaf called it a xbox360 plus right before E3. Looks like he was right on the money. Idea man been right on every leak btw on gaf....

You have a bad habit of twisting posts.
 
Funny you keep saying thing about x360 when no else in this threads has said anything. Idea man on gaf called it a xbox360 plus right before E3. Looks like he was right on the money. Idea man been right on every leak btw on gaf....

Could you please link me this post?

The only thing I remember from Ideaman was that according to his sources you could say it looks twice as good as the XB360 even though it has to render the second screen.
 
The "there aren't as many shaders" comment seems like the most far-off from all the others that I've seen to date.
And as we've pointed out earlier, it's coming from an "unamed developer"..

Everything else points to a more powerful GPU and less powerful CPU. And the fact that it's accusing to have "less shaders" makes even less sense.
At 40nm, the only AMD desktop GPUs that are slower than a Xenos consume something like 15W max.
 
The "there aren't as many shaders" comment seems like the most far-off from all the others that I've seen to date.
And as we've pointed out earlier, it's coming from an "unamed developer"..

Everything else points to a more powerful GPU and less powerful CPU. And the fact that it's accusing to have "less shaders" makes even less sense.
At 40nm, the only AMD desktop GPUs that are slower than a Xenos consume something like 15W max.

This. The only way those anonymous comments are correct is if they are comparing it to the 720/PS4. But either way, saying "Not enough shaders" was hilarious and gave the 'source' away as fraudulent imo. Theres no way a developer would use that terminology.
 
Not all developers understand the technicalities of the hardware they're developing for, and not all developers are native English speakers.

It seems unlikely that the WiiU has fewer shaders than the Xbox 360 (and it could still be more powerful with fewer), but this is Nintendo we're talking about.
 
Not all developers understand the technicalities of the hardware they're developing for, and not all developers are native English speakers.

It seems unlikely that the WiiU has fewer shaders than the Xbox 360 (and it could still be more powerful with fewer), but this is Nintendo we're talking about.

Fair point. But I would still argue that it's a bizarre term to use. If its true then I would assume its based on an impression of the system, rather than looking at actual figures as I doubt devs would be given details of the amount of shaders, would they? Is that somthing they need to know or is it all based on benchmarking and the sort of generalised spec sheet that we saw around e3?
 
Fair point. But I would still argue that it's a bizarre term to use. If its true then I would assume its based on an impression of the system, rather than looking at actual figures as I doubt devs would be given details of the amount of shaders, would they? Is that somthing they need to know or is it all based on benchmarking and the sort of generalised spec sheet that we saw around e3?

Yeah I agree it sounds bizarre, but it could be the kind of thing an artist who isn't a native English speaker might say when trying to get something across, and most developers aren't actually programmers. Then again, it could just be another internet fake.

I would expect a graphics programmer to know how many shaders the final hardware will have (or to be able to work it out), but then again maybe pre final hardware and/or an API mean Nintendo didn't need to let these details out till recently.
 
Most of the people who get to talk to the press aren't engineers, it's almost always producers and occasionally designers, neither of which are particularly technical.
Both of which have a tendency to grab onto buzzwords they hear. So the terminology wouldn't surprise me.
The part actually being weaker than 360/PS3 would.
 
From what we've seen so far, ports of existing games are clearly not automatically better in terms of framerate, resolution or AA. I presume there will be room for improvement, but I'm worried the optical drive / lack of hdd combo and demands of 2nd screen may be a weak spot that more RAM won't compensate easily at first. Eventually the console will probably get its FXAA or whatever and mp games will compare decently, but I don't expect much. Nintendo HW has yet to positively surprise me in that regard.
 
Not too hopeful given the source, but apparently Batman and AC3 will run at 1080p native:

http://m.gamefaqs.com/boards/631516-wii-u/63483225

That would be expected if the GPU was really anywhere near 500 gflops. As I've been saying.

As you say the source is incredibly flimsy though, especially since PR hacks typically dont know the difference between 1080P output support and native rendering.

but this obviously seems like a case for the b3d pixel counters :p

come to think of it, didnt quaz and others already confirm all wiiu games shown at e3 at 720p? is it even possible to change the res of a game in a short time?

Edit: even on GAF they're not taking this seriously.
 
Now it's things like that that get me excited again...once again this paints a picture of a worthwhile upgrade over xbox 360...assassins creed 3 running 1080p native....as long as it has all the same goodies as the ps360 version...and doesn't drop below 30fps...that sounds like an awesome experience.
 
Well the poster also claims a frame rate above 30 although he is not quite certain if it's 60. If 1080p 60fps AC3 and potentially improved textures could happen, it could be a significant step up (although texture improvements are not mentioned, ram wise it should be possible)
 
Where does it say the controller will work as a remote with the console turned off?
I already told you it was announced at E3 this year. Go back and watch either the tuesday conference or Iwata's pre-show video - I can't remember where they showed this particular detail off. It was either of these two anyway, you'll find it if you go look, in case you don't believe me. :p

Not that it's much of a stretch to believe there's graphics display capability in the pad anyhow. Microcontrollers with graphics built in are dirt cheap and quite capable these days.
 
I would have to try to find it as it was in a video around E3, but I believe Ubisoft is on record saying that all the (console) versions will be at 720p.

Could you please link me this post?

The only thing I remember from Ideaman was that according to his sources you could say it looks twice as good as the XB360 even though it has to render the second screen.

http://www.neogaf.com/forum/showpost.php?p=38572993&postcount=444

As you can see KB left off a couple of plus signs and ignored IM talking about heavy usage of the controller happening at the same time.

Not all developers understand the technicalities of the hardware they're developing for, and not all developers are native English speakers.

It seems unlikely that the WiiU has fewer shaders than the Xbox 360 (and it could still be more powerful with fewer), but this is Nintendo we're talking about.

Even not being a native English speaker would justify saying it's not as capable as PS360.
 
Even not being a native English speaker would justify saying it's not as capable as PS360.

If all they've heard from the tech guys is that they can't get their ports to run properly - which looking at the date of the quote might be reasonable - then it'd be understandable if they passed on to GI.biz some twisted version of this.

There's a perception that everyone in the games industry reads GPU reviews and wanks over spec sheets. Most of them don't, but they understand when their project is running into problems.
 
If all they've heard from the tech guys is that they can't get their ports to run properly - which looking at the date of the quote might be reasonable - then it'd be understandable if they passed on to GI.biz some twisted version of this.

There's a perception that everyone in the games industry reads GPU reviews and wanks over spec sheets. Most of them don't, but they understand when their project is running into problems.

I'm not talking about the shaders part. And if they've heard that from the tech guys, then that's not a language problem.
 
Status
Not open for further replies.
Back
Top