Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
That's just how he is. Says outlandish things and then won't respond when you call him out on it.
Well standard wisdom would be lets wait for final products.
There is not that much magic in the silicon world and the sane attitude is too wait for product using this tech and see which performance at which power budget.

there is not disputing that they might reach 200MFLOPS but at which power? With Windows 8 running on ARM there might be ARM based netbook in the pipe, may be ultra book too. Those devices power budget has nothing to do with nowadays tablets or phone. PowerVR seems clever enough to consider this market (may be even a bit higher) and that's not the issue.

It doesn't hurt not waiting for facts instead of being close to personal attack and the issue is the pretty lose usage of the world "mobile". Are netbook ultrabook or even nowadays windows tablets "mobile" products? I would not say so they go with power budget order of magnitude higher than "mobile devices".

Who is outlandish the one that want concrete implementation of the tech or the one that take for granted an increase in power efficiency of one order of magnitude by going form 45/40 nm lithography to 32/28 nm?
 
Last edited by a moderator:
Question: If PowerVR chips are that amazing, why aren't we using them in PCs anymore? Right, because they're not. They're damn nice for sub 1W systems, but that's pretty much it. Rogue won't change that.

Is this a serious question? The point is Wii U will release 7 years after Xbox 360 and now PowerVR mobile graphics technology can deliver same quality performance with low power consumption. Yes, Wii U might be slightly better but still for most consumers its the same. Even X720

http://www.gamesindustry.biz/articles/digitalfoundry-new-ipad-evolution-of-ios

"Next-gen mobile tech, slated to arrive next year, finally sees graphics power catch up with - and perhaps even exceed - the capabilities of the Xbox 360 and PlayStation 3. The iPad 4 could conceivably become a target platform for AAA development."

Oh boy. Looks like DigitalFoundry agrees
 
Last edited by a moderator:
Rogue certainly looks like a great GPU and a generational step forward for mobile graphics. I think it'll be closer in power to next gen console GPU's than current mobile GPU's are to Xenos/RSX. However the bit I've highlighted above is quite an odd statement. Because not only do you not know what kind of performance we'll see from Rogue in 2013 (It seems Rogue will start out at around 200Gflops, which is below 360/PS3 GPU's) but you also don't really know what WiiU will be. Rumours and vague comments are not enough to compare graphics power.

Care to explain such? Seems contradictory.

If you are talking about degrees there are current GPUs on mobile at nearly 30GFLOPs (roughly 15% as fast). Looking at a 200GFLOPs mobile chip means you need a console in the 1400GFLOPs range to be a relatively equal range difference. But that isn't the only factor, e.g. bandwidth (for all the consoles/PC memory architecture mobiles, with their very stricture power requirements and small form factor have are very limited to whatever low hanging fruit passes on to the mobile space.

Which goes back to your own comment, "Rumours and vague comments are not enough to compare graphics power". While you qualify your comment with "I think" instead of the assertive "absolutely" (tsk tsk to those placing so much faith in rumored specs, especially when some of the underpowered WiiU rumors mention it has really high FLOPs but not so high real performance, i.e. a 500GFLOPs WiiU could be architecturally better than a 200GFLOPs Rogue, in which case the whole "absolute" is really "not even close") I think, I know dangerous thinking, you made a very similar equation in regards to Nintendo's competitors. ;)

Not going off of rumors, but as industry trends go, mobile has improved dramatically from the beginning of the 2000s to today. A lot of low laying fruit has already been captures by current mobile hardware--even though they are older architectures--compared to what was available in 2005. Due to process limitation, area limitations, TDP limitations, form factor limitations, BOM considerations, etc the pace of mobile advancement will be more inline with that of the PC.
 
Oh boy. Looks like DigitalFoundry agrees
They also let out that shitty article/ shoes polishing about Kepler.
I'll tell you something Granmaster did a really good jobs but there are people in the embedded tech forum that works for the aforementioned company(ies).

I would be surprised if they came here to say that rogue will come close to 200MFLOPS with the ~1 Watt power budget of an iPad (and that's with the CPU and the RAM). Actually even if they did I would be be surprised it turns into a product (like using a ginormous chip, and highest bins parts at low clock, etc. so won't happen in real product). One can dream though.

PowerVR have a great product with rogue that might allow them to address both the mobile market and the ARM/windows8 devices (with higer power characteristics). They might have quiet an edge in power efficiency (and what they do out of low bandwidth) vs AMD, NV and Intel.
 
Last edited by a moderator:
Well standard wisdom would be lets wait for final products.
There is not that much magic in the silicon world and the sane attitude is too wait for product using this tech and see which performance at which power budget.

there is not disputing that they might reach 200MFLOPS but at which power? With Windows 8 running on ARM there might be ARM based netbook in the pipe, may be ultra book too. Those devices power budget has nothing to do with nowadays tablets or phone. PowerVR seems clever enough to consider this market (may be even a bit higher) and that's not the issue.

It doesn't hurt not waiting for facts instead of being close to personal attack and the issue is the pretty lose usage of the world "mobile". Are netbook ultrabook or even nowadays windows tablets "mobile" products? I would not say so they go with power budget order of magnitude higher than "mobile devices".

Who is outlandish the one that want concrete implementation of the tech or the one that take for granted an increase in power efficiency of one order of magnitude by going form 45/40 nm lithography to 32/28 nm?

If you've seen the posting history in this thread, you would know what I'm talking about. I wouldn't consider that anywhere close to personal also. :smile:
 
Looking back on my post "inept" was too strong of a word to use regarding developers. Perhaps it was tainted by a biased source. (this particular one was developing a proprietary engine) Although my statement regarding middleware optimization was correct, & still is. Progress is advancing rapidly, & has been made even since my last post in fact. A rather recent March update to the SDK (ver. 2.03) which in all probability is linked to the upcoming V5 development kit. As it stands, afaik the most popular middleware engines are running optimally with all features intact. (not enough shaders would render this impossible)

What did not occur to me was why was/is Nintendo testing & optimizing these various engines, adjusting & tweaking hw specifications, etc.? From its inception Nintendo was touting the Wii U's ease of portability, why the change? The most obvious answer was 3rd party engine compatibility, though it had to be more than that. I spoke of the Nintendo "footprint," & it could have indeed been larger than I initially thought.

Were 3rd parties were having trouble initially with the CPU perhaps? Instruction sets are typically tailored, or highly customized by the console vendor based upon their particular performance needs. IIRC, the GC's Gekko had an additional set of 50 instructions to the PowerPC 750 on which it was based. (as well as stripping away non-essential features) Nintendo is again going with IBM's PowerPC architecture. The rumored OoOe tri-core CPU with 2-way SMT appears legitimate. Why this seems unbelievable has me bewildered, & questioning posters incredulity. Is this based upon the very cheaply produced, severely underpowered Wii? (which btw was a first on the Nintendo home console front) I would have to assume so. The 32mb embedded ram is not as cost prohibitive as many of you might believe.

Every console CPU & GPU obviously behaves differently, the Xenon is a highly customised variant of the VMX AltiVec unit. Alas, I think that the PS3's PPE & SPEs are irrelevant in this discussion, with the exception of highlighting notably custom platform architecture. Why? Because Nintendo used the 360s' development environment model as a baseline of sorts. The 360's ease of development, ease of PC portability, generally superior versions of multi-platform software, ease of middleware engine adaptability, etc. (as well as seeking developer input from close 3rd parties) Thus we could assume that the Wii U's CPU could be incoporating a more modern VSX AltiVec unit. Though due to its customisation, instruction sets and data-formatting would also have to change. Optimized code for the 360 may run sub-par on the Wii U, (or stall) & this problem is also compounded with its communication/relationship to the GPU. (which is also heavily customized) While by no means a quantum leap over Xenon computationally, its efficiency will be where the CPU differentiates itself. (a penchant of Nintendo's)

It's a developer learning curve that's all, which will be beneficial to native engines much moreso than ported ones. The performance will scale up considerably on proprietary engines I've been told. (much like the PS3, though even more capable) Nintendo did design this platform attempting to be extremely port friendly, though its own software still drove & dictated the initial design & feature set. I've known of the target system specs for some time,& they are indeed accurate. I just do not know what alterations have been made. (as some have) The "target" specifications are what Nintendo told 3rd parties to expect from finalized hardware.

As far as the lighting is concerned, I tried to describe having some fixed functionality in parallel to programmable shaders. What this yields is stable, predictable performance. This is especially important when rendering a seperate viewpoint on the DRC (display remote controller) from what is shown on the main screen. (differing geometry, lighting, shader effects, etc.) I attempted to describe what I was told, I hope I didn't lose any aspects or understanding in translation. I referenced the GC's architecture because there seemed to be parallels. There is definitely a DSP, I read a page back where there was some confusion regarding its inclusion. Why is anyone bringing up the Ipad? I do not want to even enter into this nonsensical debate, the occupy completely different hemispheres.
 
What is your source for these things? You would know the exact SDK version (v.2.03), what the most up to date dev kit model is (v5), know that all the most popular middleware is running optimally with no disabled features on the WiiU dev kits, etc so I assume you either work with the dev kits yourself, know someone who does and is freely passing you info, or are culling this from the net in which case your tone about what rumors are valid and which ones are not doesn't make any sense. Since you seem absolutely confident in what the WiiU has inside maybe you could give more specifics about the CPU and GPU. I mean, if you can tell us the exact SDK version # surely some data on the current hardware wouldn't be hard to come by?
 
I'm interested in hearing more about this DSP and what functions it assists the CPU or GPU with. Not that it's all that surprising, if the thing has a set task and does it super fast then good for it. Does the DSP assist in the lighting part of the equations? Rendering from two different viewpoints can't be that difficult, but maybe a DSP can speed up by a large margin?
 
I'm interested in hearing more about this DSP and what functions it assists the CPU or GPU with. Not that it's all that surprising, if the thing has a set task and does it super fast then good for it. Does the DSP assist in the lighting part of the equations? Rendering from two different viewpoints can't be that difficult, but maybe a DSP can speed up by a large margin?

Audio DSP
 
I'm interested in hearing more about this DSP and what functions it assists the CPU or GPU with. Not that it's all that surprising, if the thing has a set task and does it super fast then good for it. Does the DSP assist in the lighting part of the equations? Rendering from two different viewpoints can't be that difficult, but maybe a DSP can speed up by a large margin?

Me too.
 
Care to explain such? Seems contradictory.

If you are talking about degrees there are current GPUs on mobile at nearly 30GFLOPs (roughly 15% as fast). Looking at a 200GFLOPs mobile chip means you need a console in the 1400GFLOPs range to be a relatively equal range difference. But that isn't the only factor, e.g. bandwidth (for all the consoles/PC memory architecture mobiles, with their very stricture power requirements and small form factor have are very limited to whatever low hanging fruit passes on to the mobile space.

Which goes back to your own comment, "Rumours and vague comments are not enough to compare graphics power". While you qualify your comment with "I think" instead of the assertive "absolutely" (tsk tsk to those placing so much faith in rumored specs, especially when some of the underpowered WiiU rumors mention it has really high FLOPs but not so high real performance, i.e. a 500GFLOPs WiiU could be architecturally better than a 200GFLOPs Rogue, in which case the whole "absolute" is really "not even close") I think, I know dangerous thinking, you made a very similar equation in regards to Nintendo's competitors. ;)

Not going off of rumors, but as industry trends go, mobile has improved dramatically from the beginning of the 2000s to today. A lot of low laying fruit has already been captures by current mobile hardware--even though they are older architectures--compared to what was available in 2005. Due to process limitation, area limitations, TDP limitations, form factor limitations, BOM considerations, etc the pace of mobile advancement will be more inline with that of the PC.

No contradiction I'm just not expecting any next gen console to have as much of an increase in power as Rogue will have over previous mobile GPU's. I won't claim that will definitely be the case, its just my opinion at the present time. Also I didn't say that would be the case in Rogues first iteration (notice I said it would start at 200GFLOPS, so I was referring to its first iteration there).

I'm expecting Rogues later iterations to be closer to XBox3/WiiU/PS4 than current mobile GPU's are to 360/PS3, I wouldn't be surprised if that happened. Is that really very similar to "Rogue will absolutely be "on par" with Wii U next spring"? :D
 
Last edited by a moderator:
Because it'll have unique software, controls that don't require fingers to get in the way, and battery life longer than 2 hours when playing games.
We will be getting to the point where only the software will be better.. wireless charging on tablets is not going to far away, the point is hardware wise it will be equalled within 12 months...software wise no it wont...but with W8 tablets around the corner...
 
We will be getting to the point where only the software will be better.. wireless charging on tablets is not going to far away, the point is hardware wise it will be equalled within 12 months...software wise no it wont...but with W8 tablets around the corner...

Really?.. what exactly are you expecting from the first implementation of Rogue in a tablet? Remember tablets have a tdp of something like 2w for the entire system.
 
Last edited by a moderator:
The new iPad is closer to 5W isn't it? 10hour battery life (8 with LTE) and a 45 watt hour battery. Not that I agree with his assessment, it'll be a long time before tablets are there.
 
I knew Ipad 2 was about 2w (2.75w?), but Ipad3 does seem closer to 5w after a quick look on the net (43w per hour and somewhere between 8.5 and 9.5 hour battery life).

Still as you say, tablets are years away from the kind of performance he's talking about.
 
We will be getting to the point where only the software will be better.. wireless charging on tablets is not going to far away
Is that for real? Because I remember talk of methanol fuel cells for mobiles around PSP's launch, and fully expected the old battery to be replaced, but despite working prototypes noone has released an actual product. I imagine wireless power will arrive after fuel cells (ignoring that it existed in the 19th century)
The point is hardware wise it will be equalled within 12 months...software wise no it wont...but with W8 tablets around the corner...
Maybe hardware wise Wuu will be equalled - we don't yet know what is going into Wuu, or how well Rogue performs in actual products. Software wise Wuu has no equal, in the same way Wii hasn't. Sony haven't managed to sel 80 million Moves on the stength of their software. And a tablet without controls will not offer the experience of Wuu, full-stop. COD on a Wuu tablet or PSV will be much superior an experience to COD on a simple touchscreen, whether iOS, Android or Windows.

In fact, summarising your argument, no-one will want Wuu because it's unknown hardware is beaten by unreleased tablets based on unproven hardware powered by a non-existant technology running an uncertain OS. Can't say that's the strongest argument I've ever heard. ;)
 
Really?.. what exactly are you expecting from the first implementation of Rogue in a tablet? Remember tablets have a tdp of something like 2w for the entire system.
Assuming Wii U is less powerfull than 360...then it is very possible that A6X would beat it...just by looking at some of the predictions a page or to back..both the ROPS and TMU's will be fairly similar...shader power is going to be alot higher on Rogue...

Apple used quad channel 32 bit memory..assuming they stick with 32bit and change u to LPDDR3 bandwidth would be in the ballpark..as well as being TBDR...
Likely 2gb ram?? 2-4 Cortex A15?? @1.5-2.0 ghz?? seriously its possible.

There is nothing stopping what i have mentioned happening..at least technology wise, snap on thumb sticks ala 3DS??..AIRPLAY..to TV?..either wireless charging or just plug the charger in?? Wii U looks like it willbe wired and in anycase it would have the same battery as ipad regardless?

Software wise of course you would expect Ninty to blow ipad games into the weeds...however although highly unlikey...that Apple activates Open GL 4XX on IOS..and enables larger downloads/more expensive games? maybe meeting with Valve is to activate steam on ios??

Just to confirm i DONT expect this to happen quite like that, but i DO expect the gap to close substantially between ipad and current consoles by spring next year...
The point im making is by christmas next year Ninty will have the REAL next gen to contend with, and likely graphically intense tablets that will be matching it..not the same niche they had with Wii is all im saying.

EDIT;
In fact, summarising your argument, no-one will want Wuu because it's unknown hardware is beaten by unreleased tablets based on unproven hardware powered by a non-existant technology running an uncertain OS. Can't say that's the strongest argument I've ever heard. ;)
im not talking about unreleased hardware...next spring is all im talking about....wii u may have just 12 months in the lime light, and of course i havnt mentioned the fact ps3 and 360 will still be selling likely for less.
 
Status
Not open for further replies.
Back
Top