Fact: Nintendo to release HD console + controllers with built-in screen late 2012

The real question in my mind is still compatibility with key technology which engines will be designed around. What would the significant differences be if two consoles are both more powerful and more efficient at the same time with current future rendering technology compared to a console which may not be compatible with some of the design choices. I just wonder if something like using tessellation might make it difficult to port models backwards to older hardware?

As I said in my previous post, the feature set of the Wii2/Café won't be far off from the ones of the PS4/X1080, therefore engines could be ported around more easily this time around. What couldn't be is of course the level of detail PS4/X1080 could afford.

Tessellation could very well be part of the Café GPU, a less advanced as what can be found in current DX11 GPUs is already in the RV7x0 silicon. And it's definitely a possibility that AMD includes a more recent version of their tessellation engine in the Café's GPU.

And in the worst case scenario where the GPU wouldn't support tessellation at all, it wouldn't be a deal breaker for ports. It would just mean less detailed models, but that would have already been the case given the supposed difference in sheer computational power and memory. In fact the PS4/X1080 games using tessellation would mean less work for the 3rd parties doing the port, they'd just go with the base models or lower tessellation level on the Café version (depending of the support of tessellation or not on Café's GPU).
 
RudeCurve said:
Actually it was a mini-DVD drive...
It is unnecessary "corrections" like this that boggle my mind. Was it a DVD drive? No. Did it have the capacity of a DVD? No.

And since you apparently failed to read the link you provided: "The physical size of a Nintendo GameCube Game Disc is that of a miniDVD, and the Wii Optical Disc is the size of a DVD. The discs are of a proprietary format;"
 
Just because the GPU design will be more recent doesn't mean it won't be similarly underpowered. If they only slap 80 R700 era shaders on a die with those three PPE/Xenon level PowerPC cores and connect it to 512 MBs of DDR3, being DX10.1 won't matter all that much since it will still be incredibly slow for a 2012 device.

Every indication we have is that it will be competitive with the 360 and PS3. By 2013 if Sony and Microsoft launch new consoles with 8 or 16 times as much RAM, modern 8, 12 or 16 core CPUs and DX11+ GPUs with 2000 shaders, that will leave Nintendo a full generation behind. Again.

Well obviously if they released a slow chip based on any architecture then it'd be slow.. All I'm trying to say is that a R7xx based GPU doesn't have to be poor in a console in 2012 (IF its even using a R7xx based chip). Obviously the final result depends on how they use the architecture. Having said that plenty of sources have claimed its more powerful or even significantly more powerful then PS3.

Although since Hollywood's architecture is Flipper, it would've been 7 years old in 2006 when Wii launched--the architecture dates back to 1999--that's when the gates were cranked out. Now in 2011 it's 12 years old. By the time Nintendo Cafe/Wii 2 / Wii HD launches in late 2012, the Flipper/Hollywood architecture will be, incredibly, 13 years old, obviously.

If the rumors are true, then the Wii's successor with an Rx7xx-based GPU will be 4 years old. Not as old as Hollywood was, but still old. I'd really like to see the latest tessellation on Nintendo's GPU, the kind that DX11 has. I am aware that all AMD/ATI GPUs have tessellation, starting with Xenos, and actually going all the way back to R200 / Radeon 8500 (TruForm), but I'd hope to see the sort that only showed up in DX11-supporting GPUs.

Fair enough Hollywood was around 7 years old on release, not 9, I mis-remembered some dates :) Still don't think the situation would be comparable to the situation with Wii (as long as we don't just assume its going to be a extremely gimped R7xx for the sake of it). Hollywoods biggest problem by far was the massive difference in architecture. A R7xx based GPU would no doubt be quite a bit slower then the GPU's in a 2013 console from MS or Sony, but is there really a chance that it'd be as foreign in architecture as Hollywood was in comparison to Xenos?
 
Last edited by a moderator:
Why can't we assume it'll have custom features on the GPU? I mean, as far as I can tell, neither 360 or PS3 used a simple off the shelf part. They had their own custom features added on to a regular GPU.

It can and will, we just don't know what they'll be. I'd certainly expect added eDram, but who knows what else. That's if its even based on R7xx. There seems to be some confusion now whether the French site even had any of that info or whether it was guess work (with the only hard piece of info being the PowerPC based CPU).

The real question in my mind is still compatibility with key technology which engines will be designed around. What would the significant differences be if two consoles are both more powerful and more efficient at the same time with current future rendering technology compared to a console which may not be compatible with some of the design choices. I just wonder if something like using tessellation might make it difficult to port models backwards to older hardware?

R7xx has hardware tessellation units doesn't it? Even though DX10 doesn't currently support them (does DX10.1?), that isn't really important for a console, since the consoles API could expose the feature.
 
Last edited by a moderator:
As I said in my previous post, the feature set of the Wii2/Café won't be far off from the ones of the PS4/X1080, therefore engines could be ported around more easily this time around. What couldn't be is of course the level of detail PS4/X1080 could afford.

Isn't that too much of a "long shot" assumption?
What if PS4 and X720 are strongly based on OpenCL programming, with something betwen Larrabee and Fermi-like autonomous GPGPUs that make extensive use of thousands of ALUs with big L2 caches that are used for everything from 3D rendering, (much more) advanced physics, A.I. etc?
Let me remind you that 2013 is already the timeframe for Maxwell, the first Project Denver GPU from nVidia, which is supposedly an autonomous GPGPU assisted (or composed, we still don't know) by ARM cores.

If the next consoles force a new programming model for developers that is much more based on massive parallel computing (which would even make sense, since that's where we're headed anyways), what chances could a 3-core CPU + DX10 GPU have for "off-the-shelf" compatibility?
 
Isn't that too much of a "long shot" assumption?
What if PS4 and X720 are strongly based on OpenCL programming, with something betwen Larrabee and Fermi-like autonomous GPGPUs that make extensive use of thousands of ALUs with big L2 caches that are used for everything from 3D rendering, (much more) advanced physics, A.I. etc?
Let me remind you that 2013 is already the timeframe for Maxwell, the first Project Denver GPU from nVidia, which is supposedly an autonomous GPGPU assisted (or composed, we still don't know) by ARM cores.

If the next consoles force a new programming model for developers that is much more based on massive parallel computing (which would even make sense, since that's where we're headed anyways), what chances could a 3-core CPU + DX10 GPU have for "off-the-shelf" compatibility?

Do you think MS or Sony will wait 2 or more years to release an answer to Wii2 though? A GPU thought to be coming out in late 2013 certainly won't be in a 2013 console.
 
Do you think MS or Sony will wait 2 or more years to release an answer to Wii2 though?

Both Sony and Microsoft have publicly stated that their current consoles would last until 2015. Even if their successor comes out earlier than that, it won't be more than a year before, making their plans of releasing it somewhere in 2014.

Besides, if the Wii 2 comes out as currently rumoured, I don't see why they would rush the release of their next-gen consoles.
If there's no threat of the Wii 2 stealing away impossible-to-port 3rd party AAA titles, why wouldn't they wait a bit more to have access to technology that completely crushes the Wii 2 in performance?
Let's not forget that Nintendo sells consoles that are made to make profits from the hardware at day one, whereas both Microsoft and Sony sell their consoles at a loss for (at least) the first year.


A GPU thought to be coming out in late 2013 certainly won't be in a 2013 console.
Following my previous statement where I honestly doubt Microsoft and Sony will rush their successors for a 2013 release, Xenos was the first unified shader GPU in the world to be in the shelves for end consumers.
It wouldn't be the first time that a console gets a state-of-the-art component (technology-wise) at release time.
 
Isn't that too much of a "long shot" assumption?
What if PS4 and X720 are strongly based on OpenCL programming, with something betwen Larrabee and Fermi-like autonomous GPGPUs that make extensive use of thousands of ALUs with big L2 caches that are used for everything from 3D rendering, (much more) advanced physics, A.I. etc?
I'm sure you'd agree that the use of an API such OpenCL, Direct Compute, Direct3D, OpenGL, etc. doesn't exclude feature-for-feature compatibility in hardware.

I never said I did expect easy transitions for game engines running on monsters +1000 last generation ALUs GPUs to a GPU with a quarter that amount and from three generations ago. It will obviously be an issue. Hence why I suggested that some third party engine be designed on Café as base platform, and then upgraded to the other two and PC.

The only way Café would be entirely left in the dust with regard to technological advances, it's if something like Larrabee or the, very early, original PS3 design makes it into the PS4/X1080. In other words, Many Cores, entirely programmable, plus some texture units and ROPs. But, I honestly don't see it come to pass for 2013. Even Maxwell isn't there yet as far as being comparable to Larrabee.

For the PS4/X1080, I expect high end GPU architectures, with a reasonable TDP (RRoD and YLoD were not worth it for MS and Sony), a reasonable transistor count, but nothing exotic. I still expect a multi-core CPU + a Programmable Shaders GPU.

By the way, since the topic of APIs was mentioned, and since shameless promotion for the site isn't banned, Beyond3D is currently working on an article on the future of APIs for GPUs. So check that front page (yes, there's one!), folks!
 
Both Sony and Microsoft have publicly stated that their current consoles would last until 2015. Even if their successor comes out earlier than that, it won't be more than a year before, making their plans of releasing it somewhere in 2014.

Besides, if the Wii 2 comes out as currently rumoured, I don't see why they would rush the release of their next-gen consoles.
If there's no threat of the Wii 2 stealing away impossible-to-port 3rd party AAA titles, why wouldn't they wait a bit more to have access to technology that completely crushes the Wii 2 in performance?
Let's not forget that Nintendo sells consoles that are made to make profits from the hardware at day one, whereas both Microsoft and Sony sell their consoles at a loss for (at least) the first year.

Because if Wii2 is more powerful then PS3/360 (which seems to be very likely, only the level of improvement seems to be in question) 2 years on the market by itself would potentially be enough time to gain the kind of support that would make it the primary system for a lot of developers. You may not think its likely, but its possible and is something MS and Sony would need to think about and monitor.

If they do release in 2014, then I can't see Nintendo being too worried about the improved hardware. They'll be top dogs in graphics for 2 years and when the other two release they won't be in any worse a state graphically then they were with Wii, and this time they could potentially have all the support and a legitimate excuse for why their hardware doesn't stand up to the other two, it was released years earlier.

Following my previous statement where I honestly doubt Microsoft and Sony will rush their successors for a 2013 release, Xenos was the first unified shader GPU in the world to be in the shelves for end consumers.
It wouldn't be the first time that a console gets a state-of-the-art component (technology-wise) at release time.

Its possible I suppose if it really scheduled to arrive in 2013 in hardware rather then just a paper launch, not that its important if you believe in a 2014 launch though.
 
Last edited by a moderator:
On a somewhat depressing note, even if Nintendo will reveal something, that in no way implies that they will reveal any hardware details about the new console. The 3DS is on the market, and interested bystanders are still pretty much completely in the dark as far as its hardware capabilities are concerned.
And they never made the innards of the Wii official.

On the other hand, it makes speculation completely safe, and even after release anyone will be able to make outrageous claims as to its (lack of) capabilities and noone will be able to offer solid evidence to the contrary. So I guess there is a silver lining to every cloud. (*cough*)
 
The only way Café would be entirely left in the dust with regard to technological advances, it's if something like Larrabee or the, very early, original PS3 design makes it into the PS4/X1080. In other words, Many Cores, entirely programmable, plus some texture units and ROPs. But, I honestly don't see it come to pass for 2013. Even Maxwell isn't there yet as far as being comparable to Larrabee.

Because if Wii2 was significantly more powerful then PS3/360 and had 2 years on the market by itself it could gain the kind of support that would make it the primary system for a lot of developers. Similar to how most developers still made their multi platform games primarily for PS2 and only ported them to the much more capable XBox and GC.


Here's why we disagree: I'm making the assumption that the Wii 2 will not be substantially more powerful than PS3 and X360, because the rumours state the following developer comments:
graphics capabilities "roughly equal to those of the Xbox 360", performance "over the Xbox 360, but just a notch"



Do we agree that if this is the case, then there's no risk of the developers making the Wii 2 their primary system? Why would they? The other two combined make an user base of ~100 million gamers.

That said, with the Wii 2 launching in late 2012, I don't believe Microsoft and Sony would release the new consoles earlier than Christmas 2013 -> Spring 2014, and that puts them on time to get a Maxwell-based system.
Why stay with the "Tick" if you can afford to wait for the "Tock"?
 
Last edited by a moderator:
Here's why we disagree: I'm making the assumption that the Wii 2 will not be substantially more powerful than PS3 and X360, because the rumours state the following developer comments:



Do we agree that if this is the case, then there's no risk of the developers making the Wii 2 their primary system? Why would they? The other two combined make an user base of ~100 million gamers.

That said, with the Wii 2 launching in late 2012, I don't believe Microsoft and Sony would release the new consoles earlier than Christmas 2013 -> Spring 2014, and that puts them on time to get a Maxwell-based system.

IF it were true that it was no more powerful then 360 then yes I'd agree it would be very unlikely to become the dominant platform (unless the controller ect really is something amazing).

But claims on this from sources vary quite a bit, from equal to more powerful to significantly more powerful, so we can't really assume that those specific comments are the correct ones.
 
Last edited by a moderator:
Nintendo is going to have to prove they deserve to be the target platform before any developers make that shift. I doubt any shift would happen before all 3 consoles hit the market.
 
Nintendo is going to have to prove they deserve to be the target platform before any developers make that shift. I doubt any shift would happen before all 3 consoles hit the market.

Of course they'll have to prove it. But we're talking about a situation were Wii2 (or whatever its called) releases in 2012 and the other two come 2 years later, that's a long time for devs to wait and a long time for Nintendo to prove themselves unopposed.
 
Hypothetical question: How powerful would the Wii2 be if it used a HD4770?

The reason I ask this is that when the card first came out, I always pegged this as the most likely card to use if Nintendo wanted performance and low power consumption. The HD4xxx cards all had pretty lousy power consumption until it was fixed for the 5xxx series. When the rumor said it was using something from the R7xx line, the only card that makes sense to me is the 4770.
 
Do we agree that if this is the case, then there's no risk of the developers making the Wii 2 their primary system? Why would they? The other two combined make an user base of ~100 million gamers.

It's clear that if we see a Café GPU with 240 RV7x0 ALUs with 1GB of RAM and if the PS4/X1080 boast a 1000+ last gen ALUs GPU with +4GB of RAM, the probability of seeing the big third party games on the Nintendo Console to be equivalent of the Sony and MS' consoles versions are slim to none. So yes, we agree on that point.

But I still think that it would have more chance of seeing a "decent," a relative and subjective term, version of the game than nothing at all, as it is the case currently with the Wii.

That said, with the Wii 2 launching in late 2012, I don't believe Microsoft and Sony would release the new consoles earlier than Christmas 2013 -> Spring 2014, and that puts them on time to get a Maxwell-based system.
Why stay with the "Tick" if you can afford to wait for the "Tock"?

It really depends on what Sony and MS want to do. They obviously want a good deal as far as IP licensing go. And both MS and Sony know from firsthand experience that NV plays hard. MS was stuck with fixed cost for most of the life of the first Xbox and Sony was offered a current-gen portfolio of IPs only, they didn't put NV80 to licensing.

So, other than Maxwell, which, like all big architecture refreshes these days, end up being late and under-performing in its first iteration, what other big game changers could Sony or MS choose to go with in the next-year (to be ready in one or two year)? Full Intel CPU+Larrabee2? Wouldn't be wise IP-wise, since Intel wouldn't give any license to the hardware at all. It wouldn't be cheap, nor small or won't it run cool. And it's unproven technology.

The only "exotic" path I could see MS or Sony take would be something like AMD Fusion, or their own ARM based solution + PVR desktop grade solution for the graphics (Yes, they have that in the works). But that wouldn't be super high-end, more highly efficient.
 
Of course they'll have to prove it. But we're talking about a situation were Wii2 (or whatever its called) releases in 2012 and the other two come 2 years later, that's a long time for devs to wait and a long time for Nintendo to prove themselves unopposed.

They won't be unopposed (and I seriously doubt they'll get more than a 12 month headstart). And it isn't really that long when you consider it'll be a while (a year) before the install base is big enough for them to target.
 
Hypothetical question: How powerful would the Wii2 be if it used a HD4770?


Given the original 750MHz clock and sufficient memory bandwidth (128-bit GDDR5)?
In graphics capabilites, I'd say over 3-4x the XBox360.

Still nowhere near the current difference between the Wii and the other 7th gen consoles, though.
 
But I still think that it would have more chance of seeing a "decent," a relative and subjective term, version of the game than nothing at all, as it is the case currently with the Wii.

The Wii is an anomaly in terms of performance relative to its contemporaries.
But I don't think the power of the console will be the main determinant here, the business angle probably is. The Gamecube was perceived as a failure, and when the Wii came along, pretty much nobody believed that it would be as successful as it turned out to be. Once it was apparent that it was a hit, it was very late to start targeting it for AAA titles, given the development time of such projects. For the big publishers, the Wii has largely been a missed opportunity.
I doubt they would want to see the next Nintendo console pass them by as well. They'll be there from the start on this one, even if cautiously. By the time other console manufacturers launch, the success or failure of the Wii2 will be relatively easy to judge, and the publishers will go with what makes financial sense.
 
just saw this rumor in the massiveGAF thread

According to an Ubisoft employee -who wished to stay anonymous-, the publisher pushes all their development teams to cash in with a bunch of 360 ports to match the march 2012 launch. "Even with the extra horse power, in especially the graphical department, we aren't allowed to put extra textures or revise bugged physics that occurred in the original." He also stated that the 3D features the system offers are revolutionairy and stunning, because you don´t need any glasses or 3DTV to experience it! The controller with build in screen and camera plays a huge part of that hologram like 3D effect. "With the optional motion control and the right horse power the only thing I can say is, Nintendo did it right this time!"


Wow if true.
Just like EGM said 15+ years ago, there would oneday be a Nintendo Holodeck 256. Lol, I take this with a massive mountain of salt,of course, but I've always thought holograms would be the next step beyond even virtual reality. the little kid in me just got really excited.
 
Back
Top