Fact: Nintendo to release HD console + controllers with built-in screen late 2012

Lack of ram is the only real reason why all PS3 games can not be played in Remote Play mode with PSP. Pirates can already acess cfg/ini files of pirated games, and enable their remote play cappabilities. As i hear some games are not playable, but majority is.
 
Funny, I don't recall the GC having a DVD drive....

Mini-disc man! :D

That's a good question: what format will Nintendo use?

They are always away from standard formats like MS and Sony are using, it could be some modified Blu-Ray - console will not be able to play movies (like Wii and DVD format). Or they will copy 360 in every way and go with DVD9 - awful move but... it's Nintendo :???:
 
Hopefully Nintendo uses a faster Blu-ray drive than PS3 does, and has plenty of cheap slow RAM used as a buffer, like GameCube's "ARAM". I hate loading times.

As for the GPU, I hope for heavens sake that it's on par with the RV770 in every single aspect (even if it's not a shrunk RV770 itself). I'm thinking in terms of everything from FLOPS to ROPS to memory bus width. Maybe though, a 128-bit bus would be okay if it has faster GDDR5 memory plus a large chunk of incredibly fast embedded RAM.

I'd love to see what EAD Tokyo (Mario Galaxy team) could do with a Nintendo console that is two orders of magnitude (100x) stronger than the technology in Wii, which dates back to the late 1990s.
 
If its a R7xx based GPU (modified obviously) then I don't see a problem. It reminds me of what they did with GC (modified 3 year old GPU), although this time I can't see DX11 (or even DX12 if its out by then) having as important a leap over DX10 as DX8 had over DX7.

Part of their problem with the current generation was having non standard technology. If they continue with the more obsolete hardware designs then they risk litterally being left behind on the technology curve again.
 
Hopefully Nintendo uses a faster Blu-ray drive than PS3 does, and has plenty of cheap slow RAM used as a buffer, like GameCube's "ARAM". I hate loading times.
On a relative scale, if it's as slow as that RAM was on GC, that RAM would be underused as it was on GC. Most developers used it as sound memory and not much more.

Not to mention that it would complicate the memory topology of the console, which wouldn't be developer friendly these days. If they want to add memory chips to the motherboard design, at least, it would be wiser to use more of the same system RAM chips they'll use.

Anyway, as you know, load times are really a software design issue. You can start a game with only a minimum of data, while the rest streams into RAM in the background. You don't have to force the player to watch tons of logos/intros only to force them to stand still in front a load screen waiting for the whole RAM to be filled with data.

As for the GPU, I hope for heavens sake that it's on par with the RV770 in every single aspect (even if it's not a shrunk RV770 itself). I'm thinking in terms of everything from FLOPS to ROPS to memory bus width. Maybe though, a 128-bit bus would be okay if it has faster GDDR5 memory plus a large chunk of incredibly fast embedded RAM.
The rumour on 01.net talks about R700 series and a single chip. Which leaves us with a broad number of chips to look at as a basis for the caffeinated GPU.It goes from the RV710 (only 80 SP, so we can forget about this one) used in the low end part HD4350 to the RV790 (800 SP), which is a tweaked RV770, used in the high-end board HD4890.

And since this would be a custom chip, we can not really extrapolate estimated performances based on the architecture alone. We can, though, estimate its capabilities and its features:
http://www.beyond3d.com/resources/chip/133
http://www.beyond3d.com/content/reviews/52/

No matter what its SP configuration ends up to be, it should be a huge jump from the DX7-era Hollywood feature-wise.
 
Part of their problem with the current generation was having non standard technology. If they continue with the more obsolete hardware designs then they risk litterally being left behind on the technology curve again.

Nintendo's problem this generation was using a slightly upgraded 9 year old DX7 GPU in a era when DX10 was the standard. The rumour here is a 3 years old (in 2012) DX10.1 GPU in an era when DX11 will be the standard. I don't see how thats any kind of continuation of using obsolete hardware. Like I said, this would be more comparable to GameCube's situation, which was a very competitive consoles for its time graphically.
 
Last edited by a moderator:
To get around the issue of an expensive touch screen controller,couldn't they just sell it separately as an accessory? So you get a Wii2 with some kind of current style/updated Wii controller and say for an extra $100 you can buy a controller with screen if you want to play your games remotely.
Just because people are seeing both of these products being developed side by side side doesn't mean they have to sell them as one package.
 
Nintendo's problem this generation was using a slightly upgraded 9 year old DX7 GPU in a era when DX10 was the standard. The rumour here is a 3 years old (in 2012) DX10.1 GPU in an era when DX11 will be the standard. I don't see how thats any kind of continuation of using obsolete hardware. Like I said, this would be more comparable to GameCube's situation, which was a very competitive consoles for its time graphically.

Although since Hollywood's architecture is Flipper, it would've been 7 years old in 2006 when Wii launched--the architecture dates back to 1999--that's when the gates were cranked out. Now in 2011 it's 12 years old. By the time Nintendo Cafe/Wii 2 / Wii HD launches in late 2012, the Flipper/Hollywood architecture will be, incredibly, 13 years old, obviously.

If the rumors are true, then the Wii's successor with an Rx7xx-based GPU will be 4 years old. Not as old as Hollywood was, but still old. I'd really like to see the latest tessellation on Nintendo's GPU, the kind that DX11 has. I am aware that all AMD/ATI GPUs have tessellation, starting with Xenos, and actually going all the way back to R200 / Radeon 8500 (TruForm), but I'd hope to see the sort that only showed up in DX11-supporting GPUs.
 
I don't see the point of choosing RV770 over Juniper or a RV730 over Redwood.
Down shrinking them to 40nm would have a cost, sure higher than DirectX11 licensing.
The most interesting chip out of the R700 series is RV740. It's already at 40nm, and it's between Juniper and Redwood in both size and performance.
RV740 it's considerably faster than Xenos.. probably 3 to 4 times faster.
 
Nintendo's problem this generation was using a slightly upgraded 9 year old DX7 GPU in a era when DX10 was the standard. The rumour here is a 3 years old (in 2012) DX10.1 GPU in an era when DX11 will be the standard. I don't see how thats any kind of continuation of using obsolete hardware. Like I said, this would be more comparable to GameCube's situation, which was a very competitive consoles for its time graphically.

Just because the GPU design will be more recent doesn't mean it won't be similarly underpowered. If they only slap 80 R700 era shaders on a die with those three PPE/Xenon level PowerPC cores and connect it to 512 MBs of DDR3, being DX10.1 won't matter all that much since it will still be incredibly slow for a 2012 device.

Every indication we have is that it will be competitive with the 360 and PS3. By 2013 if Sony and Microsoft launch new consoles with 8 or 16 times as much RAM, modern 8, 12 or 16 core CPUs and DX11+ GPUs with 2000 shaders, that will leave Nintendo a full generation behind. Again.
 
Just because the GPU design will be more recent doesn't mean it won't be similarly underpowered. If they only slap 80 R700 era shaders on a die with those three PPE/Xenon level PowerPC cores and connect it to 512 MBs of DDR3, being DX10.1 won't matter all that much since it will still be incredibly slow for a 2012 device.

There's no way Wii2 / WiiHD / Project Cafe will have 80 R700 era shaders. That would be weaker than Xenos, I think. The Xenos has 48 beefier shaders, the equivalent of 192 R700 shaders, from what I understand from reading the massive GAF thread.

Every indication we have is that it will be competitive with the 360 and PS3. By 2013 if Sony and Microsoft launch new consoles with 8 or 16 times as much RAM, modern 8, 12 or 16 core CPUs and DX11+ GPUs with 2000 shaders, that will leave Nintendo a full generation behind. Again.

I'm expecting something like the RV770 (not R700 aka 4870x2) which should provide around 5x the performance of Xenos. Whether that comes in the form of RV770 shrunk down to 32nm/28nm or a Fusion APU or something entirely new, is another matter.
 
I fully expect a single chip system. I think we're probably looking at 160 or 320 shaders which is the level you see on the Llano APUs.
 
Rumours are all pointing that Nintendo is once again putting its efforts on the controller stuff.

That said, I expect the new console to be fairly mediocre in terms of raw performance (or awfuly weak for 2012's PC standards).
The word out in the latest rumours over at neogaf is a notch over X360, which sounds pretty weak to me.


The good news is that X360-level performance should come dirt-cheap in 2012, so despite the rumoured exquisite controllers, it's possible the new console will launch at cheap-ish prices (~250€ with one super-duper controller). I imagine a Fusion APU with 3 cores and an "old" 64->80 VLIW5 shader GPU could be had for as little as ~75€ by mid-2012.
Rumours are pointing out to a 3-core PowerPC, but a Fusion APU makes a lot more sense to me, specially if thinking the "Nintendo way".

The bad news is that less than a year later there will be new consoles from Microsoft and Sony setting new benchmarks for performance so there goes the AAA 3rd-party games compatibility down the toilet, again.



BTW, rumours are saying the 6" screen isn't multi-touch capable. Damn Nintendo, getting greedy in the wrong places again!
 
I think we're probably looking at 160 or 320 shaders which is the level you see on the Llano APUs.
I imagine a Fusion APU with 3 cores and an "old" 64->80 VLIW5 shader GPU could be had for as little as ~75€ by mid-2012.

A 320 to 400 RV7x0-level ALUs @ say 500MHz would put it head and shoulders above the X360. Which is much more than "a notch." Then again, we only have vague second hand opinions on the expected overall "power" of the machine. And since opinions are highly subjective, that's hardly enough for us to get by and set a proper frame of reference for the GPU.

If the source of that 01.net rumour has access to a devkit/doc, and thus this "a notch above X360" has any credence, then we should expect something like 240 ALU based on the RV7x0 architecture having 80 ALUs per cluster.

If the source just bases its prediction on the overall picture Nintendo (or others) gave of the what to expect from the GPU, then a 4 cluster setup (thus 320 ALUs) is a possibility.

BTW, rumours are saying the 6" screen isn't multi-touch capable. Damn Nintendo, getting greedy in the wrong places again!
You'd need hard coating for a capacitive (multi-touch) screen. Which would make the gamepad heavier with a 6'' screen... and obviously much more expensive than just resorting to a simple resistive screen. And in this case, price would be a much wider issue than just Nintendo being cheapskates, since you'd have to factor the fact that the gamepads couldn't be too expensive, or else that will reduce the sales of supplementary gamepads for the Wii2/Café.

The bad news is that less than a year later there will be new consoles from Microsoft and Sony setting new benchmarks for performance so there goes the AAA 3rd-party games compatibility down the toilet, again.
If it isn't indeed just a ~3Ghz 3-Core POWER, with a 320 ALUs at best RV7x0 series GPU and ~1GB of RAM, then it would indeed make multi-platform support for 3rd parties an uneasy task once again.

With that said, one of the biggest issue Wii had compared to the other two, as far as multi-platform support went, was that it was very different technology wise. PS360 had multi Gigahertz multi-core CPUs and fully programmable GPUs with extensive Shader supports, whereas the Wii was stuck with a 700Mhz single core CPU and a GPU that supported fixed function TnL and some color combining effects.

The PS4/X1080 won't be that different technologically from the Wii2/Café, they'll just be a lot faster/more capable. In other words, it would be possible to imagine creating a game engine around Café's capabilities, and then port it up to the PS4/X1080 without it having to be fundamentally different (Like it had to be with Wii/PS360, if you wanted to make use of the PS360 capabilities). The Café version would obviously have to cut many more corners, have lower resolution textures, PCF samples, lower LOD, etc. But the overall game could be similar.

Of course, if the game relies on heavily on its aesthetics or in the sheer number of objects displayed on screen, and if the developers/publishers are not willing to sacrifice the image quality too much, it would sadly rule out a Café version.

Personally, I think what would be the biggest reason why publishers/developers refuse to port their big "core" games to the platform would be entirely related to the perceived success of "core" games on the platform early on. If, for some reason (founded in reason or not), the publishers decide that the Café is not a viable platform for some games, it will be a case of self-fulfilling prophecy. No or little core games > no audience cultivated > no success of the rare core games > no more core games.
 
Nintendo's problem this generation was using a slightly upgraded 9 year old DX7 GPU in a era when DX10 was the standard. The rumour here is a 3 years old (in 2012) DX10.1 GPU in an era when DX11 will be the standard. I don't see how thats any kind of continuation of using obsolete hardware. Like I said, this would be more comparable to GameCube's situation, which was a very competitive consoles for its time graphically.

Why can't we assume it'll have custom features on the GPU? I mean, as far as I can tell, neither 360 or PS3 used a simple off the shelf part. They had their own custom features added on to a regular GPU.
 
Nintendo's problem this generation was using a slightly upgraded 9 year old DX7 GPU in a era when DX10 was the standard. The rumour here is a 3 years old (in 2012) DX10.1 GPU in an era when DX11 will be the standard. I don't see how thats any kind of continuation of using obsolete hardware. Like I said, this would be more comparable to GameCube's situation, which was a very competitive consoles for its time graphically.

The real question in my mind is still compatibility with key technology which engines will be designed around. What would the significant differences be if two consoles are both more powerful and more efficient at the same time with current future rendering technology compared to a console which may not be compatible with some of the design choices. I just wonder if something like using tessellation might make it difficult to port models backwards to older hardware?
 
Back
Top