Fact: Nintendo to release HD console + controllers with built-in screen late 2012

This specs are so badly fake, 512MB for the ram? 1024 for the Vram? And on top of it Edram?
It's a bad joke...

On top of that XDR2 and GDDR5. :LOL: The XDR2 is much more usefull as a unified memory pool. The 16x data rate and insane bandwith would be hardly used out by just CPU.
 
This specs are so badly fake, 512MB for the ram? 1024 for the Vram? And on top of it Edram?
It's a bad joke...

A new-gen FlexIO from Rambus could allow the CPU to access the GDDR5 without much latency. But nonetheless, afaik the RSX ends up using the XDR quite a lot, while Cell hardly ever uses the GDDR3 in PS3.

That said, it's not that much of a long shot to assume that. in a gaming console without much OS memory demands, where you'll have a target for a 1920*1080 main display plus four 960*540 controller displays, you'll eventually need quite a bit more graphics memory than system memory, IMO.


Yes, that's true, but do you really think Nintendo was able to completely re-implement the entire Direct X APIs with all of the nuances used by game engines?

Of course the "setup, recompile, done" was an exaggeration, but if Nintendo Stream\Feel is able to top each and every spec in the Microsoft console, then the ports would require very little effort for developers. They could actually be lazy enough not to optimize almost anything for the new console.

Again, the console was supposedly developed within close contact with X360 and PC developers. Even if the eDRAM doesn't make much sense if we're thinking about building a balanced system, that objective may not be Nintendo's main purpose for the new system.
If the purpose is to attract AAA developers from X360, then sticking some eDRAM in there could be a demand from those developers.
Besides, the RV770 has ~960M transistors. The additional 16MB eDRAM would be what? Another 80-100M?
And how many transistors would it lose for switching to a 128-bit memory controller? ~30M transistors?


At 32nm, a ~1000M trasistors custom RV770 with 16MB eDRAM and reduced memory channels to a 128bit bandwidth would still take less than half the die space of the original RV770 (<100mm^2?).



For one, I think the chance that Project Cafe isn't a unified memory architecture is basically zero.

I also thing the chances are slim, but not that close to zero. Nintendo has never used an unified memory pool before in a home console.



Well it doesn't look bad aside from the fact that they'd never go for a mechanical HDD. Wouldn't the ED-RAM be too small if they're targetting 1800 by 1000 est for the four screens in addition to up to 1920 by 1080 for the main TV screen?

I'm not sure where you took the "1800*1000" for the controller screens. It's supposed to be 1920*1080 divided into 4 equal rectangles, so 960*540 for each of the 6" screens.
Regarding the HDD, it's a developer platform and not the home console itself. I wouldn't be surprised if these have a hard drive too, instead of the 512MB flash memory.
 
Last edited by a moderator:
A new-gen FlexIO from Rambus could allow the CPU to access the GDDR5 without much latency
GDDR itself already has quite a bit of added latency compared to regular DDR. In addition to that the memory controller will be out-of-(CPU)die sitting in GPU further increasing latency. IIRC XB360 has several times worse memory latency than P4 had.
 
Again, the console was supposedly developed within close contact with X360 and PC developers. Even if the eDRAM doesn't make much sense if we're thinking about building a balanced system, that objective may not be Nintendo's main purpose for the new system.
If the purpose is to attract AAA developers from X360, then sticking some eDRAM in there could be a demand from those developers.
Besides, the RV770 has ~960M transistors. The additional 16MB eDRAM would be what? Another 80-100M?
And how many transistors would it lose for switching to a 128-bit memory controller? ~30M transistors?


At 32nm, a ~1000M trasistors custom RV770 with 16MB eDRAM and reduced memory channels to a 128bit bandwidth would still take less than half the die space of the original RV770 (<100mm^2?).

Nintendo used the last two generations MoSys 1T-SRAM. http://www.mosys.com/consumerGraph.php
Wii had 24MB in the ATI Hollywood MCM. And that was on 90nm CMOS. So i dont think they will have any problems with embed memory. ;)
 
Again, the console was supposedly developed within close contact with X360 and PC developers. Even if the eDRAM doesn't make much sense if we're thinking about building a balanced system, that objective may not be Nintendo's main purpose for the new system.
If the purpose is to attract AAA developers from X360, then sticking some eDRAM in there could be a demand from those developers.
Besides, the RV770 has ~960M transistors. The additional 16MB eDRAM would be what? Another 80-100M?
And how many transistors would it lose for switching to a 128-bit memory controller? ~30M transistors?


At 32nm, a ~1000M trasistors custom RV770 with 16MB eDRAM and reduced memory channels to a 128bit bandwidth would still take less than half the die space of the original RV770 (<100mm^2?).

I think you're missing the point of ED-RAM here. It isn't specifically just about portability from the Xbox 360. The reason why ED-RAM makes sense is in a power efficiency and board cost/complexity for a given level of performance.

When it uses more power to move data than it does to actually do the calculations, anything which can localise that data better will either improve efficiency or improve performance or both.

So if they were going to go with a decent level of ED-RAM on their console they could follow Microsoft's lead for one and only use one memory bus for the whole system, just as we saw with the Xbox 360. They could even go with a relatively basic DDR3/4 memory interface with something paltry like ~30GB/S bandwidth. It'd scale with what the Xbox 360 has quite nicely given a slightly higher level of performance.

I'm not sure where you took the "1800*1000" for the controller screens. It's supposed to be 1920*1080 divided into 4 equal rectangles, so 960*540 for each of the 6" screens.
Regarding the HDD, it's a developer platform and not the home console itself. I wouldn't be surprised if these have a hard drive too, instead of the 512MB flash memory.

900 by 500 is a common resolution and form factor for a screen that size. I just went with that.
 
Anyone know a company that has started to manufacture XDR2 ? Rambus is fabless.

The GPU is really underwhelming if true.
 
Anyone know a company that has started to manufacture XDR2 ? Rambus is fabless.

The GPU is really underwhelming if true.

What kind of GPU are you expecting?! :eek:

The pic is pretty clearly fake, but IF it was real that would be a very powerful GPU, Radeon HD4890 level. Far more powerful then anyone expected until recently and still more powerful then most people are expecting now. In fact its the kind of performance you said you were hoping for earlier in this thread..
 
Last edited by a moderator:
I think you're missing the point of ED-RAM here. It isn't specifically just about portability from the Xbox 360. The reason why ED-RAM makes sense is in a power efficiency and board cost/complexity for a given level of performance.
When it uses more power to move data than it does to actually do the calculations, anything which can localise that data better will either improve efficiency or improve performance or both.

Please read my post correctly.
I'm not missing the point of using eDRAM when creating a system with balanced components and efficiency in mind.

I'm suggesting that in this case, eDRAM might be there just to achieve easier ports from X360, as that's been one of the major highlights when presenting the console to developers:
nintendoprojectcafe4.jpg






So if they were going to go with a decent level of ED-RAM on their console they could follow Microsoft's lead for one and only use one memory bus for the whole system, just as we saw with the Xbox 360. They could even go with a relatively basic DDR3/4 memory interface with something paltry like ~30GB/S bandwidth. It'd scale with what the Xbox 360 has quite nicely given a slightly higher level of performance.

Caches aside, Nintendo used 3 different memory pools in the Wii (eDRAM+1T-SRAM+GDDR3) and the Gamecube (eDRAM+1T-SRAM+SDRAM).
One may criticize the performance levels of the last console, but no one has criticized the efficiency per-watt and per-transistor of both consoles.



900 by 500 is a common resolution and form factor for a screen that size. I just went with that.

It's supposedly 960*540:
nintendoprojectcafe3.jpg




Anyone know a company that has started to manufacture XDR2 ? Rambus is fabless.

The GPU is really underwhelming if true.
A GPU 6->7x more powerful than the one in X360 is "underwhelming"?!

On the contrary, this is "too good to be true", for a console that's supposed to last ~6 years.
 
Last edited by a moderator:
What kind of GPU are you expecting?! :eek:

The pic is pretty clearly fake, but IF it was real that would be a very powerful GPU, Radeon HD4890 level. Far more powerful then anyone expected until recently and still more powerful then most people are expecting now. In fact its the kind of performance you said you were hoping for earlier in this thread..

I was sort of hoping there would be two RV770 in there, afterall it'll need to render the 1080p screen and up to four extra screens in the controller. So two 1080p. That's quite a jump from the sub 720p that we're getting this gen. Though I guess there will be some form of scaler to handle everything.
 
If the stream feature is true then I'd only expect to see reasonably simple graphics (maps ect) being pushed to the controllers while the main TV is in use for gaming. With full quality graphics being usable on the controllers if the TV isn't in use for gaming. In that scenario that GPU would be more then enough to put 360/PS3 to shame.

If the system can do what you're suggesting, then I hope its only an option for developers. I'd hate to see every developer having to limit the graphics in their games just to allow for the possibility of 5 people all playing at once on their own screen in one room..
 
Last edited by a moderator:
If the stream feature is true then I'd only expect to see reasonably simple graphics (maps ect) being pushed to the controllers while the main TV is in use for gaming. With full quality graphics being usable on the controllers if the TV isn't in use for gaming. In that scenario that GPU would be more then enough to put 360/PS3 to shame.
Agreed.
Besides, it doesn't even make much sense to invest in high-quality 3D rendering for both the controller and the TV screen at the same time, as the players wouldn't be always staring at both.



If the system can do what you're suggesting, then I hope its only an option for developers. I'd hate to see every developer having to limit the graphics in their games just to allow for the possibility of 5 people all playing at once on their own screen in one room..

Funny thing is: if the console were to really have a RV770 with dedicated 1GB GDDR5 using a 128-bit bus, it would still render two 1080p screens faster than a X360 could render one 720p screen.
 
Quarter 1080p splitscreen is a fabulous idea that hasn't seen any traction this gen, ridiculously. The idea of local coop with four local quarter 1080p screens sounds good. As Teasy says, don't expect the equivalent of 2 complete, full 3D images rendered. Like DS, one screen would likely be used at a time for gameplay and the other for information. eg. Mario Kart would have gameplay on the handheld and a commentary or map on the TV. Dragon Quest Multiplayer would have gameplay on the same screen, all four players cooperating, with local skills and inventory on the controller so it doesn't interrupt the game for other players.
 
I guess with 4 controllers you will only ever need to render no more than 4 images. I doubt there will be a option for running a game on the main screen and stream a game to the other 3 controllers.

But assuming devs are forced to make a game playable on all 4 controllers at the same time, would that really be such a big issue? Would it be very different that lets say games that currently allow 4 player splitscreen? The screen is much smaller, resolution lower and you can lower image quality.

Than again I still wnder if Nintendo is really going to put so much time/effort/money in a feature that while sounding really cool in reality isn't going to be used that much. How many families are out there that only have 1 tv? I the case of families I'd wager that most consoles are probably hooked up to a second tv.

Also, who's going to buy a home console just to end up gaming on a small screen? might as well buy a handheld then.
 
ToTTenTranz;1555054I'm [B said:
suggesting[/B] that in this case, eDRAM might be there just to achieve easier ports from X360
From what I understand edram doesn't really do much besides holding a (mostly) write-only render target with added "free" MSAA. With enough vram bandwidth those can be emulated trivially.
 
What's the point of this? Are they releasing a console or a set of four handhelds?
Local coop may be a nice gimmick, but that screen makes the controller clumsy. If I were in the market for a *console* this would be a huge turn off.
 
The controller screen will be probably just a cheap low resolution display for mainly 2d inventory or map graphics. I cant imagine that they put a quarter HD display with a decent quality on the controllers. The 4 controllers will cost more than the console itself. :?: Thats just silly.
 
I guess with 4 controllers you will only ever need to render no more than 4 images. I doubt there will be a option for running a game on the main screen and stream a game to the other 3 controllers.

But assuming devs are forced to make a game playable on all 4 controllers at the same time, would that really be such a big issue? Would it be very different that lets say games that currently allow 4 player splitscreen? The screen is much smaller, resolution lower and you can lower image quality.

Than again I still wnder if Nintendo is really going to put so much time/effort/money in a feature that while sounding really cool in reality isn't going to be used that much. How many families are out there that only have 1 tv? I the case of families I'd wager that most consoles are probably hooked up to a second tv.

Also, who's going to buy a home console just to end up gaming on a small screen? might as well buy a handheld then.

Most families will have more then one TV, but those TV's aren't easily moveable anywhere in the house like a controller could be.

With your last question you seem to be assuming that if someone buys the console then their intention will be to play on the controller all of the time, which is just weird. People will buy it to play on a TV and sometimes on the controller when it suits them.

If you and your family want a home console and you want the ability to play four player with each other in the home then which is the better option? Spending $300 on a home console and $200-$300 on 3 extra controllers or spending $300 on a home console and $1000 on 4 handhelds?

Funny thing is: if the console were to really have a RV770 with dedicated 1GB GDDR5 using a 128-bit bus, it would still render two 1080p screens faster than a X360 could render one 720p screen.

Yeah I agree, I'd love to see what it could do with one 1080p scene though :)
 
Last edited by a moderator:
From what I understand edram doesn't really do much besides holding a (mostly) write-only render target with added "free" MSAA. With enough vram bandwidth those can be emulated trivially.
How exactly can they emulate the 223GB/s bandwidth for framebuffer?
With a 512-bit GDDR5 memory system? I don't really think that's reallistic..



What's the point of this? Are they releasing a console or a set of four handhelds?
Local coop may be a nice gimmick, but that screen makes the controller clumsy. If I were in the market for a *console* this would be a huge turn off.

You've seen the controller???
Wow, you know more than all of us together! Cool thing ;)


The controller screen will be probably just a cheap low resolution display for mainly 2d inventory or map graphics. I cant imagine that they put a quarter HD display with a decent quality on the controllers. The 4 controllers will cost more than the console itself. :?: Thats just silly.

I (re)posted some pretty images in the last page. Check them out.
 
Back
Top