Predict: The Next Generation Console Tech

Status
Not open for further replies.
Dont know if anyone posted but

http://www-03.ibm.com/press/us/en/pressrelease/34683.wss

The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

IBM plans to produce millions of chips for Nintendo featuring IBM Silicon on Insulator (SOI) technology at 45 nanometers (45 billionths of a meter). The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.
 
I think some geeks just can't control themselves and have hard-on for FLOPS talk....kinda like those people who have hard-on for SSD storage and 100 core 6GHz processor...not happy until wet dream spec reach epic alternate reality proportions...tri-winning.


As i say...is just my 2 cents...

A lot arrogance in your post ...


I will never treat with disrespect another user as you did with arrogance, I think it more sensible answer to the pursuit of knowledge...it is clear that just as peak FLOPS as metrics can not be the only way to measure the power or capacity of a hardware cpu or gpu in 3d universe,but certainly its can show relative capacity to maintain - with appropriate software - an execution units in constant processing efficient as possible.

Anyway forget Top500.org, Linpack,FFT(IBM use this..),whetstone etc that no longer fit for anything ;)
 
Last edited by a moderator:
My contention wasn't that efficiency and effective design were being ignored in the discussion but that some of the examples given were anything but representative of that.
 
Sounds like they should stick with IBM, and follow up with their R&D. Let's see if they can bring Watson into PS3/4. :devilish:
I think system architecture and software library/maturity may be more important than CPU/GPU choice so far.

Binary compatibility with ARM will be broken/incomplete anyway if Sony continues to use SPU.

... and so they (IBM) did use it to market Wii U:
http://www.engadget.com/2011/06/07/ibm-puts-watsons-brains-in-nintendo-wii-u/

IBM tells us that within the Wii U there's a 45nm custom chip with "a lot" of embedded DRAM (shown above). It's a silicon on insulator design and packs the same processor technology found in Watson, the supercomputer that bested a couple of meatbags on Jeopardy awhile back.

;-)
 
My contention wasn't that efficiency and effective design were being ignored in the discussion but that some of the examples given were anything but representative of that.



I just have to thank for the information and as i said i fully agree with you.

But lets go for new console coming ...how you look at the possibility R700/770(?) gpu WII U / Project Cafe use a "lot"(32MB or even more?) eDRAM?
 
Last edited by a moderator:

I think Engadget's conclusion is dubious. I'm sure they were fed some generic marketing line from from IBM saying, "Wii U uses our POWER technology just like Watson!" but I doubt we are meant to believe it literally uses a Power7 chip.

Given the physical dimensions of the base unit actually shown I think we can pretty quickly discount any design that could fairly be called significantly more powerful than a PS3/360. I think the original leak is probably pretty accurate and we're looking at a modest 3 core POWER design of some kind. I also think the system is so small that we must be looking at a single chip design with some number of those R700-based shaders (between 160-320 is my guess) integrated Fusion-style along with a chunk of embedded DRAM as L3 or VRAM. Texture quality in the games they've shown also don't point to much more RAM than the current HD systems.
 
I just have to thank for the information and as i said i fully agree with you.

But lets go for new console coming ...how you look at the possibility R700/770(?) gpu WII U / Project Cafe use a "lot"(32MB or even more?) eDRAM?

Id say it has to be 770 features or newer (but not as many SPUs), how do the graphics get onto the controllers must be a large factor.
 
The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

45nm custom chip with "a lot" of embedded DRAM

The edram might not be on die. On die edram is supposed to add significant cost.
 
Id say it has to be 770 features or newer (but not as many SPUs), how do the graphics get onto the controllers must be a large factor.


You could be right maybe a custom RV770 * or even Redwood **(DX11) ,because there are displays on the controllers (4 or more? What resolution each one?).


* http://www.guru3d.com/article/ati-radeon-hd-4670-review/2

Radeon HD 5550/6390
** http://www.guru3d.com/news/radeon-hd-5550-rebranded-to-hd-6390-in-russia/
HD 5570
http://www.anandtech.com/show/2935/1
 
Last edited by a moderator:
You could be right maybe a custom RV770 * or even Redwood **(DX11) ,because there are displays on the controllers (4 or more? What resolution each one?).
It has been said only one of those controllers with screen can be attached to wiiu at a time.
 
Perhaps it's shared memory hierarchy so that both GPU and CPU access the same eDRAM (L3/framebuffer).

Wow, that could get very expensive. But if Ninty went with the "fake" specs of 512 MB XDR2 system memory and 1 GB of GDDR5 VRAM, EDRAM could be important to easing up the main memory since there will be no HDD (seriously Nintendo?). Whatever the case, I would like to see where Ninty's cost of goods rests right now. Not having a real HDD just seems foolish. But perhaps forcing users to use SD cards will be beneficial. I just hope that the system can deal with not having an HDD to cache data. Even still, I'm entirely expecting Ninty to release a USB HDD later on. Even a 32 GB SD card wouldn't be much for gamers to go on, especially if DLC and VC stuff if important to the new system's strategy.
 
Why is not having a hdd a problem? It will have internal storage so there is always the possibility of a part of that memory being reserved for game cache. As for DLC/VC it doesnt matter either because apperantly you can hook up any sd card or usb hdd you want and play games from there. So I really don't see the point in having a large and relative expensive hdd in there when even 16gb of flash memory would do the trick.
 
Why is not having a hdd a problem? It will have internal storage so there is always the possibility of a part of that memory being reserved for game cache. As for DLC/VC it doesnt matter either because apperantly you can hook up any sd card or usb hdd you want and play games from there. So I really don't see the point in having a large and relative expensive hdd in there when even 16gb of flash memory would do the trick.

Well if the system has support for an external mass storage device, then that is not too problematic to me. Nintendo will need to ship the system either with a "starter" SD card or with some on system flash memory if they expect developers to use such ROM as a data cache. It was a bit of a fiasco on the 360 early on, since devs had to worry about systems with and without HDDs. I'm glad that eventually developers eventually said "fudge it" and HDDs ended up being required for some games. I guess shipping with 4 GB of Flash memory wouldn't be too expensive? The system needs to store it's OS and updates on something anyways, unless they are hellbent on keeping the OS on it's own separate silicon to protect it from piracy. Seems like a plausible proposition.
 
Power7's L3 is eDRAM

a) Look at the prices of Power7 based systems. There is a reason others aren't using it.

b) CPU's run latency sensitive workloads, so enhancing bw for such an architecture instead of gpu's is counterintuitive.

c) I am hoping that it is single chip system with eDRAM mainly intended for the gpu. That would be cool. :)
 
a) Look at the prices of Power7 based systems. There is a reason others aren't using it.

b) CPU's run latency sensitive workloads, so enhancing bw for such an architecture instead of gpu's is counterintuitive.

c) I am hoping that it is single chip system with eDRAM mainly intended for the gpu. That would be cool. :)
I don't think IBM use EDRAM to work around bandwidth constrain. It seems about cache density in the case of the power7 and low power cache in the power a2.
Here a presentation about the power a2 they explain what they do with the EDRAM in the L2.
 
Status
Not open for further replies.
Back
Top