Predict: The Next Generation Console Tech

Status
Not open for further replies.
Idea for next-gen power consumption: According to the Anandtech and Guru3D GTX 690 reviews, it consumes well under 300w during gaming(260-270w across both reviews) and that's clocked at 915Mhz with four gigabytes of GDDR5 which, of course, is mirrored, but still consuming power.

-Under clock the core, shader, and memory further
-Half the RAM
-Consider the board power consumption, etc..

I can see a 200w, 3072 core fit inside a console if the thermal design of said console is raised to 250-300w. The desktop GTX 680 or 670 is already rumored to be in the next high-end notebooks, though the core will be binned and of course under-clocked.

With 18+ months before launch, the 20nm process taping out at TSMC/GF in Q4'12, I don't believe its unrealistic to look at whats currently on the market, both in terms of PC hardware and technology, to make very close predictions about the hardware and technology that we'll see in a 2013 console.
 
Is there possibility of having APU with stacked memory as 3D IC? Can be interesting to have 'next-gen' APU with fast stacked 1GB or more of RAM. Make it two for better heat dissipation and yields and connect it with 8GB of SSD like storage for streaming, of course include at least 250GB HDD and here is your PS4.

I don't think they throw at us some low-power crap, their CTO looked far more ambitious in (not so) recent interview. AMD with Sony can go completely custom, they can make whatever APU configuration they want, design it to not exceed 100W e.g. Shouldn't be a problem to have two 100W chips in console.
 
Last edited by a moderator:
For that, they'll need a memory standard for stacking, and I've only seen them mention the low power wideIO, which is targeting phones/tablets, it's not exactly fast at 12.8GB/s per chip @512bit. Hopefully a higher power version will arrive just in time ;)

EDIT: I spoke too fast:
http://eda360insider.wordpress.com/...ram-to-1-tbitsec-through-standards-evolution/
"Wide I/O JEDEC standards are in the works that target Tbit/sec data rates for high-performance requirements. These new standards will also explicitly support 2.5D and 3D memory stacks."
 
Last edited by a moderator:
Gaffer put a 1000 mhz 7850 in his X51




iADGgqPAhS9qu.jpg



http://www.neogaf.com/forum/showthread.php?p=37538289#post37538289
 
If MS or Sony go with high end CPU and GPU next gen and about 4gb gddr5. Will that be enough to achieve CGI quality graphics in real-time (aka toy story)?
 
If MS or Sony go with high end CPU and GPU next gen and about 4gb gddr5. Will that be enough to achieve CGI quality graphics in real-time (aka toy story)?

No. You need this:
rendering.jpg

wetadigital-470.jpg


With a cooling system like this:
weta-digital-water-cooling-servers.gif


To achieve CGI quality. And it still wouldn't be in realtime.
 
Gaffer put a 1000 mhz 7850 in his X51

Has he tried running it at full load for 36 hours? IIRC, that's what MS does for part of the random QA, but I'd be more curious to know the temperatures and fan speed/noise for the system at that point, not if the machine fails. Personally, I can be gaming for 8-12 hours straight (on weekends).
 
No. You need this:
To achieve CGI quality. And it still wouldn't be in realtime.
It's an interesting question, what could be achieved in realtime. Toy Story was very simplistic. Detail would have to be reduced, and we wouldn't get the IQ of CGI probably, but the general feel might be obtainable.

If you think about it, texture detail could be megatextured, so no problems there. Lighting is mostly baked or simplistic brute-force (lots of placed lightsources); a high end GPU could use a simple GI estimation for a similar look. It's poly counts that I think would pose the most trouble, but maybe with displacement maps and tesselation the models would work just fine?

However, no game is going to want to look like everything's made out of plastic (except a Toy Story 1 game). I doubt anyone will try to recreate Toy Story in realtime when graphics features have moved on so far.
 
Three memory pools!?
Ooooh cool :smile: Maybe they added a large, low speed, scratch area?
I'm curious about the box enclosing the memory with the GPU as the system LSI, could it be? They did use the terms internal versus external.
 
Ooooh cool :smile: Maybe they added a large, low speed, scratch area?
I'm curious about the box enclosing the memory with the GPU as the system LSI, could it be? They did use the terms internal versus external.

According to the patent document, the VRAM serves as a storage buffer for polygons and textures. Of course, framebuffers are textures too, and the document also cites the TV and terminal images being stored in VRAM.

The internal main memory does indeed appear to be a general purpose pool, handling data from the optical drive (program data, textures, audio for the DSP) and from the controller.


edit: it actually does sound like the Wii setup, doesn't it. :p
 
Has he tried running it at full load for 36 hours? IIRC, that's what MS does for part of the random QA, but I'd be more curious to know the temperatures and fan speed/noise for the system at that point, not if the machine fails. Personally, I can be gaming for 8-12 hours straight (on weekends).

Well, of course he hasn't. I hate to give ammunition the haters, but he did mention his CPU was running hot. 55C and 85C at load I think. Heck I tried overclocking my Q6600 the other day and it shut off at 70C. I didn't know CPU's could go above ~70c.

But he seems like a total total newb, so there's little telling. Some commenters were wondering if he put his CPU thermal paste on correctly for example, since he'd never done it before (and for some reason he redid his CPU thermal paste manually).

Heck for that matter measuring CPU temps with software is an innaccurate science, so he may be simply be using the wrong program too.

Anyways I doubt the change in video card would affect things that much, so I'd have to wonder if other X51 CPU's run hot, I assume not since no GAF commenters brought it up.

Thinking seriously, I'm thinking if you dropped a pitcairn/7870 in a console, you would probably have to underclock it to 800.
 
New revision of Wii with support for WiiU controller? Since the console in those diagrams is 1:1 Wii, and doesn't resemble those proto-WiiU's at all

Maybe they just want to protect the controller atm, and they don't want to disclose Wii U hardware yet, so they just used the Wii for patent purposes. Is that feasible/legal? Anyone here well verse don patent law? So many questions, :oops:

Anyway, the patent also shows a smaller terminal unit:
MOgf3.jpg
 
Last edited by a moderator:
Well, of course he hasn't. I hate to give ammunition the haters, but he did mention his CPU was running hot. 55C and 85C at load I think. Heck I tried overclocking my Q6600 the other day and it shut off at 70C. I didn't know CPU's could go above ~70c.

I think ~95c is the thermal cutoff on sandy bridge i7's. I run mine at 4.7ghz, which at full load is ~73c and still considered 'safe' in the overclocking community.
 
Status
Not open for further replies.
Back
Top