Predict: The Next Generation Console Tech

Status
Not open for further replies.
BTW, look at how versatile seems to be the PowerA2 architecture:
http://www.hpcwire.com/hpcwire/2011-08-22/ibm_specs_out_blue_gene_q_chip.html
http://www.theregister.co.uk/2011/08/22/ibm_bluegene_q_chip/

18-core, 1.6 Ghz, 55 W, 360 mm^2 and 205 Gigaflops DP at 45nm.

An 8-core design at 32nm seems perfect for next-generation console! 32 threads, tiny and not power hungry.

Oh wow, IBM are still using their 45nm SOI process for their latest supercomputer stuff. Maybe the WiiU isn't just using it because it's mature and cheap. Are IBM actually mass producing anything on 32nm yet?
 
Honestly, with smartphones having 1GB of RAM and 2GB and 4GB just being a matter of near future you'd think that consoles would have to demonstrate a discernible advantage if they're to last. And remember, they will come out 3 years from now... it would be embarrassing if a high end smartphone would have more room for game information than a new console at launch.

Of course RAM isn't everything, processing power is still #1, but after processing power it is the first bottleneck. My point is that halving or worse of the RAM size to save some tiny fraction of the overall cost is a very bad idea.

At least PSP Vita engineers understood this when they fought away the (management's?) idea of cutting RAM size from 512MB to 256MB.

EDIT: fixed sentence structure.

Actually, i'm expecting them (or at least Xbox next) in late 2013, which means that specs freeze is less than 2 years away.
And you can see also that PS Vita, which is made for games, has less memory than a high-end general purpose smartphone.
 
When you mention WDDM overhead are you sure you're not confusing games with CUDA applications? ;)

http://blogs.msdn.com/b/e7/archive/2009/04/25/engineering-windows-7-for-graphics-performance.aspx

Microsoft said:
For Windows 7, we have worked closely with our Graphics IHV partners, helping them improve the WDDM drivers’ gaming performance with specific changes to how Windows 7 works under the hood, while maintaining the same driver model and compatibility. Our continued investments in performance tools has helped us and our IHV partners track down and analyze various gaming performance bottlenecks and fix them in subsequent driver releases. The fundamentals of the Windows Display Driver Model remain unchanged in Windows 7. Some policies around GPU scheduling and memory management were changed to enable better performance in certain scenarios.
 
Hmm, first 3D, then motion and now VR? Looks like they're straining to come up with gimmicks for people to buy new consoles.
 
At least PSP Vita engineers understood this when they fought away the (management's?) idea of cutting RAM size from 512MB to 256MB.
That's unfounded speculation. Sony have gone on record as saying there never was that intention. As for smartphones RAM, that's different as they keep loads of apps open at the same time. Same as PC - you need lots of local storage for activite tasks. But when using one task at a time, a gigabyte is a huge amount of data. Especially factoring in compression that's used as much for performance gains as increasing available space. 4GBs would serve next-gen very well. 2GBs very fast RAM would probably do as well, coupled with some decent storage solutions (think fast flash storage disk cache). 8GBs would probably be an excess, but it could be considered for multitasking rather than threatening developers with scary costs to fill 8GBs with content. If 4GBs isn't deemed enough, it's more likely we'd have something like 4GBs fast RAM and some smaller, faster VRAM, or eDRAM for framebuffer ops which cost a lot, maybe. Massive clumps of slow DDR3 won't be much benefit.

And you also identifiy why keeping RAM parity with PC is a fool's errand - because the processing power will become outdated compared to a standard PC config within a couple of years, and that'll impact final visuals as much as lack of RAM. Expending cost on extra RAM that won't let you hold your own against PCs will be money down the drain.
 
Well, put those massive clumps of slow DDR3 RAM behind a 256bit memory controller, and even at 1600MHz this gives a good 51,2GB/s. Put in a 512bit memory controller and you can have 102,4GB/s 16GB DDR3 at a relatively low long term cost...
 
4GB ddr3 and 1GB gddr5, both 128bit.

that would work well enough, ddr3 bandwith is ample for the CPU, next gen GPU will support unified address space so split pools are starting to quite make sense.

memory can always store texture data, level content, higher quality sound effects and music, and will always be filled from caching, lazyness or formerly unreasonable wasting if the developers can't manage to fill it.

you could waste 1GB memory on hungry AI algorithms, in a way you can't even afford (for now) on PC.
 
Last edited by a moderator:
4GB ddr3 and 1GB gddr5, both 128bit.

that would work well enough, ddr3 bandwith is ample for the CPU, next gen GPU will support unified address space so split pools are starting to quite make sense.

memory can always store texture data, level content, higher quality sound effects and music, and will always be filled from caching, lazyness or formerly unreasonable wasting if the developers can't manage to fill it.

you could waste 1GB memory on hungry AI algorithms, in a way you can't even afford (for now) on PC.

It seems reasonable to me, because we would have a balance between the costs,space and room for future development graph (shaders, textures, or whatever) in the bandwidth25/27GB/sec DDR3128Bit as main RAM (DDR3 - 1600) coupled with a VRAM also 128Bit (60/70GB / sec?) with enough size and speeds up to 1080P.

(but tablets,cell phones etc maybe could reach this in less than 2 years...)
 
It would give you a nice amount of memory and bandwidth, but you'd ultimately be stuck with two 128-bit buses and a minimum of 8 memory chips (and more complex board designs and larger minimum die shrink/combine). It could potentially put it at a significant cost disadvantage against a system with a single 128-bit bus.

I really like the 360s design, it's just lacking a few megabytes more embedded memory. I'd like to see an evolution of this approach for the next Xbox. If the PS4 used PowerVR graphics they could possibly just make do with the 128-bit bus. All very shrink and cost reduction friendly, in theory.
 
Oh, and then this.

Any app I like can have 99%+ CPU time and way more than 480MB RAM that nothing else dares touch while I'm busy with it. It's called thread priority. If necessary, a thread with high priority can put almost everything else on hold and kick other stuff out of the RAM into the paging file on the HDD.

And, miraculously, when the app is done, all the other stuff comes off the HDD back into RAM and works quite nicely without interruption or restarting.

The "almost all the GPU" is just plain funny. When I'm running a game, I *know* that this game has 100% of the GPU. No "almost" about it :)
function already debunked your assertions, but I would like to point out that the windows task manager does not include task switching overhead, so that "99%" is misleading. Since it's a pre-emptive OS, processes do get checked if they have work to do, and that involves clearing the CPU pipeline, switching the registers, and other process switching work. Most system processes will have a higher priority than any user process, so no matter how high you set your game, it still has to share CPU with dozens of other processes and drivers. On the drivers side, the generalization that WDDM and other driver models provide add another layer of inefficiency that doesn't exist on the consoles. User mode-kernel mode switching is horribly expensive, and adds even more inefficiency.

Your PC is hugely more powerful than a console, this isn't a surprise. People were building powerful PCs that could outperform the console even when the console was launched. That was the point of my post, in that the consoles shouldn't be thought of as "cutting edge" as much as "as cutting edge as costs allow". Pretty much every PC game nowadays looks and plays better than the consoles, I'm not arguing that isn't the case. What I'm saying is that with, essentially 8-10x the "power", the games aren't 8x better, even ones that were made purely for the PC, and that has to do with inefficiencies of the PC architecture itself, and the requirement of having to support a heterogeneous environment that doesn't allow for as much tuning.
 
I suppose there's a legacy mentality regards consoles where they orignally were better than PC, because they had custom hardware designed for throwing around sprites and scrolling backgrounds that the PC couldn't compete with. This was of course more power through specialisation rather than anything clever, and a 286 PC could whip a Z80 based console hollow in the processing stakes even if its games sucked. Come 3D though, and the progress in the PC space has meant consoles, built around the same principles of rendering polygons, cannot compete. There's possibly a cost advantage, especially with loss-leading hardware, where at launch it's been the case that the consoles have given you more game for your money than a more expensive PC at the time could deliver. The cost of a PS3 at its launch got you Uncharted 2 visuals, whereas I dunno what that same £300 could have bought in a PC, but I'm certain it wouldn't be comparable. Still, the age of the console's being flagship hardware are over. Especially with the progress of replaceable PC graphics cards that allow very affordable upgrades and promote the acceleration of graphics hardware. Consoles are now the place to get original experiences. Eyetoy, Singstar, Wii, Kinect, are born on the console, and future progress in game styles and genres will be happening there and on tablets, not on Windows. Those who only care about the numbers should stick to PC gaming. ;)
 
Win7 is a bad OS for Real Time Systems (for a whole host of reasons).
Linux quite honestly isn't very much better in most of its variations.

Consoles are primarily designed to run a single real time application, on a single hardware configuration, that places entirely different constraints on what in the PC space would be considered drivers, and on the application. As a result console titles will in general get more out of the same hardware.

Shifty is right in terms of pure hardware consoles haven't been superior in any real way to cutting edge PC's, since the PC space started driving the 3D graphics market.

But then I don't believe that consoles are competing with cutting edge PC's.
 
Win7 is a bad OS for Real Time Systems (for a whole host of reasons).
Linux quite honestly isn't very much better in most of its variations.

You do not really want real time for games, you want the maximum throughput with minimal latency for ONE app.
 
Games are basically soft real time systems.
i.e. you need the underlying "OS" to behave predictably, or minimally within some known constraints.
 
Interview: id Software's John Carmack

DC: If you could give Sony and Microsoft each a wish list of what you’d like to see in each of their respective next-gen platforms…
JC:

So one of the most important things I would say is a unified virtual 64-bit address space, across both the GPU and the CPU. Not a partition space, like the PS3. Also, a full 64-bit space with virtualization on the hardware units - that would be a large improvement. There aren’t any twitchy graphics features that I really want; we want lots of bandwidth, and lots of cores. There’s going to be a heterogeneous environment here, and it’s pretty obvious at this point that we will have some form of CPU cores and GPU cores. We were thinking that it might be like a pure play Intel Larabee or something along that line, which would be interesting, but it seems clear at this point that we will have a combination of general purpose cores and GPU-oriented cores, which are getting flexible enough that you can do most of the things that you would do on a CPU. But there are still plenty of things that are much better done with a traditional CPU core, debugger and development environment. I will be a little surprised if there’s any radical departure from that. I hope neither of them mess that up in some fundamental way. I’m very interested to see what the next gen consoles look like, if they’re even going to have optical media or if they try to strike out without it. Those are the types of big decisions that I wouldn’t want to be in the position of making because they’re billion dollar effects. But this generation, I know most executives were surprised at what the attach rate was on this current generation of consoles.

Rest of the interview:
http://www.tomsguide.com/us/Interview-John-Carmack-id-Software,review-1686-3.html
 
When he says a unified virtual 64bit memory space does he mean using all the memory available (ie: HD, SD Cards etc) directly as a memory unit?
 
HDD/SD cards are storage, not memory. Even though you can do things like mmap.

And even then, that's not what he's talking about.
 
Status
Not open for further replies.
Back
Top