Predict: The Next Generation Console Tech

Status
Not open for further replies.
Question is though, what exactly can they do that's new with PSMotion and Natal out? Unless they have thought control, their options seem limited.

It seems you haven't heard of the horse bag yet. :LOL:

In all seriousness, who knows? There's plenty of options for innovation still. They're just not all that obvious, for apparent reasons, if you see what I mean.
 
My apologies Shifty... I forgot to add my bit on the end for my own Wii-NEXT prediction:

Wii-NEXT
CPU: Xbox 360 level or slightly more powerful Quad Core beastie
GPU: Maybe something a little more powerful than the current 360 GPU
Control Method: A combined "Wii-Mote/Motion+"-Mote that gives 1:1 control and maybe adds some other new quirky innovation, without all the current Wii-Mote limitations.

Assuming they drop the archaic fixed function architecture for their next console, I don't see the Wii's GPU being this weak at all really. The very lowest they could go would be ATI's current integrated graphics part and come 2012/13 that's already going to be on a level pretty significantly above Xenos one would assume.

Integrated graphics are only really one major revision away from being in the same sort of performance bracket as console GPU technology, the 3300/9400 were a pretty huge leap from the previous generation, as an example. So 2 or 3 major revisions like we can expect should produce some very interesting results.

Its not crazy to expect something around the level of the RV730 (but with upgraded DX11 level architecture/shaders ofcourse) to be the current standard for integrated graphics come the release of Nintendo's successor (in fact I'd say that's a pretty damn conservative estimate!) and strap 1GB of GDDR5 and a "quad core Xenon (or dual core OoO PowerPC chip?)" to that and you're going to be able to produce games on a level well above current console games, despite the hardware costing peanuts.

Why exactly would Nintendo insist on ATI creating a part below their very lowest performance chip? A chip already tailored for low power consumption, and low cost mass production? That sounds like the cheapest way to go for me, and it'd still provide one of the single biggest leaps between generations and be plenty powerful for a very robust software emulation solution for backwards compatibility as well.

Since the actual technology and architecture will be very similar to what's being deployed in the Xbox3 (just on a much smaller scale), downscaling a 1080p/60hz game may be as simple as targeting 720p/30hz with lower aa/af, texture resolution, using less precise lighting algorithms and a harsher LOD/draw distance. The average consumer's lack of interest in PC gaming has proved they don't mind putting up with that sort of thing, and it'd be a much better solution for multiplatform development that what the Wii has forced. As long as the Wii2 is targeted from the beginning it should be more than possible to create a decent version of your game for all three platforms this way, Nintendo benefits from more third party support and they still get to promote their low cost/low power consumption/small box philosophy.

I just don't see the benefits of any other approach, this is already pretty much the lowest of the low cost per unit wise, power consumption is low, the R& D budget is basically non existent and they don't have to forfeit a bunch of third party support either, everybody wins.


Just look at what ATI's current (well actually > 1 year old) integrated GPU can manage at the sort of resolutions that are common on today's consoles (see 790GX):

ep2.gif


quakewars.gif


It has built in hardware scaling so rendering at 1024x768 will produce very decent results on today's HDTV's. In 3-4 years we should be looking at a 4x increase in performance at the very least, that's quite the leap above Wii.
 
Last edited by a moderator:
Also for comparitive's sake with gfx cards that are supposedly the equivalent of PS3 and XB360... or at the very least, in close proximity:

image012.png
 
I would like to see a next gen architecture built around a many core CPU (just about a given really) where the GPU design is much more focused around fillrate and pixel pushing than anything else.

Providing better APIs (kinda like edge but tighter/cleaner/more flexible..) out the gate for doing stuff like vertex and geometry processing via the CPU cores/SPUs/Stream Processors would also really help us off our feet in the early stages too. Also if they can sort out a GPU setup that provides order-independent transparency then that'll be a big win too IMO..
 
this sounds like undoing the benefits of the last 4 years of GPU development (unification is a big deal), or maybe you are wishing for a kind of über Playstation 2. Maybe we could see that in an alternate History where Bitboyz and 3dfx are competing for the crown (was there the former who had complete vertex flexibility at a very early time?).
Throw in a 64x T-buffer/M-buffer that can enable stochastic sampling and advanced time anti-aliasing. Well, why not :).


I can't really believe in a 1 Teraflop Cell for now. A console with a quarter teraflop of CPU and 3 teraflop of GPU might be better than a more complex one with 1 teraflop worth of SPU + 1.5 teraflop GPU (I assume a similar putative power budget, i.e. bullshit numbers)
The real manycore stuff would be for the generation after that I believe and would consist in a single chip (Larrabee-like or a kind of Larrabee/GPU hybrid - with ROPs?)

of course my opinion is not worth much
 
it's also entirely CPU limited, with driver issues for the geforce 7's (except mysteriously for that "G70 7900GTX" which suggests a non matching driver).

at least the 790GX makes an impressive showing above ; but such a chipset or a better will suffer from a lack of bandwith (assuming a stock desktop IGP getting it from hypertransport?). I remember most gamecube at wii games at 60fps and I think it's better for party games.
But I imagine something similar to your whole idea for a 2012/13 Wii : an AMD Fusion processor with DDR4.
 
Last edited by a moderator:
A quick port will leave you very CPU limited on the PS3 ; even on the X360, see quake 4. Those were both engines made with a single core PC in mind, and in the worst case and most derogatory way possible, a console CPU core would be considered like a 3.2GHz Atom. With an AMD K8 @ 2.3GHz and a 7600GT I felt very CPU limited. The Source engine is of the CPU heavy kind, it reminds me of the quake 3 engine : left4dead runs decently on the core2duo laptops with weak GPUs.
 
this sounds like undoing the benefits of the last 4 years of GPU development (unification is a big deal), or maybe you are wishing for a kind of über Playstation 2.

I'm talking about a *Console* where more GPU silicon is dedicated towards pushing as many pixels as possible, very very fast. As far as the interface is concerned it needn't move away from the current Shader Models we see today but with extended functionality, simplified transparency management (on the user side) and only basic vertex processing since this can all be handled on the CPU stream processing cores (a la Cell)..

I just don't see the point of wasting GPU silicon on vertex processing or generalized cores for flexible load balancing when we'll likely tens of CPU cores with massively wide vector units that would be more than capable of handling this kind of work, not to mention providing the flexibility of users tailoring there vertex processing to whatever software model they choose..

I just think it would be the most effective means of providing the most bang for your buck on a hardware level..
 
it's also entirely CPU limited, with driver issues for the geforce 7's (except mysteriously for that "G70 7900GTX" which suggests a non matching driver).

at least the 790GX makes an impressive showing above ; but such a chipset or a better will suffer from a lack of bandwith (assuming a stock desktop IGP getting it from hypertransport?). I remember most gamecube at wii games at 60fps and I think it's better for party games.
But I imagine something similar to your whole idea for a 2012/13 Wii : an AMD Fusion processor with DDR4.

Well there's no reason they couldn't have a small amount of fast on chip memory (no reason to ditch the eDRAM, its proven a good solution for closed box platforms) for the framebuffer, in fact it could be a pretty elegant solution considering it would overcome the major performance limitation of current integrated graphics solutions.

Considering you get twice the bandwidth per clock with GDDR5 compared to DDR3, even a 64 bit bus to main memory would provide the same bandwidth as the current consoles have, and you'd probably be able to up the memory clocks quite a bit compared to the 360 as well, so such a solution could be as much as 50% better off than Xenos, which is proved itself mighty capable at 720p (which would be the target resolution). No doubt having twice the RAM is going to be a huge bonus as well. We're talking in super low end parts here but things are still looking good performance wise imo. Adding eDRAM to an off the shelf integrated GPU for 720p w 2xmsaa without tiling surely isn't going to require a bank breaking R & D budget.

Honestly, moving away from the specific implementation possibilities, the point is more to look at the protracted performance of ATI's lowest end parts. People underestimate what these things are capable of at console level resolutions and IQ settings, and there's just no real benefit in investing R & D into creating something even lower end, these things are already optimised for low power consumption and low cost mass manufacturing. The fact that AMD's very lowest end 2012 parts should be significantly above what the current consoles are capable of means Nintendo's console should see a really nice increase in performance, there's just no way forward with their old fixed function architecture.

I definitely do like the sounds of a single package "fusion" CPU/GPU. Say something like an Athlon ii X2 2ghz and a half teraflop region DX11 integrated GPU with 1GB of GDDR5 sounds like a really nice 720p system. Costs should be super low, the technology very mature and power consumption very low. These new Athlon ii CPUs pack quite a punch considering their low transistor count. That may indeed be cheaper than getting a separate CPU and GPU from IBM and ATI respectively.
 
Last edited by a moderator:
I'm reading to learn more about the IBM unveilling of their Power7 architecture. It looks like they managed to have EDRAM as L3. It's a huge density win.
It could really helps multi core architecture as Larrabee as I guess L2 cache size make quiet a dent in the compute/memory ratio of the chip.
 
Looks like Rich Hilleman, the chief creative officer of EA shares my sentiment that we likely will see a PS3.5 and a 360 ver. 2 before we will see a PS4 and 720 (link).

Feel free to go ahead and tell him why he is wrong.
 
OK, add some more memory, keep the CPU and GPU architecture but up their specs in some cost efficient manor.

I don't see the benefit of fracturing the userbase to be honest. I'd imagine most devs would simply cater for the lowest common denominator anyway, and the extra improvements on the PS3.5 and Xbox 540 would go underused.

I think the guy from EA might even be refering to an xbox360 slim and PS3 slim repackaged and re-launched with the motion controllers. Or even a 360 slim and an even slimmer PS3 (maybe when they get RSX & Cell down to 32nm).

It would annoy me if they put out a PS3 and 360 with a tad more memory and improved GPU. Plus, I'm not sure game devs would be all too happy about that either.
 
What sort of performance boost would people expect from an Xbox 360.5? I'll trot out my usual example of why I don't think concept really works... the 360 version of GTA IV runs at a 20% resolution increase over the PS3 version, and a lot of the time with a 20% frame rate boost. But it is not a 20% better game. At what point do the numbers make a difference? 50%? 100%?
 
2GBs RAM would makema significant difference IMO. But I don't see it happening. The hardware buyers aren't interested IMO. Last gen carried over well past the technical Best Before dates expired. I certainly don't want to upgrade untiul I get a real upgrade. I guess if I could shell out £50-100 for the PS3 with uniform 60fps everything, I'd do it, but that's not a realistic option. And certainly not £200-300!
 
@Prophecy2k, grandmaster and Shifty Geezer

I don´t think it´s important how many X % better games look, because as grandmaster points out we are approaching diminishing returns and people don´t really bother that much.

It will basically be all about marketing, add some dirt cheap memory, a new motion controller, maybe a few extra shaders to the GPU and voila you got a new console refresh that can live another five years.
Developers will be happy as they can use the same code and just upgrade some assets and add some effects, the motion controller will be new, but hey they already got the Wii to support so it would be done anyways.

To reiterate it´s mostly about marketing, the EA guy confessed EA is spending twice as much on marketing a title than the development budget. That gives a hint that it´s not the quality of a product that is most crucial for the success, Sony and MS will point to a number of improvements and people will just be happy to know they buy a better product at roughly the same price and that´s it.
 
Last edited by a moderator:
@Prophecy2k, grandmaster and Shifty Geezer

I don´t think it´s important how many X % better games look, because as grandmaster points out we are approaching diminishing returns and people don´t really bother that much.

It will basically be all about marketing, add some dirt cheap memory, a new motion controller, maybe a few extra shaders to the GPU and voila you got a new console refresh that can live another five years.
Developers will be happy as they can use the same code and just upgrade some assets and add some effects, the motion controller will be new, but hey they already got the Wii to support so it would be done anyways.

To reiterate it´s mostly about marketing, the EA guy confessed EA is spending twice as much on marketing a title than the development budget. That gives a hint that it´s not the quality of a product that is most crucial for the success, Sony and MS will point to a number of improvements and people will just be happy to know they buy a better product to roughly the same price and that´s it.
I'm not sure extra shaders ALU or a clocked higher rendition of xenos would make as much difference as a "fixed" xenon.
 
Status
Not open for further replies.
Back
Top