Predict: The Next Generation Console Tech

Status
Not open for further replies.
I'm sorry but I'm still seeing no evidence of these claims. Console manufacturers have more than just perf/mm and perf/watt to consider. Microsoft went with Intel for the first Xbox and didn't regret it at all from a power point of view. It was the lack if IP ownership that hurt them.
And that's the way it is with Intel. That's their deal. It has problems now as it did then.

Again, what's the evidence for this? What's the metric you are using to measure performance? How relevant is that to a games console? I find it very hard to believe that a PPC of similar power draw to say an arm CPU would perform as well. And similarly I'd like to see the PPC that's the same size as a Sandybridge offering anywhere near the performance.
That's easy. Power7. Taking the process (45nm vs 32nm) into account, I don't think many would argue x86 superiority very vigorously.

Because I think most people agree the PPC cores in Xenon and Cell suck pretty badly.
Compared to the single core P4s that Intel had to offer at the time of shopping around? At what price? Given the state of competition at the point of decision making, I think it is pretty clear that both Xenon and Cell are quite performant, given their respective constraints. Comparing them to later products, even neglecting IP ownership limitations, doesn't make sense. Ask the developers here if they would prefer to change back to an old P4 for performance reasons. I don't think you will see many hands raised.
 
Anyway you dont need graphs to see the truth, ALL consoles manufacturers uses IBM PPC last gen, and ALL consoles will use them again this gen, outside of desktop where other things come to play such as software, POWERPC makes total sense, else why would they all use it?

From a performance per watt perspective, and performance per mm die, POWERPC dominates, i think only mips beat it and this new UPU uarch from ICUBE.
Someone should tell Apple, since their stated reason for moving from PPC to Intel was that Intel was producing about twice the perf/watt. Also proven by Anandtech in their G5 -> Core Duo comparison. Both have improved significantly since then, and I don't know who is currently in the lead (for once my google... er.. bing-fu was insufficient to the task)
 
Someone should tell Apple, since their stated reason for moving from PPC to Intel was that Intel was producing about twice the perf/watt. Also proven by Anandtech in their G5 -> Core Duo comparison. Both have improved significantly since then, and I don't know who is currently in the lead (for once my google... er.. bing-fu was insufficient to the task)

Apple didn't switch to Intel until the "Core" generation of products.
 
Apple didn't switch to Intel until the "Core" generation of products.
I got the impression French Toast was implying POWER was _still_ the best perf/watt.

Sure, at the time the 360 release, the PPC was an excellent choice, although they had to give up things like 4GHz and OOO promised by IBM at the start of the project (according to some accounts)
 
Though unlikely, what about a 22nm dual-core Intel IVB or SB.

Dual-core, four threads. You get far superior performance per watt and flexibility.

A core i3 2100 uses ~30w under full load sans the iGPU.
 
Would not the decision of GPU heavy and CPU light vs a moderate CPU as you ask for be a decision based on the projected console life?
Dunno. If I was making the decision, I'd worry more about launch period and whether a very GPGPU-centric design could get any decent content. If it's too hard to write to from scratch, your platform will be killed by poop software. It makes more sense to have another generation of normal coding with GPGPU resources that devs can gradually learn to use, so next-gen it's natural for them.

Of course my opinions are being influenced by a doubt over the value of a long term console any more. I can envisage shorter life machines, released with lower specs, sold for lower at launch, and upgraded with forwards compatibility, like iOS.
 
Shifty Geezer said:
Of course my opinions are being influenced by a doubt over the value of a long term console any more. I can envisage shorter life machines, released with lower specs, sold for lower at launch, and upgraded with forwards compatibility, like iOS.

So you've changed your mind about the importance of backwards compatibility, then? Or is it just this next generation transition you're concerned about?
 
http://www.gamesindustry.biz/articles/digitalfoundry-in-theory-can-wii-u-offer-next-gen-power

Digitalfoundry causing global outrage on the Internets with Wii U predictions.

But they are right. There is no way Wii U is even close to 360S TDP without sounding like a hoover worse than launch 360. I predict very low clocks

They aren't right at all. Not only does the article ignore the fact that WiiU won't have a full size optical drive or hard drive (which take up well over 50% of the 360S design) but who ever wrote it doesn't seem to understand that technology improves over time. The 360 is a 2004/2005 design, in 2011/2012 a system can be built in a smaller case, at a lower cost and still be significantly more powerful.

Technology sites seem to be going down the drain at the moment. I was under the impression that digital foundry was supposed to be a top technology site, obviously not because that article is horrendous.
 
Last edited by a moderator:
Technology sites seem to be going down the drain at the moment. I was under the impression that digital foundry was supposed to be a top technology site, obviously not because that article is horrendous.

Are you still going to be saying that when inevitably Wii U turns up 0-30% more powerful than PS360? Ehh well I'm sure you'll just say "well I was right on the principles Nintendo just didn't take advantage".

I feel like anti sigging your sig, "GC didn't really have all 3 because it wasnt very powerful". Okay to be less trollish in truth it wasn't particularly small nor particularly powerful, a mix of average size and power.

Heck I just did a little googling and slim PS2 had less than 1/3 the volume of GC.
 
Dunno. If I was making the decision, I'd worry more about launch period and whether a very GPGPU-centric design could get any decent content. If it's too hard to write to from scratch, your platform will be killed by poop software. It makes more sense to have another generation of normal coding with GPGPU resources that devs can gradually learn to use, so next-gen it's natural for them.

Of course my opinions are being influenced by a doubt over the value of a long term console any more. I can envisage shorter life machines, released with lower specs, sold for lower at launch, and upgraded with forwards compatibility, like iOS.

Perhaps, but in that case with all devs at launch building launch games that use the GPU 100% for rendering and graphics, won't taking away from that resource to do GPGPU make their games look worse? Especially if the overall silicon budget for both chips is as meagre as the rumours that are flying about. Surely the GPUs will be choked up trying to push all the graphical effects that devs intend to meet their grand visions for their "next-gen" games. Surely they'll simply continue pushing graphical effects with the GPU, and use the CPU for all their general purpose game code throughout the life of the console, since graphics have shown to sell games and not anything that could concievably be done by GPGPU.

I personally only see a very limited use for GPGPU in next gen consoles. I just can't see how it would be worth the effort for most developers.

However, i'd like to be wrong. And i'd actually like to see a greater emphasis on physics and simulation in next-gen games in general, whether that's made possible by GPGPU or otherwise. I'd very much like a next-gen Red Faction Guerilla, or a GTA with fully destructable environments.
 
They aren't right at all. Not only does the article ignore the fact that WiiU won't have a full size optical drive or hard drive (which take up well over 50% of the 360S design) but who ever wrote it doesn't seem to understand that technology improves over time. The 360 is a 2004/2005 design, in 2011/2012 a system can be built in a smaller case, at a lower cost and still be significantly more powerful.

Even if you double performance per watt, once you take into account the reduction in wattage it's going to be tough putting something several times faster than the 360 in there.

Anyone got a measure of the 360S fan size? All I can find are pictures.
 
Right, appears the 360S has a 92mm fan (or possibly 80mm, but it's probably 92mm). Fan area alone is likely > 3 times that of the ~ 5cm fan the WiiU appears to have. Air flow difference at the same rpm is probably higher still.

I don't know enough about cooling to say a lot but ... I think Nintendo are going to need a seriously huge increase in ppw (like several times) if they want to use the current WiiU setup while having several times the processing power of the 360 (or whatever the current rumour is). Or am I getting something terribly wrong here?
 
I personally only see a very limited use for GPGPU in next gen consoles. I just can't see how it would be worth the effort for most developers.
You can use GPGPU for rendering as well. On a PC they can do certain rendering techniques much more efficient with Direct Compute than through the DirectX graphics pipeline. I'm not sure how relevant that is to consoles though, as they won't be making use of DirectX, but I can imaging that having a reasonably GPGPU friendly GPU in a console can help you in getting better graphics.
 
So you've changed your mind about the importance of backwards compatibility, then?
No. I've always said BC for the next gen, running this gen games, is unimportant, or rather of lower value and concern than other aspects of the console. Next-gen, everything is going to be software platform IMO, and so forwards compatiblity between the next consoles and future hardware in the same families will be important. But I expect that to come from software layers and not hardware compatibility, such that MS Live stuff and PSN stuff will run on a variety of mobile and tablet configurations or who knows what else. Where true AAA games hit the hardware, I reckon forwards/backwards compatibility could be ignored as that stuff will be readily replaced with the new AAA titles. The hardcore gamer may be happy to buy new hardware every 3 years at $250 with better specs that runs the next version of their favourite franchise rather than pay $500 every 7 years. Dunno. It works for Apple!
 
Perhaps, but in that case with all devs at launch building launch games that use the GPU 100% for rendering and graphics, won't taking away from that resource to do GPGPU make their games look worse?
Dunno. There's only so much you can do regards pixel shaders and the like. Also some GPGPU stuff can be piggybacked on the graphics work as I understand it, although I'm very out of touch with GPGPU code. I'd be surprised if next-gen the GPU gets saturated from day one and there's nothing spare for other workloads - that'd be a first, but then the GPUs will be known quantities unlike all previous generations where developers have had to basically start from scratch for every new machine.
 
No. I've always said BC for the next gen, running this gen games, is unimportant, or rather of lower value and concern than other aspects of the console. Next-gen, everything is going to be software platform IMO, and so forwards compatiblity between the next consoles and future hardware in the same families will be important. But I expect that to come from software layers and not hardware compatibility, such that MS Live stuff and PSN stuff will run on a variety of mobile and tablet configurations or who knows what else. Where true AAA games hit the hardware, I reckon forwards/backwards compatibility could be ignored as that stuff will be readily replaced with the new AAA titles. The hardcore gamer may be happy to buy new hardware every 3 years at $250 with better specs that runs the next version of their favourite franchise rather than pay $500 every 7 years. Dunno. It works for Apple!

Many people switched to console to get away from an upgrade cycle. And there's the catch where the first version of your $250 hardware looks like crap for 2 or 3 years because the competition launched something twice as fast, and you wind up in a situation where no one bought your hardware, so you have no money for the followup. And I don't recall the iphone launching at half the price of the competition.
 
That's easy. Power7. Taking the process (45nm vs 32nm) into account, I don't think many would argue x86 superiority very vigorously.

So you believe there's a power7 CPU out there that would be a faster gaming CPU than say a 2700k? Baring in mind the 2700K is 200 million transistors smaller and includes an intergrated GPU.

Compared to the single core P4s that Intel had to offer at the time of shopping around? At what price? Given the state of competition at the point of decision making, I think it is pretty clear that both Xenon and Cell are quite performant, given their respective constraints.

At the time of launch (and for quite a while before), the fastest and most efficient x86 architecture was the dual core Athlon X2. It wouldn't have been as cheap for MS/Sony as the custom PPC solutions they went with but that's a business decision, not a technical limitation. That's the whole point, PPC is and was the right decision because it's cheaper in the long run, not because it's overall more powerful or more performant per watt or transistor.

Comparing them to later products, even neglecting IP ownership limitations, doesn't make sense. Ask the developers here if they would prefer to change back to an old P4 for performance reasons. I don't think you will see many hands raised.

Choosing the least efficient x86 architecture isn't exactly fair. What about asking if an Athlon X2 would have been preferable over Xenon for performance reasons? And even if Xenon is still the preference of some, can anyone really claim that it dominates as was suggested? And what evidence is there to suggest that domination even if it existed would carry over to a custome PPC built today compared to modern x86 processors?
 
Perhaps, but in that case with all devs at launch building launch games that use the GPU 100% for rendering and graphics, won't taking away from that resource to do GPGPU make their games look worse? Especially if the overall silicon budget for both chips is as meagre as the rumours that are flying about.

Including a costly and high power consuming discrete CPU that causes the final GPU design to be gimped will also make the games looks worse. It's a matter of what will ultimately be the most efficient solution.
 
The hardcore gamer may be happy to buy new hardware every 3 years at $250 with better specs that runs the next version of their favourite franchise rather than pay $500 every 7 years. Dunno. It works for Apple!

I would have to consider that a stretch. From a consumers point of view, they are used to certain time periods for their consoles. You could well see a lot of people take a wait and see approach is MS (or whomever) takes that approach or if it was announced from the start a lot of people could simply decide to wait for the next one. My first thought would be "Sega Saturn". 3 years seems an awfully short time for a developer to make back their investment as well.
 
Status
Not open for further replies.
Back
Top