Predict: The Next Generation Console Tech

Status
Not open for further replies.
Given how you can buy a whole pc that is way beyond the current consoles for far less than 999 dollars so by 2013/2014 you could easily build a system way beyond the current consoles for 400 dollars I think. You could even do that right now.
 
Thats what I said. "However they can be operated in clamshell mode. This means at this point the maximum is 8x 2Gb = 2GB of RAM"

so in a 128bit with the use of clamshell you can go max 2GB, where come the 4GB?
i hope that there's something that i'm missing because really want to be a bigger number XD
 
ya the dreamcast was a solid performer it just died because of lack of support from developers.
Aaaand from Sega bleeding money on court because of that 3dfx class action suit...
I guess one thing led to another.

But the dreamcast wasn't nearly as dislocated (performance-wise) from its competitors as the Wii is right now.

Don't be shy, tell us what these things are.
I think both the PS3 and the X360 came out as very unbalanced systems. A 3-core, dual-threaded 3.2GHz CPU or Cell would be considered acceptable even today. A half-bandwidth G70 or a RV570 (the closest there has ever been of a Xenos in a desktop?) would not.


PC games and benchmarks tell us that you have to spend a whole lot more on the GPU than the CPU if you want a balanced system.
My bet would be that next-gen consoles will give a lot more power to the GPU side and a much smaller bump in the CPU side, especially if computing in the GPU turns out a viable option.



4GB comes with 256bit bus
I will say again that a 192bit bus with 3GB could very well turn out a good compromise between performance, memory amount and complexity.
 
Given how you can buy a whole pc that is way beyond the current consoles for far less than 999 dollars so by 2013/2014 you could easily build a system way beyond the current consoles for 400 dollars I think. You could even do that right now.

Indeed. And we remember the ps3 according to Isuppli reached (Bill of materials) USD$840 when launched using a G70/71.

Believing in a positive assumption perhaps coming a little later (six or ten months with large scale production) would come with custom G80 and still maintain level less than USD$999.


So maybe we can see consoles in 2014 with "high power levels" (and knowing range and customization of consoles usually exceed pcs at same materials) like USD$999 pcs.

However we have a question: M$ and Sony consoles will launch priced at USD$600?
 
Last edited by a moderator:
I think both the PS3 and the X360 came out as very unbalanced systems. A 3-core, dual-threaded 3.2GHz CPU or Cell would be considered acceptable even today. A half-bandwidth G70 or a RV570 (the closest there has ever been of a Xenos in a desktop?) would not.
What other option was there?! How could any console in 2005 put in a GPU that wouldn't look dated now? The only reason tri-core CPUs don't look bad is because the consoles were 'ahead of the game' as it were, and CPU development hasn't seen the same explosive progression as GPUs. But there was literally no other choice. These consoles bled MS and Sony money as it was. There's no GPU option that'd be comparable. There's nothing they could have invented and included, at no amount of cost (dual SLI'd G70s from 2005 won't hold a candle to the latest GPU's), so your comparison doesn't make any sense.

On the flip side, these consoles are still managing to turn out good looking games on $200 boxes and they'll be getting cheaper. They've also provided game developers with stable markets and multiple tiered options for games from downloads up to full disc titles. They've put up a pretty good fight against piracy too. All in all I don't see what other realistic expectations could be had.
 
on the PC space you still need a good CPU, let's pit a pentium 4 3GHz with a 7800GTX against a fast core2duo with a geforce 7600GT or something.

the first system will be vastly superior in lots of games (say far cry at very high IQ and res), but try running something more recent and you'll get an useless slideshow. the second system will get way more framerate albeit you'll run the game at a low resolution.

on a console or a cheap PC you will be starved for memory bandwith - a fast CPU doesn't need much bandwith but a fast GPU needs huge bandwith.

a nitpick, console CPU have weak single thread performance so there's room for a huge improvement. that can buy better AI and more 60fps games.
 
What other option was there?! How could any console in 2005 put in a GPU that wouldn't look dated now? The only reason tri-core CPUs don't look bad is because the consoles were 'ahead of the game' as it were, and CPU development hasn't seen the same explosive progression as GPUs. But there was literally no other choice. These consoles bled MS and Sony money as it was. There's no GPU option that'd be comparable. There's nothing they could have invented and included, at no amount of cost (dual SLI'd G70s from 2005 won't hold a candle to the latest GPU's), so your comparison doesn't make any sense.

I don't think you interpreted my post correctly.
I never said they should have chosen newer architectures that would look less dated today. Of course that would've been impossible, as both ATI and nVidia churn out new and more efficient\feature-full architectures as fast as they can, and by 2004-2005 it wouldn't be realistic to choose anything with a feature-set higher than SM3.0 (which proved to be quite a good standing point, as DX10\SM4 was mostly ignored by game developers).

What I meant is that, if you're a console maker like Nintendo, MS or Sony, you're not as constrained in terms of execution units for your GPU as a regular consumer.
You have a budget and a heat+power envelope, from where you decide what components go in there.
You don't think that a "Xenos B" with 64 shaders, 24TMUs and 16 ROPs plus a "Xenon B" with 2 dual-threaded cores @ 2.5GHz could do a lot better visuals than the current system?


I know the Cell development was a technological investment made in the early 2000s, but the same way goes to the PS3.
Once you have to turn to your "CPU" to do GPU stuff, it's because you've bottlenecked your system in the wrong spots.


Sure, get the game developers to spend hundreds of thousands of code-monkey-hours in figuring out work-arounds for the aforementioned bottlenecks and they'll do wonders, but in the meanwhile you've spent tenths of millions on a game and if it doesn't sell millions of copies to get that investment back, there goes the developer (to bankrupcy).

On the flip side, these consoles are still managing to turn out good looking games on $200 boxes and they'll be getting cheaper.
Never questioned that, from a consumer point of view, at least.




They've also provided game developers with stable markets and multiple tiered options for games from downloads up to full disc titles.
I'll question that. Not even getting a good-looking game out and with positive reviews will grant a market success.



They've put up a pretty good fight against piracy too.
The PS3, yes. The X360, not really.



All in all I don't see what other realistic expectations could be had.
Neither do I, but just because I wasn't there to see what other choices could have been made.
 
You don't think that a "Xenos B" with 64 shaders, 24TMUs and 16 ROPs plus a "Xenon B" with 2 dual-threaded cores @ 2.5GHz could do a lot better visuals than the current system?
You're assuming that a bigger Xenos was an option. Xenos was already pretty hot and we know how it ended.
It's a given that one could have done better than either the PS3 or the 360 within the same power/thermal envelope and silicon budget. Especially if you consider all the most successful engine have moved to deferred rendering.
Personnaly (but that's not really a hardware choice) now that it's all said and done I would have preferred if manufacturer only enforce 576p (dvd resolution) + AA onto devs and invested a bit more on the upscaler. But most won't agree with me on the matter, I believe neither of the system really had what it takes for HD (I kind of hate this term as from a PC player pov it means nothing...).
 
You're assuming that a bigger Xenos was an option.
You're right. I don't know.
AFAIK R500 was, at some point, supposed to be launched the highest-end GPU from ATI for late 2005 - early 2006, and it was scratched because driver development for the EDRAM would be a total mess.
Xenos has ~330m transistors with the EDRAM. R580, launched a couple of months after the X360, had ~380M transistors with a 625MHz core clock.
So I do know that a "bigger" GPU at 90nm would be possible, while maintaining the core clock.



Xenos was already pretty hot and we know how it ended.
But AFAIK the cooler was always shared with the CPU. A hotter GPU and a cooler CPU (as I've suggested) might have resulted the same way.
 
Last edited by a moderator:
What I meant is that, if you're a console maker like Nintendo, MS or Sony, you're not as constrained in terms of execution units for your GPU as a regular consumer.
You have a budget and a heat+power envelope, from where you decide what components go in there.
You don't think that a "Xenos B" with 64 shaders, 24TMUs and 16 ROPs plus a "Xenon B" with 2 dual-threaded cores @ 2.5GHz could do a lot better visuals than the current system?


I know the Cell development was a technological investment made in the early 2000s, but the same way goes to the PS3.
Once you have to turn to your "CPU" to do GPU stuff, it's because you've bottlenecked your system in the wrong spots.

Xenos was comparable to the highest end GPU's of late 2005 when 360 launched, excepting the toned down core clock which is a necessity in consoles where heat is such a consideration. There was nothing much "above" it to step up to if they had wanted. Unless that they wanted to go dual GPU, which I suppose is possible, but would have introduced a lot of heat, power and cost issues.

I think you're misguided though, since we're all focusing on the number of cores. Look at transistor count, Xenon is a paltry 165 million according to wiki, whereas a 4 core Sandy Bridge has 995 million transistors. If you just look at the core count or clock speed, you'll be misguided. Sandy Bridge has more than 6X as many transistors which should give you a truer picture of the performance improvement of current CPU's imo.
 
Xenos was comparable to the highest end GPU's of late 2005 when 360 launched, excepting the toned down core clock which is a necessity in consoles where heat is such a consideration. There was nothing much "above" it to step up to if they had wanted. Unless that they wanted to go dual GPU, which I suppose is possible, but would have introduced a lot of heat, power and cost issues.
Xenos might have been comparable to the highest-end GPUs (although it halved their memory bandwidth and shared it with the CPU), but Xenon wasn't.
The CPU was well above 2005 standards. By then, we'd have either super-expensive Pentium Ds (non-hyperthreaded) @ ~2.8GHz or equally expensive Athlon 64 X2 @ 2.2GHz.


I think you're misguided though, since we're all focusing on the number of cores. Look at transistor count, Xenon is a paltry 165 million according to wiki, whereas a 4 core Sandy Bridge has 995 million transistors. If you just look at the core count or clock speed, you'll be misguided. Sandy Bridge has more than 6X as many transistors which should give you a truer picture of the performance improvement of current CPU's imo.
It wasn't a low transistor count for 2005 (only a bit lower than dual-core K8s and Netbursts due to small cache sizes), but being clocked @ 3.2GHz where similar-sized x86 CPUs were clocked at ~2.4GHz in 2005 I'm pretty sure they spent some "extra credits" for the CPU in heat and power consumption (and maybe even factory binning).
 
Last edited by a moderator:
I didnt say it was a low count for 2005.

I'm just saying if you look at it like "triple core 3.2 ghz" that sounds nice even by today standards, but I think in truth todays beefy quad cores are way ahead and the transistor count gives a truer picture. You were the one saying Xcpu is still good by todays standards...

I imagine the CPU was about where the GPU was in 2005, reasonably close (but a notch below) to state of the art. I dont see the imbalance though.

The clock speed being high was probably due partly to the in-order design. As I understand it the Xcpu was designed to trade of some ease of programming for some additional brute force. You cant really compare to totally different x86 architectures by clockspeed, anymore than the old netburst pentiums that scaled to 3.6 ghz could be compared to the slower clocked but higher performing athlons of the time...you would often see a athlon at 2 ghz being faster than a 3ghz pentium and the like. I dont imagine clock for clock Xcpu is very strong.
 
Last edited by a moderator:
The clock speed being high was probably due partly to the in-order design. As I understand it the Xcpu was designed to trade of some ease of programming for some additional brute force. You cant really compare to totally different x86 architectures by clockspeed, anymore than the old netburst pentiums that scaled to 3.6 ghz could be compared to the slower clocked but higher performing athlons of the time...you would often see a athlon at 2 ghz being faster than a 3ghz pentium and the like. I dont imagine clock for clock Xcpu is very strong.

Well, the analogy to Pentium 4 is probably not too far off from what I recall (not just the Lost Planet Framework article, but from another dev a long while ago on these forums). The way Xenon executes/switches threads makes it more akin to six 1.6GHz cores if I understood correctly, but I mean, it's all approximate. It's the fact that it switches threads every cycle.

Anyways...
 
Xenos might have been comparable to the highest-end GPUs (although it halved their memory bandwidth and shared it with the CPU), but Xenon wasn't.
The CPU was well above 2005 standards. By then, we'd have either super-expensive Pentium Ds (non-hyperthreaded) @ ~2.8GHz or equally expensive Athlon 64 X2 @ 2.2GHz.

This isn't really true. Xenon has 165m transistors while the 512K version of the Athlon X2 used 221m. Thats a fairly big difference.

Core counts and clock speed tell you very little when comparing completely different architectures. For example the 2.4Ghz AX2 was faster than the 3.4Ghz Pentium D which was at that clock speed around the time of the 360's launch.

If both optimised properly and used to their advantages the Athlon X2, and even Pentium D would crush Xenon. Its a very basic, very narrow in order processor . Not at all comparable to the far more complex x86 processors of the time. And utterly unrelatable to todays quads.
 
The discussion seems to have moved on, but as to the retail price/ BOM. If the cost is over 400$ I, an early adopted who has owned a console from each generation within a few months of launch, will not buy it. Period, full stop, end of story. If I have to wait a year for the price to fall to 400$, I may not buy it then. The very idea angers me just sitting here. I highly doubt you will see a 500$+ launch and a 7 or 8 hundred $ BOM. That just seems like a recipe for disaster.
 
The discussion seems to have moved on, but as to the retail price/ BOM. If the cost is over 400$ I, an early adopted who has owned a console from each generation within a few months of launch, will not buy it. Period, full stop, end of story. If I have to wait a year for the price to fall to 400$, I may not buy it then. The very idea angers me just sitting here. I highly doubt you will see a 500$+ launch and a 7 or 8 hundred $ BOM. That just seems like a recipe for disaster.
Usually, early adopters are comfortable with a price like $500. It's the same reason as those that early adopt most things. They want the best before most get to experience it. They pay a premium to do so. What you are saying goes against that commonality. Why?
 
Usually, early adopters are comfortable with a price like $500. It's the same reason as those that early adopt most things. They want the best before most get to experience it. They pay a premium to do so. What you are saying goes against that commonality. Why?

Simply put, that is just too much. I've paid the premium in the past. I bent a fair bit to pay $400 for the 360 the Fall it came out. It wasn't that quick a decision for me. Throw in a game and a second controller and it was 500$ + tax. If the prices go up again, I would probably just resign myself to waiting a good long while. I'm guessing at my motivations but its not just the cost of the console itself. This is the first generation where there wasn't a single game (so far) that I considered worth the price of admission by itself. I think my disappointment with the games, and the higher prices for the games, is probably part of the equation.
 
Status
Not open for further replies.
Back
Top