Predict: The Next Generation Console Tech

Status
Not open for further replies.
Your list also implies that average level of tech performance has also proven to be acceptable.

Err what? all those devices are essentially bleeding cutting edge. Unless you're talking strictly for games, which is not the main reason people buy any of those (even then, they actually are bleeding edge for mobile as Chef pointed out!).

Apple in particular surprises me how much they push graphics and performance, probably too their credit, when "good enough" seems like it would suffice for iDevices. The iPhone 4S may sport a ancient 3.5" screen, but at the time of it's release the GPU and chipset was absolutely cutting edge even trumping all androids, as all iPhones.
 
Yep. That suggests, if you have something new to prop your device up with like Kinect, that you can add value and maintain sales. The thing with all the devices you are listing is that they are a year old. No-one's buying 4-year old phone tech for $400+. Will the consoumer be willing to spend $400 on a 3 year old console? Of course, if MS can sell like iThing, 50 million a year for two years, then they should go for it. However, the market for core gamers wanting the best if all of...30 million? You'd have 30 million wanting a powerful console. To everyone else, you'd either need a lower price or a box with value elsewhere. Like Kinect 2, or amazing services, or something, which is wher some of us are suggesting the console companies invest their efforts more than the hardware.

360 has really not seen widespread price cuts this generation at all. That's something I think should be taken into account. It's basically dropped only $100 in 7 years. At launch was 360 core for 299 and a 360 with 20GB hard drive for 399. Today there is 360 core for 199 and 360 with hard drive for 299, that's leaving Kinect completely out of it. If you're going to blame Kinect, you cant explain why the non Kinect SKU's are still selling at those prices.

That is why I'm not so huge on the price reduction thing. Sure it exists, but this gen has definitely slowed the price slashing on consoles. PS3 arguably too, it's still selling for a hefty 249 in year 6. What was PS2 after 6 years? I'm guessing 149 at most.
 
360 has really not seen widespread price cuts this generation at all. That's something I think should be taken into account. It's basically dropped only $100 in 7 years. At launch was 360 core for 299 and a 360 with 20GB hard drive for 399. Today there is 360 core for 199 and 360 with hard drive for 299, that's leaving Kinect completely out of it. If you're going to blame Kinect, you cant explain why the non Kinect SKU's are still selling at those prices.

That is why I'm not so huge on the price reduction thing. Sure it exists, but this gen has definitely slowed the price slashing on consoles. PS3 arguably too, it's still selling for a hefty 249 in year 6. What was PS2 after 6 years? I'm guessing 149 at most.

Exactly.

I'm not seeing how a reduced rate of node reduction will be a roadblock for technical advance.

Ex: Starting at $400 with a $400 BOM.

Price reduction of only $100 and we're on the same path as xb360.

However, somehow, somewhere, someone will manage to produce a node or two lower than 28nm over the lifecycle of xb720. ;)
 
Ehh, on AMD, I just disagree with the inclusion of EDRAM in the 360. That's probably debatable but I think on balance it harmed the system. That was an AMD idea.
So you think ATI went to Microsoft and said "look we've never used edram before, but we'd like to pitch you a product with edram even though the work to implement it will take engineering time away from our core PC products."? I'm pretty sure edram was Microsoft's idea.
 
So you think ATI went to Microsoft and said "look we've never used edram before, but we'd like to pitch you a product with edram even though the work to implement it will take engineering time away from our core PC products."? I'm pretty sure edram was Microsoft's idea.

Well I am not 100% sure, but I believe it may be in the B3D Xenos article or somewhere else, that AMD talked of it as if it was their's. The AMD engineer(s) talked of the idea coming to them when they stepped out of the box and looked at the Xenos design specifically as a console, EG basically realized there was no need to target any resolutions higher than 720P, so something could basically be designed around 720P, so you ended up with the 10MB of EDRAM.

I've just never liked the EDRAM, but I could be wrong and either way it probably wasn't a huge performance blow on balance.
 
Well I am not 100% sure, but I believe it may be in the B3D Xenos article or somewhere else, that AMD talked of it as if it was their's. The AMD engineer(s) talked of the idea coming to them when they stepped out of the box and looked at the Xenos design specifically as a console, EG basically realized there was no need to target any resolutions higher than 720P, so something could basically be designed around 720P, so you ended up with the 10MB of EDRAM.

I've just never liked the EDRAM, but I could be wrong and either way it probably wasn't a huge performance blow on balance.
Edram was sized to 10MB for 480p with 4x AA and a small render target though by the time it shipped the target was 720p with tiling for AA.
 
Says who? I specifically remember the anecdote in my post so ATI was at least saying it was designed around 720P (rather than like a PC part that had to deal with all resolutions).

Though you're right, a little odd they didn't leave room for AA, but then again even 10MB was probably pretty envelope pushing for that time.
 
That's probably debatable but I think on balance it harmed the system.

Even if they were inclined to use a wider memory bus, there'd probably only be enough die perimeter for another 64-bit channel, and then you'd get into weirdness with the DRAM connections and number of chips. They'd have to design around a mismatched bus width with 8x16-bit GDDR3 or add 50% more memory... the latter of which would have been insane at the tail end of 2005... and then only for 33.6GB/s BW.
 
Says who? I specifically remember the anecdote in my post so ATI was at least saying it was designed around 720P (rather than like a PC part that had to deal with all resolutions).

Though you're right, a little odd they didn't leave room for AA, but then again even 10MB was probably pretty envelope pushing for that time.
I worked on the project and know it to be fact. By launch Microsoft convinced themselves 10MB + tiling was good enough, but 480p was the target when the number was first arrived upon. So after launch someone could claim either without lying. It's just a different perspective.

I'm not trying to say nothing was designed with 720p in mind. Obviously tiling was. :smile:

I don't know for sure the origin of the edram being added to the design so I'm not claiming that to be fact.
 
Even if they were inclined to use a wider memory bus, there'd probably only be enough die perimeter for another 64-bit channel, and then you'd get into weirdness with the DRAM connections and number of chips. They'd have to design around a mismatched bus width with 8x16-bit GDDR3 or add 50% more memory... the latter of which would have been insane at the tail end of 2005... and then only for 33.6GB/s BW.

I look at it compared to PS3. Currently you have a box very much on par with PS3.

If they'd have gone with split pools like Sony, ditched EDRAM, they could have hypothetically added a lot more shaders. Say from 48 to 64, or even more. Since RSX and Xenos are roughly equal now, in that scenario Xenos would be clearly superior.

You lose something too, sure, but PS3 does well enough with the split system. It's more difficult to program for, but most of that is down to Cell imo. Dealing with two pools instead of unified isn't ideal, but again it isn't killing PS3. You lose some bandwidth, but again you cant end up any worse than PS3 there.

I would trade a clear edge in shader power for that. As I dont think 360 has any clear edge +/10% on PS3 as of now.
 
360 has really not seen widespread price cuts this generation at all. ...
That is why I'm not so huge on the price reduction thing. Sure it exists, but this gen has definitely slowed the price slashing on consoles. PS3 arguably too, it's still selling for a hefty 249 in year 6. What was PS2 after 6 years? I'm guessing 149 at most.
That's true, and a consideration for the BOM. Perhaps the market will happily stick to a statically priced, statically spec'd box for several years? But remember that where PS2 had dropped its price, it had also doubled its sales. Had MS or Sony dropped to $200, perhaps they'd be looking at more like 80-100 million content consumers on their platform instead of the 60 million they have? Looking at it that way, an inability to drop below $300 because you can't price reduce could limit to the platform's number to half of what they'd otherwise be, and as all the profit for these boxes comes from content rather than hardware sales, getting the most people possible to own one is key.

Oh, this is the wrong thread for this. Um, I predict that the machines will cost money to make and include both processors and RAM. ;)
 
On edram issue, I think neither microsoft nor sony have eny incentive to go that way again, it didnt work as intended for ps2 neither for xbox360, I believe sony are happy they get rid of edram for their ps3, it was THE BEST HARDWARE FINANCIAL DECISION they took for ps3.

Those are the unacceptable drawbacks of edram which exceed by far the advantages :

1- you can never have enough edram at an acceptable cost. 4 mo was insufficient for ps2, and 10 mo was insufficient for xbox360.

2- it is very difficult to decrease the cost of edram over time.

3- edram dosent allow flexibility in terms of targeted resolutions. (thats why a lot of graphically complex ps3 games ran at higher than 720p resolution, (GT5, motorstorm apocalypse, wipeout, full auto2) unlike xbox360 games.
 
Reading about Kaveri rumors and connecting the dots with Tsuruta-san interview about the next generation system I am very sure that we are going to see a version of the AMD Kaveri using a TSV+Interposer in PlayStation 4 guts, I am more inclined to thing about a Dual Kaveri+TSV+Interposer configuration more than anything else.
 
Also, putting PS2 and 360 eDRAM into one basket is... a limited view at best. For PS2, it was intended for much more than what the 360 could do with its edram. Given that PS2 was much slower, it was VERY much needed to achieve a lot of the effects shown, especially in later games.
 
If they'd have gone with split pools like Sony, ditched EDRAM, they could have hypothetically added a lot more shaders. Say from 48 to 64, or even more. Since RSX and Xenos are roughly equal now, in that scenario Xenos would be clearly superior.

And hotter... and just add to the yield issues circa 2005...
 
Just been playing with an old 9600GT that I have, pretty impressive to be honest.

Crysis, native 1280x720 everything high with shader, post processing and game effects on very high and was 30-40fps average with dips into the 20-25 region during explosions.

Fair to say that if this was in a console Crysis at everything very high would be quite easy to achieve.

Metro 2033 at native 1280x720 with medium settings was also very playable.

How does a 6670 compare to a 9600GT?

Never mind... found a few charts..

Looking at a 6670 in review charts it seems to offer around half the performance on average of a 1Gb Nvidia GTX 460.

Then looking at GTX 460 performance in review charts roughly half of a GTX 460 is a 9800GT offering 52% of the performance, also a 9600GT offering 44% of it's performance.

So a 6670 should offer a good 13-15% more performance then a 9600GT and offer performance equal to or slightly faster then an 8800/9800GT

The integrated APU these things are rumored to have has 400 shaders compared to the 480 in a 6670, that's a 16% drop.

So all in all with the clock speeds you'll be looking at 8800/9800GT SLI with DX11 capability, 8800/9800GT SLI and thus 6670&6550 crossfire in a console should be around GTX 460 level.
 
Just been playing with an old 9600GT that I have, pretty impressive to be honest.

Crysis, native 1280x720 everything high with shader, post processing and game effects on very high and was 30-40fps average with dips into the 20-25 region during explosions.

Fair to say that if this was in a console Crysis at everything very high would be quite easy to achieve.

Metro 2033 at native 1280x720 with medium settings was also very playable.

How does a 6670 compare to a 9600GT?

Never mind... found a few charts..

Looking at a 6670 in review charts it seems to offer around half the performance on average of a 1Gb Nvidia GTX 460.

Then looking at GTX 460 performance in review charts roughly half of a GTX 460 is a 9800GT offering 52% of the performance, also a 9600GT offering 44% of it's performance.

So a 6670 should offer a good 13-15% more performance then a 9600GT and offer performance equal to or slightly faster then an 8800/9800GT

The integrated APU these things are rumored to have has 400 shaders compared to the 480 in a 6670, that's a 16% drop.

So all in all with the clock speeds you'll be looking at 8800/9800GT SLI with DX11 capability, 8800/9800GT SLI and thus 6670&6550 crossfire in a console should be around GTX 460 level.

:???: do not want :(
 
The more that I read about Hybrid Memory Cube the more that I am convinced it's possible and necessary to have in one of Orbis / Durango. Considering MS joined the consortium few weeks ago, and rumors of Durango having 4-8 GB of Ram, the chances are good.
 
Status
Not open for further replies.
Back
Top