Predict: The Next Generation Console Tech

Status
Not open for further replies.
When Larrabbee got canceled I was hoping for Project Denver to be done in time for one next-gen console but oh well. No exotic hardware this time sadly.
 
I'm doubtful AMD can keep pace with Nvidia.

They're keeping pace very well right now with having the fastest single GPU available (7970 ghz edition) or at least tied.

Besides, it's probably not about the max power you can push in a box, if you wanted to do that and one vendor was way behind, you could just cram two of their GPU's in. More likely, you will be choosing a mid range GPU, so who leads at the top becomes irrelevant.

I'm of the opinion AMD is the better choice for console because they have a raw flops edge (that I presume will be exploited better on console), although it's decreasing of late it still does exist.
 
I'm just saying Nvidia has large engineering teams working on next-gen GPU technology. I have no idea what MS will do...but I do think the situation is more dynamic than just expecting a AMD "Mutiny on the Bounty" GPU. Then you have IBM working on brand new Power 8 cores. There is more than one valid choice for console tech. Bleeding edge does help sell Xbox Live. I'm doubtful AMD can keep pace with Nvidia.

The one thing I'm fairly certain about is both Sony and Microsoft will use touchpad tech in their gamepads. It works well on a Vita and I think it'll turn out to be a big mistake by Nintendo not including this in the Wii U.

Well I would say the opposite, Nvidia seems to face more and more troubles keeping pace with AMD.
They are late and late and late again.
Where is their 78xx killer?
What about compute performances? I don't bite into the cool aid I read here and there, Nvidia had to make trade off to keep up, it turns out well for gaming perfs though.
Their lead in high end (lost for now) has never been that tiny.
Etc.

Then there is mobile space, Tegra is neat but Nvidia ability to put together a proper CPU is not in the same ball park as AMD.
 
Well I would say the opposite, Nvidia seems to face more and more troubles keeping pace with AMD.
They are late and late and late again.
Where is their 78xx killer?
What about compute performances? I don't bite into the cool aid I read here and there, Nvidia had to make trade off to keep up, it turns out well for gaming perfs though.
Their lead in high end (lost for now) has never been that tiny.
Etc.

Then there is mobile space, Tegra is neat but Nvidia ability to put together a proper CPU is not in the same ball park as AMD.

Agree...but amd really has to pull it's finger out in the mobile space...jaguar needs to be good.

I would like to see something like a quad core power 7 with 4x smt and 256bit simd..I would much rather that than using many jaguar cores....single thread performance would be average in that scenario...
 
Would it be cheaper/better for a console to have slower 2 gpu's rather than 1 single fast gpu ?
I don't see how that could possibly be the case. Two GPUs would require additional software complexity, more developer effort and headache to utilize efficiently, they would require more pins and take up more board space than a single GPU (unless they're really really tiny chips), so you'd end up with a physically bigger product. You'd also need lots more pins and additional space on the board for a second set of memory components as well (unless again you go with really low-performance hardware)...

Of course, you'd need a more complex cooling solution also to handle two GPUs instead of just one. So all-round more complexity, more of various stuff, and with electronics as with most everything else (except at all-you-can-eat buffets), more = more expensive. :) Assembly of a dual-GPU product would cost more, and so on.

Kewl idea perhaps, but it's probably not anything we'll ever see in reality, unless someone starts another kickstarter-funded project. :D
 
I don't see how that could possibly be the case. Two GPUs would require additional software complexity, more developer effort and headache to utilize efficiently, they would require more pins and take up more board space than a single GPU (unless they're really really tiny chips), so you'd end up with a physically bigger product. You'd also need lots more pins and additional space on the board for a second set of memory components as well (unless again you go with really low-performance hardware)...

Of course, you'd need a more complex cooling solution also to handle two GPUs instead of just one. So all-round more complexity, more of various stuff, and with electronics as with most everything else (except at all-you-can-eat buffets), more = more expensive. :) Assembly of a dual-GPU product would cost more, and so on.

Kewl idea perhaps, but it's probably not anything we'll ever see in reality, unless someone starts another kickstarter-funded project. :D

What about something like an APU and descrete GPU?
 
I dissagree.... A dual 3rd Gen Flagship APU (4 core cpu + 7950 equvilent) with a custom interconnect would be ideal.... the multi chip solution would give a greater overall surface area which is easier to cool... because it is a fixed platform programming the issues associated with crossfire and SLI are a moot point.... regardless I believe Microsoft won't take any chances and they proved last Gen that their custom GPU solution was the better choice... I see a bright future for Gaming...
 
What about a 4-core Haswell (AVX2) with 40+ EUs and eDRAM/L4 and most of the uncore not needed for a console removed? I know this won't happen but if did, would devs still prefer an AMD APU?
 
I dissagree.... A dual 3rd Gen Flagship APU (4 core cpu + 7950 equvilent) with a custom interconnect would be ideal.... the multi chip solution would give a greater overall surface area which is easier to cool... because it is a fixed platform programming the issues associated with crossfire and SLI are a moot point.... regardless I believe Microsoft won't take any chances and they proved last Gen that their custom GPU solution was the better choice... I see a bright future for Gaming...

Is this really true?

I suppose you wouldn't be limited like SLI and Crossfire are to only split-frame and alternate frame rendering, rather you could implement some sort of composite frame rendering technique (like what that Lucidlogix thing of yesteryear was meant to do), but I would imagine you'd have to have a uniform memory architecture.

Still however, wouldn't you run into memory bandwidth issues? And how would you deal with the issue of latency when you might want to move data between GPU cores for example?

I don't think a dual APU solution would be anywhere near as efficient as a single monolithic chip of the same transister budget, no matter how good your software layers are.
 
Would it be cheaper/better for a console to have slower 2 gpu's rather than 1 single fast gpu ?

Not traditional GPU's. But I think doing something similar to what Xillinx is doing with their 28nm Vertex FPGA could be cheaper, depending on performance targets.

To get around yield and difficulties of fabbing a larger chip, they fab smaller "slices" of their FPGA at 28nm and connect them using a 65nm intersposer layer. They've been doing this for a while now. Granted those FPGAs are low volume and very expensive, but I think eventually GPU manufacturers are going to go the same way.

Instead of fabbing a giant GPU, fab small clusters of computer unit and place them on an interposer layer with the memory as well. For this to be cost effective though, I think they would have to be targetting super high performance. Let's say the equaivalent of a 400mm GPU made up of several ~100mm GPU slices.

For something of the equivelent performance of Cape Verde/Pitcrain (as we expect for the consoles
today), I don't think it makes sense. Maybe we'll see a single GPU+Memory on an interposer though.
 
What about a 4-core Haswell (AVX2) with 40+ EUs and eDRAM/L4 and most of the uncore not needed for a console removed? I know this won't happen but if did, would devs still prefer an AMD APU?

I'm not sure it's the dev's that would complain, Sony and MS would. Buying from intel means you're stuck using their fabs and they control your cost. Buying from AMD (fabless now), you're getting IP that can be fabbed at any place like TSMC, GF, etc (anyone supporting the necessary process)
 
I'm not sure it's the dev's that would complain, Sony and MS would. Buying from intel means you're stuck using their fabs and they control your cost. Buying from AMD (fabless now), you're getting IP that can be fabbed at any place like TSMC, GF, etc (anyone supporting the necessary process)

If we're talking about owning the IP of a graphics chip, this is potentially true.
Anything involving an AMD CPU is not going to have the same kind of control for either the fab or costs.
AMD can't give the IP for an x86 CPU, so a console maker is going to buy the chips from AMD. Absent some kind of directive about the manufacturer in the contract, AMD is going to pick the fab, and it would add its prices on top of what the fab charges.
 
As I´m not a tech expert, I use to review what other people with further knowlegde than me think or have said. I´ve been readying lherre posts, and, in early November, he said we would be pleasantly surprised by BOTH PS4 and Xbox 3. In this same post he said the early PS4 specs were 10x PS3, but he expected this number to raise. Afterwards he says he has been told the Xbox 3 is even more beast.

This last sentence is where, aparently, bgassassin seems to disagree, as bg thinks lherre told this simply for the RAM in the Xbox 3.

http://www.elotrolado.net/hilo_nint...sual_1693082_s120?hilit=xbox next#p1727022860
 
Then there is mobile space, Tegra is neat but Nvidia ability to put together a proper CPU is not in the same ball park as AMD.


As I understand things, Nvidia has ex-Intel CPU engineers that joined them after Stexar folded.


Microsoft providing OS support for ARM is helping Nvidia, and I can easily see Nvidia desiring a strong relationship with Microsoft going forward.
 
As I understand things, Nvidia has ex-Intel CPU engineers that joined them after Stexar folded.
So far they've only use standard licenced ARM-cores, nothing they would have cooked on their own.

Microsoft providing OS support for ARM is helping Nvidia, and I can easily see Nvidia desiring a strong relationship with Microsoft going forward.
Considering how they burned all the bridges to MS with XBox, do you see as realistic scenario that they've actually managed to rebuild them?
 
Kaotik said:
So far they've only use standard licenced ARM-cores, nothing they would have cooked on their own.
They have publicly announced they are working on their own custom ARM design. It isn't a matter of if but when.
 
...

Considering how they burned all the bridges to MS with XBox, do you see as realistic scenario that they've actually managed to rebuild them?

Not a particularly compelling argument. In business at that level, companies tend to not hold grudges. There's too much money on the line. You take the best deal and make the best product, and put any past differences aside. Hell, some companies sue each other while they're still in business together. If MS can get a better deal with Nvidia than they can with AMD, they'll go for it.
 
Found this on pastebin:

http://pastebin.com/5giPQP2r

The second PS4 dev kits seem to have an AMD APU 3870k with a Radeon 6850 + discrete GPU Radeon 7900 and the RAM has been raised to 4 Gb.

I´ve also read at pastebin that Xbox 3 will arrive with a 6 core IBM CPU, 2 GPUs and 4 Gb of RAM. However, the first rumor seem more realistic than the Xbox one.
 
Status
Not open for further replies.
Back
Top