When Larrabbee got canceled I was hoping for Project Denver to be done in time for one next-gen console but oh well. No exotic hardware this time sadly.
I'm doubtful AMD can keep pace with Nvidia.
I'm just saying Nvidia has large engineering teams working on next-gen GPU technology. I have no idea what MS will do...but I do think the situation is more dynamic than just expecting a AMD "Mutiny on the Bounty" GPU. Then you have IBM working on brand new Power 8 cores. There is more than one valid choice for console tech. Bleeding edge does help sell Xbox Live. I'm doubtful AMD can keep pace with Nvidia.
The one thing I'm fairly certain about is both Sony and Microsoft will use touchpad tech in their gamepads. It works well on a Vita and I think it'll turn out to be a big mistake by Nintendo not including this in the Wii U.
Well I would say the opposite, Nvidia seems to face more and more troubles keeping pace with AMD.
They are late and late and late again.
Where is their 78xx killer?
What about compute performances? I don't bite into the cool aid I read here and there, Nvidia had to make trade off to keep up, it turns out well for gaming perfs though.
Their lead in high end (lost for now) has never been that tiny.
Etc.
Then there is mobile space, Tegra is neat but Nvidia ability to put together a proper CPU is not in the same ball park as AMD.
I don't see how that could possibly be the case. Two GPUs would require additional software complexity, more developer effort and headache to utilize efficiently, they would require more pins and take up more board space than a single GPU (unless they're really really tiny chips), so you'd end up with a physically bigger product. You'd also need lots more pins and additional space on the board for a second set of memory components as well (unless again you go with really low-performance hardware)...Would it be cheaper/better for a console to have slower 2 gpu's rather than 1 single fast gpu ?
I don't see how that could possibly be the case. Two GPUs would require additional software complexity, more developer effort and headache to utilize efficiently, they would require more pins and take up more board space than a single GPU (unless they're really really tiny chips), so you'd end up with a physically bigger product. You'd also need lots more pins and additional space on the board for a second set of memory components as well (unless again you go with really low-performance hardware)...
Of course, you'd need a more complex cooling solution also to handle two GPUs instead of just one. So all-round more complexity, more of various stuff, and with electronics as with most everything else (except at all-you-can-eat buffets), more = more expensive. Assembly of a dual-GPU product would cost more, and so on.
Kewl idea perhaps, but it's probably not anything we'll ever see in reality, unless someone starts another kickstarter-funded project.
I dissagree.... A dual 3rd Gen Flagship APU (4 core cpu + 7950 equvilent) with a custom interconnect would be ideal.... the multi chip solution would give a greater overall surface area which is easier to cool... because it is a fixed platform programming the issues associated with crossfire and SLI are a moot point.... regardless I believe Microsoft won't take any chances and they proved last Gen that their custom GPU solution was the better choice... I see a bright future for Gaming...
Would it be cheaper/better for a console to have slower 2 gpu's rather than 1 single fast gpu ?
What about a 4-core Haswell (AVX2) with 40+ EUs and eDRAM/L4 and most of the uncore not needed for a console removed? I know this won't happen but if did, would devs still prefer an AMD APU?
I'm not sure it's the dev's that would complain, Sony and MS would. Buying from intel means you're stuck using their fabs and they control your cost. Buying from AMD (fabless now), you're getting IP that can be fabbed at any place like TSMC, GF, etc (anyone supporting the necessary process)
Then there is mobile space, Tegra is neat but Nvidia ability to put together a proper CPU is not in the same ball park as AMD.
So far they've only use standard licenced ARM-cores, nothing they would have cooked on their own.As I understand things, Nvidia has ex-Intel CPU engineers that joined them after Stexar folded.
Considering how they burned all the bridges to MS with XBox, do you see as realistic scenario that they've actually managed to rebuild them?Microsoft providing OS support for ARM is helping Nvidia, and I can easily see Nvidia desiring a strong relationship with Microsoft going forward.
They have publicly announced they are working on their own custom ARM design. It isn't a matter of if but when.Kaotik said:So far they've only use standard licenced ARM-cores, nothing they would have cooked on their own.
...
Considering how they burned all the bridges to MS with XBox, do you see as realistic scenario that they've actually managed to rebuild them?