Predict: The Next Generation Console Tech

Status
Not open for further replies.
7670 sounds better though :p

I am petitioning that if this is the GPU MS and/or Sony chooses there is a concerted effort to correct and education: If it is indeed a "7670" then we should all refer for it as a, "Slightly Revised AMD Radeon 5670, an architecture from 2009."

Why? The Radeon 7670 is a re-badge of the entry level Turks 6670 in 2/2011 was minor upgrade of the Evergreen 5670 released 1/2010 (the Evergreen architecture releasing in 9/2009).

Looking at a potential 2013 launch window it is funny how this supposed GPU is getting spun as a "7xxx series! South Islands! Cutting edge technology folks!" when it is in fact a 4 year old architecture and an entry level design.

7xxx may sound better for those hopeful (or at least closing their eyes!) but the AMD 5670 released for under $100 (GPU, Memory, PCB, Ports, etc) in 1/2010 with a whopping 104mm^2 with a max TDP of 65W on 40nm (the lower binned 5570 maxed at 39W).

The "7670" was a very cheap card in January 2010. A console that includes such isn't doing so because of budget restraint but because they are chasing a totally different console philosophy.
 
I dont think they are. Especially MS. Sony maybe, because they might go cheaper and because they apparently would have an APU too.

It's too early, I'm not believing most of these rumors. Didn't supposed insider bgassasin also hint the PS3 APU+7670 rumors were wrong according to what he'd heard?
 
Your viewpoints are perfect and fully agree with,but sometimes I think that if the APU will in fact succeed, because although the idea is excellent to marry cpu + gpu + memory controller etc on the same die (this is much more I know ..) perhaps i'm wrong here...I have the impression today and even next year the APUs is still very incipient and too ambitious to become something much more eficcient to compensate more powerfull the current paradigm CPU and GPU "separates,singles etc.

Another interesting point that you touched,were the chances of bugs (similar to the multiplier in the Intel pentium or even worse damaging memory accesses etc) in these new processors coming from AMD ... and imagine if something similar occurs next gen consoles on the production line?

I personally would prefer they used something that was had already tested and approved and if customized (put something extra on SIMD,die shrinks to 28nm,disable or retired pcs things etc.) cpus and gpus for next gen consoles maybe could be very interesting ... my "dream console" is something like quad Athlons II + Radeon HD 5850 (on the paper ...almost 2.1 TFlops ).
The big advantage I see for Sony with a SoC is cost.
Especially if they were to go with pretty much off the shelve Kaveri.
They already reduce (supposedly) the R&D by going with AMD which provide complete solution CPU+GPU.
Prior APU was not tempting but kaveri is another matter. By the way I'm impress that they plan to have such a jump in GPU power and still keeping the TDP at 100W.

I can't answer you on the benefits of an APU vs a two discrete part of the same power. I've read here and there (like pretty much every body can) that low latency communication between the CPU and the GPU could have positive effects on perfs but I can't tell more. For all I know it could be misunderstanding from my side of theirs.
I mean there is priority in job submitted to the GPU but as I see a GPU it could be a while before you get the result. Still if they were way to save the round trip to memory (GPU writing to it and then the CPU when informed than data are there as to read for the RAM...) but will Kaveri do that? (will the GPU L2 be coherent with the cores ones? I would say no).
Intel seems do be able to pull that to the benefit of their driver team (... don't laugh :LOL:).

There are benefits in power consumption you avoid to duplicate the memory controller for example. But it's not free you have to share bandwidth.
I believe that in the PC world AMD efforts are hold back by the memory type, APU are bandwidth starved.
You have only one chip to test etc. it makes sense.

How much consume an Athlon II X4? I would say quiet a lot, Let say they have a TDP of 65 Watts.
Then you have not a hd 5850, it consumes a lot I guess. a HD7850 may be a better base (or a 6850 using the 45nm lithography). That's plenty of Watts. That's a gpu with a 256 bits bus. that's two memory pools. etc.

Say a fairer (but hypothetical) comparison would be a Athlon III X4 same speed but using 32 nm lithography. Let say it consumes 45 Watts.
Then you have a 7770 or something with more CU enable but clock lower so its power consumption is in between the 7770 and the 7750 (45 and 75) so let say 60 watts.
The whole thing consumes 105 watts and there is 4 chips of gddr5 and 4 chip ddr3 consuming extra power. you need an a bit more complex board. You need two cooling solutions, etc.

Now what a kaveri with gddr5 would come close of that but cost most likely less. I beileve that there is no magic the GDDR5 might up kaveri power consumption, there might be trade off, in CPU speed, in mem controller speed, they may want to let the GPU clock untouched though but who knows (and it's not like 10% would change the big figure).
It might perform worse but from a cost pov, It may lead, may be not an awesome difference but on million units.

For the bug I'm may be overly dark as the early phenom incident is old now and Stream roller may either tweak K10 architecture or the BD cores v3, in both case AMD should would not be in unknown territory. It still clearly a risk, with Kaveri release no one know when in 2013.

Honestly Kaveri is not an amazing proposal but played right at the righ price (so pretty cheap) it may do Sony a lot of good, especially if they (/rumors) are right about missing 2013.
Kaveri will keep with pC games for a long while, no matter what MS aims at, could it be the moon.

EDIT

WRT to the hd 6670 / 7670 by AMD own numbers HD 5670 costs 75$ so it's cheap but not that much more like margins are terrible in that price range.
Edit 2 not amd number but looks like serious estimate. Link might come tomorow
 
Last edited by a moderator:
I dont think they are. Especially MS. Sony maybe, because they might go cheaper and because they apparently would have an APU too.

It's too early, I'm not believing most of these rumors. Didn't supposed insider bgassasin also hint the PS3 APU+7670 rumors were wrong according to what he'd heard?

Since it was posted in one of the GAF threads, I don't see an issue now with saying there's a reason why I was "adamant" about an "underclocked Pitcairn" primarily with PS4 in previous posts.
 
Since it was posted in one of the GAF threads, I don't see an issue now with saying there's a reason why I was "adamant" about an "underclocked Pitcairn" primarily with PS4 in previous posts.

Hey, can you give me 3-5 succinct insider data points I can put into my sig to see if they come to pass :devilish:

Ps- Don't toy with us... some of us are so desparate we would jump at an underclocked Pitcairn :p Maybe our expectations are being set really low (1oomm^2, really??! 2009 GPU?!) so when we get more we feel like it is the best thing since sliced cheese :p
 
Hey, can you give me 3-5 succinct insider data points I can put into my sig to see if they come to pass :devilish:

Ps- Don't toy with us... some of us are so desparate we would jump at an underclocked Pitcairn :p Maybe our expectations are being set really low (1oomm^2, really??! 2009 GPU?!) so when we get more we feel like it is the best thing since sliced cheese :p

I'm sure after the PS4 rumors we've had, this post will make more sense. I had some others older than that, but this one is more specific about the PS4 specs.

http://forum.beyond3d.com/showthread.php?p=1608866#post1608866

I mean things can change, but IGN's rumor is a big change from what was given last year.
 
I'm sure after the PS4 rumors we've had, this post will make more sense. I had some others older than that, but this one is more specific about the PS4 specs.

http://forum.beyond3d.com/showthread.php?p=1608866#post1608866

I mean things can change, but IGN's rumor is a big change from what was given last year.

Yes, things can change. Nice that your heard more Pitcairn class GPU.

This is an abbreviated prediction I made a couple weeks ago.

Xbox3

- 6-core "Xenon" 3.5Ghz
- 16-20 Compute Units (1024-1280 ALUs) 800+Mhz
- 2GB GDDR5


PS4

- 4 or 6-core (AMD-based) 3.2Ghz
- 16-20 Compute Units (1024-1280 ALUs) 800+Mhz
- 2GB GDDR5

The PS4, especially if it has an APU (A8-3850) would be interesting although 6 real cores and no on die GPU would probably be better.

For MS's sake "Xenon" better mean, "Something other than Xenon" because as some developers put it, 3 Xenon cores barely stack up to the IPC of 1 modern CPU. 18 months from now that style of CPU is going to be really, really ugly. And doesn't make a lot of sense when IBM does have some better solutions (470S/FP, A2); even a stripped down Power7 at 4 cores would trash a 6 core Xenon. Xenon wasn't even what MS wanted in 2005 as an architecture; hard to envision a revised version in 2013, 8 years later, would make many very happy.
 
So just you guys know serious estimates seems to give the HD 6850 bom at ~110 $
I'm going to give.the.link tomorrow.
that's the bom so it includes the mobo, still it doesn"t include what one would.have.to pay AMD to use their tech.

Low clock pitcairn implies a 256 bit bus, if.there is to be shrink that will be bothering especially.as.pitcairn is tiny chip ( barely over.200 sq. mm) .

So altogether I would out the odds of Sony using that kind of pricey off the shelves parts in the ps4 are really low.
 
Yes, things can change. Nice that your heard more Pitcairn class GPU.



The PS4, especially if it has an APU (A8-3850) would be interesting although 6 real cores and no on die GPU would probably be better.

For MS's sake "Xenon" better mean, "Something other than Xenon" because as some developers put it, 3 Xenon cores barely stack up to the IPC of 1 modern CPU. 18 months from now that style of CPU is going to be really, really ugly. And doesn't make a lot of sense when IBM does have some better solutions (470S/FP, A2); even a stripped down Power7 at 4 cores would trash a 6 core Xenon. Xenon wasn't even what MS wanted in 2005 as an architecture; hard to envision a revised version in 2013, 8 years later, would make many very happy.

PS4 summary: 4-cores (Steamroller), 18 CUs, 2GB GDDR5

The Xbox 3 "specs" were just working off the limited info we had last year. Still trying to get legitimate info on it. Surprisingly that's been the tough one for me to get info on. That's a major reason why I think MS has be trolling with these rumors.
 
Buy the way I did my homework and search about those streamroller cores.
If wiki is to be trust (not the best siurce.but useful) that's indeed bulldoser V3.
they are said to be about parallelism in wiki parlance.
I guess that mean a widening of the SIMD so matching Intel cpu wrt simd.width per.core.
I would bet stream roller module may embark two 8 wide Simd.
that would be welcome that put a 2 module.design @2.5 GHz at 196 FLOPS.
That let around " north of" 800 MFLOPS to the gpu to achieve the 1 TFLOPS AMD claims to reach with Kaveri. Less if the cpu is.clocked.higher.

At 800MHz and 8 CU we get:
8*64*2*800 so 819 MFLOPS

Translating in vliw5 FLOPS that's +25% around the TFLOPS.
I could make sense why we heard about a 6670 running at 1GHz. It would be a closer match to the upcoming product than stock HD6670.
If.there is a A8 and this in yhe devs kit it could because the a8 allows.devs to experiment with closer to the metal access to the apu whereas the overclocked hd 6670 is here to emulate.the capabilties of.the final product which A8 falls short.

End of my rant time.to sleep.
 
An AMD 6850 was released 10/2010 on the 40nm process node. Retailing at $179 it is a 255mm^2 chip (127W max TDP) with a 256bit bus to 1GB of GDDR5 memory.

I am not sure I would agree that $110 for BOM is "pricy" as a video card includes * the GPU chip * Memory * PCB with PCIe bus * various smaller chips * fan and casing * Ports for DVI, HDMI, mini display port, etc. A "console" is going to need to enlarge a few of those features (e.g. PCB, fans) but you are mainly adding a CPU, industrial design, optical drive, possibly a HDD, networking (WiFi, Bluetooth, etc), USB ports, etc. An expensive CPU would be a mistake and if that GPU setup is expensive (which if you look at what was shipping when the 360 shipped it surely is not) it really doesn't require much of a CPU. And the rest is affordable/anticipated costs.

Also consider that as 28nm ramps up and high end production moves to 20nm and memory densities improve it is hard to imaging that a GPU that was released in October 2010 at $179 could not see SIGNIFICANT cost reductions 3 years later in October 2013. In fact the 7850 (which is already seeing its price drop at retail from the MSRP of $249) has 2GB of GDDR5 and based on densities and the smaller die size (255 vs 212) it is likely as 28nm matures it will be cheaper than the 6850 was at launch.
 
PS4 summary: 4-cores (Steamroller), 18 CUs, 2GB GDDR5

The Xbox 3 "specs" were just working off the limited info we had last year. Still trying to get legitimate info on it. Surprisingly that's been the tough one for me to get info on. That's a major reason why I think MS has be trolling with these rumors.

I personally offended a number of MS folks last generation with my signature and lengthy rants about how there is no such thing as a MS rumors, only MS leaks. Surprisingly no one has PM'd me this generation yet with hot leads like last time :LOL:

I think someone was a little upset that people were passing the Xbox 360 block diagram around 2 years before the launch. But it made for sooo much fun. The various, unsubstantiated, contradictory rumors + fear that all the console makers are turning out Wii-rehashes isn't as fun as, "OMGWTFBBQ Cell broadband 1TFLOPs on 90nm before Intel!!!111 Free Cybord inside."

At least this time we have developers like Crytek, DICE, Epic, etc speaking up quite vocally about wanting a lot of cores (dozens even), beefy 2TFLOPs+ GPUs, 8GB of memory, etc.

But I take that as a bad sign: They probably were hearing whispers of 4 core, 1 TFLOPs GPUs, and 2GB of memory and were disappointed.
 
The only thing a leak of a true developer kit would tell us would be, who makes the CPU and who makes the GPU. There is no reason to think or believe that a developer kit is anything but the minimum spec that could be assembled with the CPU and GPU parts.

And something i thought about. With the current graphic cards and CPU's in the PC world using every trick the in the book so save power and heat. Why not copy those tricks in the console world.

You take a beefy CPU add a beefy GPU. Both would run very hot at maximum usage. You add a TDP/Watt limit that the developers have to abide by. So games that require lot of CPU can do that, games that rely on GPU can do that. You simply have a 250 watt power budget and you let the developers control where to use it.

Far out?
 
PS4 summary: 4-cores (Steamroller), 18 CUs, 2GB GDDR5

The Xbox 3 "specs" were just working off the limited info we had last year. Still trying to get legitimate info on it. Surprisingly that's been the tough one for me to get info on. That's a major reason why I think MS has be trolling with these rumors.

4 cores and 2GB actually fits with that first pastebin rumor.

The PS4 will feature a 4-core 32-bit processor.
the PS4 will be shipped with 2GB.


But I'm afraid BG's cred is going to take a tumble when it's inevitably confirmed Wii U is PS360 level only...:p
 
I missed this thing earlier:
In this comparison, do you recall if the bandwidth and latency differences between Cell<->RSX and i7<->PCIe<->GPU were factored in?
Sorry but no, I can't remember it :(
I only have a vague memory of seeing a table with i7, cell and some GPU in there with the time spent per-frame on doing the FXAA resolve.
 
The PS4, especially if it has an APU (A8-3850) would be interesting although 6 real cores and no on die GPU would probably be better.
My bet is on a Vishera derived CPU + Pitcairn successor + 4GB of unified RAM for the PSNext. It will incorporate some APU-ish functionality so they can basically turn the thing into a SoC later in the lifecycle.
 
My take is:

PS4
AMD Kaveri APU+6/8 SPUs >> 192bit BUS >> 2-4GB XDR2(?)

X720
4 core IBM OoO CPU+7750 level discrete GPU+24/32MB EDRAM >> 128bit BUS >> 2-4 GB RAM
 
I think that there are multiple ways of thinking about this. Mostly based on economics.

I dont think either MS or Sony want to produce designs which are stupidly expesive to make they lost too much money last gen that way. They dont want very hot designs either since that cost MS a billion last time.

However they must come out of the gate with a very powerful design which will last 5 years before becoming hoplessly outdated.

I think the best way to do this is relatively normal GPus and loads of CPU power via many cored/threaded cpus running at moderate frequency. You have to keep the TPD down but need a huge amount of threded performance. Inmany ways I am surprised they havent gone for ARM based designs with many cores. We will see.
 
Very tangentially related at best but, they've been discussing a new Sony presentation at GAF with this slide included http://www.sony.net/SonyInfo/IR/financial/fr/viewer/strategy/2012/

14_image.jpg


It seems to show a high operating profit projected in FY2014. Which is year ending March 2015...

FY11=April 2011-March 2012
FY12=April 2012-March 2013
FY13=April 2013-March 2014
FY14=April 2014-March 2015

Seems like pretty break even PS4 hardware in the plans? If you're looking at a late 2013 PS4 release, they expect the year 5-6 months after PS4 release to be nicely profitable. Even given profitable PS3 helping in there, that's pretty aggressive.
 
Status
Not open for further replies.
Back
Top