Predict: The Next Generation Console Tech

Status
Not open for further replies.
And with this config, will be possible achieve games with Samaritan level graphics, without quality compromises at 720p 30fps ?


Epics estimates:
Last week Epic Games CEO Tim Sweeney told DICE 2012 attendees that the tech demo of Unreal Engine 3 released last year, called "Samaritan," required 2.5 terraFLOPS to run at a 1920 x 1080 resolution, 30 frames per second and with 48 operations per pixel (that rig was a monster in size too). By comparison, Microsoft's Xbox 360 console is only capable of .25 terraFLOPS, meaning Microsoft will need to generate a new console at least ten times more powerful in order to run the UE3 demo smoothly.

Current rumored PS4 specs have ~half of that.
 
And with this config, will be possible achieve games with Samaritan level graphics, without quality compromises at 720p 30fps ?

Epic said the Samaritan needs 2.5TFlops for 1080p 30fps.
HD6550D + HD6670 would do a total of ~1.25TFlops for 720p, so if we think "Flops per pixel" then yes..

Of course, getting those two GPUs to work as well as the sum of their individual performances would be a challenge. Perhaps it could work through an approach like Lucid's Hybrid - different GPUs rendering different objects.
 
acert93; Nice cost breakdown, and kinda puts things in perspective, to get that manufactured in 2013...you would be looking at $200-250....including Kinect 2.....everything...the parts would be bought on cheaper than wholoesale...likely AMD wouldn't be making much profit at all...no middle men...

I really hope this is not true...i wont bother at all with these crappy specs...
You would think there hardware budget would be closer to $500..2013..all in.
 
Here's a question, have we ever gotten close in predicting ps3 and 360's final spec one year before their official reveal? Could both MS and Sony still be screwing around with us or to each other at this point? These specs are just down right disgustingly weak as if it's almost abnormal. I understand they need to save cost but not to this level of stinginess.
 
Last edited by a moderator:
Interesting: An AMD A8-3850 is $120 at retail and and AMD Radeon HD 6670 1GB GDDR5 model can be found for $70. 4GB of memory isn't even $20. A 12x Blu Ray player can be found for under $60 at Retail. I have seen 64GB SSD get down to about $70. An FM1 Motherboard is going to run $55. Pick up a generic crap case with stock PSU for $20. That is $415 at retail.

Put this into perspective for a console: $415 after the retailer takes a healthy cut (consoles usually have thin margins with software having a bigger margin at retail) as well as a cut for certain manufacturers (e.g. Sapphire for graphics, Gigabyte for the MB; i.e. they purchase the components from AMD, Samsung, etc and assemble the product, market it, distribute it, and then take their own cut). Then there is redundancies as the video card alone is almost a complete MB and the MB itself has a lot of excess bells and whistles that would be cut for a console. Further, they aren't going to get a SSD but are going to bulk contract out a single platter HDD that is less than $30. So two things grab me from this.

The first is as of right now, early 2012, an AMD A8-3850 and Radeon HD 6670 GPU system looks and sounds like it is easily a sub-$300 console to manufacture. By fall 2013 or later it is inconceivable once you consider all the hands this sort of retail hardware passes through with all the extra layers of marketing, distribution, management, etc cut out, as well as a console specific design with hefty supply contract discounts that this sort of console could not be made extremely cheap.

The second is that if, truly, both MS/Sony went this route consumers could easily in 18 months (Fall 2013) assemble (probably even just plain old buy a prefab) a PC in the $400-$500 price range that could have a better CPU, a GPU on order of 2x-3x faster, twice as much RAM, a SSD, and so forth that would allow them to get console ports all generation long that ran better on a launch-day PC than the consoles.

I guess you could argue that Kinect2/Move2+EyeToy3 are going to eat up a huge chunk of the console budget and you would possibly miss out on some of that. Anyways, I just find the rumored specs for a 2013+ platform launch just shockingly lower than I was anticipating even with more constrained budgets in view. If this all comes to pass I wonder if AMD made a really hard sell on their APUs because they know that getting every console game developer to basically do R&D into maximizing their APUs helps them against Intel and Nvidia.

I don't believe you can just research off-the-shelf parts as if they're representative of what goes into a console.

X1900XTX - $499
MS Xbox360 $299

360 launched several months before the 1900XTX.

If they decide to take a loss, then I don't see how a 6670 fits into their plan. Even if they decide to break even, a Pitcairn will still fit the bill given its size.
 
Here's a question, have we ever gotten close in predicting ps3 and 360's final spec one year before their official reveal? Could both MS and Sony still be screwing around with us or to each other at this point? These specs are just down right disgustingly weak as if it's almost abnormal. I understand they need to save cost but not to this level of stinginess.

Agree totally..the 'leaks' get worse the longer we get nearer...its absurd to think next gen will only be 2x more powerfull than current gen...some 8 years later??:???:
 
Come on guys, the IGN leak, if true, could give as a hint about the hardware configuration of the console, but is stupid to think Sony is going to mount a 6670 in a console that will be released in 2013.
 
I don't believe you can just research off-the-shelf parts as if they're representative of what goes into a console.

X1900XTX - $499
MS Xbox360 $299

360 launched several months before the 1900XTX.

If they decide to take a loss, then I don't see how a 6670 fits into their plan. Even if they decide to break even, a Pitcairn will still fit the bill given its size.

If I understood his post correctly he doesn't really. I think the point was by showing that of the self parts can almost cost as much as a console imagine then how cheap it actually is when inside an actual console with all the price reductions you get in that format...
 
Come on guys, the IGN leak, if true, could give as a hint about the hardware configuration of the console, but is stupid to think Sony is going to mount a 6670 in a console that will be released in 2013.

I'd like to think that they won't.

I'm inclined to think that Charlie's S|A article on Sony's intent with PS4 is more along the lines of what is true:

http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/

So in the end, we close with a simple thought, the Playstation 4 is almost undoubtedly an x86 part with AMD graphics too. That is only the very beginning though. If Sony can back up the boasting with real silicon, and the packaging elves can make it in quantity, it should be a game changer, pun intended. Sony is aiming for the moon just like they did for the PS3. Let hope they come closer to the mark this time, game developers could sure use the power.S|A

I'm thinking that perhaps IGN is either just fishing for hits, or that their source (whoever he/she is) may have mixed up the PS4 devkit with the Xbox720 one, given the curiously suspect similarities between the given specs and those next-gen xbox rumours.
 
Come on guys, the IGN leak, if true, could give as a hint about the hardware configuration of the console, but is stupid to think Sony is going to mount a 6670 in a console that will be released in 2013.
It was also stupid to believe Nintendo could release a console in 2006 that was just 2x GC. What seems stupid to techheads like us may not seem so stupid to business folk and system-designers knowing something we don't about their product.
 
I'd like to think that they won't.

I'm inclined to think that Charlie's S|A article on Sony's intent with PS4 is more along the lines of what is true:

http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/



I'm thinking that perhaps IGN is either just fishing for hits, or that their source (whoever he/she is) may have mixed up the PS4 devkit with the Xbox720 one, given the curiously suspect similarities between the given specs and those next-gen xbox rumours.
why you still think nextbox use 6670?
 
If I understood his post correctly he doesn't really. I think the point was by showing that of the self parts can almost cost as much as a console imagine then how cheap it actually is when inside an actual console with all the price reductions you get in that format...

Correct, that was the point I was attempting to communicate.

It was also stupid to believe Nintendo could release a console in 2006 that was just 2x GC. What seems stupid to techheads like us may not seem so stupid to business folk and system-designers knowing something we don't about their product.

Pretty much. All those who refuse to believe Sony or MS wouldn't put a 6670 class GPU in a console in 2013/2014 probably have not heard all the posturing about red ink, new ways of doing business, shift in strategies, etc. Nintendo has given, minimally, the hope that a console maker could push cheaper hardware that is not subsidized and not only be a money maker but a market leader. Now if others can replicate that is a totally different story.

I don't care if it is MS or Sony, but if one goes with 6670 class hardware as a console fan I really hope the other packs in more "kit" and charges $100 more. I am willing to pay $100 more for 2x (plus) performance. The cost of doubling RAM and moving from ~100mm^2 GPU to 200mm^2 GPU isn't even close to $100. I think if one of the companies has the balls for such a move it would be rewarded with a windfall of sales. But that is just my prediction. I also think once you start looking at shrinks the larger chip console will see bigger cost savings as those smaller chips are not going to see solid cost reduction due to memory interfaces and such.
 
Well not taking late Nvidia and GF slides to lightly I wonder about the opportunity of significant cost reductions through shrinks, sadly.

One thing is that if there are hd6670 in next gen console that will be 45nm parts, in 2013 that says a lot on the state of the foundry advancements. If there is truth to those rumors it shows that MS and Sony have been forced to take that reality in account (and Intel is not an option).
---------------
To move else where I find weird that the noise we were hearing for Durango seems indeed related to the PS4. If there any truth to these rumors I believe that MS and Sony are to end pretty close in perfs.
With the regard to Durango and the statement "it is like two PC stuck together" I can help but thing that is the perfect description for what I said some pages ago (or further back in time) two SoC in a head to head configuration. But which SoC? Kaveri? That start to be a significant silicon and power budget.

I also wonder about the noise about the manufacturers asking console maker to push specs further. In a custom design that is pretty much vain, there is not that much you can change without significant redesign and costs, basically slight clock speed increase and RAM amount and speed. I were dismissive about AMD CPU powering the next xbox but if MS were to please such requests that implies off the shelves hardware as they can be changeds pretty much "on the fly". So IBM would be out of the picture.

Off the shelves has implication for the RAM used how chips are connected, etc.

So what to we have that have been announced Trinity and Kaveri. The high SKU go as such

Trinity:
2 pile driver modules / 4 cores
2 MB of L2 per modules
3.8 GHz
6 SIMD => 96 VLIW 4 units. (northern Island architecture)
@800 MHz / 614 MFLOPS
TDP 100 watts (for what it is worth).
32nm SOI

Kaveri:
2 Stream roller module / 4 cores
the rest is unknown as far as CPU is concerned
8 CU => 512 ALUs (GCN architecture)
900MHz / 921 MFLOPs
TDP 100Watts
28nm TSMC process

I expect trinity to be ~ the same as Llano as far as die size is concerned so no significant difference in cost.
I expect Kaveri to be a tad tinier than llano/trinity. May just south of 200 sq. mm. / Wafer could be a bit cheaper too.

Two SoC head to head would be 200 Watts. I would say that it's unlikely.
A look on the mobile side of things would let one thing that there is room for one to tweak clock speed.

Still it's quiet tricky to outdo (on raw number only as GCN is more efficient, there should be improvement to the cpu cores, mem controller) significantly what Sony could come with using this path.
It could depends on where the devs asked MS to improve: ram amount? Ram speed? GPU compute power? CPU power?

If it's RAM amount and CPU two SoC could be the way. Soc (off the shelves) use DDR3 and even the fastest ddr3 is significantly cheaper than GDDR5 accordingly to AMD own number.

Either way MS may do the same as Sony APU + GPU just use better parts. I would discard any GPU using a 256bits wide bus, so no pitcairn or barts for the matter.
Such a system could be Kaveri (2 modules 8 SIMD) and a hd 77xx (or their heir at the time).

If the launch is for 2013 it may be a bit early for kaveri (AMD could be late, not out of the extraordinary lately...). If no Kaveri I can not see MS ships a system that has Northern Island or evergreen based APU matched by a GCN GPU, that would be a huge overhead as the compiler does not generate the same code either for southern Island (vliw5), Northern Island (viliw4), GCN (scalar). That smells like headache.
The same is true with Trinity the only GPU that use the same architecture are too high end.

Altogether it could be a reason why Sony would assert that MS won't make it for 2013. They may not have the matched APU and GPU ready for launch (the issue being the APU).

2 SoC won't allow significantly more than hypothetical Sony system (on raw GPU perfs)
Actually MS (using off the shelves parts) might not be in a situation to do so at all if they launch in 2013.
MS is stuck to an evergreen design if they are to launch in 2013.
The best they could come with (raising the power budget) would be imo:
A8-3850 (same as Sony, with possibly clock tweaks but that applies to Sony too)
HD6770.

That clearly won't make a nigh and day difference, but clearly will do better( more FLOPS, Twice as much ROPs). But what possible on MS will be doable on Sony system. There will be an impact on system TDP, noise, price, etc.

-----------------------

The speed at which foundries pushing out news processes and the associated (skyrocketing...) costs is clearly impacting consoles. As I see Sony system may underwhelm some people but I can't see MS providing something so much better that it would be an "enabler" more just "so much more".

The best case I see is trading GPU for CPU throughput Aka 2 trinity head to head 4GB of ram (or more unlikely though) (and Sony can raise the RAM amount that's easily done). So a 8 vs 4 CPU battle mostly.

-----------------------

That's all I can tell based on off the shelves part, custom design is different altogether but can't be significantly altered "on demand"
 
Last edited by a moderator:
And of course you can about triple that with console optimization.

Not really. The highest estimate I've ever seen from a developer (Carmack) was double and that was almost certainly referring to DX9. PC's will be much more efficient running baseline DX11+ code by the time these consoles launch.

That said, I share your view that this is no-where near as dire as some are thinking. You could be talking real world results in line with a 6970 or even higher on the PC. Not exactly amazing but still a massive jump over the current consoles.
 
Agree totally..the 'leaks' get worse the longer we get nearer...its absurd to think next gen will only be 2x more powerfull than current gen...some 8 years later??:???:

Its no-where near that small a gap. If this is true then I'd expect the extra GPU is there just to simulate the greater power of APU's at the time of launch. So their looking at an APU with roughly equivalent power to 880 SP's at 1Ghz. Throw in the greater efficiency of the new architecture and that's easily 10x the GPU power of the current generation. Add to that DX11+ features vs DX9+, 4-8x the RAM, and a vastly more capable CPU and you've got a massive leap over last gen at least as large as the current gen was over the gen before. And all at a very reasonable cost.

Sounds good to me!
 
Not really. The highest estimate I've ever seen from a developer (Carmack) was double and that was almost certainly referring to DX9. PC's will be much more efficient running baseline DX11+ code by the time these consoles launch.

That said, I share your view that this is no-where near as dire as some are thinking. You could be talking real world results in line with a 6970 or even higher on the PC. Not exactly amazing but still a massive jump over the current consoles.
The nice stuff about having two GPUs is that you can render many different render targets at many different resolutions, and not only spacial resolutions, temporal too (like 60 vs 30) :)

Some asynchronous rendering, akin to triple buffered rendering, say one gpu filled the G-Buffer and then the other finish the job with it. The former can start working on the next frame.

Plenty of tricks we saw the PS3 achieves could be pushed way further.
Imagine trick like occlusion done often through a render target/mask with SPU, imagine how fast a proper gpu would run through something like this. Velocity buffers, etc.

Plenty of the stuffs done by SPUs will get done in no time by an APU.

Clearly I like this set up as it is somehow an heir of the ps3 design. You can see just as an extra GPU but GPU can do much more than pushing pixels :)

With proper binning and so tile base rendering tiles could be dispatched to the 2 GPU a bit like in Larrabee achieving something wayt better than AFR.

The APU has ROPs too it could handle shadows for example in parallel (cf how ROPs hardware has been used to do so this gen).

I can definitely see the same forward thinking developers that came with great trick this gen do a good use of such a design. As a plus it's the same architecture for the plain GPU and for the "no matter what you use it for" APU GPU.
 
It was also stupid to believe Nintendo could release a console in 2006 that was just 2x GC. What seems stupid to techheads like us may not seem so stupid to business folk and system-designers knowing something we don't about their product.

It could be, but that doesn't seem to fit with the approach they took with VITA, i mean, Cell like proyects are dead for Sony, but doesn't mean they are going to stop betting on powerfull hardware if the costs are reasonable.

From the PS4 rumour post...


My guess that the current dev kit maybe Llano + 7670 so that PS Devs can learn to utilize and optimize for the APU and the shared memory between the CPU and GPU.

If you look at the performance of a 7670 (768 Gigaflops) and the 3850 (480 gigaflops), combined they can account for 1248 gigaflops and that's in the ballpark (on the high side) of what AMD is projecting for a 2013 APU's integrated GPU.

So my guess is that the final PS4 APU will have the equivalent GPU performance (in terms of gigaflops) of the 3850 and 7670, hopefully with more modern shaders than VLIW5.

To me, it also makes no sense to me why Sony would actually use a Llano chip when Trinity is about to release with better power, performance, and rumored much better yields on 32nm.
 
It could be, but that doesn't seem to fit with the approach they took with VITA, i mean, Cell like proyects are dead for Sony, but doesn't mean they are going to stop betting on powerfull hardware if the costs are reasonable.
I wasn't saying otherwise. We just have precedence proving that the logical cost/performance target for us doesn't mean the system designers will agree. They may go over, or under. They are designing far bigger than just a collection of hardware components as we're trying. They are designing a whole platform with a view to x number of years of software sales and content sales and whatever longterm vision they have. That might include radical changes such as short-life consoles with fast upgraded cycles, or upgradeable hardware, or cheap hardware with fancy gimmicks, or goodness know what else. But there are always logical justifications for any given spec of hardware when considered as a business.
 
If you look at the performance of a 7670 (768 Gigaflops) and the 3850 (480 gigaflops), combined they can account for 1248 gigaflops and that's in the ballpark (on the high side) of what AMD is projecting for a 2013 APU's integrated GPU.

Yeah their off the shelf Kaveri will have a GPU in the Southern Island low end 7750/7770 range.

And why not use Southern Island chip like 7750 on the devkit now? Kotaku claimed it will be Southern Island so i dont fully trust IGN reporting for both Sony and MS consoles when they are claiming both have basically 6670
 
Status
Not open for further replies.
Back
Top