Predict: The Next Generation Console Tech

Status
Not open for further replies.
Well, if it's a 6870 or 6950 and in anyway indicative of the performance of the final system, then a lot of speculation has been correct, imo.

According to a lot benchmarks, the performance is 7770 < 6850 < 6950 < 7850. A lot of the speculation has been on an Cape Verde or an under clocked Pictrain. I bet that depending on yields and what clock they can squeeze out at their power target, the final performance will likely land in that range. Seems reasonable.

On the other hand, if that's a 7950, then we're in the next tier of performance.

Oh and as far as flop counts, the 7850 is only about ~1.7 Tflops vs the 6950's 2.2 Tflops, yet it performs as good if not better all around.
 
Last edited by a moderator:
How good would a 600GFLOPS part from the 4xxx series compare to a 1.8TFLOPS part from the 7xxx series, thats a x3 difference in terms of flops, but would it be closer to or more than lets say an x6 gap.

If what I think is true that could explain why the lherre said that the difference between the Wii U and PS4/XB3 GPUs is bigger than dreamcast to xbox1.
 
Thought about 6790 but there is definitely no red cover on dev kit that i can see. If they have hd6870 or 6950 than I think they definitely shoot above 2tflop.
 
Well, if it's a 6870 or 6950 and in anyway indicative of the performance of the final system, then a lot of speculation has been correct, imo.

According to a lot benchmarks, the performance is 7770 < 6850 < 6950 < 7850. A lot of the speculation has been on an Cape Verde or an under clocked Pictrain. I bet that depending on yields and what clock they can squeeze out at their power target, the final performance will likely land in that range. Seems reasonable.

On the other hand, if that's a 7950, then we're in the next tier of performance.

Oh and as far as flop counts, the 7850 is only about ~1.7 Tflops vs the 6950's 2.2 Tflops, yet it performs as good if not better all around.

This would all seem to reiterate (to me) that which console is more powerful is still debatable and may never truly be known even after a few years of development.


7970 Durango beast confirmed :p



6790 has 800 SP's, so that's not too bad...however would probably put it more in league with Cape Verde.

Awesome detective work ITT though.

Yeah the power of it is a part of why I considered it an outside contender. But with customization of the GPU, of which I personally expect a greater emphasis on GPU compute, that being a 6950 makes more sense as a comparable target for performance.
 
Well DaE did keep saying things about his dev kit like "it's a nice gaming PC" or the like. Didn't really make it seem like a piece of crap, like I would kind of think a 7750 or 7770 equipped kit would be. For some reason that's in my head now. Although also he warned not to expect absolute top of the line.

It's good that Microsft will never change and always be the leakiest. we should get good Durango info throughout.
 
If MS delivers an Xbox 3 with a GPU with a GPU over 2TFLOPs of programmable shader powerformance I will personally drive south to bkillian's house and give him one of my world famous orange vanilla cheesecakes and offer to grill him Alaskan Salmon. (If it has a 3TFLOPs GPU I will put the console in a box, bake the cheesecake, and take it out of the box and throw the Salmon on the console lid and grill it in the living room George Forman style!)

Ps- Poor AlStrong only got a cheapo throw togetther Cheese Stick. 'em the brakes!
 
If MS delivers an Xbox 3 with a GPU with a GPU over 2TFLOPs of programmable shader powerformance I will personally drive south to bkillian's house and give him one of my world famous orange vanilla cheesecakes and offer to grill him Alaskan Salmon. (If it has a 3TFLOPs GPU I will put the console in a box, bake the cheesecake, and take it out of the box and throw the Salmon on the console lid and grill it in the living room George Forman style!)

Ps- Poor AlStrong only got a cheapo throw togetther Cheese Stick. 'em the brakes!

Bookmarked.

In addition, make sure to invite me too.
 
What about say an 88xx or 87xx part? Is it too late to include that in a console?

Timing is only an issue if they didn't start thinking about it until now. The 360 launched with sufficiently advanced hardware, I don't see a reason they need to be digging around the archives for a part this gen( provided they had sufficient foresight).
 
Great detective work guys.

So despite all the apprehension, it looks like the GPU is closer to 2 TFLOPS and not the 1 TFLOPS bgassasin was suggesting from his source.

I never believed MS would launch a console significantly weaker than the competition and just let Sony take over the core audience, (especially when Sony's the one bleeding red)

Given what we know of the Orbis specs:
http://www.vgleaks.com/world-exclusive-ps4-in-deep-first-specs/
Does the 720 look like it will be the more capable machine, given it has more memory and more CPU cores with a similar GPU?
 
Last edited by a moderator:
Great detective work guys.

So despite all the apprehension, it looks like the GPU is closer to 2 TFLOPS and not the 1 TFLOPS bgassasin was suggesting from his source.

I never believed MS would launch a console significantly weaker than the competition and just let Sony take over the core audience, (especially when Sony's the one bleeding red)

Given what we know of the Orbis specs:
http://www.vgleaks.com/world-exclusive-ps4-in-deep-first-specs/
Does the 720 look like it will be the more capable machine, given it has more memory and more CPU cores with a similar GPU?

My source said 1+TF. I or he never said it was 1TF.
 
Great detective work guys.

So despite all the apprehension, it looks like the GPU is closer to 2 TFLOPS and not the 1 TFLOPS bgassasin was suggesting from his source.
?

It's really difficult to conclude a whole lot. Other than neither 6870 nor 6950 is trash.

For example, a 6870 is 1120 SP's, however it's easily outpowered in PC benchmarks by 1024 SP HD 7850.

However I've always wondered how much improved architecture, and how much raw FLOPS matters in console. Part of me thinks console software can extract whatever raw flops exist, so advanced architecture wont be a factor, or nearly as much of one, as on PC.

If my thinking was right basically you'd rather have a VLIW4 or VLIW5 part with more flops per area in a console, than a GCN part. But I could be wrong too.

But I kind of doubt MS is going with a VLIW4/5 part in the final anyway. Did they send these kits out before southern islands was available? I thought they went out in June or something according to rumor? But I'm not sure at all.

I will reveal that back in June, a "trusted GAF source" who shall remain nameless :p Confirmed to me in PM that "weaker GPU, more RAM" was a correct basic summary of the Durango V Orbis situation at the time. That same source also put in the caveat that the Durango GPU was "not finished".

I think 6870 vs 7850 in the Durango/Orbis dev kits would likely fit that bill?
 
My source said 1+TF. I or he never said it was 1TF.

Yes, but everyone was taking that to mean nearer to 1 and not 'slightly less powerful' than the PS4's 1.8TF GPU

Ok so he did respond to that part. I couldn't remember. And I still think he's basing that on the CPU and memory at least for now. Tying that in with the last question when I was first told about PS4's GPU it was said to be "~2 TFLOPs", which I eventually learned it was ~1.8 TFLOPs. When I first heard about Xbox 3's I was told "1+ TFLOPs". So like I told the person, that doesn't sound very impressive compared to when I first heard about PS4's GPU (different people by the way). And he said they were still working on the GPU for Xbox 3. So that was about a month ago. I don't have an update on PS4's progress right now.

And yeah I think PS4 is pretty much set at whatever they plan it to be other than determining the memory amount. I see Sony keeping tighter control over costs.

I think the problem was you were told the TFLOPS for each console's GPU by two different people and your Durango source understated it as 1+ TFLOPS when really he should have said around 2.

It's really difficult to conclude a whole lot. Other than neither 6870 nor 6950 is trash.

For example, a 6870 is 1120 SP's, however it's easily outpowered in PC benchmarks by 1024 SP HD 7850.
...
I will reveal that back in June, a "trusted GAF source" who shall remain nameless Confirmed to me in PM that "weaker GPU, more RAM" was a correct basic summary of the Durango V Orbis situation at the time. That same source also put in the caveat that the Durango GPU was "not finished".

I think 6870 vs 7850 in the Durango/Orbis dev kits would likely fit that bill?

Yeah, but we can infer quite a bit given we know both systems use AMD GPUs (and probably CPUs too), especially since the leaked Orbis specs are supposed to be accurate.

I think we can expect the GPUs to be pretty similar performance wise? (at least with the caveat that that's going by what's in the current devkits and barring any major changes to the final parts).

And given we know the 720 will have more RAM, the only big question mark still remaining is the Durango CPU, which has gone from PowerPC to Intel and now seems to be an AMD chip. Besides it probably having a lot of cores we don't know much about how it compares to the quad core 3.2 Ghz Steamroller currently in Orbis.
 
Great detective work guys.
Yop I would have never suffer going through all those GPU pictures my-self.
So despite all the apprehension, it looks like the GPU is closer to 2 TFLOPS and not the 1 TFLOPS bgassasin was suggesting from his source.
I would not make any conclusion just now from a alpha dev kit. Though it seems MSFT is aiming high let's not forget that those cards consumes a bunch of power.
Definitely the HD5870 the HD 6870 and the Hd 6950 offer almost the same perfs. That's using the gross average method.
FLOPS don't say the whole picture or one has to be careful let say.
You can compare the HD6870 to the HD5870, the former achieve ~90% of the later performance (no AA, x4 AA, x8 AA) with a lot less MFLOPS (2720 vs 2012). The gap in power consumption is sound too, one peaks @228 Watts the other @150 Watts.
HD 6950 as far as FLOPS are concerned is not a straight comparison so it performs almost exactly the same.
Out of those three cards the one that offers the best perf per watts, it's also most likely the best in perfs per Dollars (of BOM).
Anyway I would expect to have use the most up to date in card in those alpha dev kit which is the Radeon HD6950.
I don't expect MS to use anything not based on GNC for a system releasing in late 2013.

Things the HD7850 outperforms those three cards. It's even worse if you use a more techreport methodology (imo most site should not adapted Techreport takes but give the average and the spread), /OT). That's a card only (so to speak...) worse 1761MFLOPS.
It also consumes less power than aforementioned card.

The thing is MFLOPS are by far not the most telling metric to define a GPU performances.
If there would be only one metric theoretical metric to guesstimate their performances it would be:
Fillrate with blending.
It's a bit sad after all those years, the crap that was last gen wrt to FLOPS that people are still hell bent (with no objective reasons) on FLOPS. People never learn I guess.

Overall I expect MS to undershoot even the HD7850 in raw arithmetic throughy much put but to remain really close in the ROP throughput department and if there are embedded memory in the system possibly to outperform it in real world applications (better fed ROPs).
Pretty much what I would call a "bartization" (in refernce to the jumpo from hf 58xx tp 68xx) of Pitcairn. A long with a bit slower clock (akin to the HD 7750) and less ALUs that should get the power consumption down as imho 105Watts is still on the high side.

I never believed MS would launch a console significantly weaker than the competition and just let Sony take over the core audience, (especially when Sony's the one bleeding red)
Well I hope other wise, I hope Sony ship a core system for chip even if it's less powerful, though I've no hope though Sony has been acting dumbly for a while now...
Anyway that's an empty statement as FLOPS doesn't allows to rank GPU in real world application.
Given what we know of the Orbis specs:
http://www.vgleaks.com/world-exclusive-ps4-in-deep-first-specs/
Does the 720 look like it will be the more capable machine, given it has more memory and more CPU cores with a similar GPU?
With more specific info on the CPU types, their clock speed, it's tough to say (for durango I mean as your link assume quad core stream roller @ 3.2GHz for Sony).
On the GPU side it's not clear either.
On the memory side MSFT seems set to have an clear edge (that's if Sony is stuck with 2GB).
But honestly this early that kind of guess is irrelevant and more a bad omen about the upcoming FB wars to plague the interweb next year... :LOL:
 
Last edited by a moderator:
Yes, but everyone was taking that to mean nearer to 1 and not 'slightly less powerful' than the PS4's 1.8TF GPU

I think the problem was you were told the TFLOPS for each console's GPU by two different people and your Durango source understated it as 1+ TFLOPS when really he should have said around 2.

That I agree with because as you see I even told him that didn't sound impressive. But yeah in hindsight he was probably playing it safe since at the time they probably didn't know the exact performance nor bothered to look closer at it like we would. :LOL:

Yeah, but we can infer quite a bit given we know both systems use AMD GPUs (and probably CPUs too), especially since the leaked Orbis specs are supposed to be accurate.

I think we can expect the GPUs to be pretty similar performance wise? (at least with the caveat that that's going by what's in the current devkits and barring any major changes to the final parts).

And given we know the 720 will have more RAM, the only big question mark still remaining is the Durango CPU, which has gone from PowerPC to Intel and now seems to be an AMD chip. Besides it probably having a lot of cores we don't know much about how it compares to the quad core 3.2 Ghz Steamroller currently in Orbis.

Well Intel was supposedly never in the picture. And both consoles are supposedly using Jaguar cores now. But this to me does point so far to similar performance.
 
My source said 1+TF. I or he never said it was 1TF.
Well those Alpha kit are not a definitive indication of what amount of what meta physical FLOPS the definitive GPU is going to push.
For all it's worse your source could be right without a significant disparity in real world performance.
(barts achieved 90% of the HD 5870, with only 74% of the arithmetic throughput, 90% of the bandwidth and ROPs running a bit faster).
Mostly the same applies if you compare a HD5850 to HD 6850.
I'm would not be surprised if the HD8xxx series follow the same pattern as the jump from the HD58 to HD68 with mostly the same results, though moving from the HD 79xx to HD 88xx as for pitcairn without embedded memory you would deal with less memory bandwidth if they can no longer fit a 256 bit on the chip.
OT In my opinion AMD should bartized Pitcairn feed it with a 192 bit bus and provide a really competitive HD 87xx parts. /OT
 
Last edited by a moderator:
I still think durango&PS4 will be very close
BTW is that possible MS testing the 22nm for AMD GPU so we got a news about "low yield" of durango at early September?
 
I still think durango&PS4 will be very close
BTW is that possible MS testing the 22nm for AMD GPU so we got a news about "low yield" of durango at early September?
No way that 22nm will make it into those systems.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top