NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
I wouldn't hold my breath on either system making it portable. Even shrunk to 14nm these systems would still be pulling quite a few watts before a display.

This is a 2013 tablet using a 28nm Temash.
The Temash has a quad-core Jaguar probably at the same 1.6GHz, and probably something like a 2 CU GCN GPU.

Yet you believe that a 14nm SoC composed of an 8-core Jaguar + 12CU GCN GPU and lower power memory wouldn't fit into the same form factor?
I think you're being overly pessimistic.

Razer has already put out a tablet with an Ivybridge and a GK107 out there. I wouldn't be surprised if the total amount of transistors is approximately the same between Durango and the Razer Edge.


Yes. Look at the 360. Shrink from 90nm to 45nm halved its power (generous) down to 100W.

Durango is at least 120W. You've got to reduce that by a factor of at least 10 to start talking about tablets.

Both the 360 and the PS3 used very high clocks. Even in 2013 we haven't seen mobile CPUs going over 2GHz, and even the most handpicked 22nm 17W dual-core IvyBridge only goes to 3.2GHz for a limited amount of time when it's in single-core operation.
 
Last edited by a moderator:
Yeah for mobile gaming, Sony took a look at the landscape and started 2 tracks:
* PS Mobile for third party devices
* Vita for native games. According to a few devs, downporting PS3 games to Vita seems easy. I expect PS4 to be the same but truthfully, I think the games need to be optimized for mobile gaming somewhat. Something like Demon's Souls would be terrible since you can't pause the game. I would love a custom Demon Soul's title catered for mobile devices nonetheless.

Who knows, may be all games in the future will have short progress segment to simplify porting to mobile devices. Great for busy people (Yesh !)
 
This is a 2013 tablet using a 28nm Temash.
The Temash has a quad-core Jaguar probably at the same 1.6GHz, and probably something like a 2 CU GCN GPU.

Yet you believe that a 14nm SoC composed of an 8-core Jaguar + 12CU GCN GPU and lower power memory wouldn't fit into the same form factor?
I think you're being overly pessimistic.

Razer has already put out a tablet with an Ivybridge and a GK104 out there. I wouldn't be surprised if the total amount of transistors is approximately the same between Durango and the Razer Edge.




Both the 360 and the PS3 used very high clocks. Even in 2013 we haven't seen mobile CPUs going over 2GHz, and even the most handpicked 22nm 17W dual-core IvyBridge only goes to 3.2GHz for a limited amount of time when it's in single-core operation.

A perfect shrink to 14nm would quarter the power. A quarter of 120W to 150W + powering a display? Is it going to come with a backpack for a battery?
 
And you can't simply divide the power like that. Stuff not resident on the SoC won't decrease linearly. Certain interfaces require so much power, not everything will shrink, etc.
 
Because it can help you reach closer to your peak performance,you can get the very best GPU,if you put help alone side it,it will perform even better than it would on its own.

No it doesn't. A cpu with gpu like processors helping out a gpu leads to better gpu performance. It doesn't mean it leads to a performance level greater than if those gpu like processors actually sat on the gpu.
 
No it doesn't. A cpu with gpu like processors helping out a gpu leads to better gpu performance. It doesn't mean it leads to a performance level greater than if those gpu like processors actually sat on the gpu.

He is correct. There are developer slides (PhyreEngine) showing a 3x improvement over running on RSX alone for some techniques. The sum was greater than each part on it's own.
 
A perfect shrink to 14nm would quarter the power. A quarter of 120W to 150W + powering a display? Is it going to come with a backpack for a battery?

And where exactly is there any rumour saying the Durango consumes a constant 120W?
I'd be surprised if it consumes much more than half of that.
 
Prepare to be surprised.

I'm not the one being surprised.

The Mobile Cape Verde @ 800MHz (10 CUs) consumes 40W (with GDDR5 included). A 40nm dual-core Brazos @ 1.7GHz consumes 18W (with an iGPU, but it's the closest we have from a Jaguar).

What you're saying is that the Durango will consume as much as a Mobile Cape Verde + 6*Brazos 2.0 dual-core APUs, which amounts to a 10CUs GCN GPU @800MHz/28nm, 12 Brazos cores @ 1.7GHz/40nm and the equivalent of a 480 VLIW5 GPU @ 550MHz/40nm.

I'm sorry but I find that a bit ridiculous.
Yes, the SDKs might even come with a 120W power brick with alpha hardware, but the final unit? Nope.
 
He is correct. There are developer slides (PhyreEngine) showing a 3x improvement over running on RSX alone for some techniques. The sum was greater than each part on it's own.

That doesn't indicate sum being greater than its parts. That would indicate something done very poorly on the gpu optimize to work a lot better on the spe.
 
That doesn't indicate sum being greater than its parts. That would indicate something done very poorly on the gpu optimize to work a lot better on the spe.

Not to mention that model, for particular workloads, may be true of RSX and Cell, but not on other GPU/CPU combinations.
 
I'm not the one being surprised.

The Mobile Cape Verde @ 800MHz (10 CUs) consumes 40W (with GDDR5 included). A 40nm dual-core Brazos @ 1.7GHz consumes 18W (with an iGPU, but it's the closest we have from a Jaguar).

What you're saying is that the Durango will consume as much as a Mobile Cape Verde + 6*Brazos 2.0 dual-core APUs, which amounts to a 10CUs GCN GPU @800MHz/28nm, 12 Brazos cores @ 1.7GHz/40nm and the equivalent of a 480 VLIW5 GPU @ 550MHz/40nm.

I'm sorry but I find that a bit ridiculous.
Yes, the SDKs might even come with a 120W power brick with alpha hardware, but the final unit? Nope.

Can we please stop comparing the GPU to specially binned mobile parts?

The desktop 7750 is a 55W* card. The 7870M is 45W, not 40W. Durango has 20% more CUs. In addition to the CPU, it also has DMEs, scalers, 32MB of ESRAM, etc. and probably an ARM security core from the rumors. It's going to be at least 120W.
 
Last edited by a moderator:
And where exactly is there any rumour saying the Durango consumes a constant 120W?
I'd be surprised if it consumes much more than half of that.

Are you referring to the chip or the entire system?
I'm thinking 120W for the system is going to be close to the mark. Remember it's probably going to have Kinect and what not. Kinect is consuming 18W currently if I'm not mistaken.
 
Numbers, or it never happened.

Latencies are important but if you re-read my post I point out that based on the leak the actually disclosed specs all pretty much support Durango being a 7770 class GPU.

This isn't to say it won't be faster: it is a closed box with a unique design, of course it will be.

This isn't to say it won't produce better looking games than a comparable PC: lower overhead and targeted specs as well as exploiting platform specific features (instead of general API) will go a long way.

And having 102GB/s of bandwidth on-die will be a big win for those able to manage 32MB.

But I think it is well past time, until something substantial is disclosed, to continue grasping at "special sauce this" and "latencies that."

What special sauce? <crickets>

What are the latencies? <crickets>

If you asked my opinion I think Durango's 12 CUs actually could run 7850 quality graphics/performance in the launch window when you factor in targeting a closed platform, thinner API, ESRAM, etc. So I am not knocking Durango's abilities. But in the same breath quoting people like Proelite saying it will mimic a 2.5GFLOPs GPU (yeah, at 720p?) ignores the fact I bet Orbis in the same situation is going to look better than a 7870 when the the hardware inside is less than a 7850 for all the same reasons.

But I don't see people clamoring, "Secret Sauce is going to make Orbis effective FLOPs skyrocket because of unknown ingredient!"


Completely agree..

If Durango is say to be efficient and have low latency based on nothing,Orbis should be the same,after all is wishful thinking that one will be ultra efficient (the one with the slower ram and smaller bandwidth) and the other with good bandwidth and faster ram will not be the same or close.

I also don't get how 32MB of eSRAM will make up for 16 less ROP's,6 less CU,4 which are say to be specialize for compute and other things as well.

I just don't see this..
 
I definitely prefer Orbis over Durano, you probably too, but that doesn't mean that Durango will fail.

Agree

That is something that strike me to,people who think that because Orbis is more powerful (on paper debatable) Durango will fail,is like this generation never happen,like if power really help MS to win 2 generations in a row.

I say i don't see how Durango can make up for the gap,with the current specs leaks,but i think saying or thinking that it will fail because is not as powerful as Orbis is a joke and show how little people learn from this generation which the weakest console won.
 
Agree

That is something that strike me to,people who think that because Orbis is more powerful (on paper debatable) Durango will fail,is like this generation never happen,like if power really help MS to win 2 generations in a row.

I say i don't see how Durango can make up for the gap,with the current specs leaks,but i think saying or thinking that it will fail because is not as powerful as Orbis is a joke and show how little people learn from this generation which the weakest console won.

You have to remember where MS was at launch with xbox and xbox360.

They were clawing tooth and nail to get into the console biz. Sony was dominant with ps1 and even more so with ps2. But having multiplat superiority (for the most part) helped them gain a foothold in what was once a dominated market.

I don't foresee a lopsided failure as in WiiU, but they will certainly feel pressure in areas they had secured last gen. Specifically, I think they will need to react in their online offering as the specs on hand will lead to multiplats looking better on orbis which will lead to more peer purchases on orbis which will lead to less desire/need to pay for xblg for multiplayer (which was the dominant reason for xblg purchases).



I will add one caveat to my "doom and gloom" scenario for MS - we still don't know what GPU is in the box. If the GPU is PowerVR Tile based, all bets are off.
 
You have to remember where MS was at launch with xbox and xbox360.

They were clawing tooth and nail to get into the console biz. Sony was dominant with ps1 and even more so with ps2. But having multiplat superiority (for the most part) helped them gain a foothold in what was once a dominated market.

I don't foresee a lopsided failure as in WiiU, but they will certainly feel pressure in areas they had secured last gen. Specifically, I think they will need to react in their online offering as the specs on hand will lead to multiplats looking better on orbis which will lead to more peer purchases on orbis which will lead to less desire/need to pay for xblg for multiplayer (which was the dominant reason for xblg purchases).



I will add one caveat to my "doom and gloom" scenario for MS - we still don't know what GPU is in the box. If the GPU is PowerVR Tile based, all bets are off.

The rumours have it as an AMD GCN variant, something like Cape Verde. That's going by VGLeaks. Is there any reason you'd have to believe that part of the rumour is not true? Why believe the Orbis rumours to be true if they are coming from the same source?
 
I'd bet they explored the idea for a while, but the 2nd GPU was always going to be part of an ARM based SoC for low power operations and managing the overlays and notifications. I'm sure they gave up on the idea when they decided it would be too costly o integrate with the rest of the design relative to any advantages over AMD's modern power-saving techniques.

Of course they explored the idea - it was in the leaked roadmap.

They decided it was a bad idea early on though, much before the kits came out.
 
Does the triangles/ vertex per sec rate matters ? Like 1.6 billion triangles/s for Liverpool gpu . What is the triangle rate of Durango gpu and how does both compare to the last gen and the top end Gpus like gtx 680 and 7970 ?

It determines the systems ability to draw polygons. So more can lead to more detailed 3d character models and scenery. Of course the use of tessellation would make it less important than it was in previous generations.

For comparison the systems you mentioned have the following setup rates:

PS3: 0.25 billion
Xbox 360: 0.5 billion
7970 GE: 2.1 billion
680:4 billion
 
Last edited by a moderator:
You have to remember where MS was at launch with xbox and xbox360.

They were clawing tooth and nail to get into the console biz. Sony was dominant with ps1 and even more so with ps2. But having multiplat superiority (for the most part) helped them gain a foothold in what was once a dominated market.

I don't foresee a lopsided failure as in WiiU, but they will certainly feel pressure in areas they had secured last gen. Specifically, I think they will need to react in their online offering as the specs on hand will lead to multiplats looking better on orbis which will lead to more peer purchases on orbis which will lead to less desire/need to pay for xblg for multiplayer (which was the dominant reason for xblg purchases).



I will add one caveat to my "doom and gloom" scenario for MS - we still don't know what GPU is in the box. If the GPU is PowerVR Tile based, all bets are off.

Having multiplatform superiority had nothing to do with how MS perform this gen.

Sony screwed up with a $600 dollar console,if multiplatforms were actually the reason the wii would have not won.

Not to mention that RROD also sold several millions 360 more,i have pass by 2,and all friends and relatives i know who had a 360 had also have more than 1,some as much as 3.
 
Status
Not open for further replies.
Back
Top