NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
These articles say basically the same as the articles of EDGE and VGLeaks. So you mean parts of what came to the surface in the last two months are totally wrong and almost every gaming site in the world wide web is chasing a mirage?



I guess "same CPU" means "same CPU". Why would it mean something else? He's a native speaker of the English language (I'm not btw) so I assume he definitely knows how to put his thoughts to paper.

Re-read vgleaks, 8 cores jaguar for ps4, 8 cores x64 for durango. I think DF article is based on vgleaks, and I guess he is assuming both cpus are the same based in the "8 cores cpu".

Even they are using the "jaguar" codename for orbis:
http://www.vgleaks.com/wp-content/uploads/2013/01/apu-600x248.jpg

And they are not using jaguar for Durango:
http://www.vgleaks.com/wp-content/uploads/2013/01/durango_arq1.jpg
 
Last edited by a moderator:
I guess "same CPU" means "same CPU". Why would it mean something else? He's a native speaker of the English language (I'm not btw) so I assume he definitely knows how to put his thoughts to paper.

I dunno, read some of his face offs? :LOL:

I'm merely giving him the benefit of the doubt, that or his sources didn't even know anything about Durango's CPU besides the fact they're both using 8 core jaguar parts (Or Orbis' for that matter), so they're "the same" in that regard. Many people say 360's CPU cores are the same as PS3's PPE despite the fact there are some differences between the two.
 
100GFLOPS CPU vs 200GFLOPS CPU doesn't sound "the same", does it? We're talking about a performance difference of 100%.

Yeah, but if all they saw was 'based on Jaguar', 1.6 GHz, and 8 cores...it would lead most ppl to simply assume they use the exact same thing. What was the first rumor that talked about Durango's CPU being a Jaguar derivative? Anyone know off hand?
 
Yeah, but if all they saw was 'based on Jaguar', 1.6 GHz, and 8 cores...it would lead most ppl to simply assume they use the exact same thing. What was the first rumor that talked about Durango's CPU being a Jaguar derivative? Anyone know off hand?
There is no valid documentation they could have seen that would have said "based on Jaguar".
 
100GFLOPS CPU vs 200GFLOPS CPU doesn't sound "the same", does it? We're talking about a performance difference of 100%.



And Jaguar is no x86-64 CPU?

I'm not saying that, I mean they are not using jaguar name for Durango and they are using a "generic" X64. Why not to use jaguar instead?
 
Even they are using the "jaguar" codename for orbis:
http://www.vgleaks.com/wp-content/uploads/2013/01/apu-600x248.jpg

And they are not using jaguar for Durango:
http://www.vgleaks.com/wp-content/uploads/2013/01/durango_arq1.jpg

So what can it be then? 64Bit extension means AMD or Intel, but no IBM, no ARM. Since Durango uses a GCN GPU Intel makes no sense at all if MS wants to have the advantages of the HSA. If we assume it's HSA then it only can be Steamroller or Jaguar since GCN is 28nm. But 1.6Ghz Steamroller again makes no sense at all. It's a Jaguar or it's no HSA, easy as that. Or it's a 32nm HSA based on (edit) Trinity and VLIW which would be a huge disadvantage compared to Orbis.
 
Last edited by a moderator:
None of the AMD cards max out their ROPs, there's always more than enough. Durango is weird, it doesn't have enough ROPs to max out it's 170GB/s. How's that supposed to be called an efficient design?

Some memory bandwidth is going to be needed to feed the CPU and some will also be used by the DMEs.
 
100GFLOPS CPU vs 200GFLOPS CPU doesn't sound "the same", does it? We're talking about a performance difference of 100%.

And again we're just now getting murmurs of the CPUs being pretty different. Assuming Leadbetter wrote his article based on VGLeaks info which didn't note said discrepancy, why would one expect Leadbetter to note it? Sure the guy is intelligent but he's no psychic. He can only go by what information he's given at the time, much like B3D. If you go back a month most people weren't even talking about the CPUs because we didn't have much information on them and others thought they were the same based on rumors that had been gathered up till that point.
 
So what can it be then? 64Bit extension means AMD or Intel, but no IBM, no ARM. Since Durango uses a GCN GPU Intel makes no sense at all if MS wants to have the advantages of the HSA. If we assume it's HSA then it only can be Steamroller or Jaguar since GCN is 28nm. But 1.6Ghz Steamroller makes no sense at all. It's a Jaguar or it's no HSA, easy as that. Or it's a 32nm HSA based on Llano and VLIW which would be a huge disadvantage compared to Orbis.

Maybe a very customized jaguar cpu, and they are not using the same codename? Maybe both are using the same cpu, but vgleaks is clear, for orbis they are using direct names, steamroller first and jaguar later, for durango they are using X64 from the first leak.

And Im not talking about IBM or Intel, I mean an AMD cpu, maybe a custom jaguar with another codename.
 
What evidence? Please lay out the arguments and supporting evidence/logic for me that forces rational thinkers to conclude what you have here. All signs point to MS's specs being settled well prior to Sony's, so I'm not sure how you've convinced yourself that MS reacted to some Orbis specs somehow.

That's easy. What do these specialized parts do? Let's take a look:

ESRAM- provide a high bandwidth pool of memory because the DDR3 main memory bandwidth would be cripplingly slow for the targeted performance. Orbis uses high bandwidth main memory that exceeds Durango's aggregate bandwidth. +main have latency advantages for certain workloads, -very small pool, significant bandwidth overhead required to copy data back and forth for those benefits to be exploited.

DMEs- Manage the movement of data between the two memory pools while offering on the fly data decompression. Since Orbis has only one pool of data it doesn't need specialized units for managing that, and already possesses DMAs so shader resources aren't wasted with memory operations. + can easily move data around a lot, - you have to move data around a lot.

Everything else that has been posited as some kind of "special sauce" for Durango is either replicated by definition on the GCN design of Orbis' GPU, or through it's own specialized audio, decompression and output hardware.

And I never said MS reacted to Sony. You imagined that part, which says more about your biases than mine.

I said these design choices were required for good efficiency given Microsoft's choice to use relatively slow DDR3 for the main memory bus. A large, single pool of high bandwidth memory, with adequate low latency L2 SRAM cache, is the ideal configuration for a high performance console design. It's why you see lots of GDDR5 on high end GPUs with significant SRAM caches at different levels. If you took a normal 1.2TF GCN GPU and only gave it 68GBps of memory bandwidth it had to share with the CPU, it would be crippled. So MS added stuff to mitigate the problems that created. Those solutions don't take Durango above Orbis in efficiency, they bring it back in line with a design like Orbis or a high end GPU. Nothing more.

Please explain your reasoning for me because your posts here and at GAF come off as very defensive at times and I'd rather not bother wading into some back and forth unless you are willing to share your logic openly. Just saying. ;)

Yes, I must seem very defensive given I'm constantly trying to correct exactly the kind of misconceptions and magical thinking that is running rampant right now from people like you. Here's a question, if you think Durango is more effiecient that Orbis in design, how about you explain how that is possible when it has less resources to work with and more limitations at basically every turn?

Eh, we already know that the hardware in Orbis is going to be less efficient in some ways than the hardware in Durango.

Just look at the 32 versus 16 ROPs. Those 32 ROPs for example aren't likely going to be fully utilized with the limited bandwidth and CUs that it has compared to something like a Radeon 79xx. The 16 ROPs on the other hand will be able to make better use of it's available bandwidth and resources. If I were to just take a hand waving guess, 20-24 ROPs might be ideal for highest efficiency for the resources available in Orbis. But the ROPs can't be scaled with that fine grained granularity. It's either 16 or 32, nothing in between.

In other words, the 16 ROPs in Durango will have higher utilization (thus higher efficiency) than the 32 ROPs in Orbis in most cases.

I don't even think that is accurate. It's too narrow a view. I guess technically if Durango is constantly saturating its ROPs and Orbis never can, we might say that constitutes higher "efficiency". But we should be looking at the entire design. If Orbis can never be fill limited because it always has more ROPs than it needs, but Durango CAN become fill limited, that implies greater efficiency overall for Orbis. Maybe Durango strikes a better balance in terms of ratio, or even die space committed to RBEs, but that's not the same as being more efficient when running actual games.

I'm not saying that, I mean they are not using jaguar name for Durango and they are using a "generic" X64. Why not to use jaguar instead?

MS seemingly took great pains to avoid using standard AMD nomenclature for basically every single part of the GPU in these leaks, why would we think it strange when they also obfuscate the CPU's name? The fact that they don't specify that they are Jaguar cores doesn't mean they must be something else.
 
Toshiba is a partner of sony on the Playstation brand...;)
No, they are not.
In fact toshiba even used Cell on some of their TV's.
Cell is the product of a IBM, Sony and Toshiba collaboration. That's as far as it goes.
Trying to negate sony advantage on the hardware department is silly.
No, claiming Sony can include hardware while at the same time saving money is silly. That isn't how the world works - not even wearing fanboy rose-tinted glasses.
 
They are billions in dept yet they buy Gankai for close to 400 millions,just for gaming purposes.

This is investment sony may be in debt but they still sell 50 billions on hardware yearly,the investment is nothing is the reward is there,funny MS has money to burn and Durango will have a top of the line GPU like last gen,is this a signal of weak MS pockets,not is risk vs reward,the PS3 800+ to manufacture fiasco will not happen again.
Why do you keep raising Microsoft? Microsoft are not a factor, Sony owe a lot of money. Buying Gaikai was a gamble. They took it. They're still in debt. Maybe it will work, maybe it won't but you talk about it like it's a done deal. Like Sony have won the next generation.

What are you drinking over there? Seriously?
 
Everyone here knows Sony's financials. However, if you don't invest, you don't get returns. If Sony's business thinking is like yours, why add a touch panel on the controller? Or a speaker? Why use 18 CU when they could use 16? You have to get the right set of features, and Sony's prior success with casual camera gaming is something they can capitalise on again, plus compete with their major rival.
Touch panel? Cheap and easy to integrate not to mention better for navigating interfaces. Speaker? Cheap and easy to integrate. 18 CUs? Cheap and easy to integrate.
Business is more than looking at the bank balance and shying away from anything expensive.
Business is also about not throwing away money and unnecessary things. The Wii sold a crap load of consoles. What didn't sell was a crap load of games - at least from mainstream developers. The consumers demand for motion control games is far from assured.
They've augmented reality in PS2 (EyeToy), PSP (Invisimals), PS3 (EyePet) and Vita. It's pretty obvious they value AR. We've also learned they are supposed to be adding 3D cameras for depth perception.
Capturing the user and depth perception are one thing, but you said 3D augmented reality. That is capturing a person in 3D for replication on a 3D TV. If Sony wasting money on such a niche requirement, they really have lost it.
If they are adding stereoscopic cameras and a mic array (also see the Kinect thread about the difficulty of having microphones in the case suffering from fan noise issues), adding these on an external USB device will added very little extra cost on top of the BOM for the components,
What fan noise? You've not seen the console! Noise cancelling technology is also cheap. And no, a sensor bar is a product in itself. You need to manufacture the whole thing, the whole thing - which is a heck of a lot more expensive that building components into something else. I don't know if you've ever been involved in the production of a product but producing addons like this is expensive. You have to design and test them, certify them against safety standards, have them separately manufactured and shipped around for inclusion in the box.
 
So too xbox fans mention twice the flops and the rest of you grasp it like it's gospel, it reeks of desperation.

I doubt their will be any difference in CPUs other than name. And even if it was twice as powerful it would be some 40 gflops down because of the OS.

I know people want to believe Sony are reserving cores and cu's for the OS but that is wishful thinking on their part hoping orbis is coming out weaker than it looks.
 
ESRAM- provide a high bandwidth pool of memory because the DDR3 main memory bandwidth would be cripplingly slow for the targeted performance.

DMEs- Manage the movement of data between the two memory pools while offering on the fly data decompression.

Your arguments here rely extensively on your view that Durango was designed first and foremost to be a media box, which requires lots of RAM, which would need to be DDR3 to stay affordable, which then requires extra hardware to boost effective bandwidth just to perform up to snuff with the baseline GPU it is supposedly based off of, which is weaker than the one in Orbis. I don't necessarily disagree with most of this logic so much as I'm not sure we can take the initial premise at face value at this point. That is an assumption on your part and since everything else relies on that, it's a pretty hefty one worth vetting.

Yours isn't the only plausible interpretation of how things might fit together as I understand it. Isn't it also possible that MS talked to major graphics gurus like Sweeney and Carmack et al who told them they wanted to see graphics architectures built around virtualized assets through and through? The existing stuff on PC's (like the 7970) sounds like its PRT support is actually somewhat limited. AMD themselves warned against trying to leverage it too much due to RAM size constraints being too low in many PC's (scalability would be the issue here). MS wouldn't have that concern on a console though. So is it not also possible that MS saw that approach as a solution to both challenges?

I mean, they can do the cheap DDR3 and the 8GB is now a boon for performance. The bandwidth issue in that scenario isn't all that troublesome as virtualized assets don't require a lot of bandwidth to get the same visual output. The drawback is managing all the tiles and whatnot, right? Correct me on this stuff because I honestly dunno, just going from what I've read. For all we know the DME's and eSRAM are there to help with all of that (hence the extra functionality of the DME's over normal DMA's). Virtualized assets also don't require as much processing, especially in terms of AF, right? If I'm remembering this correctly, that may well suggest they found ways to produce what they felt was a competitive platform for housing next gen game engines with a highly specialized architecture built around a mid-range GPU.

So it seems possible that they didn't start from the pov that they needed to gimp the graphics architecture and thus had to bolster its performance with extra kit so much as trying to make a setup built around leveraging virtualized assets that simply don't need as much bandwidth/processing to yield the same visual results.

Everything else that has been posited as some kind of "special sauce" for Durango is either replicated by definition on the GCN design of Orbis' GPU, or through it's own specialized audio, decompression and output hardware.
So the Durango's standalone GPU wouldn't have any of those features built into it stock? Or are we to believe they removed those features and pushed them outside the standalone GPU to spread out the processing or what? Maybe for heat concerns? And if so, that still would leave the GPU with less burden to shoulder, no?

Btw, using loaded terms to mock ppl who might disagree with your conclusions isn't helpful in the slightest. Ppl with an agenda seem eager to use terms like 'special sauce' as an excuse to mock those who felt there was more to consider in comparing things than the standalone GPU specs on paper. I'd argue that over the past month those people would have seen their contention validated significantly. It seems that at the very least there is still some uncertainty about how these machines will perform graphically relative to one another amongst those who aren't ready to stick a flag in the ground just yet.

And I never said MS reacted to Sony. You imagined that part, which says more about your biases than mine.
Whoa there tiger, you most certainly implied that by asserting that MS was only including the extra kit to lessen the purported performance gulf between Durango and Orbis. Like I said, you can come off as defensive. :???:

Yes, I must seem very defensive given I'm constantly trying to correct exactly the kind of misconceptions and magical thinking that is running rampant right now from people like you. Here's a question, if you think Durango is more effiecient that Orbis in design, how about you explain how that is possible when it has less resources to work with and more limitations at basically every turn?
Read this aloud to yourself. Take a minute to do that and note the tone of your post here. It's one thing to feign frustration. It's another to present your views in a wholly defensive light, which is how this post reads.

I am certain you know more about this stuff than I do, but I already gave you a scenario reliant on leveraging virtualized assets in this here post. So by the time you read this paragraph surely you can see that and respond to that conjecture of mine. It was discussed at some length here in the past and the consensus seemed much less pessimistic and sure of itself compared to your posts. Would a setup leveraging virtualized assets not be more efficient in terms of bandwidth and overall processing? Pretty confident that was the motivating force behind virtual textures in the first place, no?

I don't even think that is accurate. It's too narrow a view. I guess technically if Durango is constantly saturating its ROPs and Orbis never can, we might say that constitutes higher "efficiency". But we should be looking at the entire design. If Orbis can never be fill limited because it always has more ROPs than it needs, but Durango CAN become fill limited, that implies greater efficiency overall for Orbis.
That's not any definition of 'efficiency' I've ever heard of. That sounds more like Orbis would be more powerful in that scenario while Durango would be more efficient.

MS seemingly took great pains to avoid using standard AMD nomenclature for basically every single part of the GPU in these leaks, why would we think it strange when they also obfuscate the CPU's name? The fact that they don't specify that they are Jaguar cores doesn't mean they must be something else.
Are you saying bgassassin is lying? Wrong? Trolling us? Has bad intel? What are YOU assuming in the process? Honestly bkillian's reply to my question there seems to strongly imply it's not Jaguar at all.

And why are you suggesting that the stuff like the display planes and DME's are the same thing as their analogs on previous platforms? Pretty sure I read ppl here noting differences in their capabilities. Or did you mean eSRAM, which isn't common in PC's as it is? :?:

It's interesting that your first instinct here seems to involve painting MS as going out of their way to mislead ppl in specs they never meant to see leaked in the first place.
 
Last edited by a moderator:
Your arguments here rely extensively on your view that Durango was designed first and foremost to be a media box, which requires lots of RAM, which would need to be DDR3 to stay affordable, which then requires extra hardware to boost effective bandwidth just to perform up to snuff with the baseline GPU it is supposedly based off of, which is weaker than the one in Orbis. I don't necessarily disagree with most of this logic so much as I'm not sure we can take the initial premise at face value at this point. That is an assumption on your part and since everything else relies on taht, it's a pretty hefty one worth vetting.

My argument relies entirely on the reported specs. If they aren't accurate nothing any of us are discussing matters.

Who there tiger, you most certainly implied that by asserting that MS was only including the extra kit to lessen the purported performance gulf between Durango and Orbis.

Of course. Without the extra bits the gulf would be far greater than the difference "on paper" for the reasons I enumerated.

I am certain you know more about this stuff than I do, but I already gave you a scenario reliant on leveraging virtualized assets in this here post. So by the time you read this paragraph surely you can see that and respond to that conjecture of mine. It was discussed at some length here in the past and the consensus seemed much less pessimistic and sure of itself compared to your posts. Would a setup leveraging virtualized assets not be more efficient in terms of bandwidth and overall processing? Pretty confident that was the motivating force behind virtual textures in the first place, no?

Virtualizing assets may indeed be the best way to get near peak performance out of Durango. I've never disputed that. The problem is you still think that means Durango will perform better in that scenario relative to Orbis, but you're wrong. It will be a good way to get near peak performance out of Orbis, too. So if both are using virtualized assets (which on Orbis does not require as much hardware assistance thanks to it's unified memory) and are thus operating near their peak than Orbis should have a real world performance advantage approximating it's theoretical advantages in shader power, fillrate, bandwidth, etc.

It's interesting that your first instinct here seems to involve painting MS as going out of their way to mislead ppl in specs they never meant to see leaked in the first place.

My first instinct is to point out that lack of evidence does not constitute proof of anything. It's interesting that your first instinct is to accuse everyone else of exhibiting confirmation bias.
 
Your arguments here rely extensively on your view that Durango was designed first and foremost to be a media box, which requires lots of RAM, which would need to be DDR3 to stay affordable, which then requires extra hardware to boost effective bandwidth just to perform up to snuff with the baseline GPU it is supposedly based off of, which is weaker than the one in Orbis. I don't necessarily disagree with most of this logic so much as I'm not sure we can take the initial premise at face value at this point. That is an assumption on your part and since everything else relies on taht, it's a pretty hefty one worth vetting.

Yours isn't the only plausible interpretation of how things might fit together as I understand it. Isn't it also possible that MS talked to major graphics gurus like Sweeney and Carmack et al who told them they wanted to see graphics architectures built around virtualized assets through and through? The existing stuff on PC's (like the 7970) sounds like its PRT support is actually somewhat limited. AMD themsevles warned against trying to leverage it too much due to RAM size constraints being too low in many PC's (scalability would be the issue here). MS wouldn't have that concern on a console though. So is it not also possible that MS saw that approach as a solution to both challenges?

I mean, they can do the cheap DDR3 and the 8GB is now a boon for performance. The bandwidth issue in that scenario isn't all that troublesome as virtualized assets don't require a lot of bandwidth to get the same visual output. The drawback is managing all the tiles and whatnot, right? Correct me on this stuff because I honestly dunno, just going from what I've read. For all we know the DME's and eSRAM are there to help with all of that (hence the extra functionality of the DME's over normal DMA's). Virtualized assets also don't require as much processing, especially in terms of AF, right? If I'm remembering this correctly, that may well suggest they found ways to produce what they felt was a competitive platform for housing next gen game engines with a highly specialized architecture built around a mid-range GPU.

So it seems possible that they didn't start from the pov that they needed to gimp the graphics architecture and thus had to bolster its performance with extra kit so much as trying to make a setup built around leveraging virtualized assets that simply don't need as much bandwidth/processing to yield the same visual results.

So the Durango's standalone GPU wouldn't have any of those features built into it stock? Or are we to believe they removed those features and pushed them outside the standalone GPU to spread out the processing or what? Maybe for heat concerns? And if so, that still would leave the GPU with less burden to shoulder, no?

Btw, using loaded terms to mock ppl who might disagree with your conclusions isn't helpful in the slightest. Ppl with an agenda seem eager to use terms like 'special sauce' as an excuse to mock those who felt there was more to consider in comparing things than the standalone GPU specs on paper. I'd argue that over the past month those people would have seen their contention validated significantly. It seems that at the very least there is still some uncertainty about how these machines will perform graphically relative to one another amongst those who aren't ready to stick a flag in the ground just yet.

Who there tiger, you most certainly implied that by asserting that MS was only including the extra kit to lessen the purported performance gulf between Durango and Orbis. Like I said, you can come off as defensive. :???:

Read this aloud to yourself. Take a minute to do that and note the tone of your post here. It's one thing to feign frustration. It's another to present your views in a wholly defensive light, which is how this post reads.

I am certain you know more about this stuff than I do, but I already gave you a scenario reliant on leveraging virtualized assets in this here post. So by the time you read this paragraph surely you can see that and respond to that conjecture of mine. It was discussed at some length here in the past and the consensus seemed much less pessimistic and sure of itself compared to your posts. Would a setup leveraging virtualized assets not be more efficient in terms of bandwidth and overall processing? Pretty confident that was the motivating force behind virtual textures in the first place, no?

That's not any definition of 'efficiency' I've ever heard of. That sounds more like Orbis would be more powerful in that scenario while Durango would be more efficient.

Are you saying bgassassin is lying? Wrong? Trolling us? Has bad intel? What are YOU assuming in the process? Honestly bkillian's reply to my question there seems to strongly imply it's not Jaguar at all.

And why are you suggesting that the stuff like the display planes and DME's are the same thing as their analogs on previous platforms? Pretty sure I read ppl here noting differences in their capabilities. Or did you mean eSRAM, which isn't common in PC's as it is? :?:

It's interesting that your first instinct here seems to involve painting MS as going out of their way to mislead ppl in specs they never meant to see leaked in the first place.

Be honest with yourself for a second. These companies have a certain silicon budget and thermal envelope they are trying to make the most of. That being th case, Sony and Microsoft seem to have slightly different approaches this upcoming generation. Sony seems to be placing their bets on the core gamers first and foremost, hoping that these early adopters will set the tone for platform of choice. It seems Microsoft is trying a different strategy by trying to appeal to the Apple fans and casuals of the world first and foremost (i.e., mass appeal from the get go). I have no doubt in my mind that Microsoft want to retain the core audience they've garnered over the years and obviously are trying to stay competitive in terms of power, but it's not their paramount concern this gen. The "entertainment" console rebranding they've taken as of recent speaks volumes to MS's new approach. It may turn out very well for them at Sony and Nintendo's expense.

But don't kid yourself with the "special sauce" stuff. Relative efficiency advantages can only close the gap so far in terms of a significant flops deficit when comparing apples to apples in terms of cpu and gpu architectures (which we are here for the most part). The Multi-media and new interface interactivity aspects of their machine necessitate a large volume of ram (8 gb) at the expense of ram performance that would be useful for gaming situations. These are the realities of the silicon and thermal constraints they have to work within given the unique goals for their console.

Both will be game consoles that provide an array of other types of consumable media. MS will just deemphasize games as the central focus this time around in their marketing and services strategy while Sony will likely emphasize games as the central focus of theirs.
 
My argument relies entirely on the reported specs. If they aren't accurate nothing any of us are discussing matters.

Your argument assumes that those specs are to be interpreted in a very particular context that is based on the premise I described. Do you have specific evidence that supports your conclusion that MS started out engineering a weak system and only added that stuff to plug in the performance holes (as opposed to those parts playing very crucial roles in leveraging virtualized assets at the hardware level)? If not then just admit you're basing your argument on an assumption. Nothing wrong with that, I just want to know the context your premises are couched in beforehand.

Of course. Without the extra bits the gulf would be far greater than the difference "on paper" for the reasons I enumerated.

That argument does nothing to establish your premise, which was concerned with the motivation for making the graphics architecture that way in the first place. All this comment here tells me is that you assume the standalone GPU specs tell the entire story and as such any and all additional hardware accelerators are to be viewed as patchwork instead of playing important roles in a well considered and thoroughly engineered design agenda. I'm not at all convinced at this point that we can make the assumption you are making in your premise.

The problem is you still think that means Durango will perform better in that scenario relative to Orbis, but you're wrong.

This is another assertion...please back it up with evidence/logic. I am eager to learn more about virtualized assets, but I am not sure I recall many arguing that Orbis' GPU would automatically handle that approach on par with a setup built very specifically to leverage it. Why do you assert this as fact? Just because Orbis' GPU is supposedly based on a 7970 which has PRT support in its hardware? Or is there something more substantial? :?:

My first instinct is to point out that lack of evidence does not constitute proof of anything. It's interesting that your first instinct is to accuse everyone else of exhibiting confirmation bias.

This is a rhetorical gimmick. Just because you can mirror the structure of my comment there doesn't mean your argument is actually countering my concern. As has been pointed out to you, your entire perspective is couched in an assumption you've yet to justify. Just because extra kit in Durango improves the system (duh?) doesn't suggest they only included it because they were trying to close some performance gap with a machine they knew nothing about at the time. :rolleyes:
 
Maybe a very customized jaguar cpu, and they are not using the same codename? Maybe both are using the same cpu, but vgleaks is clear, for orbis they are using direct names, steamroller first and jaguar later, for durango they are using X64 from the first leak.

And Im not talking about IBM or Intel, I mean an AMD cpu, maybe a custom jaguar with another codename.

Enlighten us because every spec from vgleaks do not point to different CPUs.

They're basically the same spec reworded and some things missing from Orbis and some missing from Durango. Orbis specs are more detailed, but nevertheless.

To make your life simple here's the vgleaks specs



Durango
- x64 Architecture
- 8 CPU cores running at 1.6 gigahertz (GHz)
- each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache
- each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache
- each core has one fully independent hardware thread with no shared execution resources
- each hardware thread can issue two instructions per clock

Orbis

  • Orbis contains eight Jaguar cores at 1.6 Ghz, arranged as two “clusters”
  • Each cluster contains 4 cores and a shared 2MB L2 cache
  • 256-bit SIMD operations, 128-bit SIMD ALU
  • SSE up to SSE4, as well as Advanced Vector Extensions (AVX)
  • One hardware thread per core
  • Decodes, executes and retires at up to two intructions/cycle
  • Out of order execution
  • Per-core dedicated L1-I and L1-D cache (32Kb each)
  • Two pipes per core yield 12,8 GFlops performance
  • 102.4 GFlops for system
 
Status
Not open for further replies.
Back
Top