Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Both needs access to Kinect and voice data.
It'll be a massive waste to have it be replicated across two memory pools.

Probably oversimplifying but why wouldn't the OS APU handle all the Kinect and voice data processing and just pass it on to the application as "Human Interface Device" input data? (i.e. like another controller) It would be time sensitive but wouldn't be that bandwidth intensive would it?
 
The point I'm trying to make is that custom solutions that don't need to rely on standard PCI-E interconnects and conventional APIs could have an easier time dealing with it. It can't be totally ruled out.

I'm not sure if I've run across an architectural description of where the software/hardware scheduling transition is made for IMG multiprocessor GPUs, but my google-fu seems weak here.

Being a tiled architecture from the outset and being implemented on the same die is a big differentiator from the architectural history of AMD's Xfire implementations. I'm not sure where the MP scheduler fits in the hierarchy relative to how AMD structures things with its command processor and ACEs.
The legacy of living behind PCIe and having very independent contexts doesn't disappear with the bus, and the diagrams given don't give much hint of a massive change in that regard.
 
Yes, but don't these systems share the same ram pool?

Define share. They have access to the same RAM pool, but it's non-uniform in performance. Physically there are separate RAM pools connected to separate memory controllers on separate dies. Logically it's a unified address space and cache is coherent as well.

No reason why a multi-SoC console couldn't work the same way. You'd want to take some care to properly allocate the memory to avoid heavily accessing it from the wrong side but this isn't as hard as it sounds.
 
There's a need for high-bandwidth communication between the APUs. The need for coordination between two separate graphics systems is going to add complexity, and latencies at the level of workflow control are harder to hide.

What amount of bandwidth are we talking about ?

PCI-E 3.0 delievers 8GT/s bit rate. PCI E 4 is 16GT/s would that be enough ?

.



The same reason why one game might scale 70% with an SLI rig while the next has zero scaling, or why there are multiple game patches or driver releases for stability problems.
Cheaper silicon doesn't save money if it adds uncertainty as to whether the system will or will not lock up.
Multi-GPU functionality does not have a history of robust implementation, and the GPUs themselves are not currently structured to take it seriously. The rumors concerning Durango haven't made the case that they've significantly changed that.
Crossfire has been steadily improving in the last few years. IT should also be easier for a developer if they are targeting crossfire from the get go instead of designing for a single gpu as I would assume is the case with most pc games


In the absence of interposer integration, and even then, it is not more power-efficient to pass scads of data over external wires. Some of that data is going to be command processor or front-end communication between GPUs, which is data traffic that does not exist in a unified design. Or there is minimal coordination, and the platform is inconsistent or unreliable.


Perhaps this is what the data movers .

Only if the dual GPUs are abstracted away. If the state and command systems of the graphics chips are exposed to software, it will be a source of trouble for existing code if you suddenly take half of it away.


Weren't we hearing a lot about sony lettings the devs go closer to the metal and ms having them use DX .

Coding for DX would be like coding for the pc would it not , the game wouldn't actually care what hardware it was running on . So changing out hardware wouldn't be a big problem.



Also how much would 8 gigs of ddr 3 cost ? Would 16 gigs be about the same as 8 of gddr 5 ? Perhaps that is the real reason they wetn with ddr 3


(I'm just speculating , it would be a bit insane if ms went with such a console )
 
http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit

HT was integrated into their APUs 2011. While googling around the next HT interconnect by AMD is called Freedom Fabric which they plan to make an industry standard across x86, graphics and ARM cores.

This guy ?

356225-amd-freedom-fabric.jpg


That seems different than HT .
 
Someone tell me, why are we taking this rumour seriously?
Misterxmedia etc who have always spouted this multi APU bs never get a hearing here, but when some random with zero credentials first post is exactly the same multi APU nonsense we start treating it like a new insider leak?

The multi APU rumour is not corroborated anywhere outside the alexcteam, misterxmedia, Jeff rigby camp. In fact we might as well believe the final GPU is as powerful as a 7970/680 as both Proelite and aegies (who are actual insiders) have said as much.

And if you haven't noticed the OP of that rumour has not posted anything since, typical hit and run - he's probably rather amused he got B3D to spend 8 pages discussing a rumour they've previously written off as utter bs. Oh and the name, CliffordB? Really? If that's not a giveaway I don't know what is.
 
I don't know that anyone is taking it that seriously, there really isn't anything to discuss so why not?

As for his lack of posting, Hard to post when you're banned. He did have the March 6 date before it was leaked elsewhere afaik. Though I'm not sure we'll hear much from an investor meeting.
 
Someone said that date was leaked before he posted.

I doubt we will hear anything from it aside from several 'insiders' that will register to confirm this guys rumour.
 
He's not banned, MikeR is banned not BClifford (the BClifford thing is an obvious nod to the MikeR = Mark Rein conflation hence a big giveaway )

Someone can confirm whether he was indeed the first to post the March 6 date, otherwise it's just a case of throwing in some glitter among the chicken feed.
 
What I failed to add were the points made by others about how this increases cost substantially without really offering any guarenteed benefits. Think about it as an upsell proposition it's hard to as $100 or more for 'up to' 70% faster, particularly to the majority of folks buying it as an item of consumer electronics rather than as a computer. If I pay $100 more for model A over model B it had damn well better give me a guaranteed boost or else I'm going to feel cheated particularly if game X is 'only' 20% and that game is Madden or CoD. Too much cost for too little gain.

Of course until there are any official specs released I have my crow in the deep freeze, just in case :D

Too much cost for too little gain, I don't know. There are a lot of people that don't feel that way about their sli or crossfire setups, which is all this would essentially be.

What boost is or isn't provided will be entirely up to the individual goals for a developer's game. They need not get some boost in graphics performance that is measured in a certain percentage point number. That's entirely the wrong way to be looking at it.

It's simply a fixed amount of extra rendering muscle to compliment the main gpu in doing whatever the developer decides they want it to do. How beneficial those extra resources will be is entirely dependent on how the developer designs their game. For example, this could simply come down to nothing more than a really nice looking shadow effect at a certain quality level.

Maybe the fully optimized main gpu can't accomplish this in concert with everything else the game does visually on its own, but with the extra graphics power provided by the secondary gpu, this gets you from a more unstable frame rate with this feature in busier scenes, to a more solid 28-30fps during those same busy scenes, and perhaps you can get away with making it just a little bit busier. That's all. We aren't talking about some massive boost in overall performance, or double the frame rate.

We're simply looking at a nice and reasonable way to provide console developers some much needed hardware help in optimizing the performance of their games. A lot of times developers say they would be far better served if they spent less time simply trying to get their games to run or perform as they need it to, which would give them more time to work on other important aspects, such as gameplay. If a developer, thanks to a console style sli setup of sorts doesn't have to spend nearly as much time optimizing on the single gpu to get their game to meet their performance goals, isn't this a win for developers and games?

You could say this might make them lazy, but I don't think so. There will also be instances where they push even further until they see no worthwhile gain from the extra rendering performance provided by the secondary gpu.

It's almost like we're talking about an sli or crossfire style setups as if they are experimental tech. These things are well documented. They wouldn't be unpredictable for game developers in a closed environment.

Still, until we get more solid info on this, it probably makes no sense to really dwell on it any further.
 
On a closed box with a proper api there would be no reason to compare it to a crossfire setup which is essentially AFR only I believe.
 
Someone tell me, why are we taking this rumour seriously?
Misterxmedia etc who have always spouted this multi APU bs never get a hearing here, but when some random with zero credentials first post is exactly the same multi APU nonsense we start treating it like a new insider leak?

The multi APU rumour is not corroborated anywhere outside the alexcteam, misterxmedia, Jeff rigby camp. In fact we might as well believe the final GPU is as powerful as a 7970/680 as both Proelite and aegies (who are actual insiders) have said as much.

And if you haven't noticed the OP of that rumour has not posted anything since, typical hit and run - he's probably rather amused he got B3D to spend 8 pages discussing a rumour they've previously written off as utter bs. Oh and the name, CliffordB? Really? If that's not a giveaway I don't know what is.

because one of Microsoft's designs was a multi-soc design?

They could of rolled over stuff from the Yukon design.

I thought people laughed at misterxmedia because the dude was saying 5tflop console.
 
Status
Not open for further replies.
Back
Top