News & Rumors: Xbox One (codename Durango)

Status
Not open for further replies.
Sorry but the concept is stupid and perhaps there were people pushing for it but the reality is that MS was never going to abandon that large a % of their user base when it came down to it.

I don't buy that some forums warriors in an uproar changed anyones mind.
As that's impossible to prove until maybe Dean Takahashi writes a new book, it's perhaps best to just let this one lie. Whether MS were or were not going to release a "needs internet connection" console, it would appear that now we are going to get a standard console that can play entertainment content offline, and I guess MS sent the memo so after 21st, people free to talk about ideas made public are all reading from the sheet.

The finger pointing of who was right and wrong in guessing this move can wait until there's certain proof (for those who think that's a valuable discussion and like to be the guy who's right on the internet).
 
if the very same gpu from AMD
How do you know it's the very same GPU? It LOOKS like the very same GPU, you say? Why does it look like the very same GPU, because of the 12 CUs? That's laughable, because it's not! It's got on-die framebuffer, move engines and display planes which don't exist in any current AMD desktop GPU. Who knows what else differs under the hood! To say that because desktop GPU X clocks to Y just fine, then this one has to as well is just patently ridiculous. There's way too many unknowns to make such an unfounded, speculative assertion.
 
How do you know it's the very same GPU? It LOOKS like the very same GPU, you say? Why does it look like the very same GPU, because of the 12 CUs? That's laughable, because it's not! It's got on-die framebuffer, move engines and display planes which don't exist in any current AMD desktop GPU. Who knows what else differs under the hood! To say that because desktop GPU X clocks to Y just fine, then this one has to as well is just patently ridiculous. There's way too many unknowns to make such an unfounded, speculative assertion.

If some of you laugh, like to be aggressive, talks about what's ridicolous but never stop to talk and think, we'll never meet a common point.
when you talk about on-die framebuffer are you talking about esram? because it should be NOT a framebuffer and should be a separate block (as separate block was and still is the eDRAM on 360) and yes the 12 CU's block should be the exact 12 CU's block on 7790, exact, same type, same technology, all the same.
And if you don't like speculative discussions, you're in the wrong thread. You are not sure even of the presence of move engines, display planes, there's nothing proved or sure; so if you don't believe that the CU's block is a GCN1 12 CU's block, but you believe all the other rumored and speculative things, it's a bit strange for me. Even more, a 1 GHz 12 CU's block should be less hot than a 0.8 GHz fat 18 CU's block
 
As that's impossible to prove until maybe Dean Takahashi writes a new book, it's perhaps best to just let this one lie. Whether MS were or were not going to release a "needs internet connection" console, it would appear that now we are going to get a standard console that can play entertainment content offline, and I guess MS sent the memo so after 21st, people free to talk about ideas made public are all reading from the sheet.

The finger pointing of who was right and wrong in guessing this move can wait until there's certain proof (for those who think that's a valuable discussion and like to be the guy who's right on the internet).

I agree, if it was never Microsoft's intention then there really isn't going to be a way to prove that one way or another. MS would defend or deny the truth, and as an issue it really won't matter to gamers in the aftermath. They don't have to worry about it now, so it will quickly fade from the conversation.

But I always viewed this rumor as being particularly extreme. It would have been a shot in the foot of MS' own bottom-line. And no matter how many times it was repeated, it just never made sense. It would be like an election candidate deciding. "Eh, I don't need the 21 million people who voted for me in the last election. We got the 49 million voters that we really want" :LOL:

I think this is only the beginning, some of these other rumors may start crashing along the rocks in the same manner soon. And we still have two weeks left to go.
 
But I always viewed this rumor as being particularly extreme. It would have been a shot in the foot of MS' own bottom-line. And no matter how many times it was repeated, it just never made sense. It would be like an election candidate deciding. "Eh, I don't need the 21 million people who voted for me in the last election. We got the 49 million voters that we really want" :LOL:
That's not quite the comparison though. It's more like the election candidate seeing he might lose 10% of previous voters but be more attractive (online services and developers/publishers) to twice the potential voters. There's no way to quantify the negative impact always on gaming would have, so we're all just making numbers up in support of our own POV, but I seriously doubt 100% of existing XB fans would reject the platform.
 
Part of me is a bit dismayed at this decision actually. I kind of wanted to see what kind of cool things might happen if a internet connection could always be counted on.

Mostly though, I probably agree with not requiring always online.
 
Yes but what I don't know for sure is if in this case increasing memory amount but not memory bandwidth will help.
Adding 4GB of slow DDR3 will unbalance the architecture or not?
Would the 69GB/s of bandwidth of the CPU still be good enough to access 9GB instead than then current 5GB?

I can't answer these questions and be sure 100%

It's 69GB/s plus the ESRAM, which is likely not a insignificant help (360 this gen on one 128 bit bus, typically having fewer BW problems than PS3 with 2)

Also, supposedly devs complained a lot about the Durango OS reserves (take this as wink wink confirmed). Now, those reserves are said to be 3GB RAM and two cores. Given the Shape audio block, and the fact at least one core would have to be expected to be reserved, I doubt they were griping that much over one small CPU core. Nope, they were griping about the RAM holdback. Which means they wanted more RAM than 5GB, at least. Which means they obviously think they can use it.
 
Part of me is a bit dismayed at this decision actually. I kind of wanted to see what kind of cool things might happen if a internet connection could always be counted on.

Mostly though, I probably agree with not requiring always online.
Devs can still target always on in their game as long as advertised as such ("this game requires an internet connection to play" icon on the box). And features can be enabled with the assumption that it's connected, with those features dropped when it isn't. Assuming cross-platform games, devs would have to factor that in anyway - only Durango exclusives could really go 100% online, cloud universe, no play offline, and that's probably not gonna be many games at all. So I doubt it's much of a loss or a gain either way, truth be told.
 
Even more, a 1 GHz 12 CU's block should be less hot than a 0.8 GHz fat 18 CU's block

Power per unit area is much more important than total heat dissipated when you're talking about temperature.

18 CUs @ 0.8 Ghz is much more likely a better option than 12 CUs @ 1Ghz when you're talking about temperature when everything else is kept constant. Of course unless you're trying to fry something with them then 12CUs @ 1 Ghz would be a better option.
 
It's 69GB/s plus the ESRAM, which is likely not a insignificant help (360 this gen on one 128 bit bus, typically having fewer BW problems than PS3 with 2)

Also, supposedly devs complained a lot about the Durango OS reserves (take this as wink wink confirmed). Now, those reserves are said to be 3GB RAM and two cores. Given the Shape audio block, and the fact at least one core would have to be expected to be reserved, I doubt they were griping that much over one small CPU core. Nope, they were griping about the RAM holdback. Which means they wanted more RAM than 5GB, at least. Which means they obviously think they can use it.

The ESARM bandwidth is not shared with the CPU.
DRAM bandwidth is limited to 69GB/s and it's shared between: CPU, GPU, Display scan out, Move engines and the audio system.

ESRAM bandwidth (102Gb/s) is shared by GPU & Move engines.

MS can increase RAM to 12Gb, which would give 9GB to games, but the CPU would still read/write them at 69GB/s.
The CPU has low bandwidth and it would then have to access a even larger memory and I am not sure it would not reduce the percentage of accessible memory.
 
Yes but what I don't know for sure is if in this case increasing memory amount but not memory bandwidth will help.
Adding 4GB of slow DDR3 will unbalance the architecture or not?
Would the 69GB/s of bandwidth of the CPU still be good enough to access 9GB instead than then current 5GB?

I can't answer these questions and be sure 100%

The CPU bandwidth to main memory should be about the same as in the PS4, but one system would have more available memory with a smaller OS footprint.

More bandwidth isn't necessary for more RAM to improve things up to a point. The major bottleneck is how fast you can copy stuff into memory from the HDD/BD disc.

Also if there's enough of your game in memory so that streaming the periphery from a HDD makes for a seamless experience, there's no point in having/using more. More would be useful for switching quickly between apps and games without waiting for them to load each time.
 
Last edited by a moderator:
when you talk about on-die framebuffer are you talking about esram? because it should be NOT a framebuffer and should be a separate block (as separate block was and still is the eDRAM on 360)
It's not a separate block, it's going to be integrated into the die itself. And yeah, it'll have the framebuffer (and/or various render targets, Z, G-buffer, shadow buffers etc) stored on it, of course it will. It offers the highest bandwidth of all memory in the system, and rasterization is the largest consumer of it. Texturing is also high, but 32MB is too small to fit all textures for a scene these days. It's what makes sense.

and yes the 12 CU's block should be the exact 12 CU's block on 7790, exact, same type, same technology, all the same.
"Should", yeah, just like the wuu's GPU should have the exact same shader units out of the radeon 4000 line since that's what it's based on (except it doesn't), so how can you be so sure? How many console GPUs have you designed in order to know this?

In any case, the shader block isn't the only component of the GPU of course and just because the shaders may be capable of clocking at 1+ GHz doesn't mean the rest of the logic on the GPU does. AMD GPUs use one single core clock, likely the eDRAM (SRAM, whatever) runs at core clock too, as well as any other custom logic designed specifically for durango. Is all of that going to be capable of reliably clocking 20% faster than the suggested speed? Unknown, yet you seem very certain for some reason, which is strange since there are literally no fact to back that up.

The higher you clock things the lower functional yields are going to be, and trying to "solve" it hobbyist-overclocker fashion by whacking up the volts isn't going to be a reliable solution for a company like microsoft that will sell millions of units. They've already gone through one red-ring fiasco that cost them multiple $billions already, I'm confident they never want a repeat of that again. (Also, bunches of original xboxes caught fire and so on, due to poor quality power supplies and whatnot.)

And if you don't like speculative discussions, you're in the wrong thread.
However you're not speculating, you're throwing out rigid assertions without any basis in fact, which is something different than just speculation.

Even more, a 1 GHz 12 CU's block should be less hot than a 0.8 GHz fat 18 CU's block
Depends on the voltages you need to pump through the chip. If the 18CU chip runs at 1V@800MHz and the 12CU chip at 1.175V@1GHz it's not as clear-cut anymore. Anyway, hot-spots on a chip is more of an issue generally than the total heatload of a chip, and as you ramp the volts heat quickly ramps up.
 
The first final devkits reached the developers in March (with final silicon). I don't think there was any time for reeingegnering the APU for increasing core-counts or the CUs. If they decided to improve the specs it's one of the following option:

- More DDR3
- Switch to higher frequency DDR4 (3200?)
- Slight increase in clocks. (15-20%)

The first move may be the most impressive on the marketing side, but it's probably the least effective for developers: you can access at most 5 GB for frame with the current rumored bandwidth, which incidentaly it's also the rumored RAM available to devs.

The second would be the best improvement for increasing real-world performance, raching a BW to system RAM to 100 GB/s.. and achieving an aggregated BW of 200 GB/s for the GPU.

The third one it's likely require reeingegnering/a respin, and it's the more expensive solution: a change in TDP means a redesign in the packaging, motherboard components, cooling solution and to the power section. If not combined with an increase in the system BW, it would stress even more the limited BW to RAM.

Yes, MS could combine more than a option and provide us with a considerably more powerful console, at a considerably higher cost for them. But if they do, I expect lots of heads to roll: how can you plan a performance target so wrong in the beginning? (So wrong because you feel the need to change it after). But it wouldn't be the first time that MS does huge mistakes in the direction their managment takes (see Windows 8, Zune, Kin).
 
Part of me is a bit dismayed at this decision actually. I kind of wanted to see what kind of cool things might happen if a internet connection could always be counted on.

Mostly though, I probably agree with not requiring always online.

My thoughts summed up nicely.

But what I find interesting is that if bkilian actually left Microsoft because of its original plans to have always online, then the decision to change at the last minute might be pretty upsetting or at least make you regret for not sticking it out. Or he might feel a bit vindicated. Bet there would be all kinds of emotions. I know I would.

I agree with Shifty, I'd like to see a book from Dean Takahashi. I bet all the politics & dealings behind the scenes would be interesting.

Tommy McClain
 
The CPU bandwidth to main memory should be about the same as in the PS4, but one system would have more available memory with a smaller OS footprint.

PS4 CPU shared bandwidth is 176GB/s for 7-7.5GB of memory available to games.
Durango's CPU shared bandwidth is at 69GB/s for 5GB available to games, 9GB if they add 4GB.
.
 
PS4 CPU shared bandwidth is 176GB/s for 7-7.5GB of memory available to games.
Durango's CPU shared bandwidth is at 69GB/s for 5GB available to games, 9GB if they add 4GB.

It's not the same.

It should be ~20GB/s for both, the rest left for the GPU.
 
9GB if they add 4GB.
Why on earth would they do that? I swear, this must be the silliest wishful-thinking rumor I've heard this generation. First of all they'd need to double the amount of chips on their PCB, it would complicate signalling and potentially mess with RAM timings (two loads on the bitlines instead of just one), they'd need to order separate capacity chips, making their supply chain more complex if not for the remainder of the life of the console so at least until whatever strange day they decide to simply up the memory capacity to 16GB, if/when such RAM ICs become available (probably when they revise the hardware to use DDR4 instead.) Price goes up in a major way as well. Not to mention what might happen with RAM banking/interleave with an asymmetrical memory pool installed. That is if the memory controller on the chip even supports an asymmetrical loadout in the first place...

This dumb rumor is a perfect example of Occam's razor. Devs want more lebensraum to grow into, what should MS do; jump through 15 hoops to stick 12GB into every durango and make their machine more expensive with little benefit to show to the buyer, or simply cut down on the amount of reserved RAM?

...I know which bet I'd put my money on, for sure.

*Edit: third option: MS tells devs to shut up and take it...
 
Status
Not open for further replies.
Back
Top