Xbox : What should MS do next? *spawn

For an Xbox Two, I would keep the 32MB of ESRAM and same 8 core CPU setup, but solely focus on more upclocks.

The next node shrink (14/16 FF) should provide significant head room for a potential upclock for both CPU and GPU. If they could hit ~1.25 GHz for the GPU they'd be in parity territory with PS4.

I think the trick would be to minimize the work needed by the Dev's to target the different specs. If you could abstract away the differences so that it's basically: get a game to run 720/900p on Xbox 1 and it will run at 1080p on Xbox 1.5, then I think it's doable.

You'd probably want more of that scratchpad to accommodate the larger rendertargets. :p
 
ESRAM provides two features for performance - BW and lower latency. BW can be accommodated with whatever RAM XBtoo has, so I don't see that being at all an issue. The only possible concern is if the low latency (if it's that low) gets relied on for some performance. But even if so, larger caches and higher clocks may well mitigate that. So as long as devs aren't hitting the metal, and with DX12 they shouldn't be, I see no problem implementing a BC XBtoo.
 
There has been no indication that ESRAM is a benefit. Why add that to Xbox Two?
Cache in any environment requiring repeat access is a proven benefit. If you removed ESRAM from Xbox One, leaving it the 68Gb/sec DDR3 bandwidth, you'd notice. Xbox One to PlayStation comparisons would be a bloodbath :yep2:

Xbox One is much better for the ESRAM.
 
Cache in any environment requiring repeat access is a proven benefit. If you removed ESRAM from Xbox One, leaving it the 68Gb/sec DDR3 bandwidth, you'd notice. Xbox One to PlayStation comparisons would be a bloodbath :yep2:

Xbox One is much better for the ESRAM.

I guess the question is whether some form of ESRAM still has a benefit if you switch to GDDR5 or a faster DDR4 for an Xbox Two.
 
There has been no indication that ESRAM is a benefit. Why add that to Xbox Two?
Compatibility. Making XBToo work with XBOne games would mean directing code that's looking for ESRAM to something suitable. The obvious solution is ESRAM, although I reckon it can be worked around and plain old RAM will suffice. Highly efficient code could end up looking for fast, low latency data that may not be a nice fit for a different memory topology, or such code might never appear.

ESRAM really was a bad move. Had Ms gone with a more straightforward HW like PS4, FC/BC would have been a shoe-in and they could run XB as a DirectX platform as they originally envisioned. Could have even added super-easy portability to PC - one codebase, PC+consoles covered.
 
I guess the question is whether some form of ESRAM still has a benefit if you switch to GDDR5 or a faster DDR4 for an Xbox Two.
If it's the same bandwidth as now then it's less impactful but perhaps still useful in those rare use cases where you achieve that critical read-write behaviour to get a burst of 200Gb/sec.

However, if they scale up the bandwidth (as I would expect them too with new hardware) then sure!
 
I guess the question is whether some form of ESRAM still has a benefit if you switch to GDDR5 or a faster DDR4 for an Xbox Two.

If it was free sure, but it takes additional die realestate and increases fab costs or reduces realestate for other microarchitecture such as a larger gpu. But sure if cust, tradeoffs, and yields weren't an issue ESRAM would be great since it'd still be added bandwidth and that is never a negative.

Cerny were contemplating an edram cache + 128bit wide gddr5 interface.

Xbox 360 ran on gddr3 + edram.

" http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview
Digital Foundry: Why go for ESRAM rather than eDRAM? You had a lot of success with this on Xbox 360.

Nick Baker: It's just a matter of who has the technology available to do eDRAM on a single die.

Digital Foundry: So you didn't want to go for a daughter die as you did with Xbox 360?

Nick Baker: No, we wanted a single processor, like I said. If there'd been a different time frame or technology options we could maybe have had a different technology there but for the product in the timeframe, ESRAM was the best choice.
"
 
Well, the great thing about RAM is it's just a memory address. You could in theory switch to a different memory and pick a chunk to allocate as "ESRAM" for the VM that runs games.
 
Cerny were contemplating an edram cache + 128bit wide gddr5 interface.
Yeah, at 1,088 Gb/sec!

2013-06-28%20Mark%20Cerny%20at%20Gamelabs%20-%20Road%20to%20PS4%20%282%29.png


If ESRAM2 in Xbox Two was crazy high bandwidth like than then you can probably always find novel ways to make the hardware dance that's virtually impossible on a conventional hardware.
 
ESRAM provides two features for performance - BW and lower latency. BW can be accommodated with whatever RAM XBtoo has, so I don't see that being at all an issue. The only possible concern is if the low latency (if it's that low) gets relied on for some performance. But even if so, larger caches and higher clocks may well mitigate that. So as long as devs aren't hitting the metal, and with DX12 they shouldn't be, I see no problem implementing a BC XBtoo.

Running Xbox One games on an Xbox Two could be handled by faster BW of the Xbox Two. I'm not sure it would work for the reverse: Xbox Two games running at lower settings on an Xbox one. I think you have to design for a fixed embedded RAM to handle that case. Otherwise it's not the iPhone model.
 
The only possible concern is if the low latency (if it's that low) gets relied on for some performance. But even if so, larger caches and higher clocks may well mitigate that. So as long as devs aren't hitting the metal, and with DX12 they shouldn't be, I see no problem implementing a BC XBtoo.

Maybe latency was never intended to be a factor? I guess the 'no statement' about it might be evidence of such.

Digital Foundry: There's been some discussion online about low-latency memory access on ESRAM. My understanding of graphics technology is that you forego latency and you go wide, you parallelise over however many compute units are available. Does low latency here materially affect GPU performance?

Nick Baker: You're right. GPUs are less latency sensitive. We've not really made any statements about latency.
 
Running Xbox One games on an Xbox Two could be handled by faster BW of the Xbox Two. I'm not sure it would work for the reverse: Xbox Two games running at lower settings on an Xbox one.

This is principally why I don't think it'll happen. Two consoles to support with different but compatible(isn) hardware. You're constraining the following generation of hardware to the last to maintain some semblance of backward compatibility.

If you're too radically divorced from the last generation hardware you're basically asking devs to support a forth console build. :nope:
 
I don't believe the latency has any significant benefit, but it doesn't need to be significant to impact BC. You only need a few stalls throwing out timing for a piece of code to keel over. Depending on how low level the devs get. lower latency might be an issue with just portioning a chunk of RAM of as virtual ESRAM.
 
ESRAM really was a bad move. Had Ms gone with a more straightforward HW like PS4, FC/BC would have been a shoe-in and they could run XB as a DirectX platform as they originally envisioned. Could have even added super-easy portability to PC - one codebase, PC+consoles covered.

This is my view also.
 
I don't believe the latency has any significant benefit, but it doesn't need to be significant to impact BC. You only need a few stalls throwing out timing for a piece of code to keel over. Depending on how low level the devs get. lower latency might be an issue with just portioning a chunk of RAM of as virtual ESRAM.

IIRC, A year ago 3dilettante suggested that rop might benefit from the potentially reduced latency esram might provide.

Then iirc a few months ago a developers was interviewed on gamingbolt saying that esram could improve rop performance.
 
esram won't seem like a bad move when/if it allows them to transition to DDR4. At that point they'll have a cost advantage and even lower power consumption per MB per second.

MS could have made a more powerful system while continuing to use esram.

If it was a consideration when MS were designing the system and the API, performance fluctuations coming from changing memory won't prevent backwards compatibility of a future product. In this sense, it's the same as a GDDR5 based platform.

Even if esram were required for BC reasons, a 14/16 nm future product could eat 32MB of cache at well under 40 mm^2 and offer significant performance increases for that product too.
 
esram won't seem like a bad move when/if it allows them to transition to DDR4. At that point they'll have a cost advantage and even lower power consumption per MB per second.

Is it possible to get a primer on why DDR4, pros/cons wrt Xbox One?
 
Over the next 3-5 years it will become cheaper than DDR3 and replace it. It will have higher densities, faster speeds, and lower power consumption versus DDR3.

For X1, it will allow them to go to smaller bus with fewer chips and consume less power. All of these should result in a lower cost console.

In theory DDR4 could stay higher cost than DDR3, but I doubt that. I bet adoption will snowball next year when highend mobile devices adopt it.
 
Ya. The most compelling revision that DDR4 makes possible to armchair console designers like myself is a 4 x 2GB configuration.

Small, simple motherboard, wouldn't use much juice, and would sit well with a small, efficient 16 nm APU. You could probably fit it into something not much bigger than the Wii U, even with optical drive.

Or in MS's case, put it in a bizarrely large and ugly case with a huge external power brick.
 
Back
Top