I am having trouble keeping up with this thread. It is growing fast.That would be entirely up to the design targets of the implementation. Bandwidth is more a function of clocks and the total width of the read and write paths. Unless you really care about sub-ns latencies--and at these capacities and distances on the die you don't, there is no conclusion to draw based on the type of memory cells employed. To think differently is to forget about IBM's POWER7 L3 cache.
What measures differ by 10x?
I'm failing to understand the purpose of the constrained state if you cannot interact with the game? What is the point of leaving it running? Why not just suspend it instead? Kind of kills the idea of multitasking as well, if this is the state your game is put into when you run Skype at the same time, for example. I'd have to see an explanation of the usefulness of this.
The terminated state is nice. Basically a VM snapshot, I guess. You just resume, like you were playing a movie where you left off. No menus to jump through.
Worse you might be acting as the multiplayer server, and you don't want to DC everyone else just because you answered a call.
1) Running: The game is loaded in memory and is fully running. The game has full access to the reserved system resources, which are six CPU cores, 90 percent of GPU processing power, and 5 GB of memory. The game is rendering full-screen and the user can interact with it.
But an order of magnitude is 10x. Also the BW to eDRAM in Xbox 360 was 32GB/s? So less than 1/3 compared to the SRAM here.I am having trouble keeping up with this thread. It is growing fast.
In regards to your question.... By orders of magnitude I meant the scale of certain amount -which was not determined by powers of ten- but by the multiplication of a percentage, more concretely 10%.
In that case the EDRAM of the X360 is an order of magnitude faster -bandwidth wise, I am not talking about latency- than the eSRAM on XBO.
More concretely 150% faster, if my maths aren't wrong, that's why I mentioned an order of magnitude.
These differences and the emergent realities of eSRAM vs EDRAM have forced rearchitecting the entire console.
Worse you might be acting as the multiplayer server, and you don't want to DC everyone else just because you answered a call.
Suppose you got a call from someone important, and you decided to expand the Skype call fullscreen.
You don't want to be kicked off a multiplayer server just because you decided to answer a Skype call, do you?
Exactly. there is a huge diffrence between a 200-400 mhz bump on the cpu and a 200mhz bump on the gpu portion of the apu than a 3-4 times increase in clock speeds.
We don't know the tipping point of when yields go to hell vs clock speeds.
The 360 also had a GPU reservation, somewhere around the 10% mark, might have been a bit less. Developers had no problem getting close to the metal there.The 10% GPU reservation is pretty bad news. Not just because of the lost GPU time itself but because that means what developers were already warning about is probably true.. more API abstraction than on PS4. You can't get very close to the GPU when you don't even have exclusive access to it.
If the OS does any sort of notification or overlay it has to reserve some GPU time. It doesn't need to be constant but there has to be a maximum.
Worse you might be acting as the multiplayer server, and you don't want to DC everyone else just because you answered a call.
Sure, but doesn't the game get paused when that happens on XBox 360? So it wouldn't take away from its total GPU budget or interfere with its ability to access the hardware directly.
If that happens w/o pauses on XBox One that seems like the perfect kind of thing to use those hardware overlays for, and not mess with the GPU at all. Even on XBox 360 it's not like it had to use the GPU to render the menus and stuff (since the GPU doesn't manage the framebuffer anyway), they could have probably done that in software too.
The 10% GPU reservation is pretty bad news. Not just because of the lost GPU time itself but because that means what developers were already warning about is probably true.. more API abstraction than on PS4. You can't get very close to the GPU when you don't even have exclusive access to it.
I maintain that if they're going the VM route they should simply open another VM when acting as the multiplayer server.
And I thought they're promising dedicated servers in the cloud already?
But of course that doesn't mean you can't act as a server anyway so probably a moot point.