I think MS will end its conference with it's first set of native DX12 based games, and that they'll be sure to let everyone know which games are built ground up from DX12, everything shown before that is still leveraging DX11.2+.
I'm not sure what that means though, I just have a feeling it will happen. I'll only do this once since it's a predictions thread, and showcase my lack of knowing anything.
"Ridiculous Predication/Hopes and Dreams Hat On"
There's a part of me that believes in the reality that Nvidia & AMD needed a solution that would increase performance and power, without the need to increasing die size or shrinking further than it already is. Maybe they are running into a wall or something soon and all sorts of physics based problems coming up now.
So the push is to completely saturate the GPU; make it a game about combating heat, providing enough bandwidth, and providing the tools to ensure that the data was always in the right place at the right time to keep the GPU constantly busy and not stalled. Maybe DX12 assists in that; DX12 is all about hardware saturation.
According to this
http://research.cs.wisc.edu/multifacet/papers/micro13_hsc.pdf from another thread it takes 700 GB/s to not be the bottleneck for 32 CUs. Yet AMD released the 290x that has 44 CUs with only 320 GB/s worth of available bandwidth. I can't ignore the the obvious fact that simple math would showcase that is only enough bandwidth to saturate 20 CUs according to that research paper, or in other words the CUs on the 290x are only running at below 50% saturation.
Coincidentally the X1 has 12 CUs and magically according to MS total system bandwidth at 192 + 68 GB/s which is exactly the amount to just be/not be the bandwidth to fully saturate 12 CUs. Missing 3GBs of RAM and the Kinect reservation on the GPU with the constant talk of a 'fully balanced system' looking at all sorts of bottlenecks, the numbers just seem to really be "balanced".
I recall reading that developers have mentioned that Mantle like APIs would be pushing the GPU much harder than before, because so many more draw calls would be going in than ever, we could see modern day GPUs overheat because their cooling solutions were not designed for fully saturated GPUs.
BKillian mentioned that esram was always part of the design decision and had nothing to do with trying to compensate the lack of GDDR5. I think maybe it's because they wanted to centralize where all the heat was going, 80% of all the heavy lifting is all done on a single chip making a cooling solution more simplistic. Just cool the chip hard, and the off die RAM chips could be cooled passively. The X1 has a massive heatsink and fan that you'd only find on an semi enthusiast gaming system, and that ultimately plays a role in supporting that future - I for some reason recall this quote from DF
Our sources concur that the February/March dev kits were indeed very loud indeed, but it wasn't due to over-heating - quite the opposite in fact. The thermal control algorithm - which monitors the heat output of the major chips on the motherboard and adjusts fan speed accordingly - simply wasn't implemented in the developing OS, and so to avoid damaging the hardware, the fans were set to 100 per cent all the time. This was resolved by a software update back in March that brought Xbox One to its current stealth-like state, and our understanding is that final development hardware - which is a complete match for retail silicon - started rolling out to developers early in July and remains extremely quiet.
Since I've had my X1, I've never once heard the fan, ever. I'm curious to know what 100% sounds like.
TL;DR X1 secret sauce is that massive heatsink.