Its hard to believe that the memory is such a problem for developers. In that DF article about the Wii U, the developer said that memory bandwidth was not an issue on the Wii U, and that console has severely limited bandwidth to the main memory, 12.8GB/s compared to the Xbox One 68GB/s. I would assume the 32MB of sram would work in a similar manner to Wii U's edram, freeing up bandwidth taken by the buffers. Not to mention the fact that the GPU is far more advanced than the Wii U's GPU, so I would think it would have even better texture cache, but perhaps that is not the case. When you take a look at the difference in fidelity between COD Ghost on X1 compared to current gen consoles, its hard to fathom that memory bandwidth is the reason the game cant be rendered in 1080p on X1. Anandtech did an anylsis of the Gamecube GPU back in the day, and spoke of the 2MB of onboard memory for the Zbuffer saving tons of memory bandwidth, the 32 MB of sram should be freeing up enough bandwidth to make the 68GB/s to the main DDR3 memory sufficient. Perhaps the additional shader performance ramps of memory bandwidth requirements significantly, but so far is seems like the Xbox One is under performing for a closed box system. Its spec sheet would lead you to believe it should have little to no trouble running games like Ghost and Battlefield in 1080p, but obviously this is not the case.
In the WiiU's case, apparently the bandwidth wasn't a problem because other issues (CPU) were a even worse bottleneck than the bandwidth.
If you gave the WiiU a much more capable CPU, we probably have no doubt that bandwidth would become an issue, but then, with the better CPU it would probably then have a much more commanding edge over the PS3/360 (and still far behind the XB1 or the PS4) unlike now, and thus it wouldn't really matter that much.