Working assuming the Eurogamer Article is mostly correct with the exception of maybe exact clocks, amount of memory, and number of enabled cores (all of which could easily change to adapt to yields)....
PS4
The real reason to get excited about a PS4 is what Sony as a company does with the OS and system libraries as a platform, and what this enables 1st party studios to do, when they make PS4-only games. If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won't happen right away on launch, but once developers tool up for the platform, this will be the case. As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API....
I could continue here, but I'm not, by now you get the picture, launch titles will likely be DX11 ports, so perhaps not much better than what could be done on PC. However if Sony provides the real-time OS with libGCM v2 for GCN, one or two years out, 1st party devs and Sony's internal teams like the ICE team, will have had long enough to build up tech to really leverage the platform.
I'm excited for what this platform will provide for PS4-only 1st party titles and developers who still have the balls to do a non-portable game this next round....
Xbox720
Working here assuming the Eurogamer Article is close to correct. On this platform I'd be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of "ESRAM" sounds troubling....If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.
My guess is that the real reason for 8GB of memory is because this box is a DVR which actually runs "Windows" (which requires a GB or two or three of "overhead"), but like Windows RT (Windows on ARM) only exposes a non-desktop UI to the user. There are a bunch of reasons they might ditch the real-time console OS, one being that if they don't provide low level access to developers, that it might enable a faster refresh on backwards compatible hardware. In theory the developer just targets the box like it was a special DX11 "PC" with a few extra changes like hints for surfaces which should go in ESRAM, then on the next refresh hardware, all prior games just get better FPS or resolution or AA. Of course if they do that, then it is just another PC, just lower performance, with all the latency baggage, and lack of low level magic which makes 1st party games stand out and sell the platform.
A fast GDDR5 will be the desired option for developers. All the interesting cases for good anti-aliasing require a large amount of bandwidth and RAM. A tiny 32MB chunk of ESRAM will not fit that need even for forward rendering at 1080p. I think some developers could hit 1080p@60fps with the rumored Orbis specs even with good AA. My personal project is targeting 1080p@60fps with great AA on a 560ti which is a little slower than the rumored Orbis specs. There is no way my engine would hit that target on the rumored 720 specs. Ultimately on Orbis I guess devs target 1080p/30fps (with some motion blur) and leverage the lower latency OS stack and scan out at 60fps (double scan frames) to provide a really great lower-latency experience. Maybe the same title on 720 would render at 720p/30fps, and maybe Microsoft is dedicating a few CPU hardware threads to the GPU driver stack to remove the latency problem (assuming this is a "Windows" OS under the covers).