And an upscaler can mean 4K upscaled.
If MS designed Xbox One with 720p in mind, Why Forza 5 and other games can run at 1080p?
And CoD is an intensive game on the hardware?
And CoD is an intensive game on the hardware?
careful there were people on these very forums who were calling halo3 the best looking console game at the time on any platformYou all are forgetting that 360 launch title was wallguy.
Launch titles are never an indication of any console's capability
Why is anyone at fault?Lets say the blame is half and half, on IW and MS.
Not only launch timeframes but also pretty much a fixed deadline.but maybe it's harder than I imagine, especially considering the launch timeframe.
It would be interesting to know why various tradeoffs were chosen and what buffers were put where, but I'd guess we'll never know.
Yep, I agree, but as you can see I put most of this down to time frames and being first gen of games, where especially multi plat games need to be a quick port to just get them up and running reasonably and out on time.I thought we'd hear more talk of dynamic framebuffers because of the extra display plane and the per plane scaling. At least that's what was implied from the leaks. Maybe working with dynamic framebuffers is not that easy.
The engines in use for these early games aren't designed around the limited size of the ESRAM, and the prevalence of deferred renderers probably isn't helping that, which likely explains some of deficit
Forgetting any comparison with PS4 or Steambox or any other console, MS would really be inept to not design a console with 1080p as the target output. 1080p is increasingly becoming the norm for TV's. It's releasing on 2013 and is expected to be a 8 year+ device.I didn't say they designed XBOX one with 720p in mind ... but given all the effort they obviously put into making their display planes capable of individual hardware scaling, they probably at least didn't design XBOX One around the idea of games usually hitting 1080p (native) ...
They clearly made very sure that combining 1080p HUDs with lower-res game renderings is really easy to achieve ...
But deferred renderers were very popular this current gen ... don't Microsoft made sure its machine would be adept at that technique and futureproof it ? Didn't see were things are going ?
Forgetting any comparison with PS4 or Steambox or any other console, MS would really be inept to not design a console with 1080p as the target output. 1080p is increasingly becoming the norm for TV's. It's releasing on 2013 and is expected to be a 8 year+ device.
Again forgetting what the PS4 does, outputting a game such as COD at essentailly the same resolution as the 360 is a major technical failure. Whether it has an effect on sales for the average gamer, we'll see soon. It's just forum snark right now. The big question is, is it a temporary software/firmware/tools problem or will we be seeing COD in 2017 at 720p?
We constructed virtualisation in such a way that it doesn't have any overhead cost for graphics other than for interrupts. We've contrived to do everything we can to avoid interrupts... We only do two per frame. We had to make significant changes in the hardware and the software to accomplish this. We have hardware overlays where we give two layers to the title and one layer to the system and the title can render completely asynchronously and have them presented completely asynchronously to what's going on system-side.
They had to make compromises somewhere with the inclusion of Kinect in order to keep the retail price palatable.
But surely the biggest change would be for the former PS3 devs. They are looking at a complete change of architecture, more PC like sure. But a completely different paradigm of design and implementation.
The XB, on the other hand, is the logical extension of the 360 architecture. Different CPU but the paradigm driving the use of the ESRAM is essentially the same as the 360's EDRAM. You just have more available storage and different timings for cache use etc.
Surely getting the XB1 up to speed should be quite easy unless all you are doing is a straightforward brute force implementation that saturates the memory bandwidth with screen and buffer support. In this case the XB1 is going to suffer massively. But I can't believe that the ESRAM is just being ignored.
Considering the similarity of features between BF4 and Ghosts what can be so very different to cause such a big downgrade?