News & Rumors: Xbox One (codename Durango)

Status
Not open for further replies.
Yes, Microsoft made it really easy for developers to work smarter not harder.
 
Just to be clear about this: I personally think that whatever the tech behind XBOX One's upscaler is, it does a pretty good job.

Difference between the 720p vs. 900p B4 footage was WAY less obvious than I would have expected.

While I do prefer the PS4, I really don't think that the difference in (native) resolution in some of the third party launch games is THAT big a deal ...
 
And CoD is an intensive game on the hardware?

Well having early access and complete support from MS helped quite a bit as well as only needing to render what goes on in front of a car helps as well. Think of the visual cone spinning around in a large world in 360 degrees ( plus up and down ) vs a visual cone moving along a fixed track almost always pointing along the direction you are moving. You can be pretty sure of what is coming next and planning on it when it comes to a driving game on a fixed circuit rather than a FPS whirling around in an ever moving/exploding world.
 
I would think the more concerning thing right now for MS is the fact that the difference between 720P vs 1080P is loser to 100%, much larger than the ALU deficit, and closer to the ROP deficit.

The engines in use for these early games aren't designed around the limited size of the ESRAM, and the prevalence of deferred renderers probably isn't helping that, which likely explains some of deficit, but I would be concerned that the virtualization of the GPU is introducing significant overhead, or the limited ROPs are an issue.

Having never worked on an XB1, I would assume that juggling the 100MB's of render targets most modern games use wouldn't be a huge problem, and that you should be able to get 80% of the way to optimum relatively easily, but maybe it's harder than I imagine, especially considering the launch timeframe.

It would be interesting to know why various tradeoffs were chosen and what buffers were put where, but I'd guess we'll never know.
 
I thought we'd hear more talk of dynamic framebuffers because of the extra display plane and the per plane scaling. At least that's what was implied from the leaks. Maybe working with dynamic framebuffers is not that easy.
 
You all are forgetting that 360 launch title was wallguy.
Launch titles are never an indication of any console's capability

2077416-wallguy_screen.jpg
careful there were people on these very forums who were calling halo3 the best looking console game at the time on any platform :LOL:
 
Lets say the blame is half and half, on IW and MS.
Why is anyone at fault?
Not much different than launch games in the past especially cross platform ones, which is more prevalent now.
2 new consoles coming out at exactly the same time, 3rd party dev, talk about limited time frames, on hardware and tools etc that are in a constant state of flux. Hardware moving from Alpha,beta to final.
but maybe it's harder than I imagine, especially considering the launch timeframe.

It would be interesting to know why various tradeoffs were chosen and what buffers were put where, but I'd guess we'll never know.
Not only launch timeframes but also pretty much a fixed deadline.
Drive Club could move, but that's not a multi platform release.
I thought we'd hear more talk of dynamic framebuffers because of the extra display plane and the per plane scaling. At least that's what was implied from the leaks. Maybe working with dynamic framebuffers is not that easy.
Yep, I agree, but as you can see I put most of this down to time frames and being first gen of games, where especially multi plat games need to be a quick port to just get them up and running reasonably and out on time.

What would really be nice would be devs talking about it, but as said, doubt we'll be hearing about it anytime soon, never know though......
 
The engines in use for these early games aren't designed around the limited size of the ESRAM, and the prevalence of deferred renderers probably isn't helping that, which likely explains some of deficit

But deferred renderers were very popular this current gen ... don't Microsoft made sure its machine would be adept at that technique and futureproof it ? Didn't see were things are going ?
 
I didn't say they designed XBOX one with 720p in mind ... but given all the effort they obviously put into making their display planes capable of individual hardware scaling, they probably at least didn't design XBOX One around the idea of games usually hitting 1080p (native) ...

They clearly made very sure that combining 1080p HUDs with lower-res game renderings is really easy to achieve ...
Forgetting any comparison with PS4 or Steambox or any other console, MS would really be inept to not design a console with 1080p as the target output. 1080p is increasingly becoming the norm for TV's. It's releasing on 2013 and is expected to be a 8 year+ device.

Again forgetting what the PS4 does, outputting a game such as COD at essentailly the same resolution as the 360 is a major technical failure. Whether it has an effect on sales for the average gamer, we'll see soon. It's just forum snark right now. The big question is, is it a temporary software/firmware/tools problem or will we be seeing COD in 2017 at 720p?
 
But deferred renderers were very popular this current gen ... don't Microsoft made sure its machine would be adept at that technique and futureproof it ? Didn't see were things are going ?

It's probably more the case that there is a only so much ESRAM you can put in the chip.

Developers will eventually tailor engines to the limitations of the hardware, and I don't mean by that they will do less, but rather they will structure data and order rendering operations to maximize what's there.

If however the core issue is the virtualization of the GPU or the ROP count, it's unlikely things will improve.
 
Forgetting any comparison with PS4 or Steambox or any other console, MS would really be inept to not design a console with 1080p as the target output. 1080p is increasingly becoming the norm for TV's. It's releasing on 2013 and is expected to be a 8 year+ device.

Again forgetting what the PS4 does, outputting a game such as COD at essentailly the same resolution as the 360 is a major technical failure. Whether it has an effect on sales for the average gamer, we'll see soon. It's just forum snark right now. The big question is, is it a temporary software/firmware/tools problem or will we be seeing COD in 2017 at 720p?

They had to make compromises somewhere with the inclusion of Kinect in order to keep the retail price palatable.
 
In the grand scheme of things, the money will be made off of the 360 and PS3 versions for these games. These are cross-gen ports, and to me this shows on both next consoles.

I do think this shows that the X1 is lacking in brute force (which we knew), and will require more work to get the same results. I just don't think either BF4 or CoD is utilizing the X1 to the fullest degree, and there could be issues on MS that hampered all of this. So the other console was better able to handle a straight port with horsepower over a unique some of parts.

X1 maybe won't be harder to use, it will just take more time to optimize the results. If Ryse or Forza looked like BF4 when compared to the 360 version, then I would think MS really screwed up bad somewhere. For now I give both consoles time (let's see the 2nd wave games), and I will maintain that they both could have used a better GPU (and CPU). ;p

Cryengine and UE4 should yield better results, and I think anything cross-gen is going to be lacking. AC4 will be the same as BF and CoD, along with Rivals I bet.

[YT]TJUKN99fW_k[/YT]
 
Last edited by a moderator:
But surely the biggest change would be for the former PS3 devs. They are looking at a complete change of architecture, more PC like sure. But a completely different paradigm of design and implementation.

The XB, on the other hand, is the logical extension of the 360 architecture. Different CPU but the paradigm driving the use of the ESRAM is essentially the same as the 360's EDRAM. You just have more available storage and different timings for cache use etc.

Surely getting the XB1 up to speed should be quite easy unless all you are doing is a straightforward brute force implementation that saturates the memory bandwidth with screen and buffer support. In this case the XB1 is going to suffer massively. But I can't believe that the ESRAM is just being ignored.

Considering the similarity of features between BF4 and Ghosts what can be so very different to cause such a big downgrade?
 
Didn't they said that GPU virtualization doesn't have overhead?

We constructed virtualisation in such a way that it doesn't have any overhead cost for graphics other than for interrupts. We've contrived to do everything we can to avoid interrupts... We only do two per frame. We had to make significant changes in the hardware and the software to accomplish this. We have hardware overlays where we give two layers to the title and one layer to the system and the title can render completely asynchronously and have them presented completely asynchronously to what's going on system-side.
 
They had to make compromises somewhere with the inclusion of Kinect in order to keep the retail price palatable.

Kinect already accounts for the $100 markup. Hardware for hardware, they were on the same footing as Sony. I mean with 360 and PS3 there were resolution differences but it was 720p vs 640p or something fairly unnoticeable. 1080 vs 720 is massive. As I and others have said, we'll revisit in Dec 2014 to see if this really is a colossal technical failure or a firmware/tools setback.
 
But surely the biggest change would be for the former PS3 devs. They are looking at a complete change of architecture, more PC like sure. But a completely different paradigm of design and implementation.

The XB, on the other hand, is the logical extension of the 360 architecture. Different CPU but the paradigm driving the use of the ESRAM is essentially the same as the 360's EDRAM. You just have more available storage and different timings for cache use etc.

Surely getting the XB1 up to speed should be quite easy unless all you are doing is a straightforward brute force implementation that saturates the memory bandwidth with screen and buffer support. In this case the XB1 is going to suffer massively. But I can't believe that the ESRAM is just being ignored.

Considering the similarity of features between BF4 and Ghosts what can be so very different to cause such a big downgrade?

Given the similarities with pc CPU and gpu, wouldn't they be porting the pc engine to xbo instead of the 360 one? The xbo is x86 and gcn, why port a PowerPC/xenos engine to xbo?
 
Status
Not open for further replies.
Back
Top