Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Im not sure , but maybe MS release devices that looks like a sega 32X , sega saturn cartridges ... or something with a very low cost (about a game price) to boost processing power.

Yeah I dont think thats gonna happen. I dont even really think it is needed. It would be much more feasable to boost the gpu and cpu clock speed if they are really that concerned with extra power. Most system add ons in the past have been for extra RAM. Both next gen consoles have plenty of memory.
 
Yeah I dont think thats gonna happen. I dont even really think it is needed. It would be much more feasable to boost the gpu and cpu clock speed if they are really that concerned with extra power. Most system add ons in the past have been for extra RAM. Both next gen consoles have plenty of memory.

Not technically possible I think, you'd need a super fast interconnect to the rest of the system and where would you get that? Plus it would introduce all sorts of issues with programming.

If you were going to do that, I think they'd be better off pursuing the cloud anyway (according to that leaker Forza Horizon 2 will use cloud for dynamic weather system?). If you're going to do processing and then offload pre-computed results to a local console over a thin pipe, Cloud would be better for that.

The One is not THAT far behind anyway despite all the hubbub. The difference onscreen is arguably small. I'd go with what's there.

Boosting clocks after release for any console is also a whole nother kettle of fish and opens up a giant set of other issues, which is why it's never happened (except maybe on PSP that I dont know the details of how that worked). The bottom line is all consoles must be able to play every game ever released on the system just as well as any other console, or you open up all kinds of issues and make a drastic change to the console model.
 
Any psp was tested at max clock, and the increase was totally programmed as a way to artificially create a second generation of games and initially boost the autonomy.
Being out against the ds, it was competing on technology only with itself.
 
Any psp was tested at max clock, and the increase was totally programmed as a way to artificially create a second generation of games and initially boost the autonomy.
Being out against the ds, it was competing on technology only with itself.

Afaik the problem with the clock on the PSP was related to the "backlash" to the battery time. A lower frequency kept the play time up. If you go back and read the reviews on the launch you will see lots of focusing on Battery time. It was pretty stupid. And we have no idea what clock the PS4 has been tested against, it might be that Sony is playing it safe, and i said before they don't really have to care.
 
Not technically possible I think, you'd need a super fast interconnect to the rest of the system and where would you get that? Plus it would introduce all sorts of issues with programming.

If you were going to do that, I think they'd be better off pursuing the cloud anyway (according to that leaker Forza Horizon 2 will use cloud for dynamic weather system?). If you're going to do processing and then offload pre-computed results to a local console over a thin pipe, Cloud would be better for that.

The One is not THAT far behind anyway despite all the hubbub. The difference onscreen is arguably small. I'd go with what's there.

Boosting clocks after release for any console is also a whole nother kettle of fish and opens up a giant set of other issues, which is why it's never happened (except maybe on PSP that I dont know the details of how that worked). The bottom line is all consoles must be able to play every game ever released on the system just as well as any other console, or you open up all kinds of issues and make a drastic change to the console model.

Totally agree. I dont think a further upclock is necesary. I think people are making alot of drama out of the X1's suppossed lack of power. One question though since each X1 game ships with the game OS on the disk is it possible for future games to run at different gpu clock speeds? Making the clock speed dynamic between specific versions of future OS's?
Im sure this may depend on if the system can handle the higher thermals but Im really just wondering if it is actually possible.
 
Well the difference on screen is not small compared to the difference there was (one way or another) between the 360 and the ps3.
If the difference is small why raise the option for dynamic clocking? It is a bit nonsensical. The issue is not power but relative power wrt to competition and the system price. Going further down that line one may also question the system perfs wrt to the silicon invested.
Actually looking at Kaveri+DDR3+Direct x against both the ps4 and the XB one, or how well fares a Bonaire in another DF entry, I'm not convinced by the choices of either MSFT or Sony. Sony gets a free pass in this regard as at least they are leading in the performances metric.

By the way it seems there is something wrong with Kabini wrt to dynamic clocking as it did not make it to the first shipping Soc.
Kaveri also shows that dynamic clocking between GCN and streamrollers is still not optimal and it makes sense. On contrary to Intel and whereas AMD is pretty loud about APU the CPU and GPU are not design hand in hand.
 
Last edited by a moderator:
Xbox One’s eSRAM Too Small to Output Games At 1080p But Will Catch up to PS4 – Rebellion Games

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on. It’s a bit like the reverse of the PS3. PS3 was harder to program for than the Xbox 360. Now it seems like everything has reversed but it doesn’t mean it’s far less powerful – it’s just a pain in the ass to start with. We are on fine ground now but the first few months were hell.”

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines. The Xbox One is a bit more multimedia, a bit more hub-centric so its a bit more complex. There’s stuff you can and can’t do because it’s a sort of multimedia hub. PS4 doesn’t have that. PS4 is just a games machine.”

The PS4 is thus more of a gaming machine in its core focus. “Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up. But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

http://gamingbolt.com/xbox-ones-esr...080p-but-will-catch-up-to-ps4-rebellion-games
 
That is something that bothers me with MSFT design choices. For all the bad the things we heard about the WiiU hardware from devs and the badmouthing found on various forum (which included my viper tongue... lol), it turns out that the "memory set-up" (at large, the edram+64bit bus to the slow ddr3) doesn't prove to be much of an issue. It does the job keeping in mind Nintendo performances target for the system.

Now on MSFT side, the huge investment in eSRAM did not save them the investment on both a 256bit bus and fast DDR3 memory. Worse it turns out that 32MB proves to be a bit short, at least not comfortable to work with.

From a technical pov it is wasteful, (it also applies to a lesser extend to the ps4), I'm far from convinced that a system embarking 4 jaguar cores @2GHz, 8 CUs / 16ROPs high clocked GPU, linked to really fast GDDR5 through a 128 bit bus (so 4GB only) would take the back seat as far as gaming performances are concerned. May be not as power efficient (though both the ps4 and XB1 suck here the power consumption while in menus is dreadful...) but it is a far stretch from melting the mobo.

It is something that bother actually with both systems, we are asked money for the sake of a lot of marketing bullets points, features that should prove less and less relevant as 100$ (may be less soon and there are cheaper things as Chrome cast) hdmi sticks can turn any tv into a "smart" one
and I don't believe that either MSFT or Sony can touch what Android or iOS environments offer
.

It is not an attack toward MSFT, as it also applies (again to a lesser extend) to Sony but mostly next gen experience could have come from systems sold as replacement as of previous SKUs, by that I mean within the same price bracket (slightly above but nothing that significant).
 
Last edited by a moderator:
Anyone got any comment on this? When he says 'much faster SDK' - what is that about? New drivers? or the rumoured reduction in the OS reservation? (or both....?)

"Despite the challenges, Rebellion Games are working closely with Microsoft and Sony while developing Sniper Elite 3. They are currently targeting 60fps for both the console versions. It is also interesting to note that Bolcato reveals that a new SDK update is on its way that will improve performance on the Xbox One. What that update will exactly do is still currently unknown or may be it’s the driver update that we reported about a month ago. Stay tuned for more details on Sniper Elite 3, including gameplay mechanics and the differences between current and next gen versions."
 
Well the difference on screen is not small compared to the difference there was (one way or another) between the 360 and the ps3.
If the difference is small why raise the option for dynamic clocking? It is a bit nonsensical. The issue is not power but relative power wrt to competition and the system price. Going further down that line one may also question the system perfs wrt to the silicon invested.
Actually looking at Kaveri+DDR3+Direct x against both the ps4 and the XB one, or how well fares a Bonaire in another DF entry, I'm not convinced by the choices of either MSFT or Sony. Sony gets a free pass in this regard as at least they are leading in the performances metric.

By the way it seems there is something wrong with Kabini wrt to dynamic clocking as it did not make it to the first shipping Soc.
Kaveri also shows that dynamic clocking between GCN and streamrollers is still not optimal and it makes sense. On contrary to Intel and whereas AMD is pretty loud about APU the CPU and GPU are not design hand in hand.

The difference on screen only exist in 3 to 4 games out of over a dozen multiplats. In fact I would say the situation is exactly the same as it was last gen only in reverse. The Ps3 cost $200 more than the xbox 360 yet the ps3 had a weaker gpu that used older tech than the 360. The Ps3 often had lower res than 360 in multiplats. It did not mean the ps3 had a weak gpu just like the X1 gpu is not weak its just not equal to the ps4. Thats the problem I have with this whole idea that the Xbox one's price point is a dooming unfair factor. The value for the extra cash comes from the kinect and the hdmi in and user interface. Both next gen consoles have advantages and they have both been designed fairly well despite their mid range PC like performance. There is no way they could have included much more gpu power while keeping the price and power consumption at reasonable levels. This gen is no different than any previous gen in the fact that all systems have different graphical performance. I just dont get all the internet uproar over unequal performance. It's how it has always been and the big 3 are all still in business.
 
Didnt the Ms guys say the framebuffer could overflow into the main Ram in the DF interview. Maybe the Sdk has optimised that process or improved compression techniques for the framebuffer.

One thing though about that interview quoted above the guy from Rebelion says the esram is to blame for the X1 not having many games in 1080p. That's not really true there are more games running native 1080p on the X1 than games running in 900p and 720p combined. In fact only 5 games out of over 20 dont run at 1080p.
 
Didnt the Ms guys say the framebuffer could overflow into the main Ram in the DF interview. Maybe the Sdk has optimised that process or improved compression techniques for the framebuffer. One thing though about that interview quoted above the guy from Rebelion says the esram is to blame for the X1 not having many games in 1080p. That's not really true there are more games running native 1080p on the X1 than games running in 900p and 720p combined. In fact only 5 games out of over 20 dont run at 1080p.

Along with the fact Forza 5 is running at 1080p AND 60fps. So not only is it possible to fit a 1080p image in the framebuffer, it is also possible to move it in and out quick enough to achieve 60fps.

It does seem from interviews that Turn 10 may have had more time with the hardware than any other studio, having helped shaped it's design. Hopefully some of the techniques they have learnt can be shared with not only 1st and 2nd party studios, but 3rd party as well.
 
One thing though about that interview quoted above the guy from Rebelion says the esram is to blame for the X1 not having many games in 1080p. That's not really true there are more games running native 1080p on the X1 than games running in 900p and 720p combined. In fact only 5 games out of over 20 dont run at 1080p.

Uh: AC4, BF4, CoD:G, Tomb Raider, Dead Rising 3, Ryse, Killer Instinct, and Powerstar Golf. That is 8 off the top of my head. Not sure about Zoo Tycoon, Zumba Fitness or Just Dance.
 
Along with the fact Forza 5 is running at 1080p AND 60fps. So not only is it possible to fit a 1080p image in the framebuffer, it is also possible to move it in and out quick enough to achieve 60fps.

It does seem from interviews that Turn 10 may have had more time with the hardware than any other studio, having helped shaped it's design. Hopefully some of the techniques they have learnt can be shared with not only 1st and 2nd party studios, but 3rd party as well.

Turn 10 did solid work on F5 but racing games do represented the best case scenario for high frame rates as there are relatively few polys to push at 1080/60. F5 uses sprite spectators and some pretty low res/poly structures off track as when racing none of these things matter that much. The reason for the focus on multi-plats is that they are apples to apples comparisons just like last gen. I just hope that the current ports on XB1 are more like that awful Splinter Cell port on PS3 at launch rather than canaries in the coal mine.

SDK stuff on a console means faster libraries, better linkers and compilers and also possibly drivers (although that's more O/S). Any of these things can make for faster multi-plats much like the MLAA that superceded the awful quincux solution that was prevalent on PS3 for ages
 
Uh: AC4, BF4, CoD:G, Tomb Raider, Dead Rising 3, Ryse, Killer Instinct, and Powerstar Golf. That is 8 off the top of my head. Not sure about Zoo Tycoon, Zumba Fitness or Just Dance.

We know Powerstar Golf is sub 1080??

Tomb Raider is 1080 gameplay so I think that counts as 1080.

Ironically the One has the only retail exclusive at 1080P locked 60 FPS (Forza 5), so the ESRAM must not be a total disaster.
 
You can fit any game to 1080p60 on any HD console as long as you pick your graphics targets to fix. Ergo the existence of a 1080p60 title doesn't equate a capable platform. The better statement is that for the given graphical target the devs are aiming for, they are finding the ESRAM a bottleneck at the moment.
 
So not only is it possible to fit a 1080p image in the framebuffer, it is also possible to move it in and out quick enough to achieve 60fps.

Move it in and out?

You can fit any game to 1080p60 on any HD console as long as you pick your graphics targets to fix. Ergo the existence of a 1080p60 title doesn't equate a capable platform. The better statement is that for the given graphical target the devs are aiming for, they are finding the ESRAM a bottleneck at the moment.

I wonder if tiling will once again be a solution to the limited amount embedded ram.
 
Status
Not open for further replies.
Back
Top