Freeing up the Kinect Reservation *spawn*

Status
Not open for further replies.
I keep noticing the title of this thread is kind of a inadvertent pun

Get it, "freeing", "reservation"? Freeing the reservation?

I don't see that freeing up the resources used by Kinnect is going to make a lot of difference to anything. The ~50% deficit it has regards the PS4 is still going to be ~50% no matter how many resources get freed. It may make the consoles none gaming features slightly snappier (see what I did there) but not a lot else.

It's better than if they dont do it, though. And evidently they need all the help possible to get 1080P.

As I've said a few times, you can probably liken it to another hidden ~50mhz upclock in effect (~7% performance vs 800 mhz baseline). That's nice imo.

Also, if Sony doesn't feel the need to counter because they're already rolling in the excess flops, then it can help comparatively even more (and again MS needs it). Total conjecture here, but ERP posted a few months ago that PS4 has a similar OS GPU reservation to X1. And if you doubt that just use common sense, remember at one time PS4 was supposed to have much less reservation in every area, and like dominoes all that predictably fell as we learned more (2 CPU cores, 3+GB RAM, etc).

I'm also wondering if there's any RAM possibly being freed up here. Long ago bkilian I believe said there was 1GB of X1 (out of the 3GB reserved) that was held in limbo. So, 5 GB games, 2GB OS, and 1GB complete "limbo". Supposedly this could have been given back to games oneday. I never heard anything else about this since then, or what happened to the limbo RAM, if anything. It would be nice to get to 6GB. I need to go look at the MS quotes about freeing up OS resources to see how generic they are or if they are specific only to GPU.
 
The Xbox One performance update was apparently definitely Kinect related. Loads of updated comments to Eurogamer

http://www.eurogamer.net/articles/2014-06-04-xbox-one-dev-kits-receive-more-gpu-bandwidth

UPDATE 10.25pm: Microsoft has confirmed that this boost in performance is in fact due to Kinect being stripped from the package. When asked if the two were related, a Microsoft spokesperson sent Eurogamer the following response:

"Yes, the additional resources allow access to up to 10 per cent additional GPU performance. We're committed to giving developers new tools and flexibility to make their Xbox One games even better by giving them the option to use the GPU reserve in whatever way is best for them and their games."

And will this mean more games will hit the 1080p 60fps benchmark that's all the rage these days, I asked?

"Xbox One games look beautiful and have rich gameplay and platform features. How developers choose to access the extra GPU performance for their games will be up to them. We have started working with a number of developers on how they can best take advantage of these changes. We will have more to share in the future."

Microsoft also offered the following statement on the dev kit update in general:

“Just as we're committed to making ongoing system updates for our fans to enjoy new features of Xbox One, we're also committed to giving developers new tools and flexibility to make their Xbox One games even better. In June we're releasing a new SDK making it possible for developers to access additional GPU resources previously reserved for Kinect and system functions. The team is continually calibrating the system to determine how we can give developers more capabilities. With this SDK, we will include new options for how developers can use the system reserve as well as more flexibility in our natural user interface reserve (voice and gesture). We'll continue to work closely with developers to help them bring great games to Xbox One.”

So I guess now we have a hard timeline on when devs got the Kinect reserve back. June. Now to wait for the results.
 
The Xbox One performance update was apparently definitely Kinect related. Loads of updated comments to Eurogamer

http://www.eurogamer.net/articles/2014-06-04-xbox-one-dev-kits-receive-more-gpu-bandwidth



So I guess now we have a hard timeline on when devs got the Kinect reserve back. June. Now to wait for the results.

If some of the GPU BW has been reserved previously, isn't this confirmation that some of the esram itself was (and possibly still is) being reserved also?

With so little esram for those buffers, could this be shrunk back at some point too?
 
I thought I remembered some people here claiming some ESRAM was reserved. Whether that was accurate, and why they think that, or where those posts are, I have no idea.

The reserved GPU bandwidth could have been all DDR3 though. Nothing concludes ESRAM was involved, or not involved.
 
I'm under the impression that the major difference is that the skeletal motion tracking can now be turned off, which is pretty much the 10% is coming from.
 
If some of the GPU BW has been reserved previously, isn't this confirmation that some of the esram itself was (and possibly still is) being reserved also?

With so little esram for those buffers, could this be shrunk back at some point too?

I'm doubting whether this will enable games with a higher resolution framebuffer. Someone at Neogaf did some math and with deferred rendering, 792p just fits in ESRAM:

http://www.neogaf.com/forum/showthread.php?t=819298

792p gives you :

(32 MB * 1024 KB/MB * 1024 KB/B) / (1408 * 792 pixels) = 30.01 bytes per pixel

while it seems that games like Infamous SS already used 40 bytes per pixel. I also remember that
the XB1 has "just" 16 ROPS.

So what gives this 10% ? 10% more shaders ? 10% more DDR3 bandwidth ... an option to put the frame buffer in DDR3 instead of ESRAM ?
 
I'm doubting whether this will enable games with a higher resolution framebuffer. Someone at Neogaf did some math and with deferred rendering, 792p just fits in ESRAM:

http://www.neogaf.com/forum/showthread.php?t=819298

792p gives you :

(32 MB * 1024 KB/MB * 1024 KB/B) / (1408 * 792 pixels) = 30.01 bytes per pixel

while it seems that games like Infamous SS already used 40 bytes per pixel. I also remember that
the XB1 has "just" 16 ROPS.

So what gives this 10% ? 10% more shaders ? 10% more DDR3 bandwidth ... an option to put the frame buffer in DDR3 instead of ESRAM ?

They will free up 10% of GPU time slice that they were using for Kinect (mainly skeleton tracking) and other system multitasking features like snap. If they give Kinect reserved resources back to developers then I think they will have more coherent bandwidth between CPU and GPU (which should be important) and most of that 10% GPU time slice. They also mentioned some audio resources that were reserved for Kinect (DSPs for example) will be accessible to developers, too.

Do developers really need to keep every thing at the same time on eSRAM?
 
I'm doubting whether this will enable games with a higher resolution framebuffer. Someone at Neogaf did some math and with deferred rendering, 792p just fits in ESRAM:

http://www.neogaf.com/forum/showthread.php?t=819298

792p gives you :

(32 MB * 1024 KB/MB * 1024 KB/B) / (1408 * 792 pixels) = 30.01 bytes per pixel

while it seems that games like Infamous SS already used 40 bytes per pixel. I also remember that
the XB1 has "just" 16 ROPS.

So what gives this 10% ? 10% more shaders ? 10% more DDR3 bandwidth ... an option to put the frame buffer in DDR3 instead of ESRAM ?

There's already plenty, most, games at >792P on X1, on right up to plenty of native 1080P games.

You may be correct it might not be a sea change, but every bit can help.

Still after all this time there doesn't seem to be a clear view of what's primarily limiting X1 resolution in some cases. At least I have not heard a consistent theme from devs.
 
Do developers really need to keep every thing at the same time on eSRAM?

no, it would be a waste of resources (bandwidth) to only use the esram for the render target. and well, not to forget 1080p was possible on xbox 360, too (and the edram had much less bandwidth for the developer). but putting the render target inside esram is the fastest way (of a developer perspective) to get an advantage of using esram. This should change in future. currently most games are multi-gen ports, so the esram will only be used in the way what is implemented with the least amount of time/work (quick win).
In the long-run it should be used for multiple things. you just don't need all the data of the render target in the fast memory the whole time. The esram is fast and small enough to just move things around (while not needed) without hurting bandwith to much, so it can be used for other tasks.

There's already plenty, most, games at >792P on X1, on right up to plenty of native 1080P games.

You may be correct it might not be a sea change, but every bit can help.

Still after all this time there doesn't seem to be a clear view of what's primarily limiting X1 resolution in some cases. At least I have not heard a consistent theme from devs.
well the 792p is just a result of having 60fps (well forza managed that, but it is a first party titel and also used some prebacked stuff, but only because 16ms is not much time to render a frame).
Some developers also stated that 1080p 30fps is easier to reach than 720p 60fps. It is just the timeframe that limits here. And most (or at least much) of the time-frame is consumed by the current Directx API.
792p is just (like I described) the quick-win strategy of using esram bandwith.

There are not that many 1080p 60fps games at all on the market right now for current gen. but most 60fps (if they are stable around 60) games (I don't mean indi games) have a reduced resolution.
 
Last edited by a moderator:
This 10% GPU boost will realistically mean that games that are 720p/900p or 792p/900p or 900p/1080p will now have their framerate/effects as good as the PS4, which is good news.

It wasn't the case for many multiplatform games: Battlefield 4, Watch_Dogs, Trials, Assassin's creed 4 where on top of the resolution gap, XB1 games had also worse framerates or more screen tearing or less AO versus PS4 games.
 
This 10% GPU boost will realistically mean that games that are 720p/900p or 792p/900p or 900p/1080p will now have their framerate/effects as good as the PS4, which is good news.

It wasn't the case for many multiplatform games: Battlefield 4, Watch_Dogs, Trials, Assassin's creed 4 where on top of the resolution gap, XB1 games had also worse framerates or more screen tearing or less AO versus PS4 games.

Indeed, that makes more sense.
 
well the 792p is just a result of having 60fps (well forza managed that, but it is a first party titel and also used some prebacked stuff, but only because 16ms is not much time to render a frame).
Some developers also stated that 1080p 30fps is easier to reach than 720p 60fps. It is just the timeframe that limits here. And most (or at least much) of the time-frame is consumed by the current Directx API.
792p is just (like I described) the quick-win strategy of using esram bandwith.

There are not that many 1080p 60fps games at all on the market right now for current gen. but most 60fps (if they are stable around 60) games (I don't mean indi games) have a reduced resolution.

Titanfall is 792p with 2xMSAA with less optimization than Forza 5 and it uses prebacked stuff, too. I mean many other games have some prebacked stuff. What's the difference between CoD, Titanfall and Forza 5? Correct me if I'm wrong but Forza 5 is the only AAA game with constant 60fps @ 1080p on current gen consoles. So it isn't fair to downplaying it for different reasons.
 
Titanfall is 792p with 2xMSAA with less optimization than Forza 5 and it uses prebacked stuff, too. I mean many other games have some prebacked stuff. What's the difference between CoD, Titanfall and Forza 5? Correct me if I'm wrong but Forza 5 is the only AAA game with constant 60fps @ 1080p on current gen consoles. So it isn't fair to downplaying it for different reasons.


I don't think COD has prebaked shadows since they have different LODs and some objects move requiring them to be dynamic. I doubt comparing Forza to Ghosts (even BF) is fair. Those games are definitely more dynamic.
 
Titanfall is 792p with 2xMSAA with less optimization than Forza 5 and it uses prebacked stuff, too. I mean many other games have some prebacked stuff. What's the difference between CoD, Titanfall and Forza 5? Correct me if I'm wrong but Forza 5 is the only AAA game with constant 60fps @ 1080p on current gen consoles. So it isn't fair to downplaying it for different reasons.

You are wrong.

MGS5 on PS4 is 1080p at locked 60fps in a semi open world with some decent dynamic lighting and good AA.

Resogun too but as people now suddently don't count 'indies' as proper games...
 
Titanfall is 792p with 2xMSAA with less optimization than Forza 5 and it uses prebacked stuff, too. I mean many other games have some prebacked stuff. What's the difference between CoD, Titanfall and Forza 5? Correct me if I'm wrong but Forza 5 is the only AAA game with constant 60fps @ 1080p on current gen consoles. So it isn't fair to downplaying it for different reasons.

I know, I only wanted to mention it, before someone else cries because I didn't mention that it uses prebacked stuff. ;)
And yes some things were downgraded in last minutes, but that should be thanks to the SDK and the fixed launch-date.
Forza 5 is a gorgeous looking game (best when seeing it live) and yes, it is the only rock-stable 60fps 1080p AAA game yet on consoles.

You are wrong.

MGS5 on PS4 is 1080p at locked 60fps in a semi open world with some decent dynamic lighting and good AA.

Resogun too but as people now suddently don't count 'indies' as proper games...

sry, but resogun doesn't look that great. yeah the particles are somehow cool, but every other aspect of the game is not that GPU intensive (Textures, lightning, ...). And that's true for most indies. There is almost nothing on screen that would justify high GPU usage. But that is what indies are. I don't expect them to be 100% optimized for the hardware, because they are cheap and so must be cheap in production.
 
Last edited by a moderator:
sry, but resogun doesn't look that great. yeah the particles are somehow cool, but every other aspect of the game is not that GPU intensive (Textures, lightning, ...).


Couldn't the same argument be used against Forza? Regardless of how good the game looks (and it definitely does look good), but it's simply not rendering with the same level of effects as other games.
 
I don't think COD has prebaked shadows since they have different LODs and some objects move requiring them to be dynamic. I doubt comparing Forza to Ghosts (even BF) is fair. Those games are definitely more dynamic.

Cars and other objects in the circuit has dynamic shadows and none of those games that you named aren't 1080p with stable frame rate like FM5, if FM5 is running always at 60fps (without single frame drop) it's because that it's always rendering higher than 60fps, its minimum frame rate is 60fps while minimum frame rate of those games is around 40-50 fps and FM5 has it's own problem to solve (for example physics engine that running at 360fps).

Couldn't the same argument be used against Forza? Regardless of how good the game looks (and it definitely does look good), but it's simply not rendering with the same level of effects as other games.

As I said FM5 is a racing game with it's own problem to solve. It's not like every one can make a racing sim with FM5 visual fidelity that runs at lock 60fps @ 1080p (specially on XB1). .

You are wrong.

MGS5 on PS4 is 1080p at locked 60fps in a semi open world with some decent dynamic lighting and good AA.

Resogun too but as people now suddently don't count 'indies' as proper games...

Yes it's locked on 60fps at 1080p but it has low poly models (on par with last gen versions) and mediocre textures. It's a semi open world game and has dynamic shadows/lighting but MGS5:GZ doesn't seems really as a current gen title to me, it's a lunche cross-gen game.

I can't understand reasons behind comparing Resogun and Strider to FM5.

@Allandor
My bad. ;)
 
Last edited by a moderator:
I don't think COD has prebaked shadows since they have different LODs and some objects move requiring them to be dynamic. I doubt comparing Forza to Ghosts (even BF) is fair. Those games are definitely more dynamic.

....ok? Do you realize that prebaked radiosity gives you more bang for the buck, and are you suggestion that COD Ghost is fully dynamically lit without any prebaked lighting? (it'll be quite a feat if it does, which I have my doubts)
 
Status
Not open for further replies.
Back
Top