Codename Vejle: Xbox 360 S combined CPU/GPU

I am sure someone posted link to developers saying how they squeezed more from 360 by coding to metal. I'll see if I can find the link(s). Though would sound really strange if devs could not bypass the DX9 esque 360 API and then have devs say how you can get closer to HW/metal with consoles than PC. Ehh?
 
I am sure someone posted link to developers saying how they squeezed more from 360 by coding to metal. I'll see if I can find the link(s). Though would sound really strange if devs could not bypass the DX9 esque 360 API and then have devs say how you can get closer to HW/metal with consoles than PC. Ehh?

There's a difference between getting closer than a PC and "going to the metal" which in itself is rather ambiguious.
 
How many games do use real Deferred Rendering on consoles? Two (KZ2 and Trials HD)? Three?
I would not be surpirsed if the response if instead plenty ;) (Insomniac use deferred renderers, Remedy for Alan wake, Crytek, Rockstar, etc.).

But back to the topic it's sad that we'll never find what improvements an unified chip may have offer vs the old one may have bring on the table as far as perfs are concerned it could have been interesting looking forward :) (We're left waiting for Ontario release :( ).
 
Last edited by a moderator:
A lot more than you think! ;) but seriously, the point is that moving forwards, MRT usage is going to increase.

Hm, I hope, I understand you. But current "light" DRs like CE3 don't use big G-buffers (Crysis 2: 2 32 bit RTs). Is there a limitation, that all RTs have to have the same size?
 
I don't think so... I am not sure how one'd define Render Target, but in Killzone 2, the DoF buffer is only quarter resolution. Blending different resolutions together is quite simple too, as can be seen in MANY games, again in Killzone, Alan Wake and many others.
 
kagemaru said:
then where do reports such as this come from?
I wouldn't know, but IIRC this article already got debated on B3D back when it was first posted. Irony is that early in this generation, PS3 was the most restricted console in terms of HW access (though they mostly caught up since).

Arnold Beckenbauer said:
Is there a limitation, that all RTs have to have the same size?
Only those that belong to same MRT pass.
 
I wouldn't know, but IIRC this article already got debated on B3D back when it was first posted. Irony is that early in this generation, PS3 was the most restricted console in terms of HW access (though they mostly caught up since).


Only those that belong to same MRT pass.

Interesting, thank for the heads up
 
But if you are a developer you will be acquainted with TCR 12 which states that you can only address elements of the 360 hardware through Microsoft APIs, as you can see from looking at slide 53 of this presentation.

And they have good reasons for this.

The big one would be for their next console, the Xbox 720 or whatever they call it, backwards compatibility would be much easier to implement. Added bonus for developers who want the game to be ported to the PC.
 
How much does it hurt how the games look though, not being able to "code to the metal"? Any ideas? Though I'm sure it's OT now. And Faf even said something about PS3 being even worse in this regard?

And why does Sony not have a similar rule if it supposedly has these slam dunk advantages?
 
I would not be surpirsed if the response if instead plenty ;) (Insomniac use deferred renderers, Remedy for Alan wake, Crytek, Rockstar, etc.).

But back to the topic it's sad that we'll never find what improvements an unified chip may have offer vs the old one may have bring on the table as far as perfs are concerned it could have been interesting looking forward :) (We're left waiting for Ontario release :( ).

Not sure about Insomniac and Remedy, but Crytek and Rockstar use Deferred Lighting (aka Light Pre-Pass), which is maybe what the poster before you referred to as "not real Deferred Shading".

Can anyone explain why was the "front bus emulation" needed? What exactly would break? I doubt anyone is making their games depend on such low-level timing, but who knows?
 
Not sure about Insomniac and Remedy, but Crytek and Rockstar use Deferred Lighting (aka Light Pre-Pass), which is maybe what the poster before you referred to as "not real Deferred Shading".

Can anyone explain why was the "front bus emulation" needed? What exactly would break? I doubt anyone is making their games depend on such low-level timing, but who knows?

What makes it even more bizarre for me is that Microsoft already support a wide range of performance profiles for their games. You can run a 360 without a HDD completely, you can let titles cache to the drive or even install them to the HDD (and that's a HDD which is much faster than the one featured in launch units) which can have a big impact on performance/stuttering/streaming/load times/etc. in certain games.

Just seems like a waste of silicon to me for something that could actually help encourage a few enthusiasts to "upgrade." I'm sure they have their reasons and if it was proven to break compatibility in some key titles then fair enough but it doesn't stop me from wanting to find out what the effect of the integration would have been without the FSB emulation hardware.
 
Microsoft wouldn't make much money from enthusiasts upgrading. They want you to buy more games. Making everything match existing hardware is all about risk reduction. They want to ensure games don't all of a sudden have weird bugs. I suspect most games would run fine with reduced CPU GPU latency, but the 5 people that would upgrade for what's likely a very small performance enhancement isn't worth the risk.
 
I don't think the supposed increased performance from not emulating the exact FSB bandwidth/latency is interesting. But I'm sure they wouldn't have spent time and silicon on this emulator if things would work otherwise - so I'm curious - what is it that would break existing games?
 
How much does it hurt how the games look though, not being able to "code to the metal"? Any ideas? Though I'm sure it's OT now. And Faf even said something about PS3 being even worse in this regard?

And why does Sony not have a similar rule if it supposedly has these slam dunk advantages?

He stated PS3 was more restrictive at the beginning, not currently. :)

As to the second, the only major piece of hardware that could be coded to the metal with which Sony wouldn't have access to or license to the IPs for would be the Nvidia GPU. And since they can do many of the things GPU's traditionally do in the SPU's, it's easier to just push developers to optimizing for SPU's rather than code to the metal of a less capable GPU. And in the process, there's no need to worry about getting Nvidia approval for any potential BC though emulation of Nvidia specific hardware features.

Regards,
SB
 
I have to wonder if Sony cares about BC at all at this point. Their new strategy seems to be to sell old games as HD remixes (God of War collection) or maybe as DLC.
 
Back
Top