How Do You Suppose PS3 Will Provide Backwards Compatible

Squeak said:
what does that tell us about the 32Gb/s EDRAM in Xenos?
1.) That's BW TO the eDRAM. Internally, for the work it does, it's got 256 GB/s. PS2's 48 GB/s BW isn't enough for what Xenos has to do next-gen. 2.) That's so totally off topic :oops: . I'll pretend I didn't reply and I'm really talking about PS3's BC solution, which I believe uses magic, perhaps small gremlins and fairies.
 
Shifty Geezer said:
1.) That's BW TO the eDRAM. Internally, for the work it does, it's got 256 GB/s. PS2's 48 GB/s BW isn't enough for what Xenos has to do next-gen. 2.)
But that's "only" for AA and stencil. AFAICS 10Mb is just enough for Z and backbuffer, so If you want to do alpha blending or render to texture, you have to go through system memory or bus to VPU.
 
Pcostabel said:
OTOH, the gs is a very simple rasterizer: the RSX will have no problem emulating it and adding enhancements like AA and higher res
I disagree, simple though GS may be, the way it gets used in games is not simple to emulate at all. Unless RSX comes with certain GS extensions (in regards to address modes), this is very much in question whether it's possible to do fast enough.
As for AA/resolution enhancements - unless Sony follows up on MS idea of game profiles, forget about it, it's just not going to happen.


Squeak said:
If RSXs framebuffer compression is good enough to give the equivalent of 48Gb/s bandwidth, what does that tell us about the 32Gb/s EDRAM in Xenos? Is it silicon well spend, or would it have been better to but some (less area consuming) bandwidth saving logic on the main die?
You don't need to think of it in such brute force manner - SDTV framebuffers are relatively small and the better optimized the game, the more closely it follows page buffer coherency.

IMO something around 128KB renderbuffer cache would trivially serve bandwith requirements of typical GS rendering patterns 90% of the time, even with much slower external memory then what RSX has available. And regular texture bandwith is a non-issue alltogether.

However, IMO there's two real problems with GS emulation.
One is that you need to emulate its addressing schemes in detail - majority of effects in PS2 games are coded around specific behaviour of memory accesses and addressing, you'll not get away with any kind of HLE for that.
The other problem is the few cases of games that ping-pong between texture&render buffer a lot (sometimes on every rendered primitive, using same memory addresses for texture and render buffer) - which will kill bandwith on any GPU without unified eDram (meaning Xenos like setup would be useless to assist this also). The only way I see around this issue is hoping that the emulation of the rest of rendering would be faster then real GS by a large enough amount to compensate for the slower rendering whenever this kind of rendering packets are encountered over course of a frame.

Anyway, one thing I'm certain about is that if they ended up sticking a full GS chip in there, it will NOT be part of RSX die.


Bohdy said:
What is more salient to me (at least) is how MS is managing to emulate the Xbox in real time. Sure they have licensed the NV2A hardware for the GPU side of things, but emulating the 700mhz P3 must be a nightmare with multiple cores providing seemingly little benefit.
Are they actually going to use NV2A hardware in there? I was under impression that it'll be some sort of highlevel emulation, so basically whenever a game does lowlevel stuff that isn't quite within DX spec, they'll code it into one of those game profiles (hence the initial low compatibility rate).

The P3 emulation is an interesting question though, if they were really sticking with static recompiles then you could well be looking at separate "emulation profile" for each game - which at least to me would sound like there'll never be anything close to full library compatibility.
 
Last edited by a moderator:
Fafalada said:
I disagree, simple though GS may be, the way it gets used in games is not simple to emulate at all. Unless RSX comes with certain GS extensions (in regards to address modes), this is very much in question whether it's possible to do fast enough.

I know, but a simple hardware address decoder should take care of that. I don't think you'd need to put the whole GS+4MB VRAM in there.

As for AA/resolution enhancements - unless Sony follows up on MS idea of game profiles, forget about it, it's just not going to happen.

Why not? The enhancements on PS2 can be switched on or off by the user, there is no need to have game profiles. Scaling everything to 1080p shoul be pretty much transparent to the game, and so should be FSAA.

However, IMO there's two real problems with GS emulation.
One is that you need to emulate its addressing schemes in detail - majority of effects in PS2 games are coded around specific behaviour of memory accesses and addressing, you'll not get away with any kind of HLE for that.

I agree, this will probably need some hardware support.

The other problem is the few cases of games that ping-pong between texture&render buffer a lot (sometimes on every rendered primitive, using same memory addresses for texture and render buffer) - which will kill bandwith on any GPU without unified eDram (meaning Xenos like setup would be useless to assist this also). The only way I see around this issue is hoping that the emulation of the rest of rendering would be faster then real GS by a large enough amount to compensate for the slower rendering whenever this kind of rendering packets are encountered over course of a frame.

Not sure I follow. Why would rendering to texture kill bandwidth? I'm not familiar with nVidia architecture, but why would it matter if you render to a texture or to the frame buffer?

The P3 emulation is an interesting question though, if they were really sticking with static recompiles then you could well be looking at separate "emulation profile" for each game - which at least to me would sound like there'll never be anything close to full library compatibility.

That's why I think they have to include the EE in there. Emulating it will inevitably result in limited compatibility, which is clearly not what Sony has in mind.
 
Pcostabel said:
I know, but a simple hardware address decoder should take care of that. I don't think you'd need to put the whole GS+4MB VRAM in there.
I agree, and I am still hoping they can do something like that, the benefit of RSX advanced texture filtering would already improve PS2 games a lot, even without any extra tricks. This would also still go along with the comments about "software+hardware" solution.

Why not? The enhancements on PS2 can be switched on or off by the user, there is no need to have game profiles. Scaling everything to 1080p shoul be pretty much transparent to the game, and so should be FSAA.
From hw perspective there's no concept of what is a frame buffer and what is a texture, let alone which buffers can be antialiased and which can not. And then you also have your games where rendering-buffers move around the memory in mid-frame, making the situation even more messy.
Unless you know the exact configuration of memory used in a specific game in advance, emulator has no way of knowing what should/can be scaled or antialiased. The only universal approach would be to scale up the entire "virtual" embeded memory, but that would also mean those GS addressing extensions would get more complicated, and multisampling AA is still flat out of the question.

Game profiles are the only way to give emulator the info about where screen buffers are and which parts are safe to mess around with.

Not sure I follow. Why would rendering to texture kill bandwidth?
PC GPUs use small onboard caches to keep traffic with main memory to minimum and help hide access latencies. Things like flushing texture cache on every rendered polygon is a pretty safe way to bring any current PC GPU to its knees.
Switching render targets is afaik also an expensive and not GPU friendly operation if done very often - and again on PS2 we sometimes do this on per primitive basis because it has no real penalty to speak of.
 
Last edited by a moderator:
Fafalada said:
From hw perspective there's no concept of what is a frame buffer and what is a texture, let alone which buffers can be antialiased and which can not. And then you also have your games where rendering-buffers move around the memory in mid-frame, making the situation even more messy.
Unless you know the exact configuration of memory used in a specific game in advance, emulator has no way of knowing what should/can be scaled or antialiased. The only universal approach would be to scale up the entire "virtual" embeded memory, but that would also mean those GS addressing extensions would get more complicated, and multisampling AA is still flat out of the question.

Game profiles are the only way to give emulator the info about where screen buffers are and which parts are safe to mess around with.

Well, it is a pretty safe assumption that whatever is displayed is the frame buffer :)
It should be trivial to figure out where the buffers are from the CRT settings. This doesn't have to work with every game, but how many games are moving the framebuffers around anyways? Personally, I'd be happy if I can play GT4 at 1080p even without AA. if I can choose a setting to do that even if it screws up things like DOF and other full screen effects, it's a choice I'd like to have.

PC GPUs use small onboard caches to keep traffic with main memory to minimum and help hide access latencies. Things like flushing texture cache on every rendered polygon is a pretty safe way to bring any current PC GPU to its knees.
Switching render targets is afaik also an expensive and not GPU friendly operation if done very often - and again on PS2 we sometimes do this on per primitive basis because it has no real penalty to speak of.

I see, thanks for the explanation.
 
Fafalada said:
Are they actually going to use NV2A hardware in there? I was under impression that it'll be some sort of highlevel emulation, so basically whenever a game does lowlevel stuff that isn't quite within DX spec, they'll code it into one of those game profiles (hence the initial low compatibility rate).

There were reports that MS licensed nVidia tech for Xbox BC a while back, and I interpreted this a hardware solution, but I may be mistaken.

Fafalada said:
The P3 emulation is an interesting question though, if they were really sticking with static recompiles then you could well be looking at separate "emulation profile" for each game - which at least to me would sound like there'll never be anything close to full library compatibility.

It does seem that way to me as I don't know of any other option for emulating it. I haven't heard of any common emulators that work full speed with much less than a 10:1 speed ratio, and especially none that can gain speed linearly with multiple cpu's. Emulation has always been traditionally a very serial task afaik.
 
OT: About X360 BC

Bohdy said:
There were reports that MS licensed nVidia tech for Xbox BC a while back, and I interpreted this a hardware solution, but I may be mistaken.
This deal give MS access to low level informations about the NV2A architectures and give them also the right to use it (In a software emulator).
It's now too late, for MS or Ati, to implement any NV2A hardware related silicon in the Xenos.

Also, it seems that MS decided to go the recompilation route for the .XBEs, therefore it's clearly a "case per case" basis kind of emulation. Making any hardwired emulation for the GPU, a non issue (At least a not that important one).

And about the main topic, I'm, I've to say, more interested in how the PS3 will handle the PSone emulation?
Because in this case there's clearly room for a lot of visual enhancement. (Resolution, FSAA, Filtering). Will they go Dynarec + HLE for the graphics or will they opt for R3K + HLE?
 
Vysez said:
I'm, I've to say, more interested in how the PS3 will handle the PSone emulation
I can't imagine that it would handle it in a specialised way at all.. it makes sense just to emulate PS2. The PS2's emulation of PS1 titles just drops out of it, in the same way as the emulation of other PS2 applications.

Of course, there's nothing to stop a native rebuild of the PS1 emulator from taking place, but given the time constraints SCEI are under (for both completing the PS2 emulator, and testing PS1/PS2 titles) I seriously doubt that they'd risk being ready for launch by adding yet more complexity to the emulation setup.

Like others in this thread, I also believe that PS2 emulation will not add any additional gloss to a title (resolution increases, antialiasing, better texture filtering etc)... having worked on PS2 quite extensively, I can't see how SCEI would be able to distinguish GPGPU-like work and 'normal' rendering from each other. As far as the h/w is concerned, it's just pushing pixels without any associated context.

And game-specific profiles are out of the question.. SCEI have *no idea at all* what internal tricks each individual title are performing.. they perform only minor analysis of executables prior to submission (primarily to check that devs aren't calling OS functions that they shouldn't...). Knowing what bits of executables are using which areas of VRAM (dynamically allocated, in many cases), for specific display purposes is beyond their capability.
 
Pcostabel said:
Well, it is a pretty safe assumption that whatever is displayed is the frame buffer
Unfortunately not - what CRTC reads is the front buffer, and in 99% of PS2 titles that's just a copy of backbuffer after all rendering has finished. (Some games may add some postprocessing to frontbuffer also, but either way, bulk of the rendering happens in the backbuffer).

Arnie Pie said:
I can't imagine that it would handle it in a specialised way at all.. it makes sense just to emulate PS2. The PS2's emulation of PS1 titles just drops out of it
Well I'd imagine that SPU and R3000 emulation modules will be shared between the two, but if we have a virtual GS there's no need to extend it for PS1 compatibility, when you could emulate PS1 GPU explicitly.
There's of course always a chance they'd just use a real GS in there.
So I do hope the PS1 GPU gets emulated, if for no other reason because I would prefer PS1 titles in progressive. Interlacing in titles that run in hires vertical gives me a splitting headache (there was no concept of flicker filters back then).

And game-specific profiles are out of the question.. SCEI have *no idea at all* what internal tricks each individual title are performing..
Running the title through PA it's not all that difficult to see where most of the buffers are and what they're doing. Make an online database of game profiles that gets updated over time - start with only 1st party titles if necessary (there's plenty of hits there, GT series alone in true HD would be a big deal for many people out there).
If a game has an entry in the DB, the option for HD modes gets highligted when you stick the DVD in, if not, you get standard PS2.
 
BlueTsunami said:
Microsoft did license some NVidia tech but appearently it was low level stuff (not sure exactly what).

Heres the link to a thread that talked about this issue....

Nvidia Licenses Technology to Microsoft for Backwards Compatiblity

It's pure software, they'll be using the same shader replacement tricks graphics companies have been using to 'cheat' in benchmarks. They must have to liscense the nvidia specific instructions to do it legally.
 
Fafalada said:
Running the title through PA it's not all that difficult to see where most of the buffers are and what they're doing.

You seriously think that SCEI would do this? Even given that fact that a very large number of titles use a dynamic VRAM allocation scheme (where the addresses are not fixed for the buffers)? I don't know if you're a developer, but if you are, then surely you know how long it takes to analyze a single frame *accurately* on a PA, right? A long time.. and that's for a title you're working on. Analyzing a single frame of someone elses title (ie without a map file, and the source code) and deriving some kind of game-specific meaning to the data is a much more daunting prospect. And that's not including the headaches associated with each frame being different! Or with code being in overlays! It's madness!

SCEI have committed to backwards compatibility on day one. I have no doubt that's what they'll provide (whether completely in software, or as a mix of hardware/software remains to be seen), but thinking they're going to automagically enhance PS2 (or PS1) titles to look 'HD-like' on PS2 is crazy. PS2 games will look like they're running on PS2. Nothing more. There is no financial gain in this kind of enhancement exercise. SCEI want people to buy PS3 games (with an increased per-unit license fee) for PS3. Not to play their old PS2 games.

Sure, BC looks good on a list of features, but it doesn't increase SCEI's bottom line.
 
If it is deemed necessary to include a complete GS, it seems an awful waste not to be able to leverage the 4Mb eDRAM in some way.
35 million transistors is still quite a chunk to have lying around idle, when it could be used for texture buffering or other stuff. Although fitting the whole thing on to the RSX die, could be a problem and the integration for something useful an even bigger one.

About enhancing PS2 games: Even if the emulator doesn't know frame from texture and therefore can't do super sampling or multisampling, it knows polygons. How about edge AA?
Anisotropic and trilinear should "just" be a matter of making the EE emulator tag the geometry that needs to be filtered.
 
Squeak said:
About enhancing PS2 games: Even if the emulator doesn't know frame from texture and therefore can't do super sampling or multisampling, it knows polygons.
You could do SSAA in 99% of the cases, I'd say. By nature, MSAA would be impossible, I'd say.
 
Arnie Pie said:
I don't know if you're a developer, but if you are, then surely you know how long it takes to analyze a single frame *accurately* on a PA, right?
We don't need to get 'that' accurate - for most part you just need to map certain address ranges over certain time intervals and then pretend they are different dimensions at runtime. You only need to know the locations of backbuffer and any offscreen targets that share it's ZBuffer - most of those won't move much, if it at all.
It's not trivial - but there's a lot of PS2 software out there that wouldn't make it 'that' difficult either.
And I wasn't suggesting they'd have to go about this in completely blackbox approach - it's why I said it could be an added feature for only 1st party titles at first. Like PS2 enhancements to PS1 titles, it would be optional.

There is no financial gain in this kind of enhancement exercise. SCEI want people to buy PS3 games (with an increased per-unit license fee) for PS3. Not to play their old PS2 games.
Well I should point out that I was discussing this from academic point of view - as I posted earlier in the thread I do Not expect any resolution/AA enhancements will happen with PS3. That said - RSX emulating GS could relatively easily add enhanced texture filtering (something that plagues PS2 games much more then lack of "AA" or resolution).

But if we are to debate economic perspective - there's some room for argument that enhanced PS2 content could be a good selling point early on, especially if it pertains to first party hits such as GT that won't have a PS3 version for at least a year after machine is out.
Kutaragi already spoke about 'enhancing' DVD content through fancy upscaling processing, this isn't really any different.

Squeak said:
Although fitting the whole thing on to the RSX die, could be a problem and the integration for something useful an even bigger one.
GS die on 90nm is estimated around 40-45mm2, adding that to a chip that's possibly already over 200mm2 would probably have a noticeable effect on yields.
 
Last edited by a moderator:
An additional emulation problem will be the need to upres PS1 and PS2 games. Sony can hardly expect people to unplug their HD panels and drag down an old portable TV every time they want to play an old game. If this is the case I wonder if any other IQ improvements will be made. Could bring a new lease of life to some old PS1 games.
 
Fafalada said:
It's not trivial - but there's a lot of PS2 software out there that wouldn't make it 'that' difficult either.
I guess we'll have to agree to disagree, but I do think it's more difficult that you believe. I've personally done weird stuff with the GS in terms of frame buffer management (and texture management as a whole, which includes off-screen render buffers, and dynamic texture uploads to the upper 8 bits of the active display area).. And I know many other developers who do the same kind of things. I would expect even the big 1st party titles such as GT4 to do similarly weird processes to get things like 1080i support to work in the limited VRAM space available.

Fafalada said:
That said - RSX emulating GS could relatively easily add enhanced texture filtering (something that plagues PS2 games much more then lack of "AA" or resolution)
Well, it's not so much the texture filtering that plagues PS2 games, it's the lack of any hardware-assisted mipmapping. It's up to application code to select a mip level for an object (or a strip, if you've got the cycles spare), whereas the PC approach (which I assume will be in RSX) is to automatically determine the mip levels based on derivatives computed during rasterisation. I guess it's possible to handle this in emulation (assuming they don't just slap a GS in the box!), but then what to do if the application is choosing its mip level for a reason.. maybe it's not supposed to be a function of the derivatives that selects it.

Ok, so maybe your proposed database of 1st party games is something that applies here too..? You know, kind of like "if a specific DMA chain from a specific memory address, on a specific frame number from bootup, is going through VU1 with a specific hardware state enabled, and the specific textures have their mipmaps located at known addresses then turn on hardware mipmapping". Never.. There are way too many variables to make this a feasible solution.. *waaaay* too many.

Fafalada said:
Kutaragi already spoke about 'enhancing' DVD content through fancy upscaling processing, this isn't really any different.
Come on.. this is completely different! Upscaling DVD players have been around for ages (although, strangely, not from Sony :|).. as such it would be stupid to not handle this. And for this it's technically a very simple thing to do.
 
Back
Top