1.) That's BW TO the eDRAM. Internally, for the work it does, it's got 256 GB/s. PS2's 48 GB/s BW isn't enough for what Xenos has to do next-gen. 2.) That's so totally off topic . I'll pretend I didn't reply and I'm really talking about PS3's BC solution, which I believe uses magic, perhaps small gremlins and fairies.Squeak said:what does that tell us about the 32Gb/s EDRAM in Xenos?
But that's "only" for AA and stencil. AFAICS 10Mb is just enough for Z and backbuffer, so If you want to do alpha blending or render to texture, you have to go through system memory or bus to VPU.Shifty Geezer said:1.) That's BW TO the eDRAM. Internally, for the work it does, it's got 256 GB/s. PS2's 48 GB/s BW isn't enough for what Xenos has to do next-gen. 2.)
I disagree, simple though GS may be, the way it gets used in games is not simple to emulate at all. Unless RSX comes with certain GS extensions (in regards to address modes), this is very much in question whether it's possible to do fast enough.Pcostabel said:OTOH, the gs is a very simple rasterizer: the RSX will have no problem emulating it and adding enhancements like AA and higher res
You don't need to think of it in such brute force manner - SDTV framebuffers are relatively small and the better optimized the game, the more closely it follows page buffer coherency.Squeak said:If RSXs framebuffer compression is good enough to give the equivalent of 48Gb/s bandwidth, what does that tell us about the 32Gb/s EDRAM in Xenos? Is it silicon well spend, or would it have been better to but some (less area consuming) bandwidth saving logic on the main die?
Are they actually going to use NV2A hardware in there? I was under impression that it'll be some sort of highlevel emulation, so basically whenever a game does lowlevel stuff that isn't quite within DX spec, they'll code it into one of those game profiles (hence the initial low compatibility rate).Bohdy said:What is more salient to me (at least) is how MS is managing to emulate the Xbox in real time. Sure they have licensed the NV2A hardware for the GPU side of things, but emulating the 700mhz P3 must be a nightmare with multiple cores providing seemingly little benefit.
Fafalada said:I disagree, simple though GS may be, the way it gets used in games is not simple to emulate at all. Unless RSX comes with certain GS extensions (in regards to address modes), this is very much in question whether it's possible to do fast enough.
As for AA/resolution enhancements - unless Sony follows up on MS idea of game profiles, forget about it, it's just not going to happen.
However, IMO there's two real problems with GS emulation.
One is that you need to emulate its addressing schemes in detail - majority of effects in PS2 games are coded around specific behaviour of memory accesses and addressing, you'll not get away with any kind of HLE for that.
The other problem is the few cases of games that ping-pong between texture&render buffer a lot (sometimes on every rendered primitive, using same memory addresses for texture and render buffer) - which will kill bandwith on any GPU without unified eDram (meaning Xenos like setup would be useless to assist this also). The only way I see around this issue is hoping that the emulation of the rest of rendering would be faster then real GS by a large enough amount to compensate for the slower rendering whenever this kind of rendering packets are encountered over course of a frame.
The P3 emulation is an interesting question though, if they were really sticking with static recompiles then you could well be looking at separate "emulation profile" for each game - which at least to me would sound like there'll never be anything close to full library compatibility.
I agree, and I am still hoping they can do something like that, the benefit of RSX advanced texture filtering would already improve PS2 games a lot, even without any extra tricks. This would also still go along with the comments about "software+hardware" solution.Pcostabel said:I know, but a simple hardware address decoder should take care of that. I don't think you'd need to put the whole GS+4MB VRAM in there.
From hw perspective there's no concept of what is a frame buffer and what is a texture, let alone which buffers can be antialiased and which can not. And then you also have your games where rendering-buffers move around the memory in mid-frame, making the situation even more messy.Why not? The enhancements on PS2 can be switched on or off by the user, there is no need to have game profiles. Scaling everything to 1080p shoul be pretty much transparent to the game, and so should be FSAA.
PC GPUs use small onboard caches to keep traffic with main memory to minimum and help hide access latencies. Things like flushing texture cache on every rendered polygon is a pretty safe way to bring any current PC GPU to its knees.Not sure I follow. Why would rendering to texture kill bandwidth?
Fafalada said:From hw perspective there's no concept of what is a frame buffer and what is a texture, let alone which buffers can be antialiased and which can not. And then you also have your games where rendering-buffers move around the memory in mid-frame, making the situation even more messy.
Unless you know the exact configuration of memory used in a specific game in advance, emulator has no way of knowing what should/can be scaled or antialiased. The only universal approach would be to scale up the entire "virtual" embeded memory, but that would also mean those GS addressing extensions would get more complicated, and multisampling AA is still flat out of the question.
Game profiles are the only way to give emulator the info about where screen buffers are and which parts are safe to mess around with.
PC GPUs use small onboard caches to keep traffic with main memory to minimum and help hide access latencies. Things like flushing texture cache on every rendered polygon is a pretty safe way to bring any current PC GPU to its knees.
Switching render targets is afaik also an expensive and not GPU friendly operation if done very often - and again on PS2 we sometimes do this on per primitive basis because it has no real penalty to speak of.
Fafalada said:Are they actually going to use NV2A hardware in there? I was under impression that it'll be some sort of highlevel emulation, so basically whenever a game does lowlevel stuff that isn't quite within DX spec, they'll code it into one of those game profiles (hence the initial low compatibility rate).
Fafalada said:The P3 emulation is an interesting question though, if they were really sticking with static recompiles then you could well be looking at separate "emulation profile" for each game - which at least to me would sound like there'll never be anything close to full library compatibility.
Bohdy said:There were reports that MS licensed nVidia tech for Xbox BC a while back, and I interpreted this a hardware solution, but I may be mistaken.
This deal give MS access to low level informations about the NV2A architectures and give them also the right to use it (In a software emulator).Bohdy said:There were reports that MS licensed nVidia tech for Xbox BC a while back, and I interpreted this a hardware solution, but I may be mistaken.
I can't imagine that it would handle it in a specialised way at all.. it makes sense just to emulate PS2. The PS2's emulation of PS1 titles just drops out of it, in the same way as the emulation of other PS2 applications.Vysez said:I'm, I've to say, more interested in how the PS3 will handle the PSone emulation
Unfortunately not - what CRTC reads is the front buffer, and in 99% of PS2 titles that's just a copy of backbuffer after all rendering has finished. (Some games may add some postprocessing to frontbuffer also, but either way, bulk of the rendering happens in the backbuffer).Pcostabel said:Well, it is a pretty safe assumption that whatever is displayed is the frame buffer
Well I'd imagine that SPU and R3000 emulation modules will be shared between the two, but if we have a virtual GS there's no need to extend it for PS1 compatibility, when you could emulate PS1 GPU explicitly.Arnie Pie said:I can't imagine that it would handle it in a specialised way at all.. it makes sense just to emulate PS2. The PS2's emulation of PS1 titles just drops out of it
Running the title through PA it's not all that difficult to see where most of the buffers are and what they're doing. Make an online database of game profiles that gets updated over time - start with only 1st party titles if necessary (there's plenty of hits there, GT series alone in true HD would be a big deal for many people out there).And game-specific profiles are out of the question.. SCEI have *no idea at all* what internal tricks each individual title are performing..
BlueTsunami said:Microsoft did license some NVidia tech but appearently it was low level stuff (not sure exactly what).
Heres the link to a thread that talked about this issue....
Nvidia Licenses Technology to Microsoft for Backwards Compatiblity
Fafalada said:Running the title through PA it's not all that difficult to see where most of the buffers are and what they're doing.
You could do SSAA in 99% of the cases, I'd say. By nature, MSAA would be impossible, I'd say.Squeak said:About enhancing PS2 games: Even if the emulator doesn't know frame from texture and therefore can't do super sampling or multisampling, it knows polygons.
We don't need to get 'that' accurate - for most part you just need to map certain address ranges over certain time intervals and then pretend they are different dimensions at runtime. You only need to know the locations of backbuffer and any offscreen targets that share it's ZBuffer - most of those won't move much, if it at all.Arnie Pie said:I don't know if you're a developer, but if you are, then surely you know how long it takes to analyze a single frame *accurately* on a PA, right?
Well I should point out that I was discussing this from academic point of view - as I posted earlier in the thread I do Not expect any resolution/AA enhancements will happen with PS3. That said - RSX emulating GS could relatively easily add enhanced texture filtering (something that plagues PS2 games much more then lack of "AA" or resolution).There is no financial gain in this kind of enhancement exercise. SCEI want people to buy PS3 games (with an increased per-unit license fee) for PS3. Not to play their old PS2 games.
GS die on 90nm is estimated around 40-45mm2, adding that to a chip that's possibly already over 200mm2 would probably have a noticeable effect on yields.Squeak said:Although fitting the whole thing on to the RSX die, could be a problem and the integration for something useful an even bigger one.
I guess we'll have to agree to disagree, but I do think it's more difficult that you believe. I've personally done weird stuff with the GS in terms of frame buffer management (and texture management as a whole, which includes off-screen render buffers, and dynamic texture uploads to the upper 8 bits of the active display area).. And I know many other developers who do the same kind of things. I would expect even the big 1st party titles such as GT4 to do similarly weird processes to get things like 1080i support to work in the limited VRAM space available.Fafalada said:It's not trivial - but there's a lot of PS2 software out there that wouldn't make it 'that' difficult either.
Well, it's not so much the texture filtering that plagues PS2 games, it's the lack of any hardware-assisted mipmapping. It's up to application code to select a mip level for an object (or a strip, if you've got the cycles spare), whereas the PC approach (which I assume will be in RSX) is to automatically determine the mip levels based on derivatives computed during rasterisation. I guess it's possible to handle this in emulation (assuming they don't just slap a GS in the box!), but then what to do if the application is choosing its mip level for a reason.. maybe it's not supposed to be a function of the derivatives that selects it.Fafalada said:That said - RSX emulating GS could relatively easily add enhanced texture filtering (something that plagues PS2 games much more then lack of "AA" or resolution)
Come on.. this is completely different! Upscaling DVD players have been around for ages (although, strangely, not from Sony :|).. as such it would be stupid to not handle this. And for this it's technically a very simple thing to do.Fafalada said:Kutaragi already spoke about 'enhancing' DVD content through fancy upscaling processing, this isn't really any different.