EU PS3 = Software PS2 Emulation

I'm still not convinced that it's possible. The eDRAM at least would need to be physically in the box somewhere IMO.
As PS2 lacked any compression, 4:1 lossless compression at 18 GB/s GDDR would be more than enough to cover required BW.

I'm afraid the canabalised EuroPS3 won't tell us much. I expect a hole where the EE+GS was, and whatever hardware that might be helping BC will be hidden among the other chips. A little 'e'DRAM die, not embedded in anything, doesn't seem likely IMO.
 
As PS2 lacked any compression, 4:1 lossless compression at 18 GB/s GDDR would be more than enough to cover required BW.
Theres no lossles compression for textures. Besides, PS2 uses palletted textures, which can be considered compression as well. RSX is however more efficiently using bandwith, using bigger caches and has stuff like early-z-rejection.
Whats more of a concern is how the GS is used in games, a "PC" GPU has trouble handling that.

I predict that Sony will have per-game-patches (or call it profiles if you want) to remove parts of code that break or slowdown emulation.
 
Theres no lossles compression for textures. Besides, PS2 uses palletted textures, which can be considered compression as well. RSX is however more efficiently using bandwith, using bigger caches and has stuff like early-z-rejection.
Whats more of a concern is how the GS is used in games, a "PC" GPU has trouble handling that.

NVIDIA GPUs outright hate paletted textures - to the extend that at some point they were enabled in the drivers, and they disabled them in a later revision. I would guess an emulator running on the RSX which has access to vastly more memory would uncompress them to R5G6B5 or DXTn, depending on the type of texture. (And a table of "what compression to use for what texture" is one of the things that might be supplied in a compatibility patch.)
 
I would guess an emulator running on the RSX which has access to vastly more memory would uncompress them to R5G6B5 or DXTn, depending on the type of texture.
That would be a waste of effort. Simply have the palette used as a 1 dimensional texture and issue a dependent texture read in whatever shader was being used.

Dean
 
Whats more of a concern is how the GS is used in games, a "PC" GPU has trouble handling that.
Yes. In previous discussion on PS3 emulating PS2, the biggst concerns raised weren't over eDRAM's bandwidth, but the activities of GS, which is just plain odd and doesn't map well to conventional GPUs.
 
What i expected then :)

PS3 will definitely drop in price before Xmas. This and the 65nm shrink only threngthens the theory. Besides, we all know that my theories and predictions are fact.
the only reason i don't think they'll drop the price, is because it would anger early ps3 owners. i think the first price cut will come, at the earliest, 1 year.
 
lol, the news have already mutated:

London (England) - In an effort to scale back the prohibitive cost of the PS3, Sony Computer Entertainment Europe (SCEE) is removing a couple features for the official launch next month. Hardware cut-backs will result in the removal of motion sensitive technology and will limit the backward compatibility of the system.

http://www.tomshardware.co.uk/2007/02/23/euro_ps3/

:oops: :rolleyes:
 
Last edited by a moderator:
NVIDIA GPUs outright hate paletted textures - to the extend that at some point they were enabled in the drivers, and they disabled them in a later revision.

Yeah, the original Unreal, developed mainly for 3dfx cards and glide, had 8-bit palettized textures too, and it wasn't running well on any Nvidia card I had, not even of GF4s...
 
The things people write! ;)

Indeed, indeed! :smile:

It will be very interesting to see what happens when Sony PR-machinery starts to gear up in March.
I think they have positioned them self right by giving the bad news first so some of the negative buzz will fade off in time before the launch, when they will release positive news.
 
That would be a waste of effort. Simply have the palette used as a 1 dimensional texture and issue a dependent texture read in whatever shader was being used.
Not sure if it's a good idea, as you would need to sample 4 texels from the original palettized texture, issue 4 point filtered texture reads from the palette texture and then perform bilinear filtering using shaders math. It's better just to store a texture as an uncompressed one imho, even though..in the big schemes of things..for some obscure convuluted reason the first idea might be better.
 
(And a table of "what compression to use for what texture" is one of the things that might be supplied in a compatibility patch.)
Why would you need something like that? just store your textures are uncompressed, you have all the bandwidth you need to sample them, it's a no isuee.
Palettized textures, imho, are a minor problem to solve if you wanna write a GS emulator :)
 
It's better just to store a texture as an uncompressed one imho
Hehe.. I would have thought storing as uncompressed would bring all sorts of horrible synchronisation issues up, specifically as on PS2 VRAM layout is normally rather dynamic (that's the understatement of the year, I guess!). You'd need to track modifications to palettes (and to the palettised textures) - potentially mid-frame - and rebuild your uncompressed texture based on what you suggest. Of course, you'd have to do this for any mipmaps that may be configured too. And surely any rebuild would require the use of the shader processing you described?

Cheers,
Dean
 
nAo said:
for some obscure convuluted reason the first idea might be better.
Convoluted or common?
Can you think of any title that 'doesn't' use paletted addressing for postprocessing math, not to mention stuff like multiple palettes per texture etc.
Of course - some of those sequences could be identified and replaced by more "sensible" draw commands, but hoping to emulate GS without a shader fallback for Clut, you'd be quite literally stuck pissing against the wind, IMHO.

Laa-Yosh said:
and it wasn't running well on any Nvidia card I had, not even of GF4s...
Might have something to do with the fact that virtually no NVidia card had hardware support for palettes. When Unreal came out, TNTs were stuck using 16/32bit maps, so 3dfx had major advantages there.
 
18 GB/s ?
Sorry, is it 20 GB/s? The only official figure we had was 22.4 IIRC, which was before the clock decrease.
anyway..don't forget the 'other' bus :)
For a direct emulator, wouldn't the latency in reaching XDR be quite a shock to software that's been using eDRAM just a stone's throw from the logic? And at full system BW, 45 GB/s for PS3, you're less than PS2's total BW. Compressed data between RSX and VRAM would offer an easy solution. Perhaps 2:1 compression is possible for framebuffer BW, with Textures provided from XDR?
 
Hehe.. I would have thought storing as uncompressed would bring all sorts of horrible synchronisation issues up, specifically as on PS2 VRAM layout is normally rather dynamic (that's the understatement of the year, I guess!). You'd need to track modifications to palettes (and to the palettised textures) - potentially mid-frame - and rebuild your uncompressed texture based on what you suggest. Of course, you'd have to do this for any mipmaps that may be configured too. And surely any rebuild would require the use of the shader processing you described?

Cheers,
Dean

Given how nAo and Faf coded for PlayStation 2 (and other crazy ones too :p), I think that for quite a long time the GS emulator was in Error Code 0x0DCDAA (made up) which is matched in the database at the voice "Error Code 0x0dCDAA: Emulator curling up and crying".
 
Hehe.. I would have thought storing as uncompressed would bring all sorts of horrible synchronisation issues up, specifically as on PS2 VRAM layout is normally rather dynamic (that's the understatement of the year, I guess!). You'd need to track modifications to palettes (and to the palettised textures) - potentially mid-frame - and rebuild your uncompressed texture based on what you suggest. Of course, you'd have to do this for any mipmaps that may be configured too. And surely any rebuild would require the use of the shader processing you described?

Cheers,
Dean
:LOL: :LOL: :LOL:
Oh man I'm half-dying here laughing. That really made my day, thanks! :D
 
Back
Top