It can sound silly but is possible (about the RSX)

@scatteh316: I don't think this is about how the graphics will look on final hardware - obviously RSX will achieve higher quality that GTX simply because it's a closed system.

But I do think that Dave's demo does serve a purpose. Now, as others have mentioned, PS3 just isn't going to be set up in a traditional PC environment whatsoever, but still assuming RSX is a tweaked G70 - (I honestly think it's more than that at this point, but that's neither here nor there) - Dave's test does provide us with the points of weakness for the potential RSX and it's bandwidth restrictions.
 
Last edited by a moderator:
Jawed said:
What's more interesting to see is that the AA and HDR hits at 1280p are in the region of 10%, give or take. With a bit of luck they can stay in that region and frame rates can hold up at around 60fps.
Can you run that by me again? at 1280x1024 the AA hits are ~27% and 33%, and HDR is a 27% hit for Splinter Cell and 41% hit for Far Cry.

The regular 7800 manages just a 5% AA hit for FarCry(22% differential) and 25%(8% diff) for Splinter Cell. HDR hit was 20%(21% differential) for FarCry and 23%(4% diff) for splinter cell.
 
I was referring to the hit versus full-speed memory and favouring SC:CT and taking cognisance of the 45-70fps averages with HDR or 4xAA/8xAF.

In other words, the low-bandwidth G70 isn't wallowing at 25-40fps, so I don't think RSX is in danger of being a dog at 1280p.

I do think with Cell doing vertex and geometry shading the sheer pressure on GDDR3 will be relatively limited.

Perhaps this time I'll remember this conclusion :rolleyes:

Jawed
 
That comparison is interesting but could be way off base on what others have shared here.

RSX has double the bandwidth as listed in that comparison, as it also has full access to CELL's main memory. At the very least we could expect vertex data to come from CELL, this saving a lot of the RSX's main bandwidth.

Come on, HDR is over-rated. Most people could not tell it is on or not, unless some kind of bloom lighting is used for the sky. Sure we hardcore gamers might be more sensitive to it, but the market is decided by people who would not be able to tell the difference.

One could even argue that anything over 2X AA is over-rated also. It simply not going to make a difference in the next generation wars, and yet so many people here will argue to death, 4XAA and HDR. Have fun.

To me the quality of the game is more important. For example if Metal Gear Solid 4 is a fun game or not. It will look great even if it does not have HDR or 4XAA.
 
Last edited by a moderator:
And this isn't a reference to the PS3, it's a approximation of the PS3 GPU and it's benchmarks, showing performance penalties associated with the decreased bandwidth and effects like 4xAA and HDR lighting.....
....when running PC games.

Different GPU usage patterns (for software native to the console) will also mean different performance considerations.
 
Josh378 said:
No one has answered my question...whats the advantage of having a raytracer in hardware in a GPU?

-Josh378
Certain algorithms are easier to implement and look better with ray tracing. That why artists use ray tracing for some elements of their non-realtime scenes. There are still data structure issues with realtime raytracing and dynamic scenes so it doesn't make sense for GPUs. Maybe that will change someday.
 
Back
Top