G70 Benchmarks @500/350

scooby_dooby said:
In other words, if all the textures etc are the same, wouldn't a developer be able to make a much nicer looking game at 720p since all the rendering etc would come at a much lower costs to system performance?? Freeing more available power?
My knowledge of realtime rendering isn't sufficient to comment on this. I'm not sure how the resources saved can be reallocated. eg. Does the BW needed for 1080p mean less available for models = less detailed models? I don't know how much BW different aspects consume. I don't think you could do much more with 720p without AA then with AA such that it was an obvious improvement. Already there'll be n hundred enemies around (see HS).

Aiming for 720p with AA is no different to aiming for 1080p without AA, and if you render to the latter, you get the former when you downsample. If you were to render to 720p without AA, you'd free up BW for other stuff, but I don't think it'd make any visible difference whereas the drop in IQ would be noticeable.

Though Xenos is locked to 720p (1080i) by MS's choosing, that's to enforce a minimum of AA. They could also open it up to 1080p output and give the option to drop AA for the higher res. I really can't see adding 1080p as an option in any way a bad thing.
 
Though Xenos is locked to 720p (1080i) by MS's choosing, that's to enforce a minimum of AA. They could also open it up to 1080p output and give the option to drop AA for the higher res. I really can't see adding 1080p as an option in any way a bad thing.


Ignoring the bandwidth issue for a secoond 1080P is 2x the number of pixels of 720P, this means roughly half as many shader instructions/pixel.

If your using MSAA, you only shade at the same number of pixels as the target resolution, AA doesn't affect it.
 
Titanio said:
In the case of multiplatform console games, the latter is often true of course, but these aren't even multiplatform next-generation console games. Benches of multiplatform (console-only) games might be of some use - they're aiming at a small set of configurations - but these games here have been designed to work with a much much wider set of possible configurations.

Actually, if you want to focus in that manner on the games in question Splinter Cell is a big crossplatform title, so this is likely to continue in the future. They have already stated that the reason for including a Shader 3.0 path was for the next generation consoles, so the next version is likely to use the same engine.
 
DaveBaumann said:
Titanio said:
In the case of multiplatform console games, the latter is often true of course, but these aren't even multiplatform next-generation console games. Benches of multiplatform (console-only) games might be of some use - they're aiming at a small set of configurations - but these games here have been designed to work with a much much wider set of possible configurations.

Actually, if you want to focus in that manner on the games in question Splinter Cell is a big crossplatform title, so this is likely to continue in the future. They have already stated that the reason for including a Shader 3.0 path was for the next generation consoles, so the next version is likely to use the same engine.

I was talking about multiplatform console games - console-only, as mentioned above. A multiplatform game with a PC version is little better, if not worst - well, depending on what platform was driving development at least. If it was PC, certainly it's at least as bad for this kind of analysis.
 
Inane_Dork said:
That's exactly the situation in which Dave's test is most meaningful
I disagree, that's exactly the situation in which Dave's test shown its limits cause it just ignores more than half RSX's bandwith :eek:

ERP said:
Ignoring the bandwidth issue for a secoond 1080P is 2x the number of pixels of 720P, this means roughly half as many shader instructions/pixel.

If your using MSAA, you only shade at the same number of pixels as the target resolution, AA doesn't affect it.
You're right but superior textures and shading quality have a price ;)
 
I disagree, that's exactly the situation in which Dave's test shown its limits cause it just ignores more than half RSX's bandwith

How so ? The rsx will never have the full bandwidth to the xdr ram .

These benchmarks should be noted as a worse case or as a firm foundation of what to expect in the rsx .
 
From what I can gather on all the materials I've read is that the PS3 will do just fine with AA, HDR and the like at 720p. All the benchmarks and info I've read here seem to jive with this. Correct?

It seems then that the only reason the discussion on the BW (or possible lack thereof) in the RSX in regards to AA, HDR, etc. implementation continues is if we're running at 1080p.

Isn't it likely then that most all devs will, for at least '06, '07 and possibly '08, simply stick to using 720p with all the benefits, or use 1080p but with less AA (1x or 2x only) since the majority of users will have the 1080p output downsampled? Is this a correct assumption?

If it is, then we don't need to worry about 1080p outputs with full AA or whatever until a few years from now, when devs will have a much better handle on the machine.
 
Say cell in procedurally generatin soming and feeding it to the RSX that as far as I know doesn't require it to hit the XDR but bandwidth is still used and it uts down on the RSX's internal bandwith requirements.
 
Oh my god, I don't how many times I have to repeat it :)
RSX DOES NOT NEED TO MAKE ACCESSES TO XDR RAM IN ORDER TO USE FLEXIO BANDWITH.
There are other memory-like resources on CELL CPU that RSX can exploit (and I'm not talking about procedural stuff, but just plain standard rendering)
 
How much memory is there inside Cell?

How much memory is consumed by a 1920x1080 framebuffer with 4xMSAA?

Just curious as to how Cell would be used to implement a backbuffer...

Jawed
 
From what I can gather on all the materials I've read is that the PS3 will do just fine with AA, HDR and the like at 720p. All the benchmarks and info I've read here seem to jive with this. Correct?
You'd be wrong . There is no graphics chip on the market that can do hdr + aa . All we have is really the g70 and nv40 that can do hdr and both take huge hits with hdr on at 1024x768 .

Say cell in procedurally generatin soming and feeding it to the RSX that as far as I know doesn't require it to hit the XDR but bandwidth is still used and it uts down on the RSX's internal bandwith requirements.
It could but is it going to be able to generated enough textures for each scene ?

Oh my god, I don't how many times I have to repeat it
RSX DOES NOT NEED TO MAKE ACCESSES TO XDR RAM IN ORDER TO USE FLEXIO BANDWITH.
There are other memory-like resources on CELL CPU that RSX can exploit (and I'm not talking about procedural stuff, but just plain standard rendering)

How is this going to help with the buffers ? The flexio wont help at all unless you use the cache to do that but then again your taking the wind out of the cell to feed the rsx

Goign to take a huge amount of tiles to fit the buffers into that cache and of course your going to be taking the wind out of the cells sails . Thoses caches were made to keep the cell cpu fed . Taking those away to feed the rsx will greatly reduce the performance of the cell chip

The fixation on 1080p with 4xMSAA is becoming tiresome.

How about 720p 4x fsaa and 16fp hdr ?
 
nAo said:
Just curious as to how Cell would be used to implement a backbuffer...
tiling

And you are still talking about sacrificing the CELL CPU (if possible in real world scenarios) so you can make up a feature the 360 gets for free.

This does not play to the PS3's strengths and you are talking about lobotomizing the CELL. What good is the extra bandwidth if you are going to be CPU limited?

What is the point of a powerful CPU if it has to babysit GPU tasks all day?

Seems pretty clear to me that PS3 developers would be better off spending their time playing to their strengths.
 
jvd said:
Goign to take a huge amount of tiles to fit the buffers into that cache and of course your going to be taking the wind out of the cells sails . Thoses caches were made to keep the cell cpu fed . Taking those away to feed the rsx will greatly reduce the performance of the cell chip

No it won't - it doesn't reduce Cell performance, but it reduces the headroom for other tasks. In the end its up to the dev to balance that as they wish.

How about 720p 4x fsaa and 16fp hdr ?

More relevant ;)

And you are still talking about sacrificing the CELL CPU (if possible in real world scenarios) so you can make up a feature the 360 gets for free.

Relative to what could have been without the eDram, we can't say it's getting it for free. It's less free than it is a decision that's been made for you and that you don't have to worry about. How can we say where RSX will stand vs Xenos without AA etc. and then with AA etc? Any hits it takes are relative to itself, not Xenos.
 
No it won't - it doesn't reduce Cell performance, but it reduces the headroom for other tasks. In the end its up to the dev to balance that as they wish.
of course it will . The cache is there to feed the cpu. Use a tweaker and shut of the l2 cache for your desktop pc and then cry at how horribly slow it is . If your using the cache to tile for the gpu there wont be cache for the spus and they will have to go to the xdr ram which will be thousands of times slower than hittnig that cache .
 
jvd said:
No it won't - it doesn't reduce Cell performance, but it reduces the headroom for other tasks. In the end its up to the dev to balance that as they wish.
of course it will . The cache is there to feed the cpu. Use a tweaker and shut of the l2 cache for your desktop pc and then cry at how horribly slow it is . If your using the cache to tile for the gpu there wont be cache for the spus and they will have to go to the xdr ram which will be thousands of times slower than hittnig that cache .

My apologies, I was assuming we were just using local SRAM here.
 
Titanio said:
I was talking about multiplatform console games - console-only, as mentioned above. A multiplatform game with a PC version is little better, if not worst - well, depending on what platform was driving development at least. If it was PC, certainly it's at least as bad for this kind of analysis.

So, anything thats been seen using the UE3 engine isn't going to be particularly good on utilising next gen console hardware?
 
Back
Top