G70 Benchmarks @500/350

Robbz said:
The thing I was a bit unsure about was if you actually can decompress such things properly in the vertex shader.
Decompression of these kind of simple quantized formats has been part of VS specification since VS1.0, and was already made standard fare by architectures predating VertexShaders by 2 years.
It's also pretty much mandatory to use if you want to get anywhere near optimal performance on any current console architecture.

Still, it's a pity that one have to do this "extra" work because of memory BW considerations. It'd be a lot nicer if the BW would be "enough" to sustain the maximum precision
And it'd be even nicer if we had infinite memory, bandwith, and computational power. Sadly there's no such thing as a free lunch in realtime processing - there will always be limitations.

I was just not convinced that it would be a good thing for realtime tesselated stuff.
PS2 automatically compresses output vertex data for you in this kind of fashion (it's part of the bus arbitration circuit), and we're talking about 6 year old architecture. Obviously it can't be all that hard to do can it? ;)

Btw, as far as base storage goes, scalar/vector quantizations was what we used as baseline for the last 5 years, I am expecting to do much more then that this time around.
 
nAo said:
XBOX and PS2 decompress data in vertex shaders all the time in current games ;)

For the PS2 i knew that you could. What I didn't know was that you could do it on the NV2A.

nAo said:
What do you prefer, more precision almost no one can be aware of or potentially more 'stuff' to display?

I'd rather have both! :D
Seriously, if one gains from compressing then yes, more stuff for me! :)
In the SW renderer I'm working on, I use SSE style fp16 for the backbuffer , as it saves enough BW to justify the extra work.

So, this means that my whole post shoud be ignored. What a pity, I spent several minutes writing it! :D

But i still think that the geometry BW load should at least considered, even if it's less than I originally depicted.
 
Fafalada said:
Decompression of these kind of simple quantized formats has been part of VS specification since VS1.0, and was already made standard fare by architectures predating VertexShaders by 2 years.
It's also pretty much mandatory to use if you want to get anywhere near optimal performance on any current console architecture.

Ok, sorry for my lack of knowledge. I haven't done any real VS/PS work yet as I mainly work on SW rasterizers.

Fafalada said:
And it'd be even nicer if we had infinite memory, bandwith, and computational power. Sadly there's no such thing as a free lunch in realtime processing - there will always be limitations.

I can only agree. Luckily, the new consoles appear to have a lot less limitations than anything existsing today.

Fafalada said:
PS2 automatically compresses output vertex data for you in this kind of fashion (it's part of the bus arbitration circuit), and we're talking about 6 year old architecture. Obviously it can't be all that hard to do can it? ;)

That's cool. And I agree, it can't be that hard if you have the proper HW/ISA to assist. ;)
 
Personally I just wondered what the costs for AA and HDR would be on G70 at these bandwidth levels, if RSX is similar / the same as G70. What I took out of this is what the costs are for HDR and AA on current generation titles when dealing with these levels of bandwidths. On the whole, with the “framebuffer pixel complexitiesâ€￾ of the titles here there costs for 4x FSAA isn’t too bad and if future applications have similar levels of operations in the framebuffer as these two then there may be some cases where developers may not need to make much in the way of considerations as far as FSAA usage is concerned and they may just get away with considering a part with this level of bandwidth as a straight graphics processor at 720p and still get FSAA around the 60 FPS mark without having to take any special processing configurations into account (may be useful for cross platform titles).

However, where the performance is dropping significantly below 60 FPS then I’m thinking that shifting around various elements of processing requirements between graphics and CPU RAM may become more of a requirement. Should it be the generally accepted norm for Cell to process the geometry then there is likely to be a lower vertex buffer bandwidth use on the graphics side.

Of course, another thing to consider is that where the performance at 500/350 is lower than 430/600 effectively the shader complexity per pixel can be raised for free.
 
cho said:
Titanio said:
Shameless, Dave.

:oops: :?

:rolleyes: why ?

Sorry if that came across rudely, it was an early morning post, I just think it's silly to be encouraging people to debate this in the context of RSX. PC games in a PC system on a (relatively underclocked) PC card. The tech in the GPU is similar of course but the rest of the variables are going to differ widely.

Maybe its useful in some respects, I don't know.
 
I would say that this is the complexity of a year old game, so OK it is PC but it is reasonable to assume that FSAA@ 1080p will not happen and than again it is not so necessary. All in all Xbox2 game devs have it much more straight than PS3 game devs. Overall I would think the game devs will go for whatever offers them the best tradeoff on PS3, while on X2 they will know the goal, no need to think too much.
 
If Sony had not emphasized 1080p I think the AA issue is a moot point.


I can't belive anyone EVER belived in the first place they would be playing PS3 games at 1080p, Sony will match whatever Microshaft is doing and have all their games run at 720p.

AGAIN WHY SPEND TIME AND MONEY DESIGNING A GAME FROM THE GROUND UP TO RUN AT 1080P WHEN LESS THAN .01% OF HDTV'S EVEN ACCEPT A 1080P SIGNAL!!!!!!!!!!!


OK I understand now, but still come on people 1080p is an extremely high resolution, do you really expect a 300$ game console to output at that resolution?
 
c0_re said:
WHEN LESS THAN .01 OF HDTV'S EVEV ACCEPT A 1080P SIGNAL
Cause if your GPU doesn't support MSAA on FP render targets you'd better supersample your frame buffer: 720p screens owners would get a downscaled, thus AAed image.
 
Because those running in lower resolutions still benefit from AA from downsampling. 1080p at 0 AA for those with 1080p displays get very crisp imagery (plus small jaggies depending on size of TV); 720p/i displays get 1.5x SSAA; 480p/i displays get 4x SSAA - which'll be very nice a I'll still be using a UK PAL interlaced TV!
 
Looking at past history, I think it's safe to say that very, very few games wil be rendered at the maximum 1080p.

On XBOX, less than 2% of games developed used the maximum 720p.

Also, the market for 1080p will be even less than the market for 720p was last gen, since 1080p are extremely expensive. Which gives Dev's even less incentive to spend the time and money developing for 1080p.

I don't understand what factual basis you guys are using to predict wide support of the maximun available resolution, expecially when there are no standards whatsoever from Sony.

History and common sense suggest this won't be the case.

I think a much more realistic scenario is mostly 480p and 720p games. The majority of Dev's don't seem to push resolutions unless they have to, and Sony isn't forcing them.

p.s. I'm not a designer, but it doesn't seem to make sense to me to render at a higher resolution than your target audience can view, just to get some "free AA." Wouldn't it be better, i.e. less costly in development time, to aim for a lower resolution and try and incorporate AA the traditional way?? Seems the extra effort required to go 1080p, plus the extra bandwidth required, would make this not worthwhile simple to get free AA.

Correct me if I'm way off here.
 
c0_re said:
Well so what the PS3 is more than capable of 720p with AA and HSR both fully enabled. right?!?!?

If you're playing a game released in 2004 ya....i.e. FarCry or SplinterCell.

But what about the games released in 2006 and 2007?
 
Developers, publishers and game company execs. will be among the early adopters of 65-inch SED panels displaying 1080p. :D

Plus games rendered at 1080p will have benefits for people who only have 1080i or 720p displays.

That's assuming they don't have to make severe tradeoffs at the higher resolution.

But that's not to say there will in fact be a lot of 1080p games.

Who knows how the respective markets will develop?

If nothing else, Blu-Ray movies will make buying 1080p displays worth it for some people.
 
scooby_dooby said:
c0_re said:
Well so what the PS3 is more than capable of 720p with AA and HSR both fully enabled. right?!?!?

If you're playing a game released in 2004 ya....i.e. FarCry or SplinterCell.

But what about the games released in 2006 and 2007?

Comparing PC games to games built around one system may not be very fair. That's one of the things that's fundamentally wrong about this whole thing in the first place!
 
scooby_dooby said:
p.s. I'm not a designer, but it doesn't seem to make sense to me to render at a higher resolution than your target audience can view, just to get some "free AA." Wouldn't it be better, i.e. less costly in development time, to aim for a lower resolution and try and incorporate AA the traditional way??
No, there no reason not to aim high. The cost of high-res rendering is in processing, not development. You don't need higher polycounts or texture definitions to be able to render at 1080p vs. 720p. The only real difference is in creation of 2D artwork like HUDs, but there's no time advantage in drawing 2D artwork for 720p or 1080p either. I expect most artwork is drawn large and scaled down for the game. In game 2D graphics can be scaled for the display.

Really, outputting 1080p or 720p is but a switch, just like on the PC. Choose the output resolution and the graphics are rendered to fit.
 
Shifty I understand that, but by going to a higher resolution than necessary doesn't that severely cut back on the amount of power/bandwidth available?

For example, if a game was going to use HDR, it would take a MUCH larger performance hit at 1080p then it would at 720p. Therefore, many things would have to be scaled back to make up for the performance costs of of using 1080p.

In other words, if all the textures etc are the same, wouldn't a developer be able to make a much nicer looking game at 720p since all the rendering etc would come at a much lower costs to system performance?? Freeing more available power?

I believe you when you say there's not alot of extra work to swicth from 720p to 1080p, but what about all the work that will be required because of the massive bandwidth requierments/performance losses that 1080p creates??

On a PC you can switch resolutions, and sometimes when you go to high tyeh game runs like crap right? On the closed PS3 they will need to make sure all games run well at 1080p. Doesn't this mean they would have to use less effects, less AA, and simpler graphics than if they designed for 720p from the get-go?

Let me know if I'm not making sense.
 
Titanio said:
Comparing PC games to games built around one system may not be very fair. That's one of the things that's fundamentally wrong about this whole thing in the first place!

a.) All games are "designed around one system", none are "designed around the general capabilities of next generation systems and tweaked to each"?

b.) We are not "comparing" we are taking a look to get a feel of where tweaking/tailoring to the specific system requirements may need to occur given the graphics bandwidth available to RSX and assuming it has the same compression capabilities.
 
nAo said:
Dunno if you got my point, I've just said you don't need to 'steal' XDR bw in order to make some use of the FlexIO bw.
That's exactly the situation in which Dave's test is most meaningful. You're right, of course, but a major next-gen GPU should have more like double the RAM bandwidth the RSX will have. 22 GB/s would never be enough to power PC games on a 550 MHz G70, at any rate. 44 GB/s would be closer.
 
DaveBaumann said:
a.) All games are "designed around one system", none are "designed around the general capabilities of next generation systems and tweaked to each"?

In the case of multiplatform console games, the latter is often true of course, but these aren't even multiplatform next-generation console games. Benches of multiplatform (console-only) games might be of some use - they're aiming at a small set of configurations - but these games here have been designed to work with a much much wider set of possible configurations.

Still, really the best way to gauge technical capability or potential from games is to look at games designed solely for one system. Multiplatform games don't often provide an accurate reflection of one system's capability. I know it's impossible to make such benches now, if ever, but it should be borne in mind.

DaveBaumann said:
b.) We are not "comparing" we are taking a look to get a feel of where tweaking/tailoring to the specific system requirements may need to occur given the graphics bandwidth available to RSX and assuming it has the same compression capabilities.

Fair enough, but I've little doubt some reading this thread aren't appreciating it for what it is in the same way as you intended and others here are, but looking at it as something else. For them this may be very misleading, and some unfortunately are very willing to be misled..
 
Back
Top