Bandwidth required for 720p 4xmsaa?

Imperial measurements are still a standard, even if replaced by a more modern standard. A foot is a foot whether measuring string or velocity (though there is a variation in weights :rolleyes: ). Metric (SI) is a proper standard and a very well thought out one which was conceived over 200 years ago, after Elizabethan standardization of the metric system with the creation of official weights and sizes.

50 Hz, 240 Volts is a standard (though it's local to nationalities). We don't need a 120 V plug socket for computers and a 240 V socket for TVs and a 50 V plug for charging your mobile. You also don't need a 2 pin plug for your TV, a 3 pin for your computer, and a 5 pin for your mobile charger. IT varies from country to country but within a country you know if you buy an electrical device, it'll work.

PAL analogue broadcasts are a standard. They didn't introduce 3 varieties of transmission and build different TV sets with different levels of support for each option.

MIDI is a standard. The Hex values for play a middle C are the same for electronic insturments. This replaced a miserable mix of proprietary interfacing protocols.

There are some good standards. USB has replaced a halve dozen connectors and I'm a big fan of it. Gone are the stupid days of having the same wires connecting to different pins in different shaped connectors for no good reason other than to be awkward.

What's so ridiculous is as far back as the 1500s people were aware for the need for standards. In this wonderful communications age people are too busy competing to sit down and sort themselves out. Why the hell did DVD writeables come in - and + formats? Do we need two different flavours? No, but the hardware produces didn't want to cooperate with each other.

I guess in the case of TVs one could argue support for two formats is needed for different BW services. 720p is needed if there isn't BW for 1080p. Maybe there is a reason?

I'd better stop ranting. I'm feeling politcal at the moment. The UK Government (probably not the Scots who don't have the same chumps in office as the English and poor Welsh lumbered with our mindnumblingly dumb laws) have now banned people letting their cat out night, making it a law to provide entertainments for your cat, and you have to provide a litter tray even when you've a perfectly adequate garden, with something like 10 regulations governing how your cat takes a crap. Some standards we can do without. :devilish:
 
Shifty Geezer said:
Imperial measurements are still a standard, even if replaced by a more modern standard. A foot is a foot whether measuring string or velocity (though there is a variation in weights :rolleyes: ). Metric (SI) is a proper standard and a very well thought out one which was conceived over 200 years ago, after Elizabethan standardization of the metric system with the creation of official weights and sizes.

50 Hz, 240 Volts is a standard (though it's local to nationalities). We don't need a 120 V plug socket for computers and a 240 V socket for TVs and a 50 V plug for charging your mobile. You also don't need a 2 pin plug for your TV, a 3 pin for your computer, and a 5 pin for your mobile charger. IT varies from country to country but within a country you know if you buy an electrical device, it'll work.

PAL analogue broadcasts are a standard. They didn't introduce 3 varieties of transmission and build different TV sets with different levels of support for each option.

I did say "humanity" as in worldwide. Take those "standards" to a worldwide level and they're not much of a standard anymore.

The only thing saving this is that PC standards seem to be the same all over the world. AVI is AVI in japan as it is in Greece for example.

What's so ridiculous is as far back as the 1500s people were aware for the need for standards. In this wonderful communications age people are too busy competing to sit down and sort themselves out. Why the hell did DVD writeables come in - and + formats? Do we need two different flavours? No, but the hardware produces didn't want to cooperate with each other.

Yeah, that's the problem i was talking about. Getting differenct companies to stick to one standard will be tough, when each company tries to slit other companies throats and make as much money as they can.

Hell, because of all this mess, we still have the 50Hz/60Hz problem even with the new HD resolutions! Even though i must admit, it's much better than before since all HDTVs in Europe support both... STILL...!!! ;)
 
So with all these haxx calculations, did you come up with anything?

Would you say that the GDDR3 bandwidth would not be enough for 720p and 4xAA at 60fps?
I mean, even if it did take all the gddr3 bandwidth, ps3 would still have more bandwidth left for CPU+GPU, eh?

But if we look at 6600GT which mostly has 16GB/s (btw how is the rsx memory speed shown, 700mhz and 1400mhz ddr (effective) or 350mhz and 700mhz ddr?) the 6600gt gets

4xaa-ut2k4-1280.gif


http://www.vr-zone.com.sg/?i=1684&s=1

And these only have 128mb vram. Wouldn't it be safe to say that 128mb and 16GB/s should be enough for framebuffer? Heck I think 128mb is way more than enough for a single frame :p And it's 95fps on just 16GB/s.
 
weaksauce said:
And these only have 128mb vram. Wouldn't it be safe to say that 128mb and 16GB/s should be enough for framebuffer?

It all depends on what you're trying to render.

To be pedantic, all you can conclude from that chart is that Unreal Tournament 2004 running on a PC at 1280x1024@4xAA/8xAF has sufficient bandwidth for that particular benchmark to reach 95 fps.

Which may or may not have anything to do with next-generation games running rendering loads optimized for a console platform. ;)
 
Last edited by a moderator:
aaaaa00 said:
It all depends on what you're trying to render.

To be pedantic, all you can conclude from that chart is that Unreal Tournament 2004 running on a PC at 1280x1024@4xAA/8xAF has sufficient bandwidth for that particular benchmark to reach 95 fps.

Which may or may not have anything to do with next-generation games running rendering loads optimized for a console platform. ;)

Why does it matter so much on what you render? I chose to show ut2k4 because I think it has lots of colors and produces some quality images. How can it get worse?
And why does it matter if it's PC or not? Consoles get better optimization yeah but I'm just talking about GDDR3 bandwidth, which should be quite the same accept ps3's is faster.
 
ERP said:
You will be able to slow a game down with lots of transparent OD on RSX, so devs will likely limit the amount of transparent OD.

So if the GS' strength is OD and particle effects, what is RSX' strength?
 
So if you only get ~1/2 of peak bandwidth in realworld(11.2gb/s for rsx), and 4xMSAA required 4x's the bandwidth whatever it may be, is it accurate to say that when using 4xMSAA dev's will have ~3gb/s available bandwidth for everything else? i.e. resolution x OD x fps x (bytes/color value+Bytes/Z Value)?

do you think we'll see 2xAA used predominantly since it gives them twice the useable bandwidth(5.6gb/s) to do whatever they want? Or am I just waaaaaay off here...
 
Last edited by a moderator:
You're reading far too much into hokey math here.
Console devs don't work like that, you don't finish your game then turn on AA.
You do some benchmarks, look at image quality, decide what you think the right set of trade offs is and make everything else work.

These bandwidth figures tell you very little other than that frame buffer and Z compression actually work.
 
Back
Top