G70 Benchmarks @500/350

Maybe I'm going to starve some SPEs, or maybe I'm not, you can't know in advance.
Mabye , would suck to code a game and find out your starving the cell when your done coding .

No offense Jvd but I'm not sure you're completely grasping what I'm talking about, please can you elaborate how I'm going to limit 'the power of cell'? I mean, it's time to get some detail here.
(and..IT'S NOT A CACHE!)
The cell was designed with an optimal lvl of cache and sram needed to do the tasks its intented to do . But limiting the cell acess to that cache and ram your going to limit the full power of the cell. Of course its not ever going to reach its theoretical . However by earmarking this space for tiling for the rsx your going to limit the potential of the cell processer .

I don't care what Sony says, maybe you do, I don't.
C'mon I just made an example of an application that does not need 100% of your local store to run fast.
but now your limiting what your going to do with the cpu to make up for a lack of something in the gpu .

Emh..I'm not talking about AA at all, now I don't know how did you get this idea.
So why are you in a thread talking about fsaa and hdr and the g70 / rsx if your not talking about either of these ?

This right here tells me there is no reason to continue talking about this with you. You obviously have no idea what the thread is discusing and just felt like jumping in and adding your wo sense into a discusion you didn't bother to read .
 
Mabye , would suck to code a game and find out your starving the cell when your done coding .

Then you're not done... Or you are done and it was your intention to utilize the particular amount of resources on Cell that you chose... *Or* you fux0red up...

The cell was designed with an optimal lvl of cache and sram needed to do the tasks its intented to do . But limiting the cell acess to that cache and ram your going to limit the full power of the cell. Of course its not ever going to reach its theoretical . However by earmarking this space for tiling for the rsx your going to limit the potential of the cell processer .

Did you ever stop to consider that stuffing tiles into SRAM would be using Cell to it's potential? There's no *one way* to use the system, the whole point of the design is to be flexible enough that you can try different methods depending on your needs.

but now your limiting what your going to do with the cpu to make up for a lack of something in the gpu .

How am I limiting? Be *VERY* specific! For all you know I could be rendering ridiculously large buffers, and just using one or more SPEs to run a sinc or lanczos filter over the tile before spitting out to the front buffer... It would certainly be amusing to explore...

So why are you in a thread talking about fsaa and hdr and the g70 / rsx if your not talking about either of these ?

This right here tells me there is no reason to continue talking about this with you. You obviously have no idea what the thread is discusing and just felt like jumping in and adding your wo sense into a discusion you didn't bother to read .

The thread is titled "G70 Benchmarks @500/350", you can take the AA and/or HDR with it if you want.... The only constraint with regards to resolution that folks really have to concern themselves with is simply; more pixels == shorter shaders per-pixel (and that affects both systems).
 
The thread is titled "G70 Benchmarks @500/350", you can take the AA and/or HDR with it if you want.... The only constraint with regards to resolution that folks really have to concern themselves with is simply; more pixels == shorter shaders per-pixel (and that affects both systems).
apparently you haven't been paying attention to the discusion in the thrad either . Its all around hdr and fsaa and the g70/rsx .

That is what the discusion has been about. To come in an change what your talking about to try to make what your saying relevent to the conversation after realising that nothing your talking about has to do with what was being discussed is a horrible excuse .
 
The only problem of course is that those aren't the numbers of G70. 99.6 to 81.3... are. And as I said earlier:

Except I'm talking specificly about the g70 in rsx's clothing not the extra numbers dave threw in to give us a refrence. This is what i've allways been talking aobut .

That is why i make the mention of it being a lower res than 720p

Yet still using your numbers of a card that will have almost twice the bandwidth as the rsx it still doens't have hdr + fsaa . Which is what most of us want at 720p since the other console will be able to do this .
 
you were talking about rsx? Oh I see....
No it was in reply to the person saying


From what I can gather on all the materials I've read is that the PS3 will do just fine with AA, HDR and the like at 720p. All the benchmarks and info I've read here seem to jive with this. Correct?

In which my reply is correct as the only benchmarks we've seen of hdr is nv40 and g70 cards . Neither of which can do hdr + fsaa . So waht he said is incorrect in regards to the benchmarks and we still don't know if the rsx can do both at the same time .

Really, could you provide some performance numbers? Is that fp10 or fp16? What is the % difference between the two precisions? What is the performance difference between no AA and 4x AA with HDR, or is it still free?


Here are the frame-buffer sizes for these HDTV resolutions and 640x480 with a colour depth of 32-bit (which will cover both the standard integer 32-bit format and the FP10) and a 32-bit Z/stencil buffer. Naturally, the sizes will increase if a higher Z-Buffer depth or a higher bit colour depth is used:

Then the chart lists
Code:
(MB) No FSAA 2x FSAA 4x FSAA 
640x480 2.3 4.7 9.4 
1280x720 7.0 14.0 28.1 
1920x540 7.9 15.8 31.6

Seems to imply to me that with fp10 4x fsaa at 720p fits in to 28.1 megs which doesn't quite fill up 3 tiles . (3 tiles would be 30megs ) So in bandwidth terms it apears to be free.

Since the xenos is set up to give 4x fsaa for free both bandwidth and fillrate wise (or up to 5% decrease in performance) and from how i'm reading it (though i could be wrong ) fp 10 will be free bandwidth wise , we only have to worry about gpu power wise . Looking at the nv40 and g70 benchmarks hdr fp 16 seems to be bandwidth not gpu limited .

So i don't see why it can't do fp 10 with fsaa with minimal performance hits .


The question is two parts

1) can the rsx do both hdr and fsaa together (if not question too is moot )

2) can it do both with out taking a huge hit
 
It does look like this will be the case, but I don't know why anyone would consider fp10 HDR.

Can you list why you think on one would consider fp 10 hdr ?

While it wont be ideal for all games it sounds like a very good cross between image quality and performance .

1) can the rsx do both hdr and fsaa together
I haven't seen any reason to believe it can.

Either have I . So on the x360 you have a choice between what seems like free or a very small hit fp 10 + 4x fsaa at 720p or fp 16 with a bigger hit and 4x fsaa

Or the ps3 where u have to decide upon fsaa or hdr .
 
I'll assume you meant wouldn't... mathematics.
then again fp 16 would be the same compared to fp32 , fp32 would be the same compared to fp64 .

At each lvl there are trade offs , Looking at the hits in games that aren't cpu bound hdr takes a huge loss on the upper res . This is fp16 . So idealy fp 10 introduced on a nv40 would have meant playable framerates at higher reses than what we ended up with .

On the flip side on the rsx fp 16 will be usable in many more games than fp32 .



Oh I think it could be a nice upgrade vs. what we've had previously (doubly so since it works with AA), but I would say it is disingenuous to call it HDR.
as above what your saying is akin to claiming hdr fp 16 shouldn't be called hdr because fp 32 would look better .

I don't see the point . Esp if in most cases the images will look close to the same
 
Inane_Dork said:
22 GB/s would never be enough to power PC games on a 550 MHz G70
Why is everyone freaking stuck in the mindset - "all software workloads are identical (PC-type) and the only variable is so called 'power' "? If that were the case PS2 and DC would still be stuck with ~Unreal1 graphics at <30fps.

Dave said:
So, anything thats been seen using the UE3 engine isn't going to be particularly good on utilising next gen console hardware?
No, not relative to native solutions.
 
jvd said:
game and find out your starving the cell when your done coding .
It happens, game developers as common people do mistakes all the time,
but a good games developer before taking a specific route runs tests, makes experiments, and so on..

The cell was designed with an optimal lvl of cache and sram needed to do the tasks its intented to do . But limiting the cell acess to that cache and ram your going to limit the full power of the cell
I'm going to limit it only on tasks that can need a full local store, or full bandwith, etc..
Of course its not ever going to reach its theoretical . However by earmarking this space for tiling for the rsx your going to limit the potential of the cell processer .
then my 3D engine will suck and I will get fired, aren't you happy? ;)
but now your limiting what your going to do with the cpu to make up for a lack of something in the gpu .
I'm not going to repeat why this is not the case for the tenth time.

So why are you in a thread talking about fsaa and hdr and the g70 / rsx if your not talking about either of these ?
What?! I'm in a thread which started with, in my opinion, a meaningless test and I'm trying to explain why I believe that test don't deserve to be taken as a serious test.
This right here tells me there is no reason to continue talking about this with you. You obviously have no idea what the thread is discusing and just felt like jumping in and adding your wo sense into a discusion you didn't bother to read .
Yeah..I'm the one who has no idea :) too bad you're the one that avoided all my questions on specific technical details :rolleyes:
 
Fafalada said:
Inane_Dork said:
22 GB/s would never be enough to power PC games on a 550 MHz G70
Why is everyone freaking stuck in the mindset - "all software workloads are identical (PC-type) and the only variable is so called 'power' "? If that were the case PS2 and DC would still be stuck with ~Unreal1 graphics at <30fps.
Maybe if you were following the path of the conversation, you wouldn't be so upset. nAo was talking about the RSX not using XDR bandwidth. I thought he meant streaming mesh data over Flex I/O and all textures and render targets staying in the VRAM. It turns out he was talking about keeping the framebuffer in the SPE's LS and tiling the whole thing, but I didn't know that at the time.

Now then, my point is that the RSX would most likely be bandwidth bound at 22 GB/s. Of course, in such a case, developers would go with longer shaders and such because they can do so "freely," but that doesn't mean 22 GB/s is a good match to the RSX.

Long story short, you're barking up the wrong tree.
 
Late last year the talk (rumor) was that NVidia's new SoundStorm solution would actually be integrated into their next generation of video cards, supposedly to be powered by the GPU.

Well, looks like that didn't come to pass and that's too bad, because I've always been a big SoundStorm fan. Maybe it will someday soon though.
 
ninelven said:
jvd said:
as above what your saying is akin to claiming hdr fp 16 shouldn't be called hdr because fp 32 would look better.
No, that isn't what I was saying at all. The point of HDR is to "accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to the deepest shadows." key word: real.

of which fp 16 wouldn't accurately represent it either as it is also limited as is fp 32 .

Which going by what you said none should be claled hdr .
 
ninelven said:
jvd said:
of which fp 16 wouldn't accurately represent it either as it is also limited as is fp 32 .

Which going by what you said none should be claled hdr.

Really, where did I say that? I must have missed that part...

:rolleyes: Obviously one is working within the constraints of current monitor technology.

To quote a moonite: "I'm doing it harder than I ever have before."


It does look like this will be the case, but I don't know why anyone would consider fp10 HDR.

I'll assume you meant wouldn't... mathematics.

Oh I think it could be a nice upgrade vs. what we've had previously (doubly so since it works with AA), but I would say it is disingenuous to call it HDR.
 
Back
Top