Beyond3D's GT200 GPU and Architecture Analysis

Setup limitations are especially relevent at lower resolutions. GT200 may let you play Crysis with twice the resolution than your old card did at the same FPS, but at the same resolution you won't get close to a 100% increase in FPS. A lot of people would like the latter.

Good point that at the same resolution, one may not get close to 100% increase in FPS. But it would be interesting to see what would happen at the same resolution with very high levels of AA (ie. 8xAA or 16xAA CSAA). Would there be anything stopping GT200 from getting close to 100% increase in FPS at lower resolutions with very high levels of AA?
 
Good point that at the same resolution, one may not get close to 100% increase in FPS. But it would be interesting to see what would happen at the same resolution with very high levels of AA (ie. 8xAA or 16xAA CSAA). Would there be anything stopping GT200 from getting close to 100% increase in FPS at lower resolutions with very high levels of AA?
At any resolution with any level of AA it will be hard to get the increase that you expect from shader/pixel rate increase unless you have very low poly count. So the answer is no. Let me try to make the difference a bit more clear:

Say card A has twice the BW, shaders, and texture units as card B, but the same setup rate. I'm saying that A will get the same framerate as B at double the resolution, not that A is twice as fast as B at the doubled resolution. Similarly A is not twice as fast as B at the original resolution with lots of AA.
 
At any resolution with any level of AA it will be hard to get the increase that you expect from shader/pixel rate increase unless you have very low poly count. So the answer is no. Let me try to make the difference a bit more clear:

Say card A has twice the BW, shaders, and texture units as card B, but the same setup rate. I'm saying that A will get the same framerate as B at double the resolution, not that A is twice as fast as B at the doubled resolution. Similarly A is not twice as fast as B at the original resolution with lots of AA.

Makes sense, thanks!
 
http://www.ixbt.com/video3/gt200-part2.shtml

gsvu9.png


Geometry Shader performance is tremendously improved compared to older G8x/G9x GPUs. Still waiting for English article at Digit-Life.
 
Geometry Shader performance is tremendously improved compared to older G8x/G9x GPUs. Still waiting for English article at Digit-Life.
Sweet, I love their articles. Thanks for the heads up. Don't really need the english version as I only look at the graphs :D

In their tests, G80 has generally globbered R6xx in DX10 tests, although the results vary wildly and sometimes show the opposite. Looks like GS performance is more consistent now - about 2.4 times that of RV670 in each test. Maybe 3 times the difference in base clock?
 
Although Tweaktown believe Nvidia have dropped the ball, I think Nvidia has got a slight breather over AMD as the R700 is only expected to be released in August(if not later). I expect Nvidia to release GX2-290(or do we know what the 290 is already?) sometime around then.

US
 
Although Tweaktown believe Nvidia have dropped the ball, I think Nvidia has got a slight breather over AMD as the R700 is only expected to be released in August(if not later). I expect Nvidia to release GX2-290(or do we know what the 290 is already?) sometime around then.

US

There
will
not
be
a GT200-based GX2

it is too large, too hot, and too power hungry. While not technically impossible, it is so unfeasible as to be virtually impossible.
 
The medusa demo actually makes good use the geometry shader. And the performance difference between the G92/G80 cards compared to the GTX 260/280 cards is pretty significant.

Chris
 
What about a GT200b GX2 ? :smile:


GT200b won't be enough of a shrink to allow for a GT200b GX2 cardwith two GPUs.

GT200 probably needs a full shrink (to 45nm) and a redesign, to allow a GX2.


9800 GX2 was not two G80s on a card, it was two reduced G92s on a card.
2x 256-bit bus, 2x 16 ROPs.

If a dual G80, GX2 card had been produced, it would've been 2x 384-bit bus, 2x 24 ROPs.

We may never see a dual G200 card.
 
The medusa demo actually makes good use the geometry shader. And the performance difference between the G92/G80 cards compared to the GTX 260/280 cards is pretty significant.

Chris

I ran it last night. The technical bits might be interesting but it's the end result that generates the wow factor and it's lacking in that respect IMO. What's the use with all these techological advances if the images they produce aren't getting any better?
 
In what sense? The demo was designed in many ways top show off a combination of DX10 effects all running together. The human head demo, Adrianne, And geometry shader generation, In a real time enviroment with more than 1 thing going on. Thats what impressed me about it. And I think thats the point of the demo.

I'm biased to the era and theme. ((Medusa is soo mythological and I like that)) but I found the demo very impressive in that respect.

/shrug
 
In what sense? The demo was designed in many ways top show off a combination of DX10 effects all running together. The human head demo, Adrianne, And geometry shader generation, In a real time enviroment with more than 1 thing going on. Thats what impressed me about it. And I think thats the point of the demo.

Well that kinda supports my point. You're impressed because you know all the technical stuff that's going on behind the scenes. But I didn't find the actual images rendered to be that great. Nothing really screamed next gen to me.
 
Question. Did Adrianne impress you? Because it didnt me. I think the thing I like about the demo is it puts alot of effects together and makes them useful in real time. Something that might actually be close to feasible in a game in a year and half. When I first saw Adrianne. I didnt see that quality of rendering in games anytime soon.

Chris
 
Nah Adrianne was most certainly not impressive. In terms of what we will see in games in a year and a half, I really hope the visuals are a bit better than the Medusa demo. Maybe it's just weak art assets or the cartoony color pallette that are spoiling it for me.
 
Nah Adrianne was most certainly not impressive. In terms of what we will see in games in a year and a half, I really hope the visuals are a bit better than the Medusa demo. Maybe it's just weak art assets or the cartoony color pallette that are spoiling it for me.

Technically impressive, but lacking aesthetically is how I would describe NV's demos over the years. ATi's got both going on, and has since at least the R300 days.
 
I would use MGS4 as a great example. Engine isn't as technically impressive as Killzone 2 for example, but the care taken by Kojima's art direction for every texel and polygon and camera position makes it look far more impressive than engines with technically more impressive shaders and algorithms.
 
Just having a pretty demo that doesnt really show off technology isnt that impressive either mind. I dont think either IHVs are really there yet.
 
Back
Top