When Tuesday does the G70 NDA expire?

And I was so used to the spring + fall following year "silly periods" schedule. All this out-of-sync between ATi and nVidia is throwing a wrench. Yup, just what we needed, year-round speculation a-hoy. :p
 
I'm a bit disappointed at seeing 8xS(FSAA) get such a huge performance hit.

http://www.beyond3d.com/previews/nvidia/g70/index.php?p=19

If Nvidia intend to show that the G70 is better than anything ATI can offer then they should be concentrating in this area.

http://www.beyond3d.com/reviews/ati/r480/index.php?p=12

Having a looking FSAA 4xAA and 16xAF .. ATI's 850XT PE whoops Nvidia G70's ass.

Even at optimal filtering(i take it's CP controlled) the ATI is faster.

For a new card . .I would've expected better.
 
Jawed said:
Just needs to be about 20% faster than X850XTPE. The games 7800GTX shows the biggest gains in tend to be the games where 6800U was well behind.

If ATI's behind on transparent AA then they're buggered, but the performance part looks like it'll be easy.

Jawed

You think so? Take another look.
 
Unknown Soldier said:
Having a looking FSAA 4xAA and 16xAF .. ATI's 850XT PE whoops Nvidia G70's ass.

Even at optimal filtering(i take it's CP controlled) the ATI is faster.

For a new card . .I would've expected better.

Relying on a single benchmark, and a single test within that benchmark (GT2) is not a reliable way to compare the two cards.
 
_xxx_ said:
Jawed said:
Just needs to be about 20% faster than X850XTPE. The games 7800GTX shows the biggest gains in tend to be the games where 6800U was well behind.

If ATI's behind on transparent AA then they're buggered, but the performance part looks like it'll be easy.

Jawed

You think so? Take another look.

I've already disputed this conclusion once in this thread, with respect to BF2.

Jawed
 
DemoCoder said:
Unknown Soldier said:
Having a looking FSAA 4xAA and 16xAF .. ATI's 850XT PE whoops Nvidia G70's ass.

Even at optimal filtering(i take it's CP controlled) the ATI is faster.

For a new card . .I would've expected better.

Relying on a single benchmark, and a single test within that benchmark (GT2) is not a reliable way to compare the two cards.

Ok .. noted. Except that.

http://www.beyond3d.com/previews/nvidia/g70/index.php?p=12
Code:
Far Cry, 4x FSAA & 8x AF (FPS) 	640x480 	800x600 	1024x768 	1280x1024 	1600x1200
7800 GTX 	                        80.3 	  80.2 	      80.1 	    77.2 	     62.1

http://www.beyond3d.com/reviews/ati/r480/index.php?p=9
Code:
X850 XT PE (FPS) 	640x480 	800x600 	1024x768 	1280x1024 	1600x1200
4x FSAA + 8x AF 	   108.1 	   108.3 	   106.1 	   87.4 	   67.0

ATI still whooping G70 ass
 
Actually .. having a look at it like this .. is something wrong with the way Davey benchmarked since the 7800 GTX can't go over 80fps at low res.?

Or is it driver issues?

US
 
Well.... color me very impressed. Like DC, I'm most impressed with the alpha AA ability. I have a few games - of which Guild Was is an example - that just sucks with MSAA.......this would be a greatly appreciated feature for those that are in the camp of better IQ. I also am gald nVidia has finally gone with gamma correction. I also appreciate the power savings and the single slot cooling. This is a big step forward for nVidia, to me at least. I do have a few dissappointmens though.......

1) Price....need I say more?
2)lack of 6X or higher MSAA. I use this a lot ATM....and with the power of the G70 it seems a waste not to have included more FSAA options.
3) Lack of the ability to use HDR with FSAA. This is a real biggie.

But, don't mistake my complaints as dissing the 7800GTX. A most impressive piece of hardware. I may have to get one of these....... ;)
 
Unknown Soldier said:
Actually .. having a look at it like this .. is something wrong with the way Davey benchmarked since the 7800 GTX can't go over 80fps at low res.?

Or is it driver issues?

US

Maybe he forgot to turn off vsync (or couldn't do it)?
 
digitalwanderer said:
compres said:
I mean what's the ponint of releasing xfire now, they wont beat the 7800sli.
Do we know that for sure, and will a Crossfire R520 beat a 7800sli?

I was referring to r420's xfire. I have not seen any r520 xfire benches or anything to be making any questions about that.
 
martrox said:
3) Lack of the ability to use HDR with FSAA. This is a real biggie.

But, don't mistake my complaints as dissing the 7800GTX. A most impressive piece of hardware. I may have to get one of these....... ;)

I think the 7600 model will be more impressive price/perf :)


I think TSAA IMHO is more important than AA+HDR since it works on alot more titles and it is unlikely someone will write a game that only works with HDR backbuffers. Especially due to the deployed base of R3xx/R4xx, games are going to provide lots of fallbacks. HDR+AA is a nice feature, don't get me wrong, I've been asking for it for a long time, but given the bandwidth on these cards, and the structure of the market, it isn't as important as it is on the XB360. The real conclusion here to draw is that the RSX/PS3 won't have it, which is bad for Sony.

However, if by the time of WGF2.0, a card doesn't have AA HDR, then I think it is a real problem.
 
http://hardocp.com/article.html?art=Nzg0LDM=

Moving down you will see that the GeForce 7800 GTX works on pixels in quads of 4 with 6 quads total making a total of 24 pixel-pipelines or, more accurately, 24 pixels-per-clock. These feed down into the ROP system, which is made up of 16 ROP pipelines. Note that the GeForce 6800 Ultra has the same number of ROP pipelines.

In terms of fillrate, the GeForce 6800 Ultra at 400MHz and 16 pixels-per-clock gives it 6.4 GigaPixels/Sec. The GeForce 7800 GTX at 430MHz and 24 pixels-per-clock gives it 10.3 GigaPixels/Sec., which is a 61% increase in raw fillrate.

this is still very confusing to a complete idiot like me :oops:

Even though G70 ~ GeForce 7800 has 24 pixel pipelines, it only has 16 ROPS, so how can the fillrate be 10.3 GP/sec ?

isn't the real fillrate 6.8 GP/sec ? (16 ROPS, 430 MHz) ?

or is it - internal pixel processing: 10.3 billion pixels/sec
and outputted/displayed pixels: 6.8 billion pixels/sec ?
 
Yep, I raised that some issue with Brent/Kyle over in [H] forums and they basically shot me down and said that's how they do it over there.

As Dave points out in his review, pixel fillrate is 6.8 Gigapixels / second. Texel fillrate is 10.3 Gigapixels / second.
 
Back
Top