When Tuesday does the G70 NDA expire?

I can't say I'm too impressed, but it's certainly a huge leap (performance wise) from what I'm currently using. If the r520 isn't a bit more exiteing, I will miss this generation as well as the last one. I can survive with a 9800 for now atleast :)

It doesn't seem like it was that long ago since the that last generation of cards came out, yet it has been a year... odd. Despite the increase in performance, The seemingly apparant lack of difference in features has meant that games look mostly the same on my r3xx card as they do on a r4xx or a G7x, blurring the difference between receant generations. The leap between my GF4 and my 9800 was quite large as it enabled a considerable number of effects that weren't practically possible previously.

It seems that my 9800 can do most of the things a G70 can do, albeit slower and with some increased hassel to the programmer. HDR is, to some extent, an exception to this (Valve will support it on r3/4xx based cards I think), but on the games where it is supported in this generation, it doesn't appear to be all that good, certainly not up to the same standard as UE3 games.

To me, better AA and AF isn't a good enough reason to buy a new graphics card when considering there price. If I can play UE3 games with all or the most important shader effects on, but at 800x600, I will do. Although thats under the presumption that I will be in the same finacial state then as I am now.
 
Unknown Soldier said:
In one test??
I was looking only at the general PS test.
Even if it's slower in some tests , it's faster than a 6800U in most of them.
 
So what should I be looking at for branching performance? Are we inferring from a particular game(s)? Do we need an explicit branching benchmark?
 
geo said:
So what should I be looking at for branching performance? Are we inferring from a particular game(s)? Do we need an explicit branching benchmark?
Why should you be looking for something in particular? dynamic branching sucks as usual (per NV's NV30 ignited tradition) and static branching is a no-go anyway...
 
Some interesting snippets from nVIDIA's GeForce 7800 GTX FAQ:

Q: With the introduction of the GeForce 7800 GTX, does this mean you are discontinuing the GeForce 6 series?
No. The GeForce 7800 GTX will be NVIDIA's new flagship GPU, replacing the GeForce 6800 Ultra at the top of our product line. The other GeForce 6 Series GPUs including the GeForce 6800 GT and Standard, will remain in their respective segments.

Q: How long will the GeForce 6 series be available for purchase?
The GeForce 6 Series GPUs will remain in their respective segments, and will be offered through the end of this year.

Q: Is there a GeForce 7800 Ultra?
GeForce 7800 GTX is our top of the line. We had so much positive response for the GT SKU (single slot, great performance) that we wanted to build upon that success and brand. GeForce 7800 GTX takes NVIDIA's high end performance offering to the next level.

Q: Will there be a 512MB version?
While the GPU can support up to 512MB of RAM, it's up to our customers and consumer demand to bring one to market.

Q: Will you use HSI to make AGP-based GeForce 7800 boards?
We are not disclosing plans for AGP at this time, but that is certainly possible with the flexibility of our HSI chip.

I'm a bit surpised that NV haven't released or at least announced a vanilla 7800 yet with disabled quads/ROP's -- perhaps there is still significant NV4x inventory within the channel?

Cheers,


BrynS
 
alexsok said:
geo said:
So what should I be looking at for branching performance? Are we inferring from a particular game(s)? Do we need an explicit branching benchmark?
Why should you be looking for something in particular? dynamic branching sucks as usual (per NV's NV30 ignited tradition) and static branching is a no-go anyway...

Anand:

We have also learned that the penalty for branching in the pixel shaders is much less than in previous hardware.

I thot it'd be nice to test the assertion. It seems to be the future as much as the other beef-ups they did on the PS.
 
After reading a few of the reviews I think it (G70) is freaking cool. I like What they have done to the Shader architecture and the performance really shows up compared to the 6800. I also like the transparency AA modes.

IMO this is the best, most well rounded, well thought out card they (nvidia) have produced since the GF2 GTS.

Actually I would say that they have finally produced their best pice of hardware in all areas, hands down.
 
Hellbinder said:
After reading a few of the reviews I think it (G70) is freaking cool...
Has your account been hacked? :LOL:

I haven't read all the reviews yet, but so far G70 seems like a very competent part, although AA with fp blending/HDR would have been the cherry on top of a very inviting cake!

Cheers,


BrynS
 
BrynS said:
Hellbinder said:
After reading a few of the reviews I think it (G70) is freaking cool...
Has your account been hacked? :LOL:

I haven't read all the reviews yet, but so far G70 seems like a very competent part, although AA with fp blending/HDR would have been the cherry on top of a very inviting cake!

Cheers,


BrynS

:LOL: :LOL:
People sometimes would change their mind, right?

Btw, Is there anyway to hack those new AA modes for use on NV4x series? :devilish:
 
I just looked at the Chinese review.

Unless i am crazy it looks like *basically* In Shader intensive games an X800XL beats it or is within 10 FPS of it in most cases. The exception is Doom 3.

Its obvious to me that ATi has a superior Shader Core for 90% of todays games even with their Current tech. Not even getting into what the R520 is going to do.

I cant wait to see the reviews that compare a Fully loaded X850XT to the 7800. Nvidia may win only a small handfull of benchmark comparrisons.

I predict its going to be a long long day for Nvidia in the near future ;)


Hellbinder said:
After reading a few of the reviews I think it (G70) is freaking cool. I like What they have done to the Shader architecture and the performance really shows up compared to the 6800. I also like the transparency AA modes.

IMO this is the best, most well rounded, well thought out card they (nvidia) have produced since the GF2 GTS.

Actually I would say that they have finally produced their best pice of hardware in all areas, hands down.

See? I told you to reserve some judgement until you saw the whole picture mate :p
 
I'm bored.

The only excitement comes from the transparency AA modes. That's really excellent. Sticking to rotated grid and not making gamma-corrected the default is all rather lame though.

The increased ALU capability is showing absolutely no benefit in any game benchmark that I can find.

I hope ATI does the transparent AA thing, too. And I hope they go to a 8xMSAA mode, and launch with a 512MB card.

It looks like we'll really have to wait till R580 before there's a truly exciting increase in performance. R520 isn't going to have to do much to compete with 7800GTX.

Jawed
 
Kombatant said:
I just looked at the Chinese review.

Unless i am crazy it looks like *basically* In Shader intensive games an X800XL beats it or is within 10 FPS of it in most cases. The exception is Doom 3.

Its obvious to me that ATi has a superior Shader Core for 90% of todays games even with their Current tech. Not even getting into what the R520 is going to do.

I cant wait to see the reviews that compare a Fully loaded X850XT to the 7800. Nvidia may win only a small handfull of benchmark comparrisons.

I predict its going to be a long long day for Nvidia in the near future ;)


Hellbinder said:
After reading a few of the reviews I think it (G70) is freaking cool. I like What they have done to the Shader architecture and the performance really shows up compared to the 6800. I also like the transparency AA modes.

IMO this is the best, most well rounded, well thought out card they (nvidia) have produced since the GF2 GTS.

Actually I would say that they have finally produced their best pice of hardware in all areas, hands down.

See? I told you to reserve some judgement until you saw the whole picture mate :p

Thats what the Chinese review made it look like. Obviously my comments were based off of that. Obviously my comments have changerd now. I think its a nice piece of hardware.

It still doesnt change the fact that Nvidia is going to be in for a long day in a little while. That one is not going to change. especially now that i have seen the full results.
 
Jawed said:
I'm bored.

The only excitement comes from the transparency AA modes. That's really excellent. Sticking to rotated grid and not making gamma-corrected the default is all rather lame though.

What's up with the gamma thing anyway? Is there a performance thing there? I can't imagine why you'd not have that on all the time (or at the very least by default) if there wasn't. Did you see (maybe I missed it) if Wavey said he did his tests with that setting on or off?

But totally agree on the Transparency AA --they got that one right, both in doing it and the control panel options.
 
Hellbinder said:
It still doesnt change the fact that Nvidia is going to be in for a long day in a little while. That one is not going to change. especially now that i have seen the full results.

I see you're still not following my advice.. ok then :)
 
Back
Top