When Tuesday does the G70 NDA expire?

What struck me about G70 is how much I *don't* want one. It's a 24-pipe NV40 that still has 16 ROPs, which was pretty much what was expected, and while it's got some nice features (actual video support is nice and transparency AA is pants-wettingly nice), I don't think there's anyone in his right mind who would think that this is the Best Thing Since Sliced Bread. For one, we don't have any apps that really need it (except maybe Far Cry with HDR, but we still can't have AA, so that's a damned if you do damned if you don't thing, or Chaos Theory, but still...)

I dunno. Feels kinda like the GF3 to me. Wonder if we'll see NV move to .09u with the refresh and ramp up clocks ridiculously. What I'm really curious about is how the $200-300 price point cards in this generation will perform...
 
So the G70 has come and gone. It seems like only yesterday we were all speculating on what it would be... :p

Well, I am not terribly impressed (yet). Maybe that is because I have not seen it compared in the right situation/benchmark. However, I have two gripes about the product. One minor gripe and one huge one. Let's begin with the little one.

1) What's up with changing the name from Ultra to GTX? This can only lead to confusion and I can't see anythign good coming from it.

2) Price. This is a biggie. At $599 ($600 to you and me) it has bumped up the price bracket for the top range. This angers me a little because Nvidia (and ATI et al) have been holding pretty steady at $500 for quite some time. This board doesn't seem to offer any quantum leap deserving of a pricing shift (other than perhaps a weakend dollar). What's more, boards selling at this price point used to have a new GPU and newer faster memory. This time Nvidia has stuck to using the same memory they used for the 6800 Ultra. The price of this memory should have dropped since the introduction of the 6800 Ultra so if they could offer that then they should certainly be able to keep prices steady (if there was some increase in GPU cost it should have been offset by the falling memory prices).

Generally speaking, I am disappointed that there seems to be no indication of moving to a 512MB memory capacity standard. I think this is really important. No matter how cool shader effects are, I would like to see higher quality and quantity content in games. I really hope that this "shader daze" will not last too long so we can get on with getting some real meat in there.

The lack of a 512MB 7800 GTX, games that can really use the GTX, and the acceptable performance of my 6800 Ultra will make me wait until buying into the next (refresh) gen. Hopefully R520 isn't too far away and I am hoping we'll see a real push to get 512MB on the top cards so developers can begin seriously designing for that capacity.

Oh, another thing. I am not in this position myself, but I couldn't help wondering as I read the various reviews how owners of a single 6800 Ultra/GT PCI-e card would react to this. Will they ditch their 6800s for 7800s or will they slap in another 6800 for SLI? This must play havoc with product placement/pricing.
 
The Baron said:
What struck me about G70 is how much I *don't* want one. It's a 24-pipe NV40 that still has 16 ROPs, which was pretty much what was expected, and while it's got some nice features (actual video support is nice and transparency AA is pants-wettingly nice), I don't think there's anyone in his right mind who would think that this is the Best Thing Since Sliced Bread. For one, we don't have any apps that really need it (except maybe Far Cry with HDR, but we still can't have AA, so that's a damned if you do damned if you don't thing, or Chaos Theory, but still...)

I dunno. Feels kinda like the GF3 to me. Wonder if we'll see NV move to .09u with the refresh and ramp up clocks ridiculously. What I'm really curious about is how the $200-300 price point cards in this generation will perform...

I agree.

I thing its a very nice card to upgrade from a 9700 and below but the performance it brings compared to the x800/gf6800 and the price tag, just doesnt cut it.
 
dsw said:
Nvidia just put this up. It answers a lot of questions that people have been asking.

http://www.nvidia.com/page/geforce_7800_faq.html

Sure, it's a brand-new "revolutionary" architecture built from the ground up, hehe consider me fooled... NOT!!! It's more of a marketing history log on icq with each PR-guy searching for a more loud-mouthed grunt or another gaunty word that is a far cry from the reality we have today.
 
alexsok said:
dsw said:
Nvidia just put this up. It answers a lot of questions that people have been asking.

http://www.nvidia.com/page/geforce_7800_faq.html

Sure, it's a brand-new "revolutionary" architecture built from the groun up, hehe consider me fooled... NOT!!! It's more of a marketing history log on icq with each PR-guy searching for a more loud-mouthed grunt or another gaunty word that is a far cry from the reality we have today.

If you look a little more closely, they give info about AGP availability (it's not happening right now), the GTX's place in the lineup (it replaces the 6800 Ultra as the new high-end offering), and the status of the midrange 6-series cards (they will remain on the market through the end of the year).
 
Actually the one review i read (on ixbt) pretty much had that material covered, so nothing new for me there... besides, that flashy utterly ridiculous first line left a bitter taste in my mouth right at the outset... i despise PR-marketing machines!!
 
I feel the same way. I feel like there is little reason for me to upgrade from my 6800GT. The only thing that would entice me is the FSAA, but is that worth $550?

Let's hope that Ati can come out with something that will want us to upgrade, otherwise I don't see myself making a purchase for another year (will be 2 years after I bought my 6800GT). It's the same thign with CPUs. I'm used to upgrading everything about once a year, but I'm more than happy with my 3ghz p4 that i got last year and OCed to 3.4.

I don't know. I just don't see a reason to upgrade this year like I have in all past years.
 
Well, if they ship a GT version of the 7800 or a 7600GTX, I'll buy one just for TSAA. IMHO, that feature alone makes it worth it. The difference is night and day in IQ.


I think too many people are looking at WGF2.0 and the XB360 and hoping that they can make their next card some sort of next-generational architecture purpose, but it will be a long time before you can buy a Xenos-style architecture for your PC and even longer before both your OS (Longhorn) and your games will support it. Hence, in the next 1-1.5 years, you're going to be forced to by what are essentially small modifications of existing architectures.
 
Kombatant said:
Jawed said:
R520 isn't going to have to do much to compete with 7800GTX.

Excuse my Engrish, but what does that mean? :)

Just needs to be about 20% faster than X850XTPE. The games 7800GTX shows the biggest gains in tend to be the games where 6800U was well behind.

If ATI's behind on transparent AA then they're buggered, but the performance part looks like it'll be easy.

Jawed
 
wireframe said:
1) What's up with changing the name from Ultra to GTX? This can only lead to confusion and I can't see anythign good coming from it.

Their FAQ seems to point at a marketing answer --everyone knows (and I think just about everyone here would agree) that the GT was the cool card to have from a price/performance pov last time. So now this one is a GT Xtreme! Wow!! It is only confusing if an Ultra doesn't show up eventually. ;) This does not, however, say comforting things about an Ultra's pricing. :(
 
digitalwanderer said:
dsw said:
Nvidia just put this up. It answers a lot of questions that people have been asking.

http://www.nvidia.com/page/geforce_7800_faq.html
nVidia said:
Twice the performance over the GeForce 6800 Ultra
I think that's the thing people are having trouble with, from all the reviews it don't really seem to be twice as powerful as a 6800 ultra.

The games aren't exactly highlighting its technical potential. Many are bottlenecked at the CPU, and all obviously aren't designed specifically for the card. I wouldn't look to benches of current games for an accurate reflection of its potential technical performance, which is what they're referring to there.

With future games, where theres the opportunity to weight the processing load back onto the GPU, you'll see it pull away more as you'd expect.

So yeah, you're not seeing 2x the performance in most games today, but that doesn't mean it can't be 2x the performance.
 
DemoCoder said:
The one option that temps me to buy this card is TSAA. Damn it looks good and fixes the single biggest problem in games today (or atleast the two games I play: HL2/CSS and BF2). Moreover, the technique being used is cheaper than supersampling the entire screen, and can be used to eventually fix other AA problems that multisampling can't fix, like aliasing introduced by shaders, such as the floor specular shaders in HL2.

Kudos to NVidia for offering supersampling options.

A agree, this is the single killer feature. [H] reported that supersampling is buggy in HL-2. Not sure about other sites - fingers-crossed it's an easy fix.

Jawed
 
Titanio said:
The games aren't exactly highlighting its technical potential. Many are bottlenecked at the CPU, and all obviously aren't designed specifically for the card. I wouldn't look to benches of current games for an accurate reflection of its potential technical performance, which is what they're referring to there.

With future games, where theres the opportunity to weight the processing load back onto the GPU, you'll see it pull away more as you'd expect.

So yeah, you're not seeing 2x the performance in most games today, but that doesn't mean it can't be 2x the performance.
You misunderstand me; I'm not saying it doesn't have twice the power, I'm saying it doesn't appear to have twice the power from all the reviews so far.

I'm not knocking the G70, I haven't really read enough about it yet to pass a verdict either way and even after reading I probably won't....I prefer to actually see 'em in use/hear about them from real users before I really form an opinion.

But right now I can tell ya that the viddy community is being a tad underwhelmed by the G70s debut in comparison to nVidia's claims about it, that's not my opinion of the G70 but my opinion of community reaction to the G70 so far.

This card may just be a fantastic card but there just isn't any software out there yet that can take advantage of it, but I just can't say for certain yet.

The only thing I know for sure is that I really love their new AA, that alone makes this generation from nVidia a win for me. 8)
 
tEd said:
From the raw shader test i often see a 50-120% gain to gf6ultra. Of course i can't predict exactly what r520 will offer but r520 has first to reach that 50-120% gain too.

The raw shader tests in various reviews do not correspond with the performance improvement of 7800GTX over 6800U though. Those raw performance tests are even worse than 3DMk for assessing the gaming performance of cards.

50-100% improvement according to the raw tests translates into 20-50% in games.

Well i expect at least 50% clock advantage of r520 but with 16pipelines the big question is how much the ALU power increase per pipeline will be compared to r420. I guess it's around 33-50%.

In a way I hope ATI doesn't go the "superscalar" route, using two equal ALUs per pipeline - but it seems inevitable to me. It's not working for 7800GTX so I don't see how it's going to work for R520. Unless 7800GTX is still being held by back register read bandwidth... Or maybe it's that first ALU's texture duties that are putting the kibosh on it. ARGH.

Jawed
 
digitalwanderer said:
This card may just be a fantastic card but there just isn't any software out there yet that can take advantage of it, but I just can't say for certain yet.

So I'll just keep banging my drum --people would be a lot more eager to buy this card [y'hear me IHV's?!] if they could put some of that excess power to use in their current favorite games by ramping up (in the control panel, damnit!] AA and AF without taking any fps hit (or minimal; or at least not below whatever their personal favorite magical number is). "Eye Candy FOR FREE!" Yippee!! Why is this so hard for them to understand?
 
Jawed said:
Kombatant said:
Jawed said:
R520 isn't going to have to do much to compete with 7800GTX.

Excuse my Engrish, but what does that mean? :)

Just needs to be about 20% faster than X850XTPE. The games 7800GTX shows the biggest gains in tend to be the games where 6800U was well behind.

If ATI's behind on transparent AA then they're buggered, but the performance part looks like it'll be easy.

Jawed

Come on 20% :rolleyes: I feel the ATI force is still strong on this bulletinboard ;)

bf21024.gif


I read a ton of reviews today and in eyecandysettings its mostly ~1,4-1,6 faster then a X850XTPE.
 
Back
Top