Next NV High-end

Uttar said:
When I say G7x, it's because I'm unsure whether the proper codename is G73,G74 or G75. I have yet to hear or read anything from anyone that's more than guesswork on this subject. Don't expect G7x before Q1 2006, most likely "late Q1 2006" or even early Q2 in fact. mid-february is the absolute minimum.

I hope they call it "G75", just because. . .well, just because I like a little traditional sanity to hang onto. :smile:

$199: GeForce 7600 (12PS)
$299: GeForce 7600 Ultra (16PS)

Hmm!! 16 you say? That's "breaking news" isn't it? One could conjure a bit with that. . .

Thanks as always, a tiny drop o' affection left your in rep account. . .
 
Nah, too busy with other stuff (last highschool year, working on a kinda-next-gen graphics/networking engine, etc. etc.) - heck I shouldn't even be wasting that much time on this forum :p That, and I don't have my sources anymore, I only get tidbits here and there nowadays :smile:

Geo: I don't think that's really breaking news, in fact I remember another reliable person hinting at it on this very forum. I think something quite significant that I hadn't noticed before getting the necessary info together is that unlike in, say, the FX era, NVIDIA isn't trying to fit all market segments in the GF7. This is just speculation, but I would assume this to be beneficial to NVIDIA, as it gives the lineup more of a "high-end" aura than otherwise; there won't be anythign similar to the X1300LE in there, for a while at least (4PS GF7200?). Obvious, they could have named their integrated chipset GeForce 7000/7050 or whatever, but they choose not to. Unlike in the case of the GFFX, GF3 and GF1, their old productline isn't going to get "phased out" quite as fast.

Another reason for this might be SLI, so that they have to make sure there remains some avaibility of their past chips for a slightly longer time, including the 6600 which doesn't seem to have a real spot in the productline anymore. Clearly, I'd say the X1300LE will have to fight the 6200 non-TC, the X1300 the 7200, and the X1600 will have to compete with the G71, and possibly the high-end G72 for the low-end model. It's gonna be a tough race, and the specs alone can't give us a clear winner (yet, let's revisit that statement post-R520 launch).


Uttar
 
I must have missed the hint, or didn't take it seriously enuf. It's always nice to get the more direct flat statement anyway, when possible (as of course sometimes it is not). ;)

Obviously (to me, at least) NV is better positioned to leave some of the current line-up in place, particularly down the range. They are already SM3 parts, after all, and G7x doesn't bring all *that* much new to the party feature-wise to make folks hold their nose at NV4x.

ATI on the other hand. . .I suspect they'll try to transition by price point (the big gaps in the Hexus prices point at it, for one), but I also think that once folks get a look at R5xx parts that not too many who lean ATI-wards will be all that eager to buy R4xx parts any more, even short-term, other than at "wow, what a steal" prices. And with apparently R580 right around the corner. . .maybe even sooner than G75 (ah, screw it, I'm using "G75" from now on and daring somebody to correct me with the real name if they like!), hopefully the ATI transition from SM2 to SM3 over the entire lineup will be relatively short-lived.
 
Last edited by a moderator:
I'm not sure about codenames myself, but there might be some truth in this info even if the codenames are misplaced ;)

And i don't expect any G70 refreshes this year. I even doubt that there will be an Ultra variant.
 
DegustatoR said:
And i don't expect any G70 refreshes this year. I even doubt that there will be an Ultra variant.
Well whatever they name it is, well, up to them. For all I know they might decide to keep the GTX sticker to it, although I doubt that heavily. All I know with relatively good reliability is that, unless the R520 is a flop, NVIDIA is going to release another high-end G70 variant with quite different clockspeeds for the different configurable parts of the chip.

Uttar
 
DegustatoR said:
And i don't expect any G70 refreshes this year. I even doubt that there will be an Ultra variant.
Well, there certainly won't be any significant changes this year, but I think a higher-clocked part wouldn't be out of the question.
 
It really depends on what (and _when_) ATI's gonna offer with R520XT. If it won't be much faster than G70GTX (or if it will be faster right to the point where G70 can't compete with it at any clockspeeds) then i suppose that NV will do something like 'X850 silence' thing again and go with 90nm 32PS G7x in the spring'06 instead of introducing an overclocked G70 before christmas...
 
Whats the point in inrease the clocks really, almost all vendors ship their card at between 450-500 and many with memory at 1300+. A 7900GTX @ 90nm 32/10 and mem at 1600Mhz should be perfect to finish of this seies, then launch NV50 or G80 late 06.
 
overclocked said:
Whats the point in inrease the clocks really, almost all vendors ship their card at between 450-500 and many with memory at 1300+. A 7900GTX @ 90nm 32/10 and mem at 1600Mhz should be perfect to finish of this seies, then launch NV50 or G80 late 06.
But vendors are still selling their GTX's with the same layout. nVidia may produce a different reference board layout for the Ultra that has better power regulation and less capacitance in the memory bus, allowing for higher clockspeeds all around. Combined with improved yields on the GTX parts, this could lead to a 20% increase over stock speeds, and better overclockability for enthusiasts.
 
Chalnoth said:
nVidia may produce a different reference board layout for the Ultra that has better power regulation and less capacitance in the memory bus, allowing for higher clockspeeds all around. Combined with improved yields on the GTX parts, this could lead to a 20% increase over stock speeds, and better overclockability for enthusiasts.
I think the problem is a power surge in the flux capacitors caused by errant tachyons.

-FUDie
 
Chalnoth said:
But vendors are still selling their GTX's with the same layout. nVidia may produce a different reference board layout for the Ultra that has better power regulation and less capacitance in the memory bus, allowing for higher clockspeeds all around. Combined with improved yields on the GTX parts, this could lead to a 20% increase over stock speeds, and better overclockability for enthusiasts.

Well thats one direction esp with the memory bus, but the current layout seems to handle "stable" speeds of around 1350MHz and thats with 600MHz rated memory.
So it not that far stretched that it would handle 700MHz 1,4 memory at stock.

But my current understanding is that ATI is using 1,2 rated memory models so it would likely require a different layout anyway as i think nVidia will go for the same effective speeds so i agree with you.
 
I'm pretty sure the next high end card is going to a 512 MB card, preferably right after the R520XT release. They cannot afford to have a high end product line without a 512MB version. The 8x.00 forceware drivers with the ability to mix and match cards in SLi with different memory configurations is probably the first step.
So, 512MB on the current G70 or on a 90nm version? The question remains.

edit: and then came the Inq with this message: http://www.theinq.com/?article=26595
A GTX with 512MB, questions answered.
 
Last edited by a moderator:
Blah, Fuad just guessing yet again. No reason to consider that any evidence for or against a 512MB 7800 GTX. But given past history, a 512MB GTX is likely.
 
Chalnoth said:
Blah, Fuad just guessing yet again. No reason to consider that any evidence for or against a 512MB 7800 GTX. But given past history, a 512MB GTX is likely.

I don't think it has anything to do with history. Nvidia needs a 512MB high-end card NOW. I thought the current GTX PCB has room for more RAM on both sides? Didn't Dave comment on the layout a while back?
 
Chalnoth said:
Blah, Fuad just guessing yet again. No reason to consider that any evidence for or against a 512MB 7800 GTX. But given past history, a 512MB GTX is likely.
Yeah, the board shouldn't need a redesign seeing as it already has the traces (and pads) for another 256MB of GDDR3.
 
trinibwoy said:
I don't think it has anything to do with history. Nvidia needs a 512MB high-end card NOW. I thought the current GTX PCB has room for more RAM on both sides? Didn't Dave comment on the layout a while back?
Why does nVidia need a 512MB card now?

And by past history, I mean that nVidia has very frequently released an "update" to a high-end card with more memory.
 
Chalnoth said:
Why does nVidia need a 512MB card now?

And by past history, I mean that nVidia has very frequently released an "update" to a high-end card with more memory.

Given the pricing of the 512MB X1800XT, they can't afford to leave their flagship with "last generation" framebuffer space. And we all know how heavily marketing driven Nvidia has been in the past. Not to mention if the extra memory does prove to be valuable in upcoming titles.

Of course, all of this is assuming that ATi can provide their flagship in volume at <= MSRP pricing.
 
Back
Top