Next NV High-end

Ailuros said:
One shouldn't forget that ATI has still a large delay and lacklustering availability - for the time being - for the ultra high and mainstream part of the market on it's back.

Judging from the according reactions lately from NV's PR department, I can't sense that they feel particularly threatened right now. Is this whole "Ultra" expection mostly because there's a model missing theoretically with such a name or because of Fuad's senseless ramblings at times?
Well the R520 had to go through two revisions and was thus delayed almost 4 months. I think the primary reason for no XT availability is because they managed to fix the problem only recently and are now building up inventory.

As far as the reasoning behind an Ultra, low clocks on the GTX in comparison to what it is capable of including being bandwidth starved, single slot cooler and only 256 MB, I think the thought has crossed nvidia. Especially now that we see the R520 actually seems to have headroom of it's own in terms of clocks and efficiency improvements through the memory controller and future driver updates.
 
ANova said:
Well the R520 had to go through two revisions and was thus delayed almost 4 months. I think the primary reason for no XT availability is because they managed to fix the problem only recently and are now building up inventory.

As far as the reasoning behind an Ultra, low clocks on the GTX in comparison to what it is capable of including being bandwidth starved, single slot cooler and only 256 MB, I think the thought has crossed nvidia. Especially now that we see the R520 actually seems to have headroom of it's own in terms of clocks and efficiency improvements through the memory controller and future driver updates.

I'm taking a far more wider perspective on this one. Take the release date of G70 and the possible release date of a refresh in 90nm; I don't know if a another intermediate refresh between those two makes actually sense and that is what I have or had in mind (exception being larger framebuffers).
 
Last edited by a moderator:
So...

G73 instead of NV43
G72 instead of NV45

G74 instead of NV44?
G71 instead of G70? (90nm die shrink?)
G75 instead of G71? :) (90nm + 32PP/10VS?)

Too many codenames... :D
 
If they are going towards a reference design for sli-on-one-board, so that it essentially becomes "the top end sku", one might wonder if they really are going to bolt on more quads.

The digitimes piece specifically said "smaller", tho I suspect even an 8 quad 90nm G7x would be smaller than GTX.
 
Overvolting can have other ramficiations, the most obvious being that it can change the MTBF - while an overvolting test can produce nice increases in clocks it doesn't test other elements such as this, which a manufacturer does have to take into account.
 
trinibwoy said:
Seems like they could easily bring out a significantly higher clocked Ultra on 110nm if they wanted to (and possibly throw their power consumption advantage in the trash)

http://www.vr-zone.com/?i=2825&s=7

The performance seems to scale badly on the Geforce 7800GTX. Nearing Radeon X1800XT clocks and still only beating it with so little and 2 extra quads. Of course, the extra 256MB ram on the X1800XT may help it.

I am almost certain (hopefully) that the Geforce 7800GTX would scale better in games with those clockspeeds.
 
Dave Baumann said:
Mean Time Before Failure.

Ohhh...

That's definitely a problem, but it's essentially what all oc'ers have been doing for years, so I don't think that'll be a showstopper for most buyers.

But again, I don't believe nV will do that, would be plain stupid IMHO.
 
Dave Baumann said:
Overvolting can have other ramficiations, the most obvious being that it can change the MTBF - while an overvolting test can produce nice increases in clocks it doesn't test other elements such as this, which a manufacturer does have to take into account.

Isn't that true for any piece of hardware though? Only Nvidia knows if the voltage on the GTX is the maximum possible before MTBF falls into unacceptable levels. What if the GTX is currently undervolted?
 
trinibwoy said:
Isn't that true for any piece of hardware though? Only Nvidia knows if the voltage on the GTX is the maximum possible before MTBF falls into unacceptable levels. What if the GTX is currently undervolted?

Typical hardware like this is usally designed to run in a certian mannor...its highly un-likeky that the have much headroom built in as a fail safe to what ATI might do (fail safe meaning they designed the GTX with the ability to ramp up clocks/voltage to respond to competiors products). Its not impossible for them to do this..just very un-likely..
 
Pressure said:
The performance seems to scale badly on the Geforce 7800GTX. Nearing Radeon X1800XT clocks and still only beating it with so little and 2 extra quads. Of course, the extra 256MB ram on the X1800XT may help it.

I am almost certain (hopefully) that the Geforce 7800GTX would scale better in games with those clockspeeds.

If the increase in frequency in memory speeds is more or less in line with core frequency increases than most likely yes.
 
_xxx_ said:
it's essentially what all oc'ers have been doing for years
Most OCers simply increase the clock rate, not the core voltage, no? Increasing voltages typically involves hardware mods (like Xbit or iXBT are wont to do) or possibly flashing BIOSes, which I don't believe most OCers do. Maybe things have changed, though. They may well change if ATI's X1000-series OCing tool, with its voltage control, is released.

Hmmm, just read that VR-Zone story. The clock gains from the voltage increases don't seem worthwhile. The gains are decent at 1.5V and less so at 1.6V, but the last two .1V increases don't seem worth it. Then again, people who buy top-end video cards probably don't keep them beyond two years, and I expect even the 1.8V mod would last for at least that long.
 
Last edited by a moderator:
jb said:
Typical hardware like this is usally designed to run in a certian mannor...its highly un-likeky that the have much headroom built in as a fail safe to what ATI might do (fail safe meaning they designed the GTX with the ability to ramp up clocks/voltage to respond to competiors products). Its not impossible for them to do this..just very un-likely..

I'm not sure what you're basing this on exactly. How can you be certain that Nvidia designed the G70 with the performance of the GTX SKU in mind?
 
Back
Top