NVIDIA's Last Minute Effort - 6850

well if the rummers are true and IBM only made there 10.000 nv40s and TMSC is realy bizy, and the " Man who should step down" says Low K is realy realy scary, well i would go with FSG at 11....
 
Well if this rumor is true, will we see a 5800 => 5900 transistion like? 3 months later a new chip?

Frankly speaking, the 800XT is superior to the 6800U, but it's still a good chip (unlike the 5800U). It would be a waste from my point of view.
 
Welcome to the forums, OM. Now let's get to it. ;)

Operation Mindcrime said:
What I have a hard time understanding is why aren't the ATI fanbois mad? By ATI's own admission, the X800XTP is really just a tweaked, optomized and refined R300. Anyone who payed $400-$500 for a 9800XT now has to pay another $400-$500 to get the card that the 9800XT should have been in the first place... :oops:
Why aren't Willamette owners angry at Intel, or GF1 owners at nV? I guess b/c they know tech progresses rapidly.

This begs the question of what a refresh 6800U on low-k might be capable of...
The question line forms at whether nV can get to grips with low-k, IMO.

Anyway, I think the X800XT looks better at $500, and the 6800GT may look slightly better than X800P (taking into account SSAA, less texture shimmer [per THG], SM3.0 [I'm ever an optimist], and OGL performance) at $400.
 
Doomtrooper said:
Johnn Carmack does, which is kind of like a big ' :LOL: ' since OpenGL is his supposed fav API yet these type of threads shows the truth...no ?.

http://www.beyond3d.com/forum/viewtopic.php?t=11712

You have to remember that Carmack has an extremely "Carmack-centric" view. Everything he talks about is focussed on whatever engine he is working on. Given that Nvidia actively and aggressively support whatever Carmack needs in their drivers (and even their hardware now), it's no surprise that Carmack has a heavy preference for Nvidia.
 
Bjorn said:
karlotta said:
there wont be a low-k 6800u

How about low-k NV45 then ?

Thats the general assumption. However, when you think about it, if yields are bad because of NV40's die size, then you question what effects low-k will have - the process is likely to be more expensive in the first place and its not going to help yileds (especially with NVIDIA's trepidation over it). Using 110nm wouldbe an option to get the die size down and improve the yield purely by virtue of more die per wafer.
 
techreport


Unfortunately, the 61.11 beta drivers we received this past Friday night didn't behave as expected. We ticked the checkbox to disable trilinear optimizations, but our image quality tests showed that the driver didn't disable all trilinear optimizations in DirectX games. I did have time to check at least one OpenGL app, and trilinear optimizations were definitely disabled there. We will show you the image quality impact of the 61.11 drivers' odd behavior in the IQ section the review.

NVIDIA's Tony Tamasi confirmed for us that this behavior is a bug in the 61.11 drivers' control panel, and says it will be fixed………

NV releases a beta driver for hardware sites to benchmark with and it mysteriously disables the trilinear optimization disabler in some games. This kind of NV garbage is amazing. A control panel bug?????!!!!!!!.
 
Back
Top