The Official G84/G86 Rumours & Speculation Thread

Yeah the 8600GTS won't be winning many fans with those kinds of scores. Getting beaten down by 7950GT and X1950 Pro is pretty embarrassing. It'll be making a beeline for $125-$150 post launch if this performance holds up.
 
You know, looking at those scores I remembered all the more why 6600GT was one of the best cards ever. Nvidia did not measure up to its own mark.
 
This:



Coupled with this, taken from the latest FW 101.70 drivers (notice the order of each G80 card, and how the 8800 Ultra is below the 8800 GTS, with a higher id number -GTX has the lowest so far-):



Leads me to believe that the "8800 Ultra" may actually be an even cheaper version of the G80 processor, perhaps with only 64 sp's (but keeping the 320bit bus), or 96 sp's and a 256bit bus.

Of course, this is just a guess, since Ultra is a synonym of "cheaper", instead of "faster" lately in Nvidia's vocabulary (650i Ultra, NF4 Ultra, etc).

Maybe we were lead to believe that the 8600GTS isnt the bridge between the 8600 series and the 8800 series.

Then there was the rumours about a possible 8600 ultra. So possibly nVIDIA could be just waiting for the R630XT to launch to launch its fastest midrange card when ATI release theres.

The framebuffer size i think also bottlenecks the 8600GTS (as seen with the 8800GTS 320 vs 640mb) as you raise the res with AA/AF. So maybe we might be seeing a 512mb configuration of this card, and possibly abit more faster than what the previews are showing us.
 
512 mb don't think that is what is the bottleneck, seems like bandwidth.

Well framebuffer AND bandwidth. Possibly even the ROP.

Did nVIDIA make the wrong move with the G84s? Or made them too unbalanced, just like G72 with the 64bit memory interface.
 
don't know yet if they made a wrong move, it all depends on the rest of the line up. If there are cards in the middle of the 8600 gts and 8800 gts, they might be in ok shape, if not, AMD can pounce in this area.
 
Yeah the 8600GTS won't be winning many fans with those kinds of scores. Getting beaten down by 7950GT and X1950 Pro is pretty embarrassing. It'll be making a beeline for $125-$150 post launch if this performance holds up.

I don't see why they couldn't price it at 125-150$ and still keep good margins. It's got the same bus size as the 6600 GT, a card from two generations back. And it's got a small die. Only around 25% larger then it's predecessor, if i'm not mistaken.
 
This:



Coupled with this, taken from the latest FW 101.70 drivers (notice the order of each G80 card, and how the 8800 Ultra is below the 8800 GTS, with a higher id number -GTX has the lowest so far-):
////////////////////////////////////////////////////////////////////
// 0190 - NVIDIA G80-400 ?
// 0191 - NVIDIA GeForce 8800 GTX
// 0192 - NVIDIA G80-200 ?
// 0193 - NVIDIA GeForce 8800 GTS
// 0194 - NVIDIA GeForce 8800 Ultra
// 0197 - NVIDIA G80-600 ?
////////////////////////////////////////////////////////////////////
Leads me to believe that the "8800 Ultra" may actually be an even cheaper version of the G80 processor, perhaps with only 64 sp's (but keeping the 320bit bus), or 96 sp's and a 256bit bus.

Of course, this is just a guess, since Ultra is a synonym of "cheaper", instead of "faster" lately in Nvidia's vocabulary (650i Ultra, NF4 Ultra, etc).


In a previuous post I posted this table in the G80 thread.
http://drivers.softpedia.com/get/GRAPHICS-BOARD/NVIDIA/NVIDIA-ForceWare-10058-Beta.shtml said:
NVIDIA_G80.DEV_0190.1 = "NVIDIA G80-400"
NVIDIA_G80.DEV_0191.1 = "NVIDIA GeForce 8800 GTX"
NVIDIA_G80.DEV_0192.1 = "NVIDIA G80-200"
NVIDIA_G80.DEV_0193.1 = "NVIDIA GeForce 8800 GTS"
NVIDIA_G80.DEV_0194.1 = "NVIDIA G80-450"
NVIDIA_G80.DEV_0197.1 = "NVIDIA G80-600"
NVIDIA_G80.DEV_019D.1 = "NVIDIA G80-875"

If I recall correctly the GTX was called NVIDIA G80-300 and the GTS was called NVIDIA G80-100 before they got their final names. This leads me to belive that the Ultra is a higher-end version.
 
Where a card sits in the .inf has no bearing at all on its performance.
 
why the hell did NVIDIA made the 8600 on 128Bit bus !! reviewer started discard resolution 1024 on tests,sure there alot of bandwidth bottleneck i think it's time to make the midrange cards on 256bit buses the first 256bit was on late 2002 on r300 i guess it's time to move it on the lower ranges cards.it taking more than 4 years to make it on midrange !!WTF
 
why the hell did NVIDIA made the 8600 on 128Bit bus !! reviewer started discard resolution 1024 on tests,sure there alot of bandwidth bottleneck i think it's time to make the midrange cards on 256bit buses the first 256bit was on late 2002 on r300 i guess it's time to move it on the lower ranges cards.it taking more than 4 years to make it on midrange !!WTF

Actually earlier during 2002 on Matrox Parhelia
 
According to this, the G84 (8600 GT/8600 GTS) may actually be using just 32 scalar processors, not 64 as previously thought.

This mays reinforce the theory of a bridge product between the 8600 GTS and the 8800 GTS, but we shall see.
 
According to this, the G84 (8600 GT/8600 GTS) may actually be using just 32 scalar processors, not 64 as previously thought.

This mays reinforce the theory of a bridge product between the 8600 GTS and the 8800 GTS, but we shall see.

What is that site saying? That an old version of Everest says 64 and the new one says 32? I'd think that incorrect reporting in the new Everest is just as likely as in the old.
 
and one more thing, 256bit now is very cheap to be produced we have already saw too many card with 256bit @ 200$ price tag
(GF 5900XT, 6800GS , 7900GS) So plz anyone tell me what it is holding them back to do it now !!
 
Maybe it has to do with volume and EOL prices? The x600GT parts from Nvidia usually hit ~ $100 at EOL and sell a lot more during their lifetime. Maybe the cost of a 256-bit PCB would put too much additional pressure on margins.
 
I'd be willing to bet that if they get their drivers working properly in Vista the 8600 and 8400 are going to be huge with dell, hp and the likes. They'll move the 8800GTS 320mb down to fill the price gap after the 8800 ultra is released. I don't see a huge problem with this, but unless they make a 256bit SKU of the g80, there will still be a pretty big performance gap around the $200 range. Making a 256bit SKU of a 700M tranny chip seems nuts though.


edit: Really I was hoping they'd go with a 192bit SKU somewhere. Seems like that would have been a smart move to keep costs down and add a great deal of bandwidth. Their architecture seems to lend itself to high granularity in the bandwidth department.
 
I don't think a 256 bit G80 chip is that absurd, though I doubt we'll see one soon. What I think we're looking for is a "Performance" card to bridge the gap between the 8600 gts and 8800 gts. Maybe a 8900 gs or 8800 gt, down-clocked with 256 bit mem.

It'd be interesting if a card based on g80 architecture had 256 bit mem with 256/512 mb VRAM. Wonder how that'd stack up against a 8800gts, performance wise and cost wise.
 
This is basically a 7600GT replacement, right? So, I'd have been surprised if they went with >128bit, though I agree that it seems like about time to move to 192bit or something for this price range. 128bit just makes so much more sense to NV in terms of $: bulk OEM sales, mobile packaging, I guess RAM pricing (1GHz GDDR4 probably still commands a premium). Maybe they're also trying to squeeze more money out of this, as it's the first round of G8x and their first chance to recoup some of their investment. Pity most of us aren't NV, but I doubt they're targeting us forum snobs with the 8600. :)

I wonder if this huge gulf b/w the initial enthusiast and mainstream cards is due to manufacturing constraints, if TMSC and the like don't have the capacity to simultaneously manufacture an IHV's full line-up at peak (launch) demand. We might be stuck with waiting for the second wave of a generation for the $250-500 SRP "mid-range" enthusiast class, a la RV570, when they can reduce (or rejigger for higher yields) high-end manufacturing to squeeze in what I'd call the mid-range (I'm ignoring the budget cards).
 
So this overclocked 8600GTS has a
  • bandwidth of 36.64GB/s
  • with 8 ROPs clocked at 745MHz, i.e. a fillrate of 5960MP/s
  • AA fillrate 23840MP/s
  • zixel rate of 47680MP/s
  • presumably 16 TMUs for a bilinear rate of 11920MT/s
  • trilinear rate of 23840MT/s
Compared with X1950Pro, which has:
  • 44.16GB/s bandwidth
  • fillrate of 6900MP/s
  • AA fillrate of 13800MP/s
  • and a zixel rate of 6900MP/s (or is it 2x that?)
  • bilinear rate of 10400MT/s
  • trilinear rate of 10400MT/s
So in percentages, GTS v PRO:
  • bandwidth = 83%
  • fillrate = 86%
  • AA fillrate = 173%
  • zixel rate = 691% or 346%
  • bilinear rate = 115%
  • trilinear rate = 229%
If it's only 32 SPs that's about 96GFLOPs, versus (ignoring 40-odd GFLOPs of VS) ~250GFLOPs :oops: Otherwise 64 SPs means 192GFLOPs. Give that a 33% boost (due to the scalar architecture) for ~256GFLOPs. For what it's worth...

If it's really only 32 SPs, then that really could hurt texturing-heavy code, on top of the bandwidth shortfall.

Shame Hardware Zone didn't check out a harsh area in Oblivion.

Drivers are prolly too much out of whack.

It seems pretty unlikely to me that RV630XT will run its memory much faster (if at all), since 1145MHz GDDR3 is very much in the region of the slowest GDDR4.

Jawed
 
Last edited by a moderator:
It seems pretty unlikely to me that RV630XT will run its memory much faster (if at all), since 1145MHz GDDR3 is very much in the region of the slowest GDDR4.

Jawed

Yeah, I think that's quite a pickle. I wonder if AMD did/will rethink the switching to GDDR4 for certain parts, or instead use faster GDDR4 or slower GDDR3 than what's optimal cost-wise to separate products.

Do you happen to have any idea of pricing of 1ghz gddr3 vs. 1.1ghz GDDR4?

I've always been curious about that considering it appears they perform approx the same (OC-wise) at default voltages, which are only .1v apart.
 
Back
Top