The Official G84/G86 Rumours & Speculation Thread

So the default clocks shown by ATI tool in Shamino's piece are not default?

He probably had access to a factory overclocked model, which isn't going to be an uncommon sight, attending the OC he still got out on top of that.
If the core is really that good clockspeed wise, i bet most manufacturers will have at least one "extreme" model ready really soon, unlike what happened with the 8800's.
 
Yeah, Coolaler showed us it tops around 800 on the stock cooler and stock volts. We've seen models ranging from 675 to 720, or in this case 745. Irregardless, around 800mhz sounds like where it'll go, and factory overclocked cards will be clocked below that.

That being said, that has to be the easiest v-mod to 1.5v I've ever seen in my life. Yay! I don't have to buy a buddy a 6-pack to get some volts (because my hands, they be shaky.)

I was waiting for RV630XT with the intention of it becoming a physics card later in the year, but that looks pretty damn impressive. I too will wait the obligatory month to see how RV630 fares, but it hit 7K with mods I can do (and with Shammy only using 2.4 and 3.2ghz clocks)...for ~$200 it's looking good. It actually can par last gen's high-end...with a little help.
 
Last edited by a moderator:
Yeah, it definitely should help by lowering voltage, as I was reminded privately. I wouldn't say it's pure software though, power states might also require some hardware I guess.
Sure, it does require hardware support, but to clarify what I meant... you already need to be able to run lower clocks for debug so most of the hardware support should be a given. Then the only issue is making it dynamic. That's why I'm thinking it's mostly a software issue.
 
The starting point is 675MHz, not 745.

Its still darn impressive. For the fact that the stock core clock of the 8600GTS is 675mhz compared to last gens 560 mhz (7600GT). A big jump in my opinion, and on top of that DX10, 64 unified shaders (rumour) and single slot cooling.

This card sounds really nice. Wonder how the GT fares in OCing.
 
15.jpg

http://www.vr-zone.com/?i=4875&s=1
;)
 
Wonder how the GT fares in OCing.

I imagine the core will do well, unless they turn the volts down (which is a definite possibility...but that's what mods are for) I suppose that is the important thing. The killer is though, iirc, the 1.4ns RAM, topping out at maybe ~<1600mhz tops for a total of <25.6GBPS. Why won't that stuff just die already? It's been used on the mid-range for what seems forever. It seems like equally as much, it's never done much better than it's speed bin, unlike GDDR3 that came before and after it. They've always seemed to bin the hell out of it. :p That top-of-the-line GDDR3 (1.0ns, 2000mhz) sounds pretty exciting though, as the highest bin always seems to have substantially more wiggle-room, and this seems to be no exception even though shortly before this GDDR3 seemed almost EOL. 10-25+% overclocking sure helps a card with a 128-bit bus keep it's usefulness next to the big-boys with substantially more bandwidth. 35-40GBPS+? That actually almost earns it's title as 1/2 a 8800GTX.

Also, in reality, isn't GDDR3 at said speed faster than GDDR4 at same speed because of the way bits are divided (or something?) It's been a while since I researched the workings of memory...Wonder if that's a reason ATi seems to be refocusing on the GDDR3 R600...Perhaps GDDR3 yields are better (and cheaper) than they expected?

Curious to see how the 512MB version does, and where it's priced. This might actually be a (forgets proper performance bracket name) worth getting 512MB...If it's not priced to close too the 8800GTS.

Looking forward to that Good Ol' B3D review gentleman :D

Edit: I actually focused more on this pic:

17.jpg


Sane V-mod that might actually work with the stock cooler...and the crazy extra voltage didn't seem to help that much.
 
Last edited by a moderator:
Also, in reality, isn't GDDR3 at said speed faster than GDDR4 at same speed because of the way bits are divided (or something?)
Nope, since in both cases the speeds are effective. 3 and 4 clock the DRAM core differently, as you remember, but the effective frequency is still the same.
 
you expect higher latency though with GDDR4. but GPUs don't care much. CPUs want low latency/low bandwith (that's why DDR2 fared worse than DDR1 at the beginning), GPUs are pleased with high latency/high bandwith.
 
Latest from The Inquirer:

http://www.theinquirer.net/default.aspx?article=38654

WE GOT UPDATES on the possible delay of G84 and G86. It looks like the April 17th date is on, but if they are going to launch with parts availabile on that day, you might be better off avoiding them.

The problem as we understand it is the 2D modes are not clocked down to where they should be. NV has a 2D clock that is a lot lower than the 3D clock which saves battery power and in general, makes things run cooler and quieter.

The bug we are told prevents them from clocking 2D down to a level lower than 3D increasing power substantially. Basically, when the parts should be clocked down in 2D where they spend most of their life, they instead run flat out.
 
Same thing again, just as with G80, most info about the cards is out in the open way before launch. NV should ask DAAMIT for advice. :smile:
 
you expect higher latency though with GDDR4. but GPUs don't care much. CPUs want low latency/low bandwith (that's why DDR2 fared worse than DDR1 at the beginning), GPUs are pleased with high latency/high bandwith.
Well, higher latency does require more GPU die area, as you need bigger FIFO buffers, and have to have more pixels in flight to hide the latency for read-modify-write operations (blending).
 
Back
Top