The Official G84/G86 Rumours & Speculation Thread

Not really, check the sm.2.0 and sm3.0 score.



And wait for retail cards, cherry picked not say too much about OC.

Why they test mainstream VGA with quad cpu in 3dmark2k6 where the cpu couting much in the full score?

Well, he did tested it against a 7900 GTX in the same system, so that levels it out on the CPU front...
It's not likely that a top 90nm G71 can OC as good as a 80nm G84.
http://img11.picsplace.to/img10/24/DSC00046.JPG

7900 GTX results (stock)
http://img11.picsplace.to/img10/24/79266.JPG



Besides 800MHz core doesn't look anything like cherry-picked product.
I'm willing to bet most manufacturers won't settle for the default 675MHz, and boys like XFX, eVGA and BFG will likely sell cards clocked at the 800MHz level, perhaps even with the default single-slot cooler from Nvidia.
Note that, even at 800MHz, the core stays below 60º C.
 
Last edited by a moderator:
Yeah, but it comes close, considering the price difference.

The significantly different technical and physical properties (128bit bus, 80nm, single-slot, power consumption, etc) compared to the 8800 GTS (320bit bus, 90nm, double-slot, higher power draw, higher price) would make it appear to be much less powerful than it really is.
Isnt the 8600GTS to be priced at the higher end of the mainstream segment ($249) ? Besides if you follow that logic, the 320MB GTS is untouchable since it "comes close" to the GTX. :p

Also the 320MB GTS scores close to 9000, the performance parity between that and the G84 doesnt surprise me.
 
Isnt the 8600GTS to be priced at the higher end of the mainstream segment ($249) ? Besides if you follow that logic, the 320MB GTS is untouchable since it "comes close" to the GTX. :p

Also the 320MB GTS scores close to 9000, the performance parity between that and the G84 doesnt surprise me.

At first, of course.
But it's not likely that it will stay that way (much like the 7600 GT), because it's much cheaper to produce.
The 8800 GTS doesn't benefit from such cheap amenities as a 80nm core, 128bit bus and simple PCB/cooler.

I'd be willing to bet that the 8800 GTS will disappear, replaced by a full G80 GTX core and a "cut in half version" G80 GS -still retaining a wider bus than the 8600 GTS-.
The QuadroFX 4600 proved that the shorter 8800 GTS PCB can indeed be used as a 8800 GTX if necessary, but a core redesign/shrink may be in order to cut power consumption to sub-150W levels.
 
Well, he did tested it against a 7900 GTX in the same system, so that levels it out on the CPU front...
It's not likely that a top 90nm G71 can OC as good as a 80nm G84.
http://img11.picsplace.to/img10/24/DSC00046.JPG

7900 GTX results (stock)
http://img11.picsplace.to/img10/24/79266.JPG



Besides 800MHz core doesn't look anything like cherry-picked product.
I'm willing to bet most manufacturers won't settle for the default 675MHz, and boys like XFX, eVGA and BFG will likely sell cards clocked at the 800MHz level, perhaps even with the default single-slot cooler from Nvidia.
Note that, even at 800MHz, the core stays below 60º C.

You write 7900gtx and x1950xt, x1950xt score more than 2500 in sm3.0 test.
The 8600gts sm2.0 score impressive, but the sm3.0 not really, with 128bit and very high clocked memory looks like not enough for HDR. (this is only one synethic benchmark, i wait for the real tests :smile:)

About temperature, the picture captured after 3dmark finished?
 
You write 7900gtx and x1950xt, x1950xt score more than 2500 in sm3.0 test.
The 8600gts sm2.0 score impressive, but the sm3.0 not really, with 128bit and very high clocked memory looks like not enough for HDR. (this is only one synethic benchmark, i wait for the real tests :smile:)

About temperature, the picture captured after 3dmark finished?

Maybe you're mistaking it for a X1950 XTX.
The X1950 XT i referenced is a slower clocked model, with only 256MB.

The temperature..., yes, i suppose so.
Otherwise, why place the nTune screen there, just to show core and memory speeds ?


Remember that these are scores on very, very early drivers, compared against mature cards.
And none of the other two supports DX10. ;)
 
Yeah, but it comes close, considering the price difference.

I wouldn't go that far. It required a pretty mighty overclock to get there and still didnt make it all the way. It won't be threatening the 8800GTS but if that type of OC is common, the 8600GTS will be a very, very popular card.

That pic of the 7900GTX compared to the 8600GTS really proves that a picture is worth a thousand words. Technology really is advancing at a fantastic pace - it's kinda hard to appreciate that sometimes when you're living in it :)
 
Maybe you're mistaking it for a X1950 XTX.
The X1950 XT i referenced is a slower clocked model, with only 256MB.

The temperature..., yes, i suppose so.
Otherwise, why place the nTune screen there, just to show core and memory speeds ?

Yes i know what is xt, i remember for more than 2500 point in 3dmark2k6 sm3.0 test, but i can't find now any x1950xt score with quad core CPU :cry:

I checked the picture again, looks like in the whole test the GPU was ~57-60 celsius, thats nice with 800mhz GPU clock and with the 1 slot reference cooler, but for the whole picture we need to know the dB rate too.
 
That pic of the 7900GTX compared to the 8600GTS really proves that a picture is worth a thousand words. Technology really is advancing at a fantastic pace - it's kinda hard to appreciate that sometimes when you're living in it :)

Wait for the real tests before you say this, 3dmark2k6 not say too much about real ingame performance :smile:
 
Since there was discussion of the x1950xt 3dmark score, I decided to run it real quick in case anyone was interested. My x1950xt at stock speeds on a 3.2 Core 2 Duo got these scores:

3DMark Score: 6750
SM 2.0 Score: 2497
SM 3.0 Score: 2869
CPU Score: 2800
 
Since there was discussion of the x1950xt 3dmark score, I decided to run it real quick in case anyone was interested. My x1950xt at stock speeds on a 3.2 Core 2 Duo got these scores:

3DMark Score: 6750
SM 2.0 Score: 2497
SM 3.0 Score: 2869
CPU Score: 2800

Thx, test run with default clocks?
 
Wait for the real tests before you say this, 3dmark2k6 not say too much about real ingame performance :smile:

Doesn't matter really. You will soon be getting for ~ $200 what you paid ~ $500 for a year ago. Well actually we already are getting that in the X1950XT today.
 
At first, of course.
But it's not likely that it will stay that way (much like the 7600 GT), because it's much cheaper to produce.
The 8800 GTS doesn't benefit from such cheap amenities as a 80nm core, 128bit bus and simple PCB/cooler.
It doesnt, thats what makes it such a bargain at its current price.

That pic of the 7900GTX compared to the 8600GTS really proves that a picture is worth a thousand words. Technology really is advancing at a fantastic pace - it's kinda hard to appreciate that sometimes when you're living in it :)
:)

I got the same feeling when I read about G73 (smaller chip outpacing bigger). It remains to be seen if G84 can repeat it.
 
The G84/86 lineup is looking very promising. With the ATI/AMD delays, 2007 is looking to be Nvidia's year.

Yet again, unless the RV630 is a top notch performer. But by the looks of the RV630XT, e.g dual slot cooling, looks kind of big for a midrange, power consumption figures etc might provide a tough challenge to the 8600GTS.
 
Yet again, unless the RV630 is a top notch performer. But by the looks of the RV630XT, e.g dual slot cooling, looks kind of big for a midrange, power consumption figures etc might provide a tough challenge to the 8600GTS.

No one confirmed this card showed on Cebit is a rv630xt.
PCB looks similar as x1950pro PCB, i still say its too long for 128bit and not looking like a 200$ mainstream card, i bet for this is a 256bit card, but not any rv630 version, maybe r600xl/pro/gt/gto/gto2 who knows what sign AMD want to use now :smile:
 
Yet again, unless the RV630 is a top notch performer. But by the looks of the RV630XT, e.g dual slot cooling, looks kind of big for a midrange, power consumption figures etc might provide a tough challenge to the 8600GTS.

The whole lineup from ATI looks to be a power hog. What ever happened to the company that use to rule in the laptop GPU industry? Nvidia has totally displaced them there.
 
I'd say it's looking rather efficient based on what you're getting. Their 65nm parts should do rather well and I'm still under the impression the 'marginal' performance difference we appear to have between R600 and an 8800GTX is going to disappear as soon as AA/AF get turned on. And it doesn't look like R600 will be using much more power than a GTX. For all we know ATI has been clocking them conservatively and they're mad overclockers.

I bet the performance gap really widens if you try to watch something in 1080P while playing a game. :runaway:
 
Thx, test run with default clocks?

Here is 3dmark06 running at my CPU's stock speed. I have also put in the links for the OC and stock run. I hope they work as I have never posted a 3dmark link before. The one I posted yesterday was stock GPU speed, but over-clocked CPU in case I was not clear. Anyway, here is the speed at stock CPU speed.

3DMark Score 6075 3DMarks
SM 2.0 Score 2401 Marks
SM 3.0 Score 2727 Marks
CPU Score 1874 Marks


Here are the links:

CPU OC, GPU Stock: http://service.futuremark.com/orb/resultanalyzer.jsp?projectType=14&XLID=0&UID=8525872

Stock CPU, GPU Stock: http://service.futuremark.com/orb/resultanalyzer.jsp?projectType=14&XLID=0&UID=8537603
 
Back
Top