AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
Looking from another perspective, comparing it to HD3870 wich was faster than HD3850 wich was faster than HD4670 wich was terrible limited by memory BW but let's leave this for another thread :)

Compared to HD3870 it have 25% more ALUs and TMUs, same z-only and half ROPs and about the same BW, 25% more performance at same clock? :)

Or comparing to HD4770 wich is also limited by memory BW, 37,5% less ALUs and TMUs half ROPS and Z, add a few % if Reewood isn't limited by memory and then... 25% less performance?

What's your guesses guys?
 
Looking from another perspective, comparing it to HD3870 wich was faster than HD3850 wich was faster than HD4670 wich was terrible limited by memory BW but let's leave this for another thread :)
I don't agree with that... 3870>4670>3850.
There were some situations where 3850 pushed past the 4670, just like there are some situations where the 4670 was nipping at the heels of the 3870 which mainly seemed to be with AA on.
 
Looking from another perspective, comparing it to HD3870 wich was faster than HD3850 wich was faster than HD4670 wich was terrible limited by memory BW but let's leave this for another thread :)
Actually with newer drivers / benchmarks the HD 4670 isn't really slower than the HD 3850.
I dunno I'd expect it to be faster than HD3870 but not by that much.
I think the interesting comparison will really be against GT240.
 
Wow.

Hmm..
(<50W) Mobility 5870 G5 vs GTS 350M SLI
(<40W) Mobility 5850 G5 -> GTS 360M
(<_30W) Mobility 5850 G3/5770 G5 -> GTS 350M
(<25W) Mobility 5830 G3/5750 G5 -> GTS 335M (330/325M is a goner)
(<20W) Mobility 5650 -> ... previously 30W Mobility 4670/5165 (basically TDP vs cost)
(<15W) Mobility 5470 G5/G3 -> G 310/305M G3 (5470 G3 loses, but should edge out with G5 against G310)


1. Mob5730 has faster clocks than Mob5750, but is G3 only, and uses more power actually.
2. Mob5850 has the vari-clock bullsh*t again. I'm taking the fastest config though, amounting to 625Mhz core.
3. nVidia removed ANY TDP reference numbers they had earlier, so I'm referring to their un-rebranded line (2XXM) that they showed everyone earlier on as it is unlikely that they'd bin out a whole series of miracles in this time period.
 
Besides, I expect Nvidia will fade out production of the 9600 GT soon and replace it completely with the slightly-slower-on-average GT240-GDDR5.

I hope they do that soon enough then. I'll get either a 9600GT or a GT240 gddr5, both have about the same value to me - one a bit faster, the other one a bit lower power.

but right now the GT240 gddr3 is priced the same as a low power edition 9600GT, and GT240 gddr5 is more expensive. that make both a no-go.
for now, I consider the GT240 and GT220 for what they are : aimed at OEMs and laptops, before the consumer.

But it's a tough call with the AMD 5600 series. With a nvidia card, I get better software all around (linux driver, CUDA and nhancer), with a a 5670 I get excellent supersampling for older games and light stuff.
that's how my choice boils down ; maybe this time I'll go with AMD after a geforce 4/6/7/8 streak
 
They'll no doubt phase out GT240 too; in favor of GT340 or GT350 using the same chip at same or just a tad higher clocks
 
That'll surely take off... thermally :LOL:

Yeah.
2a0hqpu.gif

No.
 
That's with a big ole dual slot cooler though. I would be very surprised if Nvidia even reacts at all. Weren't GT 240 clocks set to come in just under the 75w limit? Unless they can get voltages down as the process matures I don't see how a clock bump would be imminent. Assuming they care in the first place.
 
It's interessting, that a 69 Watt TDP (gt240) card consumes more than a 96 Watt TDP card (9600gt)...
 
Last edited by a moderator:
Arguably Xbit does the most accurate consumption testing of all the review sites and they came to a very different result.

http://www.xbitlabs.com/articles/video/display/gf-gt240-1gb_4.html#sect0

The only drawback to the current Xbit method of determining power consumption is that it doesn't measure power provided through the PCIE slot itself. Looking at the board they have setup between the PSU and the power rails, it doesn't appear to have anything intercepting power provided to the MB, which distributes power to the PCIE slots.

And even if I'm wrong about how it's setup, I'm not sure how accurate they would be in determining how much of the power going to the MB is being consumed by the PCIE slot versus other components drawing power through the MB.

Which could affect different cards depending on whether they draw power from the slot or not as well as how much power if it is drawing power.

Regards,
SB
 
I'm not sure how their method has any drawbacks compared to other sites that simply stick a meter at the wall outlet and compare total system consumption. The article clearly outlines the separate measurements they do for slot and cable provided power. Even if the slot values come from the mobo and not the slot directly it's still going to be far more accurate than what anyone else is doing no?

ps: given how explicitly they mention separating GPU and CPU power and their specific PCIe labels I'm inclined to think they are pulling those numbers from the slot.
 
Back
Top