The Official RV630/RV610 Rumours & Speculation Thread

Status
Not open for further replies.
But the X1950P pretty much stomps the 8600 and 2600 in HH's test, and the 8800GTS does come out looking like a kick-ass card, but 320MB is still too little for its speed. So many trade-offs...

There's a lot of people out there with 19" 1280*1024 LCD monitors and the 320 Mb GTS seems to handle that resolution rather well. And the price is certainly right compared to f.e the 8600 GTS.

Back on topic though, prices for the 2400-2600XT are starting to show up at komplett.se. The 2600 XT GDR3 is only a couple of $ more then the 8500 GT. The GDR4 ~ 8600 GT. Surely doesn't seem to be moneymakers for AMD with the current prices/performance.
 
ATI has a 390 million transistor part now in HD2600XT, that would get absolutely destroyed in any game benchmark by 140m X850XT.

Nice.

At least, if you could use the 140m X850XT for HD video playing on a sloppy cpu... just would like to put an additional point to the other ppl...
 
Umm, dont the features add performance/transistor (when used by games)? Otherwise why have them? It would be more efficient then to run the graphics on super bulked up SM2 hardware, or whatever.

HDR related features (MSAA on FP targets, FP filtering) hardly adds performance/transistor. Doesn't mean that it's not a good idea to have them.

Edit: and then there's the UVD of course.
 
, dont the features add performance/transistor (when used by games)? Otherwise why have them?

Oh, there are lots of reasons, for instance:
* Increasing the barrier of entry for new entrants into the market.
* Allowing new graphical effects.
* Adding buzzwords to drive the consumer upgrade cycle.
* Allowing the IHVs to pursue other markets with their products and technology.
 
These cards are a disaster. You should see the posts on Hardocp. Performance is just awful. Half 8600GTS in some cases. Which itself is a terrible card.

I hope for ATI's sake that these cards are broken as per the rumors, because if they knowingly planned for this level of performance, I just dont know what to say.

4 ROPs? Really?

Oh well, supposedly the ultra low power one will sell them something to OEM's.


Umm, dont the features add performance/transistor (when used by games)? Otherwise why have them? It would be more efficient then to run the graphics on super bulked up SM2 hardware, or whatever.

OK, there are issues, some of them IMHO can be addressed in future divers releases (especially those where 2600XT performs on par with 1650XT), others are simply related (I suppose FEAR is one of the cases) to bottlenecks, mainly in the ROPs department. But I don't see these cards as a failure. Yes, they don't overtake the 8600 GTS but in many cases are at the same level (in very few cases better) and are priced right, being this very important in the mainstream market.
 
OK, there are issues, some of them IMHO can be addressed in future divers releases (especially those where 2600XT performs on par with 1650XT), others are simply related (I suppose FEAR is one of the cases) to bottlenecks, mainly in the ROPs department. But I don't see these cards as a failure. Yes, they don't overtake the 8600 GTS but in many cases are at the same level (in very few cases better) and are priced right, being this very important in the mainstream market.

I think the users bored with the " feature driver will help" thing, as you see 2 official driver released for hd2900xt, and nothing excited happend, still slower than the x1950xtx is some game.
 
Looks like TSMC 65G 65nm process leakage high, hd2600xt consume the same power than the 8600gt factory OC'ed version Link

390million wasted transistor, zero OC'ing capatibility (core already pushed to the limit), gddr4 version is a fantom card, the cards have zero performance with filters enabled, avarage performance is .... (i can't find the best word here).
Total disaster, i can't belive ATi engineers dream the rv630 like this, so its broken or i have no idea whats going on.

This is the worst nightmare what can happend from user aspect :cry:
 
MSRP prices:
$149 Radeon HD 2600 XT GDDR4
$129 Radeon HD 2600 XT GDDR3
$99 Radeon HD 2600 Pro DDR2
$79 ATI Radeon HD 2400 XT
$59 ATI Radeon HD 2400 Pro

Official slide 6 weeks ago:
1178941045MH7ld2qVcW_1_4_l.gif
 
At least, if you could use the 140m X850XT for HD video playing on a sloppy cpu... just would like to put an additional point to the other ppl...
BUT YOU CAN"T USE the 390mln X2600 for HD video playing on a sloppy cpu either.
Just check the B3D preview :p
When will this be fixed?
 
I think the users bored with the " feature driver will help" thing, as you see 2 official driver released for hd2900xt, and nothing excited happend, still slower than the x1950xtx is some game.

Or maybe people should start to realise that the X2900-cards is constructed with DX10 in mined not DX9. When shader intensive DX10 games hits the market there aren’t going to be more discussion about this.
 
Or maybe people should start to realise that the X2900-cards is constructed with DX10 in mined not DX9. When shader intensive DX10 games hits the market there aren’t going to be more discussion about this.
When shader intensive DX10 games hits the market , there will be refreshes to all current DX10 cards... and i wanna bet, refresh parts will be faster.
So? Do you buy a "dx10 card" which struglles to get good fps in current non-shader intensive games, hoping that somehow the more demanding the game the better the fps?
 
When shader intensive DX10 games hits the market , there will be refreshes to all current DX10 cards... and i wanna bet, refresh parts will be faster.
So? Do you buy a "dx10 card" which struglles to get good fps in current non-shader intensive games, hoping that somehow the more demanding the game the better the fps?

As far as i know X2900XT do not struggle to get good fps in current non-shader intensive games. Of cores there is going to be faster cards on the market when shader intensive DX10 games hits the market. However most people by graphics cards and keep them for at least two years.
 
As far as i know X2900XT do not struggle to get good fps in current non-shader intensive games. Of cores there is going to be faster cards on the market when shader intensive DX10 games hits the market. However most people by graphics cards and keep them for at least two years.
Well, this thread is called:
"The Official RV630/RV610 Rumours & Speculation Thread"
:rolleyes:
 
Or maybe people should start to realise that the X2900-cards is constructed with DX10 in mined not DX9. When shader intensive DX10 games hits the market there aren’t going to be more discussion about this.

HD2900XT is another story, its have much more performance than this cut-cut-cutdown versions rv630.

Dx10 patched games already not have playable performance in mainstream resolutions Link Link , who care when 6 vs 12 fps? still unplayable.
When real dx10 games coming out hd2900/gf8800 will be mainstream already.
 
Status
Not open for further replies.
Back
Top