The Official RV630/RV610 Rumours & Speculation Thread

Status
Not open for further replies.
I expect HD2600XT GDDR3 > GF 8600 GT in selected games like:

NFS:Carbon
CoH
X3:Reunion
(modern games... lots of shader stuff...)

in others cases, a 'disaster'...

bye
 
If pconline tested PRO version with GDDR3 (600/700), than XT-GDDR3 (800/700) won't be a disaster. GF8600GT is according to this review 20,6% faster (at average, games only). XT-GDDR3 will use 33% faster core than PRO. In BW unlimited scenarios XT-GDDR3 can be theoretically faster than GF8600GT (at average). In BW limited scenarios XT-GDDR3 will be possibly slower, but XT-GDDR4 will shine.
 
How do you come to believe that this is gonna by a disaster. The first tested XT was a PRO. The Real XT has a faster core and as GDDR4 more badnwith, why should such a card be second the the 8600GTS ?
 
I find it baffling that ATI still can't find their groove in the mainstream segment. And it's even worse considering the process advantage.

It's very predicable as they stick with those damned ratios...the instant HD2900 came out, I knew they would not have exceptional mid-range parts. It was obvious. Some people were saying "oh well high end isn't that important, if ATI nails the mid-range they will be better off" and back then I was like well guess what, they're not going to nail the mid-range, how could they? You immediately know they're going to castrate the TMU/ROPs, to more than likely 8 or less, which are the real limiter on the card, it's only a question of how badly. When your high end part has a bad ratio, then your mid parts will have a bad ratio as well. This is two gens in a row for ATI.

Nvidia doesn't really have a excuse for their lackluster mid-range though. If Nvidia was really managed as well as people act, there's nothing holding them back from dominating that segment, yet they didn't with the disappointing 8600GT. Although I think, you can make a case for Nvidia the 320MB 8800GTS might be encroaching on mid-range anyway, at ~$280.

I mean, HD2900XT is barely above mid range anyway. There's no way to cut it down and get good perfomance. Unless they left all 16 TMU's and ROP's in (as well as a decent amount of shaders of course), and we know that they're not going to do that. The only way they could really get a kickass mid range part though, is 16 TMU/ROP. 12 would give them something competitive, but not blow the doors off. Just like last time around.

The ratios totally killed them last time, as they aimed way way to low initially (X1600XT), then it took them a long time to recover, up the ROP/TMU count to something reasonable, and get decent mid range parts out. Which by then it mostly too late.
 
Last edited by a moderator:
ATi can't sell their product. Nvidia even when faced with a performance and IQ hit manages to out sell or stay competitive. Why? Because Nvidia can sell their products! They make the average consumer think that in some way they want that feature! It was SM3 with the Geforce 6 series, it was the power consumption with Geforce 7. Nvidia is a much better (much much better) company when it comes to being a business, even if their products were bad (which they're far from...) I have this funny feeling ATi wouldn't be a total run away.
 
ATI is now AMD, so things have changed and the better multimedia features of the RV630 seems to have given them the lead in the OEM market. I find it hard to believe that poeple discount the RV630 so badly, it is an awesome multimedia card and it should perform more or less equly to 8600, while consuming less power and being cheaper. I think ATI nailed the midrange part this time.
 
AIW R420 is also an awsome multimedia part, but you won't see me going out and buying it for obvious reasons. I wouldn't call these chips (from both companies) midrange, this is all low-end realistically.
 
ATI is now AMD, so things have changed and the better multimedia features of the RV630 seems to have given them the lead in the OEM market. I find it hard to believe that poeple discount the RV630 so badly, it is an awesome multimedia card and it should perform more or less equly to 8600, while consuming less power and being cheaper. I think ATI nailed the midrange part this time.

Great, AMD make 5% of the users happy with multimedia features, and 95% will be disappointed.
 
Looks like HD2350 coming
47aver.jpg


Now AMD have low-low-low-lowend card too.
 
The potential dx10 "mainstream" (this time lowend) user who want play games, and not buy anything yet because waiting for AMD answer.

Ah, so you're saying they suck as gaming parts? Well, we'll see I guess. The NDA still hasn't lifted to get a broad sense of the results on that point.

I think you're pretty well off tho if you think 95% of mainstream pc purchasers are gamers. If that were true, and you were right about those parts being bad for that, then there wouldn't be good OEM response. Particularly for things like laptops I'd bet there are more movie watchers than gamers.
 
Due to misread NDA info, we published a review of HD 2600XT & HD 2400XT graphics cards. The review will be put offline until the expiry of the NDA. Sorry for the inconvenience.
Link

"Incident" :LOL:

BTW. they are tested with Intel QX6700 on XP,2gb ram, HD2600xt 512mb gddr4 score in 3dmark2k5 9302, 2k6 sm2.0/3.0 1788/2268, HD2400XT 256mb gddr3 score in 3dmark2k5 4951, 2k6 sm2.0/3,0 770/897
 
Now that's an odd choice in that Xpertvision card;
That HDMI is sitting there means that either it has no audio capabilities in it (since no SPDIF in) or that the HD15 plug is there in the place of TV-Out, since we know that they have 1x DVI w/ HDMI capabilities on the standard boards, if they could I'm sure they'd have 2? So it means that the HDMI is in place of that DVI, or in place of TV-Out (and thus without sound)


Meh, I can't even make up what I'm trying to say from that post myself, but hopefully someone gets the idea
 
Now that's an odd choice in that Xpertvision card;
That HDMI is sitting there means that either it has no audio capabilities in it (since no SPDIF in) or that the HD15 plug is there in the place of TV-Out, since we know that they have 1x DVI w/ HDMI capabilities on the standard boards, if they could I'm sure they'd have 2? So it means that the HDMI is in place of that DVI, or in place of TV-Out (and thus without sound)
Its up to the vendor, but I would suspect that the soldered down HDMI output would take priority and the audio from the GPU would be routed to that (in otherwords audio would not be routed to the DVI as there is no need for an HDMI adapter in this case).

One of those three outputs would also have to be running in clone mode with another as well.
 
Status
Not open for further replies.
Back
Top