So, do we know anything about RV670 yet?

hmm that suggests under 3dmark that AA doesn't have as big a hit as R600, yet I've seen other results which show the 3870 slower than the 2900XT with AA in DX10 games (which may be the bus coming into play).
 
Nice...

For those who play the numbers games, this part "8800GT was a tad faster in Crysis, but only by two frames." translates into "The 8800GT is 13% faster in Crysis."

From the current Performance Driver Result (bug) you have todo -10% and then the cards are equal
 
hmm that suggests under 3dmark that AA doesn't have as big a hit as R600, yet I've seen other results which show the 3870 slower than the 2900XT with AA in DX10 games (which may be the bus coming into play).

Well, for Nvidia GF8800GT with 256bit memory is enough to win against old GF8800GTS with 320bit memory , I don't see any reason why 256bit memory for ATI Radeon 3870 will hold the card back.
 
No the Driver after it should fix it but it's not proven yet that it does without a Speed hit
And what if the "bug fixed" driver results in increased performance?

Well, for Nvidia GF8800GT with 256bit memory is enough to win against old GF8800GTS with 320bit memory , I don't see any reason why 256bit memory for ATI Radeon 3870 will hold the card back.
No increase in RBEs so TF (esp AF) hit?
 
Well, for Nvidia GF8800GT with 256bit memory is enough to win against old GF8800GTS with 320bit memory , I don't see any reason why 256bit memory for ATI Radeon 3870 will hold the card back.

You need to remember that these are different architectures, one may need / want more membandwidth than the other.
 
Nice...

For those who play the numbers games, this part "8800GT was a tad faster in Crysis, but only by two frames." translates into "The 8800GT is 13% faster in Crysis."
How much you wanna bet those scores are for 0xAF, and hence 100% useless?
 
You hit the nail on the head Turtle....

The HD3800s all of a sudden make more than 1 card remotely possible.

When I researched my MB I sort of laughed that it had 2 PCI-e slots. What a waste.... but it's like many features.. they bundled in. A lot of new mother boards sold this fall and late summer have 2 PCI-e slots and most I bet were P35/P38 with dual Crossfire slots versus 680i with dual SLi slots.

Budget minded P35 aside, all the review sites for that last 4-5 months have been reviewing and pimping P35 this, P38 that. There are massive amount of these boards out there budget and high end. Low cost and good performing midrange cards from both nVidia (8800gt) and AMD (HD3800) all of a sudden let people who could never justify a 800-1200 dual video card purchase have a new option.

With HD3800 just about to be released, there is starting to get talk on getting crossfire; with the 8800gt I have not heard talk of SLi even though that would be great. It comes back to the boards that are newly purchased in last while being Intel based and as such no option for SLi.

To the reply why I would not just get a 8800gtx... well its not money. I run this system with 8GB of memory and 1,25T of hard drive. Not everyone plays stuff that requires a 2900XT or a 8800GTS/GTX. Playing MMO's with my HD2600XT has been more than adequate however as everyone here knows the HD2600XT and 8600GTS are just a bit underpowered for a mid-range. I can see at the highest setting a slowdown even for MMO's which don't require much. A 8800GT and HD3800 is just what is needed.

I think I represent a group of computer gamers that would rather not spend on the top end. I am over 50 and maintain a menagerie of computers for my family and have 5 computers to upgrade. I spend new on some and hand-me down to others. I am building a Windows Home Server this week in addition to regular requirements. I look at the HD3800 in that I can get 1 or 2 cards and pass on the cards when required. I might get to do a crossfire for a month or two but I bet I will be donating one of the cards to my wife or one of my kids. The 8800gt and HD3800 family are a major change in that they hit the performance and price point right on the head for a lot of people.

I have been a lurker on this site for years before I signed up. Excellent discussions and not too trolly.:smile:
 
My experience is anecdotal, so it should be evaluated as such, but the majority of people I know view multi-card setups as something rather exotic.

From what I've seen of chatter on the internet (a limited sample and I don't frequent a lot of the boards where members like to pimp their rigs), AMD crossfire is frequently mentioned in the context "you can use crossfire to equal Nvidia's single-card offering".

That's not exactly resounding praise for AMD, but I guess you have to get kudos where you can.


I'm interested in the numbers for this scheme.
Is it better for Nvidia to sell a very expensive high-margin Ultra, or can AMD manage better with the volumes of selling multiple lower-margin RV670s?

On one hand, it is preferable to have a single-chip or single-card solution, if possible. It seems clear that for one manufacturer, the limit of possibility is a little closer.
After all, why get two cards for 1.8X scalability if you can buy a single card that is twice as capable and offers by default 1X scalability?

Then again, it is possible that Nvidia's having more difficulty playing the manufacturing game with larger die sizes.

The determination of which manufacturer is better off is dependent on just where on the cost of manufacturing/integration curve they are at.

Perhaps AMD is better served by better groundwork for multi-card solutions, though in my admittedly conservative opinion, Crossfire or SLI are at least in part another way for the IHVs' drivers to louse things up.


Just like with multicore CPUs, you don't go for multiples unless you can't get any further with one.
It's a pain in the ass for AMD, then, that apparently someone else can.
If Nvidia can keep manufacturing issues at bay, it can keep an edge.
 
Multiple card set-ups are exotic. As you said people who had/have them are in the minority.

They are expensive in that you need 2 cards. But I think there was a factor that is different this time. For a long time, to get a dual card setup you needed to get IDENTICAL cards. So you needed to decide and put up all the money all at once.

With a few Crossfire tests being shown where the cards were mixed, creates a new scenario. Buy one now and add a card later in the future.

The only places where I have seen a few comments on Crossfire being a new choice is on forums that are discussing the new benchmarks being touted on HD3800s. All of a sudden a couple of people admit that they thinking of Crossfire with the HD3800 and a few agreements. follow. People who used to get dual cards used to just buy them and let everyone know afterwards how uber their systems were.

However talk is cheap... we will see how many people dual cards in a few weeks.
 
I'm taking a wait and see approach on the mismatched card crossfire concept.
That's another layer of possible headaches and another way software or a driver update could cause problems.
 
Yep drivers be more complicated.

The only thing I will say is that it seems to me that seems the future of PCs. We have dual cores and quad cores now on the CPU front.

The crystal ball seems to indicate that multiple GPUs is on the cards for both AMD and nVidia. We have all this talk of HD3870x2 and GX92 x 2 coining and also triple and quad cards in the near/mid future. So far we have one set of drivers for single and multiple GPUs and with new focus hopefully they get it correct or even us single card users may suffer.
 
Multichip was, is and will be niche. Mainstream is single-chip. Multicore CPUs went to masses when these were integrated into one package. Crystal ball indicates that Fusion will be mainstream (x64 + gfx), again integrated solution. Yes, there is an option for additional external chips, but again, it won't be mainstream.

Same for Intel: Larrabee + CPU, integrated.

Maybe I'm having overly restrictive definition of what "mainstream" means? :LOL:
 
If you go by Intel's quad-cores, multichip products are at price points that seem too low to be niche.

Multi-socket is still niche for the desktop, but anything within the same package and under the same heatspreader makes no difference to the user anyway.
 
I agree mainstream would be single chip.

However, I get feeling that if you want faster it will be multiples in future. Maybe the performance champs will be multiple chips from now on. It's got to be cheaper, make 1 to 2 generic chips and if you need the fastest, get 2 chips on a pcb , crossfire or sli.
 
Last edited by a moderator:
Back
Top