Quad-CrossFire is coming

Arty

KEPLER
Veteran
VR-Zone

33ndvrr.jpg


Sapphire has a Quad Crossfire demo up at their booth using two X1950PRO Dual cards. There is another demo rig without graphics card probably reserved for R600 later.
 
31782.jpg

131826.jpg


Looks to me that drivers are almost here: But now who is the King ----> SLI-GF8800Ultra -OR- Quad-Crossfire Radeon3870X2. :D
 
Last time Nvidia failed with quad-SLI GF7950GX2 :(

You can't really compare 2 x G71 in Windows XP/DirectX 9.0c to 2 x G92 in Windows Vista/DirectX 10.
Three-way SLI already demonstrated that G8x and especially the DX10 API allow for a much better scaling of multi-GPU solutions' performance, albeit it can't obviously touch the per-GPU efficiency of a single card setup, of course.

Also, HD3870X2 is just as dependent of its driver quality as 9800 GX2, IMHO.
There are still plenty of games where a single 8800 Ultra easily beats it, and in fact the driver latency combined with poor optimization on any given game title sometimes actually make it slower than a single HD3870 (!).


But, quite honestly, multi-GPU is not my cup of tea. It's like "cheating" to buy time in the market until the true next high-end part arrives.
I can understand the need for it by developers modeling next generation game performance levels, though.
 
Last edited by a moderator:
You can't really compare 2 x G71 in Windows XP/DirectX 9.0c to 2 x G92 in Windows Vista/DirectX 10.
Three-way SLI already demonstrated that G8x and especially the DX10 API allow for a much better scaling of multi-GPU solutions' performance, albeit it can't obviously touch the per-GPU efficiency of a single card setup, of course.

Also, HD3870X2 is just as dependent of its driver quality as 9800 GX2, IMHO.
There are still plenty of games where a single 8800 Ultra easily beats it, and in fact the driver latency combined with poor optimization sometimes actually make it slower than a single HD3870 (!).

True....

This is first try for ATI to produce Quad GPU setup, and this would be second attempt for Nvidia with Quad GPU.

I'll wait and see :D
 
But, quite honestly, multi-GPU is not my cup of tea. It's like "cheating" to buy time in the market until the true next high-end part arrives.
I can understand the need for it by developers modeling next generation game performance levels, though.

Yes, and I still think multi-GPU is waste of time; but I do like Quad-CPU :)
 
Last edited by a moderator:
Multi-cards (not multi-gpu) is a waste of money and energy, because when all those cards will be obsolete, you'll have to trash all of them.
Quad-GPU it's like buying 4 processors for a 4 sockets Main Board.
No! Even worse! Because those cards are usefull just for gaming, when a CPU can do many more things.
 
Multi-cards (not multi-gpu) is a waste of money and energy, because when all those cards will be obsolete, you'll have to trash all of them.
Quad-GPU it's like buying 4 processors for a 4 sockets Main Board.
No! Even worse! Because those cards are usefull just for gaming, when a CPU can do many more things.

OK...how does this work?If I have 2 boards and they become obsolete, I cannot use them anymore...because they go spoof?They cease functioning?Whilst with just one board I can go on and on and on just like the Energizer Bunny?Is this something fairies are doing?
 
I think what he meant was that you can't replace them individually.


As for the CPU being more useful, most people don't benefit much (at all) from even dualcores (regular office workers, internet surfing etc), very few can use quadcore or more.
 
I think what he meant was that you can't replace them individually.


As for the CPU being more useful, most people don't benefit much (at all) from even dualcores (regular office workers, internet surfing etc), very few can use quadcore or more.

I still fail to see the relevance of it in this context, perhaps I'm being dense:so the cards are obsolete...what does it matter if you have either one or two obsolete cards?You can either a)sell;b)reuse in a lesser PC;c)throw away in both scenarios. What's inherently more evil WRT having 2 cards?

I agree with the most ppl comment...but for some(me:) ), having a speedy quad is a godsend. Fast media encoding, whilst still being able to do something else, and quite a few other perks made it a superb upgrade in my case. But that's entirely OT.
 
I don't know, it depends on how each potential consumer looks at things. Cost/value is different with so many variables to consider over-all to blanket for all.

The thing that may be appealing is the choice and ability to do Tri or Quad CrossFire. This platform isn't for everyone and, one may imagine, not designed to be. However, from time-to-time the dreaded waste of time; waste of energy; waste of money is offered by the consumer mind-set that this type of choice may not be so appealing to. Nothing new; and hear it from time-to-time with anything new with costs high at times. That's the beauty of forums and mind-sets and it's all good.

Curious to see how this type of platform performs and matures over time.
 
I think what he meant was that you can't replace them individually.


As for the CPU being more useful, most people don't benefit much (at all) from even dualcores (regular office workers, internet surfing etc), very few can use quadcore or more.

I'd take issue with that, if you've tried using a single-core processor lately it is pretty painful if you are used to dual or quad-core chips.
 
I don't know, it depends on how each potential consumer looks at things. Cost/value is different with so many variables to consider over-all to blanket for all.

The thing that may be appealing is the choice and ability to do Tri or Quad CrossFire. This platform isn't for everyone and, one may imagine, not designed to be.
However, from time-to-time the dreaded waste of time; waste of energy; waste of money is offered by the consumer mind-set that this type of choice may not be so appealing to. Nothing new; and hear it from time-to-time with anything new with costs high at times. That's the beauty of forums and mind-sets and it's all good.

Curious to see how this type of platform performs and matures over time.

Well put. For instance, I think it'll be interesting to see how both single card and SLI'ed (2x) 9800GTX's compare to single and Quadfire 2x3870x2's, as they are (or rather will be) the priced equals to each other...$400 a piece.

Or to look at it another way, one 9800GX2 vs a 3870x2 + 3870 in tri-fire, as they will both equal a $600 solution.

Let it be known I feel bad for the reviewers whom will have to test 1-4 cards for each sku, minus the dual-gpu cards, where they will need to test 1-2 cards per sku. It sounds like a lot of work, but frankly is needed for value comparison. Things just got a whole lot more complicated.

I'd take issue with that, if you've tried using a single-core processor lately it is pretty painful if you are used to dual or quad-core chips.

Tell me about it. When I sold my setup and went back to a single-core A64 in preparation for a massive June-ish upgrade, I wholly underestimated how painful it would be.
 
Last edited by a moderator:
I have yet to see an instance where this type of configuration is remotely worth purchasing. It's a great thing to fantasize about though!

I would expect this to perhaps become useful, maybe, if you have some massive 3000xsomething display to power. Nevermind that games can't get that high, so I just don't know yet if I'd personally would ever both.

Anandtech recently tested ATI's crossfire in 3x mode. That didn't produce didly over 2x, until you got past 1920x1200 at least.
 
I have yet to see an instance where this type of configuration is remotely worth purchasing. It's a great thing to fantasize about though!

I would expect this to perhaps become useful, maybe, if you have some massive 3000xsomething display to power. Nevermind that games can't get that high, so I just don't know yet if I'd personally would ever both.

Anandtech recently tested ATI's crossfire in 3x mode. That didn't produce didly over 2x, until you got past 1920x1200 at least.

It's quite worth noting that Anand's test were done with beta drivers that still had a multitude of issues. So if you're going to fling mud, at least fling it when the numbers are in production and not some beta-testing thing. I can completely swamp a pair of individual 3870's in crossfire by enabling 8xAA and 16xAF in a few choice games at 1680x1050. And that's in games where CF is giving me substantial speedup...

Kinda like referencing Vista performance benchmarks when it was still in Release Candidate status.
 
It's quite worth noting that Anand's test were done with beta drivers that still had a multitude of issues. So if you're going to fling mud, at least fling it when the numbers are in production and not some beta-testing thing. I can completely swamp a pair of individual 3870's in crossfire by enabling 8xAA and 16xAF in a few choice games at 1680x1050. And that's in games where CF is giving me substantial speedup...

Kinda like referencing Vista performance benchmarks when it was still in Release Candidate status.

Crysis performance is likeable with the latest 8.47RC3s in QuadCF...even in DX10 Very High. There are still some rough edges, certainly, but it's surprisingly good. Too bad the Powerplay-downclocking bug is still present...this I think has messed up both reviews and user experiences.
 
Back
Top