Crossfire Info

Status
Not open for further replies.
Jawed said:
It seems that unmatched quad capabilities are turned off, e.g. a 4 quad master working with a 3 quad slave will be reduced to a 3 quad master.

So bang goes all those hours spent speculating over asymmetric quad supertiling schemes. :oops:

Note - I don't understand German - this is from the translation :LOL:

Jawed

Bummer :(

DSC said:
Anyone noticed the price for the X850 Crossfire Edition card as suggested by ATI? $549.... Sander wasn't wrong after all. :LOL: :LOL:

And notice you can't use a X850 CE with a X800 non CE cards. Or vice versa, so much for the myth of Crossfire being more flexible. :LOL:

SLI: can't use 6800 Ultra with 6800 GT or vanilla
Crossfire: can use X850XT master with X850 Pro or vanilla

DemoCoder said:
Didn't some people hack Nvidia SLI to work on non-SLI motherboards by putting the second card in a PCIe x1 slot? Seems that it is not fundamentally limited to nForce MBs, but this is merely an NV driver decision to force people to buy nForce. A future driver rev could make NVidia SLI work on other chipsets. From an astract standpoint, I don't really see why the MB chipset needs to provide any support at all beyond the PCIe spec itself.

I know that there was a hack with early DFI boards to make Ultra chips run as SLI; this was because they had two x16 size lanes anyway for their own wierd-ass 16+2 configuration thing, and the early Ultra chips from nVidia were just SLI chips with a couple of bridges cut. Apply the old graphite pencil trick and bam, SLI. I imagine nVidia have since moved these bridges to a more inaccessible position, ie inside the chip, but have no evidence for this. Haven't seen reports of plugging cards into 1x slots, and would be interested in more info. Either way, if part of the hack involves closing said bridges then it would imply that SLI-specific hardware is still being utilised in these cases.
 
DSC said:
It is recommended that Catalyst A.I. be enabled in the 3D settings

Why do I get a feeling if A.I is disabled, all the profiles stored inside will not work..... so much for claims for not needing profiles..... :LOL: :LOL:

Theres a difference between requiring profiles for any boost and needing profiles for the best performance



Although they're description of scissor mode for opengl games I think says works in most games, so maybe they wont have boosts in all opengl games without profiles and if Nvidia cant do so in Chronicles of Riddick surely ATI should fail as well as Scissor mode is basically SFR isnt it???
 
Enthusiasts will hear to be disappointed that they do not have a control of the different CrossFire computing modes after present conditions of the things. Catalyst A.I. specifies, which play runs in which Render mode. If the play the driver is not well-known, the standard mode (SuperTiling) for Direct3D and Scissor for OpenGL applications is taken automatically. Only the SuperAA mode can be selected by the user freely.

So you can't choose Supertiling, Scissor or AFR yourself, you use whatever Catalyst AI is programmed with. If it isn't programmed then you get Supertiling on D3D and Scissor on OGL.

If you switch on SuperAA then obviously these other modes become irrelevant.

Jawed
 
Jawed said:
Enthusiasts will hear to be disappointed that they do not have a control of the different CrossFire computing modes after present conditions of the things. Catalyst A.I. specifies, which play runs in which Render mode. If the play the driver is not well-known, the standard mode (SuperTiling) for Direct3D and Scissor for OpenGL applications is taken automatically. Only the SuperAA mode can be selected by the user freely.

So you can't choose Supertiling, Scissor or AFR yourself, you use whatever Catalyst AI is programmed with. If it isn't programmed then you get Supertiling on D3D and Scissor on OGL.

If you switch on SuperAA then obviously these other modes become irrelevant.

Jawed

In it's current stage anyway yes
 
DSC said:
Anyone noticed the price for the X850 Crossfire Edition card as suggested by ATI? $549.... Sander wasn't wrong after all. :LOL: :LOL:

And notice you can't use a X850 CE with a X800 non CE cards. Or vice versa, so much for the myth of Crossfire being more flexible. :LOL:

Have you noticed that nVidia can't mix a GF6800GT with a GF6800U or even a GF6800 at all? And that you need to have the same biosrevision and preferably the same vendor? :LOL:

And have you noticed that buying a X800XL for $249 with another X800XL (can be found below $299) will be cheaper than 2 GF6800GTs? :)

Did you notice that SLi only has 2 rendering modes which don't even work out of the box in every game when it's not officially supported in the drivers? Have you noticed that SLi isn't able to improve image quality at all? And did you notice that it will most probably be possible to get CrossFire to work on SLi motherboards? Wow, you're so right... compared to SLi CrossFire looks so pale. ;)
 
DSC said:
Why do I get a feeling if A.I is disabled, all the profiles stored inside will not work..... so much for claims for not needing profiles..... :LOL: :LOL:

Turn AI off and you'll likely default to Supertile in D3D.
 
Anyway, with R520 a few months away, which enthusiasts are seriously going to buy Crossfire for X800 or X850? Who's going to double their investment in end of the line cards?

And if NVidia releases G70 at the same time as Crossfire, surely its performance is going to make Crossfire look pointless. Particularly in SLI.

SuperAA is about as compelling as it gets. I wonder if it'll be possible to screen-grab 14xAA? Wouldn't be surprised if you can't, lol.

Well I suppose 14xAA will be a weapon in the war with the console boys, "I can play 1920x1080 with 14xAA on my 60" HDTV" etc.

Anyway, I hope R520 brings an 8xAA mode for single cards...

Jawed
 
Y'know, I want to be wrong on this. I really, really do. But I just have this growing ugly feeling that this is prima facie evidence of a great lack of confidence in R520 as a performer on the part of ATI. I can't think of any other rational explanation for why you'd be coming out with a $550 sku this late in the R4xx-lifecycle. I'm more than 1/2 expecting at this point to hear that R520 has been canned and R580 will come out instead.

If this isn't the case, why aren't they launching this with R520 instead?
 
DemoCoder said:
A future driver rev could make NVidia SLI work on other chipsets. From an astract standpoint, I don't really see why the MB chipset needs to provide any support at all beyond the PCIe spec itself.

VIA have already demonstrated two 6600 GT's working in SLI mode on their own chipset with hacked nForce drivers that were on the web. Supposedly there is some login in the chipsets that assist in arbitration of the PCI Express lanes, but personally I feel that posturing BS (from all around) - the controls of how its set up is most likely just software; I doubt ATI's northbridges for crossfire even differ from the initial chipsets made available.
 
CJ, there is no fundamental reason why SLI can't support additional AA modes other than needing to implement them in a driver and probably no fundamental reason why NV cards can't be made to work on other MBs besides software. It would be trivial to implement the 8xS mode as well as improve its quality over single-card 8xS by subpixel displacement.

I would look forward to future driver updates that enable this functionality once there is sufficient based of installed competiting chipsets and GPUs. Nvidia's been able to sit on the laurels for the last year in terms of price and featureset because of their exclusivity, but competition will change that.

ATI's entry into this market is good all around. NVidia needs a kick in the butt.
 
geo said:
Y'know, I want to be wrong on this. I really, really do. But I just have this growing ugly feeling that this is prima facie evidence of a great lack of confidence in R520 as a performer on the part of ATI. I can't think of any other rational explanation for why you'd be coming out with a $550 sku this late in the R4xx-lifecycle. I'm more than 1/2 expecting at this point to hear that R520 has been canned and R580 will come out instead.

If this isn't the case, why aren't they launching this with R520 instead?

the master card has additional chips compared to the normal version which makes it more expensive.

For availablity for r520 i don't think it will be much later than G70. G70 may launch earlier but that doesn't mean it also available earlier

We'll find out eventually
 
Jawed said:
Geo, I wouldn't be surprised if R520 gets canned for R580 either.

Tis all very strange.

Jawed

Of course, we're both thinking of the strangely disappearing posts on what R580 is compared to R520 that strongly suggested it was something more than the typical clock/memory bump refresh. Maybe ATI backs into their previous "launch midrange first on new process" mode by repositioning R520 (maybe with somewhat slower memory) as the new midrange?
 
Charmaka said:
You're sure that SLI would work on any multi-16x mobo with a driver tweak or similar?
As I'm not working at nVidia, there's no way to be sure, but all indicators point in this direction. We've seen SLi, when used with early, less restrictive drivers, working on Tumwater (x16 + x4), Grandsdale (x16 + x4), NF4 SLi (x8 + x8), NF4 Ultra (x16 + x2) and NF Pro (x16 + x16).

There's probably some kind of lower PCIe lane threshold for decent performance, I'm guessing 4 lanes min, but other than that, it seems like its all drivers. (i.e. no 'SLi magic sauce' required)
 
Charmaka said:
SLI: can't use 6800 Ultra with 6800 GT or vanilla
Crossfire: can use X850XT master with X850 Pro or vanilla
But what's the point of buying a $549 card that is then throttled (1 quad off and downclocked) when run along side that $349/399(?) Pro?

Flexible? Yep. Sensible? Hell no!
 
CJ said:
And have you noticed that buying a X800XL for $249 with another X800XL (can be found below $299) will be cheaper than 2 GF6800GTs? :)
Have you noticed that it wont workin Crossfire mode? According to Tom's, there're three Masters as of now, the X850 XT PE-based one and two X800-based ones.

Now, that X800 XL of yours can only be paired with either of the latter two, forcing it to shut off one of its quads. You'll essentially end up with two X800s, and having overpaid dearly on both. (one because it's really a X800 XL, the other because its a precious Master)

PS: My personal 'favourite' right now is X800 Master + X800 XT PE slave (poor thing). :LOL:
 
Status
Not open for further replies.
Back
Top