Crossfire limitation

JoshMST said:
In many ways your post leaves me a bit confused, as I agree with most of what you say. My point with profiles is that ATI says, "We won't have profiles" but Cat AI will be deciding which titles get which rendering method. I don't think there is anything inherently good or bad with profiles or AI, it is just the way these companies do the job. It is interesting to read that you haven't been able to disable Cat AI. If there is one person that knows 3d graphics, it is definitely you.
This is what NVIDIA were tyring to impress on people at their editors day, except I'm not sure people had much buy in to it. The very fact there there are 3 different advertised rendering modes and only two API's should tell everyone that profiling in some cases is a requirement, otherwise there frankly wouldn't be any point in providing 3 different modes.

NVIDIA appear to be stuck on the issue of profile/no-profiles, but what ATI were trying to impress is that the default for Crossfire will be on, whereas the default for SLI when it was initially launched would be off, unless there is a game profile. Exactly how much that is an issue by the time Crossfire is actually released now is an entirely different question, but the argument behind profiling is a bit of a red herring IMO.

Many of the points you mention are actually mentioned in the Crossfire FAQ

http://www.ati.com/technology/crossfire/faq.html
http://www.ati.com/technology/crossfire/faq1.html

[Edit] - Disabling Cat AI is easy enough, what I haven't figured out is how to change the Crossfire rendering mode.
 
Frankly after even following the whole thread for it's entire 4 pages I still haven't understood what is what, so feel free to call me dumb.

If I connect my 21" CRT to a Crossfire system (via a DVI dongle obviously) will I get the supported refresh rates by the monitor in 1600 and beyond or will I be limited to 60Hz?
 
Dave Baumann said:
NVIDIA appear to be stuck on the issue of profile/no-profiles, but what ATI were trying to impress is that the default for Crossfire will be on, whereas the default for SLI when it was initially launched would be off, unless there is a game profile. Exactly how much that is an issue by the time Crossfire is actually released now is an entirely different question, but the argument behind profiling is a bit of a red herring IMO.

Many of the points you mention are actually mentioned in the Crossfire FAQ

http://www.ati.com/technology/crossfire/faq.html
http://www.ati.com/technology/crossfire/faq1.html

Frankly the PR mess from both sides is a mess IMO. "Red" or "green" herring, I'd have to take both sides stuff and filter our what is true and what not and what's an exaggeration and what isn't.

From the second page of that FAQ:

Competitive solutions only work on a limited number of games that are profiled in the driver. New games, older games, lesser known games, and even some current popular titles are not supported, and the end user sees no benefit with this system when running these applications.

Meaning the SLi user cannot profile a game himself if isn't profiled?

While I do understand what you mean with the "default" on CrossFire, I still would think that users would prefer to have the chance to alter rendering modes and the driver being completely transparent over multi-GPU rendering modes. If it's just hidden somewhere in the driver and you haven't figure it out yet, that's another story, yet I still have to wonder why it can't be as simple that everyone can find it, if it can be altered.
 
Ailuros said:
Frankly after even following the whole thread for it's entire 4 pages I still haven't understood what is what, so feel free to call me dumb.

If I connect my 21" CRT to a Crossfire system (via a DVI dongle obviously) will I get the supported refresh rates by the monitor in 1600 and beyond or will I be limited to 60Hz?
People are confusing refresh rates and frame rates (much like Fuad confused bus widths and chip density, and I confused division with imagination). If Xfire is limited in its communication by the bandwidth of the SI TDMS chips (16x12@60Hz, or 19x12@~60Hz with reduced blanking intervals), then that's a frame rate limitation in some Xfire modes.

But people seem to think this high-res @ 60Hz frame rate limitation means they won't be able to drive their CRTs at a more comfortable refresh rate, which is apparently incorrect, as CRTs are driven by a RAMDAC that is probably a smidge faster than 165MHz--or else all the X8x0s driving analog CRTs via DVI-DB15 adapters would have made something of a fuss.

I don't believe Xfire changes the way analog monitors are driven (read: supported refresh rates via RAMDAC), just the inter-GPU communication is limited by the TDMS transmitter and receiver team's 165MHz bandwidth. But, as I pointed out, AnandTech's Xfire review showed a 136fps framerate in HL2 at 16x12, so either it was running in AFR (and thus has the headroom of double a single DVI-spec TDMS transmitter, or 16x12@120Hz), Derek's FPS detector was detecting screen refreshes rather than legitimate frame buffer updates, or the framerate isn't as obviously limited.

Dave, some of us took the time to think. :p

Josh, I'm off to read your article.
 
Pete said:
But people seem to think this high-res @ 60Hz frame rate limitation means they won't be able to drive their CRTs at a more comfortable refresh rate,
No, it's entirely correct. Because to run the CRT at a different refresh rate than the compositor chip is receiving images would require that this chip have lots of memory (1600x1200 would require about 16MB, 2048x1536 about 24MB), and would add latency. This chip does not appear to have any sort of memory of its own, so it's just going to be receiving the signal from both chips and outputting them directly as it receives the signal.

This directly means that the chip will only be able to run a CRT at 60Hz for 1600x1200, unless the TMDS is updated for the soon-to-be-shipped cards.
 
Last edited by a moderator:
Pete said:
People are confusing refresh rates and frame rates (much like Fuad confused bus widths and chip density, and I confused division with imagination). If Xfire is limited in its communication by the bandwidth of the SI TDMS chips (16x12@60Hz, or 19x12@~60Hz with reduced blanking intervals), then that's a frame rate limitation in some Xfire modes.

But people seem to think this high-res @ 60Hz frame rate limitation means they won't be able to drive their CRTs at a more comfortable refresh rate, which is apparently incorrect, as CRTs are driven by a RAMDAC that is probably a smidge faster than 165MHz--or else all the X8x0s driving analog CRTs via DVI-DB15 adapters would have made something of a fuss.

While I appreciate the detailed analysis, I only requested a yes or no answer just because you're only further confusing me.

Frankly I never really cared about such details up to now, hence my complete lack of knowledge/understanding of the whole story.
 
Ailuros said:
While I appreciate the detailed analysis, I only requested a yes or no answer just because you're only further confusing me.

Frankly I never really cared about such details up to now, hence my complete lack of knowledge/understanding of the whole story.
:LOL:
 
I was hoping Anandtech's preview used a CRT, but noooo:
Not being able to use a good CRT monitor, we had to resort to our 1600x1200 LCD for testing, limiting our max resolution.
Awfully convenient, no? :devilish:

Chal, forgive my density, but how then does the compositor handle frame rates lower than the 60Hz refresh rate: by repeating the previous frame until it has enough data for a new one? For that matter, I'd like to know how it handles fps higher than Hz. Does it discard frames, continually combine successive ones until the display is ready, ...? Does SuperAA indicate that the Compositing Engine has enough memory to combine at least two full frames into one? Don't feel compelled to answer, as I'm asking too many Qs that I'm sure will be answered in a week or two.

Ail, sorry. Short answer: dunno. :) I guess yes (and Dave seems to argue this), but Chal makes a case for no that I don't know enough about to refute.
 
Ailuros said:
If I connect my 21" CRT to a Crossfire system (via a DVI dongle obviously) will I get the supported refresh rates by the monitor in 1600 and beyond or will I be limited to 60Hz?
Evidently 60MHz.

Ailuros said:
While I do understand what you mean with the "default" on CrossFire, I still would think that users would prefer to have the chance to alter rendering modes and the driver being completely transparent over multi-GPU rendering modes. If it's just hidden somewhere in the driver and you haven't figure it out yet, that's another story, yet I still have to wonder why it can't be as simple that everyone can find it, if it can be altered.
The question is, if it works as its advertised its supposed to, then what actual need is there to alter the settings? I can see both points of view - some end user ay want to twaek, but on the flipside there no real need to tweak if its doing the necessary job, and this is kind of in keeping with ATI's previous methods and statements.

(Perhaps, though, end user do want to tweak their apps because there is such a derth of actual decent PC software out there right now that they want to fill their time experimenting with their graphics cards, which is what makes all of this business a little absurd - for the cost of two high end graphics boards and a new platform for it end users could have bought a next gen console and 10-15 games and they probably would be getting a far greater selection of games to choose from in short order.)
 
Dave Baumann said:
for the cost of two high end graphics boards and a new platform for it end users could have bought a next gen console and 10-15 games and they probably would be getting a far greater selection of games to choose from in short order.)
So, if we are going to be so picky, we should just become console gamers?
 
radeonic2 said:

As funny as it may seem I didn't even know if I'd be able to maintain my refresh rates on the CRT while using a DVI dongle on the G70. Obviously all the GPUs I used up to now had an analog connector and there wasn't much reason for me to bother with DVI ports. For any LCD I ever used I used DVI.
 
Dave Baumann said:
Evidently 60MHz.

Then I can wholeheartedly understand all objections.


The question is, if it works as its advertised its supposed to, then what actual need is there to alter the settings? I can see both points of view - some end user ay want to twaek, but on the flipside there no real need to tweak if its doing the necessary job, and this is kind of in keeping with ATI's previous methods and statements.

I can see a multi-GPU system making real sense in an enthusiast's (or even professional's) hands and not somebody that just needs a plug and play GPU.

Question: assuming a game has an AFR profile and it scales performance by a good percentage; if the usual GPU performance is good enough for me though but I want higher AA sample densities instead can I enable those with AFR too? (yes that's an equally honest question and no I don't know...).

(Perhaps, though, end user do want to tweak their apps because there is such a derth of actual decent PC software out there right now that they want to fill their time experimenting with their graphics cards, which is what makes all of this business a little absurd - for the cost of two high end graphics boards and a new platform for it end users could have bought a next gen console and 10-15 games and they probably would be getting a far greater selection of games to choose from in short order.)

That's like saying that there is some sort of absurdness from ATI's side when deciding to invest on multi-GPU platforms. I severely doubt that either/or IHVs target just the PC ultra high end market. IMHO ATI didn't want to leave NVIDIA standing alone with multi-GPU sollutions in the professional market either, but could be entirely wrong of course too.
 
Ailuros said:
I can see a multi-GPU system making real sense in an enthusiast's (or even professional's) hands and not somebody that just needs a plug and play GPU.
But again, what actual need there to go changing things if you are already getting a decent performance increase?

Question: assuming a game has an AFR profile and it scales performance by a good percentage; if the usual GPU performance is good enough for me though but I want higher AA sample densities instead can I enable those with AFR too? (yes that's an equally honest question and no I don't know...).
I'm not sure I quite understand the question, but I'll have a go anyway. I think this is actually addressed by ATI's FAQ - Up to 6x FSAA you'll be rendering in dual graphics mode and the rendering type that goes with that particular title or the default for you board configuration, go beyond 6x and you'll be using supertiling. In essence this is very similar to SLI (althoughh with Crossfire there is a line where you go over from dual graphics rendering to SuperAA, while SLI AA overlaps with the 8xS normal rendering).
 
Dave Baumann said:
But again, what actual need there to go changing things if you are already getting a decent performance increase?

The scenario that occurs to me on the above, is ISV releases a patch that breaks the mode the "profile" uses. . .and now you either have to reinstall the game without the patch (and maybe there was something in that patch you really wanted), or you are screwed until ATI addresses it in the driver.
 
Dave Baumann said:
But again, what actual need there to go changing things if you are already getting a decent performance increase?

I simply cannot understand yet how this is supposed to work. If there are game profiles I can't imagine how ATI has managed to include profiles for each and every single game out there, no matter how old or "rare".

What exactly speaks against having game profiles whether on single or multi-GPU systems? I just can't follow the reasoning why there should be a problem for that one.


I'm not sure I quite understand the question, but I'll have a go anyway. I think this is actually addressed by ATI's FAQ - Up to 6x FSAA you'll be rendering in dual graphics mode and the rendering type that goes with that particular title or the default for you board configuration, go beyond 6x and you'll be using supertiling. In essence this is very similar to SLI (althoughh with Crossfire there is a line where you go over from dual graphics rendering to SuperAA, while SLI AA overlaps with the 8xS normal rendering).

The answer is clear enough though ;) I was just worried that the user wouldn't be able to enable SuperAA even if a game is set for AFR for instance :)
 
Ailuros said:
If there are game profiles I can't imagine how ATI has managed to include profiles for each and every single game out there, no matter how old or "rare".

Eh? Thats exactly what they haven't done. They have provided a default "on" position for Crossfire for all games (in D3D, for instance, on 16 pipe cards the default is SuperTiling), the profiles are app specific alternate modes that don't work (or provide a better increase) for the default position - so the profile should be the exception rather than the norm (like the app specific stuff in Cat AI in the first place).
 
....the profiles are app specific alternate modes that don't work (or provide a better increase) for the default position - so the profile should be the exception rather than the norm...

And how exactly is it guaranteed that all games will benefit or won't work from the default mode? Driver teams have trouble detecting all possible bugs or quirks for all applications, and I just can't understand how it's possible that ideal modes are being guaranteed for all cases.
 
Ailuros said:
As funny as it may seem I didn't even know if I'd be able to maintain my refresh rates on the CRT while using a DVI dongle on the G70. Obviously all the GPUs I used up to now had an analog connector and there wasn't much reason for me to bother with DVI ports. For any LCD I ever used I used DVI.
Ail, radeonic was laughing at the smackdown you gave my answer. :)
 
The implication is that the "ideal" mode has only been selected for the ones with profiles. One hopes that the default mode (i.e. the great mass) is a max-compatibility mode of something close to 100% reliability, rather than a max-performance mode that may or may not work. Then you can fiddle from there on those non-profiled apps. At least that's what I'm interpreting.
 
Here's a thread I posted on HardOCP about what Release 80 is expected to bring to the table for SLI.

http://hardforum.com/showthread.php?t=954094

Just thought that should be added for contrast in relation to CrossFire.

My opinion is that ATI better release it "now" as in yesterday, as the longer that they wait the more features SLI adds to bring it to parity. At one point CrossFire was going to have some features, like different vendors or even different framebuffer sizes, etc, that nVIDIA's SLI didn't.

If they wait, they'll be facing competition with a product that will be even more in parity, or possibly with even more features.

They've got to sell as many as possible to recoup their R&D costs.
 
Back
Top