Crossfire limitation

Ailuros said:
If they don't they deserve to be shot heh...



Typical 19" CRT stretch content usually beyond a 1024 height. If you know the exact measurements of the viewable area of the monitor and it's dot pitch size, it's easy to figure out where it doesn't stretch.

If you have a 0.25 dot pitch for instance and want to reach a 1536 width, you'd need a height of the viewable screen of 384mm. Usually a 21" CRT is around 305mm ;)
The point is the 60 hz eye strain.
I dont care if it doesnt improve the detail, I care about the work area and for games the reduction of jaggies.
 
radeonic2 said:
The point is the 60 hz eye strain.
I dont care if it doesnt improve the detail, I care about the work area and for games the reduction of jaggies.
Yup, which for me that would mean a maximum resolution of 1280x1024 (which should work at 85Hz, I think) for this solution. But I already game at that res in most of my games anyway (well, I choose 1280x960, because that's actually got normal dimensions.....).
 
radeonic2 said:
The point is the 60 hz eye strain.
I dont care if it doesnt improve the detail, I care about the work area and for games the reduction of jaggies.

The 60Hz limitation (if true) is a major problem for CRT monitors.

Oh and to further nitpick higher resolutions don't reduce jaggies, they make em smaller *runs for his life*.
 
Chalnoth said:
Yup, which for me that would mean a maximum resolution of 1280x1024 (which should work at 85Hz, I think) for this solution. But I already game at that res in most of my games anyway (well, I choose 1280x960, because that's actually got normal dimensions.....).
I used to game at the odd res of 1200x900@ 75hz when i still have my radeon 8500 :smile:
The crt I use has a .26 dot pitch, so its not very sharp, there's a nice smaller crt at school that's really sharp.
Ailuros said:
The 60Hz limitation (if true) is a major problem for CRT monitors.

Oh and to further nitpick higher resolutions don't reduce jaggies, they make em smaller *runs for his life*.
Thats what I said :???:
You wont be further nitpicking because I was saying for crt owners it's a major concern, a deal breaker for larger monitors.
Smaller= less percieved jaggies btw.
 
IMO the whole crossfire Sli thing is dumb anyway. It made sense back when 2 Vodoos were any a little over 400$. Now One card cost that.

Then you have spend all this crazy freaking money when the upgraded card, who knows maybe even the refresh will equal or pass you in performance.

It makes no seense at all unless you gain the ablitily to game at say 1280x1024 with *extreme* levels of AA.

Still can anyone justify 1200$ for a pair of 6800 Ultras, and 200 bucks for the mobo all to get its pants beatten off by a GTX a short time later?
 
So the limitation is that the DVI only goes up to 1600x1200? I thought Dell's 2405 went up to 1900x1200 on a single DVI. What am I missing?
 
Karma Police said:
So the limitation is that the DVI only goes up to 1600x1200? I thought Dell's 2405 went up to 1900x1200 on a single DVI. What am I missing?

I think it uses reduced blanking to make it scale higher. Anyway, one solution is to use the 1171 since i think they are pin compatible with 1161 and has enough bandwidth to get to 1920x1200@60hz which should be a lot better than the current limitation. I think there maybe some TNDS encoding changes need to make it compatible with the 225MHZ bandwidth of the Sil 1171.
 
The point is that it would be dumb including a dual link TMDS on the Crossfire masterboard becuase the limitation is not on the masterboard, the limitation is that all current Radeon's only have a maximum of a single link TMDS output, which limits the maximum output that the slave can present to the master.
 
Hellbinder said:
Still can anyone justify 1200$ for a pair of 6800 Ultras, and 200 bucks for the mobo all to get its pants beatten off by a GTX a short time later?
If you call over a year a short time...you always pay more if you want high performance earlier. And the 6800 Ultra SLI still often beats a single 7800 GTX. I'm not currently in a financial position where it'd be close to worth it, but buying a dual 6800 Ultra was perhaps the best-motivated SLI purchase at the launch of the product.
 
Chalnoth said:
If you call over a year a short time...you always pay more if you want high performance earlier. And the 6800 Ultra SLI still often beats a single 7800 GTX.

exactly there is always a market with those who do not care about money, ATI first ignored it, now wants to be a part of it, and finally after a year or more they seem to have designed an inadequate (better to say inferior) product.
 
Druga Runda said:
exactly there is always a market with those who do not care about money, ATI first ignored it, now wants to be a part of it, and finally after a year or more they seem to have designed an inadequate (better to say inferior) product.
Well, we'll see how it does when it's finally released. There's no reason that this TMDS issue need last very long, for instance, but this does mean that it won't make any sense for people to upgrade their current ATI hardware to Crossfire, which would seem to be a pretty major blow. I'd claim that the status of the TMDS on ATI's soon to be launched cards is still up in the air, so it may make sense to, for example, use Crossfire with the R520.

If the R520 has the same issue, however, that would seem to make Crossfire with that product rather pointless (since just one R520, if it's at similar performance to the GTX, could probably do 1280x960 with massive anti-aliasing anyway).
 
Ailuros (& many others) said:
The 60Hz limitation (if true) is a major problem for CRT monitors.
Seems to me this limitation would be for the framerate, not the monitor's refresh rate.*

60Hz is obviously not an issue with LCDs, but is the RAMDAC used for CRTs still on the GPU (and ~400MHz), or is it in the DVI-DB15 adapter? I'm guessing the former, and the analog signal is passed through those extra DVI-I pins, separate from the DVI-D ones. Either way, DVI + adapter has got to allow for more than 60Hz at 16x12 on a CRT, or you'd think some owners of high-end, dual-DVI nV boards and 22" CRTs would be pissed and vocal.

-----------
* If so, then why did Derek hit 136fps in HL2 at 16x12? Even AFR mode at DVI spec should mean a 16x12@60Hz limit. Hmmm, unless they used reduced blanking, which means 16x12@72Hz --> @144Hz for AFR (only every other frame is thrown to the compositing chip via TDMS), and thus is within DVI-spec'ed 165MHz TDMS limits.

Mebbe ATI OC'ed their SI chip to the 225Hz limit listed in that SI whitepaper?

Edit: Hmmm, Chal, I haven't looked into the compositing chip. I'm just assuming this SI chip is used for inter-GPU communication, and that's the big deal. I'll trust you while I look into it. :)
 
Last edited by a moderator:
One more thing I'd like to quickly reiterate:
Alternate frame rendering shouldn't be nearly as limited as the other modes, provided ATI can fudge it so that the cards will output at strange refresh rates like 37.5Hz or 42.5Hz (which, when combined, will result in 75Hz and 85Hz, respectively). But the resolution will still be limited to 1600x1200, which a single R520 again should do just fine.
 
Pete said:
This limitation seems to be for the framerate, not the monitor's refresh rate.
I thought that at first, but the problem is that the slave card thinks it is rendering normally, and since it will output in 60Hz intervals (at 1600x1200), and since the compositing chip doesn't appear to have its own memory, it would seem extremely hard to believe that ATI could produce a master card that would run at a refresh rate of 85Hz in such a scenario.
 
geo said:
Don't you think this would be much more likely to be true if they had to take public responsibility for it? That maybe we'd get useful, but non-inflammatory, documents doing this?
Maybe, but any such statement officially from one of these companies would be seen as inflammatory, since the purpose is to point out flaws in another's product.

And besides, if one company turns out to be wrong (whether willfully or not), they could get sued much more easily if they announce these flaws officially.
 
WRT the refresh rate, why are CRT's going to be disadvantaged by a TMDS? This is for digital display, is it not? Surely the analogue refresh rate would be dictated by the RAMDAC?
 
Dave Baumann said:
WRT the refresh rate, why are CRT's going to be disadvantaged by a TMDS? This is for digital display, is it not? Surely the analogue refresh rate would be dictated by the RAMDAC?
Who is this in response to?

Because for a Crossfire solution, it seems unlikely to me that one could run the two cards at different refresh rates.
 
This is perhaps a stupid question but... couldn't you use the dual-dvi output of the slave card to get around this issue?

WRT to IHVs telling on each other and keeping it true: IMHO that's where B3D, et al come in. If you get an anon hot tip, investigate it until you have independent confirmation. This will force the companies to only "report" what are genuinely important shortcuts the others are taking.
 
The RAMDAC for a CRT takes the framebuffer and runs freely with it - irrespective of the rate that frames arrive in the framebuffer.

In CrossFire there appears to be a 1600x1200/1920x1080-ish @ 60Hz limitation for framebuffer data.

Framebuffer update rate does not equal CRT refresh rate.

But anyone with a 2560x1600 or 2048x1536 display is so out of luck.

Jawed
 
Chalnoth said:
Who is this in response to?
That was in general, because there appears to be very little thinking going on.

Chalnoth said:
Because for a Crossfire solution, it seems unlikely to me that one could run the two cards at different refresh rates.
Only one card is producing a display image, and rather than the graphics core doing it its now done by an external display device. In a 3D scene rarely is the game running at the refresh rate, so some rendered images are missed, some are repeatedly resent - this is the same as Crossfire. The Crossfire composite device is now effectively the final frame-buffer for the image; regardless of the rate it is being transmitted images from the Slave card, full images are still being presented to the RAMDAC or TMDS, so again they will be repeadted.
 
Back
Top