Tech Report: CrossFire DUAL SLAVE !

stevem said:
Does the Sil61161/2 I/O pair on the master mean that composite res is limited to 1600x1200?

Its a dual-link DVI, AFAIK.

As for the master / slave solution, if they do decide to keep the composite device then I'm wondering if a separate board, like the Alienware solution (but with internal connections), might not be a better solution - board vendors only need to carry one extra SKU (the composite board), users only pay for the extra cost if they actually want to and will allow for better matching of board performances without worrying about different vendor boards.
 
That would be one way of using that PCI Express x1 slot that often sits sandwiched between two graphics cards!

But then it also raises the idea of putting the compositor on the motherboard. If a special motherboard is required just to have two physical PCI Express x16 slots, why not put the compositor there, too.

Sure, that reduces mobo compatability...

Jawed
 
Jawed said:
But then it also raises the idea of putting the compositor on the motherboard. If a special motherboard is required just to have two physical PCI Express x16 slots, why not put the compositor there, too.

Well, this goes back to one of the original thoughts about ATI's solution - if it is the case that the chip can achieve all the composite functions (including supertile) then for motherboards that use ATI's integtrated graphics you can just use that to composite the data.

Of course, motherboard solutions are too limiting in other ways.
 
Their FAQ seemed to say that the compositing chip isn't going away anytime soon:

http://www.ati.com/products/crossfire/faq.html

Only now it is gone (I believe I quoted it on another thread a day or so after release)! Interesting. Well, it said that it would allow them to do other cool things with next generation hardware. Probably hinting at the stereoscopic support you mention.

Here's a semi-bizarre notion --can they somehow get the compositing chip *on the dongle* (or other connector bridge, internal or external)? That would be a neat solution, and not cost you extra transistors on every card, whether it is ever used for CrossFire or not. . .
 
DaveBaumann said:
stevem said:
Does the Sil61161/2 I/O pair on the master mean that composite res is limited to 1600x1200?

Its a dual-link DVI, AFAIK.

As for the master / slave solution, if they do decide to keep the composite device then I'm wondering if a separate board, like the Alienware solution (but with internal connections), might not be a better solution - board vendors only need to carry one extra SKU (the composite board), users only pay for the extra cost if they actually want to and will allow for better matching of board performances without worrying about different vendor boards.

I think this nails it on the head. If you want to pair different boards together you will need the master/slave solution but if you have to matching boards from a particular vendor you may be able to do everything slave/slave.
 
DaveBaumann said:
Its a dual-link DVI, AFAIK.
Hmm. AFAICS, once the two R4x0 output images hit the compositor, the only way out from the FPGA is via DVI (SiI1162) or RAMDAC (AD7123). SiI1161 input from the slave R4x0 may not pose a problem as it's half a frame, but both SiI devices are limited to 1600x1200. They'll need eg SiI178 for dual link or SiI1172 (225MHz), unless the master R4x0 TMDS can also be accessed. Of course it may not be a final rev board yet & I'm sure they've considered this.

geo said:
Here's a semi-bizarre notion --can they somehow get the compositing chip *on the dongle* (or other connector bridge, internal or external)?
I can't see why an external compositing block would be a problem. It can then be used for current & future gen boards as a seperate addon. FPGA programming via drivers may not be as easy or quick, although IIRC config is stored on EEPROM.
 
Output from each of the boards it actually a full frame (certinaly in tiled mode, possibly in scissor as well) but parts of the image that either board didn't render are just blank.
 
So it appears that CrossFire is limited to a maximum resolution of 1600x1200.

Whereas SLI is limited to 2048x1536.

Hmm...

Jawed
 
trinibwoy said:
Why is it limited to 1600x1200? Sorry if it was explained before and I missed it.

Well it's a guess. But Dave has just said that the slave card writes a full-frame via DVI. As I understand it, DVI is limited to 1600x1200. That's why you need 2x DVI ports to get the Apple 30" LCD to work (2560x1600 isn't it?).

So if the slave is writing a full-frame then it's limited to 1600x1200. If, instead, it was writing supertiles, or "a scissored half", the DVI bandwidth could have been used non-linearly, i.e. total resolution after compositing is more than 1600x1200.

Anyway, this is based on my guess...

Jawed
 
Oh, I thought DVI was just a digital interface with a peak bandwidth that you can use however you want. Why is DVI limited to 1600x1200? Is it because that's the max resolution that the available bandwidth can handle for some maximum refresh rate?
 
DaveBaumann said:
Output from each of the boards it actually a full frame (certinaly in tiled mode, possibly in scissor as well) but parts of the image that either board didn't render are just blank.
Yeah, I was being generous regarding input from slave. I meant to say if it's half a frame. Final image composition & output is more of an issue. I wonder whether it's all bogus & will indeed be dropped...?:)

Scissor with vertical split could also be useful (easier than supertiling?) for dual displays - if possible. The current dongle precludes this with two boards. Presumably you'd need a four card setup with two masters.

trinibwoy said:
Oh, I thought DVI was just a digital interface with a peak bandwidth that you can use however you want. Why is DVI limited to 1600x1200? Is it because that's the max resolution that the available bandwidth can handle for some maximum refresh rate?
Yep. Not all transmitters are created equally.
 
Yeah, it's down to bandwidth. No different from a bus. Earlier incarnations of DVI were even more limited.

It seems the dual-link DVI port is actually a double-DVI port in one connector (30 pins, versus 24, for digital, I think). I misinterpreted the connection between the 30" display and the graphics card as requiring two cables :oops:

Jawed
 
Jawed said:
So it appears that CrossFire is limited to a maximum resolution of 1600x1200.

Whereas SLI is limited to 2048x1536.

Hmm...

Jawed
And not just that, wouldn't you also be limited to 60 Hz @ 1600x1200 transferred from the slave to the master, effectively capping the framerate at 60 fps for scissor, tile and SuperAA modes? (and for 1280x960/1024 the cap would be 85 fps)

I don't think those limitations will sit well with the monitor crowd.
 
trinibwoy said:
Oh, I thought DVI was just a digital interface with a peak bandwidth that you can use however you want. Why is DVI limited to 1600x1200? Is it because that's the max resolution that the available bandwidth can handle for some maximum refresh rate?

My 2405FPW is 1920x1200, and hooked thru DVI.
 
Any idea what the effective frame rate is, Fallguy?

All very curious this. I use a CRT so I'm in the dark really.

Jawed
 
The frame rate won't be capped, the display rate might be. People tend to notice the display rate less than they notice the response rate.
 
I wouldn't be suprised if the master get its video sync from the compositor chip. So the master runs in sync with the slave. Running the two chips in sync will save a lot of memory on the compositor.
If that's the case, it would be hard to put the compositor chip on the cable, or on a separate board using two regular "slave" boards.

You could of course do the sync over PCIe. But since that interface isn't made to keep things in sync on that fine level (I believe there are rather deep buffers in the interface), they'd need a lot more buffering on the compositor chip.

SuperAA can still work, because that mode probably copy the slave frame into a texture in the master card. So it already waste that memory.

This may also be one of the reasons ATI like the compositor chip more. I think they could do all the modes on the GPU, but that it would involve copying the frame to a texture, that then are mixed with the shaders. It would eat PCIe bandwidth, memory in the master, and shader power.
 
Xmas said:
The switching card is neither against the spec nor patented by NVidia nor required for SLI nor limited to SLI. So please tell me, in what whay is this switching card proprietary?

And could you tell me why you think switching at the core logic level is less expensive?

I thought I already mentioned the fact that even "SLI-capable" advertising on the part of mboard manufacturers requires a licensing fee be paid to nVidia. That immediately makes it more expensive. As to whether auto-switching is cheaper, two things:

(1) Auto-config from ram to graphics slots to PCI slots to IDE controllers and etc. has been commonplace for a long, long time. Generally, bios-controlled autoconfig is a feature customers want and prefer over using manual switches and jumpers for configuration (and the manufacturers have responded.) Economies of scale have taken care of any price differential that may at one time have existed.

(2) Since the dual-slot PCIe16x spec was original with Intel as opposed to nV, can you name any dual-slot PCIex16 mboard manufacturer who is not only paying nV for SLI support but who is also paying Intel for the use of the dual-slot PCIex16 spec (just curious)? If not, I rest my case...;)


Hm, you seem to have missed the joke...

Yup...:D
 
Back
Top