Synchroniszation issues with SLI and CrossFire

All participants in the thread will please take an incense/beer/flower sniffing/Crysis/GTA4 break to mellow out with, before resuming the conversation with about two notches less intensity.

--Yr Friendly Mod Team
 
Just curious, how did Voodoo 2/5 scan line interleave work? It sounds like and implementation of split frame rendering to me. Did it suffer from similar problems? Was the performance scaling adequate? I don't believe it would have suffered from this stuttering problem anyways. On that note how did the ATI Rage Fury Maxx work? AFR? Or did it use something similar to 3DFX's SLI?
 
Just curious, how did Voodoo 2/5 scan line interleave work? It sounds like and implementation of split frame rendering to me. Did it suffer from similar problems? Was the performance scaling adequate? I don't believe it would have suffered from this stuttering problem anyways. On that note how did the ATI Rage Fury Maxx work? AFR? Or did it use something similar to 3DFX's SLI?

To my recollection it just plain worked. Each card spit out a line of pixels at a time, to be sent to the 2d card for output.

Rage Fury Maxx used AFR, IIRC.
 
To my recollection it just plain worked. Each card spit out a line of pixels at a time, to be sent to the 2d card for output.

Rage Fury Maxx used AFR, IIRC.

In 3d mode on a Voodoo 1/2 the 2d card is completely disconnected from the monitor.

And on the Voodoo2, all geometry and setup was done on the cpu, and loaded to both cards. That makes it a lot easier to deal with when the cards are getting a finished static frame to render.
 
Just curious, how did Voodoo 2/5 scan line interleave work? It sounds like and implementation of split frame rendering to me. Did it suffer from similar problems? Was the performance scaling adequate? I don't believe it would have suffered from this stuttering problem anyways. On that note how did the ATI Rage Fury Maxx work? AFR? Or did it use something similar to 3DFX's SLI?

To really compare the solutions today is really unrealistic. All the Voodoo2 had to do was contend with was fillrate problems. It had no shaders, render to texture, geometry engine, or games
that took more than 1 or 2 rendering passes.

3dfX's approach wouldnt work any better than SFR or Tiling does today. And it wasnt even a digital signal and subject to signal degredation.
 
To really compare the solutions today is really unrealistic. All the Voodoo2 had to do was contend with was fillrate problems. It had no shaders, render to texture, geometry engine, or games
that took more than 1 or 2 rendering passes.

3dfX's approach wouldnt work any better than SFR or Tiling does today. And it wasnt even a digital signal and subject to signal degredation.

VooDoo5 was digital AFAIR.
 
To really compare the solutions today is really unrealistic. All the Voodoo2 had to do was contend with was fillrate problems. It had no shaders, render to texture, geometry engine, or games
that took more than 1 or 2 rendering passes.

3dfX's approach wouldnt work any better than SFR or Tiling does today. And it wasnt even a digital signal and subject to signal degredation.

OK Cool thanks for the info. So basically 3DFX's SLI would not work in today's environment due to a general increase in complexity. On that note does anybody know the differences between V5's SLI implementation and V2's SLI implementation? I know V2 passed analogue data from one card to the other, with each card rendering alternate scan lines but how did V5 work? If it was digital as MTDE suggested was it simply an implementation of SFR? Or was it tile based?

Anyways does anybody have the motivation to make a video of the micro stuttering issue? I'd really like to see it with my own eyes so I can decide if it's worth worrying about or not. I'm sure many others would also benefit from a video illustrating the problem.
 
OK Cool thanks for the info. So basically 3DFX's SLI would not work in today's environment due to a general increase in complexity. On that note does anybody know the differences between V5's SLI implementation and V2's SLI implementation? I know V2 passed analogue data from one card to the other, with each card rendering alternate scan lines but how did V5 work? If it was digital as MTDE suggested was it simply an implementation of SFR? Or was it tile based?

AFAIR it used multiple bands (i think they were 16 pixels wide or something like that).
 
OK Cool thanks for the info. So basically 3DFX's SLI would not work in today's environment due to a general increase in complexity. On that note does anybody know the differences between V5's SLI implementation and V2's SLI implementation? I know V2 passed analogue data from one card to the other, with each card rendering alternate scan lines but how did V5 work? If it was digital as MTDE suggested was it simply an implementation of SFR? Or was it tile based?

B3D to the rescue:http://www.beyond3d.com/content/articles/67/4
 
In V5 SLI each chip could work on a group of up to 128 horizontal lines. The lines went completely across the screen. A single line was never divided into sections like Super-Tiling would do.

This group of lines processing improved cache/memory hit rates and improved balance between the chips, compared to the every-other line the V2 used.
 
To my recollection it just plain worked. Each card spit out a line of pixels at a time, to be sent to the 2d card for output.

Rage Fury Maxx used AFR, IIRC.

Do anyone really remember it? i had Rage Fury32 which was a decent GPU - if slow; with good IQ - compared to the first GeForce SDR (SGRAM), which i also had. The MAXX was basically two of the Rage32 GPUs, right? i watched it in action before deciding NOT to buy it; i wanted it at first. i remember it as being buggy as hell; the ONLY one i remember that liked it was [nV]Rollo; and i suspect it was only to make fun of it; but he liked weird technical stuff also. :p
- now that was really Micro Stutter and i remember a vertical roll occasionally and some nasty missed synchs also. When it worked it was OK, but ATi never supported Win2K as promised; even so i forgave ATi .. until 2900xt and then those awful-tasting 'sandwiches' i could not stomach anymore. =P
 
Last edited by a moderator:
Anyways does anybody have the motivation to make a video of the micro stuttering issue? I'd really like to see it with my own eyes so I can decide if it's worth worrying about or not. I'm sure many others would also benefit from a video illustrating the problem.

Maybe I'm missing something, but wouldn't you need a video camera + video playback with several times the frame rate of what you're recording in order to actually capture the variations in displayed frame rate? I suppose it helps that the camera doesn't take instantaneous samples, so the microstuttering might show up as variations in blurriness or something..
 
Maybe I'm missing something, but wouldn't you need a video camera + video playback with several times the frame rate of what you're recording in order to actually capture the variations in displayed frame rate? I suppose it helps that the camera doesn't take instantaneous samples, so the microstuttering might show up as variations in blurriness or something..

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2178762

http://www.pcgameshardware.de/?article_id=631668

http://www.forum-3dcenter.org/vbulletin/showthread.php?t=371844

there is a video linked in the 2nd one, but not everyone sees it :p
 
The question now becomes: how much longer can they keep pushing monolithic dies?

Monolithic dies aren't the only alternative to CF and SLI. They could use a multi-core shared cache methodology or a very high speed interlink for two discrete dies on a single package. Multi core that talks over an external bus like CF/SLI isn't the only solution.
 
Back
Top