PCI Express SLI by Alienware!

Aha, found some news articles about Alienware's old cusotm SLI-like methods, called PCG:

http://www.sharkygames.com/hardware/articles/metabyte_pgp_update/b.shtml said:
http://www.sharkygames.com/hardware/articles/metabyte_pgp_update/b.shtml
From what Sharky Extreme's sources have learned, system integrator Alienware will be the first and only company who produces a PGP configuration with two video cards in tandem, based on the Metabyte driver technology.
<<...snip...snip...>>
Sources close to the two parties have told Sharky Extreme that Alienware paid Metabyte somewhere in the area of one to two million dollars for the exclusive technology licensing of the PGP driverset. What the PG driverset package includes isn't exactly known, nor are the different ways that the Metabyte drivers can be utilized at this time. (Multi-Monitor Setups come to mind as a possibility as well as the native PGP function)

http://www.shugashack.com/archives/m8_m2.htm said:
http://www.shugashack.com/archives/m8_m2.htm
Looks like the Alienware guys are gonna be showing off a TNT2 Ultra and a V3 2000 paired up in PGC mode at E3 next week accoriding to GameSpot.

So how much are gamers expected to pay for this lightning-fast package? While official pricing hasn't been made available yet, Alex Aguila, Alienware's national sales director, told GameSpot News that a US$300 price point, plus or minus a few dollars, is an accurate estimation. A final amount will be decided upon at a later date, however.

GameSpot News has also learned that Alienware will soon offer a downloadable driver, which will let its PGC Voodoo3 2000 boards be used in conjunction with any nVidia TNT2-powered AGP video card. Again, no price has been decided on for this download, but Alienware hinted at a possible US$20 figure.

http://66.96.254.245/news/archive/may99.htm said:
http://66.96.254.245/news/archive/may99.htm
SharkyExtreme takes a look at Alienware's PGC SLI technology. Here's a clip from the article:

Alienware Dual Voodoo3-2000 PGC (2 x 16MB PCI, 143MHz)
Intel P3-500
Quake2 Timedemo1-
1600x1200x16bpp: 63.5 fps

That's an amazing figure, especially when you consider that the fastest video card that we've tested this year (the Matrox G400 MAX) turned in a score of 38.3 fps in the same test with the same CPU.
 
If its just Dual PCI-E and some crummy driver doing the work.

Whats to stop someone like Nvidia/ATi doing a driver and asking AIB vendors of who they deal with and support Dual PCI-E.

Or do most motherboard vendors view the Alienware solution as too high-end?

I think this could only be a win-win situation for ATi/Nvidia. I mean for the high-end segment, I know its small. But a good amount of that segment might spend on another GFX card. If it means alot of performance.

WOW.. Imagine 2x X800XT PE's 8)
 
BRiT said:
:rolleyes:

Not to be a doubter, but way back when, there was talk of AlienWare doing the same thing using GeForce-256s and the like. Guess what happened? NOTHING. NADA. ZILCH. VAPORWARE. They showed it off and demoed it to sites like FiringSquad / Sharky's Extreme. The alleged way it worked was by splitting the screen in half, with one card rendering the top half and the other card rendering the bottom half.

I'll believe it when I see it.

So with an X800XT and 6800U, one will have TAA and one won't, and one will have graphics corruptions/errors and one won't, and one will run at a different FPS than the other?

This is bound to work out well. :LOL:
 
To be honest this whole concept of using different graphics cards in conjunction to render a single application (game etc) seems flawed and impossible.
 
no_way said:
Dual only ? Lets just go the extra mile and make a quadro config possible from the get-go.

LOL i'd like that too of course ;)

But lets be practical. Lets just get 2x out there. Does anyone else think this could slow down product cycles?

I can see the phrase. Need more speed. Just get another card!

Anyway, yes I do think the different types of vendor being in 1 system is flawed. Alienware for simplicity should stick to 1 vendor of card being in a system.
 
no_way said:
demonic said:
If its just Dual PCI-E and some crummy driver doing the work.
Dual only ? Lets just go the extra mile and make a quadro config possible from the get-go.

Quad PCI-E variant Volari Duos and double twin-core Opterons! Only needs liquid nitrogen for cooling, and 700w of power...
 
Tahir said:
To be honest this whole concept of using different graphics cards in conjunction to render a single application (game etc) seems flawed and impossible.

Yes but wouldn't it be great for IQ comparisions? :D

ATi on the top, nVidia on the bottom. Wonder if you would get the choice of AA modes, or it would just do half and half.
 
Tahir said:
To be honest this whole concept of using different graphics cards in conjunction to render a single application (game etc) seems flawed and impossible.

hehe so what happens when you pair a DX8 and a DX9 board does the upper half of the screen render with PS2.0 while the lower only uses 1.1? ;)
 
AlphaWolf said:
Tahir said:
To be honest this whole concept of using different graphics cards in conjunction to render a single application (game etc) seems flawed and impossible.

Yes but wouldn't it be great for IQ comparisions? :D

ATi on the top, nVidia on the bottom. Wonder if you would get the choice of AA modes, or it would just do half and half.

Surely it's be better to have nVidia on the top and ATI on the bottom - even nVidia couldn't mess up rendering the sky? :LOL:
 
Ante P said:
hehe so what happens when you pair a DX8 and a DX9 board does the upper half of the screen render with PS2.0 while the lower only uses 1.1?

You mean like X800 vs GF 6800 in Far Cry? :LOL:

Edit.. quote
 
Ante P said:
Tahir said:
To be honest this whole concept of using different graphics cards in conjunction to render a single application (game etc) seems flawed and impossible.

hehe so what happens when you pair a DX8 and a DX9 board does the upper half of the screen render with PS2.0 while the lower only uses 1.1? ;)

Well, if the DX9 card was a 6800 and the game was FarCry, then it probably wouldn't make any difference! :LOL:
 
digitalwanderer said:
I'm just wondering what in the hell you'd need two next gen "the video card is no longer the bottleneck" powered graphic cards for right now. :|

This could also be huge in the semi-pro / pro 3d market considering most apps are or are going to be offering a way of rendering certain elements of your scene using OpenGL / DX9+ shaders 8) Some sequences could probably be rendered completely on a gfx card alone ( thinking of kids tv series which normally arent the most complicated ( or entertaining ) 3d scenes in the world.... )
 
This is very sweet indeed! :D

For those who are worried about dual molexes etc., I wouldn't worry too much. Don't forget PCI-E provides more power to the GPU as well (? 75W), so I would imagine these cards only needing 1 molex. Though the mobo is going to require quite a bit more power to support it...........
 
Dual+ Video Cards

If this is actually monitor spanning and not top/ bottom splitting I will get one.
Especially if they expand it out to 3 cards. I've always wanted to run a tripple monitor rig at decent speed.

Penty
 
Re: Dual+ Video Cards

It seems like theres all sorts of issues that could arise when your going over the drivers reach with a software layer. I'll believe it when i see it in action.

BTW hi everybody, I've been a long time lurker here and i think it's just about the best graphics forum / site there is.
 
Back
Top