PCI Express SLI by Alienware!

Workstation not desktop chipset...

DaveBaumann said:
It said they are using Intel chipsets - at the moment Grantsdale and Alderwood don't support it (AFAIK) and I don't think they would be giving early samples of newer chipsets - I'd be surprised to see Intel going this route at all (at least, just yet).

The Inquirer say that it's based on the Turnwater workstation chipset for Xeons rather than the desktop chipsets you name. I don't know if Turnwater's PCI-E specifications are known yet, but it may not need a switch.

http://www.theinquirer.net/?article=15982
 
They just showed it on TechTV's The Screensavers from E3. They talked briefly with the AlienWare guy about it and showed how it worked. Most of the stuff was covered already in this thread (how it's split, the balancing, the hub, etc). The guy also said that the PC was running an 800watt PSU which it needed. The cards were PCIe 5900s (not 5950) and he also showed it switching between the two halfs while running Q3. It all seemed pretty seamless to me.
 
Can someone please clarify from a recent FAQ alienware has put up, to allay my fears.. http://www.alienware.com/alx_pages/main_content.aspx

Does the PCI-Express standard allow anyone to do this?
PCI-Express will not allow this out of the box. Alienware is using an exclusive software solution as well as a video merger hub. Both solutions are patent pending and were developed in house by Alienware. In addition Alienware has developed a dual PCI-Express graphics slot motherboard (X2). This motherboard is exclusive to Alienware and is also patent pending. To Alienware’s knowledge, no other motherboard currently supports dual PCI-Express graphics slots. The X2 also utilizes all available PCI-Express lanes provided by the Tumwater chipset, supports DDR2-400 memory and includes 6 ports for high-performance SATA RAID 0, 1, and 0+1 configurations. The X2 includes SATA on a 66MHz PCI-x bus, Gigabit Ethernet, and 5.1 Dolby Digital audio. It will also be dual processor capable.

Someone tell me, that alienware haven't patented dual pci-e, and that someone could in theory do it, without licencing issues? o_O
 
demonic said:
Someone tell me, that alienware haven't patented dual pci-e, and that someone could in theory do it, without licencing issues? o_O

From what I've understood of it, they haven't patented dual PEG solutions as such, but rather a dual PEG solution which makes use of special software to split the screen in half horizontally.

So if you were to create a software which did the same, only vertically, it should be ok...

Besides, if they had patented all dual PEG solutions, it shouldn't be too hard to find prior art and get it declared null & void :)
 
Re: Workstation not desktop chipset...

Rob M said:
The Inquirer say that it's based on the Turnwater workstation chipset for Xeons rather than the desktop chipsets you name. I don't know if Turnwater's PCI-E specifications are known yet, but it may not need a switch.

Although there's scant information on the Tumwater chipset at the moment, at this point I think the chip linked to earlier would possibly be a better solution. Unless Tumwater has enough PCIe lanes in the Northbridge to house two PEG16X slots, which I doubt as the northbridge is usually specifically catering for the graphics not all PCIe devices, then it may be the case that the other 16 lanes a coming from the southbridge. The connection from the southbridge to the northbridge may be lower bandwidth than using a decived that takes a single PEG16X interface directly from the northbridge and slits the bandwidth over another two sets of 16X lanes.
 
Re: Workstation not desktop chipset...

DaveBaumann said:
Rob M said:
The Inquirer say that it's based on the Turnwater workstation chipset for Xeons rather than the desktop chipsets you name. I don't know if Turnwater's PCI-E specifications are known yet, but it may not need a switch.

Although there's scant information on the Tumwater chipset at the moment, at this point I think the chip linked to earlier would possibly be a better solution. Unless Tumwater has enough PCIe lanes in the Northbridge to house two PEG16X slots, which I doubt as the northbridge is usually specifically catering for the graphics not all PCIe devices, then it may be the case that the other 16 lanes a coming from the southbridge. The connection from the southbridge to the northbridge may be lower bandwidth than using a decived that takes a single PEG16X interface directly from the northbridge and slits the bandwidth over another two sets of 16X lanes.

E7525/Tumwater only supports a single 16X graphics connection.

Processor choice is weak, given that it's a DP Nocona Xeon chipset (90nm, 1MB L2, Prescott pretty much).

They might do the other 16X connection via a CSA-like expansion bridge, just for PCI-Express, but I doubt it.

Cost for the bridge is $100 in volume and the 3.4 Xeon at launch will be $700 (near $900 for the 3.6). It'll be a pricy setup, that's for sure.

Rys
 
>>It'll be a pricy setup, that's for sure<<


Custom motherboard, 800Watt PSU, Dual CPUs, Dual Video Cards...


Pricy Set-up????? :LOL: :LOL: :LOL:



that's Alienware laughing. This system will be in the stratosphere as far as pricing is concerned.
 
joe emo said:
Pricy Set-up????? :LOL: :LOL: :LOL:

Agreed and what a complete waste!

Just give me a Dual PCI-E board. The PCI gfx card that assembles the pictures, the software and let me buy 2x PCI-E gfx cards from either ATi or Nvidia.

Much much simpler solution.

Hell, Alienware could sell it as a bundle. Im sure if it is competitively priced people will go for it.

Shame we can't start a campaign for people to boycott it, so Alienware does something different ;)
 
I never did understood the kind of people who bought the GF2U or it's modern day brethren, let alone such ridiculous systems... even if you DO have the money, what's the point? Hell, it's not even about something as lame as 3DMark bragging rights, a semi-decent HCOC system would leave such "factory racers" in the dust anyway.
 
I don't get the point.

Such a setup only increases fillrate and bandwidth, right? As games start using shaders more heavily, don't they become LESS dependent on fillrate and bandwidth?
 
HolySmoke said:
I don't get the point.

Such a setup only increases fillrate and bandwidth, right? As games start using shaders more heavily, don't they become LESS dependent on fillrate and bandwidth?
Dual graphics cards increases shader processing speed as well.
 
Are you sure?

I thought the same rule applied as with SLI'd Voodoo2's, that each card would need to do the same shader calculations independently of the other, i.e. both cards doing the same calculations at the same time?
 
Voodoo 2 didn't support shaders so I think you're confusing terminology. I'm positive that pixel shading won't be duplicated. Vertex shading probably depends on the implementation. Both cards might need to operate on the same textures though so there is potential duplication in memory bandwidth. I'm curious how they handle the vertex processing, but we probably won't know until we see the patent.
 
They will probably use 12pipline geforce 6800

A 12 pipline geforce 6800 seems like a good solution because you only need one molex connecter for each one.
 
Back
Top