Nvidia's Hybrid SLI

rendezvous

Regular
Nvidia presented Hybrid SLI on the analyst day the held 06/20/07.
Presentation slides and web cast
It is basically a concept which provides two benefits to the consumer. One where a motherboard GPU* can be combined with an add in card to increase performance when both the motherboard GPU and the add in GPU are working together as in normal SLI mode for increased performance. The other mode is where the add in card can be completely shut of e.g. when browsing the web for decreased power usage and heat output.

X-bit labs has written a small news piece with a few selected quotes from the presentation.
http://www.xbitlabs.com/news/video/display/20070625083756.html

Thoughts, ideas? Please discuss.

*Nvidia seems keen on using the term "motherboard GPU" instead of IGP for their comming DX10 products, from what I understood they wanted to differentiate their solution because it will have better performance/be better than existing IGP solutions.
 
*Nvidia seems keen on using the term "motherboard GPU" instead of IGP for their comming DX10 products, from what I understood they wanted to differentiate their solution because it will have better performance/be better than existing IGP solutions.
Thats not new. Instead of going for nForce with their previous IGP solutions for AMD they branded them as "GeForce 6150 + MCP xxx".
 
It's a very interesting way to fight competing platformization through platformization of their own: if they manage to have a dominant position in the GPU business, it means that any discrete GPU system *not* bundling NVIDIA's chipsets with their GPUs will have an inherent disadvantage. Basically, this makes all potential bundling by Intel and AMD much less attractive, unless the system doesn't boast a discrete GPU, OR that they have highly competitive GPUs of their own to bundle with their chipsets.

There is another very interesting thing that this implies for NVIDIA's short-term chipset roadmap. MCP72 is their next high-end chipset, but unsurprisingly it's IGP-less. However, there's a catch... The highest-end variant will now have an IGP. Consider this:
http://xtreview.com/addcomment-id-2510-view-MCP72-and-MCP65-new-nvidia-chipset.html

These specs are correct as far as I can tell, with one major exception: MCP72XE. In the past, NVIDIA paired their high-end single-chip southbridges with a northbridge (without an integrated IGP for Intel, or with a redundant IGP for AMD) but now they've got the possibility to pair it with a single-chip IGP, possibly in both markets eventually. For AMD platforms, this means pairing up MCP72 and MCP78. This allows a number of other goodies, including more USB, SATA, Ethernet, etc. ports than otherwise. It's also potentially cheaper than 2xMCP72 for Quad FX...

The only problem really is for Intel platforms. If you want a high-performance memory controller, either you'd need a three-chip solution (MCP72+MCP78+C55/Next-Gen Memory Controller) or you need your high-performance memory controller integrated into MCP79 (wouldn't that be a little bit overkill for the low-end market however?), or you need another chip to act as a single-chip southbridge+memory controller for the Intel market. None of these solutions seem quite ideal to me...

So that's not a bad roadmap in my eyes - there are some nice things on AMD and Intel's roadmaps that don't seem to be on NVIDIA's though unless I missed them (some more native support of NAND for AMD, 10Gbit Ethernet for Intel, etc. - although the latter could be on a server-only companion chip I guess). So let's not exagerate things either...

Overall, what I really do like the idea of making GPUs part of probably nearly every single chipset going forward though, that's an excellent way to fight the market becoming a commodity via Moore's Law. It's also an excellent way for both NVIDIA and AMD to get an intrinsic advantage against other chipset providers with inferior or no discrete GPUs. Yet another reason for Intel to want to enter the discrete market, I guess...
 
I notice they're saying that an upcoming IGP will be as fast as a GeForce 8400 @ 1/5 the power draw(!). Regardless of Hybrid SLI, that seems like a decent (way too good) IGP bump from the current generation.

Other than that, I find the power reduction a good thing (100ish Watt on the high end adds up after a while). That is disregarding the slightly intellectually dishonest graph (they wouldn't be listing the power requirement of the discrete cards @ tilt and the IGP when idle, are they).
 
I notice they're saying that an upcoming IGP will be as fast as a GeForce 8400 @ 1/5 the power draw(!). Regardless of Hybrid SLI, that seems like a decent (way too good) IGP bump from the current generation.
Memory and PCIe can probably account for a lot - between IGP and a descrete solution there's one less memory bus and 16 fewer PCIe lanes to be dealing with, for starters.
 
Yeah, that definitely should explain it in terms of power (I wonder how big of a part just the DDR2 chips are) - and in terms of perf, I guess that just implies than MCP78's config would be very near G86's. Considering that MCP78 is afaik manufactured on 65nm though (heck, could even be 55nm, but I doubt that) I don't think that's unbelievable at all. I wonder if it's more similar to G86 or G98 though, heh, not that it really matters or that anyone will ever care. It will be marketed as a GeForce 8, but MCP61 was marketed as a GeForce 6 too, after all...
 
Finally, i remember i started a thread about power usage problem with current gfx card. And i posted the idea of Hybrid SLI. It is nice to see things i want becomes a reality :).

And if all of their future chipset will be mGPU. That properly explain why they dont put newest VP in their high end Gfx card. Since all mGPU will have VP inside them.

According to the slide all output will be routed to the mGPU. So today i have a low power System with mGPU connecting to my monitor. And if i suddenly want to upgrade to a power Gfx card to play games all i have to do is to plug in a Higher End dGPU to my System.

Really that simple? I am wondering how does it work.
Do i need special games support?
Will there be games that totally not work on Hybrid SLI? ( i.e i will be only using my mGPU 's power )
Will future Gfx card not have NVIO or any output connector to them like HDMI or DVI? ( Just to save cost ? )
And in case of mGPU working only could i have my dGPU working as GPGPU like Telsa?

Most of the time people will choose Intel Chipset for Intel system. I think if hybrid SLI work as advertise may be it could finally break the deadlock.

Now i could only wish Apple will use this technology ><
 
Yup, certainly shows for how long this kiind of idea has been present (and at both NVIDIA and AMD, apparently: see Jawed's linked patent too).

If you think about it, the same concept could be applied to processors: single-chip southbridge+...+CPU, and add a discrete CPU to that if the user needs it. Or you could have CPUs on the discrete GPU, and make the socket's CPU optional. There are a number of problems with these solutions, however, including where the memory controllers are, how many you want, and if/how they communicate.

So I'm not convinced at all it makes sense, but you'd expect it to be the kind of thing NVIDIA is thinking about right now, and that AMD and Intel want to position themselves to prevent it if possible. At the minimum, it does seem that NV realizes they cannot compete in that market on AMD and Intel's terms...
 
Is the IGP going to have dedicated memory? If it has to use system memory, couldn't that potentially hurt performance when running in SLI mode with the add-in card? So the IGP would be fighting with the CPU, etc over system bandwidth, which might hurt more than the IGP helps?
 
Is this Nvidia MCP73 one of the aformentioned technology? Appearently, Digitimes reported that "Nvidia MCP 73 chipset launch remains unclear".

Digitimes.com said:
Nvidia MCP 73 chipset launch remains unclear
Monica Chen, Taipei; Joseph Tsai, DIGITIMES [Tuesday 26 June 2007]


Nvidia's first Intel platform IGP (integrated graphics processor) chipset GeForce 7050 nForce 630i (MCP 73) was originally scheduled to launch in May this year, but with the chipset still in the pre-production stages, a shipping date still remains unclear, according to sources at motherboard makers.

Nvidia plans to send MCP 73 engineering samples to motherboard makers for testing in the next few days, according to the sources, who pointed out that if Nvidia can release a final version before the end of June, then the MCP 73 chipset can be expected to launch in August. However, if the chipset still remains in testing, not only would the chipset miss the peak sales season, motherboard makers could also lose interest.

Nvidia's MCP 73 adopts an 80nm process, and has a built-in GeForce 7050 graphics core, which supports DirectX 9.0c, Shader Model 3.0 and HDMI/HDCP. However its single-channel memory support is a major disadvantage, added the sources.
 
this hybrid SLI concept is very interesting for the huge power savings you get by turning off the high end GPU card. it's greatly needed. I wonder though if you'll need a same which is from the same (broad) generation, such as G8x IGP and G8x/G9x GPU. I don't even think about different GPU vendors. Vista's "only one graphics driver can run" is one of the reasons to worry about that.
That geforce 7050 IGP (a rebranded 6100) might not be able to work with a G8x. but G8x IGP is coming soon enough and/or I might be completely wrong (maybe Vista is smart enough to switch drivers when switching the GPU)


and I also hope we can get low end full ATX IGP mobos, they already are rare. I don't know if it'll get worse as you'll find full ATX on the high end, or better as an IGP will be so much useful to everyone (most often people who pay attention to the bill). and I can imagine my PC running from such a wind mill. Enter gaming mode with higher CPU clock and GPU enabled when you're ready to suck all the power :)
 
Last edited by a moderator:
Let's not forget the possibility of the IGP being used to handle physics (for a given mode) wheras the discrete GPU(s) handle the eye-candy... :D
 
This is one thing that niggles in the back of my mind every time I think about multi-GPU, and that's the traction effects physics on the GPU has had since there was that push over a year ago now.

How many titles do we know of that use GPU-based effects physics (I think of Havok FX) that are shipping now?
 
You know, I really want the head of Alfredo Garcia. Or whoever came up with the idea of doing GPU Physics nearly exclusively through Havok FX. And I also want the head of whoever came up with the idea of telling people that it was more efficient to do it on a separate board than on the same one, unless limited by the amount of video memory.

The whole GPU Physics thing has been so horribly mismanaged that I truly am at a loss of words regarding the whole thing. The fact that it was made nearly exclusive to a presumably expensive add-on for an expensive middleware solution is downright ridiculous. I know Havok did some of the R&D, but FFS, come on... This is laughable at best.

Although my current signature ("you can't compete with free.") is not related to GPU Physics, I feel that it also applies just about perfectly to its core problem.
 
We know the current External Gfx from ASUS and MSI all need the Monitor connected to the External GFX card.

Would Hybrid SLI in theory allow us to Simply plug in a External GFX card and power up our gfx system without plug and unpluging monitor cable?
 
My laptop (Sony SZ5) has an early version of this already. You can switch between the onboard Intel graphics, or flick a switch, reboot, and you have go7400 graphics. Not as slick, but the basis of the same idea.

It seems to me if you could switch this on and off without rebooting, then you would realy be onto something good. Not to mention the performance benefits.
 
It seems to me if you could switch this on and off without rebooting, then you would realy be onto something good. Not to mention the performance benefits.

Thats basically the same as PowerXpress which has been talked about with AMD's Puma platform. The issue is that with Vista you can only have one graphics driver loaded, so switching between on and the other on-the-fly is only possible if the same driver can operate all the devices you are switching between.
 
On the other hand vista can load display drives on the go without booting (at least new drivers for existing card).
 
Back
Top