Hybrid SLI dying?

nicolasb

Regular
FUDzilla: http://www.fudzilla.com/index.php?option=com_content&task=view&id=11684&Itemid=1

Nevertheless, several sources have confirmed that the company's latest discrete cards, GeForce GTX 285 and GeForce GTX 295, do not support any form of Hybrid SLI. Add the fact that Nvidia hasn't made plans for any new desktop chipsets, and this could bring light to the fact that the technology in the retail market is reaching an inevitable end.

This seems a shame, if true. From a power-usage perspective I liked the idea of being able to completely switch off the video card while in 2D mode.
 
Hybrid Power might be more trouble than it's worth with GT200 since it already has incredible idle power characteristics.
 
I think hybrid power was dying almost immediately. Did they ever even get an Intel board that supported it?

I wanted hybrid physX, but that did not work as great either.
 
I think hybrid power was dying almost immediately. Did they ever even get an Intel board that supported it?

I wanted hybrid physX, but that did not work as great either.

Lenovo's T400 or T500 (can't remember which) uses a hybrid power setup -- intel IGP and an 8300 or something like that. We have several of them at the office...
 
And Dell's latest 13" laptop, the Studio XPS 1340, uses both Hybrid Power and Hybrid SLI. It's got an Nvidia 9400M integrated chipset, as well as a discrete 9200M GS. You can turn off the 9200M GS to save power, or turn it on and run it in SLI for more performance.
 
Lenovo's T400 or T500 (can't remember which) uses a hybrid power setup -- intel IGP and an 8300 or something like that. We have several of them at the office...

Notebook switchable graphics is little different - not necassarily in concept, but in many cases it is in implementation. Lenovo Thinkpads are using switchable graphics with ATI Radeon as well.
 
I am actually looking into this. Since I have done all my testing on the Intel system. I didnt really do any tests with this on the my AMD Phenom system. So If this is the case I wouldn't have been immediately aware of it. And it wasn't mentioned. I'll see if I can get some answers.
 
so if you have a 2nd card for physx is it running all the time

Is that a question? If it's not I apologize for the long winded answer. :p

For PhysX theres' basically a few ways it can do it. If you have PhysX enabled

2 Cards: SLi Disabled

First card Renders

Second Card is auto appointed to PhysX. This assignment is automatic

2 Cards: SLi Enabled

First card renders and does PhysX tasks.

Second card renders. This assignment is automatic.

2 Identical Cards SLI: + PhysX card ((Non identical))

GPU 1 and 3 appointed to basic rendering ((must be identical)

GPU 3 assigned to PhysX via contronal Panel ((Manual))


3 way SLI and Quad SLI:

This area is a little more complicated.

With Quad SLI you can stick a dedicated physX GPU in the middle like a 9800GTX and set it manually. With Quad SLI Nvidia is working on a way to set "1 GPU" for PhysX and 3 for rendering. Or 2 for PhysX and 2 for rendering. But its not a feature yet. Currently Quad SLI could use some more flexibility.

With 3 Way. You can use a single SLI bridge ((1st and 3rd)) for standard SLI and I believe the 2nd GPU will be assigned to PhysX. However I'm not entirely sure on this and its something I am still looking into it. 3 way also could use a little more flexibility.
 
Last edited by a moderator:
No i mean if for example i have a gtx280 for rendering and a 9600gt for physx
and im not playing a physx enabled game the the 9600gt will be switched off, fan not spinning, no power going to it - yes ?
 
No. The card will remained powered. Any 2D Power saving tech will stay kicked in however.
 
oh well its a start ;)

Makes me wonder about how far a PCI-E interconnect can "go" in terms of S3/S4 or deep-sleep states. Or better yet, can you force individual devices into S3 without the whole system being there? I think Vista's power management allows some level of this (per-device basis sleep modes) but I wonder how granular it can be...

I also wonder if you can shut down individual PCI-E slots on a given motherboard, or whether it's an 'all or nothing' approach instead. Guess that would be up to the chipset manufacturer and/or the board implementing it.
 
No i mean if for example i have a gtx280 for rendering and a 9600gt for physx
and im not playing a physx enabled game the the 9600gt will be switched off, fan not spinning, no power going to it - yes ?

I have a setup similar to this (280 for rendering, 8800GTS for physx), and I have to say it's kind of disappointing in terms of power usage and heat. Both of the GPUs are essentially 60-65 degrees at idle (closed case, stock fans@stock settings), and the 8800GTS behaves the same as if it's your primary GPU. The only difference is, it doesn't get as hot under physx load compared to rendering load (a few degrees lower).

Considering how little physx actually gets used, it seems really inefficient to leave the card in there as a discrete PPU. I hope nvidia finds some way to remedy this.
 
thats why my 8800gts is in a cupboard rather than in my pc doing physx, I may get a second hand 8400gs to do physx
 
After talking to Nvidia and doing some of my own tests with the 780A. It would seem that the above is true in a sense that Nvidia may be working to employ hybrid power specifically for the notebook area, But not for the desktop. This could be due to the extra requirements of implementing Hybrid Power. And the compatibility issues that arose with certain GPU AIBs not implementing it correctly on the board. ((it does/did require special hardware)).

Seems Nvidia's current direction is to make GPU idle more attractive as time goes on improving further on the GT200 idling.

In Retrospect. I was really excited about Hybrid Power it when it came out. But now that the newer GPUS have such strong idle power usage its been a bit less of an issue. However I'm a little let down by this all because I had high hopes for Nvidia Intel implementation of Hybrid Power. Sadly it seems only a few GPUS support Hybrid Power and half of them have Gt200 power saving features.
 
Isn't onboard video power consumption generally still quite a bit lower than pretty much any dedicated video card at idle? Even the low end discrete video cards I believe generally idle higher than most IGPs.

It's a shame if they are doing this. It was one of the things I really admired Nvidia for attempting to do even though they limited it to only working with relatively high end cards.

I've constantly been hoping something would come out from Nvidia, ATI or Intel that would allow powering down/off any discrete video card when it's not required.

Regards,
SB
 
Yes. A GTX 280 for instance is about 25 watts at idle. While a onboard is even less than that. Its mostly useful to SLI users. But it was a neat technology. I liked it specially when we didnt have the power saving tech the GT200 series had. It made considerable difference on my 9800GX2's during 2D mode. Though in comparison. The GTX295's idle power is much lower than the 9800GX2's were in 2D.
 
thats why my 8800gts is in a cupboard rather than in my pc doing physx, I may get a second hand 8400gs to do physx

Don't do it, 8400gs will do more harm than good. Pretty much anything short of a 9600GT will end up decreasing performance rather than improving it, when paired with a more powerful gpu.

ChrisRay said:
Seems Nvidia's current direction is to make GPU idle more attractive as time goes on improving further on the GT200 idling.

Is it just me or did they break the downclocking in recent drivers for the 280? My 280 now runs at full clock all the time, and I made no modifications to it. It's a little worrysome to see it idling at 65 degrees C.
 
The only change to clock throttling occurs when 3D games are running. When Dynamic clock adjustments based on load was shut off.
 
Back
Top