Nvidia GT300 core: Speculation

Status
Not open for further replies.
That post was full of crazy incoherent language.

Still we had that it was called hybrid Sli for power savings, only available on AMD chips (with Nv board) and died an early death. Maybe in the future it will rear its head again, preferably with OS support so that it could deal with AMD<-->Intel<-->Nvidia without it mattering.

It's Hybrid power and not so much of a chipset thing as that board vendors had to cough up the sli license to use that feature. Nv's official stand on the matter is that GT200 is so power efficient you couldn't distinguish it from their 9300. HybridPower isn't available on anything but G8/9. It's weird though how they keep supporting it in the mobile sector, but there's no GT2XX there anyway.
Theres a nice thread how to enable it on notebooks that might not officially support it: http://forum.notebookreview.com/showthread.php?t=342947

You mind, if I use this quote in my signature? :cool:

nein, go ahead (no Sly fees here ;) )
 
It's Hybrid power and not so much of a chipset thing as that board vendors had to cough up the sli license to use that feature. Nv's official stand on the matter is that GT200 is so power efficient you couldn't distinguish it from their 9300. HybridPower isn't available on anything but G8/9. It's weird though how they keep supporting it in the mobile sector, but there's no GT2XX there anyway.
Theres a nice thread how to enable it on notebooks that might not officially support it:

I want it for my desktop though anyway. And I want it to not be Sli related, but IGP or discrete of any make at my whim. I hope that does happen in the future. Heck with Open CL if havok and physX both get ported that could turn out to be quite a boon to both AMD and Nvidia (until Intel gets a decent IGP).
 
Do your PhysX reduce idle power? angle independancy AF has a 16X power savings mode? Oh wait, nV marketing does not condone the use of the Power of 3 in a discussion about power savings.

Wow, you have an anti-nVidia complex.

I'm just saying that I want the idle power to be as low as possible, without compromising graphics performance (as in: when not idle, but when doing heavy processing).
So basically I want my next card to be faster and better than my current one, but use less power when idle.

If I just wanted power consumption to be as low as possible and didn't care about performance, I'd get an IGP, not a videocard.
 
Seriously, Aero is a no-go on my GMA500. I mean, it works, but veeeery slooooowly. So, sweet spot would be somewhere in between yours (X3100) and mine (GMA500).

Dump question is the GMA500 running Aero in hw or in sw?
 
It's Hybrid power and not so much of a chipset thing as that board vendors had to cough up the sli license to use that feature. Nv's official stand on the matter is that GT200 is so power efficient you couldn't distinguish it from their 9300. HybridPower isn't available on anything but G8/9. It's weird though how they keep supporting it in the mobile sector, but there's no GT2XX there anyway.

Last I heard was that Hybrid power was abandoned because it couldn't be made to work 100% stable without OS support.
So we'll have to see if it will be supported in Windows 7.

But I still find the idea rather half-hearted. Why have two GPUs if you could just have one GPU that can scale down properly?
 
Last I heard was that Hybrid power was abandoned because it couldn't be made to work 100% stable without OS support.
So we'll have to see if it will be supported in Windows 7.

But I still find the idea rather half-hearted. Why have two GPUs if you could just have one GPU that can scale down properly?

Tweakers.net March 2009

Now, how come it works nicely on a notebook and not on a Desktop? it still runs the same OS. ATI's halfhearted attempt (ATI XpressPower) only has support in the M780G chipset and only switches graphics depending whether it's on AC or DC.

I don't think we will see that ever, especially not with the advent of gDDR5 and the issues that seem to stem from it. I'd still go for disabling the add-in card altogether since you're not only toning down the GPU, but also the memory and the fan etc.
 
Now, how come it works nicely on a notebook and not on a Desktop?

Notebooks don't have such diverse hardware as desktops do.
Notebooks that support it are probably built according to some nVidia reference design.

I'd still go for disabling the add-in card altogether since you're not only toning down the GPU, but also the memory and the fan etc.

Perhaps you missed my post where I said that they should turn off most of the memory on a videocard, leaving only about 128 mb or so for 2d/idle mode. I find that a more elegant solution than having two GPUs in one system.
 
Notebooks don't have such diverse hardware as desktops do.
Notebooks that support it are probably built according to some nVidia reference design.

Support for it is built into the GPU and enabled by the SLi License. it also shouldn't matter on the cards because it also works on MXM/AXIOM modules. on nVidia's page about HybridSLI they take no responsibility and forward you to computer manufacturers as it is up to them to enable the feature.
 
GT300, 512 bit, super duper memory controller. delay after delay. NV30 and R600 finally can have a threesome.

Why do you consider it delayed? GT2xx isn't that old and DX11 is a good few months off yet. Plus the fastest thing on the market right now is GT2xx based. I don't see any motivation for NV to release GT3xx in the near future.

Besides, if it performs on par with those specs, i.e. twice as fast as a GTX285 then I don't think anyone will be lumping it with NV30 or R600 any time soon. Not unless ATI come up with a miracle in the form of R800 being at least >2x faster than a 4890 which the current rumored specs don't suggest.
 
I don't see any motivation for NV to release GT3xx in the near future.

Well, ATi offering DX11 parts would be a good motivation :)
Unless ofcourse their performance isn't better than current GT2xx-based parts. But I don't expect that (or is that wishful thinking?).
 
Why do you consider it delayed? GT2xx isn't that old and DX11 is a good few months off yet. Plus the fastest thing on the market right now is GT2xx based. I don't see any motivation for NV to release GT3xx in the near future.

Besides, if it performs on par with those specs, i.e. twice as fast as a GTX285 then I don't think anyone will be lumping it with NV30 or R600 any time soon. Not unless ATI come up with a miracle in the form of R800 being at least >2x faster than a 4890 which the current rumored specs don't suggest.

Both AMD and nVidia gunned for a mid-2009 release. I'm talking about discussion in early 2006 where there were famous quotes like:
retsam said:
i think g80 will have some sort of integrated physics engine.

Talk about insane numbers of 3TFlops on GT300, slowly becomming a reality. Then the 55nm parts started delaying (65nm GT200, from my POV is still a backup solution) with the 55nm GT200 having a 6 month delay. now the 40nm G92 derivates are delayed for over a half year as well. In my opinion GT200 was a prototype for a plan they had with GT300 a tripled GT200 chip on 40nm. I think that didn't work out well and some major redesign has taken place over the past year. hence my view of "a delay"
 
Until GT300 is later than November I don't think it can be described as late.

RV740 is anything upto around 6 months late. It's not unreasonable to expect 40nm GT2xx to have been affected, since it seems TSMC is having problems.

Jawed
 
How would i know? The intel driver policy regarding GMA500 is non-existant. SW-Aero on an Atom would be even more abysmal, i imagine.

SGX has afaik 2D via 3D. Think harder ;)

---------------------------------------------------------------------------------
As for some folks wanting to add funky signatures sure go ahead; they're always funny to read as most of them are usually by 180 degrees on the wrong foot.
 
Status
Not open for further replies.
Back
Top