GFFX Reborn? - HardOCP

I am hijacking the thread slightly - apologies, sound cancellation requires high quality microphones and signal in equipment - a series of fast digital audio converters (DACs) you have to seperate the signal into frequency ranges - then you have to have a phase inverter - simplicity itself - and then you you must amplify it to the correct level. Then you must monitor this with fuzzy logic to get it stable and optimally tuned else you are far worse of then what you started with.

The major challenge is lag between hearing a sound and projecting its inverse. You have to really analyse a sound, decompose it into a series of component sine waves and predict where they will be in 1/1,000 of a seconds time (or however fast your DAC and total system works at) and then project that sound else you make the system worse.

Its the prediction system that is hard - hence why it only works on regular - recurring sounds that have a distinctive pattern and a regular start and end (so you can determine when to pump out the reverse sound). Secondly its tricky to balance measuring the news sonics once you put out your inverse wave to try and determine are you making things better or worse - at worse case you will be constantly doubling the nosie you are making yourself - a horrible negative feedback llop - you need a good fuzzy logic system here to prevent this.

PS I wish all good high end sound cards could do this and that it was implemented on every single high end motherboard. It is simply hard - not impossible to do it well.
 
Typedef Enum said:
More importantly, it seemingly doesn't address the G.E. 701-C effect (that would be a turbine engine) that you get when you actually use the thing for its main purpose/mission in life...cranking out pixels. The lame statements made by both nVidia and others, such as "Hey, just put on headphones!" or "I play with the sound on" blah blah blah don't really bode well at all, IMHO.

Yeah, this really comes across kind of flat, IMO. This is all well and good if all you play are games where the sound is constantly blaring, like a racing game, or maybe a shooter like UT2003? What about when you're playing a game like System Shock 2 or NOLF where it's not always so loud (and where you even benefit from being able to hear subtle sounds). It really seems to me that the loud fan could really ruin the atmosphere of games like that. Still, no noise in 2D is good.

Doomtrooper said:
Sound is very localized so directional positioning is also very important.

Depends on the frequency.

And also just on your hearing in general. ;) But, anyway, I feel the idea that low frequencies are not directional is sheer BS. It's not as noticeable, but even in the middle of the room you can still tell which corner the sub is sitting in. It's not so obvious that it ruins the experience of movies with LFE, etc, but you can recognize where the sound is coming from. Although, I'm sure room acoustics has a lot to do with it as well.
 
Nagorak said:
And also just on your hearing in general. ;) But, anyway, I feel the idea that low frequencies are not directional is sheer BS. It's not as noticeable, but even in the middle of the room you can still tell which corner the sub is sitting in. It's not so obvious that it ruins the experience of movies with LFE, etc, but you can recognize where the sound is coming from. Although, I'm sure room acoustics has a lot to do with it as well.

frequencies below around 50hz are pretty non-directional, however they may contain harmonics, or the sub may be distorting, both of which will localise the sound source.
 
SanGreal said:
I dont know if anyone caught Kyle's appearence on TechTV tonight, but he mentions that the FlowFX spins up for 3dsmax.. While the card isnt targeted for workstations, I imagine this might pose an issue for some people...

Just imagine what this card will do when it meets Longhorn and its 3D GUI...
 
Ollo said:
Just imagine what this card will do when it meets Longhorn and its 3D GUI...

Well, I wonder if they'll be a Mac version. OSX Jaguar will already have that issue.

Its quite fortunate that Video/DVD processing isn't achieved via the shaders as well!
 
Well, it seems like it's a software issue as to whether or not the FX Flow needs to be turned on for 3D rendering. Thus, it seems apparent that for "low demand" 3D processing it should be relatively easy to just run the core at a low speed instead of turning on the FX Flow. Doesn't seem like this would be too hard for 3D GUI's or video decoding.
 
Well, it seems like it's a software issue as to whether or not the FX Flow needs to be turned on for 3D rendering.

The control may be ‘software’ but whether its on at all may be a hardware issue. So far we know is turns off for 2D operations – the VGA portion of the chip is still entirely separate from the 3D pipeline and relatively tiny (especially when you are dealing with chips this size). Because of the amount of hardware being used I’d guess that passive cooling is all that’s necessary.

However, if the 3D portion of the chip is used at all then the heat build up will be quite significant. If the 3D pipeline is used at all then I’d guess it will spin up. There may be a possibility if saying “this is a low demand 3D application, hence run low speed core and low speed fan”. Then of course you are into application specific coding.
 
Volenti said:
frequencies below around 50hz are pretty non-directional, however they may contain harmonics, or the sub may be distorting, both of which will localise the sound source.

Few subs only produce 50 Hz though, the LFE crossover is usually set to 80 Hz, or even higher depending on your frequencies. That's probably the reason for any directionality, I suppose.
 
DaveBaumann said:
However, if the 3D portion of the chip is used at all then the heat build up will be quite significant. If the 3D pipeline is used at all then I’d guess it will spin up. There may be a possibility if saying “this is a low demand 3D application, hence run low speed core and low speed fanâ€￾. Then of course you are into application specific coding.
I'm not sure application-specific coding is what's required. If nVidia produces codec interfaces for hardware video acceleration, then those interfaces could tell the hardware to clock down (if necessary) and run the instructions through the 3D core anyway.

Only if the application (if it is, for example, the Windows GUI) uses a standard programming interface would it be unfeasible to have automatic clock throttling.

As a side note, however, it would be nice for these cards to have an option to run in quiet mode for running any 3D app.
 
DaveBaumann said:
Its quite fortunate that Video/DVD processing isn't achieved via the shaders as well!

That doesn't necessarily mean the fan won't kick into high gear when playing video. I'm sure video decoding still puts a significant load on the chip, even if it's not as high as 3D processing. As far as I've seen, no reviewer has even tried playing a DVD on GeForce FX yet.
 
DaveBaumann said:
The control may be ‘software’ but whether its on at all may be a hardware issue. So far we know is turns off for 2D operations – the VGA portion of the chip is still entirely separate from the 3D pipeline and relatively tiny (especially when you are dealing with chips this size). Because of the amount of hardware being used I’d guess that passive cooling is all that’s necessary.

However, if the 3D portion of the chip is used at all then the heat build up will be quite significant. If the 3D pipeline is used at all then I’d guess it will spin up. There may be a possibility if saying “this is a low demand 3D application, hence run low speed core and low speed fanâ€. Then of course you are into application specific coding.

Exactly...! What surprised me about this product was not the fan noise--I expected that. It was the fact that the chip didn't run at 500MHz all the time and simply spin up the fan for 3D operations. Seeing it actually bump the MHz down to 300 to run 2D was a shock and an indicator of a whole lot of negative stuff which I won't go into here.

I would hope that at 300MHz the chip would not be over volted and over clocked to the degree that the massive heatsink plating sourrounding it would be insufficient to dissipate the heat it would generate at that MHz speed, doing nothing but 2D operations. Thus I was shocked again to see that nVidia was initially running the fan on low even at 300MHz, even for things as comparatively trivial in workload as 2D operations. Now that they've "fixed" the fan to remain off at 300MHz, though, a couple of questions arise.

One of the features of the hairdryer fan that make it essential to the reference design is the capability of removing heat from the user's case. Obviously, even at 300MHz and running nothing but 2D chip operations nVidia felt that the fan should remain on and continue displacing the heat from the ram and gpu out of the case, and it's only because of the almost universal rejection of the hairdryer that nVidia has decided to turn it off while the chip is doing 2D at 300MHz. It doesn't appear as if anything whatsoever is being done with the heat, however, except this arrangement is now allowing a much greater percentage of the heat (the fan never removed all of it) to remain inside the case as opposed to being evacuated. So what's happening is that end users will have to make sure their case cooling is sufficient to cope with the extra load, which probably means beefing up their case fans. Or, I suppose, they could do what Kyle over at [H] does, and that's run it with the side cover off.

It looks like the speeds for this fan are OFF, LOW, HIGH, with nothing in between so that it might be possible to adjust the speed of the fan when running 2D at 300MHz so that some heat might be evacuated. Would have been nice to see them incoporate a rheostat for variable-speed control of the fan, which makes me wonder why they didn't. I'm guessing that at 500MHz nothing short of the fan's maximum rpm is sufficient to evacuate enough heat so they probably felt there was little sense in adding in this feature which would have only increased the cost. However, if they plan to sell this thing I think they should not fix the 300MHz, 2D fan function, but at least allow the user to make the choice as to whether he wants the fan off, or on low speed.
 
One (of many) of the drawbacks of the FlowFX cooling solution is it's inherent weakness at passive cooling. It's an all or nothing design.

nVidia had better set the fan to run for several seconds after a 3D application shuts down, or there will be overheating issues.
 
Seeing it actually bump the MHz down to 300 to run 2D was a shock and an indicator of a whole lot of negative stuff which I won't go into here.

Maybe its just me working in an embedded world where power savings is king, but I'm suprised that all chips DON'T downclock in 2d. I certainly don't see that a chip does do this as "an indicator of a whole lot of negative stuff".

Obviously, even at 300MHz and running nothing but 2D chip operations nVidia felt that the fan should remain on and continue displacing the heat from the ram and gpu out of the case, and it's only because of the almost universal rejection of the hairdryer that nVidia has decided to turn it off while the chip is doing 2D at 300MHz.
Really? You're sure about this? The fact that its easier to leave the thing on all the time requires zero work, whereas controlling it requires extra work has nothing to do with it?

Personally, I'd like every company out there to have fan's that only operate when needed. My powersupply, my CPU, my GPU, and even my northbridge has a fan now. I'd prefer they all sit quiet unless required.
 
RussSchultz said:
Personally, I'd like every company out there to have fan's that only operate when needed. My powersupply, my CPU, my GPU, and even my northbridge has a fan now. I'd prefer they all sit quiet unless required.
As a point of interest, the Asus A7N8X has a CPU fan speed control. A few other motherboards probably do as well.
 
RussSchultz said:
Maybe its just me working in an embedded world where power savings is king, but I'm suprised that all chips DON'T downclock in 2d. I certainly don't see that a chip does do this as "an indicator of a whole lot of negative stuff".

Man, you've really got the rose-colored glasses on here, don't you? *chuckle* I would have thought that since the card is obviously *not* designed with power-saving in mind (displacing ~75W running a 3D game) arguments like these would be rather silly. Dial me up when nVidia decides to ship 500MHz versions of this chip in laptops *chuckle* Yea, maybe it is just you--because there sure as heck is nothing "embedded" here...;)

Why should they *downclock* chips if they are consuming minimal amounts of power and dissipating minimal amounts of heat at their standard MHz speed? Ever heard of the theory of moving to smaller production processes and the voltage and heat displacement advantages which supposedly result? (Which makes this all the more ironic.) To put it another way, how would people react if their servers or workstation cpus "downclocked" all the time? I do not think they would share your opinion about "efficiency." I know I wouldn't.

Now, if you are prepared to advance evidence that nv30 is an embedded chip designed primarily for the laptop market--hey, I'm all ears....;)

Really? You're sure about this? The fact that its easier to leave the thing on all the time requires zero work, whereas controlling it requires extra work has nothing to do with it?

Where do I begin in listing all of the things wrong with this remark?

First, how on earth do you figure that controlling fan speed from low to high is any less/more work than controlling it from off to high? They weren't leaving it "on" all the time in the same state--they were changing speeds all the time! Good grief...;)

And of course the idea that less heat is being displaced from the case with the fan off is not debatable--nor is the fact that nVidia initially shipped it with the fan "on" all the time but constantly changing speeds. You'd have to crank up the rose colored tint really high not to understand why they decided to switch it off for 2D as opposed to their original decision to leave it on at low speed. Noise--not heat.

Personally, I'd like every company out there to have fan's that only operate when needed. My powersupply, my CPU, my GPU, and even my northbridge has a fan now. I'd prefer they all sit quiet unless required.

Personally, I'd prefer that my 3D chip be designed at .13 microns so that it wouldn't need a Dustbuster and so that we wouldn't be having this discussion. But obviously neither of us is going to get what he wants here--at least I'm sure not getting it from nVidia.
 
I'm sorry I even got into another one of these tit-for-tat arguments.

You're the person saying that downclocking when the higher clock speeds is required is an indication of serious problems. I'll say from professional experience that it is no indication at all, irrespective of your rose-colored accusations.

Now, if you are prepared to advance evidence that nv30 is an embedded chip designed primarily for the laptop market--hey, I'm all ears....

Its obviously not designed for low power (i.e. embedded) systems, but that doesn't mean that scaling back the frequency when not required is a bad thing.

To put it another way, how would people react if their servers or workstation cpus "downclocked" all the time?
I hate to break it to you, but all modern CPUs enters a low power state when idle. (Ok, all might be pushing it as you might find one to prove me wrong)
 
Maybe its just me working in an embedded world where power savings is king, but I'm suprised that all chips DON'T downclock in 2d. I certainly don't see that a chip does do this as "an indicator of a whole lot of negative stuff".

...that doesn't mean that scaling back the frequency when not required is a bad thing.

The fact that it downclocks when going into "2D Mode" is not a negative in and of itself...nor do I think WaltC is arguing that.

The fact that it dissipates 75 Watts of heat and requires dustbuster cooling when in "normal operation" is a bad thing. In other words, the fact that because the solution is so noisy and hot, that it basically requires clock throttling to be a viable product. That's "why" it's a negative. It's not that the throttling is negative, it's the fact that the chip requires it. (Or do you think the GeForceFX 5800 Ultra would sell to ANYONE if it didn't clock down to in 2D operations?)
 
Then say "hey, its loud and needs a big fan and that's bad" not "it reduces its speed when not required and thats an indication of bad things".
 
Or, don't interpret a statement that says "X is an indication of bad things", as "X is a bad thing", and then argue on that wrong interpretation.
 
Back
Top