ATI RV740 review/preview

That's not how I see it. Furmark is just a normal 3d app, using only normal 3D API not some "driver backdoor" to create artificial load. And there's no guarantee in fact that Furmark really produces the maximum possible.
That said, some interesting numbers here (german):
http://ht4u.net/reviews/2009/leistungsaufnahme_grafikkarten_games/
In short, the difference in power consumption between games and Furmark is WAY higher on AMD cards than NVidia cards. There must be quite some units idling around on AMD cards in (current) games...

It's definately NOT a "normal 3D app", drawing a furry donut wouldn't be that stressing if it was just "normal 3D app"
 
In short, the difference in power consumption between games and Furmark is WAY higher on AMD cards than NVidia cards. There must be quite some units idling around on AMD cards in (current) games...
Isn't the performance of NVidia relative to ATI significantly lower in Furmark than in games?

Jawed
 
Isn't the performance of NVidia relative to ATI significantly lower in Furmark than in games?

Jawed
Yes (depending on settings a HD4850 outperforms a GTX 285). And the higher performance in Furmark seems to directly translate into higher power draw. So quite some parts of the AMD chips must be idling in games (presumably parts of the shader core).
 
Yes (depending on settings a HD4850 outperforms a GTX 285). And the higher performance in Furmark seems to directly translate into higher power draw.
So when HD4850 is outperforming GTX285 what's the power draw of each?

Jawed
 
It's definately NOT a "normal 3D app", drawing a furry donut wouldn't be that stressing if it was just "normal 3D app"
Any insights as to what makes it "non-normal"? FYI, furmark is older than both HD 4000 and HD 3000 series and thus cannot be maliciously optimized to drive Radeons past their specs.

btw, drawing a furry full screen quad/cube produces even more load, fwiw.
 
So when HD4850 is outperforming GTX285 what's the power draw of each?
A good question. The ht4u article doesn't state the furmark settings (or at least I didn't see it), the numbers where hd4850 outperforms gtx 285 were from the neoseeker article above - enabling AA is what makes AMD chips faster relative to nvidia but I'm not sure if that's causing higher or lower power draw, and that article doesn't measure power consumption (of cards - and not with furmark anyway). So I don't know...
 
It's definately NOT a "normal 3D app", drawing a furry donut wouldn't be that stressing if it was just "normal 3D app"

There's always going to be worst case scenarios. Manufacturers had better make sure their cards survive furmark without any app detection or those cards shouldn't be sold.
 
Have you never seen the red line on the rev meter?!
Furmark = running the engine past the red line.

Very true and a very good analogy.

And ATI has done something similar to some manufacturers with their cars.

ATI has put in a limiter such that when running furmark, it will limit how fast the card is allowed to go.

Similar to how some cars now have rev limiters to prevent that less than 1% of drivers from reving their car over the "normal operating conditions" of the car.

Although in the case of ATI cards, people are still able to bypass the limiter if they want.

So quite some parts of the AMD chips must be idling in games (presumably parts of the shader core).

All GPU's will have parts idling no matter what scenario you run them in. And all CPU's will also spend a significant amount of time idling, especially if you start adding in more cores.

Also, it's quite possible that the ALU's might hit peak rates when running a game, however, no game made now or in the future will put such a sustained load on those units for a constant period of time as Furmark. At some point something else will have to be done that doesn't require 100% utilization of that portion of the chip.

It's all about removing/minimizing bottlenecks for certain aspects/cases of 3D rendering.

Nvidia tends to focus on removing/minimizing the texturing bottleneck. ATI tends to focus on removing/minimizing the ALU bottleneck.

Oversimplified and overgeneralized, yes. But it's just where that focus was.

BTW - you do realize that since you aren't constantly reving your car's engine over the redline to the max it can do, that your engine is "idling" away much of it's potential right? ;)

Just because something CAN do something by design, doesn't mean that it's healthy for it to be doing it ALL the time.

Heck even PSU's that are designed to perform overspec (say drawing 2-3 amps over what a rail is rated for) are not designed to do it constantly at a sustained overload.

Regards,
SB
 
Very true and a very good analogy.

Just one you happen to agree with.

And ATI has done something similar to some manufacturers with their cars.

ATI has put in a limiter such that when running furmark, it will limit how fast the card is allowed to go.

That should be a general purpose mechanism. A limiter on furmark in particular is more like a written admission of a broken chip.
 
It is possible that the GPU scheme is general purpose, just inadequate.

The stuff I saw didn't have excessive temperatures, just excessive power draw.
Perhaps the scheme is thermally based, which assumes that out of spec power draw leads to overly high temps.

If the cooler is effective enough, perhaps the threshold wasn't reached.
This doesn't rule out the possibility of ineffective or incorrectly calibrated thermal readings.

Schemes that have clock management that checks on current draw are a more recent development, and only a few designs come to mind.
 
Yep, pretty much. The car analogy is a poor one.

Perhaps a Heavy Machine Gun (HMG) would be a better choice for an analogy then. A HMG is designed to fire many bullets very quickly. However, if you sustain maximum rate of fire over a long period of time the gun barrel will overhead and cause irreparable damage to the HMG. Even going so far as to cause an explosive backfire which could seriously injure a user.

Even a relatively slow firing canon has the same problem.

Or perhaps Water Heaters before a pressure relief valve was invented.

The real world is rife with examples of things that are designed to operate with a non-sustainable maximium usage.

It allows you to use it for maximum benefit when needed but not at all times.

When a game comes out that causes this behaviour then I'll panic. But as is, it's a non-issue...

Regards,
SB
 
Hmm, wonder what the chimp's fur here does to temperatures:

http://developer.amd.com/documentation/videos/pages/ATIRadeon9800Real-Time.aspx

Do the drivers limit performance?

Jawed

I've run that demo couple of moths ago and the performance was higher than on HD3800 series so I think it's not limited. That was on 8.10 or 8.11 driver so I will retest it again and this time on CF config with 2 HD4870.

I will compare total system power consumption between Chimp Demo and Furmark.
Who is betting on Chimp Demo having much lower GPU utilization than Furemark? I will guess 30-50W less.
 
I've run that demo couple of moths ago and the performance was higher than on HD3800 series so I think it's not limited. That was on 8.10 or 8.11 driver so I will retest it again and this time on CF config with 2 HD4870.

I will compare total system power consumption between Chimp Demo and Furmark.
Who is betting on Chimp Demo having much lower GPU utilization than Furemark? I will guess 30-50W less.

total or per card :)
 
Back
Top