What were the GPU temps at that time?
I wouldn't know if any GPU thermal safeguard would think to also check the MOSFET temperatures.
The real world is rife with examples of things that are designed to operate with a non-sustainable maximium usage.
It allows you to use it for maximum benefit when needed but not at all times.
When a game comes out that causes this behaviour then I'll panic. But as is, it's a non-issue...
Regards,
SB
MOSFETs aren't checked or have a thermal threshold.
otoh, I haven't heard anyone about the normal operating temperatures for them.
I 100% agree with those assertions.FYI, furmark is older than both HD 4000 and HD 3000 series and thus cannot be maliciously optimized to drive Radeons past their specs.
http://www.ozone3d.net/benchmarks/fur/ said:What is FurMark?
FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that's why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card.
...
Xtreme Burning
Xtreme Burning is a mode where the workload of GPU is maximal. In this mode, the donut is fixe and is displayed in front side which offers the largest surface. In this mode, the GPU quickly becomes very hot
...
Version 1.6.5 - February 4, 2009
Version 1.6.1 - January 30, 2009
Version 1.6.0 - January 7, 2009
Version 1.5.0 - November 23, 2008
Version 1.4.0 - June 23, 2008
...
Version 1.0.0 - August 20, 2007
release notes for 1.5.0 said:New: added a postprocessing pass in Stability Test mode to make the test more intensive.
release notes for 1.4.0 said:New: added an extreme burning mode for stability test.
Damn, I first read that as artificial insemination - sig-worthy if only it was trueartificial intensification.
Does Chimp demo have vsync on?
Allowing for 86% PSU efficiency, I make that 120W extra power draw per GPU by Furmark, compared with desktop. I expect someone will correct my maths though...
Jawed
Now, if you were a pretender for Dave Baumann's old crown you'd investigate versus a single HD4870 and do a complete sweep across resolutions for a nice fillrate-normalised power-graph
Furmark loading the CPU may not even be stressing the CPU in any meaningful way. I haven't investigated this much, but it seems to me that any kind of GPU-limited workload seems to get the CPU core driving the GPU running at 100% - I can't help thinking that's "busy waiting" type workload. Theoretically not exactly stressing the CPU. Dunno really, though. Also might be an ATI thing...
Jawed
Gulp.Well, at some point maybe
Problem is, that I have job to do which takes 50-70h a week.
Yup, somehow enabling AA on Radeon boards in FM drops the load, while it's a kind of opposite case for the GeForce's.Worth noting that Furmark 1280x1024 4xAA resulted in lower power consumption - 550W peak