AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
I've been running a 3 monitor Eyefinity setup for nearly to 3 weeks now, and the bezel are not an issue. Your brain just adjust to it. Even my miss-matched monitor size setup (25.5" + 27" + 25.5") isn't an issue.

Btw, Eyefinity is gaming bliss. The added peripheral vision adds so much immersion to gaming. I'm surprised there's so little talk about it here with this supposed hardcore 3D audience.
Well I haven't used it personally, but man I can't see ever becoming used to such a separation between the screens.

One major issue though that I saw from the demos was the aspect ratio - you would have a normal AR image in the main screen, but then the two at the left/right would be significantly warped. Do the games you're playing on it actually adjust the AR or offer to do so manually (something I wish so many games did and did not tie it to resolution, kudos to the Riddick developers for example).
 
Well I haven't used it personally, but man I can't see ever becoming used to such a separation between the screens.
This is one of those "see it to believe it" things, but as mentioned once you start playing with it you get over the bezels/separation. The reason being is that a 3x1 landscape configuration doesn't alter what you look at (at least in FPS, 3rd person and racing titles); you are not always looking around the monitors, your head / eyes are still focused on the central panel, but the additional panels are there to encompass your peripheral vision and you react to that.

One major issue though that I saw from the demos was the aspect ratio - you would have a normal AR image in the main screen, but then the two at the left/right would be significantly warped. Do the games you're playing on it actually adjust the AR or offer to do so manually (something I wish so many games did and did not tie it to resolution, kudos to the Riddick developers for example).
This is title dependant - some will strech but others will actually render natively to that aspect ratio, which means more polys and objects rendered than on 4:3, 16:10, 16:9 single panels. Naturally we are working with game developers to so they take note of the aspect ratios that Eyefinity can produce and you'll see that one of the late updates to the Unigine benchmark was to support Eyefinity and also the early demos of Dirt 2 already render native Eyefinity aspect ratios.
 
I just like OLED . I have it on the zune hd and I will never go back. IF there were affordable 2x sized oled monitors I would buy them in a heart beat. They look so amazing.


Though it sounds like you can run eye infinity without the same size monitors ? do they all have to be the same res ?
 
Last edited by a moderator:
They can be different sizes, resolutions, they can even be rotated differently. ie. 1 portrait along-side 2 landscape.
 
ah cool. I have a 24 inch sceptre thats 1920x1200 and a 22 inch westinghouse thats 1680x1050. I wanted to get an oled 24 inch as my main but they are only 1920x1080. I'd happy now i can use all 3 when i get a 5870 next year.
 
In what way could anyone see that kind of power draw in non-real-world applications? I.E. what makes any application non-real-world?

I suppose for people with jobs that require Furmark to get any work done? Er...

Ok, maybe for people that like the play a game called Furmark? Er...

OK, so for video card hypercondriac's it may be a good way to validate their worries that their card might blow up when actually doing something with it. :p

Regards,
SB
 
I am guessing so they would know the max possible draw, some piece of mind. You'd never know when a program would totally overwork those graphics cards in a similar fashion furmark would. And... you'd never know what might happen in the near future... maybe we will be playing a game with characters entirely made out of fur... LOL... I hope someone would make a mod for a game with that in mind.


Also i don't think there's anything wrong with testing a grahics card to know what the maximum possible power draw could be. For thermaldynamics' sake.
 
I am guessing so they would know the max possible draw, some piece of mind. You'd never know when a program would totally overwork those graphics cards in a similar fashion furmark would. And... you'd never know what might happen in the near future... maybe we will be playing a game with characters entirely made out of fur... LOL... I hope someone would make a mod for a game with that in mind.


Also i don't think there's anything wrong with testing a grahics card to know what the maximum possible power draw could be. For thermaldynamics' sake.

Hehe, not only all characters out of fur, but all buildings, geometry, terrain, etc... :) Would certainly be a funny if impractical application.

And considering furmark consumes about 50-100% more power than the most strenuous real world application or game (even 3dmark, the benchmarkers game :D)... Uh, yeah, I think there's a lot of overhead there.

Regards,
SB
 
And considering furmark consumes about 50-100% more power than the most strenuous real world application or game (even 3dmark, the benchmarkers game :D)...
This is greatly exaggerated. From the numbers posted above, you can see FurMark "only" increases power draw from about 150W to 200W for a HD5870 compared to "real" real world application, which is 33% more. All radeons show a similar increase, whereas for the GeForces (particularly the gt200b based ones) it is much less.
 
This is greatly exaggerated. From the numbers posted above, you can see FurMark "only" increases power draw from about 150W to 200W for a HD5870 compared to "real" real world application, which is 33% more. All radeons show a similar increase, whereas for the GeForces (particularly the gt200b based ones) it is much less.

Do you have any framerates in relation to that? Please?
 
Do you have any framerates in relation to that? Please?
You mean for Furmark? I know I saw some article somewhere which also looked at performance in FurMark apart from power usage to get some efficiency idea, but can't remember where that was...
Too bad there are no Bioshock fps numbers in that article neither, but according to http://ht4u.net/reviews/2009/leistungsaufnahme_grafikkarten_games/index2.php they test power usage with 1680x1050, highest details but no AA, so I guess you could find some numbers elsewhere.

btw also according to this article, power consumption in furmark is indeed more than those 33% more if you consider average power consumption instead of maximum (of course furmark is always the same but games are not).
 
Since no-one runs benchmarks with Furmark (framerate) it's ridiculous to only show power draw and leave FPS behind. FurMark then becomes an (un)realistic display of power consumption with no reference.

And how exactly does that make it non-real world? Everything without a however meaningless score does not count? No one plays 3DMark either, it only has an abstract score. Better? No one can play Unigine either.

Or is it because Furmark showed something that some people would better have left undiscovered? That at least is my assumption.

Do you have any framerates in
relation to that? Please?

HD 4770: 24
HD 4870 1G: 51
HD 5770 1G: 35
HD 5850 1G: 59
HD 5870 1G: 72
(all at 1.280x1.024 Fullscreen, no AA/AF, extrem Burning-Mode)

Seems not very much Shader-limited, doesn't it? Maybe one should see if it scales better with available bandwith?
 
Last edited by a moderator:
Hmm, it would be interesting to discover, why it forces the graphics card to draw so much power. I would expect such behaviour from a code, which is limited by ALU+TMU performance, not by bandwidth.
 
I am wondering about that too. Seems like it isn't raw ALU performance, nor seems it to be interpolator limited.
 
HD 4770: 24
HD 4870 1G: 51
HD 5770 1G: 35
HD 5850 1G: 59
HD 5870 1G: 72
(all at 1.280x1.024 Fullscreen, no AA/AF, extrem Burning-Mode)
Seems not very much Shader-limited, doesn't it? Maybe one should see if it scales better with available bandwith?
5770@ 700/1100: 31
5770@ 825/1100: 32
5770@ 950/1100: 33
5770@ 700/1400: 31
5770@ 825/1400: 36
5770@ 950/1400: 39

Hmm.. looks ok balanced, just strange that the 4870 is so fast then:
5770@750/1100: 33
5770@750/1445: 35
 
The 4870 doesn't have any hardware to regulate power consumption by rogue power-virus programs/code.

For the same reason you won't see power draw go through the roof with Rv870 you won't see performance go through the roof.

Regards,
SB
 
The 4870 doesn't have any hardware to regulate power consumption by rogue power-virus programs/code.

For the same reason you won't see power draw go through the roof with Rv870 you won't see performance go through the roof.
I don't think that's what's happening. If that would be the case you'd see higher performance with better cooling or lower gpu voltage.
 
I don't think that's what's happening. If that would be the case you'd see higher performance with better cooling or lower gpu voltage.

How do you know you wouldn't? And why would better cooling mean less power draw? I think I'm confused. Are you referring to 5770 or 4870 with the above statement?

The 4870 doesn't have any hardware to regulate power consumption by rogue power-virus programs/code.

For the same reason you won't see power draw go through the roof with Rv870 you won't see performance go through the roof.

Regards,
SB

Doesn't ATI drivers detect and handicap furmark on any hardware (and it still handily bests Nvidia solutions, even then)?
 
How do you know you wouldn't? And why would better cooling mean less power draw? I think I'm confused. Are you referring to 5770 or 4870 with the above statement?
The 5xxx series. Better cooling means less power consumption indeed. Now I always thought this would only make a pretty marginal difference, but this is actually not the case the difference is instead quite dramatic. You can see that for instance here: http://ht4u.net/reviews/2009/amd_radeon_hd5000_fertigungsschwankungen/index5.php (now that article was for showing differences between one chip and another, but you can see what large difference it makes when the chip stays cooler).
You are right though I don't know if this isn't somehow hardware-limited. This was just a guess based on the fact that percentage difference between bioshock and furmark was similar with 5870 and 4890, though it might be possible bioshock with those settings creates less gpu load (percentage-wise) on 5870 than 4890 (for instance could be cpu limited). Someone would need to test it to be sure, unless that power-draw limiting logic is stupid and doesn't actually have anything to do with power draw but is only based on some units-busy indicators, in which case no amount of cooling, undervolting or downclocking would change anything...

Doesn't ATI drivers detect and handicap furmark on any hardware (and it still handily bests Nvidia solutions, even then)?
No, not on HD5xxx: http://ht4u.net/reviews/2009/amd_ati_radeon_hd5870_part2/index5.php
And no with driver limit HD48xx no longer beats GTX275 (they actually have furmark scores there!)
 
Back
Top