HD 4870 X2 (R700) review thread

Discussion in '3D Hardware, Software & Output Devices' started by willardjuice, Jul 14, 2008.

  1. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Higher resolution does nothing to diminish the need for AA even one bit. As the DPI often stays the same. Only times you see DPI get smaller is 1920x1200 on a 23" monitor versus a 24" monitor for example. Where a smaller monitor has the same resolution as a larger monitor it will have a naturally higher DPI. But even that effect is negligible with regards to aliasing.

    Now if someday LCD panels come out that support high DPI (think at least 3840x2400 on a 24" screen) that are reasonably priced and reasonably responsive for the consumer space, then you might have an argument there for a lesser need of AA. Although considering how much GPU power you'd need for that, you'd probably still be better off running at 1920x1200 with 8xAA on a 24" screen, at least for edges.

    Number of polygons is only gong to increase the need for AA as now you have far more edges with far more jaggies crawling around the screen.

    Texture Quality won't effect that in any way either.

    Lighting CONTRAST could effect it I suppose if EVERYTHING was uniformly dark and of a similar color then jaggies would be less noticeable. But would then be even more jarring when you came upon a high contrast area. Look at Doom3 as an example of this.

    AA (in my opinion) is still the single largest contributor to immersion in a game.

    Using texture quality as an example. Bioshock has absolutely horrible textures with extremely low resolution. Yet the game can still be immersive...if there's no jaggies. The fact that it's also almost uniformly dark with very few high contrast area's also helps minimize the influence of Jaggies if you can't get AA to work. But there's still quite a few area's where lack of AA will immediately drag me out of the game if I somehow manage to stop noticing them in a darker area. Thankfully, it's possible to run Bioshock with AA.

    Regards,
    SB
     
  2. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    I must say, it kinda sucked when I couldn't get jaggies to disappear on the first Serious Sam game using the highest res my monitor supports(1280x1028, sadly) and 16xQCSAA. I would have thought that'd do it. Guess I'll have to try supersampling and hope the performance doesn't drop too much...
     
  3. suryad

    Veteran

    Joined:
    Aug 20, 2004
    Messages:
    2,479
    Likes Received:
    16
    I must agree with I.S.T. there. I had Assassin's Creed playing at 2560 x 1600 with max AF and I think 4 x AA...dont remember exactly and it looked stunning on my 30 inch. The performance was way better than I had expected but the jaggedy shadows drove ME NUTS! My god is it really that hard to make the jaggies go away?

    Disclaimer: The last question is a result of my lack of knowledge on the AC engine etc etc.
     
  4. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile

    OK, I confess I'm baffled as to why you didn't understand what I said...;) By "cheapness" I meant the <$300 markets, which is the market for consoles. Many people spend more for a 3d card than that, so I'd say "cheap" is right on the money. "You get what you pay for," etc. The reason why console devs have by comparison "limited" resources to tap is precisely because the console costs much less than even some single 3d cards cost alone--the limited resources exist because of the economics that dictate what console resources can be. Hopefully, that's clear. When you say "consoles are vastly more powerful than the last generation" I *hope* you are talking about the last generation of console--which would be true. But when you are talking about fluid 3d-gaming at 1920x1200 or higher, we aren't talking about consoles, are we?

    As I said, developers have two markets, not one. They have to cater to the console market and to the PC market, and the demands of both markets are so different that many companies either do console versions or PC versions but not both.

    Again, AQ (art quality) is an entirely different subject from IQ, because IQ is the province of the 3d gpu, whereas AQ is the province of the game developer. It is entirely possibly to have great AQ and rotten IQ, or vice-versa, and indeed when looking at many console games versus their PC counterparts, and even when looking at games on the PC/console which have no PC or console counterparts, the difference is clear. People who pay more do indeed expect more. That's a well-established marketing reality that predates this discussion by centuries...;)
     
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Sure, but you're implying that if consoles cost much more than they do now and sported much more powerful hardware that developers would leverage that additional power to allow AA. I'm merely pointing out that the leap in performance from last generation consoles didn't bring this to pass so what makes you think that even more powerful and expensive console hardware would? Is there some arbitrary ceiling on AQ that you think console devs would be happy with at which point they would divert resources to "IQ" ?

    Well creating a subjective (and arbitrary) distinction between AQ and IQ doesn't really change the argument. Using your terms all I'm saying that AQ improvements (art, character and environment design etc) have a bigger impact than IQ improvements (adding AA). No matter how much "IQ" you add to an older game with poor AQ there's very little you can do to improve the experience that way. However, there is unlimited potential for improving AQ and I suspect that this is why developers have focused their resources in that area.

    Yep, but how is that relevant to what we're discussing here....?
     
  6. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    trini: sure, not every game this gen on 360/PS3 supports AA, but you must admit the proportion is much greater than the previous generation...
     
  7. SirPauly

    Regular

    Joined:
    Feb 16, 2002
    Messages:
    491
    Likes Received:
    14
    It just looks better with AA enabled for many; it really can't get more simple than that to me. Aliasing hurts immersion and why IHV's try to offer the features and try to force on the features when possible.

    When I see new graphic effects or the cool factor of new shadows, shading or lighting -- amazed at its potential for immersion..............but after some time the aliasing creeps into the mind-set and this is offered, " Man, only if there was no aliasing or less aliasing -- this would look much more real!"

    They're all important moving forward: Shadows, lighting, tessellation, Physics, AA, filtering, Shaders........however AA has been constant on the PC scene for many, many years and is sort of a foundational feature like filtering is -- you just expect to be able to use it.
     
  8. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Don't think anybody said AA makes things look worse....

    It really hasn't though. AA today is still a nice to have, additional feature that is not even supported by some prominent game engines on the PC. It is a far cry from a standard feature like texture filtering. It's a luxury today just as it always has been. One day developers will have full control over AA and apply it liberally but we aren't there yet.

    Yep, we're definitely making progress. Maybe new engines like ID Tech 5 will free up enough bandwidth to allow for more titles supporting AA to come on the scene but we'll see.....
     
  9. SirPauly

    Regular

    Joined:
    Feb 16, 2002
    Messages:
    491
    Likes Received:
    14
    It may be a luxury to certain developers but how is it a luxury for the PC gamer over-all?

    For 150 dollars an end-user can add x8 aa with a lot of performance for it's price-point today. With products like the GeForce3 Ti-200 for 199 in 2001, opened the doors for AA possibilities for the mid-range and has only improved moving forward from that time-line. Thousands upon thousands of PC games can be enjoyed with AA ...........there is only a small amount that can't.
     
  10. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    more frequently on the 360 than the PS3
     
  11. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    No, I wasn't implying that at all...;) I was stating that devs already write software for far more expensive environments--which is what I mean by "PC"--it's on the PC that a single 3d card can cost as much or more than a console, and it's on the PC that resources are far more abundant. They are writing for two markets--consoles, and the much more powerful, and costly, PC. Often developers write for one or the other, and only sometimes do they write the same games for both. And just as often, the limitations of the console-written game are not addressed properly in the PC port, and the PC player is disappointed--whereas often the console player is as happy as a clam--because that's all *he* expects.


    There's nothing subjective about it. Whatever a given game's "AQ" might be, the IQ capabilities of the gpus will be the same. For instance, a game with crummy graphics won't anti-alias any better than a game with splendid graphics or vice-versa--assuming both versions allow the gpu to anti-alias at all.

    I think it's a mistake to think that AQ outweighs IQ because I think they are completely separate categories and to lump them together as "IQ" precludes a proper analysis of either.


    Because you said that you didn't understand what I meant by "cheapness"...;)
     
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    But it should. Anistropic filtering came out after AA was first made useable on cards, and yet there's still games that give an option for AF (thank goodness) but no option for AA (boo) within the game itself.

    Since DX10 Game devs have full control over AA in game. And yet very few DX10 games even offer in game AA options although I "think" the number of titles where you do have the option is increasing.

    And with DX10.1 devs have even MORE control over AA. And yet we have one IHV actively discouraging the use of DX10.1.

    Here's the thing, and both sides will argue this until they are blue in the face.

    One camp will always argue that art assets, special effects, etc. will always make or break whether a game is considered good locking, immersive, wonderful or whatever. And that AA only enhances this.

    The other camp will always argue that without AA, no game truly looks good because they spend the majority of their time focused on the jaggies crawling all over their screen. Not because they want to, but becase the dang things are all over and they cannot ignore them. Basically without AA what would otherwise be a good looking game, because just another piece of bleh, whatever...

    This was most forcefully brought to mind when a friend convinced to play Eve Online recently. I happily went and installed the Premium graphics version thiking that would be the best looking version of the game.

    However, on running the game, I cannot get past the absolutely HORRIBLE aliasing that is everywhere you look. Ships are aliased, space stations are aliased (and extremely badly aliased), planets are aliased, asteroids are aliased, etc. So, I uninstalled it and went to the classic (read OLD graphics) and the game looks SOOOOO much better since I can run it with AA. It actually is a pretty darn fun game if a bit overwhelming.

    Sure I'm missing out on all the special lighting effects, better textures, etc. But the game looks about 500% better with the old dated Art Assets just due to the sole fact a jaggy isn't crawling itself onto my face (how it feels). And really only occasionally do I still think about how nice the Premium version would probably look if it had AA. But it doesn't. And as a result the classic old art assets and old special effects (or lack of them) looks much better.

    When it comes down to it. There are just people that can tolerate aliasing and artifacts for new art assets and special effects. And then there are people who cannot tolerate aliasing no matter what.

    I still have yet to play GRAW or GRAW 2 even though Ghost Recon (up to that point) was one of my all time favorite series of games. I still play the older Ghost Recon Games. But I doubt I'll ever play GRAW or GRAW 2. Drivers might or might not be able to force AA in them now (I honestly have no idea), but the first experience (without AA) has already ruined those games for me. Basically a wasted 50 dollars down the drain on GRAW, I wasn't silly enough to buy GRAW 2 after that mistake.

    Regards,
    SB
     
  13. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Yep, console gamers aren't used to having AA and are ostensibly more jaggy tolerant but I don't see how that ties back to my earlier comment. The cost comparison really doesn't hold up as console hardware is heavily subsidized and I don't think you can simply use PC games as a yardstick for what console devs would do if they had faster hardware today.

    What's certain is that on the next generation of consoles AA is going to be a much smaller part of the overall workload and should be relatively free on hardware at the time so there'll be no excuse for not enabling it. But today it still takes a good chunk of performance ~15% even on RV770 for 4X. Depending on how things turn out that's a lot of horsepower that could be put to to use elsewhere.

    All I care about is the final result. If AQ does more to improve that final result than IQ does then that's all that matters.

    It's simple really. Some people would rather add AA to this:


    [​IMG]

    And others would prefer this with no AA:

    [​IMG]
     
  14. Chris123234

    Regular

    Joined:
    Jan 22, 2003
    Messages:
    306
    Likes Received:
    0
    That is a horrible comparison. You cant say in one sentence that 4X AA is only a 15% hit in performance and then go on to compare a HL2 with a game that has 10000% uglier art assets.
     
  15. Poro

    Newcomer

    Joined:
    Apr 13, 2008
    Messages:
    15
    Likes Received:
    0
    Sorry to break your fierce discussion about AA, but I got couple of questions about R700. So I am buying a new computer and currently I am getting Powercolor 4870 1gb as my GPU. I have plans to get a 24" display in october and I was thinking if I should get HD 4870x2 rather than 4870 1gb? The resolution 1920x1600 is quite big, so I would believe x2 would help my performance quite a bit.

    So is the microstuttering fixed in R700? Here is a test by finnish overclocking and hardware review site Muropaketti:
    http://plaza.fi/muropaketti/artikkelit/naytonohjaimet/ati-radeon-hd-4870-x2-r700,1

    Since most of you can't understand finnish, I'll translate part of the microstuttering test here.

    My apologies if there are few mistakes. So it seems like the microstuttering is fixed? 3 millisecond variations shouldn't show up much?

    Is crossfire generally supported by games? What I've seen in Crysis results, x2 doesn't seem to have much advantage over regular 4870, but in Call of Duty the difference is over 100 FPS.
     
    #155 Poro, Aug 3, 2008
    Last edited by a moderator: Aug 4, 2008
  16. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    Micro-stuttering isn't fixed; it's only a real issue if one's frame rate is very low. Obviously the R700 is the fastest (single board) multi-gpu solution, so it seems "fixed" because it's just simply the fastest (thus it doesn't have very low frame rates as often as other multi-gpu solutions and in turn the effects of micro-stuttering are less obvious).

    Having said all of that, it can be solved one way or another (ie: dropping frames, etc) and one can enable VSync (without triple buffering) to vastly cut down on micro-stuttering's effects. As you can tell from my sig, IMO it's a fairly insignificant problem.
     
  17. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    I don't see how you could've arrived at this conclusion, having actually read Poro's translation of Muropaketti's R700 testing.

    The frame distribution has been leveled out, thus micro-stuttering has been solved (at least for the games tested). Oh, and if you don't think micro-stuttering would be visible going from ~20ms per frame to ~50ms per frame, you've never used an AFR solution.
     
  18. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    Man you just love to argue don't you? :razz:

    Just goto that one German website that started this whole mess. They have a video showing micro-stuttering still on the R700. It's there, just not as relevant as before for reasons already discussed.
     
  19. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,455
    Likes Received:
    471
  20. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    they completely agree with what wj said. they showed Crysis with very low FPS where the stutter is still visible. same goes for HL2. who plays it at 22fps? seriously?

    find PLAYABLE conditions and then report if microstutter is still available/visible/noticeable.
    Hell. .they even FAIL to mention the motherboard used, and that's surely something that has an effect on the stutter.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...