Oblivion: Shader or texture heavy?

Discussion in 'PC Gaming' started by horvendile, Feb 10, 2006.

  1. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    Someone should have asked this before, but I can't find it.
    Based on what we know now, is it possible to guess whether Oblivion will stress shader or texture capabilities of video cards the most?
    As I'm bragging about all over the forums, I'm in the slow process of buying a new computer and may have to choose video card before Oblivion is released. Right now I'm looking at midrange cards, where two obvious candidates are the X1600XT (SM3, shader strong) and the X800XL (SM2, texture strong).
     
  2. Tim

    Tim
    Regular

    Joined:
    Mar 28, 2003
    Messages:
    875
    Likes Received:
    5
    Location:
    Denmark
    The X800XL is not shader weak when it comes to performance it has 16 shader processors versus 12 for the x1600XT - the only weakness are the lack of SM3 + FP16 blending, fetch 4 etc. Oblivion will us SM3 features to increase performance, so the x1600xt might get an edge also in performance in spite of having fewer shader units. You can read about Oblivions use of SM3 here:

    http://beyond3d.com/interviews/oblivion/index.php?p=02

    There are also other candidates like x800GTO especially the models with 16 pipes like the Sapphire GTO2 (might require a bios flash to enable more than 12 pipes). It is also very likely that the Geforce 6800GS will beat both the x1600xt and the x800xl in Oblivion.
     
  3. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    Very good point, which I tend to forget.
     
  4. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    ...on the other hand, shouldn't 12 SM3 shaders at 590 MHz be faster than 16 SM2 shaders at 400 MHz? If SM3 is mainly used for performance?

    I notice that I digress from my original question, but if anyone dares take an educated guess, it still stands!
     
  5. LeChuck

    Newcomer

    Joined:
    Jul 24, 2005
    Messages:
    1
    Likes Received:
    0
    Tim why do you think it's likely that the 6800GS will beat X1600XT? You got the same amount of shader "processors", but X1600XT is clocked almost 40% higher. On top of that X1600 has much better support for branching (at least in theory), and I wouldn't be supprised to see FP10 HDR support in Oblivion.
     
  6. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,533
    Likes Received:
    139
    I think that was an overall statement...the 6800GS should be a better card due to its higher available mem-bandwidth and higher number of ROPS. I don`t think that Oblivion will be that game where math is so intensive that it absolutely denies all other aspects of an architecture. And as a sidenote, does anyone else have the feeling that they dumbed down the graphics from what they were initially showing?
     
  7. Schaden

    Newcomer

    Joined:
    Jul 16, 2004
    Messages:
    117
    Likes Received:
    0

    They have drastically cut back on the dynamic shadows since E3.
    It is disappointing. If you've seen the recent German video the graphics do not look
    as great as you'd expect. Nowhere near as nice as the first screenshots. However Bethesda claims the build in the video wasn't running with high quality and all the features enabled.

    I too would go with a 6800GS. X1600XT might be close I don't know. Definitely stick with a SM3 capable card. All of the press and interview have compared the X360 graphics to PC using "the latest SM 3 video cards", so I think there maybe some compromises with only SM2.
     
  8. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    Well, that is the question. The 6800GS is a little on the expensive side for me. The X1600XT and the X800XL with relevant cooling cost nearly exactly as much as each other, and going to X850XT would only be a small stretch. If I go for the 6800GS however, I may as well take the 7800GT, and in no time at all I will have decided to buy the X1900XT.

    The X1600XT is "modern", supporting SM3 and its grandmother. It is, however, distinctly slow on the texture side. The X850XT is one fast card for the price, but only SM2 and so forth.
    In AoE3 for example, which I also want to play, the X1600XT supports everything but is too slow to be a good choice. The same may be true for Oblivion. A fast X850XT may be a better choice if not too much eye-candy is lost.

    Really, what we (I) want to know is:
    Will SM3 only bring speed in Oblivion, or will it actually look different too? Didn't the B3D interview indicate that it was mainly used for speed?

    Also, do we actually know whether HDR+AA will be possible on the X800 series? It will surely be on the X1600XT, but again, I fear it will be too slow.
     
  9. Sobek

    Sobek Locally Operating
    Veteran

    Joined:
    Dec 17, 2004
    Messages:
    1,774
    Likes Received:
    18
    Location:
    QLD, Australia
    I think that, while feasible, the performance hit will be too large on earlier x8xx series cards to warrant including it. Perhaps a later patch will provide 'experimental support' or somesuch, but I doubt that'll be an option in the retail game.
     
  10. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Huh? I'm seeing the 6800GS as being slightly cheaper than the X1600XT.

    Regardless, though, with ATI now having SM3 parts available, there is absolutely no reason whatsoever to purchase an X800XL. It doesn't support SM3. It doesn't even support HDR at all (well, in the most common implementations of it), let alone AA+HDR.

    But the 6800GS should be cheaper than an X1600XT, and I believe it performs a bit better.
     
  11. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    I'd say the 6800GS out performs the X1600XT in all cases. I have a 6800GS currently, pretty nice card and a good overclocker. The fan is decently quiet, nothing to write home about though. A good card overall, just make sure you set it for high quality in the drivers if you want the best IQ.
     
  12. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,592
    Likes Received:
    1,508
    Er, technically not cheaper (not sure it's even possible given the sheer PCB size and doubled RAM traces), but for the extra $5-15 the GS seems to be a more balanced part, at least in current games. Who knows if the X1600XT's extra shader power (OK, not in terms of MADDs, but otherwise it's 12 @ 425MHz vs. 12 @ 590MHz) will help in future ones, or if it'll be handicapped by its texture units. I forget exactly how the X800XL compares to the 6800GS, but I'm guessing they're close, and at this point SM3 should tip you to the GS.

    At least, not cheaper in the USA according to PriceGrabber, where a Sapphire X1600XT is $165 @ ZZF and an eVGA 6800GS is $170AR @ ZZF or NewEgg.

    But I'm guessing you're not shopping in the USA, horvendile?
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    As for X800XL vs. 6800GS, bear in mind that the 6800GS is pretty much identical in performance to the 6800GT, which did usually beat the X800XL.
     
  14. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,592
    Likes Received:
    1,508
    Hmmm, I don't quite remember usually, or at least not with AA. I glanced at TR's 6800GS review and Xbit's last roundup and see that it's typically tied or else trades leads with the XL.

    I do remember the GS is faster than the GT in some unexpected situations. Xbit's 6800GS review does show that without AA it stomps an XL more often than not.

    So I'd still choose a GS given equal prices, but that's a theoretical call for SM3 and HDR over 6xAA and perhaps less texture shimmer (and for equal prices, which may not be equal if you're not in the US). Not so theoretical is the apparently higher performance in quite a few situations (mostly non-AA). And even tho I'm still leaning against SLI in this price bracket (see how the 50% pricier 7800GT compares to a single 6800GS in AoE3, CoD2, FEAR), the option can't hurt.

    But we've already seen pics of a 7600GS/GT, so I'd wait to see how that shakes out. It may be only 128-bit, but 12 slightly tweaked pipes clocked maybe 50% higher would be nothing to sneeze at.
     
  15. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    I wasn't really intending this thread to be about which grahics card I should buy, but hey, with so many people willing to help me, why not?

    Short background: I'm too old, lazy and clumsy to build my own computer. I want to specify it exactly, but not do the actual building. Thus, I want to buy everything from the same store. Furthermore I want a relatively quiet computer. The store I have selected is not not particularly cheap, but nor is it expensive (for being in Sweden), and they specialize in building quiet computers. This means that standard cooling of the video card will not do. Most of all I want a heatpipe solution, but failing that I will settle for a Zalman or an Arctic Cooling solution.
    Prices are in SEK, with 7.7 SEK for a US dollar.
    Relevant prices, including VAT, at the time of writing are:

    Radeon X1600XT with heatpipe: 2440 SEK, 317 US dollars
    Radeon X800XL with Zalman cooling: 2480 SEK, 322 US dollars
    Radeon X850XT with Arctic Cooling, er, cooling: 2830 SEK, 368 US dollars
    GeForce 6800GS with separate cooling: About 3300 SEK, 425 US dollars
    GeForce 7800 GT with Arctic Cooling: 3705 SEK, 481 US dollars
    Radeon X1800XL with separate cooling: About 4600 SEK, 600 US dollars
    Radeon X1900XT with separate cooling: About 6200 SEK, 800 US dollars

    Where I write "separate cooling", a separate cooling solution will have to be bought and mounted, the cost for this is about 500 SEK (65 US dollars). If these prices are way above US prices, well, that's my reality.

    The X1900XT is way too expensive. I just won't pay that amount of money for a video card until I'm filthy rich.
    I could possibly pay the price of the X1800XL... for the X1900XT, and perhaps for an upcoming X900XL. But not for an X1800XL. If prices went down to 7800GT levels, I might shell out for an X1800XL.
    But, as mentioned in an earlier thread, I was almost - the key word being almost - content with buying a somewhat cheaper card and upgrade in a year or so (which is shorter than the normal lifespan of my video cards). Were that to be my choice, the X850XT seemed a reasonable upper limit.
    The 6800GS falls somewhere between. It is not cheap and it is not the latest and greatest.

    So, what about the 7800GT? Well, there is this thing about AA. I am given to understand that ATI generally provides nicer-looking AA. Furthermore, isn't it impossible to get HDR and AA at the same time with the GT? Shame for such an expensive card.
    Speaking of HDR+AA, in Oblivion, I assume it's going to work with the X1000 series (though I suspect that in practice, the X1600XT will be too slow). Do we know how it stands for X800 series and GeForce series?

    For performance figures, I look to the couple of months old but very useful X-bit labs survey (mentioned by Pete above):
    http://www.xbitlabs.com/articles/video/display/games-2005.html
    By the way: AoE3 is also important for me, and yes, I know that X800 series does not support SM3. I can probably live with that in AoE3, but it would kind of a bummer if I missed out on the eye candy in Oblivion.
     
  16. horvendile

    Regular

    Joined:
    Jun 26, 2002
    Messages:
    418
    Likes Received:
    2
    Location:
    Sweden
    PS:
    Perhaps I should put myself in wait mode, waiting for the G71 to be announced (in February, right?) and hope for prices to go down. The bad part of that is that, well, waiting is boring and I really want a less noisy computer as soon as possible.
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    The first part is no longer true, at the same number of samples. ATI does offer the additional 6x mode, but nVidia now supports things like gamma correct AA with the GF7 (caveat: I don't think a satisfactory comparison has yet been done), and has more options for transparency AA.

    No and no. No HDR at all for the X800 series. No AA with HDR for the GeForce7 series (unless you're willing to let the software do supersampling, which is really slow).
     
  18. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    I don't remember whether or not I posted this tidbit here, but Pete Hines did say that SLI support is in, but he wasn't sure about whether CrossFire would work OOTB with Oblivion. Something to consider for those debating on upgrades for this game (and who might be in the oh-so-fadish niche market for dual-card graphics solutions <g>).
     
  19. Altcon

    Regular

    Joined:
    Sep 19, 2003
    Messages:
    357
    Likes Received:
    1
    Location:
    side A
    I think Bethesda sort of answered that question with the latest preview, running on an X1900XTX...
     
  20. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    link :???:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...