R520 = Dissapointment

Discussion in 'Architecture and Products' started by Hellbinder, Oct 5, 2005.

  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Well. I honestly dont see the 512 XT falling in price competitively to the 256 meg 7800GTX. The 256 Meg verson of the XT would be far more likely.
     
  2. Junkstyle

    Newcomer

    Joined:
    Sep 18, 2005
    Messages:
    158
    Likes Received:
    1
    Yeah it's ironic that Nvidia gave up their algorithm for ATI's and now ATI is going to the (angle independent?) one, but doesn't ATI do it for a fraction of the performance hit that Nvidia used to take?

    Well, in the long run they both will eventually do it.
     
    #162 Junkstyle, Oct 6, 2005
    Last edited by a moderator: Oct 6, 2005
  3. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    I dunno, I'd want to see performance numbers that I haven't seen yet before I started yelling "hypocrisy" on this issue. Both relative hit and absolute FPS with it enabled. Context is all, and it seems to me that ATI has been pretty consistent with the "useable features" theme over the last several years, meaning the performance is there to do it at playable framerates, not just "for show".
     
  4. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism
    yes but you're missing a key thing here, Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality. ATI's AF quality has not done this core to core, and they continue with this one.
     
  5. croc_mak

    Newcomer

    Joined:
    Mar 26, 2002
    Messages:
    46
    Likes Received:
    0

    Bingo..


    Beats me that several "knowledgeable reviewers" benchmarked using Nvidia's default AF quality - which is a piece of s&^t IMHO
     
  6. Banko

    Newcomer

    Joined:
    Sep 7, 2005
    Messages:
    6
    Likes Received:
    0
    That is because ATi has used angle AF algorithms with every single core, while NVidia didn't use any on the FX, and when people didnt complain about ATi's af they decided to do the same. Now ATi does what NVidia used to do.
     
  7. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    Maybe because they couldnt tell the difference but can now? People dont care when it helps performance and doesnt hurt IQ, they do care when it hurts IQ though.
     
  8. Bob

    Bob
    Regular

    Joined:
    Apr 22, 2004
    Messages:
    424
    Likes Received:
    47
    How is G70's AF worse than NV40's?
     
  9. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
  10. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    Did we just hear the bell letting loose every single gamer out there to grab that new graphics card that ATi made that game dev's like?

    In all honesty, its stuff like this that the majority of people who I deal with come to my shop for. People eat it up when they know developer's love a new card. I do to, and this is great news. Hopefully we'll see wide acceptance of all the new features on the X1k cards.
     
  11. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    You .. just .. shot .. yourself .. in .. the .. foot. ;)

    Amazingly accurate, yeah. :roll:
     
  12. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    didn't Carmack prefer NV3x over R3x0 for development? food for thought.

    (I am in no way, shape, or form equating R520 with NV30.)
     
  13. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism

    Shimmering was and still is far more pronounced. Nvidia has released drivers to correct it. First it was borked on the NV40 launch, fixed via drivers to acceptable standards of most, then the G70 launched plagued with the exact same problem all over again. Even after recent drivers i can still notice a difference in IQ comparison from NV40 to G70, so the problem still very much exists. Its far more pronounced in some titles. The worst part with angle AF is that you simply cannot get rid shimmering, its always there no matter how miniscule or pronounced, and will always cause a problem anytime you use that type of AF in games. If Nvidia keeps on the path, which i doubt, it wouldnt shock me to see the problem arise for a third time on their next core.
     
  14. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Re my point above on performance with HQ AF, I'm still trying to catch up on reviews. . .but I just ran across this at [H] re XL on HL2:

     
  15. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    Isn't it going to vary completely from game to game (i.e., it depends entirely on how many angles are getting AF that wouldn't otherwise get as much)?
     
  16. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Sure, but has "tiebreaker" value, no? Much like NV40 got tiebreakers for features. Didn't we just have this conversation, in fact? :razz:
     
  17. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
  18. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    True, but in all honesty, I think he loves OpenGL more than anything and that makes him love Nvidia cards.

    I like new stuff, I was very happy with Nv40 cards because of some of the forward thinking features it had, and I'm really happy with R5x0 with its forward thinking features. I enjoy features that are useful now, and in the future when the other guy isnt offering them. Hehe, that's why I bought a 6800, and now I'm going to buy an X1800XL.... when I know they OC good. Lol.
     
    #178 Skrying, Oct 6, 2005
    Last edited by a moderator: Oct 6, 2005
  19. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    Keep in mind that NV3x had fewer restrictions on the theoretical use of its PS units (e.g., longer instruction counts, silly things that were part of ps_2_a, etc.); it's just that using them for anything but development was, uh, dumb.
     
  20. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Mebbee. Went with what I had. I know that you'll believe me when I say I'd have come back here with an opposite indicator. So, somewhat better than anecdotal (because Brent is a trained observer), but I'd agree not conclusive.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...