no 6800ultra Extreme, still the same 6800ultra

Discussion in 'Architecture and Products' started by ultragpu, May 10, 2004.

Thread Status:
Not open for further replies.
  1. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    I, and I imagine a few others, would think it was the same situation. End of discussion really.
     
  2. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, I'm still holding out hope that nVidia will be able to significantly improve shader performance very shortly, and thus overcome the performance gap.
     
  3. Sage

    Sage 13 short of a dozen
    Regular

    Joined:
    Aug 22, 2002
    Messages:
    935
    Likes Received:
    15
    Location:
    Southern Methodist University
    no. The base will be SM2. Why? Because the vast majority of DX9 cards in peoples systems will be SM2. And, actually, most of them aren't even capable of performing DX8 shaders (most "DX9" cards actually in systems are FX 5200's). Even if ATi did have a SM3 part out now it wouldn't change that.

    And, for the record, I am going to be buying a 6800... yes, I'm buying it for SM3. I don't expect it to make a huge difference, but I do want to be able to turn on those few extra features that will be SM3 (probably mostly VS3, actually) only. I am disapointed that ATi does not have VS3 support, but lack of PS3 is not a let doen in the least bit. However, it's definitely not going to be the end of the world as you like to portray it.
     
  4. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    That sounds way too much like the old, "just wait until the real drivers are out for nVidia's card and you'll really see what it can do!" magic driver line Chal.

    Hold off buying it until they come out with the magic drivers would be my best advice if that's what you're feeling. ;)
     
  5. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Except I didn't say that, Digi.
     
  6. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    It wasn't that so far, and for all I ever learned about humans, it will never be. At the end of the day, every single argument is utterly pointless but that hasn't stopped people from leading debates on whatever topic. This just happens to be one that's been around too many times and shows no sign of ever coming to an end, is all.
     
  7. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    End of discussion for me.. dont see a point in going in circles.
    But you baited me and I am in the discussion again.
    Therefore you win. :p

    I disagree that every single argument is utterly pointless. Most are, some are not. Some arguments as a spectator or active participant you learn something new. However most arguments revolve around the sympton and not the cause. What is the root of the problem? I believe it is simply the fact no one likes to be proven wrong even if they know they are. Massive blow to the ego and all that old chum.. ;)

    Reason why it wont come to an end is because it's natural to have a difference of opinion. IMHO of course.
     
  8. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    Hot damn. Risking that I end up looking as if "winning" was of any importance to me with my post... this one just made me actually laugh out loud with happiness. Hell if I know why, never would have thought such a lousy day could possibly end with a smile.

    Maybe all is not completely pointless after all. Thus you win. :p 'nite all.
     
  9. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    This argumentation is just absolutely f'in ridiculous. nVidia released the original GF3 with 1.1 support in the spring of '01. It released FX hardware in the summer of '03, yet because of "mistakes" high-profile developers like Newell stated that they'd treat the lineup as DX8 parts. It's now summer of '04 and nVidia is close to releasing parts that can viably run software beyond 1.1.

    Essentially between the spring of '01 and the summer of '04 we're looking at marginal tech. improvements coupled onto speed refreshes and, in your own words, a rather large "mistake" that did little, if anything, to advance the use of tech. by developers. That's over three entire years, which in this industry is quite a bit of time!

    So your point is. . . .? <aside from twisting anything you can think of as damning against ATI>
     
  10. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    The GF4 was not a revolutionary architecture by any means. Like many have already said, it increased performance over the GF3 Ti line, and very little else. Why would ATI support SM3 when SM2 still has plenty of headroom for development. Do you think the three or so games that are currently out has put an end to SM2's lifespan? If you do your sadly mistaken. Companies don't just drop everything overnight for newer technology, business doesn't work like that. It's a gradual process that takes time and when that time comes the next generation will be upon us (speaking like a preacher aren't I). SM3 is an intermediate step because all it offers are improvments to the directx 9 standard. Just because a patch may be available for a game that adds a few features and optimizes code to run slightly faster doesn't mean it is a necessity to run the game.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Right. And when was it released? About one year after the GF3 line. It's been nearly two years since ATI released the Radeon 9700 Pro.

    This has always been the case at the juncture of a new architecture. Additionally, it's once SM2 really takes off that we'll start to see noticeable differences between SM2 and SM3 (as it's easier to extend to SM3 once SM2 is fully-supported).

    I don't think I ever claimed anything like that. But high-end processors should be ahead of the technology curve, not riding it. You can't make games without hardware. I think it'd just be wrong for ATI to ride on nVidia's coattails, keeping the highest-performing products by not improving technology, and then not add in SM3 until nVidia's low-end SM3 hardware is relatively common, and games start to make noticeable use of SM3.
     
  12. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    What Chalnoth said. Ruby (an ATi demo) is proof of that.

    It would have been interesing to see NV30 as originally intended, on a Low-K process and clocked around 700mhz.

    PS1.1 succeeded over 1.4 simply because it was an industry wide standard both on and off of the PC and set a new baseline.

    SM2.0 is just a waypoint on the DX9 map leading to SM3.0. In fact I believe it was initially quite unimportant until nVidia walked out of DX9 discussions and were subsequently unable to get fully to SM3.0 with NV30.
     
  13. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Unlikely that it could possibly scale to 700MHz. The announced speeds of 500MHz were its targetted clocks (although they wouldn't have gone for the coolers they did).

    Wrong. SM3.0 is a way point to 4.0 - its a teaser. Whatever has the widespread support consensus is whats most important and I doubt you'll find Intel supporting SM3.0.
     
  14. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    radar1200gs said:
    why is SM2.0 only a waypoint? It looks to me like it's the lowest common denominator and floating point shader baseline. Until we SM3.0 hardware in quantity, which won't be with the 6800 series since they are all relatively high end parts, the vast majority of install base, ATI or Nvidia, will have SM2.0 support when it comes to floating point shaders.

    Also, your statement about SM2.0 being unimportant "until Nvidia walked out of DX9 discussions" indicates that you have a very Nvidia-centric viewpoint. However, even had they stayed and NV30 managed full SM3.0 support, the FX5200 (which is still Nvidia's best-selling DX9 part) would still have provided at best severely limited usable DX9 support.
     
  15. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    The timeframe doesn't matter, the point was that nvidia has done the same in the past. The only thing that matters is that SM2 is just now starting to be utilized. There simply is no reason to jump ahead by supporting SM3 atm.

    No, we won't see noticable differences as SM3 is a minimal upgrade.

    There is nothing wrong about it, it's called being smart. ATI isn't forcing nvidia to support SM3, they chose to go that path. ATI doesn't believe there is any benefit to it at this point in time. That is their prerogative which they came to the conclusion about after extensive research. If nvidia had decided not to support SM3 that wouldn't have meant the end of progressment in technology; both companies would have supported SM3 in their next product cycles.
     
  16. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    Low-k doesn't give that big of a performance boost, first of all, and have we all forgotten about the God-awful 128-bit memory bus on NV30?
     
  17. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Well giving the NV30 the benefit of the doubt. I have seen a 500 Mhz NV30 perform better than 400/850 clocked Nv35's and some cases faster than the 450/850 Ultra cards,


    Multitexturing isnt quite as bandwith limited as single texturing and the multi texturing fill rate was there. Hard to say how shader bound ops would been bottlenecked by the memory config. But in my experience with my card. The Huge Memory improvement of my FX 5900 Card hasnt really provided a real benefit to my shader bound titles.


    The Nv30's real problem was IMO the integer units still on the core, Which hurt its FP performance.
     
  18. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    Who's doing Low-K for the performance boost? You do it to lower signal interference at high frequency and enable faster clock speeds (edit: through a lowering of the total heat budget in NV30's case).

    The 128 bit bus really doesn't matter if you can run twice the speed of your competitor (you effectively have his 256 bit bus anyway).
     
  19. Sage

    Sage 13 short of a dozen
    Regular

    Joined:
    Aug 22, 2002
    Messages:
    935
    Likes Received:
    15
    Location:
    Southern Methodist University
    uhhhh.. correct me if I'm wrong, but aren't those two generally tied together?
     
  20. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    There are other aspects of Low-K that can help boost performance, I suggest you read some of the links I and others have posted on the forum concerning Low-K.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...