R420 might only support pixel shader 2.0

Discussion in 'Pre-release GPU Speculation' started by rwolf, Feb 22, 2004.

  1. T2k

    T2k
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,004
    Likes Received:
    0
    Location:
    The Slope & TriBeCa (NYC)
  2. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    What makes you think that? You wanted a score, you got a score. You wanted a link, you got a link.
    You could probably find a proper review given sufficient time, but it doesn't look like there are too many out there with 3dmark03 results.
     
  3. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Yes. P.s 2 capable games are finally coming out in the next few months and it will most likely be a year before we see games that will make use of p.s 3 .0 . So p.s 2 and image quality is much more important to me .

    Its why I went from my 8500 - my ti 4600 - the 9700pro .

    I was going to get a 9800 pro in sept but i skipped it when i found out half life 2 wasn't coming.

    So even if the nv40 gets 5fps more at the same settings as the r420. If the r420 has better image quality . Its r420 all the way.

    2d is also very important to me as i do a ton of video editing too .
     
  4. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Performance almost isn't important - compatibility is. 845G and 865G are hardly good performing DX7 chips afterall, but we've talked to so many developers and often asked what about upcoming feature utilisation and all too often do we get the reply "we have to have a DX7 path because of all those Intel intgrated chipsets - get Intel to upgrade that".
     
  5. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    But how will it help if they have to add a DX8 path just becuse the DX9 features are to slow ?

    I seem to remember that Valves isn't to happy about the FX5200 and even the FX 5600.
    Maybe if they're counting on that most users are stupid. "DX9 capable = wow, now i can run all DX9 games".
     
  6. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    You don't add a DX7 codepath to run acceptably on EG2 - you add it to run at all. It's not the speed that's forcing you to write it, it's the lack of features. With EG3, developers will be able to just code in DX9 and know that it will run. They won't worry much about performance though, just like they don't worry now about speed on current integrated graphics.
     
  7. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    OK, you're really nitpicking here. Not only did the G400 come out well before that, but that was only a matter of a couple of months. EMBM was always part of DX7. NVidia just ignored it, as it isn't easy or cheap to implement, and performance was their main goal.

    Listen, I never said EMBM is superior to ps. Of course not, EMBM is a subset of ps (the texbem instruction). What I'm saying is that people are using texm3x3spec instead of texbem, which is mathematically and visually incorrect for flat surfaces like water. That instruction should be used for small objects which reflect in all directions. In fact, that's pretty much the only substantial difference between the original Radeon and the GF3 (pixel shader feature-wise). The Radeon couldn't do a per pixel matrix multiply and follow it with a cube map lookup. Even Carmack was commenting on this about the Radeon, and how dependent texturing was not quite flexible enough.

    The fact is many pixel shaders in use today can easily be represented by multitexturing, but it's just a pain to program. You can use EMBM for floor/wall/water reflections, fur rendering, offset mapping (or whatever you really want to call it), and many other things. ps 1.0-1.3 really just cleaned up multitexturing and added bumpy cubemap lookups.

    You're just proving my point and invalidating yours. The GF4MX was "significantly outclassing a part that has much better technology for the future". Whether it was drivers, performance, the NVidia name, or whatever, that's why you would buy it.

    You may be right, but my point is that they have impeded technology too, and you said that wasn't so.

    NV30-34 are fairly slow in DX8 pixel shaders as well, so don't blame this on DX9 again. By flooding the market with NV30-34, developers have really stopped trying with pixel shaders.
     
  8. ZoinKs!

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    782
    Likes Received:
    13
    Location:
    Waiting for Oblivion
    There's not necessarily a speed difference between dx 8 and dx 9 paths on the same hardware. iirc correctly from the HL 2 presentation, the r3x0 got the same fps whether running dx 9 or dx 8 paths. (However, there's an image quality difference.)

    If the shading hardware is slow in dx 9, perhaps it'll also be slow in dx 8. If it's fast in dx 8 shaders, perhaps it'll be fast in dx 9. Only time will tell... but in any case, it'll help ensure ps 2.0 as the new standard. A nice step forward, imo.
     
  9. volt

    Regular

    Joined:
    Oct 22, 2002
    Messages:
    365
    Likes Received:
    3
    Well I have nothing against GF4 Ti cards. They are still one of the best cards for the buck. Try Far Cry demo for example; used with 1.1 (1.3?) shaders, it's fast and looks darn good.
     
  10. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    No, not necessarily. But we know that there's at least one card that shows a big difference :)

    And it's selling pretty well also.

    If it's reasonably fast in DX9 then i'd call it a nice step forward. If it's another FX5200 class chip then i'm not so sure.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    OK, you're really nitpicking here. Not only did the G400 come out well before that, but that was only a matter of a couple of months. EMBM was always part of DX7. NVidia just ignored it, as it isn't easy or cheap to implement, and performance was their main goal.[/quote]
    Once again, the Radeon came out after the GeForce2. The GeForce2 was a refresh part, and I don't understand why it should be expected that it would add significantly new features. If nVidia was going to add EMBM for that generation, they would have done it for the GeForce 256, which was released very, very close to the G400. I don't see why nVidia could be expected to add in EMBM to the GeForce 256, especially given that features are typically solidifed ~2 years before final release of the product.

    More like it's extremely slow and precision issues creep up very quickly.

    Yes, the GeForce4 MX is a black mark on nVidia's record. But it is a totally different situation than the one I was attempting to convey. Here's the situation:

    If company X decided to, they could stop advancing any sort of programmability or other features, and merely go for a product that is as fast as possible and looks as possible while only features in use today. If this company succeeded, it would be possible to produce a part that would be significantly faster than the same part sold by company Y that has more advanced features. If this happened, given the way that many review sites rate products, we'd probably end up with many people preferring the lower-technology part.

    In other words, what I am worried about is the heavy competition in the 3D graphics market leading to products that answer short-term wants at the expense of what will happen in the long-term, leading to stagnation in technology. This is one reason, for example, the I am extremely glad that 3dfx did not succeed with the Voodoo5. I hope you can see how the GeForce4 MX is a totally different kind of bad from that which I outlined above.

    Fortunately, we seem to be approaching the end of rapid advancement of features for 3D graphics anyway, so there's not much danger of this happening any longer. My only worry is that we won't see good higher-order surfaces support for a while.
     
  12. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    Since when are we talking about what a company is expected to do? Do you even remember your original post that started this? I'll remind you of your sole statement that this discussion is based on:
    Remember now? What I'm saying is the Geforce and Geforce2 fit this description very well, and I've given you plenty of justification.

    I said multitexturing, not multipassing. Carmack said it himself - the GF3 pixel shaders are nothing more than glorified register combiners. Pixel shader 1.0 pretty much just made it easier to write, that's all. With 4 texture multitexturing and a bunch of blending stages, you are very close to GF3 functionality, and it would perform the same.

    Precision issues? What are you talking about? Why do you have such an obsession with precision? Multitexturing is all done in one pass, all the math is internal, and precision issues are no different.

    What's funny here is that the GF FX value parts all perform better when things aren't in a pixel shader.

    That sound like pretty much the same situation to me. The GF4 MX was a short term answer that people are still buying. It led to a low adoption of pixel shaders among developers, and EMBM was buried (although GF2 MX nearly killed it already), thus leading to a stagnation in technology. We're still surrounded by games using simple GLQuake style diffuse+lightmap multitexturing, but with dot3 bumpmapping becoming commonplace now. And, unlike the Voodoo5, it did succeed. Totally different? Hell no.
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    There's just no way. The GeForce was released nearly a full year before the Radeon. Of course it was going to have lesser technology! If there was any fault here, it was ATI's, because they did not put out a part that had high performance or good drivers. nVidia was not in a position to release a new architecture at the time of the NV15. There are realistic expectations, and there are unrealistic ones.

    1) That's a software issue. I was talking about hardware.
    2) EMBM never would have caught on anyway. It wasn't killed by NV1x hardware. It was killed by NV2x/R2xx hardware.
    3) EMBM certainly wouldn't have changed the scenario you outlined. EMBM was never used as anything but a gimick, and this would not have changed.
     
  14. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada

    Developers didn't think it was a gimmick 2 years ago...

    Games featuring EMBM

    Ace of Angelsâ„¢
    Aquariusâ„¢
    Battlezone II: Combat Commanderâ„¢
    Battle Isle: The Andosia war
    BITM
    Carmageddon®: TDR 2000™
    Colin McRae Rally 2
    Descent 3â„¢
    Descent 3â„¢: Mercenary
    Destroyer Commandâ„¢
    Drakanâ„¢
    Dungeon Keeperâ„¢ 2
    Echelon®
    Echelon®: Wind Warriors
    Expendableâ„¢
    F1 World Grand Prix
    Far Gate
    Fur Fighters
    Hard Truck IIâ„¢
    Hired Teamâ„¢ Gold
    Hired Teamâ„¢: Trial
    Incoming Forces
    Parkan: Iron Strategyâ„¢
    Ka-52 Team Alligatorâ„¢
    Kyodai
    Off Road: Redneck Racing
    Offshore2000: Pro Surf Tour
    Planet Heat
    PowerRenderâ„¢ engine V 3.0*
    Private Warsâ„¢
    Rollcage® Stage II
    Silent Hunter II
    Silent Space
    Silex engine
    Slave Zeroâ„¢
    Speed Bustersâ„¢
    Spirit of Speed 1937
    Sub Command
    Jugular® Street Luge Racing
    Totaledâ„¢
    Warm Up
    Wild Metal Countryâ„¢*
    3DMarkâ„¢ 2000, 2001
    Actor
    Atlantica
    Blitz: Disc Arena
    LithTech 2 Engine*
    Warrior Kings

    Not bad looking water effects for 3 years ago...

    [​IMG]
     
  15. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Gimmick: Feature added to a game to invoke a visual, "That looks cool," while not adding anything substantial to the game in question (this is the way in which I used the term).

    I don't want a specific feature applied to a specific thing in the game world that "looks cool." I want a game world that looks cool. Piecemeal effects like EMBM are gimmicks, in my use of the term.
     
  16. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    yes because pixel shader water has added so much to the games other than wow that water looks good


    Sure instead of milking the geforce product line they could have made a part featuring future techs. Hell current techs .


    But lets use your logic .

    Its alright for nvidia not to support features because they were continueing to sell old tech with increased clock speeds as top of the line parts .

    But it wont be okay for ati if they don't support p.s 3.0 or fp 32 even though they too will be offering a top of the line part that is mearly increased clock speeds and performance enhancments of a 2 year old product ?

    Its what nvidia did .

    Hell its what nvidia is known for then

    geforce 2 , geforce 3 ti , geforce 4 ti .
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Sarcasm? Yes, I consider pixel shader usage to date to have been pretty much nothing more than gimmicks to date. Still, I have to admit that I have not played some of the more recent PS 2.0 games, although I don't have very high expectations for them, either.

    So you expect them to have had a new architecture out within six months of the previous one? Nobody has done that. It would have cost way too much money.

    a) There has been enough time to develop a new architecture.
    b) If nVidia releases a PS 3.0 part at about the same time frame, then it will certainly be bad.
     
  18. Fred da Roza

    Newcomer

    Joined:
    May 6, 2003
    Messages:
    178
    Likes Received:
    2
    Why should ATI support PS 3.0 to further perpetuate (in your words) more gimmicks?
     
  19. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Just because many of these features have been used as gimicks in the past doesn't mean they'll be used as gimmicks in the future.

    PS 3.0 really won't add much in terms of new visual effects (over PS 2.x). What it will do is make certain shaders perform better, meaning a part that supports PS 3.0 should increase in performance as games add PS 3.0 shaders (which, I expect, would be very fast, as the change from PS 2.0 to PS 3.0 requires relatively little change to the engine....what is unknown is whether current PS 2.0 shaders will have a performance difference under PS 3.0...that may requrie a move to more complex shaders).

    VS 3.0 is the big jump this time, with texture addressing in the shaders, and it will likely take a while to fully support this new feature as a consequence.

    Anyway, all of these features can certainly be used as gimmicks, but they also lend themselves to producing a world that has the same amount of TLC applied to every surface, not just a few specific ones like water or hair. When you apply such a limited, specific effect to a part of the world, you create a separation in visual fidelity that ruins the whole effect. PS 3.0, as PS 2.0 was before it, will be a tool for forward-thinking developers to produce a unified world that looks truly better.

    As an example of what I term not a gimmick, I was extremely excited by John Carmack's description of how the DOOM3 rendering is to be done. Another similar thing was the increase in polycount from Unreal/Unreal Tournament to Unreal Tournament 2003 (and other games using the engine). The increased polycount really transformed the game world.
     
  20. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...