New dynamic branching demo

Discussion in 'Architecture and Products' started by Humus, Jul 1, 2004.

  1. Drak

    Newcomer

    Joined:
    May 16, 2004
    Messages:
    71
    Likes Received:
    0
    That's what I was afraid of. First, the technique needs a render target that's the size of the front/back buffer to store the variable. Will the GPU allow reading from the render target and writing to it if more "if-then-else" statements occur after the first one? Would you need ping-pong buffers?

    And an extra texture lookup would be required to read the variables back in the next passes. That's going to cost bandwidth unless the variables are zero a lot of the time. And if you need to compute more than four variables, it's MRT time.

    But I guess on hardware that do not support dynamic branching, beggars cannot be choosers. Simple emulated "if-then-else" is better than nothing as you've already shown. I think techniques may already exist before one has even thought about the problem, but it's the people who really popularise the techniques that eventually get remembered.
     
  2. Proforma

    Banned

    Joined:
    Feb 23, 2004
    Messages:
    86
    Likes Received:
    0
    Re: New poster trying to provide some FUD

    oh crap, I thought you could run an Geforce 6800 Ultra on a good 350 watt power supply just like the many people who have tested the video card with and just like ATI's latest video cards. :roll:

    Well Intel won't have Desktop 64-bit power until late 2005. AMD already
    has 64-bit cpu's out, but not many people have them compared to 32-bit chips. Its a hard core minority (not a market for many current OEMS yet).
    Need a 64-bit OS to take advantage of this and the drivers are still far too young.

    See the world of PCI express video cards this year, there sure are a ton of them. AGP is obsolete this year and they should stop selling them because of course nobody is going to buy them this year. :roll:

    I am not talking about events that happened two years ago, I am talking about the present and the future. I don't see Nvidia doomed at all.

    As you can see when Nvidia lost what it had to begin with that things can change quickly as it did with the Radeon 9700 verses Nvidia, but I feel ATI is making some of the same mistakes.

    An objective person does see things differently from a room of ATI fanboys.

    When Nvidia goes out of business, please let me know. In fact when ATI has a product on the shelves thats more than a snore of what I already own, please tell me.
     
  3. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Re: New poster trying to provide some FUD

    In the next 6-12 months? Sure maybe they'll reduce pipes, power, heat, and mhz, and you can pretend that it's low performance is relevent to the discussion and that ATI won't be doing the same. :roll:

    We already know that low-k need a new design, it's not like Nvidia will just be able to decide to start making NV40 on low-k just like that.
     
  4. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    Re: New poster trying to provide some FUD

    Your own words show that you are not as unbiased as you claim. You don't think twice the performance of your current card is significant? What about long shader support? New compressed texture format?

    -FUDie
     
  5. Proforma

    Banned

    Joined:
    Feb 23, 2004
    Messages:
    86
    Likes Received:
    0
    Re: New poster trying to provide some FUD

    Working with TMC and IBM is worth something I think. Maybe I am just stupid and don't know much, but I figure they must have some options.

    (ie they don't put all their eggs in one basket)
     
  6. Proforma

    Banned

    Joined:
    Feb 23, 2004
    Messages:
    86
    Likes Received:
    0
    Re: New poster trying to provide some FUD

    No, thats nice, but not as nice as almost all of that (minus the compressed texture format), plus 128 bit color and full on shader 3.0 support.

    I develop software for a living and also for my own use.

    If I am going to spend 400-500 dollars on a video card, it best have
    the latest technology in there already. Speaking about DX /OGL.

    I wanted the shader 3.0 for development of geometry instancing
    and ATI totally ignores the latest technology from Direct X, not only once
    but also will do it again (according to sources on this forum) later this year.

    Quoting from Dave's review...
    "This part is certainly not revolutionary, it's hardly even evolutionary, but more of a refinement on R300’s weak points, placed on a more advanced process with double the number of pipelines and effectively we have an R300 on steroids!"
     
  7. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I fail to see why ATI not supporting these features at the moment is an issue to your "development" as you can develop with a 6800.
     
  8. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    latest tech from dx ? Where were u when the r3x0 and nv3x platforms were released ? Why weren't you complaining about sm 3.0 suppot
     
  9. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Re: New poster trying to provide some FUD

    Fine, but the rest of the NV4x line aren't the NV40, are they?
     
  10. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Re: New poster trying to provide some FUD

    isn't the nv45 just the nv40 with the bridge chip on package ?
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Perhaps, but the rest aren't.
     
  12. pat777

    Newcomer

    Joined:
    May 19, 2004
    Messages:
    230
    Likes Received:
    0
    Re: New poster trying to provide some FUD

    OEMs use mid-range cards far more than high-end cards. I'm sure a 6800NU or another NV4x(not NV40, NV45, nor NV48) will do well for OEMs.
     
  13. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    This thread is getting ridiculous.

    NV40 dynamic branching has a cost. Nvidia themselves say the best use is branching on coherent per-vertex attibutes, or, as in the Nalu demo, a simple ordinary texture lookup, to choose a shader path. Humus is showing how to make a PS 2.0 path to handle this case of shader selection, nothing more, nothing less. You can let your imagination go wild if you want.

    If dynamic branching was so universally useful, why wouldn't they use it in a more creative way in the Nalu demo? If NV40 was so good with dynamic branching performance, why would it be slower than this method? It's not a "useless hack that will never find it's way into games".

    Humus, pocketmoon66, and other coders, maybe even Chalnoth, lets start a thread in the architecture/coding forum so we don't have to bother with all this utterly useless bickering.
    We can talk about this technique in conjunction with stencil shadowing, NV40 performance in all three modes, stencil early-out restrictions, when it would be more/less useful, etc.

    Sorry for the big font, but it's really hard to find the constructive posts.
     
  14. OICAspork

    Newcomer

    Joined:
    May 9, 2003
    Messages:
    210
    Likes Received:
    0
    Location:
    Nara, The Land of the Rising Sun
     
  15. Proforma

    Banned

    Joined:
    Feb 23, 2004
    Messages:
    86
    Likes Received:
    0
    programming

    Any serious program needs to have some kind of branching or you might as well toss it for anything flexible and important.

    I can't stand how people with intelligence can just say "branching isn't needed, its a useless feature", tell that to Tim Sweeney.

    Saying shader 3.0 is useless is just fanboy talk that probably thinks its a Nvidia feature and not a feature of Direct X 9.0c,which ATI should have had in the R420 and their video chip by end of year.

    Its not that I can't stand ATI, I want them to produce products that are competitive, but its these outrageous fanboys acting like Shader Model 3.0 is a useless feature made by Nvidia and will only work on their card to spread FUD. Its not a marketing ploy, but an actual feature of a Microsoft API that ATI seems to neglect lately.

    Loops and conditionals are important in programming languages, you have a very crappy language without those. You can do all the work arounds you want, but they still are hacky workarounds.

    yes, its true. Using Branching ***Might*** have a cost. However, the cost without it is far worse.

    "If shaders are usefull why don't they make a demo of 'Final Fantasy - The Spirits Within' instead of Nalu." - Sounds like some statement the author above would make.
     
  16. Proforma

    Banned

    Joined:
    Feb 23, 2004
    Messages:
    86
    Likes Received:
    0
     
  17. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    I don't have any games that take advantage of SM 3.0. :(
    Where can I buy some now?
     
  18. DSN2K

    Newcomer

    Joined:
    Oct 4, 2003
    Messages:
    146
    Likes Received:
    3
    history is repeating it self funny enough, and you guys are playing your parts just as well. :lol:
     
  19. reever

    Newcomer

    Joined:
    May 17, 2003
    Messages:
    131
    Likes Received:
    0
    Yeah, it's amazing gaming, 3d rendering, and 3d workstations flourished without any branching or loops.

    Or more like Nvidia all over again, or every company :roll:

    That matters who you ask, because some people can't stand to blame things on incompetence and bad decisions, rather point a finger at a third party

    Unless they reorganize R&D teams and actually get R&D funds from the company making the console like Nintendo or Microsoft, perhaps you should read up on the terms of their contract

    That's why companies have things called CEO's and managers, they aren't just blobs of R&D teams with no direction

    Sorta like how Nvidia neglected PS2.0? But I guess that didn't hold back the industry, right?

    I'm sorry, but how long has this entire industry lived on without these features?
     
  20. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    At least nVidia didn't "neglect" PS 2.0. They just failed to anticipate the performance problems their architecture would have.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...