my GOD i think i figured something about NV30....

Discussion in 'Architecture and Products' started by Sage, Nov 19, 2002.

  1. Vince

    Veteran

    Joined:
    Apr 9, 2002
    Messages:
    2,158
    Likes Received:
    7
    Thats so... Um, depressing :wink:
     
  2. MikeC

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    194
    Likes Received:
    0
     
  3. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Why would you have expected them to go the exotic route? Since we know for a fact that this chip was designed with the help of 3dfx engineers, a look at the product history of both companies plainly spells out "ramrod rendering" (ie, brute force--with some HZ stuff thrown in there.) They aren't interested too much in experimenting with what could become dead-ends (although they surely do so as a part of their R&D), but I rather think the company's main focus is to take what they know works, improve it, and implement it--along with some other features. nVidia said at the 3dfx acquisition that it wasn't interested in the GigaPixel tech, and by implication everything associated with it. So I wasn't a bit surprised. What does suprise me at this stage is the monstrous fans they're having to use to cool the buggers which apparently are to be stock configurations, at least for the top of the line nv30. Looks like to me they feel it's necessary to overclock this beast right out of the starting gate. As .09 microns is probably at least a year away for nVidia, I'm not sure this is a good portent to be seeing so early in the chip's life cycle (unless the fans fall off in the shipping products when yields improve, if that happens as it probably will--improved yields, I mean--don't know about the fans.)
     
  4. Sage

    Sage 13 short of a dozen
    Regular

    Joined:
    Aug 22, 2002
    Messages:
    935
    Likes Received:
    15
    Location:
    Southern Methodist University
    as for nVidia being beyond the Rampage.... in some ways yes, but in some ways no. and as for the speed of the Rampage, it would have given a GF4 Ti a VERY hard-earned victory and would still be superior in some ways. a lot of people think of the Rampage project as being just like all the other 3Dfx chips, but it was not. the Rampage Project was totally seperate from the Voodoo Project. It wasnt something to scoff at, thats for sure.
     
  5. DadUM

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    55
    Likes Received:
    0
    Will this be the transcription from your questions today? You have hinted to asked someone at the demo a bunch of questions, but haven't seemed to let the answers out yet. It will be interesting to see you spill the beans.
     
  6. megadrive0088

    Regular

    Joined:
    Jul 23, 2002
    Messages:
    700
    Likes Received:
    0
    Sage, would you say that NV30 might not be as good as Rampage in some areas?

    or just not better in some, while being better in others?
    (hope i wrote that clear enough to understand)
     
  7. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I suspect that might be an earlier revision - when did you recieve it?
     
  8. T2k

    T2k
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,004
    Likes Received:
    0
    Location:
    The Slope & TriBeCa (NYC)
    :roll:
     
  9. Vince

    Veteran

    Joined:
    Apr 9, 2002
    Messages:
    2,158
    Likes Received:
    7
    Because nVidia has many very intelligent people who realise that computational and bandwith effeciency could provide a large advantage in the comming generations when your doing heavy shading, which is computationally expensive (Think raytracing in the shader) and your architecture is just pissing it away. I mean Ned Greene was preaching effeciency like 2 years ago.

    Both companies relied on the same core architecture for their SKUs untill the Nv30/CineFX was announced. I would assume that when starting over, you could examine how the 3D graphics realm has changed over the past half decade and move from there.

    I never said 3dfx or GP - where do people get this. I blatently stated that I didn't expect a Region-based deferred renderer (ie. 3dfx's Mosiac)... how much clearer can that get?

    Yet, many things can be done outside of a tiling/chunking/et al. form of architecture. I was hoping for this. Look at Talisman - Reuse the temporal coheresncy (IIRC) between frames... the list is endless.
     
  10. Vince

    Veteran

    Joined:
    Apr 9, 2002
    Messages:
    2,158
    Likes Received:
    7
    I hate this icon with a passion, but it's gotta be done... :roll:

    I'm going to assume your joking and not pull a :roll: en masse - Kristoff style. lol
     
  11. Deflection

    Newcomer

    Joined:
    Jul 16, 2002
    Messages:
    66
    Likes Received:
    0
    Ban that icon please!
     
  12. Bambers

    Regular

    Joined:
    Mar 25, 2002
    Messages:
    781
    Likes Received:
    0
    Location:
    Bristol, UK
    :lol:

    What if ATi were to say effective memory bandwidth? With colour compression and 24:1 Z compression with 6xFSAA then that gives you well over 300GB/s of effective bandwidth :D

    btw what does iirc mean? I've never been able to work it out.
     
  13. Colourless

    Colourless Monochrome wench
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,274
    Likes Received:
    30
    Location:
    Somewhere in outback South Australia
    IIRC stands for If I Recall/Remember Correctly
     
  14. Nupraptor

    Newcomer

    Joined:
    Nov 11, 2002
    Messages:
    79
    Likes Received:
    0
    IIRC = If I Recall Correctly

    And I do have to agree: Their "effective bandwith" marketing ploy sounded kind of impressive at first, but now I'm really having my doubts. I'd say it's closer to half the number they're quoting, in most circumstances.
     
  15. MikeC

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    194
    Likes Received:
    0
    Last Friday, the 15th from Brian Burke.
     
  16. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    Tell me, was there any kind of bandwith saving technology in Rampage? Because if not, than it would've had a serious time competing with a more efficient design like the GF3, relying only on brute force (SLI dual-chip dual-memory bus design). As far as I've seen, there was nothing like early-Z, Z-compression, framebuffer compression or such in that chip, and thus it must had been a very inefficient card. The only thing that seemed to be well over the GF3 and Radeon 8500 was the pixel shader - but without the speed it wouldn't have had much of a use.

    That kind of myth-making is exactly what nvidia is trying to capitalize on with this whole "we have 3dfx tech in GeForce FX" thing. Are you already convinced to get such an artifact of 3D graphics? ;)
     
  17. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,563
    Likes Received:
    171
    Location:
    In the Island of Sodor, where the steam trains lie
    Oh the delicious irony.
    Really, that long ago?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...