nV40 info w/ benchmarks

Discussion in 'Architecture and Products' started by nelg, Jan 18, 2004.

  1. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    Who on earth knows? The only thing I know for sure is to never take "shipping dates" at face value, no matter who the distributor.
     
  2. elroy

    Regular

    Joined:
    Jan 29, 2003
    Messages:
    269
    Likes Received:
    1
    The thing is unless nVidia have significantly improved their efficiency (better occlusion culling etc.), they are going to have to have a higher clockspeed than ATi IMO. I think they will need at least 500 MHz out of it to be competitive, going on current rumours.
     
  3. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    Remember that it has 175 million transistors and nV have seen fit to give it 50GB/sec+ memory bandwidth. Current rumours are that it can work on 4 quads in certain situations. That would seem to suggest that it's an 8x2/16x0 design (in the same way that NV35 can be thought of as 4x2/8x0) and may have approximately twice the pixel throughput of NV35, per clock. I've heard that PS performance is already well above current parts, even on the A0 samples. VS shows less of an improvement; maybe they have just incorporated a single, extra VS unit.

    They might ramp with A1; it's apparently not as clock-limited as they thought it might have been.

    MuFu.
     
  4. retsam

    Newcomer

    Joined:
    Apr 29, 2003
    Messages:
    32
    Likes Received:
    0
    mufu man thats somegood info you got there wow


    rets
     
  5. elroy

    Regular

    Joined:
    Jan 29, 2003
    Messages:
    269
    Likes Received:
    1
    Are you saying that it will be difficult for them to hit high clockspeeds because it has 175 mil transistors, or that it will have a high clockspeed cos it has heaps of mem bandwidth? Or that the clockspeed doesn't have a lot to do with the bandwidth (and that the higher bandwidth will be used for better FSAA, AF etc. I guess this is a given either way.)?

    8x2/16x0 design would be sweet! Also, are you saying that it has significantly higher PS performance than R360 or NV35/38 when you talk about it?

    So many questions - maybe I should just wait until they release it! :)
     
  6. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    Because of the clock - it won't be any more "difficult", it'll just mean that they won't be setting clockspeed targets as it might be if it were say, another NV35 revision. I was implying that in certain situations it can probably use all that bandwidth - i.e. it has a monster base fillrate and/or viable, higher FSAA modes.

    Compared to NV35/NV38. Someone suggested to me today that they still cannot match ATi's DX9 performance, even with this more mature "CineFX" architecture. I wouldn't be suprised if it's quicker than R420 in all but the most shader-centric DX9 titles. D3 too of course - hard to imagine them not kicking ATi ass in that since it's potentially so much of a trump card for them (and they have a headstart because of the lower precision path).

    Nah. It's much more fun to get all worked up about it then be utterly disappointed when it turns out to be a flaming pile.

    Hmm... 3am. Time to start revising for this morning's exam. :roll:

    MuFu.
     
  7. elroy

    Regular

    Joined:
    Jan 29, 2003
    Messages:
    269
    Likes Received:
    1
    Thanks for all the info MuFu! Now, slowly step away from the keyboard, and get some study done!
     
  8. Lezmaka

    Regular

    Joined:
    Oct 30, 2002
    Messages:
    398
    Likes Received:
    2
    Nah, there will be plenty for these guys to try and figure out.
     
  9. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I don't think so on most accounts, exept the 4 quads possibility under certain conditionals, but that's just me :roll:
     
  10. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    A bit of a historical question.

    Before the NV30 launch, were there any signs that it was going to suck hard? Or was it all "Holy Crap NV30 Is Going To Rock My Fucking Socks Off!" beforehand and then the complete and utter disappointment?

    And the last series of questions--I conjecture that NV40 will remove supersampling and ordered grid multisampling and replace it with some form of RGMS across the board. Modes will probably go to 8x, maybe 12x or 16x depending on how much memory bandwidth there is. Do we know what kind of AA it uses for sure yet? Is it gamma corrected? Any sign of stochastic AA?
     
  11. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Not for me. I attended the NVidia launch event at COMDEX. During their presentation, they had this animated movie showing Geforce and 3DFX combining, giving the impression that the FX used 3DFX tech.

    I was almost giddy with energy sitting there. I was like, holy shit, did they implement a Gigapixel tiler and manage to keep it secret?

    By the end of the conference, the disillusionment started (someone asked if it was a tiler, answer: no). No new AA modes. No DM. No PPP.

    And the demos kinda sucked. A few months before, I had been blown away at ATI Mojo Day in San Francisco.

    In otherwords, all indications I had were that the NV30 was supposed to rule. Each week afterwards, the news got worse. Oh, you mean FP32 doesn't run the same speed as FP24? You mean, FP16 won't give a 2x speed boost, but instead just gives enough speed to keep up?

    NV30 has to be the greatest disappointment in the history of GFX, atleast from my standpoint.
     
  12. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    They'd be mighty stupid if they've kept the same AA algorithm as before, but I somehow don't expect fundamental changes.

    I don't see why they should remove the hybrid modes entirely, they most certainly can have their uses too, especially 8xS for extremely CPU limited cases or resolution limited games.

    If we're talking about 2x,4x,6x,8x etc MSAA modes though, it's always better that the pattern is set to sparse, ie for N amount of samples a N*N grid. I don't see much use to have f.e. a 16xOGMS mode, if lower modes are sparsely sampled, but that's just me.

    The memory footprint requirements for 8x sample AA are already quite large.
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I don't think there'd be any reason to support 16x OGMS if sparse sampling modes are supported at lower sampling rates.

    The "NxN" grid is only a construct to visualize how the sample positions are chosen. The hardware itself won't choose samples in such a way (The R3xx, for example, selects any arbitrary samples positions. You can choose your own sample positions in Linux).
     
  14. bloodbob

    bloodbob Trollipop
    Veteran

    Joined:
    May 23, 2003
    Messages:
    1,630
    Likes Received:
    27
    Location:
    Australia
    I guess that is the big question if they can't remove ATI from that strong hold if they can't then I can see ATI having a strong position in the 3d card market for a long time to come.

    I think we can all agree the that NV30 wasn't what it should have been if clock for clock the card had the same preformance as the NV35 things would be alot different today. I guess lets see if NV40 makes the same mistakes again ( prolly won't ).
     
  15. Miksu

    Regular

    Joined:
    Mar 9, 2003
    Messages:
    997
    Likes Received:
    10
    Location:
    Finland
    Now that you've made every Ati-fan out there cry, do you have any good news to them?
     
  16. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    This is exactly the sort of rumour/info/PR that was being spread (no doubt with Nvidia at it's source) when NV30 was coming out. In fact, for the best part of half a year before NV30 came out. Look what happened there - we were all taken for fools by Nvidia.

    Of course info like this should be tempered by the fact that ATI are thinking in the same way - they want to make another blindingly fast card that kicks everything else in the teeth. ATI want to consolidate their new position and show the R3x0 wasn't just a fluke. There's no reason to think R420 isn't going to give NV40 a run for it's money, and if past performance is anything to go on, IMO R420 will be a better product for playing games.
     
  17. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    I'm far more interested in seeing what the nv40 turns out to be. I suspect the R420 will be more of the same... maybe with a few extra bits here and there, with some tweaking, upgrades, but nothing really new. After all, the pressure is on nVidia to come up with something good, ATI already have it and the battlefied hasn't changed in the slightest (we're still in a world of DX with DX 9 becoming more and more relevant with every passing day).
     
  18. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    I thought that we've learned to never judge anything by past performance. GF4 -> NV30 f.e.
     
  19. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    I'm taking the view that products are in an evolutionary cycle, rather than a revolutionary one. Both companies have had these new chips in design for quite some time, probably longer than the recent events over the last year that have brought us to the market as we have it today. I don't expect truely revolutionary products until NV50/R500

    Based on that idea, ATI is starting from the very impressive R3x0, while Nvidia is starting from the lacklustre NV3x. I think Nvidia will have the biggest problems, as they have the biggest "problem baggage" to take with them. More importantly, Nvidia have not shown the least ability to change their attitude to business and design in the face of the changed marketplace. Quite the opposite in fact - Nvidia don't see graphics as their core business, only want to put out one driver a year, and only see themselves competing against Intel in the low end market.

    So I think Nvidia will be outpaced again by the hungrier and more technically focussed ATI. Nvidia is spending time on cheats and PR BS that would be better spent on making better chips. ATI is just concentrating on making better products, without all the scrambling and catchup than Nvidia is having to spend time on.

    I guess we'll have to see where everything washes out in the next couple of months, but my personal opinion is that I still see ATI doing the right things, and I still see Nvidia doing the wrong things - I can't see why that will be different a couple of months from now.
     
  20. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    But leading up to the R3xx generation, NVIDIA had the fantastically popular NV2x series while ATI had the much-maligned R200. Look how that shifted completely.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...