Two new (conflicting) Rumors

Discussion in 'Architecture and Products' started by DemoCoder, Sep 21, 2002.

  1. NV25

    Newcomer

    Joined:
    Feb 13, 2002
    Messages:
    18
    Likes Received:
    0
    Aside from a few driver problems, it's clear that ATi has the advantage right now.

    Whatever NV30 brings to the table it's far too late to steal ATi's thunder. 9700 is here NOW, NV30 is still the basis of rumors and speculation. And any exotic solution NV30 can use (DDRII, 0.13 micron) can also be used by ATi.

    The problem for nVidia is that ATi squeezed their performance out of "inferior" tech. Nvidia has always relied on cutting edge technology to give them an advantage, especially in the 3dfx days, but now it's coming back to haunt them. I don't think NV30 will be *that* much faster than 9700, and even if it does outperform it the delay in production now means that ATi is one step ahead and can respond with their own refresh far earlier than Nvidia.

    For nVidia's sake NV30 must be clearly superior to 9700 or they've got some major catch-up to do...
     
  2. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    well that was always the justification at the time - you must get a TnL card its more future-proof than the V5. If anybody argued the TnL on the Ggf1 was effectively useless at 10x7 and above as the card was fillrate limited anybody - well you were just a fanboy.

    BTW this is not an arguement against progress and the need to push tech into the hands of the developers so the consumer gets the graphics they demand but whilst developers can now start developing on a DXp board I doubt I will see DX9 games in the lifetime of my first DX9 card purchase.
     
  3. darkblu

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,642
    Likes Received:
    22
    good for you that you're so easily amused, bigus. still, pity you didn't get my point. of course nobody at that time knew gf1 would be capable of running d3 in particular (how you read my post into that is beyond my comprehension), but a plain look at vsa's feature set sufficed to show that it lacked anything which could take the yet-another-voodoo past the circa-'96 lighting/shading models. whereas the nv10's feature set showed potential. it "accidentally" happened so that that potential was sufficient for doom3 (3 years after the part's release), alas at minimalistic resolutions & framerates <shrug>

    oh, and i'm so happy about those guys who don't have issues buying a $300 vga each year, and not expecting to be able to run titles released a year from then - i admit i can't afford such a pattern.. neither would i need to, but that's another matter.
     
  4. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    But that implies that delaying NV30 means delaying NV35 (or any other refresh) too. I don't think that is neccessarily the case.
     
  5. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    I'd still rather have a GF1SDR than a Voodoo4 right now. (if you want to compare bandwidth)

    I'd still rather have a GF1SDR than a TNT2Ultra or V3-3500 (if you want to compare products out at the same time the GF1SDR was)

    I'd still rather have GF2DDR than a V5 right now (if you want to compare products out at the same time) at comparable price points.

    But yes, if you had a TNT2U, it would be stupid to have upgraded to GF1SDR.

    Yes, either a TNT2U or a GF1SDR is somewhat useless today, and moreso going forward.
     
  6. darkblu

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,642
    Likes Received:
    22
    :roll:

    joe, is it that you really can't distinguish simple (factological) support evidences from genuine reasoning, or you just haven't had your morning coffee yet?

    <irrelevant remark>
    lack or presence of logic is in the eye of the beholder.
    </irrelevant remark>

    at last something which actually is to the point of the discussion at hand. and i absolutely agree with you. now, what you need to realize is that there exist people who have a slightly different idea of the expected lifespan of their videocard purchases.

    there's nothing foolish if you seek to maximize the lifespan of your purchases. now, if a videocard turns to comply with titles 1 year from its release - fine. if the card turns to barely comply with titles 100 years from then - even better, as this means the card may turn to adequately comply with titles 50 years past its release date. what exactly is not clear here?

    read my reply to bigus to see what i think.
     
  7. Bigus Dickus

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    943
    Likes Received:
    16
    Talk about misunderstanding posts and reading into them things which are not there...

    On the whole, I agree with your argument, and yes... people have different expectations for the life of a videocard. For me it's 12 to 18 months. For others it stretches to 24 or even 36 months. But to use an example of a ~four year disparity between product launch and game launch is just pushing it a bit far, IMO. That was why I found your example humorous, and irrelevant. If anyone buys any videocard expecting or even hoping it will play a game four years down the road, they will be sorely dissapointed.
     
  8. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    1) I don't drink coffee
    2) Using a 4 year time frame as some "factological support evidence" is just funny.

    What is not clear, is that we STARTED talking specifically about Voodoo5 vs. GeForceSDR...and how you implied that the GeForce SDR is a "better buy" because it is more "future proof" as "factologically evidenced" by Doom3.

    I was debunking that SPECIFIC opinion.

    And now you're generalizing about "future proofness" in general, which of course no one disagrees with in theory.

    I don't know who suggested spending $300 each year. Again, with the speicific case, we're talking about a $300 purchase, and it lasting well over 3 years.

    If you can't afford to spend roughly $50-100 per year on average on a video card upgrade, then you simply won't keep pace with the industry.

    Again, when you look at whatever "new" features you are talking about, you also have to consider the following:

    1) When would those features be used
    2) When they are used, will the card itself be powerful enough to run with them?

    So yes, you can always say "more features is better." But the VSA vs. GeForce argument in question is not that simple, because the GeForce was not a SUPERSET of the VSA. The VSA had quality AA...which could be applied to all games: immediate support.

    So, the argument was "which is better....more features that might be used by the time I need to upgrade, features that when used might not be fast enough to leave on....or a card with some other feature that improves every game I own right out of the box?

    Everyone needs to make that decision for himself.

    Now, the nVidia faithful back then always said "support for new features is coming sooner rather than later." And the 3dfx faithful said "support will come later rather than sooner."

    And, in my opinion, I would have to say the 3dfx faithful ended up being more accurate.
     
  9. Nagorak

    Regular

    Joined:
    Jun 20, 2002
    Messages:
    854
    Likes Received:
    0
    I have to disagree with this to some extent... It could also mean that the architecture was designed to run in 32 bit with 16 bit only as a secondary consideration. Although the 16 bit implementation may have been poor (on the original Radeon for example) that was likely due to design. At the time hardware sites were still stupidly testing performance in 16 bit (why?) so it made the Radeon look bad. Now most sites don't even bother testing in 16 bit... Did any sites even check the Radeon 9700's 16 bit performance?

    Frankly I think ATi made the right choice by giving 16 bit the boot. The bottomline is 16 bit is horribly ugly, and the only reason to run in it is if you have a really old card that just can't cut it in 32 bit in newer games or if you're playing a really old game that only runs in 16 bit.

    I'd also like to point out in regards to these two posts that by the time Doom3 comes out you'll be able to buy a R9000 for <=$50. So it's fairly unlikely someone is going to spend $50 on Doom3 to run it in 320*240 resolution. If money is that much of a concern you could even spend $75 and get the game and a Radeon 7500 for $25 (or probably GF2 Pro/Ultra equivalent?).
     
  10. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    I don't fault anyone for not optimizing 16-bit today. But it will be a while before we have the bandwidth luxury of always rendering everything at 128-bit. Thus, just like in the days of expensive 32-bit rendering, having an optimized 64-bit fallback is nice.

    There may come a time when 128-bit rendering effectively runs fast enough that we don't care to drop down to the 64-bit mode, but it will probably take a few generations.

    Hell, in the first next-gen cards, the vast majority of rendering will probably still be 32-bit and only selected special effects will switch to higher precision.
     
  11. flf

    flf
    Newcomer

    Joined:
    Feb 15, 2002
    Messages:
    214
    Likes Received:
    5
    Uh, no. I think you're confusing me with Derek.

    Odd you should say that, as I am not quite sure why you took my original comments so personally in the first place. I never directed my remarks at you until you personally responded to them, and even then I am merely remarking that the opinion that "competition is good" is not universally true.
     
  12. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    I agree perhaps I take them the wrong way, but you did quote me then say my arguements made no sense? Then told me I needed coddling? I dunno why but I kind of took that as being directed at me :)
     
  13. flf

    flf
    Newcomer

    Joined:
    Feb 15, 2002
    Messages:
    214
    Likes Received:
    5
    1. Upon reviewing my own posts, I'm not quite sure why I put "your arguments make no sense" rather than "your arguments depend completely upon point of view of either consumer or business." Chaulk it up to bad form on my part.

    2. The coddling was sarcasm in response to your sarcasm. Unfortunately (or fortunately, depending upon your view), I don't use smilies, so the intent was lost. Chaulk it up as my mistake also.
     
  14. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    phew the net can be a bad place for misunderstandings.
     
  15. Guest

    Guest Guest

    If you think 16 bit is horribly ugly I suggest you buy new glasses. And only comparing 16 to 32 isn't really good either cause depending on what other features you use that really makes a different........
     
  16. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Actually, 3dfx's "22bit-effective" 16bit mode was rather beautiful. PowerVR's 16bit mode was beautiful as well. It was ATI's and Nvidia's 16bit mode that was rather ugly and hideous. Not all 16bit modes were created equally.

    --|BRiT|
     
  17. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Only if you weren't doing multipass. the 22-bit filter could not protect against hideous multipass artifacts.
     
  18. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Perhaps, but it still looked far better than ATIs or NVIDIA's 16bit modes in all the games out at that time.

    --|BRiT|
     
  19. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
    Joe + DemoCoder: Really interesting discussion here! 8)

    While I don't think that nVidia will screw royally up with a 128 bit bus without some semi-HSR, I do think that they have put most of their focus on computational power. (they still need a lot of bandwidth for textures/data for these hefty shaders). This might suggest that NV30 could be really fast in VS/PS intense games while it may not do so well in the games of today with FSAA/AF as you point out.

    nVidia have, however, always offered faster performance with a new generation since they know full well that people aren't stupid enough to buy a $400 card just to look at an increase in 3dMark 2002. But OTOH I do sense that ATI have struck a better balance between speed and future features.

    In two months we will be having some hefty discussions here! (I cannot wait) :wink:
     
  20. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    and far worse than ANYONES 32bit implementation.
    whats you point? that not all 16bit implementations sucked as badly?
    Ok, so what?
    None of em are as good as 32bit...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...