GeFX canned?

Discussion in 'Architecture and Products' started by Nebuchadnezzar, Feb 5, 2003.

  1. MikeC

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    194
    Likes Received:
    0
  2. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I remember S3 for doing the first viable texture compression--the original UT even included a special S3-compressed texture pack. AGP texturing, however, never proved itself in any 3D game I am aware of, and today is completely overshadowed by 3D cards with their own much faster onboard local ram buses. I do remember a lot of silly people running synthetic benchmarks with 50mb and 100mb single files which they then said "proved" the value of AGP texturing for 3D gaming. Problem is at the time you could barely find a single file in a 3D game over 1 megabyte, let alone 50mbs....*chuckle* And of course today that nonsense is at last behind us forever (hopefully.)

    Although, there are still some slow folks around who actually expected 3D games to show a performance spike when they moved from AGP x8 from AGP x4--I even read one site which was "disappointed" (and obviously baffled) because there was no difference in performance that they could chart. It seems like you could explain to people over and over again how the onboard texturing on a 9700P is 20x faster than AGP x4 and 10x faster than AGP x8, and they *still* expected to see a performance difference when moving to AGP x8. AGP texturing was clearly nothing but a marketing gimmick spread by ignorant people at the time who couldn't grasp the fact that 3dfx tried to communicate, which was why use AGP texturing at all when your onboard ram bus is a multiple of times faster? Always seemed pretty simple to me--even at the time of Savage 3D. There were those of us who got it then, and those who didn't. I would hope that today the vast superiority of the local ram bus for texturing in 3D games over that of the AGP bus is abundantly apparent. Sigh...even at the time a person could expound on how AGP texturing was developed by Intel at a time in which vram for video cards was $50-$75 dollars a megabyte, if not higher, and AGP texturing was developed for a 3D scenario in which the most onboard ram you'd see on a 3D card was 8 megabytes. The only real AGP-texturing 3D card ever made that I'm aware of was the Intel I7xx series, which featured no more than 8 megs of onboard ram and a heavy dependence on AGP texturing. The card was beaten soundly by 3dfx products at the time which didn't use AGP texturing at all, and eventually Intel folded up shop and got out of the 3D graphics chip business completely. IE, the only "real" AGP texturing 3D card ever made was a dog and a complete flop--utterly non-competitive with 3D solutions at the time which did not rely on AGP texturing at all.
     
  3. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Welll i think you should click on the pic.... and then you find that it's a GF FX ULTRA in pre-buy ;)
    http://www.compusa.com/promos/geforcefx/default.asp?cm_ven=pny&cm_cat=link&cm_pla=ros&cm_ite=gfx

     
  4. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
  5. Morris Ital

    Newcomer

    Joined:
    Jan 22, 2003
    Messages:
    28
    Likes Received:
    0
    Location:
    UK
    You can get it from

    http://www.komplett.co.uk/k/kl.asp?AvdID=1&CatID=24&GrpID=1&t=278&l=2

    as well
     
  6. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
  7. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Ah, my sweet 427 Chevy--427 cubic inches of *displacement* with which I could burn a swath of rubber for a quarter mile, and which got all of 8 miles to the gallon...;) Hopefully, it's now returned to some junkyard somewhere and is finally and forever at peace...;) (Probably recycled for the metal 20 years ago...;))

    Yes, the hairdryer sound was quite something...the heat, the size...it's just amazing to think that since the 9700P shipped in September that the 5800 Ultra was what nVidia threw most of its resources into, and actually presented to the market, as a "competitor."

    I really think they found the market nowhere near as gullible as it used to be. And of course it didn't hurt at all that the standard to which it was compared was the 9700P.

    As to admitting defeat, nVidia already has. Here's the link:

    http://www.bayarea.com/mld/mercurynews/business/5093187.htm

    Here's the appropriate text:

    Huang is confident that Nvidia will end up back on top. ``Tiger Woods doesn't win every day. We don't deny that ATI has a wonderful product and it took the performance lead from us. But if they think they're going to hold onto it, they're smoking something hallucinogenic.''

    I applaud Huang for being honest--it's so much nicer than trying to maintain a fiction everyone sees through.
     
  8. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
  9. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    When I say "time allowed", I'm talking about "when they recognized that the nv30's 128-bit bus was not enough".
    IMO:
    The shortest time this would be is the time before the R300 launch that they realized it would have a 256-bit bus (I'm pretty sure that would be pretty early last year at the latest).
    The longest time is from the beginning of the design of the nv35, if they would have banked on it being a good way to improve performance (this seems pretty likely to me).

    When they started focusing efforts on "overclocking" the GF FX to ultra levels might be an indicator as to the latest date possible for the former.

    I'm not talking about all of a sudden just now deciding to "tack on" a 256-bit bus.

    I try to stay away from the term "need to be designed" without any real idea for nVidia's capability to adapt. I'd also say the board design for such a part is something else they would have started at the same time as they recognized a 256-bit bus might be a good idea (which could conceivably have been much earlier than the above dates).

    It would certainly seem to me that considering such possibilities as early as possible is the most efficient approach to execution.
     
  10. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    Benchmarks are still showing the GF FX Ultra "beating" the 9700 Pro.

    This results in reasons for customers fond of their name brand assuming nv35 will "beat" the R350, and is enough to "contest" ATI's performance crown.

    This still leaves the most dedicated customers (pre-order customers) having cards to brag about to consumers who would not have purchased one in any case, and reviewers to benchmark them to consumers in the same fashion.

    Finally, this prevents the negative impact (having power supply problems, being annoyed by the noise) from being driven home to a large number of consumers, and keeps them as abstract as "something they heard about" when looking at benchmark results that will still be associated with a purchasable product (the "GeForce FX" brand name). It remains to be seen whether print magazines will facilitate this (well, if your opinion of print magazine reviews leaves room for doubt).

    Hmm...I see what has happened as a "'terminal' volume scale back", not a "cancel it altogether". The Ultra parts will be rare, not non-existant.

    We've covered the list of problems enough, I think. :lol:

    I'm not so sure it would be nVidia being concerned with that, but more like nVidia being concerned about OEMs being concerned or dealing with such. Not being the only big kid on the block means you get away with a lot less.
     
  11. THe_KELRaTH

    Regular

    Joined:
    Dec 9, 2002
    Messages:
    471
    Likes Received:
    0
    Location:
    Surrey Heath UK
    Well lets hope some reviewers hang onto the NV30 Ultra so that it can be tested in comparison with the R300 core under 3Dmark2003.
    If it follows the same trend as the shadermark tests then just switching to 256bit wont help much.
     
  12. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Seriously, though...now that the news is out that the product has been cancelled, who'd want to buy the noisy, hot thing...? I mean, if nVidia has such little confidence in it that they are withdrawing it from the market before it ships, why should a consumer pre-order it?

    I think ATI would be wise to press ahead while it has the advantage. The more distance it can put between its products and nVidia's, the better. The worst thing a company can do, as has been proven by more than one company, is to withhold its viable technology from the market when it is able to ship it. For instance, it would have been far better for nVidia to have shipped the nv30 non-Ultras last year ASAP, and to plunge immediately into nv35 development, than to do what it did. ATI needs to do exactly as nVidia did and pace itself by itself, and let the "chips" fall where they may. nVidia didn't "wait" on 3dfx or ATI to ship products prior to nv30--it shipped them according to its own internal time table and didn't much worry about what the other guys were shipping. This is the attitude ATI should adopt, I think. I hope they do.

    I think the "slightly improved" fan version is nothing but the original fan tuned to turn completely off when not running a 3D game, and at a slightly slower speed when running on high.

    No doubt the fan would prove the achilles heel of the design, not only from a marketing standpoint as many would find it distasteful enough to warrant them overlooking the product entirely, despite any other assumed superlatives it might have had, but also from a warranty standpoint as the fan would be a likely candidate for failure (compared to the other components) which would provoke an immediate RMA. This is no ordinary fan which a consumer himself could easily and cheaply replace if needed.

    A concern I had about the fan from the start is how an end user would keep it clean. Kyle at [H] so far has been the only one I've seen comment on this practical aspect, and he mentioned the use of canned air. My opinion is that you'd have to remove the card from your system, unscrew the plastic fan housing, and then blow or wipe the accumulated dust and dirt off, reassemble and reinsert it into your system. The question would be of how often this would have to be done, of course. The heat-pipe baffling inside the fan housing makes a perfect trap for lint, dust, and dirt particles. If it gets stopped up, the card over heats, the clock-throttle comes on, and your MHz drop to 300--until you clean the fan.

    Even with a clean and properly operating fan there were certain objections to the clock-throttling mechanism by people who looked at it. Anand said that when he tried a modest over clock the clock-throttle activated and dropped to ~300MHz, right in the middle of the game. So as predicted, the over-clocking potential of the Ultra is nil, since the card was already over volted and over clocked right from the factory--the clock-throttle was necessary in case of fan failure, or in case of general overheating of the chip. Some of the artifacts I saw in the [H] review which were attributed to the drivers looked suspiciously like artifacts I have seen in the past when trying to overclock a 3D chip too high, resulting from over heating.

    At any rate, the thermal and electrical problems inherent in the design undoubtedly complicated manufacturing of the product and added to the expense, and I think in the end someone at nVidia just simply "woke up" and realized this was a choice of cutting your losses now, or continuing on ahead and risking millions of dollars more in losses. The company did the right thing, IMO, both for consumers and for its shareholders.
     
  13. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Heh-Heh....personally, I keep hoping reviewers will drop 3D Mark entirely and simply stick to benchmarking games....I think that's much more valuable information.
     
  14. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    No arguments that today the balance is hugely tilted against AGP texturing.

    Back in the Savage3D days it was a big bonus. Since it only had 64-bit SDRAM it could saturate the VRAM bus completely with just 16-bit Z + colour at 1 pixel/clock.

    Therefore, since it had enough latency comp to absorb completely the AGP cache fetch, it was almost always faster to texture from AGP because it was 'free' extra bandwidth.

    Now it is very different, because with DDR and 256-bit buses there is 8x more bandwidth available on the VRAM bus and HyperZ and colour compression crank the effective bandwidth up even more. In contrast, AGP has got only 4X more bandwidth and the latency compensation requirements are way up so getting good AGP performance costs more $$$ than it used to.
     
  15. FrgMstr

    Newcomer

    Joined:
    Jun 26, 2002
    Messages:
    223
    Likes Received:
    4
    Location:
    Lucas, TX
    I can respect that. :)
     
  16. surfhurleydude

    Newcomer

    Joined:
    Jan 25, 2003
    Messages:
    176
    Likes Received:
    0
    I belive it's totally unfair to Futuremark that you are talking so much trash to 3Dmark2003 like that. If it's anything it's supposed to be, it'll be more stressful on the GPU instead of a combination of things like 3Dmark2001... It should give a good indication of cards strengths and weaknesses in GPU limited situations.
     
  17. FrgMstr

    Newcomer

    Joined:
    Jun 26, 2002
    Messages:
    223
    Likes Received:
    4
    Location:
    Lucas, TX
    Going to be hard to get confirmation from NVIDIA PR when they are do not know it themselves.

    And yes, I asked TypeDef to stop linking us as he was copy and pasting the entire contents of our news posts. Certainly you have posted many links of ours in the past weeks, I don't know why all of a sudden you would stop posting links now.

    Maybe you trying to be nice to NVIDIA and get that fansite card once again?
     
  18. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
    Huh? Are you saying that it is the OEM's that is going to dump the ultra? Or are you saying that nVidia PR wont come forward about this until later (on Monday)?
     
  19. THe_KELRaTH

    Regular

    Joined:
    Dec 9, 2002
    Messages:
    471
    Likes Received:
    0
    Location:
    Surrey Heath UK
    Kyle, have you tried to run the ATI DX9 demo's on the NV30 yet?
     
  20. FrgMstr

    Newcomer

    Joined:
    Jun 26, 2002
    Messages:
    223
    Likes Received:
    4
    Location:
    Lucas, TX
    We got our information on this through some VERY ODD channels, but still reliable. I do not think NV PR was aware of this issue when we posted it.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...