GeFX canned?

Discussion in 'Architecture and Products' started by Nebuchadnezzar, Feb 5, 2003.

  1. Mark

    Mark aka Ratchet
    Regular

    Joined:
    Apr 12, 2002
    Messages:
    604
    Likes Received:
    33
    Location:
    Newfoundland, Canada
    I realize that, but you'd have to be a pretty hard core fan-boy to choose a GFFX non-Ultra over a 9700 Pro wouldn't you? I mean, the non-Ultra has literally nothing going for it compared to the 9700 Pro - definitly no speed advantage, no image quality advantage, and (for a guesstimate) no price advantage... I call it like I see it, and what I see is a DOA nV30 non-Ultra. As far as I'm concerned, the nV30 is no more.

    Having said that, nVidia must also have realized that the non-Ultra was in absolutly no position to go head-to-head with the 9700p when they decided to scrap the GFFX Ultra. I can't possibly see a company cut their own throat like that - logic says that they have to have something waiting in the wings... what it is, I have no clue...
     
  2. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    How about better Linux support?
     
  3. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I have to tell you I think that's a rather illogical assumption you make at the end there...the nv30 Ultra was the "something" they had hoped to make competitive with the 9700P. Isn't that obvious? If there had been anything else, do you really think they'd have gone to the extremes of even proposing something like the Dustbuster solution in the first place?

    I don't. I believe if they had anything else they would've announced *that* instead, and scrapped the dustbuster from the beginning. Besides, for the past six months nVidia has had nothing remotely competitive with the 9700P to offer--nothing. I can't see what makes it so imperative "now" versus then. The simple truth is they couldn't do it then and they can't do it now.

    Let's see...how many years would you say the ATI architectures ran behind the nVidia architectures? Do you think somehow that nVidia is "immune" to being behind, or has some sort of supernatural dispensation to protect the comany from it? If so, it's already failed because nVidia has been behind for the past six months. Categorically behind.

    I see no "logical reason" why nVidia even has to catch up anytime soon, much less start winning the race anytime soon.
     
  4. Sherlock

    Newcomer

    Joined:
    Jul 29, 2002
    Messages:
    25
    Likes Received:
    0
    Location:
    Colorado, USA
    What a waste...think of all those countless engineering hours, all the cunning PR hype, and giving a big bonus to the guy who came up with the dustbuster solution...all down the drain.
     
  5. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    linux support?
    who cares? :lol:
     
  6. Mark

    Mark aka Ratchet
    Regular

    Joined:
    Apr 12, 2002
    Messages:
    604
    Likes Received:
    33
    Location:
    Newfoundland, Canada
    heh, now you're starting to grasp.
     
  7. Mark

    Mark aka Ratchet
    Regular

    Joined:
    Apr 12, 2002
    Messages:
    604
    Likes Received:
    33
    Location:
    Newfoundland, Canada
    ... but would they have ditched the Ultra if they didn't have something to take it's place. I doubt it. Like I said, they'd be cutthing their own throat to simply drop the Ultra with nothing to carry them through to the NV35. That's my point. Whatever it is, they might not be ready to announce yet, which is why we haven't heard anything...

    Then again, maybe it's like BRiT said - maybe they can't produce the GFFX Ultra reliably - in which case they'd have no choice to scrap the design... woe is nVidia...
     
  8. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    If they would be losing money on each Ultra produced, then yes, it would be smart (maybe) to ditch it - even without another product comming soon,
     
  9. Mark

    Mark aka Ratchet
    Regular

    Joined:
    Apr 12, 2002
    Messages:
    604
    Likes Received:
    33
    Location:
    Newfoundland, Canada
    that's a good point. It's not like they are selling a consoles and have game sales to prop them up.
     
  10. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Its sort of funny in seeing how this news was posted on NvNews front page, but was recently removed. Seems like someone is in denial. :roll:
     
  11. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Right...and if nVidia was forced to sell it to the OEMs and make guarantees to them and other things the sum total of which meant that nVidia's OEMs weren't willing to pay what nVidia needed them to pay so that nVidia could turn a profit--that would be it.

    That's the thing--it's the pressure of the competition from ATI--because the OEMs know they do not have to pay nVidia what it demands to sell a successful, competitive high-end 3D product that is in high demand. If they have to they can go to ATI and get a better deal. You can also be sure that ATI OEM pricing in the market is also affecting what nVidia OEMs are having to pay for the plain nv30 chips--it's certainly not what it would be if nVidia had no competition from ATI in this market. All of this in turn drives down the price to consumers.

    What I've been trying to get across here is the idea that nVidia has been beaten--last product cycle, this product cycle, too, and it may continue this way for some time in the future. ATI has a fantastic architecture in the R300, very similar to nVidia's former TNT2-based architecture which it enlarged and expanded virtually all the way through the GF4. When companies move to new architectures the rule books fly out of the window--current performance is no longer based on past successes because the company no longer is standing on the bulwark of the older, successful architecture. Now it's ATI with a fantastic architecture to build on for the next few years. It will be interesting indeed to see how nVidia responds. All of it though will be good for consumers. I for one am delighted that ATI "woke up" and decided to become competitive in the 3D chip marketplace--not just because I think the 9700P is the best 3D card I've ever owned, but because when a single company gains control of a market segment unchallenged, innovation is often the first thing to go. I think innovation is something we can look forward to now for a long time to come. *chuckle* "3D Wars" haven't been this much fun since 3dfx, Lord rest their pudgy souls...;)
     
  12. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    *chuckle* Maybe [H] is having a good time pulling a fast one on all of us...;) Ah, well, if so....it's certainly kept things interesting.... :D
     
  13. X-Reaper

    Newcomer

    Joined:
    Jul 28, 2002
    Messages:
    9
    Likes Received:
    0
    It's really looking more to be true now. PYN has updated there website.
    They remove the ULTRA name and lowerd the specs.

    http://www.pny.com/home/products/Vcard_fx.cfm

    GeForce FX Specifications
    CineFX Engine providing cinematic-quality special effects
    400MHz core Clock
    400MHz memory clock (800 MHz Effective memory Clock)
     
  14. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    Well, I'm in a fair bit of amazement that such rumors are the majority of what they've managed to deliver successfully, and that feeling has served to numb any annoyance, I guess. :-?

    Eh? No, they do have a chip that competes fairly well. It just isn't feasible to release it as a product. Also, the non Ultra still competes, just not very successfully from the standpoint of those who picture nVidia as the performance leader. The Ultra has served its purpose of providing benchmarks that can be used to show the "GeForce" at an advantage to the "Radeon"

    Hmm...well, yes there are, but why are you telling me? I'm the guy who was lambasting people for making assumptions after E3 that it was preposterous that the R300 couldn't be faster than the nv30.

    But on to your reasons...

    Do you think any of this is news? Well, I'm not sure what you mean by "too many heatsinks" but I'm not particularly curious... ;)
    This isn't the nvnews forums Walt, nor Rage3D, you don't have to keep pointing out things like this when no one is contesting them (atleast not at such length). All I was commenting on (read the text again) was specifically that I don't see any reason at all to assume the nv35 is necessarily delayed because the nv30 was so late...hence terms like "lends validity to the rumors".

    Specifically, I think this leaves room for the nv35 to come out before fall (and presumably the R400...I don't think ATi has a great deal of reason to hurry the launch of that even if they could...I think the R350 is likely to compete well enough with the nv35), and opportunity to re-associate nVidia and the GeForce, for whatever amount of time, with the concept of "performance leadership".

    This does leave technical issues to be worked out, and I don't have the confidence in nVidia engineers that I would with ATI engineers at this point, but even just adding a 256-bit bus would help the GF FX catch up quite a bit, even before considering the other ideas the engineers may have in mind.

    Repeating myself...given the prior hints of the nv35 being the focus of intensive "debugging", it seems likely, in my opinion, that this info about cancelling the 5800 Ultra parts strengthens the likelihood of rumors of a May/June launch schedule. If you disagree with this, a brief reply like that at the end could have sufficed....

    Nowhere do I indicate that I disagree that the 5800 Ultra is a flawed part, and I've mentioned the flaws prior. I don't mention them again because they've been mentioned quite a few times already....

    Yes, yes....similar outlooks have been well established. For my part, that is why I was using terms like "sane"...the Ultra just strikes me as a computer OEM dud.

    ? Yes, I saw it before too...? This first half was a waste of time, IMO. :-?

    I think you make a good point, and I tend to agree. See above with my later post about the memory clock speed.

    Well, I've discussed this before...clocking the RAM the near the same frequency as the core with a 128-bit bus is more limiting than clocking near the same frequency with a 256-bit bus. Each card having roughly the same fillrate, this indicates a situation where the GF FX architecture is much more likely to "choke" as I termed it, and I think make it more likely to get greater returns from increasing RAM clock frequency (assuming there are no issues with such a memory clock disparity between core and RAM...I assume nVidia has their interface well in order).

    But I agree the returns in performance are likely not to be deemed worth it for the increase cost, though I don't have any definite idea of the cost difference.

    I'm also not convinced that the RAM on the GF FX is best considered to be "DDR-II" in regards to latencies. But that's another discussion (no, really... we've had that discussion in another thread...).

    Now this is a brief statement of disagreement. I still don't know why you felt the majority of the first half of your reply was necessary.

    To reply briefly in turn, I also don't think the nv35 will successfully compete with the R400, and I think nVidia has been focusing on getting the nv35 ready as soon as possible for quite a while (since the 9700 launch atleast). I think it is the best glass of lemonade they can make from the situation, and I think they are preparing it as fast as they can.

    I'm not going to try and dissuade you at all about your belief on when they are going to deliver it (because I don't have any strong opinion that it is wrong), but I do find issue with your idea of "nv35 can't arrive soon because the nv30 just arrived."
     
  15. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Yes, you're right--I find this particularly persuasive since I had visited the page earlier and saw the Ultra FX specs. I'm sold. Thanks for the link.
     
  16. Mulciber

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    413
    Likes Received:
    0
    Location:
    Houston
    I think you're mystifying complicated engineering quite a bit are you there Walt? *chuckle* This topic and the term "3dfx technology" are getting quite old. The engineers from former 3dfx were absorbed by a larger body of engineers and the buck stops there. Anything they had been working on 2 years ago is now obsolete. Asking if "their approach" was used in the design of the NV30 is just silly, of course it was...since nVidia and 3dfx both used immediate mode rendering designs. As far as RGAA goes, nvidia apparently couldn't make it work without a huge tradeoff in performance. It did work quite well in the VSA architecture with two chips, but consumers expect better than a 50% drop in performance at 2x AA these days. Either they couldn't make it work, or thought they had something better, and failed. Anything that a T-buffer could have done can now be done with multiple buffers. Exactly what kind of information are you expecting them to provide? Some sweeping statement like "this particular transistor was designed with 'Mofo tech' in mind". That's not going to happen.

    I found this particularly amusing

    Next time you go into a frothing tirade (though no one was disagreeing with you), you might find that heat dissipates and doesn't displace. And how many heatsinks is too many exactly? I see 2 on the GFFX, but I recall there also being 3 on the GF2 Ultra, 1 for the chip and 2 for the memory.
     
  17. Colourless

    Colourless Monochrome wench
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,274
    Likes Received:
    30
    Location:
    Somewhere in outback South Australia
    Damn you Kyle... damn you....

    I *almost* want to make this trade....

    But after much consideration, I decided there is no chance. I just will not downgrade my FSAA quality 'that' much.
     
  18. Hellbinder

    Banned

    Joined:
    Feb 8, 2002
    Messages:
    1,444
    Likes Received:
    12
    demalion
    I simply cant believe im reading stuff like this. Seriously.

    The Ultra has *served its purpose* ??? what a bunch of nonsense. So now its an accepted practice for a company to overclock a product so far, that they cant release it like that, and somehow it counts as a real product. Well guess what, that crap does not fly. Ati could have done the same freaking thing to the R300 any time they wanted.
     
  19. Mulciber

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    413
    Likes Received:
    0
    Location:
    Houston
    No one said it was ethical.... :roll:
     
  20. Hellbinder

    Banned

    Joined:
    Feb 8, 2002
    Messages:
    1,444
    Likes Received:
    12
    I know, I edit my comment...

    But it continually irritates me that people will pass it off, and not get upset about this kind of conduct.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...