WaltC said:
demalion said:
1) There is no reason I've heard so far to believe the nv30 effort has delayed the nv35, and I'd tend to think nVidia would have had to be rather foolish to change their plans to have that result when they hit the nv30 snags. I tend to believe that the nv35 delay would primarily be determined by the marketplace, given the original intended launch schedule...as we've guessed during the nv30 delay period, focus on preparing the nv35 was likely a priority.
In this light, killing the 5800 Ultra seems very sane if they've recently reached the conclusion that they can bring it to market soon enough. I think it lends validity to the May(?)/June launch rumors with the nv35 (as much before the launch of the R400 as possible), and indicates that the issues (I think) Mufu has hinted at are on-track for resolution.
Aren't you just about sick and tired of "rumors" when it comes to nVidia and its products?
Well, I'm in a fair bit of amazement that such rumors are the majority of what they've managed to deliver successfully, and that feeling has served to numb any annoyance, I guess.
That's about all it's been for the past 8-9 months out of nVidia--rumors and speculation galore--and at the end the stark and naked truth:
nVidia has no chip with which to compete with the R300 from ATI.
Eh? No, they do have a chip that competes fairly well. It just isn't feasible to release it as a product. Also, the non Ultra still competes, just not very successfully from the standpoint of those who picture nVidia as the performance leader. The Ultra has served its purpose of providing benchmarks that can be used to show the "GeForce" at an advantage to the "Radeon"
There were plenty of good, solid reasons to cancel the GF FX Ultra which have absolutely nothing whatever to do with "nv35" (*chuckle* That's so funny--after hearing "nv30,nv30,nv30,nv30".......ad infinitum for the past six months--and how it would "wipe the floor" with the puny R300 from ATI.)
Hmm...well, yes there are, but why are you telling me? I'm the guy who was lambasting people for making assumptions after E3 that it was preposterous that the R300 couldn't be faster than the nv30.
But on to your reasons...
How about:
Walt's list of GF FX Ultra flaws
Do you think any of this is news? Well, I'm not sure what you mean by "too many heatsinks" but I'm not particularly curious...
This isn't the nvnews forums Walt, nor Rage3D, you don't have to keep pointing out things like this when no one is contesting them (atleast not at such length). All I was commenting on (read the text again) was specifically that I don't see any reason at all to assume the nv35 is necessarily delayed because the nv30 was so late...hence terms like "lends validity to the rumors".
Specifically, I think this leaves room for the nv35 to come out before fall (and presumably the R400...I don't think ATi has a great deal of reason to hurry the launch of that even if they could...I think the R350 is likely to compete well enough with the nv35), and opportunity to re-associate nVidia and the GeForce, for whatever amount of time, with the concept of "performance leadership".
This does leave technical issues to be worked out, and I don't have the confidence in nVidia engineers that I would with ATI engineers at this point, but even just adding a 256-bit bus would help the GF FX catch up quite a bit, even before considering the other ideas the engineers may have in mind.
Repeating myself...given the prior hints of the nv35 being the focus of intensive "debugging", it seems likely, in my opinion, that this info about cancelling the 5800 Ultra parts strengthens the likelihood of rumors of a May/June launch schedule. If you disagree with this, a brief reply like that at the end could have sufficed....
Nowhere do I indicate that I disagree that the 5800 Ultra is a flawed part, and I've mentioned the flaws prior. I don't mention them again because they've been mentioned quite a few times already....
Those are just in-your-face, undeniable observations each of which by itself might be reason enough to cancel the product. Taken in total they are overwhelming.
Yes, yes....similar outlooks have been well established. For my part, that is why I was using terms like "sane"...the Ultra just strikes me as a computer OEM dud.
more reasons that I thought were discussed adequately, and I don't see my post contesting.
That's enough. As you can see there are plenty of reasons to cancel this product, none of which are remotely connected to the mythical "nv35." (Ever heard the story about the boy who cried 'wolf'?)
? Yes, I saw it before too...? This first half was a waste of time, IMO.
2) 500/800 doesn't sound very sane to me...but 400/"1000" does.
(a) I'd expect heat issues with "DDR II" RAM to be less severe than those for the GPU, and yields at 400 MHz sound profitable.
(b) Bandwidth is the GF FX's primary problem.
(c) Not having read Kyle's comment yet, it seems like a plausible minor miscommunication.
400/800 is the real deal. Maybe nVidia will up the spec for the non-Ultras. But at a selling price of $300, don't count on it. nVidia already knows the non-Ultras won't compete with R300, and certainly not with R350. Why throw good money after bad? Let it compete with the $300 ATI cards....which *chuckle* will probably be the R300 anyway--just as soon as ATI ships the R350.....
I think you make a good point, and I tend to agree. See above with my later post about the memory clock speed.
BTW, I disagree about your bandwidth statement. Even with virtually *the same* raw physical bandwidth as a 9700P, the Ultra was much slower per clock, and that had nothing to do with bandwidth at all.
Well, I've discussed this before...clocking the RAM the near the same frequency as the core with a 128-bit bus is more limiting than clocking near the same frequency with a 256-bit bus. Each card having roughly the same fillrate, this indicates a situation where the GF FX architecture is much more likely to "choke" as I termed it, and I think make it more likely to get greater returns from increasing RAM clock frequency (assuming there are no issues with such a memory clock disparity between core and RAM...I assume nVidia has their interface well in order).
But I agree the returns in performance are likely not to be deemed worth it for the increase cost, though I don't have any definite idea of the cost difference.
I will say however that as the latencies in DDRII seem to be a fair amount higher than those in DDR I, and as nVidia's nv30 bus didn't seem optimal to me in the sense of even being very good at 128-bits wide, you might get by with a broad interpretation of "bandwidth" here--certainly not anything that putting slightly faster DDR II ram on the non-Ultra would solve. I think the GF FX non-Ultra might make good competition for the 9500Pro, maybe.
I'm also not convinced that the RAM on the GF FX is best considered to be "DDR-II" in regards to latencies. But that's another discussion (no, really... we've had that discussion in another thread...).
I think you'll have a wait on the nv35 though, I really do, as I think it will take significantly more than a 256-bit bus to make it competitive with R350/R400, specifically R400--and that will be a much greater challenge than R300 was, I believe.
Now this is a brief statement of disagreement. I still don't know why you felt the majority of the first half of your reply was necessary.
To reply briefly in turn, I also don't think the nv35 will successfully compete with the R400, and I think nVidia has been focusing on getting the nv35 ready as soon as possible for quite a while (since the 9700 launch atleast). I think it is the best glass of lemonade they can make from the situation, and I think they are preparing it as fast as they can.
I'm not going to try and dissuade you at all about your belief on when they are going to deliver it (because I don't have any strong opinion that it is wrong), but I do find issue with your idea of "nv35 can't arrive soon because the nv30 just arrived."