Speedbin of nv35?

I personally think Nvidia is to blame over the heighten scrutiny of their products. Even if the cheating issue didn't exist the NV3x series would sure get alot of attention because it seems like a non-standard part. It doesn't seem to have TMU's in the traditional sense of the word. It uses various precisions for differing PS operations. It seriously is a very unique product and this is the type of place that will analyze that break from the mainstream to death. We will try to figure out if it was a good or bad decision. I think some ppl. reaching a conclusion that Nvidia made a mistake is being construed as Anti-Nvidia but it's not.
Then if you heap in the current benchmark fiasco on top of it Nvidia has placed themselves under the magnifying glass.
 
indio said:
I personally think Nvidia is to blame over the heighten scrutiny of their products. Even if the cheating issue didn't exist the NV3x series would sure get alot of attention because it seems like a non-standard part. It doesn't seem to have TMU's in the traditional sense of the word. It uses various precisions for differing PS operations. It seriously is a very unique product and this is the type of place that will analyze that break from the mainstream to death. We will try to figure out if it was a good or bad decision. I think some ppl. reaching a conclusion that Nvidia made a mistake is being construed as Anti-Nvidia but it's not.
Then if you heap in the current benchmark fiasco on top of it Nvidia has placed themselves under the magnifying glass.

Yea, and the problem really is that nVidia isn't forthcoming with nearly as much general information as it should be. Instead, they do things like advertise nv30 as 8x1. Then later, they send cryptic little messages out that force people to read between the lines and draw assumptions--something that has to be done because nVidia doesn't explain itself well enough to be understood in even a general sense. Much of what is "known" about certain aspects of nv3x isn't known because of what nVidia has said about it--it's merely the result of people trying to make sense of the bits and pieces they throw out along with insufficient explanation. It's as though they try and steer people to certain conclusions while enjoying the luxury of never stating those conclusions directly themselves. Tell you the truth, though, I've never known nVidia to behave itself differently.
 
Hmm, well, I don`t mind my current job:)And yes, I do frequent the Rage3D boards, sometimes, though that does not mean that my current employer, or myself are switching ships ;)
 
Back on topic, DDR2 is doable for the NV35, though not quite needed right now.If it is necessary to use DDR2, it will be done.My oppinion is that its advantages are quite significant over traditional DDR, in terms of clockspeed growth possibility
 
I though that the main reason for ATI using the DDrII was to keep memory voltage down to a reasonable level........
 
It seems the main reason was because they got it "cheap" from Samsung. Maybe it also gives them experience with DDR2, though I don't think they'll be using it in future cards--I though GDDR was the next thing.
 
Back
Top