Speedbin of nv35?

I wonder if you think it´s possible soon for Nvidia to speedbin the Nv35 and put it with higher memory.
Because the DDR2 was soo expensive in the Nv30 and now seen too get cheaper and faster, how fast would DDR2 be at about 3 months?
Also what do you think is a possibly increasment in a good respin of Nv35?
 
Until I can see it on shelves, the NV35 is still vaporware for me. It may or may not see the same volume sales than the 5800U... So I'm not going to worry about speedbin before it is actually possible to buy it. :) When people are actually able to buy it, we will soon see overclocked results which should give us some kind of indication on how much of a speed increase Nvidia could potentially do.

And the high-speed DDR2 looks cheap because Nvidia dumped stocks of it when they all-but-in-name canned the 5800U.
 
Don't waste your breath trying to have a meaningful discussion about nVidia cards @ Beyond3d.

This is fanATIc land, and the only time nVidia gets discussed here is to deride it.

Warning -- if I see another similar useless post like this from you again, you're banned. -- Reverend
 
Oh, dear god... Why isn't the banning policy more aggressive here? Some people just won't even learn ( not talking about you, CorwinB, your reply was okay IMO )

I believe it is an interesting thing to note that the NV35 core *does* support DDR2.
It's just that it's still more expensive, that it currently produces more heat and that it's less efficient clock-for-clock.

Second generation GDDR2 should fix some of those problems ( mostly heat ) , so as soon as that's available, it is possible nVidia might be interested in making a refresh. At TSMC, of course, since IBM won't be used before the NV40 I believe.


Uttar
 
radar1200gs said:
Don't waste your breath trying to have a meaningful discussion about nVidia cards @ Beyond3d.

This is fanATIc land, and the only time nVidia gets discussed here is to deride it.

Baseless comment go look around in threads that don't have 3dmark or nvidia in the title
 
radar1200gs said:
Don't waste your breath trying to have a meaningful discussion about nVidia cards @ Beyond3d.

What a load of rubbish! Would you mind very much stating what it was you didn't find meaningful so far in the thread?

Idle speculation about speedbinning a product which isn't even on sale yet (and will stay that way for weeks more likely) is pointless, something Corwin already pointed out. Too bad you had to take it personally or something, because he IS right you know.

This is fanATIc land, and the only time nVidia gets discussed here is to deride it.

If you truly believe that, perhaps you'd feel more at home on the NvNews forums or such instead.


*G*
 
overclocked, the NV35 is already speed-binned, just like the NV15/GF2, NV20/GF3, NV25/GF4, NV17/GF4MX--heck, every nVidia card since the Riva (remember the ZX? TNT2 Ultra?). At the top is the 256MB 5900U, below that is the regular and slightly slower 128MB 5900, and bringing up the rear is the 128MB 5900 value part.
 
As far as I can see nVidia only used DDRII with the 5800U to offset the bandwidth disadvantage of its 128-bit bus. Now that they're going to 256-bits I think sticking with DDRI is better all the way around. I'm not impressed with ATi's use of DDR2 in its 256mb 9800P--I'm already clocking my DDR1 9800P's ram faster than the default speed of the DDR2 256mb card.
 
WaltC said:
As far as I can see nVidia only used DDRII with the 5800U to offset the bandwidth disadvantage of its 128-bit bus. Now that they're going to 256-bits I think sticking with DDRI is better all the way around. I'm not impressed with ATi's use of DDR2 in its 256mb 9800P--I'm already clocking my DDR1 9800P's ram faster than the default speed of the DDR2 256mb card.
On the 256 MB 9800 Pro isn't the voltage simply set too low for the memory to be used at its full potential?
 
Ostsol said:
On the 256 MB 9800 Pro isn't the voltage simply set too low for the memory to be used at its full potential?

I think if they increased the voltage to get more out of the ram they'd run into thermal problems with what they're using, probably. IMO, they got a good deal on the stuff and see it primarily as a marketing bullet for the limited production 256mb card.

In running my own clocking experiments with the DDRI 9800P I got brave enough to boost the core to 452.8MHz and for fun ran 3D Mark 03 330 a few times for a ~5830 aggregate score. Interesting thing is that the ram on the card seems to max out for me at 365MHz for reliable operation, but I noticed at 1024x768 I got differences running upward bumping the core from 438Mhz to 445Mhz, and upwards again from 445 to 452. So even at 365 MHz it doesn't appear that the bandwidth is exhausted even at those core clocks. At least, it looks that way...
 
WaltC said:
I think if they increased the voltage to get more out of the ram they'd run into thermal problems with what they're using, probably. IMO, they got a good deal on the stuff and see it primarily as a marketing bullet for the limited production 256mb card.

You're probably right. According to Tom's, the DDR II on the 9800 is already running rather hot:

In addition, the card generates an extremely high amount of heat. This is a peculiarity of the DDR II memory, from which NVIDIA's NV30 (alias GeForce FX 5800 Ultra) also suffers.
 
I'm in agreement with radar1200gs words, the mods/users here seem to actually enjoy spreading the Anti Nvidia propaganda, well when they're not too busy bashing kyle and Co.

http://www.3dgpu.com/phpbb/viewtopic.php?t=5145

The pendulum used to be somewhat centered and discussions here were insightful to read, but all we are left with now are fanboyish jibes and pom pom waving.
Talk has turned away from 3D tech and shifted to topics i'd normally associate with Rage3D, but that cant be helped I guess, especially when the majority of your new user base hails from there.
Anyone that has been a longtime reader (& lurker in my case) of B3D cannot deny the change that has taken place here, and I feel this can only be of further detriment to the site. :(

It was good while it lasted, but I feel my time is now better spent elsewhere....Que Sera, Sera


CitizenC



Beyond3D Bias meter: ATI [ X - - - | - - - - ] NVIDIA.
 
Gotta love the "I'm a long time lurker" thingie. That's a trademark from folks who never heard about a site, but want not to look trollish when they register just to trash it (hint, you failed).

I remember (and perhaps you do if you are truly a long-time lurker) the time where the place was labeled "3dfx-biased", and even earlier "PowerVR-biased", by "long-time lurkers" who registered just to post that. Some things never change.

The only bias I've consistently noticed for the majority of B3D posters is

Beyond3D Bias meter: truth [ X - - - | - - - - ] BS PR

Too bad when one IHV consistenly resorts to PR, lies, dishonest tactics, cheating... in order to secure its marketshare. That specific IHV, outside of the qualities of its products, is bound to look bad around here.

"Long time lurker", now that you have done your little trolling tour, perhaps you could leave this "propaganda" place alone, and go back to where you came from ?
 
CitizenC said:
I'm in agreement with radar1200gs words, the mods/users here seem to actually enjoy spreading the Anti Nvidia propaganda, well when they're not too busy bashing kyle and Co.
Point out one instance where the mods have spread "anti-nVidia propaganda." You can't stop forum users from doing so, and we surely have a few fans on either side, but the B3D mods and editors have been even-handed throughout, IMO. Discovering nVidia was cheating and investigating it is NOT the same thing as "anti-nVidia propaganda."
 
WaltC,

I don't think you can fault ATIs implementation of DDR2 here. As you can plainly see on the board, it features two BGA devices in serial per data/address line. This will of course put the memory subsystem under greater stress than what just one device would do. Traces will be considerably longer (maybe 2x as long), and having two loads on a line also complicates things quite a bit at those speeds. ...Or so at least the experts say and I'm not going to argue against them! :D


*G*
 
Though I don`t find Beyond3d to be anti-nv, I do consider this forum to be a tad on the ATI side of things.I think that`s explainable, considering that many users here came from Rage3D.KEEP in mind that i don`t see this as a bad thing necessarily ;)
 
Testiculus Giganticus said:
Though I don`t find Beyond3d to be anti-nv, I do consider this forum to be a tad on the ATI side of things.I think that`s explainable, considering that many users here came from Rage3D.KEEP in mind that i don`t see this as a bad thing necessarily ;)

That's an interesting comment comming from you TG... Why would you be appraised of specifics re: Rage3D forum users, unless you kept an eye on their forums? Does this give an indication of your employer's future direction...?;)
 
I'd have to agree with CitizenC and I don't have any particular bias toward ATI or Nvidia (I've owned multiple cards from both sides).
 
This is not really bias at all some ppl might favour a side a little but overall i think that most ppl here "call a spade a spade" and some ppl who dont quite understand the issues take it as bashing, any company that does dodgy, unethical or morally wrong practises gets bashed just the same here, some ppl just cant handle that and call it fanboyism. yawn get over it

And larger companies in a slightly more dominant market position generally overt to things a little dodgy, and so will cop a little more flak.
Big surprise it happens everywhere.

This site is not fanboyish and i doubt it will ever be.
Hardocp is a blatant raving fanboi site if there ever was one
 
Back
Top