NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
I haven't been keeping track about hardware back then, but their marketing practices don't seem to have gotten any better since. Killing VIA K8T900 by blackmailing mobo makers is a shining example...


VIA sucks anyways so why should anyone care.
 
It certainly does hurt Nvidia cards if the game runs in a less than optimal way without DX10.1. If adding DX10.1 support gives a minor performance boost, they might do it; but if it adds significant features or turns the game from being unplayably slow to playable, they won't bother; they'll downgrade it until it can run comfortably in DX10 instead.

There's also the development effort to consider. Why bother investing a lot of development work in features that only a tiny fraction of your target audience can benefit from?

No it doesn't make Nvidia cards perform worse, it only makes some other card perform better.
 
10.1 will work fine on NVs 10.0 hardware but on 10.1 hardware it'll run faster and/or look better.
Any screen shots showing improved AA? [H]'s review stated other than the broken AF on AMD's cards, they noticed only lighting contrast difference between Nvidia/AMD's IQ.
 
VIA sucks anyways so why should anyone care.


every vendor sucks at one point or another. I forgave ATI for the ATI rage pro being such a piece of shit , everyone seems to have forgiven nvidia for geforce FX, and Intel for the Pentium 4 (which is a terrible pain in laptops)

my A7V with KT133 sucked but that may have been partly because of Asus, my KT333 mobo with USB2 southbridge was really fine and didn't have the nforce 2 bug, nvidia chipsets are said to be crap nowadays (and the ethernet stuff is broken on my nforce 5 and my buddy's nforce 4 mobo)

I'll miss the option of VIA chipset. or even ULI who offered AGP+PCIe mobo with a surprisingly solid reputation.
 
every vendor sucks at one point or another. I forgave ATI for the ATI rage pro being such a piece of shit , everyone seems to have forgiven nvidia for geforce FX, and Intel for the Pentium 4 (which is a terrible pain in laptops)

my A7V with KT133 sucked but that may have been partly because of Asus, my KT333 mobo with USB2 southbridge was really fine and didn't have the nforce 2 bug, nvidia chipsets are said to be crap nowadays (and the ethernet stuff is broken on my nforce 5 and my buddy's nforce 4 mobo)

I'll miss the option of VIA chipset. or even ULI who offered AGP+PCIe mobo with a surprisingly solid reputation.

VIA had so many issues it aint funny. backwards engineered AGP that didn't work right until after they agreed to pay a licence to Intel and then did it work right. 1, count them, 1 set of drivers MS will support for XP. a southbridge that caused havok with SB cards.

When it comes to Nvidia, they need to get their head on straight when it comes to drivers andsupporting software for their chispets as the hardware itself is, for the most part ok if you wait 6 months. I've never had issues with my NForce based NICs(NF1-NF6 I have owned), then again I dont install nvidias IDE/SATA drivers(I use windows drivers) or their NIC support software either just the NIC driver and it works fine.

People should propably stop installing the entire NF driver/software package when they build machines and I'm willing to bet almost all issues would go away.
 
Try telling that to some of the poor sods using Nforce 2/3 chipsets with Vista.

I was helping 2 friends migrate to Vista, but they were having random BSOD's. Something I honestly hadn't seen on my own machines and very rarely on others. After much to do with testing components in another machine. It finally came down to the motherboards causing it. Joy.

Yet here I've helped people with older ULi, ATI, Intel, SiS, and even one VIA machine who have had flawless transitions. Well flawless once you convince them not to use their Creative Live/Audigy cards. :p

As far as I'm concerned. Nvidia chipsets are the new VIA chipsets with regards to quality. Which is strange since their Workstation boards with their chipsets are pretty darn solid.

Regards,
SB
 
When it comes to Nvidia, they need to get their head on straight when it comes to drivers andsupporting software for their chispets as the hardware itself is, for the most part ok if you wait 6 months. I've never had issues with my NForce based NICs(NF1-NF6 I have owned), then again I dont install nvidias IDE/SATA drivers(I use windows drivers) or their NIC support software either just the NIC driver and it works fine.

People should propably stop installing the entire NF driver/software package when they build machines and I'm willing to bet almost all issues would go away.

You can go by your own sample of one all you want but forums agree nVidia chipset issues abound. After my nForce4 fiascos it will take years for nVidia to earn my trust in their chipsets again. So far from what I've seen, they haven't even started yet.

edit; wow, how do I delete this post. after reading this dumb VIA tangent I forgot what thread I was in.

uuuuh...hope the GT200 is fast!
 
Any source? I think such a big difference is unbelievable but IF true it could be very nice :)

i am sorry but you are dreaming

the Old GTX is just at a disadvantage compared to the new one

expect twice as fast - with the usual marketing spin

i guess i can finally post here .. same day i left ATF
 
every vendor sucks at one point or another. I forgave ATI for the ATI rage pro being such a piece of shit , everyone seems to have forgiven nvidia for geforce FX, and Intel for the Pentium 4 (which is a terrible pain in laptops)

my A7V with KT133 sucked but that may have been partly because of Asus, my KT333 mobo with USB2 southbridge was really fine and didn't have the nforce 2 bug, nvidia chipsets are said to be crap nowadays (and the ethernet stuff is broken on my nforce 5 and my buddy's nforce 4 mobo)

I'll miss the option of VIA chipset. or even ULI who offered AGP+PCIe mobo with a surprisingly solid reputation.

Hold on the Rage Pro being a PoS? .. you aren't talking about the Rage Fury 32 are you?

That was my first ATi GPU and i think i paid about $200-250 for it. i liked it a lot especially the IQ; i remember playing MDK2 and noticing how much more vibrant the colors were than on my GeForce32-SDR [SGRAM, i think] which was admittedly much faster.

AMD needs to work with the devs like nVidia does with the TWIMTBP programme in the first place. Then they can make the rules like nVidia does now.

they do .. it's called "get in the game", i think.
 
apoppin said:
they do .. it's called "get in the game", i think.
I know how it's called, but since it practically doesn't exist... say, when did you last see a game bearing the GITG logo? Or let me put it another way, a game that did not have the TWIMTBP logo?
 
Looks like Nvidia is talking about something in two weeks - http://www.theinquirer.net/gb/inquirer/news/2008/05/02/nvidia-throws-party

Hopefully Charlie gets the scoop before then.

You are cordially invited to attend the NVIDIA Editor's Day May 2008 event, which will be held at the NVIDIA Santa Clara campus from May 13th to May 14th, 2008.

We will be holding in-depth technical discussions on our new consumer GPUs and other NVIDIA technologies that will help shape the future of visual computing.
 
I know how it's called, but since it practically doesn't exist... say, when did you last see a game bearing the GITG logo? Or let me put it another way, a game that did not have the TWIMTBP logo?

Of course .. and do you realize the "why"

ATi went broke .. they had to go to ailing AMD to join forces because ATi wanted to help AMD with their CPU-GPU vision of Fusion
- they over engineered their x1900 series and although it was successful it was also too expensive for them

AMD is too broke to pay attention; it is all *priorities* .. nVidia is cash right can buy anyone they want with really attractive and helpful programs for the dev.

Good gawd, nVidia just released that incredible tool to help devs "see it our way" in creating games; some of the most helpful info i have seen anywhere.

Yet, nVidia is also overbearing and irritates the people they work with

AMD - if it survives - will also get back "into the game" - imo
- they are just struggling to catch up
[imo]
 
ATi went broke .. they had to go to ailing AMD to join forces
I don't think so, ATi was quite healthy at the time of the buyout. The problem with R580 was marketing, G71 was an inferior chip, but nVidia backed it up with solid marketing lies. ATi, although they could back up their chip with true numbers, didn't do a thing.

It seems to me there's too little buzz about GT200 for it to be released so soon. But perhaps nVidia has properly sealed every hole that could leak information...
 
I don't think so, ATi was quite healthy at the time of the buyout. The problem with R580 was marketing, G71 was an inferior chip, but nVidia backed it up with solid marketing lies. ATi, although they could back up their chip with true numbers, didn't do a thing.

It seems to me there's too little buzz about GT200 for it to be released so soon. But perhaps nVidia has properly sealed every hole that could leak information...

You just agreed with me!

ATi had a very expensive "over engineered" GPU, x1900 series while G71 was "good enough" and it made nVidia tons of cash.

At ATF video i predicted G80 .. if i could do it, i know ATi saw the writing on the wall and went right away to AMD to start talks - that is "history"; we know ATi went to AMD to help them with Fusion. And 2900xt was delayed because it was not up to AMD's own standards when they decided to change their strategy and go after the GTS midrange instead of the GTX. imo, ATi would have brought out a "dustbuster" if they were in charge of it before AMD stopped them

I also predicted 6 weeks ago on their forum that "GT 200 was Ready and simply WAITING for r700's launch" - which i also said would be "by june"; it is all there if you want me to link it for you

i really try to watch the trends and my knowledge of nvidia/amd/ati/intel past histories makes it relatively easy to predict the near-future; which i love to do and i scour the 'net for real tidbits that were reliable previously ... for fun
 
ATi had a very expensive "over engineered" GPU, x1900 series while G71 was "good enough" and it made nVidia tons of cash.
Well if you look only at nVidia's profits, then yes. It's kinda sad that people let themselves be ripped off like that.
we know ATi went to AMD to help them with Fusion.
We do?
And 2900xt was delayed because it was not up to AMD's own standards when they decided to change their strategy and go after the GTS midrange instead of the GTX. imo, ATi would have brought out a "dustbuster" if they were in charge of it before AMD stopped them
I believe that is not the case. I would be happy to discuss it with you, however I don't think this is the right thread...
I also predicted 6 weeks ago on their forum that "GT 200 was Ready and simply WAITING for r700's launch" - which i also said would be "by june"; it is all there if you want me to link it for you
I believe you, but I don't agree. If they can already manufacture the cards, they could sell it at insanely high price points because of the lack of competition.
i really try to watch the trends...
Me too (and unlike you, I'm being paid for it :) )
 
Hold on the Rage Pro being a PoS? .. you aren't talking about the Rage Fury 32 are you?
No, he's talking about the Rage Pro. It was ATI's attempt at finally matching Voodoo1 but also having 2D. Rage Pro was a bug-ridden mess because of ATI's horribly awful drivers of the time. They even came out with "Turbo" drivers back then that actually slowed games down but sped up benchmarks.


And I also agree that NVIDIA chipsets haven't been the best. I have also had better luck with VIA KT266A and later than NVIDIA's stuff (NF2 and 4). The nForce chipsets are ok as long as you figure out the features not to enable and which drivers not to install.
 
Status
Not open for further replies.
Back
Top