NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Ooooh, did I mention I like ATi? :|

So do I.
In fact, the Radeon 8500 and 9600XT that I've owned make up the longest time of a single brand of GPUs that I've ever used as my primary GPU.
But I don't think this is the right topic to be discussing ATi's products and our user experiences.
 
There we go again. Would you mind controlling your ATi-fanboyism for once, and not go ranting everytime someone 'forgets' to credit ATi in a topic that isn't about ATi in the first place? (Or the other way around for that matter... Like last time when we were in a topic about Avivo, and you started ranting how I didn't criticise Badaboom in that thread?).
Now if I said that ATi *didn't* support them, perhaps that would give you the right to start a rant. But I didn't. I just didn't think it was necessary to mention ATi, since we were discussing nVidia here. Besides, I think most people here know that any DX10/DX10.1 parts will support it. Oh wait, that includes S3 aswell (and even Intel's IGPs?). Did you miss that bit?
Please...

Speaking as someone who has a hate-on for AMD's graphics cards and will for probably quite a while(loooooooooooong story)... HE DID NOT SAY THAT AT ALL.

he was merely pointing out that the advantages you're giving the nVidia cards are also advantages for AMD cards.
 
It was faster than the GF3 Ti 500 with the "it's about danm time drivers". :p
Fantastic card despite that.

Exactly. I bought it early on, because I wanted a shader card, and PS1.4 is what pulled me towards the 8500 (that, and the GF4 wasn't out yet, so it was also the fastest card).
Initially the drivers left quite a bit to be desired... Eg, I had written a thing with heavy cubemapping on my GeForce 2 GTS (the card I had before the 8500), which ran faster on the GeForce 2 than on the 8500.

But indeed, once they fixed their drivers, the card was pretty damn good.
 
he was merely pointing out that the advantages you're giving the nVidia cards are also advantages for AMD cards.

Yes, but why does he have to point that out, especially in a topic that doesn't discuss ATi products, but rather nVidia's financial situation? I didn't consider ATi relevant to that post, as we were discussing nVidia's future products relating to Windows 7.
Besides, I never said otherwise. I think it's a pretty 'duh' remark. And the way he phrased it also leaves a lot to be desired.
And ofcourse the obvious, as I already said: the advantage ALSO goes for other brands than just nVidia and ATi. Why did he not mention those? Exactly: because he only wants to talk about how good ATi is all day.
Well, he can do that in plenty of other topics. He can't expect everyone else to discuss ATi everywhere, no matter how inappropriate that might be, even if he feels that way.
 
Last edited by a moderator:
lol, why do people even make those sorts of claims in this day and age when every word uttered is stored for eternity :D
 
lol, why do people even make those sorts of claims in this day and age when every word uttered is stored for eternity :D

Because that's him? that's his "i'm better than you" attitude? Anyway. he covered himself with "could-be"'s
 
You have to give him some credit though... nVidia is important enough for Sony to buy their PS3 GPUs from.
In fact, the GPU is arguably the most important chip in the PS3 (although the Cell is no slouch either, ofcourse).

And although I doubt they'll be bigger than Intel anytime soon, they do seem to have made Intel nervous enough about this GPGPU stuff to come up with Larrabee :)
 
I'm looking forward to 2012:
While the Sony quote was from Jen-Hsun, this one definitely wasn't. It's from Michael Hara, who is still their Investor Relations guy today BTW.

Also I don't like to defend such quotes because they're obviously insane, but I think it's only fair to point out the point is not so much that NVIDIA would be 100x larger than they were in 2002 - it's that it would be 10x times bigger while Intel would be much smaller because their CPU business wasn't sustainable in the long-term while NV's business probably would be.

Essentially what they were thinking about wasn't $600 GPUs with the CPU selling for $100. They were thinking of Ion (or even better, Ion2 with VIA Nano) where the CPU will eventually sell for $10-15 and the MCP/GPU for $20-30. While still too aggressive and very much a crazy thing to say back in 2002, at least it wasn't just random wishful thinking.
 
Now unless ATi has DX11 hardware planned at (or before) the Win7 release, I see no reason why nVidia would have to have it.

Other than the fact that even without Dx11, ATI hardware will still benefit due to speedups from supporting Dx10.1. I'm sure Nvidia wouldn't want their cards to be at a large disadvantage when more and more Dx10.1 features are supported due to Dx11 gaining traction. Especially considering that in most cases Dx10.0 versus Dx10.1 (included in Dx11) is approximately 20% slower on average.

Regards,
SB
 
Come on SB, you really think there will be DX11 software ready to take advantage of DX10.1 anytime soon? Even today we only have a smattering of true DX10 titles.
 
Other than the fact that even without Dx11, ATI hardware will still benefit due to speedups from supporting Dx10.1. I'm sure Nvidia wouldn't want their cards to be at a large disadvantage when more and more Dx10.1 features are supported due to Dx11 gaining traction. Especially considering that in most cases Dx10.0 versus Dx10.1 (included in Dx11) is approximately 20% slower on average.

Regards,
SB

nVidia hasn't supported DX10.1 for a while now, and hasn't lost any sleep over it.
I really don't think DX11/Win7 have any effect on that. It's not like it makes the DX10.1 cards do MORE than they already would with DX10.1 under Vista (at least, not outside the advantages like multithreaded rendering and CS and such that go for any DX10 hardware). Nothing changes.
 
So are all the people who were attacking Charlie for listing other chips going to apologize? I doubt it.
 
Status
Not open for further replies.
Back
Top