NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Nicely done. Let's all play straight console ports from now on.

As consumers our foremost option is to vote with our wallets. It is entirely reasonable to NOT support a company that actively damages the purchased product for part of the customer base. The particular company in question may still come out ahead - after all, they get money from nVidia for mistreating their AMD customers and it is not at all a given how it balances out economically.
But why should consumers financially support a company whose policies they don't agree with?
 
If you look all that up, it is called research. I did it. I then wrote it up, that is called reporting.

No Charlie, what you do isn't reporting. Your unhealthy obsession with Nvidia and the fantasies that result from it are entertaining indeed but far from research. See, as usual in your enthusiasm to diss Nvidia you missed the point. The poster I responded to was using Nvidia's financial results as an indication of GT200 yields and margins. Hence, I asked him to explain why AMD is also losing money with their oh so great yields that you love to "report" on. And that's notwithstanding the vast sums of money Nvidia spends on extra-curricular activities. Yet, Nvidia's non-GAAP gross margin was 37% last quarter (excluding the Bumpgate charge). Curious, no?

So, rather than running around whining, why don't you actually go and do the research like I did? Then you will have your answer, to whatever degree you feel is necessary to document the problem or as you posit, lack thereof.

What problem is that exactly? That Nvidia's execution is poor as of late? What is this problem that I claim does not exist? As for whining, well that's exceptionally ironic coming from you....

Your homework Trini is to figure out which is which. Then you can consider yourself edumacated, and have someone put a gold star sticker on your forehead. This is not a path off the short bus however, that will take a little more time.

Now, now don't get all testy. Framebuffer is the one who brought up credentials. But since we're on the topic I'll assume from your response that you don't have any :) Besides, you don't need a degree to smell what you're shovelling.

The next time you don't believe what I write, go do the research, you will look a lot smarter.

Fortunately for me then that I don't have to "look" smart on the internet. I don't have a blog, semi-accurate or otherwise ;)

By the way, I'm still waiting to hear what you think of Fermi from an architectural standpoint. The proclamations of doom are getting really stale.
 
Volume costs when I gave that price wasn't what you are saying, nor was I talking about yields vs cost per wafer.

Charlie, the stuff you cut out and that link, those guys do it for a living, if you want, I suggest you buy the yearly reports and then look at your numbers, if you can't afford to do so, contact them and see if they would be so gracious to give you last years info or the year before that, until you learn how to communicate and understand where I'm coming from (if you read anything in your life the about semiconductor industry and implicit and explicit cost models and evaluations) then I'll chat and I'll be all ears, at this point its pretty obvious you have no clue what you are talking about ;)

Why bother when I can call the companies that are in question and ask for the info directly? I don't need a layer that is part obfuscation layer, part PR. I know those kinds of reports are good if you can't go direct. I can.

I also do get the science behind it. And have other sources.
http://www.theinquirer.net/inquirer...le-macbook-pros-have-nvidia-bad-bump-material
Nvidia is still frothing over this, trying to blame ATI at every opportunity, and telling everyone I don't get the science. Then again, they did say there was no truth to the chipset cancellation story I put up a year ago too.

-Charlie
 
Why bother when I can call the companies that are in question and ask for the info directly? I don't need a layer that is part obfuscation layer, part PR. I know those kinds of reports are good if you can't go direct. I can.

I also do get the science behind it. And have other sources.
http://www.theinquirer.net/inquirer...le-macbook-pros-have-nvidia-bad-bump-material
Nvidia is still frothing over this, trying to blame ATI at every opportunity, and telling everyone I don't get the science. Then again, they did say there was no truth to the chipset cancellation story I put up a year ago too.

-Charlie


if you want me to be specific, nV's and AMD's wafer purchasing isn't the same, I know that nV's purchasing model isn't based on buying wafers, which if you didn't get the hint in my second post to you, well......

I'm going to be blatent there is no truth to your article on profit's per chip/card.

BTW I don't know why you pointed to your bump gate article, (self promotion ;))

Chipset article, since you wrote that article its been 1 year before anouncement that they are still in legal proceedings, they aren't jumping off the boat, Intel and nV havn't gotten to an agreement, quite different then what your article has stated.
 
Last edited by a moderator:
Chipset article, since you wrote that article its been 1 year before anouncement that they are still in legal proceedings, they aren't jumping off the boat, Intel and nV havn't gotten to an agreement, quite different then what your article has stated.
Are you saying that Nvidia is NOT getting out of the chipset market?
 
Are you saying that Nvidia is NOT getting out of the chipset market?


I'm not sure, I didn't ever make the comment did I? Those legal proceedings since Feb of this year, there is no way Charlie could have made that guess since his article was in August of 08, unless he has leverage over Intel's lawyers? And how many lawyers do you know that have a sling in thier mouth? This wouldn't have happened, further more his article was based on conjecture that it was because of bad chipsets. He put 2 and 3 and 4 together to make a story. After AMD's buyout of ATi, and their opening up of Xfire to other platforms, that was a good conjecture I won't disagree with that, but again, its a good guess.

http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-dead
 
Last edited by a moderator:
If anyone is to blame, it's Nvidia for trying to fracture the market.
Oh yes. Let's all sit and do nothing and when someone's done something that you don't have then let's say that this is bad. Just like AMD does right now.
NV's making reasons to buy games on PC and not on consoles. Of course they make it so that people would go and buy their h/w. I don't see what's wrong with that at all.

Edit: By the way, are you part of the program?
What program?

As consumers our foremost option is to vote with our wallets. It is entirely reasonable to NOT support a company that actively damages the purchased product for part of the customer base. The particular company in question may still come out ahead - after all, they get money from nVidia for mistreating their AMD customers and it is not at all a given how it balances out economically.
But why should consumers financially support a company whose policies they don't agree with?
You and many others got this whole thing completely backwards. NV's enriching the product with additional content be it PhysX or AA in UE3 D3D9. Why would NV do this if not to get more h/w sales? What would you get if NV wouldn't do this at all? What has AMD done to make games better looking lately -- and no, I'm not talking about some marketing bullshit like supporting open physics standards (yeah, "supporting" them with presentations I guess because as it turned out it's being developed on NV's hardware)?
OK, AMD has Dirt 2 DX11 now (which got pushed back nearly 4 months on PC because of how fast AMD's devrel works; hooray!!..). But if Battleforge is any indication they'll probably do it in such a way that only DX11 GPUs will benefit from added quality while most of what will be added is probably going to be possible to implement on DX11 feature level 10 h/w (that's mostly NV's today's h/w). I wonder what the general reaction to such an event will be. Battleforge ported to DX11 ignoring all the DX11 stuff for DX10 h/w went largely unnoticed somehow but NV's AA implementation in BAA turned into a shitstorm of stupidity.
And on that "got money from NVIDIA" thing I keep hearing. Care to provide any basis for that allegation?
 
Oh yes. Let's all sit and do nothing and when someone's done something that you don't have then let's say that this is bad.

The only acceptable courses of action are to do nothing (like AMD) or to do something, spend all the time and money but not gain any advantage from it. Anything else is unfair.

What program?

Presumably Nvidia's "focus group". In which case I guess that means everybody who disagrees with you is part of AMD's.....

Battleforge ported to DX11 ignoring all the DX11 stuff for DX10 h/w went largely unnoticed somehow but NV's AA implementation in BAA turned into a shitstorm of stupidity.

Fudo was right on the money with this one. It's happening all the time but it's starting to become a bigger deal as Nvidia gets into higher profile titles. Mirror's Edge, Assassin's Creed and now Batman. Nobody raised a stink over Cryostasis did they?
 
Are you saying that Nvidia is NOT getting out of the chipset market?

For those that do not enjoy critical thinking let me lay this out in a simple manner.

Lawyers are not free.

Nvidia is a company, they desire to make money.

Nvidia is paying lawyers boat loads of money to argue with intel about whether they can make a chipset.

Nvidia would not be giving away boatloads of money if they did not at least desire to build a chipset.

That is rather clear.

It doesn't mean they will build any more. They may quit b/c they cannot come to an agreement where they believe they can make money on them anymore.
 
You and many others got this whole thing completely backwards. NV's enriching the product...
I was specifically talking about the offending company disabling AA for AMD products, as evidenced by the feature being completely functional if the application was fooled to believe that an nV card was present. It seems reasonable to assume that this course of action was at the request (demand?) of nVidia. Which company one chooses to object to in this case is a matter of personal preference, either or both would seem to be reasonable options. (For me, I see nVidias position as understandable if ugly, but the position of the developer.... not behaviour I want to support financially, no.)

Again, some would argue that it is the DUTY of the consumer to vote with his wallet in a market economy to help ensure that the system works. For the life of me, I can't see why I should support this particular company. It's not as if I'm missing out on something terribly important by not playing their particular games.
 
It seems reasonable to assume that this course of action was at the request (demand?) of nVidia.

So you don't buy the story that Nvidia was responsible for the implementation?

Again, some would argue that it is the DUTY of the consumer to vote with his wallet in a market economy to help ensure that the system works. For the life of me, I can't see why I should support this particular company. It's not as if I'm missing out on something terribly important by not playing their particular games.

By all means, it's your right to deprive yourself (and your friends too aye SB :)) of a game that you want to play.
 
Presumably Nvidia's "focus group". In which case I guess that means everybody who disagrees with you is part of AMD's.....

Its still only 4 people for Nvidia. Its shrunk. Then expanded. Then shrunk again. Its always been about 4 people. :p
 
I was specifically talking about the offending company disabling AA for AMD products, as evidenced by the feature being completely functional if the application was fooled to believe that an nV card was present.

Does the feature work on products which are neither AMD nor NVIDIA?

It seems reasonable to assume that this course of action was at the request (demand?) of nVidia.
It is also reasonable to assume that the Sun orbits the Earth, if you choose to focus solely on the movement of the Sun the in sky and ignore other possible interpretations.
 
Status
Not open for further replies.
Back
Top