NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Have all 4 been active as heck in this thread? :|

;)

Nope. I'm the only person in the focus group that I'm aware of that even posts at beyond3d. Rollo barely posts anymore...( :) )

Amorphous is helps primarily on the Nvidia forums ((he was my counterpart to nzone that I was to SLIZONE though we'll soon be working together since we're merging)). And Anthony((keysplayer)) posts at anandtech,


What? Since when?
 
I was not referring to the Focus group, but "The way its meant to be played" program.
Oh, that free programming service that nVidia provides that you claim doesn't give any economical benefit to the developer or incentive to cause them favoritism?

One catch of The way its meant to be played marketing and development program is that Nvidia actually does end up spending money where publishers and developers benefit.

Most of the money never exchanges hands, but Nvidia offers thousands of programmer hours as the company codes advanced effects, tries to makes thing faster, get PhysX inside of game and does this mostly for free. Since labour is also something you need to pay, Nvidia ends up paying millions in salaries and other costs that come out of TWIMTBP program.

EDITED BITS: Thanks Chris, I was just joking on that. I actually don't recognize any of those other focus group members names, so I doubt they've been posting at any of the forums I have been...but still appreciate the info. :)
 
Nope. I'm the only person in the focus group that I'm aware of that even posts at beyond3d. Rollo barely posts anymore...( :) )

Hmmm, yeah haven't seen him in a while. But what's the point of being a focus group member if you're not interacting with the community? And digi, you should try looking in the mirror before making those "jokes" :LOL:
 
Hmmm, yeah haven't seen him in a while. But what's the point of being a focus group member if you're not interacting with the community? And digi, you should try looking in the mirror before making those "jokes" :LOL:

Well I can say theres a whole lot more to it than "interacting" with the community. Alot of it is observing the community, Helping Nvidia recognise trends and providing feedback/thoughts regarding your observations. As well as driver/hardware beta testing and delivering driver feedback directly to Nvidia.

But I see your point. I don't know what he is doing right now.
 
"the way it's meant to be played" program
I as me, just one person? -)

The only acceptable courses of action are to do nothing (like AMD) or to do something, spend all the time and money but not gain any advantage from it. Anything else is unfair.
It starts to look like this. I wonder what would all these people say back then when there was TNT2 and Voodoo3 and NV was pushing industry for higher resolution 32-bit textures and frame buffers and 3dfx was saying that 256x256x16bit textures and "22 bit" color buffers was "enough". Or when there was 3dfx with Glide and nothing else at all - I guess that 3dfx had no right to do Glide and start this whole damn thing we're now talking so much about.

Fudo was right on the money with this one. It's happening all the time but it's starting to become a bigger deal as Nvidia gets into higher profile titles. Mirror's Edge, Assassin's Creed and now Batman. Nobody raised a stink over Cryostasis did they?
Well the particular case with BAA is that NV implemented something that can (supposedly) be easily implemented on AMD cards too. The question is who's right here: NV, AMD or Eidos.
NV made something by themselves for a PC version of a console game. I see no reason why they should implement that for any other vendor's h/w. I mean are we supposed to demand from AMD to wait until GF100 comes out to test their DX11 renderers in Battleforge, Dirt 2, Stalker, AvP? Sounds like a load crap doesn't it? It's the same as demanding NV to test their BAA AA implementation on AMD's h/w.
Eidos probably wouldn't bother porting BAA to PC at all if it wouldn't recieve technical help from NV's devrel. OK that might be a bit extreme but I do think that NV's providing (free!) technical help for console ports means a lot in todays PC gaming market condition. So that basically means that Eidos didn't give a sh. They'd be perfectly happy with X360 and PS3 versions of BAA only. It's not a good situation to demand something specifically for a PC version from them.
That leaves us with AMD who said that they've provided Eidos with a solution to enabling AA in BAA on Radeons but Eidos said that no one really came and built that solution into BAA's code. Considering that Eidos couldn't care less about PC version they certainly decided not to do it themselves -- especially since NV has done it for them for free. So where was AMD's devrel when it was supposed to work with Eidos on this? It looks like there was some exchange of emails with some code probably but that clearly wasn't enough for Eidos since they didn't want to spend a penny on PC version of BAA. So who's problem is it in the end that BAA PC was released without MSAA support on Radeons? NV's? Eidos's? Or maybe AMD had something to do to get that MSAA into the engine?

I was specifically talking about the offending company disabling AA for AMD products, as evidenced by the feature being completely functional if the application was fooled to believe that an nV card was present. It seems reasonable to assume that this course of action was at the request (demand?) of nVidia.
To me it seems reasonable to assume that that course of action was a result of Eidos supporting the feature only on tested hardware. Since AMD couldn't be bothered to work with Eidos on implementing and testing that feature on Radeons Eidos simply disabled it for Radeons.

Which company one chooses to object to in this case is a matter of personal preference, either or both would seem to be reasonable options. (For me, I see nVidias position as understandable if ugly, but the position of the developer.... not behaviour I want to support financially, no.)
See above.

Again, some would argue that it is the DUTY of the consumer to vote with his wallet in a market economy to help ensure that the system works. For the life of me, I can't see why I should support this particular company. It's not as if I'm missing out on something terribly important by not playing their particular games.
That company gave you a better looking (at least on the h/w of a vendor which wasn't lazy to work with that company) PC version of a game which might have end up being console exclusive. And now you're punishing that company for this. Would you like to take a guess about what that company will most probably do after such consumer reaction to them releasing better looking PC versions of their console games?
I think that next time you'll get plain console port at best and no PC version at all at worst.
That's what AMD's doing right now with it's PhysX and NV's devrel work bashing. They are destroying the only market where they have some kind of a foothold right now. I find that amazingly stupid.
 
Just spoke to Nvidia. Apoppin is not a member of the focus group. I'm not gonna give out his full name or affiliation. ((he writes for a website)). If he wants to clarify further he can.
 
Just spoke to Nvidia. Apoppin is not a member of the focus group. I'm not gonna give out his full name or affiliation. ((he writes for a website)). If he wants to clarify further he can.

Yeah, he already told us he's at AlienbabelTech. Don't know why SH thought he was in the focus group.
 
Is there some reason your avoiding an answer?
Is there a reason for asking someone that question? How can a person be a part of a program for publishers and developement studios? How can a person be a part of a program which is about games not people? Do you think that i'm some kind of a title from Eidos? 8)
 
That's true.
I'm not sure if PhysX licensing dictates anything about it's branding being on the box. I can't really remember if any PhysX game has any PhysX stuff on the box.

A linking to BSOBS as a credible site while at the same time disparaging any other author or side is humorous as best.

Secondly, there actually is no reliable market share data about success of the various physics packages at the moment. From purely my personal perspective, while physx is in more games, havoc consistently sells more copies. Probably has something to do with physx basically being free while havoc actually has support.
 
Mods, could it be time for a little cleanup? Or maybe a cooldown period on this topic in general.
 
Stop doing that, people!

This thread is train wreck as far as quality discussion and minimal civilities go. Not that I'm surprised given the nature of the thread, but still, it's hitting all lows it could short of involving long-winded posts about NV30.

Now, should I really remember that it is not considered to be apropos to question directly whether a member is receiving incentives by one of the IHVs to favor positive hype of their products?

Let's not even mention the fact that ad hominem attacks and useless finger pointing exercises are never the welcome. The latter are especially unneeded when they're made by people who weren't even part of the argument.

It goes without saying that banstick wielding expert personel will be dispatched to any sector of this forum if these basic rules aren't respected.


TD;DR.LOL!: Don't call people shills and don't go making a post with nothing but you calling people's post BS if you've got nothing to back you up save for another's member's post/article. Let them deal their stuff together, no need for hype-men, this is not a rap battle.

...Hmm, that stupid analogy got me thinking. I wonder who would win a rap battle between Charlie and Razor1. Not our ears, I'm affraid.
 
The posts about Batman: Arkham Asylum and AA were moved to a more fitting thread:

http://forum.beyond3d.com/showthread.php?t=54786

No, seriously folks, if you guys want an "I HATE NVIDIA *cries little man-tears*" thread, start one. Obviously, make the thread title something more rasonable, such as "What I dislike about NVIDIA's practices" or something along those lines. That way, you'd set the negative tone of the thread from the get-go and leave financial related threads free of talk about dubious developer support in exchange of unfair advantages.

A lot of the criticism displayed here is justified, as you can see from the B:AA thread I linked, I'm not in disagreement at all with some of it. It's just that we need threads to focus on the topics they cover. Or else, the thread turns into an unreadable mess of concurrent discussions <insert SMP joke here>.
 
if you want me to be specific, nV's and AMD's wafer purchasing isn't the same, I know that nV's purchasing model isn't based on buying wafers, which if you didn't get the hint in my second post to you, well......

I'm going to be blatent there is no truth to your article on profit's per chip/card.

BTW I don't know why you pointed to your bump gate article, (self promotion ;))

Chipset article, since you wrote that article its been 1 year before anouncement that they are still in legal proceedings, they aren't jumping off the boat, Intel and nV havn't gotten to an agreement, quite different then what your article has stated.

The wafers are probably sold at a different price, mainly due to processing steps required, not volume. That said, it isn't enough to change the price substantially, and not enough to move the price of a good die by much more than rounding error. What's your point?

I know SOMETIMES NV's model isn't based on wafer cost, which is why I PRINTED THAT AND THE YIELD BREAKPOINT here.
http://www.semiaccurate.com/2009/08/18/nvidia-takes-huge-risk/
Then again, you can't troll as well if you read the articles. Facts have this nasty way of scattering the roaches in forums.

To sum up, you agree with most of my point, most of my numbers, but jump up and down saying I am wrong. Then you refuse to provide any numbers of your own. Wow, wonderful argument there!

To quote you from the above, "I'm going to be blatent [SIC] there is no truth to your article on profit's per chip/card." OK, there are two ways to solve this issue, one involves numbers, the other does not. They both involve the word 'up'. I will repeat my earlier statement, put up or shut up. You have only posted one vague number that agreed with me.

As for the bumpgate article, I was trying to point out to you that I do actually get the science behind this. You posted a long list of things that affect wafer cost instead of answer the questions posed, so I pointed out that I do in fact understand those. The link was to point you to an example. I am starting to think you don't read very much.

To belabor that point, you seem to be one of those people that wait for PR announcements before believing something. I keep forgetting that you don't actually go out and talk to the people doing the work and making the products. My bad.

Let me explain it a little more slowly. The article was written in August 2008. The suit was filed on Feb 18, 2009. I will assume you have read it fully before commenting on it like I have, otherwise you would look pretty stupid and be trolling.

To refresh your memory, in the filing on P8 para 18, it says, "In early 2007, Intel informed NVIDIA that it planned to introduce Nehalem architecture processors in 2008." It goes on on para 19 to say, "A series of discussions ensued between Intel and NVIDIA as to whether Disputed NVIDIA MCPs are licensed under the Agreements. It was, and is, Intel's position that the Disputed NVIDIA MCPs are not licensed under the CLA because they cannot provide an interface between an Intel processor and system memory."

There you have it, it started in early 2007, and was very public a year later. Maybe in your world disputes start after the court filing, but not in mine. In my world, court filings only happen after a dispute can not be solved through negotiation or other means.

In this case, it was very clear, and very public over a year before you noticed. It was one of the points, along with Nvidia canceling the development of future products and reassigning engineers, that clued me in. Actually, you might say that the licensing issue was minor in comparison to them ending the programs.

Then again, you seem to be the type that listens to EVERY word that PR says, and takes it as gospel, so keep on believing that they JUST made that decision. And they are, giggle, using those, heh heh, resources to.... heh... no.... pain in side from laughter.... to.... to.... work on the SLI licensing program for Intel chipsets. Bwahahahaha! I wish I had that link handy, it will give you another talking point to explain how this could have happened. Bwahahahaha!

-Charlie
 
lol now we are talking about risk wafers? BTW I wasn't taking about Fermi or risk wafers.

Oh and that article has a lot of holes too. Hot lots aren't strickly used for yield, not to mention according to that article if they did get a return of such low chips and they were doing a yield analysis, they wouldn't even have done it. BTW do you know the equation's used when doing hot lots for yield analysis? (you do realize even if nV got X number of chips from the hot lots, as you stated they would have to calculate risk by using using a formula) And yes I do have yield numbers for Fermi from about the time you wrote that article its more like 15% for fully funtional chips and it goes up to around double that with salvage parts. Keep pulling up your old articles to make yourself look good, just doesn't work its easy to cut them down like swiss cheese because the factual information you have is castrated by your fanatical ramblings which make no sense.

The bump gate article, I did give you credit for that in the past, but I don't believe you wrote that yourself. You would have needed alot of help and alot is an understatement.

The license agreement between Intel and nV which was done 5 years ago, Intel wanted to renegotiate the deal with nV, nV didn't want to, and thats when Intel filed for breach. TO REFRESH YOUR MEMORY if you didn't know that would happen before Feb of 2008 which would have been impossible to know outside of said people, or are you just omniscient, possibly you knew it was going to happen 7 months before it actually happened. The negotiations didn't go sour till oh end of 2008 early 2009. Negotiations were know publicly not the course of action by either company till Feb of 2009.
 
Last edited by a moderator:
A linking to BSOBS as a credible site while at the same time disparaging any other author or side is humorous as best.
Have you actually read that link or is it again the same thing as with all of NV's bashers -- just assume a bunch of stuff and you're done?
Hint: Theo's providing a link to Develop magazine. I guess it's "humorous as best" too.
 
But if Battleforge is any indication they'll probably do it in such a way that only DX11 GPUs will benefit from added quality while most of what will be added is probably going to be possible to implement on DX11 feature level 10 h/w (that's mostly NV's today's h/w). I wonder what the general reaction to such an event will be. Battleforge ported to DX11 ignoring all the DX11 stuff for DX10 h/w went largely unnoticed somehow but NV's AA implementation in BAA turned into a shitstorm of stupidity.

There are only two new DX11 features for DX10 hardware.
1. Multithread render context support. In BattleForge there was no use for this at all. The engine already has an API independent multithread render support.
2. Compute Shader. It was my personal decision to not use CS 4.xright now. There were two primary reasons for this:
a. The SDK sucks when it comes to CS 4.x. No documentation at all. As our AO shader goes on the limits of CS 5.0 it is not funny at all to find all the CS 4.x limitations that block us from doing a CS 4.x version by playing trial and error with the shader compiler.
b. CS 4.x support was not enabled by default in the nvidia drivers at the time we had to decide what to do. This was a clear sign for us that the CS 4.x support isn’t ready for prime time.
 
There are only two new DX11 features for DX10 hardware.
Yes and there is only one beyond that for DX11 hardware -- tesselation (I know about new BCs and other stuff but they are smaller and less important features). And you're not using tesselation in BF DX11 at all. That's why I find it very strange that you've decided not to use FL10 h/w since you're basically not using FL11 h/w's features.

1. Multithread render context support. In BattleForge there was no use for this at all. The engine already has an API independent multithread render support.
How can it be API independent? Are you sure that you're not mixing two different things here -- engine multithreading and renderer multithreading? AFAIK the second thing depends on API and isn't possible in <DX11.

2. Compute Shader. It was my personal decision to not use CS 4.xright now. There were two primary reasons for this:
a. The SDK sucks when it comes to CS 4.x. No documentation at all. As our AO shader goes on the limits of CS 5.0 it is not funny at all to find all the CS 4.x limitations that block us from doing a CS 4.x version by playing trial and error with the shader compiler.
b. CS 4.x support was not enabled by default in the nvidia drivers at the time we had to decide what to do. This was a clear sign for us that the CS 4.x support isn’t ready for prime time.
Why are we talking about NVIDIA here? AMD has a big bunch of FL10(.1) h/w itself. If they care about their current customers and not about selling their new cards to those customers they should've probably push for CS4(.1) before CS5. I mean, isn't it clear that CS4 will be more widely used in the forseeable future -- even after the whole EG line will be on shelves and NV will release Fermi cards?
As for the SDK -- i'm sure that MS and NV would provide whatever documents you may have needed to implement CS4 support. It's not like this was a feature that couldn't wait another month or two anyway.
Are you planning on implementing CS4 support after Win7 launch, NV and AMD enabling them in their drivers by default and CS4 SDK matures?
 
Status
Not open for further replies.
Back
Top