The DevRel Duel (Again)

Where R600 will fail is upcoming DX10 games with that "the way it's meant to be played logo", that's one thing I'm willing to bet on. Crysis, World In Conflict, QuakeWars are all nVidia's friends.

It would have to be a real monster of a card to beat nVidia in these titles.
 
Where R600 will fail is upcoming DX10 games with that "the way it's meant to be played logo", that's one thing I'm willing to bet on. Crysis, World In Conflict, QuakeWars are all nVidia's friends.

It would have to be a real monster of a card to beat nVidia in these titles.
What you should be asking yourself is why ATI doesn't have a similar developer program.
 
I've been asking myself that question since R300. Sure they had that HL2 thing going on, but it wasn't anywhere near the nVidia - id Software deal.

I guess it costs a lot. :D
 
Cost shouldn't be a problem, though. If a strategy is effective, ATI should jump to it. I've been getting this feeling lately that ATI really has been somewhat detached from the market as a whole. From what I've heard, it seems that ATI is really a technology-driven company, and they haven't spent nearly as much time and money as they should have on reaching out to consumers and developers, to find out what developers want, to help developers optimize for their graphics cards, and, most importantly, to find out what consumers want.

I really don't see anything inherently wrong with nVidia helping game developers to optimize their games for nVidia's graphics cards. I would only see something wrong if nVidia was actively attempting to prevent developers from optimizing for ATI hardware. I can't say that isn't happening, but I see no reason to level accusations without evidence (and I'm sure other posters who know me will glean other motivations...but that's okay :) ).

I really don't see why ATI isn't putting at least as much effort into communicating with developers.

Now, I could be totally off my rocker as to how much each company is doing, as I'm really not an industry insider. But it certainly seems this way, what with things like TWIMTBP compared to the relative lack of GITG titles.
 
Where R600 will fail is upcoming DX10 games with that "the way it's meant to be played logo", that's one thing I'm willing to bet on. Crysis, World In Conflict, QuakeWars are all nVidia's friends.

It would have to be a real monster of a card to beat nVidia in these titles.

Just because its stamped with the green machines logo doesnt mean ATi automatically has a performance pitfall, many popular games have shown this, some even have driver bugs for months on nVidia hardware (terrible bugs such has what EQ2 experianced for months after launch as the recent problems with TR:Legend) while ATi hardware functions pretty much without issue.

And contrary to popular belief ATi does get into the Devs houses, its just not in your face. Prey for example, was backed by ATi, and i bet most people didnt even notice.

Crytek, by the way, has their hand in multiple pockets. If you'll remember Farcry, even while being backed by nVidia, was one of the first games to make use of SM2.0b and they also created that machinima project for ATI, AND that game remained one of ATi's strengths for benchmarking up until the current 8800 launch.

Brand stamping on games has ment relatively little since its introduction in terms of the performance outcome.

And i dont know about the rest of you, but i'd prefer my hardware just work and not see its brand logo popup during the loading of a game, rather then have it fail or faulter and get the logo. Doesnt mean anything to me if it works right (which it should anyway so i expect it) and if doesnt its an embarrassment.

Lastly i dont see where they have room to optimize game code for one peice of hardware over the other when it comes to DX10+ since it should be almost completely hardware agnostic unlike previous DirectX versions. OpenGL is another story.
 
Last edited by a moderator:
I really don't see why ATI isn't putting at least as much effort into communicating with developers.

Now, I could be totally off my rocker as to how much each company is doing, as I'm really not an industry insider. But it certainly seems this way, what with things like TWIMTBP compared to the relative lack of GITG titles.

Because Nvidia is always trying to slip in proprietary code so things work better or differently on their chips. Either that or it takes more coding or they have to do it for you.
 
Last edited by a moderator:
Chalnoth said:
I really don't see why ATI isn't putting at least as much effort into communicating with developers.

Now, I could be totally off my rocker as to how much each company is doing, as I'm really not an industry insider. But it certainly seems this way, what with things like TWIMTBP compared to the relative lack of GITG titles.
No offend with that...

Because Nvidia is always trying to slip in proprietary code so things work better or differently on their chips. Either that or it takes more coding or they have to do it for you.

[Sorry for being off-R600 topic]
Or it is just easy for the dev to use the custom tools NV provided as a result to join the programme. Yes, meaning to shorten time to develop your code and NV just only made custom tools for all devs, this way NV can put any kinds of proprietary code as they want to and the dev just put that command in to the code!! <== It's my guess on this... so please do not ask for link!!
[Trying to back to R600 topic]

DemoCoder said:
Let me see if I get this straight now from what's being bandied about:

R600 is

1) 512-bit bus
2) not scalar, but essentially Xenos or R520 style ALUs (DX10 compliant hopefully)
3) HW/driver "packing" of SIMD ops to boost effiiency
4) 16-32 ROPs
5) mega clocking (and mega-dustbuster?)

?

Just wonder why would you think that the R600 might not get fully the DX10 compliant? There are such a rumour that it would be even DX10.1 too...
What would be the main different in using scalar or non-scalar (more complex, flexible like as combination of Vec and Scalar) to you as the code developer?
Which kind of the arch that would benefit you most? Why?

I asked as I have no cure on this GPU coding but gaining more intererest in the thing lately...

Edit: typo as usual...
 
Last edited by a moderator:
Because Nvidia is always trying to slip in proprietary code so things work better or differently on their chips. Either that or it takes more coding or they have to do it for you.


Man that is so much bull shit now it isn't even funny, its always the choice of the developer to use what ever code they want, its only the approval from the developer will any propriatary code gets into any program.
 
I really don't see anything inherently wrong with nVidia helping game developers to optimize their games for nVidia's graphics cards. I would only see something wrong if nVidia was actively attempting to prevent developers from optimizing for ATI hardware. I can't say that isn't happening, but I see no reason to level accusations without evidence (and I'm sure other posters who know me will glean other motivations...but that's okay :) ).

I'm fairly certain that nVidia's deal with Bethesda Softworks was the main reason why there was no AA+HDR for ATI's cards in Oblivion. They said someting like "it's not possible" etc and yet they had HDR+AA running on X360, and then there's the Chuck patch...
 
I'm fairly certain that nVidia's deal with Bethesda Softworks was the main reason why there was no AA+HDR for ATI's cards in Oblivion. They said someting like "it's not possible" etc and yet they had HDR+AA running on X360, and then there's the Chuck patch...


How about Valve and the dx9 path for the fx cards, same thing? Its always up the developers. Doesn't matter if nV or ATi bitches about it that developer isn't doing enough for thier cards, its the developers choice from the start to the end.
 
I really don't see why ATI isn't putting at least as much effort into communicating with developers.
Hmm, is TWIMTBP really the same thing as "communicating with developers"? It's more visible definetly, but as someone here said it doesn't necessarily mean anything else than they paid for the game to start with the NV logo.
But NV is certainly more aggressive in marketing. There might be a game somewhere starting with Ati logo but it's very rare, I don't think they (Ati) do that.
Dunno how much more time and effort NV puts in working with devs, but they certainly spend more money on advertising.

And TWIMTBP being so visible is propably also the reason for people to be so paranoid about it.:LOL:
 
I remember Epic staff stating on their forums that TWIMTBP was nothing to do with them and that the money basically went to the publishers for an advert at the beginning of their games. These kinds of marketing ads should be considered separate from developer relations.

So while ATI is not willing to spend money on ads, and there's no doubt it's another place where their marketing falls down and they fail to get in the face of the customer (in comparison to Nvidia), it shouldn't be considered to mean that ATI have less of a technical relationship with the developers they work with.
 
How about Valve and the dx9 path for the fx cards, same thing? Its always up the developers. Doesn't matter if nV or ATi bitches about it that developer isn't doing enough for thier cards, its the developers choice from the start to the end.

HL2 was not ran in any console with FX-card. Besides, Oblivion is not FPS-game therefore it doesn't 'need' fps the same way, and it can be played with HDR+AA. I don't see any technical reasons for their decision.
 
HL2 was not ran in any console with FX-card. Besides, Oblivion is not FPS-game therefore it doesn't 'need' fps the same way, and it can be played with HDR+AA. I don't see any technical reasons for their decision.


I'm not argueing about a technical reason, I'm saying it was a developer's choice ;) . Some one outside of Valve made a patch to get the dx9 path working for fx cards with pp hints, and that Valve didn't want to do that not to mention a path that was there in the leaked source code and leaked half built HL2 game.

If you want to argue about technical reasons, Valve said the Fx was too slow to run in Dx9, yes they were too slow, to run Valve's dx9 path.

Anyways this is off topic lol. PM if ya want to discuss it further :)
 
I remember Epic staff stating on their forums that TWIMTBP was nothing to do with them and that the money basically went to the publishers for an advert at the beginning of their games. These kinds of marketing ads should be considered separate from developer relations.

that's a whole bag of crap you're bearing with you there and you know it..

Uhm.. Lego: Star Wars? I mean, since when are nVidia based graphics cards the only cards in the world that can actually produce a glow around a light-saber?

Anyway.. when did CryTek go back to nVidia? I thought they showed their first few demo's on ATI hardware? .. and Where's Allan Wake when you need him?
 
that's a whole bag of crap you're bearing with you there and you know it..

Uhm.. Lego: Star Wars? I mean, since when are nVidia based graphics cards the only cards in the world that can actually produce a glow around a light-saber?

Anyway.. when did CryTek go back to nVidia? I thought they showed their first few demo's on ATI hardware? .. and Where's Allan Wake when you need him?

If I understood what they hell you were saying, I might be able to formulate a reply. I am simply reporting what Epic staff stated at a time when their forums were complaining that ATI cards ran UT2003/4 better than Nvidia cards, and could they have their own ATI startup graphic (several were made by modders).

You might not know this, but often several things (such as copy protection) are added by the publisher after the Devs finish their work. Also included are the startup ads. What gets supported on what cards is down to the Devs, not down to what money goes for ads to the publisher. As has been pointed out by several other people, there have been quite a few TWIMTBP games that have run better on ATI hardware than Nvidia hardware on release.

Whether ATI or Nvidia has better Dev-Rel for a given development company is a separate issue from what a publisher gets paid for logo ads at the beginning of a game intro.
 
Last edited by a moderator:
If I understood what they hell you were saying, I might be able to formulate a reply. I am simply reporting what Epic staff stated at a time when their forums were complaining that ATI cards ran UT2003/4 better than Nvidia cards, and could they have their own ATI startup graphic (several were made by modders).

You might not know this, but often several things (such as copy protection) are added by the publisher after the Devs finish their work. Also included are the startup ads.

You wrote that devs say that twimtbp has nothing to do with their development of games.
So howcome some twimtbp (lego: star wars, splinter cell: chaos theory) offer the full experience to nV based cards and a crippled display on non-nv boards. SM1.4 vs. SM3.0 for SC:CT and some bizar things like shadows, Lightsaber glow etc. only running on nv based hardware.
 
You wrote that devs say that twimtbp has nothing to do with their development of games.
So howcome some twimtbp (lego: star wars, splinter cell: chaos theory) offer the full experience to nV based cards and a crippled display on non-nv boards. SM1.4 vs. SM3.0 for SC:CT and some bizar things like shadows, Lightsaber glow etc. only running on nv based hardware.

Probably because Nvidia wrote the code for the developer, and ATI didn't. That's got no relationship to whether the publisher got paid for an ad.
 
Back
Top