Another ATIvsNVIDIA review from gamersdepot

Pete said:
Again, I don't think he was the one perpetrating "fraud."
No, you are correct...he was just a willing accomplice in it. :rolleyes:

Reverend said:
If you'll read what JC has said thus far, you'll know what video card he recommends without saying it.
Dumb question: why doesn't he just say it? I'm really glad you can read in between the lines to know what JC means, but I'm used to dealing with people who have a hard time understanding what drivers even are....all they know about Doom3 and performance is that stupidly unbalanced benchmark, and from that they've concluded that the FX is a better card for D3.

Mebbe it is, mebbe it ain't; but JC is most definately helping out nVidia right now with his silence.

As well as not knowing prcisely what JC and his involvement with id Software is about.
Hmmmmm....I do believe he owns it and is the driving factor behind it, or something silly like that.

C'mon Anthony, what are you trying to say? That JC doesn't have the power of decision making at iD? Puh-lease! :rolleyes:

I'll stop harping on it if it's making you angry, but I am NOT changing my opinion that JC is a sell-out to nVidia until I am shown something to indicate otherwise.

nVidia bought JC, simple. :(
 
Why don't we all wait to see just how noticeable the visual differences are between FX12 and FP16/24/32 before going on the attack?
 
John Reynolds said:
Why don't we all wait to see just how noticeable the visual differences are between FX12 and FP16/24/32 before going on the attack?

Low precision = sucks
High precision = slow
Mid precision = okay

This is only for curent cards on the market.

In future cards it will be like this:

Low precision = sucks
High precision = Great
Mid precision = sucks

nVidia bought JC, simple.

This is just so silly. I guess Ati bought out Valve. ;)

On naother note:
I'm not afraid that I agree with Rev on this.
 
Since the leak the D3 builds were changed such that a hardware dongle is required to allow the current beta builds to run - since the leak ATI have neither had the dongle nor recieved any internal builds.

However, I think JC may have been fairly unhappy about the manner in which the benchmarks were conducted and the types of comments its generated subsequently, which could be one of the contributing factors why he might not include a benchmark.
 
I think JC didn't have a choice. He's developing an engine that ID is going to licence for millions over the next few years, so he had no choice but to implement low precision to make up for Nvidia's poor hardware in order that his product runs well on as many major cards as possible. This is especially obvious when you realise the initial work for D3 started a couple of years ago, when the graphics card landscape looked very different from today.

The Nvdia D3 benchmark is another matter - IMO, that's just JC being childish and getting his own back on ATI for leaking the D3 beta. If ID wasn't being paid to advertise Nvidia cards and give ATI a slap in the face, they would have contacted ATI and got as much coverage for Doom 3 as possible from *both* chip suppliers, instead of turning it into an Nvidia showcase.
 
Doomtrooper said:
Childish....riiight.

Here we have a Developer that clearly has been pushing higher precision, but then turns around and states "well the NV 30 path is lower precision and there is no image loss"..since we know the NV30 and 31 and 34 operate mostly in FX 12 precision it is simply hyprocritical to state FX 12 is acceptable, especially from their 'flagship' super duper hyped Doom 3.

Make up his mind, Doom 3 isn't a good game for precision example obviousally....but it sure is mentioned alot in 'plan' updates.


you'll read what JC has said thus far, you'll know what video card he recommends without saying it

Sure, latest plan update states he uses a 5800 Ultra as his primary development card..the card that has been cancelled.
That is a snippet of information that was written by him, no one else.

Was JC biased at E3 last year when it was shown on ATI hardware only?
Having a 5800 Ultra in his machine is probably a sign he is having to do extra work to get to a similar quality. I had a Kyro2 in my machine while on SH2 for quite some while, wasn't actually a recommandation quite the opposite. I spent much longer on Kyro2 and Matrox G400 then I did on ATI 9700 for SH2, so I guess I'm biased for writing all those 'custom' pipelines.

As for the higher precision argument, what? He has stated that he thinks higher precision is a good thing but shock horror his CURRENT engine doesn't need it. If you had actually listened to developers, we've clearly stated that higher precision only comes into play when the art path and lighting algorithms NEED it. He doesn't use it because his engine doesn't NEED it because its a fixed point fragment processing engine, and by doing so millions of his customers get a better deal! Its just so happens that one card doesn't do fixed point fragment processing, so has to run at a higher precision. Doesn't mean all the other cards should be forced to when its NOT nessecary for HIS artwork and lighting calculations. Thats one of the reason its would make a crappy benchmark, it has half a dozen different pipelines, all using the same basic artpath and lighting model.

I prefer the ATI 9700, both as a developer and a gamer but don't feal the need to insult somebody who is making a BETTER game for his customers. Its a game developers job to make the best game for whatever hardware is out there. JC should be praised for going the extra mile for all those GFFX owners not insulted. Just the same as he did for those of us with hardware accelerators back in the Quake days or releasing the source code for newbies devs to look at etc
 
Doomtrooper said:
This coming from a Guy that is pushing for 64-bit color
..
Here we have a Developer that clearly has been pushing higher precision, but then turns around and states "well the NV 30 path is lower precision and there is no image loss"..since we know the NV30 and 31 and 34 operate mostly in FX 12 precision it is simply hyprocritical to state FX 12 is acceptable, especially from their 'flagship' super duper hyped Doom 3....

Isn't the 64 bit color he's talking about there the precision of the framebuffer, not the graphics pipeline ?

There is also a BIG difference between wanting something for future generations of cards and what he accepts for running Doom 3. Especially since Doom3 has to run acceptably on older DX8 hardware also. Besides, he hasn't said that there isn't an image quality loss in Doom3 when going from ARB2 -> NV30 -> R200 and so forth. Just that the difference is very small. Even between the R300 and R200 path which is FP24 -> FX16(?).
 
Pardon me but isn't JC a business man and isn't iD a company? I never heard companies had to be fair and equal, rather that they had to compete. One way of competing is by chosing who you partner with and how closely you share secrets and what you do and don't say in public about your partners.

JC is not a referee on trial for bias. He is a talented coder who is also a business man trying to make a living and helping the fortunes of everyone at id in the process.

As a businessman what is the issue about keeping quiet about your allies' weaknesses and their competitor's strengths? He is not lying or deceiving anyone - he is simply refraining from full disclosure because no law compels him to. Do we get upset because for the last 20 years Bill Gates didn't stand up and say at (initially) 10,000 lines of code UNIX really shows what an Operating System should be and how Windows millions of lines of shit code sucks arse, and really its a pretend Operating Systems and GUI that we'll get right by the 20th iteration - but hell keep on paying me for the next crudy verions. No we made him rich.

JC knows how to code and how to partner. ATi through at least one employee hurt id, now they are seeing the fruits of these actions in all John doesn't say or do. There is no law or moral obligation that says JC has to be neutral and can't accept heavy incentives from whoever he likes.

Its more the challenge of great sites like B3D to keep the records straight - not the players on the field in the game at the moment.
 
g__day said:
JC knows how to code and how to partner. ATi through at least one employee hurt id, now they are seeing the fruits of these actions in all John doesn't say or do. There is no law or moral obligation that says JC has to be neutral and can't accept heavy incentives from whoever he likes.

That is absolutely true.

On the other hand, there is also no law or moral obligation from those who use iD's products that say we have to be neutral, and must "accept" his business practice without protest ourselves.

In other words, while one can't really blame ID for "punishing" ATI if the leak came from within ATI for example, one also can't blame ATI users for being upset that the game may not be as optimal on ATI hardware as it should (if ATI does not get the same access to the code that nVidia does.)

So, Carmack can pick whatever alliances he wants...but should not really complain about the consequences of doing so.
 
Joe DeFuria said:
g__day said:
JC knows how to code and how to partner. ATi through at least one employee hurt id, now they are seeing the fruits of these actions in all John doesn't say or do. There is no law or moral obligation that says JC has to be neutral and can't accept heavy incentives from whoever he likes.

That is absolutely true.

On the other hand, there is also no law or moral obligation from those who use iD's products that say we have to be neutral, and must "accept" his business practice without protest ourselves.

In other words, while one can't really blame ID for "punishing" ATI if the leak came from within ATI for example, one also can't blame ATI users for being upset that the game may not be as optimal on ATI hardware as it should (if ATI does not get the same access to the code that nVidia does.)

So, Carmack can pick whatever alliances he wants...but should not really complain about the consequences of doing so.
Yup, but in fairness I don't see Carmack being the one complaining.

I know he's a businessman and iD is a business, but that doesn't mean I have to like or agree with his/their decisions. I expect that both code paths will end up looking pretty much the same as I do respect Carmack's skills, but I can't help feeling he made a huge mistake on that benchmark fiasco and I resent his behavoir/participation/taciturn approval of it. :(
 
Guys I agree with you! I own a 9700 Pro and I think ATi is the ants pants at the moment.

I think John is a great guy, and wish him all the best. I really wish the world were fairer and the best always won. NVidia is buying advantages - its their right and its clever to do so.

God I wish it were a level playing field, but it isn't yet. If and as video cards perform more to open standards rather than proprietary extensions we may see the playing field level. So long as 3d APIs keep up with 3d Hardware and mean you aren't forced into extending APIs to reveal your smarts.

I think John has to live with his choices - but its his path to walk and I am sure his life isn't stress free. He made the best decision he could at the time faced with the options he could see and his experience to decide.

It will be wierd that Doom 3 gets relased as a major Game in 2004 where it may be a stand out - but in my mind is proprietary for NVidia and is it more DX9 equivalent or DX8 equivalnet from its use or not of shaders anyone?
 
g__day said:
Guys I agree with you! I own a 9700 Pro and I think ATi is the ants pants at the moment.

I think John is a great guy, and wish him all the best. I really wish the world were fairer and the best always won. NVidia is buying advantages - its their right and its clever to do so.

God I wish it were a level playing field, but it isn't yet. If and as video cards perform more to open standards rather than proprietary extensions we may see the playing field level. So long as 3d APIs keep up with 3d Hardware and mean you aren't forced into extending APIs to reveal your smarts.

I think John has to live with his choices - but its his path to walk and I am sure his life isn't stress free. He made the best decision he could at the time faced with the options he could see and his experience to decide.
Well I don't know Mr.Carmack so I can't say what kind of guy he is, but I'm glad we can at least agree a bit. :)

It will be wierd that Doom 3 gets relased as a major Game in 2004 where it may be a stand out - but in my mind is proprietary for NVidia and is it more DX9 equivalent or DX8 equivalnet from its use or not of shaders anyone?
JC has already said it's much more of a DX8.1 game than a DX9 game, if that helps. ;)
 
Pete said:
...
WaltC, The Carmack already explained why he uses an FX card as his primary development platform, and it should be obvious even without his explanation: the FX allows him to code for more codepaths on a single card. FX gives him NV2x, NV3x, and ARB2 on one card; R3xx gives him only R2xx and ARB2. I'm sure he has the (dozen) odd R3xx's in various machines to test performance, but I'd imagine the convenience of one machine to test more codepaths is more important to him.

Don't be so fast to attribute bad intentions to a coder who has been repeatedly heralded as brilliant and (bluntly) honest. And don't forget that nV was in the driver's seat for quite some time.

Still, I've been rooting for AMD + ATi for a while now. We'll see how HL2 and D3 perform on an Opteron/Athlon64 + 9800/XT setup when they all debut. :)

I would think then that the politic thing for JC to say would be that he "divides his time equally among his two primary development platforms, the R3x0 and nV3x," or something along those lines, because it affords him what he needs to write his code paths. After all, he can't do R2xx on an nV3x, and the ATi and nVidia OpenGl extensions are pretty different. Like you say I doubt he's doing much coding for ATi on an nV3x box...;)

Anyway, the comments I reference are several months old and I don't know what he's using currently as his "primary" development platform. I really wasn't attributing "bad intentions" to him--just understandable ones, from his point of view. Prior to R3x0, nVidia had the better OpenGl drivers, and so his leaning in that direction is entirely understandable because he can't/won't program in D3d. But I think the overall picture has cleared up a lot in just the last six months, so I wouldn't be surprised to see him approaching things differently now. The interesting thing for me is this: if the figures I saw in another report which stated ATi had captured some 97% of the $300+ 3d-card market this year are accurate, then JC shouldn't be wasting a lot of time on nV3x code paths for D3--he'd probably get more mileage out of nv2x paths...;)
 
Bouncing Zabaglione Bros. said:
The Nvdia D3 benchmark is another matter - IMO, that's just JC being childish and getting his own back on ATI for leaking the D3 beta.

I think people are forgetting there is a publisher involved here as well.
 
DaveBaumann said:
Since the leak the D3 builds were changed such that a hardware dongle is required to allow the current beta builds to run - since the leak ATI have neither had the dongle nor recieved any internal builds.

BTW, talking about the alpha leak--I thought it had been resolved months ago that it did not come from ATi (as was originally reported along with the fiction that an ATi employee had been fired for leaking it.) Was that report not accurate--did the leak actually come from ATi after all?

However, I think JC may have been fairly unhappy about the manner in which the benchmarks were conducted and the types of comments its generated subsequently, which could be one of the contributing factors why he might not include a benchmark.

I'm thinking it's a case of JC wisely not wanting to become entangled in the snare of IHV partisanship, for one thing, although this doesn't preclude him integrating a D3 timedemo benchmark into the game, I wouldn't think. It may be that he's not releasing anymore "demos/benchmarks" to IHVs to avoid the kind of thing that happened with the nv35 launch and the then-current build of D3. That would make a lot of sense to me. Also, I can't help wondering what might be going on behind the scenes with nVidia and its drivers as far as "optimizations" nVidia might be running with current builds of D3. I'll bet there's a real interesting story there--which we'll probably never be let in on...
 
if the figures I saw in another report which stated ATi had captured some 97% of the $300+ 3d-card market this year are accurate, then JC shouldn't be wasting a lot of time on nV3x code paths for D3--he'd probably get more mileage out of nv2x paths...

The problem is that the 300$+ 3d-card market is rather small compared to the < 300$ one.
 
I wholeheartedly argee with what DeanoC said.

[rant]
It's quite sad when people that do 3D in theory only judge developers in terms of bias, when they are biased themselves.

Developers tend to develop for the hardware that users have. There are numerous people on this board that doesn't seem to understand even that simple fact.

It's not like saying, oh I like this card I code for that, and if you don't have one screw yourself, I don't care if you buy our game or not...
[/rant]
 
Just a new guy question

How much precision do you really need for stencil ops?

I have a vague feeling it'a lot less than 12
 
Bjorn said:
The problem is that the 300$+ 3d-card market is rather small compared to the < 300$ one.

How is it a "problem"?....;) Games like HL2, and presumably D3 when it ships next year, will look best and run best on these products. It is on these products that 90% + of all Internet reviews of these games will be conducted. You will even see IHV's using these games to promote their 3d products in this price range.

I understand your point that the $300+ market is a small piece of the overall graphics chip market based on volume, but by the same token the vast majority of PC's in use will not see either D3 or HL2 installed on them, either...;) (And couldn't run them well if at all as many have no AGP/3d capability.) I think that when your analysis is restricted to the ~10M or so people who comprise the active 3d-gaming market worldwide, the prevalence of these products is much more apparent.
 
DaveBaumann said:
Bouncing Zabaglione Bros. said:
The Nvdia D3 benchmark is another matter - IMO, that's just JC being childish and getting his own back on ATI for leaking the D3 beta.

I think people are forgetting there is a publisher involved here as well.

Very true.
Furthermore John Carmack do _not_ call the shots at id, although he obviously has a strong position. Which was brought to public attention when Paul Steed was "let go" against JCs very strong wishes. JC even wrote about it in his .plan.

JCs function at id software is not primarily that of "businessman", other people do that job and typically make the decisions in that area.

Entropy
 
Back
Top