John Carmack talked @ GDC

That Company has become way to Technology Driven instead of Content driven.
............
Doom-III Hopefully Delivers on a really great Game play "experience" and does not once again shock us all about how great their engine "could be" if someone else would make a game from it.
Unfortunately, it seems that Id has become primarily a "game-engine company". Not so unfortunate actually. It makes money, eases dev cycle of games that licensed the tech, lets these "other games" concentrate on gameplay :p , etc. For the sake of avoiding disappointment, it is advisable to let the "Doom3 will have great gameplay" clause sit at the bottom of the wishlist.
 
Joe DeFuria said:
Slight correction.

Carmack knows when the drive code follows his spec. That is, when drivers are tuned to perform to his engines, and when they produce the results he wants.
No, not his spec, the OpenGL spec, or the appropriate extension spec. The OpenGL spec is typically much more exact than Microsoft's DirectX spec. Additionally, it seems to me that his games typically have had the fewest rendering errors of all games I've played, not to mention fewer patches. It seems to me that he is very good at write code in a very careful and exacting manner, so as to reduce the number of bugs.
 
Chalnoth said:
Joe DeFuria said:
Slight correction.

Carmack knows when the drive code follows his spec. That is, when drivers are tuned to perform to his engines, and when they produce the results he wants.
No, not his spec, the OpenGL spec, or the appropriate extension spec. The OpenGL spec is typically much more exact than Microsoft's DirectX spec. Additionally, it seems to me that his games typically have had the fewest rendering errors of all games I've played, not to mention fewer patches. It seems to me that he is very good at write code in a very careful and exacting manner, so as to reduce the number of bugs.

Yeah, of course. :rolleyes: That's why he f**ked up the 3Dnow part completely. :rolleyes:

Edit: aster.
 
Chalnoth said:
No, not his spec, the OpenGL spec, or the appropriate extension spec. The OpenGL spec is typically much more exact than Microsoft's DirectX spec. Additionally, it seems to me that his games typically have had the fewest rendering errors of all games I've played, not to mention fewer patches. It seems to me that he is very good at write code in a very careful and exacting manner, so as to reduce the number of bugs.

No, it seems to me that because Carmack's engines are the "most important" wrt to games, that IHVs tune their drivers, and work with him appropriately.

In other words, it's easy to have "few rendering errors" when you have IHVs knocking at your door saying "how can we help you?"

If you're not Carmack, it's a different story.
 
His prestige in the industry only affects games that have been released, or bugs he has already submitted to the appropriate company. It won't affect his view of which drivers are better, as he'll still see what bugs are there when he first writes the code.
 
Chalnoth said:
His prestige in the industry only affects games that have been released, or bugs he has already submitted to the appropriate company. It won't affect his view of which drivers are better, as he'll still see what bugs are there when he first writes the code.

That has nothing to do with what you said about Carmak's games being the least bug prone. He gets preferential treatment from the IHVs (for obvious reasons), so IHVs will cater to him...even if it's at the expense of others.
 
That only reduces the value of the evidence I stated. I only said that as an attempt to post evidence that he's pretty good at writing game engines.
 
Chalnoth said:
That only reduces the value of the evidence I stated. I only said that as an attempt to post evidence that he's pretty good at writing game engines.

I didn't think it was debatable that Carmack was at least "pretty good" at writing game engines....is it?

However, the popularity of his engines (and IHV reliance on benchmarks of his engines) paradoxically make Carmack actually a bit less qualified to comment on general driver robustness and stability. What he's most qualified to addressm is the IHV's responsiveness to whatever problems he encounters. ;)
 
Once again, when he's writing a new game engine, he's forging into new territory, and there's no way for IHV's to help him until he asks for help. So he definitely is qualified to comment on driver quality.
 
Chalnoth said:
Once again, when he's writing a new game engine, he's forging into new territory, and there's no way for IHV's to help him until he asks for help. So he definitely is qualified to comment on driver quality.

Once again: he is FAR from being very good in game engines. He DID completely blow up the AMD support in Q3 as everybody knows it very well.

It could be
1. intentionally for the sake of Intel - nobody think so but this easily destroy your pink pictures about him, Chalnoth
2. accidentally because he's not as good as Chalnoth would like to paint him for us here - I think this is the case

Choose your point, Chalnoth.
 
T2K, could you please point me to some information on this 3DNow! issue? I was told that the SSE-detection routine is flawed and won't recognize AthlonXP as an SSE-capable part, but that's about it. Noone mentioned that even 3DNow! is broken in Q3... :oops:
 
anaqer said:
T2K, could you please point me to some information on this 3DNow! issue? I was told that the SSE-detection routine is flawed and won't recognize AthlonXP as an SSE-capable part, but that's about it. Noone mentioned that even 3DNow! is broken in Q3... :oops:

Maybe you should select sources more carefully next time. :p It's a well-known story... ;)

PS: I'm looking for a link, hold on...

PS2: http://216.239.39.104/search?q=cache:jp2IFmh6qaQJ:www.planetmars.co.za/forums/index.php%3Fshowtopic%3D7630%26view%3Dnew+broken+3dnow+quake+3&hl=en&ie=UTF-8[/url] - this is a forum link from Google's cahce
 
http://www.overclockers.com/tips752/

A semi-link to the info...

I believe the original website that housed this info doesn't exist... the guy provided a "hack/patch" to get SSE support enabled for Quake 3 and RTCW with an Athlon XP (which supports SSE).. This is related to the broken 3dnow! support which he fixed (this is on the same site)...

If I recall correctly, Quake 2 was the first to take advantage of 3DNow! instructions... after that.. I believe no further support came for that.. only SSE (I don't think any SSE2 instructions have been used in recent engines)... and furthermore it is the detection mechanism for the instructions are flawed.

Eg. Window Media Encoder in some old benchmark that assumed that SSE was ONLY applicable to Intel CPUs.. and not AMDs (the AthlonXP being the first to support this).
 
Actually, Intel's docs specifically state that anybody using SSE should first check to see if it's an Intel CPU before checking for the SSE flag.
 
Actually, Intel's docs specifically state that anybody using SSE should first check to see if it's an Intel CPU before checking for the SSE flag.
That might be correct when this was started (Katmai)... however that's VERY WRONG when AMD supports these extensions as well.
 
Chalnoth said:
Actually, Intel's docs specifically state that anybody using SSE should first check to see if it's an Intel CPU before checking for the SSE flag.

It has nothing to do with the fact that JC made a fully broken 3DNow support in Quake3.

PS: As AMD Zone said at the time:
Note to people who think these DLLs are "unfair" in benchmarks. This is horribly incorrect. Quake3 (even with an Athlon XP) forces AMD chips to use 3DNow, not SSE. Little do people know the 3DNow! code is completely broken in Quake3. It does not help at ALL.
The Pentium 3 and 4 chips however are both detected as "Pentium 3" chips, which automatically uses SSE which DOES work in Quake3 and gives a significant FPS boost. These DLLs don't enable SSE but do help with Athlons.. and if SSE was to ever be enabled in Quake3 for Athlon processors they would blow Pentium 4 chips out of the water even more. In the tests I ran a P4-2.53 @ 3.32GHz, 175/700fsb lost to a "stock" Athlon XP 2700+(2166MHz, 166fsb) using these dlls...
 
I don't see how not writing assembly for proprietary CPU extensions translates to inability to properly write OpenGL code. Anyway, my previous comment was a statement about the support of SSE, not JC.
 
I just found THIS...
VansHardware said:
"NewAgeOC’s DLLs appear to elevate performance across the board for chips from all major vendors by similar amounts. This suggests that the performance gains are obtained from the use of newer and better compilers rather than fixing a processor specific coding bug."

Something to keep in mind about all the DLLs floating around - the most they can possibly optimize is just game logic, AI and minor stuff like UI management. They have no effect whatsoever on the rendering engine itself... But still I wonder when, by whom, an most importantly how it was deduced that the 3DNow! part doesn't work - is there a way to disable 3DNow instructions in the CPU for testing purposes? I don't mean to call bullshit! on this at all, but proper information seems rather hard to trace down for something so well known. :?

Chalnoth said:
I don't see how not writing assembly for proprietary CPU extensions translates to inability to properly write OpenGL code.

I was about to write something along those lines... botching 3DNow! support across the whole game WOULD put a dent on his image as an excellent coder ( given sufficient proof, mind you ) - but messing up a broken SSE detection routine and not fixing it for 4 years is more likely to indicate a move to satisfy Intel than him "being far from being very good in game engines".
 
anaqer said:
- but messing up a broken SSE detection routine and not fixing it for 4 years is more likely to indicate a move to satisfy Intel than him "being far from being very good in game engines".

Well, if we're going to say that, then it certainly isn't out of line to suggest Carmack would "satisfy" a certain graphics IHV over another as well, which is what started this 'mini debate.' ;)
 
Back
Top