John Carmack talked @ GDC

Agreed.
I just nefariously derailed the discussion and used the opportunity to learn a bit about this 3DNow! fiasco... come to think of it, another reason to wish for a D3 release "Real Soon Now" (TM) - it's only a matter of months after that, and we might just get the Q3 engine source. Ah, a boy can dream... ;)
 
Chalnoth said:
I don't see how not writing assembly for proprietary CPU extensions translates to inability to properly write OpenGL code. Anyway, my previous comment was a statement about the support of SSE, not JC.

You never can see the obvious things it they don't back up your mostly faulty statement like your comments in question.
 
anaqer said:
Chalnoth said:
I don't see how not writing assembly for proprietary CPU extensions translates to inability to properly write OpenGL code.

I was about to write something along those lines... botching 3DNow! support across the whole game WOULD put a dent on his image as an excellent coder ( given sufficient proof, mind you ) - but messing up a broken SSE detection routine and not fixing it for 4 years is more likely to indicate a move to satisfy Intel than him "being far from being very good in game engines".

Maybe you should start reading the whole story from the beginning instead of only pieces of it, dear anager. Do your research and read first.

Understanding doesn't start with statements like this but rather being well-informed on the subject...

FYI: broken 3DNow supp != broken SSE detection

PS2: if you and Chalnoth interested I can probably get those DLLs and you can check the diffference on your own.
 
T2k said:
FYI: broken 3DNow supp != broken SSE detection

PS2: if you and Chalnoth interested I can probably get those DLLs and you can check the diffference on your own.

I know, that's why I handled the two problems separately. I was saying that messing up a CPU flags check does not a bad coder make, but it's an entirely different matter to consistently not use 3DNow!-friendly code.

I actually have three DLLs somewhere sitting around which are supposed to boost scores on AthlonXPs, but I'm not using them as they never gave noticable FPS increase. It could be that these are not what everybody knows about, I just got them from a link on planetquake sometime last year

( As for reading the whole story from the beginning - that's exactly what I wanted to do with this 3DNow! incident, but like the others all I found was either broken link or some second-hand info in short news and forum postings. I would still very much like to learn about the exact way this broken 3DNow! support was detected. )
 
anaqer said:
I actually have three DLLs somewhere sitting around which are supposed to boost scores on AthlonXPs, but I'm not using them as they never gave noticable FPS increase. It could be that these are not what everybody knows about, I just got them from a link on planetquake sometime last year.

Similar case here - somewhere, on some CD... but I remember I got some nice boosts at the time I used to play Q3 on my Athlon.
 
Deathlike2 said:
If I recall correctly, Quake 2 was the first to take advantage of 3DNow! instructions... after that.. I believe no further support came for that.. only SSE (I don't think any SSE2 instructions have been used in recent engines)... and furthermore it is the detection mechanism for the instructions are flawed.

AMD made a Quake2 patch that enabled 3DNow! support in Quake2.
 
AMD made a Quake2 patch that enabled 3DNow! support in Quake2.
Ah, didn't realize that.. I heard after this patch was made.. 3DNow! looked like the way to go for AMD (well, the concept of SSE).. before the Athlon... :D

If you run RTCW on an AMD Athlon XP (or better) and quickly look at the boot code... you'll see it' will detect an AMD cpu w/3dnow... it may be a mistake, but it is very probable it isn't using any SSE support...
 
Being a great programmer doesn't mean you don't have any bugs, or that you are the best at everything and know everything all the time. It is possible for Carmack to be better than 95% of all programmers, but still not good at writing 3DNow! code, especially if he's got other things to do. The man singlehandedly writes his own game engines and has changed industry rendering paradigms several times. Not just a great coder, but a leader as well. He gets well deserved kudos for his contributions.

T2k said:
Sorry to say this but DC... you obviously A WAY BIASED. It's simply ridiculous how you're acting when it comes to NVIDIA.... it's just, let me put this way, DUMB.

Whatever you're trying to suggest about JC and even I do think HB is dumb and biased like a 3rd grade child - still true: JC IS REALLY REALLY prefer NVIDIA, regardless its disgusting, cheating, fake, obviously crappy and low-level business tricks. DO YOU HEAR ME, DC???

Actually - with all my respect, man - I'm getting tired of your unconditional love on whatever NV is doing. Just check your latest posts - isn't this ridiculous? Like a stupidest child in the neighborhood...

As for your comments to me T2k, you're getting a little batty. Look at your hyperbolic response, take some blood pressure pills man. You take my comments that Carmack is a great programmer, and that he is one of the least PR-savvy bullshitters in the industry, a straight-talking kind of guy who usually gets to the point, and spin this into an ad hominem attack on me and Carmack.

Unconditional love for NV? What are you smoking. You need to go back and read my comments on the NV3x. Perhaps in your mind if I say "The NV3x is really disappointing, and features X,Y, and Z are bad, BUT, it does have some redeeming qualities", you think this is unconditional love.

To me, it's plain honestly. Many of the people on this BBS have blinders on when it comes to the NV3x architecture and routinely claim that there is nothing good about it. From time to time, I pop in to point out some of the potential issues with it that are not wholly bad. And discussing whether or not multiprecision is good or bad, or whether or not an optimizing compiler can be put into a driver is not unconditional love for NV.
 
Chalnoth said:
Joe DeFuria said:
Slight correction.

Carmack knows when the drive code follows his spec. That is, when drivers are tuned to perform to his engines, and when they produce the results he wants.
No, not his spec, the OpenGL spec, or the appropriate extension spec. The OpenGL spec is typically much more exact than Microsoft's DirectX spec. Additionally, it seems to me that his games typically have had the fewest rendering errors of all games I've played, not to mention fewer patches. It seems to me that he is very good at write code in a very careful and exacting manner, so as to reduce the number of bugs.

I wouldn't say his games have fewer patches than other games. Quake 1 got up to 1.06/7, Quake2 took a few months to get semi-bug free up to version 3.20. and Quake3 had patches coming out over 2 years after the game did, with 1.32(speaking of which, why do benchmark sites always use 1.17 or other older version? The AMD bug?).
 
Reznor007 said:
(speaking of which, why do benchmark sites always use 1.17 or other older version? The AMD bug?).

Part lazyness, part backwards compatibility with scores from previous reviews. ( They can't use demo001.dm3 with anything above 1.17 )
 
anaqer said:
Reznor007 said:
(speaking of which, why do benchmark sites always use 1.17 or other older version? The AMD bug?).

Part lazyness, part backwards compatibility with scores from previous reviews. ( They can't use demo001.dm3 with anything above 1.17 )

Well they shouldn't use demo001.dm3 in the first place...
 
Remember the guy that came on and made up a bunch of crap about unreal3 and the nv40 sucking, well he got shot down, but the whole issue of 3dnow being broken could have origniated the same way. Or alternatively it could increase the fps only 1 for every 300 with out verified information it seems like a bunch of whiners saying that anyone who is more respected, and well thought of than me must suck. Perhaps it is the rebel instinct that people want to fight the power they perceive carmack and others as having, who knows.

More to the point what makes an excellent engine coder? Well nothing there is no test, there is only opinion, and the opinion is that JC is excellent, if you are in the minority that is sitting in a padded cell banging your fists against your thick skull saying, NO NO I am better and JC is not excellent then I am sorry take you medication and calm down.

If you notice I didn't say he is the best, I don't know who is I don't care, and different people bring different things to the table, but to say he is not excelent at what he does is just silly.

edit: The links you posted to back up the assretions are full of morons who say things like
have you tried playing some older games like Quake2 on a new (much faster) system. I tried playing Q2 on my current system and DAMN !!!
I was running around so fast that I couldn't see where the hell I was going !!!!

LOL sure it makes you run faster, except that it doesn't sure DOS games and other old games do that, but if it is connected to the sytem clock

Err wait here is the source himself
Gouhan said:
There we have it people. As you can read from another thread. the ATi cards are smacking the nVidia cards upside the head as far as HL-2 goes. Now to all those who were against vendor specific rendering paths, this is exactly what happens. Now you tell me again how this is good for anybody?
As its a nkown fact there's no way in Hell the ATi Radeon9800Pro is 3x faster than the FX5900ultra.

Well it is a known fact by who? It seems that in many situations the people here showed it was 3x as fast... ah well
 
Hyp-X said:
Well they shouldn't use demo001.dm3 in the first place...

Agreed, but it's not like that would hold them back... :(


T2k :

About the DLLs... turns out "somewhere sitting around" was right inside one of my custom PK3 files in baseq3. Doh, no wonder I could measure no difference when copying in a second instance as they were already put to good use. :LOL:
I checked now, and they do give a nice 9-12% boost depending on quality settings. What would be good to know though is whether this performance delta is present on Intel machines as well ( like the VansHardware charts suggested )... Can somebody check this?
 
The Quake 3 DLL issue has been beaten to death, but here's once more:

- by default Quake 3 uses only virtual machine to run the game code (which is why there aren't any .dll files in the .pk3 files, only .vm)
- you can compile the game code to .dll which is then used if found, unless playing on so called pure servers
- most mods use .vm as well as you can use the same code on Windows, Linux and Mac

Obviously there's a boost from using native code compared to virtual machine code, duh...
 
jpaana said:
Obviously there's a boost from using native code compared to virtual machine code, duh...

Makes sense, thanks for pointing this out. Do you also happen to have some info on how it was decided that the 3DNow! support is broken? Really, one would expect things like this to be thoroughly covered by multiple review sites. :?
 
anaqer said:
Do you also happen to have some info on how it was decided that the 3DNow! support is broken? Really, one would expect things like this to be thoroughly covered by multiple review sites. :?

No knowledge as such. Might be that because Pentium 4 has been on top in Quake 3 until Athlon 64 that people decided there was something wrong with the game that made it slower on Athlons.

I'm not saying there isn't a problem with 3DNow support, but that it doesn't have anything to do with the game code, which the .vm and .dll code includes. The rendering code might use SSE/3DNow for math which could explain some of the difference.

As to why Pentium 4 runs Q3 so well, probably a combination of trace cache, more FSB bandwidth etc. that happens to like interpreting virtual machine code, which is quite different from running "ordinary" code. Benchmarks using Java virtual machine without any JIT would be interesting.
 
Since I'm not home (I'm just popped in for testing a new wireless setup @ my accountant ;) - barter is cool :D), I dont have time to point out why IMO is probably BS (I mean it has nothing to do with the fact 3dNow is FULLY broken in original Q3 -, so let me just ask one question: if it would be true why is that P4 doesn't get any significant boost from your compiled - so-called native - code? :rolleyes:

And also please, go and compile the original and then compare its performance to our fixed DLLs...

It'd have been better to do it before you posted your guessing... it is actually just another assumption, in this case from you, with no tests against our facts (backed up by tests).
 
From what I can recall Carmack said that processor optimizations made a marginal difference in the engine, and that it was up to driver writers to make good use of them.
 
T2k said:
Since I'm not home (I'm just popped in for testing a new wireless setup @ my accountant ;) - barter is cool :D), I dont have time to point out why IMO is probably BS (I mean it has nothing to do with the fact 3dNow is FULLY broken in original Q3 -, so let me just ask one question: if it would be true why is that P4 doesn't get any significant boost from your compiled - so-called native - code? :rolleyes:

And also please, go and compile the original and then compare its performance to our fixed DLLs...

It'd have been better to do it before you posted your guessing... it is actually just another assumption, in this case from you, with no tests against our facts (backed up by tests).

Don't know what you are trying to say here, but with quick testing, compiling original Quake 3 1.32 source with Visual Studio .Net 2003, default optimizations (SSE optimizations OFF) I get 191.1-191.6 fps (three run set) in timedemo of four.dm_68 with virtual machine code and 259.3-260.2 fps with the .dlls. Quite a significant difference in my book.

And this machine is a Pentium M laptop with Radeon 9600 mobility...

Edit:

tested with SSE2 optimizations as well and got a whopping 264.8-266.8 fps with the DLLs, yeah...

Edit 2:

tested with a 2.66GHz Pentium 4, with Radeon 8500LE:

vm: 197.8-199.1
dll: 214.9-215.7
sse2: 215.3-215.4

I thought the 8500 is beginning to be the bottleneck in 800x600 res I ran those in, but 640x480 gives only 1-2 fps more.
 
I just did a compile on the mod source too with an ooold VS6 ( no idea what optimizations were used, I just set "compile for : maximum speed" ) and I can confirm that it helps just as much as the "Athlon Optimized" DLLs I downloaded earlier ( curiously, file size is down too about 30% ).
 
Back
Top