If Doom 3 had been written in D3D....

For instance, a GeForce 6800 Non-Ultra with an Athlon 1800+ will probably provide a better gaming experience than a Radeon 8500 using an Athlon 64.

This is too vague. What is "a better gaming experience"?
I can turn the resolution higher on the 9700 without getting a performance hit... The rendering quality is also slightly higher... That could be seen as a better gaming experience.
The thing is... The maximum framerate is too low... And that is what I find not too good a gaming experience. Even in 640x480 with low detail, the framerate is not very high. Any other game would run at hundreds of fps in 640x480 with the lowest detail settings... Doom3 still manages about 30 fps tops or so... Which is barely enough for a decent gaming experience. Before you know it, it drops below playable, when there are a few enemies around.
Funny how the same PC can easily do Max Payne 2 with lots of enemies in a room full of stuff which can be blown up, and not have a problem at all. And Max Payne 2 would be a game that is likely to be CPU-limited, since the graphics are a piece of cake for a 9700.
 
Scali said:
Actually, I never said that either. I just said that Carmack is the only moron still coding OpenGL... That does not imply that all morons code OpenGL.

Looks like you need a refresher course on basic logic, because this is totally illogical :D

If one says that Carmack is the only "moron" still coding OGL, that implies that no one else is coding OGL, and it also implies that anyone else who has coded in OGL is a moron. Anyone who can put two and two together can see that.
 
If one says that Carmack is the only "moron" still coding OGL, that implies that no one else is coding OGL, and it also implies that anyone else who has coded in OGL is a moron. Anyone who can put two and two together can see that.

Your logic is flawed there.
It doesn't imply that no one else is coding OGL, it implies that no other "moron" still codes OGL (there could still be non-morons coding OGL).
Which doesn't imply that "anyone else" who has coded in OGL is a moron, but merely that there have been other "morons" that have been coding OGL, but aren't anymore.

Just out of curiosity... how much is 2 + 2 by your logic?
 
Scali said:
This is too vague. What is "a better gaming experience"?

Just about as vague as your comparison between the 8500 and 9700 ;) What I meant was, higher framerates at comparable image quality settings, and comparable framerates at higher image quality settings, as they both enhance the "gaming experience" obviously.

I can turn the resolution higher on the 9700 without getting a performance hit... The rendering quality is also slightly higher... That could be seen as a better gaming experience.

So the "gaming" experience really is better on a 9700 than an 8500 (especially if you actually bothered to control variables and use the same CPU). Surprise surprise!

The thing is... The maximum framerate is too low... And that is what I find not too good a gaming experience. Even in 640x480 with low detail, the framerate is not very high. Any other game would run at hundreds of fps in 640x480 with the lowest detail settings

Maybe you feel you are not getting a good gaming experience on your machine and for your tastes, but your general gaming experience is also contradictory to what every single reviewer has said about the game: from the sound quality, to the visuals, to the playability, to the CPU scaling.

Funny how the same PC can easily do Max Payne 2 with lots of enemies in a room full of stuff which can be blown up, and not have a problem at all. And Max Payne 2 would be a game that is likely to be CPU-limited, since the graphics are a piece of cake for a 9700.

This argument does not sound logical at all. Simply because Max Payne 2 plays well on your computer equipped with 9700, you expect all brand new games to perform the same on your years-old hardware? Sounds a bit silly to me.
 
Scali said:
Your logic is flawed there.
It doesn't imply that no one else is coding OGL, it implies that no other "moron" still codes OGL (there could still be non-morons coding OGL).

This is an inane argument, totally illogical other than on a juvenile level. Yeah, let's group the "moron" developers in one corner, and say that Carmack is the only one in the group who codes in OGL while all the other morons in the group code in D3D.

Get a clue already, you are just running around in circles here (and obviously basic logic is not your strong point).
 
So the "gaming" experience really is better on a 9700 than an 8500 (especially if you actually bothered to control variables and use the same CPU). Surprise surprise!

It's useless to use the same CPU, since I already know that the 9700 is much faster then. It's also useless to use the same graphics card, because we already know that the faster CPU will get the better framerate.

The point is (jesus christ, is it that hard to get), that despite the MUCH faster card, a reasonably decent CPU like the 1800+ is barely able to drag out playable framerates, while the 8500 with a faster CPU is capable of quite high framerates. Which is something that I have not seen in any other game.

Maybe you feel you are not getting a good gaming experience on your machine and for your tastes, but your general gaming experience is also contradictory to what every single reviewer has said about the game: from the sound quality, to the visuals, to the playability, to the CPU scaling.

Maybe I have a different view on Doom3 than the average reviewer who's been kissing up to John Carmack for the past few years, getting interviews and sparse info. Then again, I haven't seen any review with a system similar to mine. Hardocp only tested the real low-end... Not a P 1.5 with a 6800U for example. I wonder what their reaction would be if they got pretty much the same framerates as with the GF4MX.
But no, they went straight up to the 2.4 GHz stuff when testing the more recent videocards.

This argument does not sound logical at all. Simply because Max Payne 2 plays well on your computer equipped with 9700, you expect all brand new games to perform the same on your hardware? Sounds a bit silly to me.

Maybe if you understood why... Max Payne 2 is basically the same sort of game as Doom3, CPU-wise... We have a very nice physics engine, we have very nice environmental 3d sound, we have some reasonably smart enemies running around, it is mostly indoors, and in reasonably small areas at one time (single rooms). Max Payne 2 actually has a lot more enemies running around, and a lot more objects that react to physics than Doom3... Now, Doom3 has the more advanced rendering... but this should mostly affect the GPU, not the CPU. And since Max Payne 2 is a piece of cake for both the CPU and the GPU, I would expect Doom3 to still be a piece of cake for the CPU, and only be hard on the GPU (that is how my Doom3-style rendering works anyway, as does for example the Battle Of Proxycon in 3dmark03)... Now somewhere there seems to have gone something wrong, because suddenly the CPU is limiting the game, and it's still a piece of cake for the GPU, because it simply doesn't get enough frames sent to it, to make it run fast, regardless of resolution or detail settings.

In short, I think Carmack's code is unbalanced. He does way too much on the CPU, and way too little with the graphics and audio hardware.
I bet that Half-Life 2 will be completely different, running much better on the 1800+ than on the P4, like every other game out there, which actually uses the extra hardware.
 
This is an inane argument, totally illogical other than on a juvenile level. Yeah, let's group the "moron" developers in one corner, and say that Carmack is the only one in the group who codes in OGL while all the other morons in the group code in D3D.

Yes, that is a possibility, from a purely logical point-of-view.
By the rules of logic, it is all valid... And the rules of logic are well-defined, and do not operate on various levels such as 'juvenile', 'adult'. Logic is universal.

Get a clue already, you are just running around in circles here (and obviously basic logic is not your strong point).

My logic was not the flawed one. So who should be getting a clue?
It just shows how much people implicitly read into things you never actually said.
I merely pointed out all the things that I DID say, from a purely logical point-of-view.
 
Scali, I disagree, I think Doom3 shows a very good balance between CPU and GPU. I don't think it's reasonable to believe pairing a high-end GPU with a low-end CPU will give you good results. You have to pair equal classes.
 
Scali said:
The point is (jesus christ, is it that hard to get), that despite the MUCH faster card, a reasonably decent CPU like the 1800+ is barely able to drag out playable framerates, while the 8500 with a faster CPU is capable of quite high framerates.

And a 9700 with a faster CPU is capable of even higher framerates, so what? These relative differences will depend on several variables: cpu architecture, gpu architecture, game architecture. Like I hinted at earlier, a faster and newer card such as the 9700 tends to benefit even more from a faster CPU's than a slower card like the 8500. It would also not be inconceivable to imagine that a GeForce 5900XT would be faster with an 1800+ in Doom 3 than an 8500 with a faster CPU.

Maybe I have a different view on Doom3 than the average reviewer who's been kissing up to John Carmack for the past few years, getting interviews and sparse info.

I see no good reason for reviewers to deceive the public or be dishonest in their findings, given the fact that the game will be out in a day or so, and people will be able to find out for themselves.

Not a P 1.5 with a 6800U for example. I wonder what their reaction would be if they got pretty much the same framerates as with the GF4MX.
But no, they went straight up to the 2.4 GHz stuff when testing the more recent videocards.

A 6800U matched with a P 1.5 would be a total mismatch. Generally, as performance increases on a graphics card, benefits for the gpu from increasing CPU power also increase. Also, faster and newer graphics cards tend to have faster and newer CPU's accompanying them. That's why most people who use 8500's will probably in general have a slower CPU than those people who own 9700's.

In short, I think Carmack's code is unbalanced.

LOL, of course you would think that. Unfortunately he is not here to respond to the anonymous comment, but I'm sure he'd think the same thing of your code too ;)

I bet that Half-Life 2 will be completely different

HL2 will be different for you, because it was designed more with the R3xx architecture in mind (and of course, it is written in your beloved D3D).

running much better on the 1800+ than on the P4, like every other game out there, which actually uses the extra hardware

An 1800+ besting a P4 (no Mhz given, so whatever that means) every other game out there? LOL, yeah, sure. I am quite certain that there are some CPU comparison tests done by some hardware reviewers that show otherwise (meaning that performance differences depend on the game tested). I will try to dig up something for you.
 
Bah, typed up a huge reply, then decided against it. In the end, this moron here(me) will still be buying and playing the moron's game, but wouldn't know what Scali ever produced.

edit: too many typos.
 
Xmas wrote:
Scali, I disagree, I think Doom3 shows a very good balance between CPU and GPU. I don't think it's reasonable to believe pairing a high-end GPU with a low-end CPU will give you good results. You have to pair equal classes.

Just remember what 3dfx and nVidia said back then: "In a few years, you'll only have to upgrade your gfx card, since most of the workload will not be handled by the CPU anymore". Me still waiting... ;)
 
I think it shows a good balance between the two. But thats becasue the game should have been out for a year already. We should have been playing it with the fx line and the radeon 9500+ line.

With 3ghz cpus if we were lucky. Now we have 3.8 ghz cpus out there and cards 3 times faster than the last gen.


I dunno . i wish this had come out last year . farcry and painkiller ruined this for me
 
I don't think it's reasonable to believe pairing a high-end GPU with a low-end CPU will give you good results. You have to pair equal classes.

Erm, I believe that about two years ago, both 1800+ and 9700 were near the high-end. There was the 9700Pro ofcourse, and with CPUs... maybe 2200+ or so?
So as far as I recall, they are equal classes.
 
It would also not be inconceivable to imagine that a GeForce 5900XT would be faster with an 1800+ in Doom 3 than an 8500 with a faster CPU.

Erm... Why would it not be inconceivable that a 5900XT is faster than a 8500 with a faster CPU, while apparently the 9700 is not faster, because it is CPU-limited?
Does the 5900XT magically accelerate the CPU?

An 1800+ besting a P4 (no Mhz given, so whatever that means)every other game out there?

Obviously still the 2.4 GHz P4 with the 8500, that was the subject of the discussion the whole time.

I am quite certain that there are some CPU comparison tests done by some hardware reviewers that show otherwise (meaning that performance differences depend on the game tested). I will try to dig up something for you.

Well, Doom3 is a start. I doubt that there are many other games that will be slower on the 1800+. 3dmark03 surely isn't.
 
Just remember what 3dfx and nVidia said back then: "In a few years, you'll only have to upgrade your gfx card, since most of the workload will not be handled by the CPU anymore".

This is true for most games, Doom3 seems to be the exception... But apparently people look up to Carmack like he is a god, so whatever he does is right.
 
Back on topic: if Doom 3 was a D3D game, it'd look exactly the same, but the people would be bitching about it not being OGL... :LOL:
 
I prefer OGL over D3D with NV or ATi. D3D has a nasty habit of getting hickups and pauses allaround. Haven't seen any with OGL based games nommater which hardware i used. It always run smooth.
Even Far Cry doesn't run good on my rig (D3D),but Doom 3 runs fast like nuts (OGL).
 
Scali said:
Okay... so you can re-use the texture register from an earlier stage... How often is that useful, and how often does it actually reduce stages?

Very often. I used it all the time in the R100/GF2 era. For perpixel lighting I'd dot bumpmap with vector field map in stage0, then modulate with light color in stage1.
 
This article shows exactly what I mean:

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2149&p=7

Look at the XP2000+ (almost as slow as my 1800+), it gets 46.1 fps, and that's it... Changing the resolution doesn't matter anything. Only on 3+ GHz systems, the resolution seems to have any impact at all on the framerate, below that, they are completely CPU-limited.
And there is more than a factor 2 speed difference between the slowest and the fastest CPUs in the test, that is even more than the processor ratings from AMD would indicate.

So these figures justify my suspicions: Doom3 is way too CPU-heavy.
 
Back
Top