DX7 GFX CAN MATCH DX9 GFX LEVEL GRAPHICS!!

NeoCool said:
Hhhhmmmm, true. Actually, valve doesn't do a very good job in hl2 of taking advantage of dx6-dx9's full potential, I'm not very impressed at all, but does anyone have resource screenshots of hl2 comparing the IQ of dx6-dx9 class hardware? :?:

While I understand what you mean, what your asking a developer to do is kill the hardware or severaly limiting their potential audience. Its not a console, we still have people buying new games to play them on Voodoo 2's.
Meaning: They cannot program a game to take full potential of DX9 with out killing the hardware. I propose that if you made a game that took every DX9 option an enabled it it would kill a 9800XT. So you can imagine what a 8500 or GF4 would do.

It will be probably R500, N50 before we see DX9 in all its glory.

My two cents.
 
NeoCool said:
No I'm not an NVIDIA employee i've never even been to their HQ. Hah. And no, er, I'm not promoting DX7 features over DX8/DX9 features, I'm just saying DX7 features aren't tooken advantage fo nearly enough. :D

I think you're making the all too common mistake of confusing the technology available with the developers who create the games. Much of the way things appear in a game have less to do with technology and more to do with the artistic talent of the artists doing the work. It's garbage in, garbage out...:) If you are using the most powerful 3d-chip ever made with the most advanced API, if the textures and other things you use look like crap, then the technology won't improve much on that because it can't. OTOH, powerful new hardware in concert with new APIs which support it, in combination with artists & programmers who know what they're doing, can work some mighty impressive results...:)

Good example of the truth of this is the recent DeusEx2:Invisible War demo from IS/Eidos. The IQ of the demo is pure-tee crud on my 9800P, which has everything to do with demo itself and nothing to do with my hardware, or the API. As someone else pointed out earlier, advances in the API have to do with standardizing approaches for doing complex rendering things in a much simpler manner, so that developers as a group will feel encouraged to raise the IQ bar so that doing so no longer intimidates them. Just because you can do something in DX7 in, say, 12 steps as opposed to 2 steps in DX9, doesn't mean that you should do it in DX7, or that you'd even want to. In short, you'll always see differences among developers between the technology they use in a game, and how they use it. Two developers can claim to use "DX7" or "DX9", and yet the graphical results obtained by each under the same hardware and API can be light years apart.
 
Socos said:
I propose that if you made a game that took every DX9 option an enabled it it would kill a 9800XT.
Graphics API's aren't a series of options that can be turned on and off. In fact, it is conceivable to write a program that would display an infinite number of pixels in one frame (of course, the program would never run to completion).

So yes, with any graphics API, you can kill performance on any graphics card, if that is what you desire.

In this way, "making full use" of an API doesn't really mean all that much. There's still the fact that the developers are trying to display actual graphics, and there are a plethora of different ways to render a scene.

The only comparison that makes any sense is, when development began, what baseline hardware did the developers assume the game would be run on? That will determine a subset of rendering algorithms that are feasible. The actual algorithm chosen will determine how well the game engine can adapt to more powerful hardware.

DOOM3, for instance, was developed with the assumption that shadow volume generation would need to be done on the CPU. This precluded the use of any sort of higher-order surfaces. If the NV40 and R420 can efficiently calculate shadow volumes on the GPU, then JC's choice of rendering algorithm will have limited how well the DOOM3 engine could translate to more powerful hardware.

Similarly, JC was forced to use stencil buffers for the shadowing because that was the only shadow technique available on his baseline hardware.

Now, DOOM3 will be pretty slow on the original GeForce and Radeon graphics cards not because DOOM3 "fully uses DX7," but because the rendering algorithm chosen requires many passes on that hardware (not to mention DOOM3 doesn't use DirectX at all...). It doesn't even have anything to do with what options are turned on. The original GeForce and Radeon just don't have the flexibility to be efficient with JC's rendering algorithm.
 
NeoCool said:
The EBM TRANS link doesn't work, btw. And yeah, a GeForce1/2/4MX/Radeon 7xxx level of hardware in d3 @ 320x240 with HIGH detail settings should gain like 30fps or so. The graphics cards themselves support the stencil buffer through hardware through geometry acceleration, and combine that with their specular lighting support, hardware T & L, DOT3BM/EBM support, and you could certainly play d3 at a reasonable frame rate at high settings, you'd just have to settle for a reeealllly low resolution and no AA/AF. ;)

High detail settings in Doom3 include AF.
 
Chalnoth said:
(not to mention DOOM3 doesn't use DirectX at all...).

No, Doom 3 will require and use Direct-X in software and Direct-Sound in hardware, the only category doom 3 won't use Direct-X for is graphics, there is where the OpenGL API comes in. Or, instead of using Direct-Sound for Doom3, OpenAL or hardware accerelated 3D surround sound or EAX will be used. Otherwise, Direct-X will be used in doom 3, no doubt, not really hardware wise, though. :)
 
NeoCool said:
Ailuros said:
"dx8 details" (?) as in what and where exactly?

They are hardly noticeable, some more detailed fluid surfaces, maybe some other little details, same with Unreal2. :rolleyes:

Feel free to consult Daniel Vogel about the imaginary dx8 effects. There's quite a difference between utilizing specific units/instructions and categorizing effects for a specific API. Last time I checked on dx8.1 and dx9.0 compliant hardware, I couldn't see anything that goes out of the dx7 realm.
 
I doubt the Linux and Mac versions/ports of D3 will benefit from DX. ;)
 
NeoCool said:
...or EAX will be used.

EAX is NOT a complete API. It is just a set of extensions to DirectSound3D and OpenAL. Remember EAX stands for Environmental Audio Extensions.
 
Back
Top