3D tech with immediate benefits

Reverend

Banned
We have AA and aniso as prime examples. N-patches can fall into this category as well (if devs can come up patches for existing games). I would discount insane speed from the "3D Tech" category.

What next? Any clues anyone?
 
With better 3d capabilities on 3d cards it seems like the 2d display limitation will be the next major hurdle. Sharp has something unique here:

3D display

Surround gaming also seems unique if a afforable setup is available.
 
This 3D Display seems to be very similar to what Elsa did with their Ecomo4D Displays, though with a much lower price tag.
 
Some of you may laugh at me but an immediate benefit also would be stability and bug free drivers.

With this in mind I do not believe you can lay the blame entirely on the IHV. Sometimes it is Macro$shafts fault on the OS Level and developers problems on the games/applications side.

Accept my apologies Rev, I'm pretty sure I have strayed a little off topic, but it is an 'immediate benefit'.
 
Reverend said:
We have AA and aniso as prime examples. N-patches can fall into this category as well (if devs can come up patches for existing games). I would discount insane speed from the "3D Tech" category.

What next? Any clues anyone?

deferred rendering, i think we need to move away from immediate mode renderers... the way we have now is not effencient, we need to be doing things much more effecient then we are now... that would show an immediate benefit...

3D Textures? that could be interesting
 
IMO a transparent (to the user) method of ensuring that lighting in games is uniform across different configurations. Some games can be radically different (at their defaults) due to varaitions in the defaults of monitors and gfx card gamma settings. The end user experience is spoilt and dubious advantage gained/lost by things like shadows being far too dark/light/none existant just by changing gfx card &/or monitor.

Giving artists certainty about what the user would experience would allow for greater creativity and subtlety.
 
Gamma correction in the pixel pipe and higher precision are two others Rev. Although they require some patching, it's minimal.
 
I fully understand the limitations that Surround Gaming impose: budget, desktop space, etc.

But if you're purely talking about a feature that brings about immediate benefits with limited developer requirements (or none), then I would honestly say look no further than Surround Gaming.

It really is one of those deals that once you sit down and experience it, you will never want to go back to a single display game.

I sent Matrox a Quake2 build that I made from the source code that I snagged off the ID FTP site. In all, it took me about 10 minutes to identify the changes needed to make Surround Gaming work. 10 whole minutes.

This is one of Surround Gamings strongest points: it requires so very little coding to make it work. Granted, there are some things that certain games might have to make additional adjustments to, such as the HUD, etc. But most developers would have a very good idea as to where in the code the changes would need to be made.

Anyhow, I've been really blown away by the feature. If you think back to what some said about the Voodoo5's FSAA...Turn it on, never look back...that's the same way I view S.G.
 
misae said:
With this in mind I do not believe you can lay the blame entirely on the IHV. Sometimes it is Macro$shafts fault on the OS Level and developers problems on the games/applications side.

With that in mind, and back to the game rendering, would one have to look any further then the D3D render found in Diablo II and Diablo II xpack? Since having upgraded my system to an Athlon XP 1900+, 512 MB of Corsair XMS-2400 (in 2 DIMMs so as to get full use of the memory bus architecture on the nForce chipset), and Asus A7N266-E I upgraded too within the last year, I'm finally at a place where sorceresses being in game doesn't bring the game to a grinding hault with 5 fps and 30 skipped frames (despite only 100 ping, and lots of free memory as shown by the fps command). It averages about 60-85 fps, sometimes lower on some parts of the map, but this is at only 800x600x16-bit color mind you...

But as to necromancers, they still bring the thing down to like 10 fps and I'm left wondering how much of an upgrade would be needed to have the D3D renderer remain at playable frame rates and not look choppy. The D3D renderer was some intermix of 2D sprites and D3D, supposedly to make the thing run better on even a 3dfx Banshee, and lists a PII 233 as being all the game should need...
 
The Diablo2 D3D renderer plain *sucks*. It's slow on ANY kind of hardware when the screen gets busy. It just can't be avoided it seems, even GeForce3:s and 1.7GHz CPUs makes this game chug like crazy in big fights.

It can't be used for benchmarking at all, when Unreal-engined games mostly were CPU-bound, Diablo2 seems to be like, laws-of-nature-bound or something. :)

And don't blame the poor necromancer please, he gets to take all the crap when it's Blizzard's lame-ass programmers who're at fault! :D


*G*
 
Grall said:
The Diablo2 D3D renderer plain *sucks*. It's slow on ANY kind of hardware when the screen gets busy. It just can't be avoided it seems, even GeForce3:s and 1.7GHz CPUs makes this game chug like crazy in big fights.

It can't be used for benchmarking at all, when Unreal-engined games mostly were CPU-bound, Diablo2 seems to be like, laws-of-nature-bound or something. :)

And don't blame the poor necromancer please, he gets to take all the crap when it's Blizzard's lame-ass programmers who're at fault! :D


*G*

Honestly I've never played the game (because I hate the Diablo series and similar games) but WHAT THE HELL is Diablo 2 doing with a D3D renderer? Seems kind of overkill for a 2D game...
 
Typedef: One could argue the same point about TRUFORM by your logic, and it comes to the end user at a much smaller cost $0.00!! All is needed is some developer or enthusiast to code a few lines and patch the game. It has been done with Quake, Quake2 and UT

Nagorak: It added special effects for the spells and stuff. Also added a bit of perspective corrected depth. The glide renderer is much faster tho and has prettier effects.

BTW Son Goku = [HCA]Son Goku?, yes or no will suffice :)
 
The reason your Diablo was running slow was more because of too many sprites rather than the 3d effects. You can see the same effect in Starcraft by building alot of Terran missle towers in one place.
 
gkar1 said:
Typedef: One could argue the same point about TRUFORM by your logic, and it comes to the end user at a much smaller cost $0.00!! All is needed is some developer or enthusiast to code a few lines and patch the game. It has been done with Quake, Quake2 and UT

No. Patching the game won't suffice. You need to communicate with artists to bring up N-patch compatible 3D models. This is not necessarily an easy task.

And to the "immediate benefits" 3D tech, another one will be some form of temporal antialiasing. The most simple one is to render at insane speed (600fps or similar) and blending them with a filter (Gaussian should be good enough) into normal 60 or 85 fps which are possible to display with current technology. Of course, the benefit could be small except for perhaps racing games and other simulation games.
 
Very simple "requests":
Force colordepth.
Force texture color depth.
Force Z/W-buffer depth.


I know they are at least partially availible on most cards, I'd just like to see full support of it through the control panels.

I'd also like to have SSAA "back", MSAA sucks crap (Haven't seen the 9700 in action though.) or else some much improved texture filtering.
 
And don't blame the poor necromancer please, he gets to take all the crap when it's Blizzard's lame-ass programmers who're at fault!

Yes indeed it is their programming, though I finally hit a point where the necro is the one who I see it most with (actually army necro, as a CE/amp damage necro such as in cow runs doesn't have the same effect). One would think this long after the release, and with so many hardware upgrades, given they allegedly designed it to be able to run on low end hardware in their day, it would actually run better today. But nooo...I'm not even sure if an NV30 or NV35 with a Sledgehammer would do it upon release, though in a years time or so, I might find out, perhaps to my chagrin...

BTW Son Goku = [HCA]Son Goku?, yes or no will suffice

No

The reason your Diablo was running slow was more because of too many sprites rather than the 3d effects.

It is the D3D renderer however, as if one puts the game in Direct Draw mode, one does not see this. And it was Blizzard who decided to program their D3D renderer in this manner (allegedly as I remember the PR, to make it easier for it to run on low end cards like a Banshee). What they got instead was a game that performs lousy on any hardware. IMO, their decision to use 2D sprites in their D3D renderer like they did, was one of the lousiest design decisions they made...

Well I hadn't got Warcraft III yet. Perhaps someone who has can say whether rendering War III avoids this issue or plays just as horendous on (anyone have a GF4 Ti4600 or ATI 9700 to compare it on now)?
 
Even if Truform required nothing else (on the modeling side, programming, etc.), it still doesn't come close to what S.G. brings to the table.

All in my opinion, of course.
 
Diablo 2 ran awesome on my Voodoo 3 in Glide.
It runs very well in heavy fights on my system. Just there are 5 places in the game wher eit slows down. 1 spot in the 5 towns.
On act 5 it's when taking a direct route to the blacksmith.
In heavy battles it doesn't slow down for me. I think earlier Nvidia drivers would be better for D2. I haven't touched the game for a year.
 
Son Goku said:
Well I hadn't got Warcraft III yet. Perhaps someone who has can say whether rendering War III avoids this issue or plays just as horendous on (anyone have a GF4 Ti4600 or ATI 9700 to compare it on now)?

War3 is now completely 3D and runs fine from what I saw of it on my Radeon 8500. I don't feel the graphics were very impressive in a "blow you away" kind of way. I also thought the game was pretty boring, but then again I don't really like RTS games either. Just glad I borrowed it from a friend instead of spending $60 on it (WTF was with them overpricing the game so bad upon release? You can get UT2003 for $40 and Blizzard charged $60 for War3????).
 
K.I.L.E.R said:
Diablo 2 ran awesome on my Voodoo 3 in Glide.
It runs very well in heavy fights on my system. Just there are 5 places in the game wher eit slows down. 1 spot in the 5 towns.
On act 5 it's when taking a direct route to the blacksmith.
In heavy battles it doesn't slow down for me. I think earlier Nvidia drivers would be better for D2. I haven't touched the game for a year.

What I suspect, though can't prove is that the Glide renderer was much more optomized then the D3D renderer which one would use with all non-3dfx cards (which means all video cards being manufactured today). There were a lot of delays (with D2 classic), and optomizing the D3D code might have been left off in rushing the game out the door... However, xpack was in mid-development when the announcement was made that nVidia was buying 3dfx assets...and from the look of it saw no improvement prior to the latter release of xpack.

War3 is now completely 3D and runs fine from what I saw of it on my Radeon 8500.

Thank goodness...if it runs fine on a Radeon 8500 it should be OK on my GF3 Ti500 or when I upgrade next year or so.
 
Back
Top