Ultra High Mode in UT 2003

What about aquanox, doesn't aquanox utilize a robust directx 8 engine with pixel and vertex shader support, along with an advanced lighting model?
 
The question is if there is no difference between High->UltraHigh due to lack of vmem requirements, or if it's simply being ignored in software until the next patch.

I'd try the Ultra High terrain detail at like 800x600, no AF and no AA to free up as much vmem as possible and see if this might be enough to see some difference. I do know the 9700 Pro takes a SS footprint as far as vmem usage as verified by ATI on the Rage3D forums (i.e. 1600x1200 with AA takes a ~110MB vmem footprint with no textures!).

So, in theory, if there are better still textures past "High" on the texture slider, they could either be ignored for whatever reason, or simply failing gracefully due to memory requirements and falling back.
 
It would be nice if it is a mode that works on all cards.... however, it wouldn't be unprecedented for Epic to support this feature on only one card.... we all remember UT and the extra CD you could only use with 2 S3 cards until Vogel added support into OpenGL, right?
 
Well, Daniel posted a much more comprehensive answer to this over at Rage3D and it explains it very well. I'll quote it here:

I can assure you that we did NOT lock out anything from customers that is on the CD.

We pick (conservative) default settings based on detected HW and the code currently in place will only pick the highest texture detail on 256 MByte cards by default though you can easily change this either via the menu or by modifying the ini file.

At highest texture detail some levels might use up to 180 MByte of data (textures + geometry) and if you have a lot of players this number might be even higher

Here's how the detail settings work:

FWIW, we didn't ship with any textures that require a detail setting above High to be shown at full detail. The texture detail is basically a bias against the LOD level defined in the texture package. So e.g. large textures might have a "NormalLOD" of 2 which means at normal texture LOD ("Normal") the engine will skip the first two mip levels. At "Lower" it will skip 3 and at "Higher" it will skip only 1. To the best of my knowledge the highest NormalLOD used in UT2k3 is 2 which means that by setting your texture detail to "High" (ini and menus use a different notation as texture detail ended up being to fine grained and I'm refering to ini options) you'll end up with the full quality textures. We also do some clamping so small textures won't loose mipmaps at low texture detail settings.

Below are the ini options and their bias level.

-4 UltraHigh
-3 VeryHigh
-2 High
-1 Higher
0 Normal
+1 Lower
+2 Low
+3 VeryLow
+4 UltraLow

As this is too fine- grained for the regular user we mapped the words differently so XYZ in the menus doesn't necessarily map to XYZ in the ini so this might have caused some confusion.

-- Daniel, Epic Games Inc.
 
rhink said:
It would be nice if it is a mode that works on all cards.... however, it wouldn't be unprecedented for Epic to support this feature on only one card.... we all remember UT and the extra CD you could only use with 2 S3 cards until Vogel added support into OpenGL, right?

Actually it was as usual done by a outside party, Loki AFIK was the 1st to release a Opengl Renderer to allow all cards to use the second Cd...just like most of the online games sadly the most improvement comes from the internet..including maps.
There is also the other side of that, already binds and aimbots are being used in Ut2003, and we have to rely on some other coder out there to make another CHSP or bot detection/Pure Server mod. Epic didn't think it was important I guess.

Carmack has stated about the speed gains by executing everything in a single pass, as much as 30%:

The fragment level processing is clearly way better on the 8500 than on the
Nvidia products, including the latest GF4. You have six individual textures,
but you can access the textures twice, giving up to eleven possible texture
accesses in a single pass, and the dependent texture operation is much more
sensible. This wound up being a perfect fit for Doom, because the standard
path could be implemented with six unique textures, but required one texture
(a normalization cube map) to be accessed twice. The vast majority of Doom
light / surface interaction rendering will be a single pass on the 8500, in
contrast to two or three passes, depending on the number of color components
in a light, for GF3/GF4 (*note GF4 bitching later on).

Initial performance testing was interesting. I set up three extreme cases to
exercise different characteristics:

A test of the non-textured stencil shadow speed showed a GF3 about 20% faster
than the 8500. I believe that Nvidia has a slightly higher performance memory
architecture.

A test of light interaction speed initially had the 8500 significantly slower
than the GF3, which was shocking due to the difference in pass count. ATI
identified some driver issues, and the speed came around so that the 8500 was
faster in all combinations of texture attributes, in some cases 30+% more.
This was about what I expected, given the large savings in memory traffic by
doing everything in a single pass.

Not sure why UT 2003 would not benefit the same :-?
 
There is nothing wrong with IHV's paying for special features,versions, or patches for their cards. It was quite common in the past for IHV's to pay for 3D ports (from software rasterized games) which were included in a games bundle. Giants is another example of a game that had a special "NVidia" version.

If ATI had paid Epic to include a special TruForm mode (which only, to this day, really works well on ATI hardware), I doubt Doomtrooper and others would be complaining.
 
Maybe because UT already runs way faster than Doom3 is going to run even in one pass, and the bottleneck in UT is more CPU than Graphics card?

By the way set physics detail to low...you'll barely tell the difference and it won't eat up so much CPU time.
 
If ATI had paid Epic to include a special TruForm mode (which only, to this day, really works well on ATI hardware), I doubt Doomtrooper and others would be complaining.

I'm still very interested in how some people like to make stark, blatantly incomparable situations and somehow equate them as "same"....

There is obviously nothing wrong with supporting a feature, technology or bonus on one chipset versus another as long as the reasoning behind such is a technological improvement over other IHVs.

When the GF3 first came out, there wouldn't be any argument for a game that uses p/v shaders. Given the GF3 was the only card that had support for this at the time, this would be perfectly acceptable.

In other words, if there are some advanced textures in UT2003 that require 256MB of vmem to support, then any videocard with 256MB should be able to use them. The complaints would start when you have competing products of equal or same capabilities and a feature has been intentionally thwarted to give an illusion of hardware superiority. (i.e. another 256MB vmem videocard but is artificially prevented from using outside the realm of driver bugs or similar).

Using Truform as equating to this is baffling to say the least. Obviously, if Truform were supported by NVIDIA, this would make an equivalent argument, but since there is no Truform support for current generation NVIDIA cards, it's a ludicrous example. If some form of "middle ground" HoS support could be added with wider market share, this would always be preferred.

So you are correct- I doubt people would be complaining when there is no reasonable or logical purpose to complain. If brand X has feature Y, and brand Z does not, then there is no valid argument why brand Z should have this feature.
 
umm, call me stupid, but UT2003 DOES support truform. and quite nicely, too. Pity it is visible just on player (and weapon) models. those adrenaline pills sure could use some rounding. (and remember tim's rant about how truform sucks)

also, Doomtrooper, AFAIK Loki = Vogel. Vogel, please correct me, if i'm wrong.

And if you ask me, this is just anoter example of hearsay theresay argument thread. there are still MONTHS since we'll see nv30 on the shelves and i don't care a smallest bit if there will be a nv specific feature or not. i got mine.
in the oter news: i just recieved 9700 board. woohoo!
 
In other words, if there are some advanced textures in UT2003 that require 256MB of vmem to support, then any videocard with 256MB should be able to use them. The complaints would start when you have competing products of equal or same capabilities and a feature has been intentionally thwarted to give an illusion of hardware superiority

Thank you shark. This is exactly my concern. I know that you and I are having a little *discussion* right now at Rage3d.. but i really appriciate That you can see why this is different from just supporting a n IHV's feature.
 
quattro said:
umm, call me stupid, but UT2003 DOES support truform. and quite nicely, too. Pity it is visible just on player (and weapon) models. those adrenaline pills sure could use some rounding. (and remember tim's rant about how truform sucks)

also, Doomtrooper, AFAIK Loki = Vogel. Vogel, please correct me, if i'm wrong.

And if you ask me, this is just anoter example of hearsay theresay argument thread. there are still MONTHS since we'll see nv30 on the shelves and i don't care a smallest bit if there will be a nv specific feature or not. i got mine.
in the oter news: i just recieved 9700 board. woohoo!

Loki was a company that was involved in converting games to run on Linux. Unfortunately they folded...

Vogel worked at Loki and was responsible for the UT conversion for Linux. As Linux does not support D3D, an OpenGL compatible renderer was necessary. The original UT OpenGL basically sucked ass. Anyway, he was the one who wrote the OpenGL renderer for the Linux port, and eventually that made its way back to the Windows version. After Loki filed for Chapter 13, he was hired by Epic. At least that's what I understand, he would know better than I of course. ;)
 
Also, please explain why this thread is still going? Daniel Vogel just denied the story about only supporting higher rez textures on the NV30. It's completely false...any discussion pertaining to it is thus irrelevent.
 
Well Shark, you didn't read what I said. I said any IHV should be able to pay any ISV to put in special support for features that their hardware has. The situation is exactly comparable to TruForm. I did not say "Any IHV should be able to pay any ISV to disable the game working on their card if in fact, it really could at appropriate performance levels"

You however, are operating on a false rumor that somehow UT simply says

Code:
If ULTRA_HIGH_RES == ON and CARD != NV30 then FAIL
Yes, that isn't a comparable situation, but that's not what I claimed anyway.

What really happens is something like this

Code:
IF OpenGL_extension_exists(NVidia_Extension) AND can_run_extension_fast(NVidia_Extension) then Enable_Special_Feature();

For example, UT2003 could be enhanced to use NV_vertex_program2 over ARB_vertex_program if there are some shaders that require data dependent branching. Or, if UT2003 actually had some pixel shaders, it could be enhanced to use NV_fragment_program if NV30 specific pixel shader stuff was needed.

And there is nothing wrong with any IHV paying for the addition of support of their specific OGL extensions or hardware features.

The fact is, you are all too willing to latch onto any ominous rumor before it is confirmed as long as it is anti-NVidia.
 
I certainly understand the concept of offering features in a title that may enhance the sale of a 3d product.

I really like this talk about gaming titles and 3d companies and easily offering words like pay and cash.
 
Actually, DemoCoder, if you actually read what I posted you would understand I haven't latched onto any rumor.

You however, are operating on a false rumor that somehow UT simply says
Code:
If ULTRA_HIGH_RES == ON and CARD != NV30 then FAIL

No, actually what I said is:
The question is if there is no difference between High->UltraHigh due to lack of vmem requirements, or if it's simply being ignored in software until the next patch.

I don't recall saying anything about the NV30 or if CARD != NV30 then FAIL one bit. Be very interested where this came from as it surely didnt come from me.

And I disagree that IHV's should be able to finance game makers to enable abilities for themselves and/or disable abilities from other IHV's, especially in games that have benchmark features.

This serves no purpose except to taint the playing field by allowing the highest bidder to present artificial "superiority" and bring the hardware enthusiast's battle to the home gamer, rather than website forums and MadOnion where it belongs.

Your average Joe game consumer only cares if their game works, works well, and works as advertised. The moment a game comes out with "With advanced reflections! (NVidia only), Superior pixel shader effects! (ATI only) and lifelike shadows (Parhelia only)" will be the day game developers realize what this causes- forums full of complaining customers. Just go look at Bioware's forums concerning shiny-water and Neverwinter Nights. They have been on full-time thread deletion duty for months until they finally created a "sticky" thread and committed to getting pixel shader effects for everyone else with said capability.

So the fact truly is, you are willing to latch on to any pro-NVidia propaganda or methods long before any such rumors can substantiate themselves in reality... and simply discount any real discussion or interest in these matters as some sort of anti-NVidia agenda.. which is funny as I have purchased and used every NVIDIA product since the TNT2. :)
 
Doomtrooper said:
Carmack has stated about the speed gains by executing everything in a single pass, as much as 30%:
As said before this, UT2003 could have different bottlenecks which limit these gains. But, I would suspect since UT and Doom seem to be using these pass systems for different reasons they will not necessarily yield the same set of results.

If the CPU were the bottleneck, and using a single pass were saving 30% of the work off the GPU, would that mean you can turn on more vid card features for "free"?
 
Luminescent said:
What about aquanox, doesn't aquanox utilize a robust directx 8 engine with pixel and vertex shader support, along with an advanced lighting model?

Bah, Aquanox isn't much of a real game. It's much more of a tech demo.

And no, it's not truly DX8 unless it requires DX8 hardware to run. That is, one cannot possibly see the maximum possible out of certain PC hardware until that hardware is assumed to be the minimum spec.
 
Humus said:
Mephisto said:
No, UT2003 uses neither Pixel nor Vertex Shader.

From UT2003.ini:

[D3DDrv.D3DRenderDevice]
UseHardwareVS=True
MaxPixelShaderVersion=255

This setting hat no effect. It was used for internal technology testing (as the engine could support VS, but the game does not).
 
Back
Top