Ultra High Mode in UT 2003

You talk about this like it's a bad thing. The things that you describe we don't have in any other game to date. Graphically, UT2k3 is easily the most advanced we've yet seen (As a side note: bump maps...if they haven't been done well, they might as well not have been done at all...I don't think they've been done well yet...).

There is no "life" in the game, everything is static. For me, UT2K3 looks as boring as any other multitexturing out there. I scoff at high static poly counts, especialy as you still see the individual polys without truform.
 
SharkFood said:
Your average Joe game consumer only cares if their game works, works well, and works as advertised. The moment a game comes out with "With advanced reflections! (NVidia only), Superior pixel shader effects! (ATI only) and lifelike shadows (Parhelia only)" will be the day game developers realize what this causes- forums full of complaining customers.

Or "Enable TruForm(tm)" (ATI only).

The flaw is your argument is that the game DOES work as advertised to Joe Consumer. The fact that Joe Consumer has a check-box that says "Enable Environment Bump Mapping" that is grayed-out or "Enable TruForm" that is greyed out, or that he can't select stenciled shadows (no stencil buffer), or advanced pixel shader effects (DX7 class card) is irrelevent. The game was advertised as "minimum recommended configuration == TNT class card" and if his setup meets the minimal spec, it will run as advertised.

However, having the minimum spec does not guarantee that the game will look or run the same on every single person's system. The PC is NOT
a Game CONSOLE.

We are not talking about UT2003 NOT RUNNING AS ADVERTISED, we are talking about a switch that an NV30 owner can flip to enhance the experience on his own card.


People who buy more expensive cards SHOULD be able to buy games that have been patched or ported to use the advanced functionality. For example, the NVidia Giant's Patch that enabled S3TC and DOT3 for cards that supported it.

Likewise, if I owned an NV30 and NVidia paid Epic to make a special mode/patch for the game look 10x better on a NV30, I would gleefully thank NVidia for encouraging Epic to make my advanced hardware actually be useful. Likewise if Epic enhanced UT2k3 to use DX9 PS2.0 for my R300 PRO (which I own now)

3dfx would have never made it as far as they did had not John Carmack produced a version of Quake that ran specifically for the 3dfx GL minidriver, a hacked up OpenGL implementation that NO ONE in the industry supported at the time Carmack implemented it.


If the table's were turned and ATI was paying developers for ports and patches, you'd be talking about how great ATI customer support is and developer support is. How nice the company is to influence developers to support the features of you card. Basically, anyone who supports PS1.4 is supporting an ATI only feature as well. Shouldn't you be complaining about this?

The fact of the matter is, I am pro-ATI and pro-NVidia and I am tired of these silly corporate groupies twisting every bit of news into anti-Something. The only reason I bring up ATI and TruForm is to try to shake some sense into you so you can see the transparent hypocrisy of your statements.


When ATI buys an exclusive port to one of their hardware features, I expect to see righteous indignation from you on B3D railing against them forcing any feature into a game that is not DX9 compliant or standard OGL (no proprietary ATI only extensions) Otherwise, your righteous indignation against this issue for Nvidia is a sham.

Hmm, I wonder which case is true?
 
Hi there,

I, too, wonder why this thread is still going on. People should re-read what Daniel posted and his quotes, too.

DemoCoder said:
We are not talking about UT2003 NOT RUNNING AS ADVERTISED, we are talking about a switch that an NV30 owner can flip to enhance the experience on his own card.
Well, NV30 owners can't. There are no special NV30 features with UT2003, no textures are "locked" and need to be "unlocked" by a special patch. As Daniel made clear, the whole issue was a misunderstanding by PC Games.

ta,
-Sascha.rb
 
However, having the minimum spec does not guarantee that the game will look or run the same on every single person's system. The PC is NOT a Game CONSOLE.

The flaw in your argument is that nobody ever said that a PC is a GAME CONSOLE... or anything even remotely close to this wrongful assumption. Once again, I'm baffled where this is coming from as it's surely not from me. :)

And we are also not talking about the minimum spec. We are talking about superior or equal spec. End of story.

People who buy more expensive cards SHOULD be able to buy games that have been patched or ported to use the advanced functionality. For example, the NVidia Giant's Patch that enabled S3TC and DOT3 for cards that supported it.

Absolutely in agreement. But any other card that has equal (or superior) support for S3TC and DOT3 should also have the option available to them.

If the table's were turned and ATI was paying developers for ports and patches, you'd be talking about how great ATI customer support is and developer support is.

It's good to know Miss Cleo has no competition, because, quite frankly, your prediction skills totally suck. It doesn't matter what IHV financed artificial* and fictional superiority in games- I'm not for it one bit.

If Joe Gamer has equal support for feature XYZ, there should be absolutely no reason why feature XYZ shouldn't be available in games purchased. IHV financing is no reasoning for this.

I am pro-ATI and pro-NVidia and I am tired of these silly corporate groupies twisting every bit of news into anti-Something.

Maybe if you state that 100 more times, someone might actually start to believe it. :) It's pretty obvious from your posting history that you will only cite trivial ATI advancements, while earnestly condemning beyond all reason anything with mere mention of 3dfx. To claim objectivity given the past is a kinda moot point... but I digress.

The only reason I bring up ATI and TruForm is to try to shake some sense into you so you can see the transparent hypocrisy of your statements.

And as I have already quite clearing explained- TruForm isn't even within the same universe as the kind of issue being described here. If NVIDIA and ATI both had a middle-ground form of HoS that could be supported, your arguments might actually hold merit... but as there isn't such a thing at this time, they are nothing more than a grandstand away from the point in a desperate attempt to draw fire away from the point with a massive unequivalence.

I'll say it as simply as I possibly can in hopes that you may actually understand it:
A featureset that is equally shared amongst multiple IHV's should never, under any circumstances, be allowed to be exclusive through IHV financial tampering.

That is as simple as it gets. Does the NV25 support TruForm? No, it doesnt. So how does this qualify? Simply put- it doesn't.

Like I already said, but was hopelessly ignored in the good name of grandstanding- a GF3 game released with p/v shaders at the point of the GF3 being released would be a *good thing*- even if only this IHV had a product that could expose these features. This isn't artificial thwarting of harware.. there were no equal or superior support for these features at the time.

Does the ATI have 128MB of video memory? Yes it does. Should it have the same texture limits (given equal/shared methods of TC) in games as other 128MB video cards, exception being possible bugs or other issues preventing their use? Yes, it sure better.

It's a pretty basic and simple principle... and no, I'm not accusing Epic of having done this at this point in time, but instead have been posting this needs more research for this alleged "rumor" before people start casting stones. I'd also recommend reading the original Rage3D threads on the subject since I'm still totally without any objective reason to believe this is the case with all evidence available.

In theory, this very specific subject is a bad idea. The accusation is an extra level of graphic detail that can be enabled on NV30 hardware for the sole reason it is an NV30- not because of any other reason- will only lead to very pissed off customers. If there is some technological advance on the NV30 that makes the feature impossible on other platforms, there is no argument for such a feature. This mirrors the ideology of TruForm + ATI.

It's a very simple point.. so simple a child can understand it.
 
Too bad your entire argument is based on an untrue rumor. You keep falling back to this mythical scenario where an IHV (NVidia) paid for a feature that is supported on other platforms (ATI) but still disabled in the game (UT2003)

I'm talking about IHV's paying to ENABLE features, you are talking about bribes to DISABLE features of other cards with reason (like a broken implementation, or poor performance) In any case, you were talking about fiction, and I was talking about reality, so I could see how you could get confused. But you're right, they were in separate universes -- fiction vs reality.


Whatever my "posting credentials", the fact of the matter is, I have atleast owned mostly every NVidia, ATI, and 3dfx card, and I frequently defend ATI in these forums for unwarranted claims. I have yet to see you defend any unfair attack against NVidia. In fact, you are frequently the originator of them. I don't originate rumor-attack posts and I don't try to base elaborate conspiracy theories based on them.

When in doubt, doubt.
 
In any case, you were talking about fiction, and I was talking about reality, so I could see how you could get confused. But you're right, they were in separate universes -- fiction vs reality.

Actually, we were indeed talking of two different universes.. except they aren't fiction vs. reality, they are objective vs. non-objective.

There is no objective proof that Hellbinder's claims are substantiated as fictional or non-fictional at this time. I clearly outlined what would be required to deem this topic "closed", from which none of the criteria needed to make such a determination can be queried at this time.

So as there is no evidence of reality OR fiction, only the non-objective would be quick to discount any such possibility without proof or evidence.. as well as create all sorts of swooping and illustrious incomparable examples to FUD up the issue.

It's obvious where the non-objectivity exists- it's due to an unequivocal and unreasonable faith/loyalty for one company versus another and the very point indeed.

Oh, and when in doubt, usually it's a good practice to organize all available data, form what is needed to prove/disprove a given hypothesis, then either keep or throw out... not faithfully assume the best through rose-colored glasses (to coin a phrase)...
 
Sharkfood said:
There is no objective proof that Hellbinder's claims are substantiated as fictional or non-fictional at this time.

Daniel Vogel just came here and denied it...that's right from the horse's mouth, wtf more do you want????!!! :rolleyes:

http://66.224.5.66/board/showthread.php?s=&postid=1331474697#post1331474697

And just to make it easy for you:

Vogel (at Rage3D) said:
Your sources are wrong

I can assure you that we did NOT lock out anything from customers that is on the CD.

We pick (conservative) default settings based on detected HW and the code currently in place will only pick the highest texture detail on 256 MByte cards by default though you can easily change this either via the menu or by modifying the ini file.

At highest texture detail some levels might use up to 180 MByte of data (textures + geometry) and if you have a lot of players this number might be even higher

Here's how the detail settings work:

FWIW, we didn't ship with any textures that require a detail setting above High to be shown at full detail. The texture detail is basically a bias against the LOD level defined in the texture package. So e.g. large textures might have a "NormalLOD" of 2 which means at normal texture LOD ("Normal") the engine will skip the first two mip levels. At "Lower" it will skip 3 and at "Higher" it will skip only 1. To the best of my knowledge the highest NormalLOD used in UT2k3 is 2 which means that by setting your texture detail to "High" (ini and menus use a different notation as texture detail ended up being to fine grained and I'm refering to ini options) you'll end up with the full quality textures. We also do some clamping so small textures won't loose mipmaps at low texture detail settings.

Below are the ini options and their bias level.

-4 UltraHigh
-3 VeryHigh
-2 High
-1 Higher
0 Normal
+1 Lower
+2 Low
+3 VeryLow
+4 UltraLow

As this is too fine- grained for the regular user we mapped the words differently so XYZ in the menus doesn't necessarily map to XYZ in the ini so this might have caused some confusion.

-- Daniel, Epic Games Inc.
 
Nagorak-

You may want to go back to page three of this very thread where I quoted that entire body from Daniel that you just quoted back to me.

It still doesn't reasonably prove or disprove the hypothesis as it would require real-world testing in order to qualify (or disqualify) any given hypothesis.

Again, if it becomes a matter of "trust"- be it trust in Daniel Vogel or trust in Hellbinder's hypothesis based on a quote from another Epic employee... there is nothing objective about trust.

It's a non-issue as there is *nothing* to prove or disprove the accusation. There is also no way to test or verify the accusation at current so any belief either way is unfounded. Simple point.
 
The burden of proof is on the person making the claim. Rumors are not facts and don't need to be "disproven" The accuser has to prove them. The accuser not only offered no proof, but we have credible testimony showing the opposite.

Yet you still choose to cling to this hypothesis. Let go of your Nvidia hatred. It leads to the darkside.


(God forbid there is some bug in Epic's code that makes it not work on some cards but needs a patch to fix. Sharkfood will take this as proof positive that NVidia orchestrated a sinister deal with Epic to sabotage ATI)
 
Let go of your NVIDIA phanboi-ism (edit- wow fboy is a censored word! lol) and come into the light of objectivity.

Look at page one of this thread- the statement was:
According to Epic´s Mark Rein the mode will come with a patch delivered subsequently as soon as Nvidia's new NV30 appears.

Objectivity would also suggest the knee-jerk counter from Daniel Vogel conflicts with the statement from Mark Rein.

Which is the more objective outlook on the situation? One that blindly favors the pro-NV30 point of view that the first statement was in error, or one that blindly follows the second statement was in error in an anti-NV30 point of view?

I simply have stated that neither quick judgement is correct. In order to determine which of the two statements is in error it would require:
1) This upcoming patch.
2) A 256MB NV30 based board
3) A 256MB ATI based board.

These would be the only way to discount one or the other statement as fictional.

You can leap on your personal favorite as blatantly as you wish or even try to argue superflous and unrelated blather all you wish. I'm still sticking to the fact that neither statement can be proven OR disproven at this time.
 
Something is unproven until it is proven. This is not fboyism, it is logic 101. When you gather crucial your evidence, come back to us.
 
No, you have two conflicting statements...

1) New texture detail that wont be available until after NV30 is released.
2) That new texture mode is alive and well right now.

Logic 101 would suggest when both are totally unproven, leaping on one over the other would suggest bias.. aka fboyism.

You just happened to leap on #2... others lept on #1.

I lept on trying to create a suite of tests to prove or disprove either 1 or 2. No such test can be created at this time, so no leaping to favorite IHV specific number for this soldier. :)
 
Some point I would like to bring up here. N-patches or HOS support really didn't come about untill DX8.(Please correct me if I am wrong here.) On that note it seems that UT2003 is supporting this DX8 feature. If nvidia cards did support npatches or HOS would the implement work in UT2003 on an nvidia card? Trueform is ATi marketing speak for n-patches. N-patches are a part of DX8. Hrm Democoder the only way for your argument to be logical is that if somehow the UT2003 patch has some DX9 implement that is supported only on NV30 and not the R300 core?
 
Shark & Demo.....Guys, you are beating each other up from arguements taken 3 years ago! :rolleyes: The bottom line is that, if you take away these 2 words: ATI & nVidia, YOU BOTH AGREE! :eek:

If both of you would just drop the "raised hair on the back of the neck" banter, and look at what you are both saying......it's pretty much the same.

Fact 1: If anyone supports any advanced feature of any videocard, then IT"S A GOOD THING!
Fact 2: If anyone disables or doesn't use any features in a given manufactures product to make another manufacturer look better, then IT'S A BAD THING!
 
Because he's a paid stoolie. Its obvious. ;)

I'd use the eye roll, but that one is sooo condecesnding.
 
Sabastian said:
Some point I would like to bring up here. N-patches or HOS support really didn't come about untill DX8.(Please correct me if I am wrong here.) On that note it seems that UT2003 is supporting this DX8 feature. If nvidia cards did support npatches or HOS would the implement work in UT2003 on an nvidia card? Trueform is ATi marketing speak for n-patches. N-patches are a part of DX8. Hrm Democoder the only way for your argument to be logical is that if somehow the UT2003 patch has some DX9 implement that is supported only on NV30 and not the R300 core?

There are many DX features that are optional. Just because a feature exists in DX doesn't mean that it ain't vendor specific (e.g. PS1.4) N-Patches are effectively an ATI feature at this junction. No one else implements it in HW like ATI, and other implements using the CPU are too slow to use. (NV30 might change this) If you wish to follow your line of reasoning, then I hope you won't complain about the VS2.0/PS2.0 "Extended" model in DX9 (not VS3.0/PS3.0). To the point, these add those NV30 specific shading instructions we all know and love to the DX9 assembly. That is, data-dependent branching (among other things) is now in 2.0 and can be queried for as a caps bit.

So if NVidia pays epic to write fancy shaders that use "Extended" 2.0 DX9 shaders, you won't complain about it right, since, after all, it's in DX9!


Basically, I don't see anything wrong with an ISV targeting a specific card feature, be it exposed via OpenGL extension, or via DX9. That's why I don't object to (in reality) ATI-only TruForm patches to games, or even PS1.4 specific shaders. I think this is a good thing.

Sharkfood is talking about someone paying to specifically to make sure a feature DOESN'T WORK on a particular card.

Well, depending on the situation, I think this may be considered a bad thing, or a good thing. I don't think it is evil a priori.


Consider this scenario: NVidia pays Epic $1 million to create a new Unreal2003 "NV30" Special Edition which will feature dazzling new shaders and reengineered geometry and textures. A team of artists and programmers works for 3 months doing the extras. NVidia wants to bundle this "Special Edition" with OEMed cards as an incentive for people to buy NV30s. Let us suggest that this new version will actually run on ATI cards as well.


Now NVidia paid $1million for new content to be developed, but ATI benefits as well as a "free rider" by showing off the R300 using this suped up UT2003. Wouldn't NVidia be justified in stopping ATI from free riding on NVidia funded development efforts? All that new art might not ever been created had it not been funded by NVidia in this scenario. I think Epic/Nvidia would be justified in charging ATI $$$ to enable this to work with their cards.


If someone funds the development of IP, the person who owns it has a right to control it. In this case, if NVidia hired Epic to make a special version of UT2003 to demo their hardware, NVidia should retain the rights to decide who is allowed to run it.


Are we to say that NVidia has no right developing games or contracting developers to make "exclusive" games? Should Microsoft, Sony, and Nintendo be forced to go crossplatform with their exclusives as well? Absurd. NVidia is legally and morally within their right to fund games and lock them to their hardware via any mechanism they want.

There is a big difference between direct sabotage and protecting your investment. If NVidia paid Epic to insert bugs or "performance killing wait loops" into UT2003 to make the benchmarks run poor on the R300, I'd say that way bad. That's development that benefits no one and harms someone else. But if NVidia contracts Epic to develop a special version of UT2003 that runs only on the NV30 (Even if IN PRINCIPLE it could run well on the R300) I still think they are perfectly within their rights to dictate who can run it.

Remember the ASUS "wallhack" cheating Detonators? In their, their drivers could run on any NVidia card, but they put copyprotection into them that made it so they could only (without a hack) work with ASUS's OEMed Nvidia card. ASUS was perfectly within their rights to "lock" the software to the hardware shipped since they funded development of these new drivers as their main selling point! No one else has the right to "steal" them.


So even if Sharkfood's theory is right, that a special mode exists for 256Mb video memory cards, but in really, will only work on Nvidia's cards, I say, so what? If Nvidia actually paid money to have Epic create new hi-res textures for this, why should ATI get them for free?
 
So even if Sharkfood's theory is right, that a special mode exists for 256Mb video memory cards, but in really, will only work on Nvidia's cards, I say, so what? If Nvidia actually paid money to have Epic create new hi-res textures for this, why should ATI get them for free?

Well, for starters, that is NOT my theory (for the third time..). I don't HAVE a theory, which was the whole point... I just see two statements from Epic in contradiction and simply cannot objectively choose which one is correct as it's impossible to determine at this juncture.... BUT..

Hypothetically speaking- if UT2003 had a special, financed NV30 mode.. one that NVIDIA actually paid money to have Epic create for proprietary use on their hardware, I wouldnt have *any* beef with this, provided it was *clearly* advertised as such and clearly labeled as such.

Going as far as calling it "NVIDIA's UT2003" versus "Epic's UT2003" would be sufficient, if not overtly identifying. My personal opinion is- people can do whatever they want, provided it's done in such a way that is clearly defined as not to leave anything to be misconstrued or build false expectations.

I'm still a bit baffled about the comparison of HoS vs. "Extended Mode" with VS/PS2.0. N-Patches are only effectively ATI due to NVIDIA intentionally removing the support due to poor performance in current generation videocards (as per direct from NVIDIA in the DX mailing list). Something that is removed intentionally due to poor performance isnt a very viable explanation of why something becomes vendor specific. If this were the case, then IHV's would just be disabling anything their product does poorly at compared to the competition and suddenly cry "Proprietary!" :)

As far as "Extended Mode" in VS/PS2.0, there would be no qualms if this was chosen for the correct purpose. Obviously, a barebones, simplistic shader implementation that could easily be expressed with even DX8.0 PS1.1 would have no excuse aside from agenda, but something that obviously gained value/benefit from the additions in 2.0 would be welcomed totally.
 
Ok DemoCoder I am not going to argue with you as I pretty much agree. But my point rests in the idea that DX8 supports npatches nvidia could support it if they had chosen to do so. If this happens to be a DX9 implement it could be that ATIs R300 supports whatever the implement is. Nvidia will just have all the glory for it in light of the chosen time of the release of this implement supposedly DX9 will also be launched around the time of the launch of the NV30.[speculation] Hrm what are the chances that the HLSL is Cg based?[/speculation]
 
DemoCoder said:
I think Epic/Nvidia would be justified in charging ATI $$$ to enable this to work with their cards.

Hrm I don't agree with this statement at all.(seems I missed it when I skimmed your post earlier and should have been more critical seems a second cup of java helps considerably as I don’t agree with your argument as much as I initially thought.) ATi should not be obliged to give ANY money to their main competitor not a dime. If ATI is in fact blocked out of the affair as a result of some sort of proprietary software that may be available to ATi end users there will be outrage. Epic/nvidia suffering the brunt of the critique. Now I would understand that if nvidia created the game themselves but this third party entering the picture seems to leave a dirty taste in my mouth. IMHO if it comes to thatATI should not pay nvidia a dime and consider legal action if at all possible. In the end though as a result of this hypothetical proprietary software vast portions of gamers would be locked out of enjoying the supposed higher quality of gaming while nvidia gets the perception that its hardware is somehow superior when in fact it isn’t. What about other IHVs should they as well give money to nvidia for supporting this? I think that the suggestion is preposterous. If nvidia pays to have the debute of the NV30 shine then great but the DX9 effects should be available on all DX9 cards. The more I think about the proposition the more it seems terribly wrong. ATi should not be obliged to give nvidia a dime or for that matter the dev. The developer should be betting that if it develops a good game that it will sell well not highering themselves out as thugs. But it seems Epic (and its partners.) have thrown themselves on side here. I don’t like it personally. ATi does not intend on charging for n-patches nor could it. The comparison there is rather poor IMO. ATi should get the patch for free as ATi supports the technology to make it work straight out. BTW it costs hundreds of millions to develope this sort of hardware I think that is good reason enough.
 
Back
Top