UT2003 visuals only on NVIDIA?

As far as I know this myth origins in a german magazin which claimed in his UT2k3 test that the game includes specials for the NV30 but Daniel Vogel stated numerous times that this is wrong. Unfortunately the magazin never corrected itself and the myth seems to be around still...
 
Uttar said:
Not 100% true. The GFFX does look better in UT2003 than in many other apps - so I guess they did develop with nVidia hardware in mind more than ATI's hardware...

Of course, some people will think the other games are evyl and made with ATI hardware in mind, that is. :rolleyes: But I've grown to ignore those type of people :)


Uttar

Well, there's some truth here on both sides. A few days after the game was released, Mark Reign (sp?) made a semi-lengthy post on the Infogrames site (tech-support section, IIRC) which I found strange for several reasons.

He stated bluntly that Epic had never used--even once--an R300 in the development of the game even though, he said, they'd had a prototype 9700P sent over from ATi "sitting unpacked in a box" for sometime. He then went extremely apologetic for some reason, because some people were reporting "problems" with the 9700P in running the game--problems no one at Epic had as of yet verified at the time Mark made this post. Mark vowed to quickly test and correct any problems they found.

I responded that I'd had zero problems with the game and was very surprised to learn Epic had not used an R300 in the game's development. In other words, had Mark not made this admission I simply would never have guessed it. Game ran great for me ROOB with my 9700P. That's the truth--I saw nothing which would have remotely led me to the conclusion Mark admitted. His apologies were very premature, at least as far as I could see, and I told him so...;) I suppose that after he got a production R300 9700P up and running with the game he must have agreed as he had very little more to say on the subject.

I think it speaks it well of ATi's DX driver efforts, as well as Epic's DX code base, when a new product like the 9700P never even used in the development of a game like UT2K3 runs it near flawlessly ROOB (I don't recall any significant problems with it.) In the days that followed I stayed attuned to the forum and thought it extremely interesting that by far the lion's share of the problems reported there concerned nVidia 3D products...! Granted, the general technical skill of most of the people who frequented that forum, as far as I could tell from the comments, left a great deal to be desired and I suspect that many of the problems (regardless of hardware) were matters of basic system configuration. I thought the whole episode was very telling and very interesting.

I agree with those who fail to find anything of substance behind the "The way it's meant to be played" paid promotional which nVidia obviously paid Atari to put there (I substituted an ATi logo a few days after buying the game.) As long as nVidia continues to market "special feature" support without documenting what the features are, a highly skeptical attitude is justified. I've seen nothing in the game to justify such a claim.
 
RussSchultz said:
I'd like every developer house to let me know what they've tested it on and what they've developed using so I can avoid the morons who only test one product or don't cross develop.

Since they don't do that, if they plaster endorsements on their product(which to me suggest "we're in bed" with said endorser, develop using said endorsers product, and likely only test with said endorser) and my video card is not endorsed, I'm not going to risk my $50 bucks on something that might run like ass.

I can assure you (I shipped a "Way its meant to be played" logo game), the logo thing has nothing to do with the developer. Nothing, Nada, its entirely a publisher thing. Publisher tells you to that Logo X stays up for n seconds. Publisher gives you devrel contact details (which you generally have but anyway) and some free boards come your way.

We got roughly equal treatment off ATI and NVIDIA (even little old Matrox sent us some cards). All were helpful (as they always are) and all cards were treated equally.

No game from a major publisher will have not been tested on any major video cards. When we go gold, it has to work on every card in the test lab (which will generally be every card you can a)buy b)devrel will send you).
 
DeanoC: You worked on Silent Hill 2 for the PC, right? Guess that means you work for Creature Labs. SH2 is quite a departure for the company isn't it?
 
cellarboy said:
DeanoC: You worked on Silent Hill 2 for the PC, right? Guess that means you work for Creature Labs. SH2 is quite a departure for the company isn't it?

For the ex-company, you mean. Creatures Labs is bankrupt as of a few weeks ago. More than 50 jobs, IIRC, were lost. There are still some negociations for the selling of the assets, AFAIK.

RIP, CL! Hmm, were you still working there, DeanoC? Lost your job?

Woah, never realized SH2 was a Creature Labs port for the PC, anyway. Never bought it, and never was too interested in it either.

Although I was a huge fan of the Creatures series a few years ago - did a lot of agents, was one of the webmasters of one of the big sites, frequented the chats & forums a lot - you get the idea.


Uttar
 
In the case of UT2003, there are visual effects that can only be experienced on NVIDIA hardware.
.

This is incorrect. FWIW, if you want to be nitpicky, ATI could claim it as UT2003 supports Truform.

-- Daniel, Epic Games Inc.
 
I'm fairly certain either you misunderstood something or Mark posted something with a slighty bad choice of words :)

We had R300 boards several months before going gold though it's true that we didn't use them as main development machines as the drivers weren't stable enough at that point. ATI took care of the problem before UT2003 went gold though.

Most people didn't want to switch cards/ change configurations at that point in development and most artists used device spanning for multi- mon back then which AFAIK ATI still doesn't support :(

I've been doing compatibility work for UT2003 and I've certainly ran the game on R300 and reported bugs to ATI. I can't remember whether our in-house testers used an R300 or not though ATI had access to the game for QA and Infogrames' testers tested on R300 cards as well.

FWIW, none of the R300 stability problems were bugs in the game.

-- Daniel, Epic Games Inc.

WaltC said:
Well, there's some truth here on both sides. A few days after the game was released, Mark Reign (sp?) made a semi-lengthy post on the Infogrames site (tech-support section, IIRC) which I found strange for several reasons.

He stated bluntly that Epic had never used--even once--an R300 in the development of the game even though, he said, they'd had a prototype 9700P sent over from ATi "sitting unpacked in a box" for sometime. He then went extremely apologetic for some reason, because some people were reporting "problems" with the 9700P in running the game--problems no one at Epic had as of yet verified at the time Mark made this post. Mark vowed to quickly test and correct any problems they found.

I responded that I'd had zero problems with the game and was very surprised to learn Epic had not used an R300 in the game's development. In other words, had Mark not made this admission I simply would never have guessed it. Game ran great for me ROOB with my 9700P. That's the truth--I saw nothing which would have remotely led me to the conclusion Mark admitted. His apologies were very premature, at least as far as I could see, and I told him so...;) I suppose that after he got a production R300 9700P up and running with the game he must have agreed as he had very little more to say on the subject.

I think it speaks it well of ATi's DX driver efforts, as well as Epic's DX code base, when a new product like the 9700P never even used in the development of a game like UT2K3 runs it near flawlessly ROOB (I don't recall any significant problems with it.) In the days that followed I stayed attuned to the forum and thought it extremely interesting that by far the lion's share of the problems reported there concerned nVidia 3D products...! Granted, the general technical skill of most of the people who frequented that forum, as far as I could tell from the comments, left a great deal to be desired and I suspect that many of the problems (regardless of hardware) were matters of basic system configuration. I thought the whole episode was very telling and very interesting.

I agree with those who fail to find anything of substance behind the "The way it's meant to be played" paid promotional which nVidia obviously paid Atari to put there (I substituted an ATi logo a few days after buying the game.) As long as nVidia continues to market "special feature" support without documenting what the features are, a highly skeptical attitude is justified. I've seen nothing in the game to justify such a claim.
 
Dan just to clear this up once and for all without any kind of ambiguity (can you imagine now a 10 page thread arguing about what you wrote just now :rolleyes: ) can you tell us when you first had access to the GFFX (i.e. after it went Gold or before)? I mean actual hardware and not some fancy paper telling you what its specs were ;)
 
NV30 was not included in our test matrix before going gold with UT2003 though NVIDIA did show a tech demo of our next gen stuff during the NV30 launch event at Comdex.

BTW, I love "ambiguity"

-- Daniel, Epic Games Inc.

Tahir said:
Dan just to clear this up once and for all without any kind of ambiguity (can you imagine now a 10 page thread arguing about what you wrote just now :rolleyes: ) can you tell us when you first had access to the GFFX (i.e. after it went Gold or before)? I mean actual hardware and not some fancy paper telling you what its specs were ;)
 
vogel said:
Most people didn't want to switch cards/ change configurations at that point in development and most artists used device spanning for multi- mon back then which AFAIK ATI still doesn't support :(

I would like to think that you would know about the spec's. of the hardware you are developing for.
 
Mr Dan Vogel, (is that your name or am I tripping again?)
I'm pretty sure Ati does have multi-mon as I am running 2 monitors at the same time right now, are you refering to Ati not having a specific feature in it's MM capabilities?
 
Yikes!

1) He said he personally tested for the R300.

2) His mention of multi-monitor usage, wrong or right, was proposed as a second hand impression.

3) UT 2k3 was at the end of its development when the 9700 was launched...please keep in mind the time period to which he is referring. Maybe you'd want to ask about 8500's atleast as a prelude to jumping all over him?


So, what exactly did he say that warranted hostility?
 
as far as the "the way it's meant to be played" logo, thats 100% marketing. nvidia most likely paid them to put that up there, and if not, nvidia gave them something for them to put it there.

as for the r300 being released at teh end of ut2k3 developement, that doesn't really matter. game developers will get protoype hardware to test it on long before it goes retail.
 
And they probably received some early boards. That were likely unstable because they were early. Probably associated with why they didn't instantly switch over all cards to the new card for production or development, but introduced them in the way Dan mentioned instead.

The game didn't have a bunch of issues with the R300 at launch, it was tested for the R300, and they didn't switch over to the R300 which was brand new as they were finalizing the game, but they did test it.

OK. So, what's the problem here? Oh, the Big Honking logo? Yeah, I hate the logo something fierce too. But I thought we were talking about bugs and the R300's role in the game's development? Exclusive features? Even received an answer indicating non nVidia cards aren't disadvantaged? Well, I'm glad that there is atleast a distinction between these issues.
 
K.I.L.E.R said:
Mr Dan Vogel, (is that your name or am I tripping again?)
I'm pretty sure Ati does have multi-mon as I am running 2 monitors at the same time right now, are you refering to Ati not having a specific feature in it's MM capabilities?

He said device spanning.
That means having one 3D device to cover multiple monitors.
Under D3D only Matrox cards support this.
(I'm not sure about OGL.)
 
I see no reason to attack Mr. Vogel. The fact that the R300 wasn't used for UT23K dev is a no brainer.... just when was it developed? BEFORE there was a R300! ;) While it was available at the very end of development, you can't condemn them for not using it.... only IF it wasn't checked for compatibility.

And, as far as multimonitor support, well, ATI definately has some problems there. Check out any of the forums that deal in video editing - I recommend VideoVegas - and you will see nothing said good (and a lot said bad!) about ATI's multimonitor abilities. Sorry, but just telling it as it is.

The thing here is this is all a bit OT..... The question was wether there were specific content for the FX that made it a better choice than an R300 for UT23K... and that has been answered. The only question now remaining - and the goes to a few other threads (Halflife2?) .... Just why does nVidia PR feel it has to imply what amounts to lying where their FX series of cards are concerned? :rolleyes:
 
martrox said:
(Halflife2?) .... Just why does nVidia PR feel it has to imply what amounts to lying where their FX series of cards are concerned? :rolleyes:

This is what a good aggressive PR group does. Confuse everyone and sell a lot of HW. Nvidia has been doing this for a long time and gets away with it. ATI does an 'optimization' for Quake and gets roasted.
 
Hyp-X said:
K.I.L.E.R said:
Mr Dan Vogel, (is that your name or am I tripping again?)
I'm pretty sure Ati does have multi-mon as I am running 2 monitors at the same time right now, are you refering to Ati not having a specific feature in it's MM capabilities?

He said device spanning.
That means having one 3D device to cover multiple monitors.
Under D3D only Matrox cards support this.
(I'm not sure about OGL.)
IIRC (but I could confuse this with something else) both Nvidia and ATI can do that. However, the OGL context cannot exceed 2560x2560 in case of R300/350/RV350 (and 2048x2048 on R200/RV250), so you can't cover two high-res monitors completely.
 
Back
Top