Geforce fx = Geforce 4600 ultra?

Tagrineth said:
I wonder if anything became of Carmack's considering making a cube-map emulated/free codepath for Kyro II, due to 'surprising performance'. Guess the performance just wasn't quite good enough... shame really.

I think Carmack "passed the ball" to PVR's OGL driver team in a way, in the interview Reverend had with him.

In retrospect I'd rather have PVR release finally a new card than that ;)
 
Doomtrooper said:
Not breathtaking..but acceptable..looking through the config file is interesting since this was a E3 demo leak..yet no reference to ATI at all.
seta r_useParhelia "0"
seta r_useNV20 "0"
seta r_useNV30 "0"

:LOL:

If Ati was the default no settings would need to be changed for Ati cards.
 
Doom3 still looks totally underwhelming to me. It's just going to be another game where you have to turn the monitor brightness up halfway just to be able to see anything. Then again from what I can see in those dim screen shots, nothing really strikes me as worth seeing to begin with. o_O
 
Nagorak said:
Doom3 still looks totally underwhelming to me. It's just going to be another game where you have to turn the monitor brightness up halfway just to be able to see anything. Then again from what I can see in those dim screen shots, nothing really strikes me as worth seeing to begin with. o_O
gah, you are the kind of person game devs hate.
You HAVE to see everything, so you crank the gamma/monitor brightness.
it is SUPPOSED to be dark.

Note - upon reading this, i realize it seems very negative.
I dont mean it to be insulting or rude, i just dont get people who crank the gamma until they can see everything when its clear the dev didnt want you to be able to see everything. It would be like playing Unreal (not UT, Unreal) with the "nolighting" option on. Sure it woud suck.
 
I don't think ATI was used as a reference for anything...looking at the reviews from Carmack..he stated they tried all the cards available and ATI's card(9700) was the fastest...the speed most certainly was not from optimizations, but pure rendering power.

If there was some optimizations you would think the config file would show it, obviously looking at this most certainly Nv20, Nv30 and Parhelia are getting something...at least in this build...

I thought it was quite funny since this is about 8 months old and the Nv30 reference in there already..E3 was May 22, 2002 and were still 2-3 months away for offical Nv30 cards.
 
If memory serves, the E3 was being shown on 'next gen ATI hardware,' which means a lower-clocked r300 board. While Carmack was working on making the game work well on all boards, it makes sense that the build he brought to show on ATI hardware was optimized for it.
 
Well using your logic how would Carmack code for a card that has really little experience with..in fact ATI engineers stayed up all night prior to E3 to the 9700 ready...you have to provide the product up front to code optimizations..at best this build may have some minor 8500 optimizations in it...

John C.


The new ATI card was clearly superior. I don’t want to ding NVidia for anything because NVidia has done everything they possibly could; but in every test we ran, ATI was faster. (John Carmark)

I wonder why..256 bit bus etc..hardware improvements IMO.

Quotes from Quakecon Tim Willits and Fred Nilsson of ID...

The presentation in the DOOM III theater was quite the scene at QuakeCon. After being thrown in the front of the line before all of the fans (which I feel both bad and good about), I was treated to an extended version of what was shown at E3. Only this time it was shown at higher detail and resolution. "We ran the DOOM III presentation at E3 in 640x480 in medium quality," explained Tim Willits. "Without changing the executable, we're running it here at 800x600 in high quality and that's due to the great work that ATI has done on the 9700 drivers." The presentation was also running the game on a 2.2Ghz Pentium 4 with that lovely Radeon 9700, so it was no wonder that it was moving smoothly. Both Tim Willits during out little talk and Carmack in his keynote later were quick to point out that the game could run on anything as low as a GeForce1 however. "The game running at full features is with a GeForce3 video card or higher. It'll run on anything down to a GeForce1 because of the hardware acceleration, but we feel that some of the graphical features would have to be turned down. But with the new products from ATI and nVidia coming out before the release of the game, we're sure that we'll have great penetration for the game full feature."
 
Doomtrooper said:
Not breathtaking..but acceptable..looking through the config file is interesting since this was a E3 demo leak..yet no reference to ATI at all.


seta r_useParhelia "0"
seta r_useNV20 "0"
seta r_useNV30 "0"

:LOL:

JC is probably just using the ARB codepath for the shader implementation in ATI's cards in that build.
 
Doomtrooper said:
The new ATI card was clearly superior. I don’t want to ding NVidia for anything because NVidia has done everything they possibly could; but in every test we ran, ATI was faster. (John Carmark)

I wonder why..256 bit bus etc..hardware improvements IMO.

There's also processing improvements to be had. As I've been saying, a DX9 card won't have to do as much calculating as a DX8 card for the same effects. But, this may have not been the case at the time of JC's writing, as ATI has been very slow in getting the R300's advanced features exposed.
 
Doom 3 Alpha leak uses R200 code path(Radeon8500 extensions) on ATI boards, when you start game pull down console and scroll up and you'll see OpenGL init messages. There is no point guessing what code path demo uses when you can clearly see it in demo.
 
Back
Top