Interview with Epic's Tim Sweeney

What you write sounds as a good reason ,but when r300 released twimtbp dev's not choice the r300 route, they choice slow down game developent, this is why we see no real dx9 games for years.

They can do something when starting the same money hungry program, ops i forget amd has no money left, than the only hope left is they own driver team (and r650 when its not just a shrink), but driver optimisation need time, so its a loose-loose situation.
That's not true, most developpers did chose R300 and quickly that's why Ati did have moreover an advantage in new games when new parts from Nv came out.
 
Haha so it's AMD's heroic stable of driver developers vs Nvidia's evil stash of money? :LOL:

No.
Sometimes after game released and amd driver team has a chance for optimizing they do a great job without any trick, but in reallity this is the game devs job what they "forget" to do.
AMD need to learn how capitalism working, without they own orange tree. they cant sell orange ;)
 
That's not true, most developpers did chose R300 and quickly that's why Ati did have moreover an advantage in new games when new parts from Nv came out.

R300 was a history already when dx9 games start shineing, and ati not have any advantage from r300dx9 capacity in later.
 
The way I read it is that Sweeney thinks the 8800GTX has been the top performing part over the last year. Does anyone really disagree with that assesment? :smile:

Jock McSweeney said:
for the past year, NVidia hardware has been ahead fairly consistently

I mean come on... It's not rocket science, backhanders or anything. It's simple fact. He even says 'fairly' which if you read between the lines might imply better performance on X1 vs G7 if you squint hard enough.
 
Oddly enough, if you remember back in the day, NVidia chips benefited in Quake3/Doom3 due to their double-Z performance and (IIRC) this design choice was in part because they had discussed with Carmack how he intended to design his future engines. Obviously, at the time Carmack's engines were the be-all and end-all in computer gaming graphics.

In the new generation of engines, UE3 stands to be extremely widely used by many licensees so I wonder if NVidia discussed future developments with Sweeney and this is one of the reasons why the G80 has high fillrate. After all, the following sounds like a perfect match to G80's capabilities:

Most effects of UT3 are more bound to the fillrate than to basic features like geometry processing.

Is it just coincidence that NVidia's designs also seem to dovetail well with the engines of the day or just good planning? On the other hand, this could again indicate that ATI/AMD haven't been planning so well in recent times.
 
I'm pretty sure that ID, Epic and nVidia have helped steer each others future products. Why wouldn't they?
 
I'm pretty sure that ID, Epic and nVidia have helped steer each others future products. Why wouldn't they?

Yes, but the question is, why haven't ATI/AMD done the same?

I certainly don't think that anything underhand has occurred but I do wonder what factors influenced ATI to think that such huge increases in ALU power were important and fillrate was perhaps less so? Did other developers indicate this was what their engines were working towards or did ATI designers simply misread the general direction of developers? Perhaps ATI were thinking about the HPC market too much? :?:
 
Yes, but the question is, why haven't ATI/AMD done the same?

I certainly don't think that anything underhand has occurred but I do wonder what factors influenced ATI to think that such huge increases in ALU power were important and fillrate was perhaps less so? Did other developers indicate this was what their engines were working towards or did ATI designers simply misread the general direction of developers? Perhaps ATI were thinking about the HPC market too much? :?:
And how many UTKx engine games are there? There is always buzz about epic then the game comes out , then it becomes niche.... MAYBE this next gen will be the catsmeow... kinda doubt it.
 
And how many UTKx engine games are there? There is always buzz about epic then the game comes out , then it becomes niche.... MAYBE this next gen will be the catsmeow... kinda doubt it.

If I recall correctly each iteration of Unreal Engine tech has been followed or preceeded by at least 4-6 titles that make it to retail. Granted some of them are pretty poor titles, but some are pretty good (Vampire for example I had a lot of fun with). The Unreal Engine 3.0 (and ones leading up to it) also have been seeing wider adoption.

I'd imagine Nvidia just has a policy of identifying who they think is going to release the next buzzword engine and then trying to tailor their hardware to maximize potential on that hardware while also giving general good performance.

Up until R600, ATI had been doing a fairly good job of maintaining good overall performance in a large variety of games without relying on any one particular engine. Although in the past OGL performance has been their achilles heel.

Other than the few months the 7800 GTX was out before X1800XT and it's driver improvements came out, ATI was in general ahead in performance until G80 launched. At which point it's been a bit of a slaughter when it comes to performance.

Regards,
SB
 
-->

Originally Posted by Epic
Unreal Tournament 3 will ship with full DX10 support, with multi-sampling being the biggest visible benefit of the new graphics interface. Additionally, with DX10 under Vista we have the possibility to use the video memory more efficently, to be able to display textures with a higher grade of detail as it would be possible with the DX9 path of Vista. Most effects of UT3 are more bound to the fillrate than to basic features like geometry processing. That's why DX10 has a great impact on performance, while we mostly forgo the integration of new features.

I'm wondering, since he specifically mentions DX10 under Vista with regards to video memory and textures, if that means they might be using Vertex Texture Fetch? Or somesuch?

He also specifically mentions in the sentence directly after the fillrate comment that DX10 has a great impact on performance. If the game is fillrate limited and DX10 provides for a greater increase in performance over DX9, it sort of implies that DX10 is doing something more efficiently than DX9 with regards to fillrate.

In other words what makes DX10 more efficient with regards to fillrate than DX9? And would this help R600 in way?

Regards,
SB
 
Why does he say 'of vista'?

Could it be that the XP version allows the higher texture resolutions in DX9?
 
I would have hoped that Beyond3D at least would be above the conspiracy-theory nonsense that infects other websites about how evil Nvidia pays developers to make games suck on non-Nvidia cards with TWIMTBP. Unfortunately, that seems to not be the case even though it's a classic urban legend and there has never been the tiniest shred of evidence that Nvidia has ever paid developers to only optimize for their cards.

And how many UTKx engine games are there? There is always buzz about epic then the game comes out , then it becomes niche.... MAYBE this next gen will be the catsmeow... kinda doubt it.

A list of games which have used the various Unreal Engines can be found here:

http://en.wikipedia.org/wiki/Unreal_engine

As you can see UE is one of the most heavily licensed and used game engines out there today. UE3 in particular has grown in scope to the point where it is now more of a middleware engine like RenderWare. Now that Criterion is a part of EA and is no longer offering RenderWare for licensing outside of existing contracts, I suspect UE3 will be a very popular replacement in the market for middleware engines because it can be customized with all the major physics, AI, and sound APIs.
 
Last edited by a moderator:
Back
Top