Matrox & Selective Reviewing?

OpenGL guy said:
Well, at least the OpenGL drivers were good... :)
Hint: I used to work on the Performance ICD for the Savage 4 and some other chips.

I was actually waiting for you to comment on this since i remember that you said you used to work on the Savage 4 drivers (i've been on the Beyond 3d boards for some time now and i have a very good but unfortunetly also a very selective memory :)) :)

It ran Quake 2 pretty good, that much i remember. So maybe the Open GL drivers weren't so bad. Can't really remember which games i had problems with and which not though.
 
OpenGL guy said:
Bjorn said:
As another previous Savage 4 owner, i would say that crap is a way to nice word to use when describing the Savage 4 drivers 8)

Well, at least the OpenGL drivers were good... :)

Hint: I used to work on the Performance ICD for the Savage 4 and some other chips.


Ahhh... but not as good as the Savage2000 Metal driver for UT that I wrote ;)

- Andy.
 
Sidenote: DroneZ looks absolutely stunning, yet the gameplay leaves a lot to be desired.

Has been DroneZ publicly available for sale or is it simply a game NV paid Zeta games to develop to bundle with it´s cards?
 
hrm interesting. Either of you know Tony? He used to work for Diamond before they became part of S3 don't know if he went to S3 , as he's been working for Micron for a while now...
 
jb said:
DroneZ is openGL only had uses NV specific OpenGL calls to add in the TnL processing. Thus GF3/4 cards get better results. These same calls will run on the ATI card if the developers provide for the use of the ATI calls. Vulpine is the same way (custom nV OpenGL calls).

I dont have a problem with this as back then when the apps were being developed GF3's where the only thing they had to target. Now with P10, Matrox and ATI parts offering the same features I would hope the developers give us a patch. And if these are used in reviews (which again is perfectly fine) just please make note of the fact that they use Optimized nV calls.

In these kind of cases -- where the game engine will really be based on the demo engine (dunno much about Vulpine tho, correct where I'm wrong) -- isn't it the job of 3Dlabs, Matrox and ATI to pimp those patches in there? So if the games happen to favor Nvidia only, the others have actually done their job poorly? (Of course developers too should be interested in maxing out the customer base.) Sometimes it's almost like some people seem to blame Nvidia for getting their product accelerated, patch or no patch, and that doesn't seem completely fair to me. I don't mean jb or anyone else here, just that I've seen that attitude appear at places from time to time.
 
What if the developer says NO, Dronez and Aquanox were designed on Nvidia hardware, Vupline Glmark wouldn't EVEN run on a Radeon 64 Meg DDR due to heavy Nvidia PROPRIETARY extensions until a patch was released.
Even Aquanox being a Direct3D game doesn't mean its not OPTIMIZED for specific hardware .i.e compiler similar to CG...
The key with Dronez and Glmark both contain PROPRIETARY extensions that no ATI card can support, so its up to the developer to make the change to a universal extension.

To show you how pathetic Glmark is, I had a buddy drop my shop and we took out his GTS and put a Radeon 8500 in...this was on a P3 1GHZ machine. He wanted to see what all the fuss was about with IQ of the Radeon and was looking for a new card.
Vulpine ran signifcantly faster with the GTS over a Radeon 8500 clocked at 300 mhz, but when we ran MOHAA and Serious Sam the 8500 was putting up numbers 3X times better than the GTS :rolleyes:
 
Re: As for hardocp and brent

Tygrus_Artesiaoa said:
Really though I'm trying to play around with fraps. Many games today don't have built in benchmarking and some of the cool games out don't have it. Supposedly fraps doesn't kill performance anymore as well.

If FRAPS wasn't around, many of my previews would be stale. As long as you can come close to replicating actual gameplay with various graphics settings, FRAPS is an excellent measure of performance.

For example, in a GeForce3 Ti 200 review and others, I play part of a level in Max Payne that I have memorized with FRAPS running behind the scenes.

http://www.nvnews.net/reviews/prolink_gf3_ti200/page_9.shtml

This technique is indicative of actual gameplay as opposed to the typical Max Payne benchmark that's been used. This type of analysis isn't rocket science and reviewers should come up with imaginative ways to measure performance.
 
This technique is indicative of actual gameplay as opposed to the typical Max Payne benchmark that's been used. This type of analysis isn't rocket science and reviewers should come up with imaginative ways to measure performance.

Good idea there MikeC.
You get an actual feel for how smooth the game plays with various cards and settings.

Hopefully we will see more of this type of analysis in the future, rather than just running a benchmark and noting down numbers on paper.
 
This technique is indicative of actual gameplay as opposed to the typical Max Payne benchmark that's been used. This type of analysis isn't rocket science and reviewers should come up with imaginative ways to measure performance.

So very true .
 
Mike,

While I can´t disagree with your point completely, there´s one detail that bothers me: results have to be reproducable.

In a case example like the one you illustrated above with Max Payne you´d have to find a way to record a cutscene/replay that reflects your findings as accurately as possible and then publish it for download.

Ideally if webspace wouldn´t be too valuable and still a lot of users tied to 56K connections, a reviewer could also record avi´s showing a demo or cutscene performing in realtime with the framecounter on of course. A bit exaggerated but it could keep users from possibly raising doubts.

I personally prefer to see the absolute worst case scenarios from a game. Those demos that absolutely stress the tested hardware to it´s limits.
 
Well I still believe a benchmark demo needs to reflect a real world setup (I've said this before)...a massive multiplayer game like Tribes 2 or Unreal Tournament or MOHAA doesn't represent some of the pre-generated demos that have shipped with some games today.
If the game is a single player game like Doom 3 and the settings used are 1600 x 1200 with FSAA and anistropic enabled 99% of the time those settings are not used in the online game.
If the game is a single player game and the demo reflects what you will see in the game (Quake 2 's famous timedemo is perfect example here) then sure I have no problem with the reviewer posting benchmarks with full eye candy, what I do have a problem with is posting those scores on a multiplayer game with as many as 32 players running around, usually ALOT more graphic intensive than the average time demo.
Sorry 3 year old Quake 3 doesn't apply.
What I took out of Mike's comment...is lack of the use of a certain game because it doesn't have a frame counter ..BIG DEAL..
Fraps has proven to be accurate and games like Madden 2002 and Nhl 2002 with very high detailed models would be a great benchmark IMO.
 
Doomtrooper,

I´ve no objection against using fraps or Mike´s method at all. Nor did I mean by absolute worst case scenarios AA or aniso enabled.

Take UT: there are more than one benchmark demos for it available. Reverend I believe when he created the thunder demo was attempting to create a non-cpu limited demo for that game. IMO it reflects rather single player performance. For multiplayer performance I´d rather use the utbench to see how hardware would fair online as cpu limited as it may be.

Are the timedemos, benchmarks at fault in cases you describe or the interpretations given to the results in the end?

I merely meant to say that results have to be reproducable by the reader. Rev has used in his reviews Max Payne too and I recall him having published the cutscene he used. When you as a user can reproduce the results then you will be certain how representative they are or not concerning specific games.

How certain are you about UPT2002 for example at the moment? See my point?
 
Cut Kyle some slack. I don't really see eye to eye with him but I don't see how you can slag him for stating his opinions (whatever they may be) on his own website.

Regarding benchmarking. People like to compare... that's the problem.
 
Doomtrooper said:
What if the developer says NO, Dronez and Aquanox were designed on Nvidia hardware, Vupline Glmark wouldn't EVEN run on a Radeon 64 Meg DDR due to heavy Nvidia PROPRIETARY extensions until a patch was released.

Even Aquanox being a Direct3D game doesn't mean its not OPTIMIZED for specific hardware .i.e compiler similar to CG...
The key with Dronez and Glmark both contain PROPRIETARY extensions that no ATI card can support, so its up to the developer to make the change to a universal extension.

Why would a developer say 'no'? Resources and target audience. If the latter isn't at some sort of critical mass then it's hard to justify spending the former. This was certainly true for ATI regarding games until somewhat recently. This is still true for PVR.

If the dev team doesn't have the resources (programmers cost big money) then who is going to pony up for the cost? This is exactly how 3dfx got games to support Glide, because 3dfx PAID for it in one way or another. Nowadays NVidia's developer support team has taken 3dfx's place. They literally go out of their way to make sure upcoming games work best on their cards. Is that a developer's fault? Maybe but it' hard to argue against the economics when NVidia is literally doing the RnD (and allowing it to be cut-n-pasted no less afaik) on the software side.

If you want to blame anyone for the lack of ATI support, blame ATI because they allowed themselves to lose developer mindshare. It's not some sort of dev team conspiracy.
 
Developers say 'NO' all the time...
I posted this already once by a ATI Dev relations fellow named Jeff Royle..

This question refers to a partial role of the Developer Relations team actually. As I'm part of this group I have a certain knowledge of how we handle many of these cases.

There are a high number of games that were developed on non-ATI boards which means any driver bugs they have may be worked around in code. If the game is not tested on ATI boards before release and these bugs found, the game goes gold and ships that way. When the bug is eventually found and determined to be a game bug, we contact the developers of the game and let them know. We can then request a patch if they are willing and even offer advice on how to fix it. ATI will not knowingly break a driver to make a game work.

In rare cases, developers will not create a patch and then we can only take note of the title and try to remember the bug for future reference. The state of the development community seems to be shifting for the better these days and many bugs are hammered out well in advance of shipping, some later on. We do our best to get all titles tested and bugs found.

The biggest problem we encounter is that end users don't always realize it's not a driver issue that causes the problems. When the game is written as above, on different graphics hardware and bugs are just accepted and worked around in code then it's hard for us to say "But the problem is in the game" because end users see it works for other people on different graphics hardware. ATI already has a bad rap for drivers and yet we won't intentionally leave a bug in a driver. Competitors occasionally will leave a known bug in the driver maybe because they are afraid of what everyone will think when they actually fix it.

Recently I've seen a trend where developers let us know of bugs in the competitor drivers which acts as a heads up for us. Then when other developers come across a problem we can offer the advice that it may not be a bug on our side.

This has gotten too long winded and poorly worded I'm sure so that's it for now.

Jeff

Dronez runs on my 8500, thats all the developer cares about..does it have all the eye candy and optimizations that a Geforce 3 user saw..NO.
 
If you want to blame anyone for the lack of ATI support, blame ATI because they allowed themselves to lose developer mindshare. It's not some sort of dev team conspiracy.

This is exactly the kind of thinking that needs to stop, games need to work well on all hardware with no specific optimizations and was the whole Idea behind DirectX and Opengl (before Proprietary extensions that required license agreements arrived)...glide is gone and so should that way of thinking.
 
Your first post contained nothing that disagreed with what I wrote. ATI knows/knew they had a bad rep for drivers. Period. ATI knew that they had lost developer mindshare (as I've mentioned before, an ATI rep at E3 admitted this to me and even had an NVidia card in his machine!) Again, they've gotten MUCH better and to be honest for my home machine I wished I had not gotten my Ti200 (even though it OC's like a monster) and instead had gotten a cheap 8500 (which I even recommended to a fellow team member who went ahead and ordered one.)

Doomtrooper said:
If you want to blame anyone for the lack of ATI support, blame ATI because they allowed themselves to lose developer mindshare. It's not some sort of dev team conspiracy.

This is exactly the kind of thinking that needs to stop, games need to work well on all hardware with no specific optimizations and was the whole Idea behind DirectX and Opengl (before Proprietary extensions that required license agreements arrived)...glide is gone and so should that way of thinking.

Games will work well on all hardware when all hardware is equal in terms of performance and marketshare. Till then you're out of luck.
 
I can't believe anyone today would still think that is a good for the gaming industry, to each their own.
I don't buy products based on Market Share..I buy based on price, performance and features, otherwise I would drive a GM and Own Intel..and I do neither :rolleyes:

Being BIG does NOT mean better.
 
Back
Top