More 3DMark from ExtremeTech

Deflection said:
http://www.extremetech.com/article2/0,3973,888060,00.asp

I thought this was a pretty fair review from both sides of all the hoopla. Updated drivers from both sides as well as updated version for the benchmark. They even (*gasp*) talked to developers and Dell as well as ATI and Nvidia for the article instead of just running the benchmark.

Yes, out of the sites that wrote editorial content last week ExtremeTech was the only one that actually wanted to have a conference call with us prior to going live with their opinions. While we had not been able to write out all technical details (our response was sent out on Friday night) their assessment of the situation was very in-depth and accurate.

Cheers,

AJ

Please note that there's nothing wrong with writing or reporting news as fast as they happen. However with editorial content it's often considered polite to have more than a single source.
 
Nagorak said:
I think you're mixing up card generations and DX generations. The reason the Radeon 9700 is faster than the Radeon 8500 has nothing to do with DX9. Remember the R8500 was a lot faster than the original Radeon too (and even faster than an equally clocked R7500). If DX9 was never released the 9700 would still be faster just because it's a newer (and better) architecture. In fact, outside of 3Dmark we have no real measure of DX9 performance at all.

I don't really see anything at all to support your contention that DX8.1 is just going to fall by the wayside. DX8.1 added a lot more than some of the other DX versions, so I don't really see how it's a "waste". Time will tell, I suppose.

Actually, no, i don't think i've mixed up things. I know that it's got nothing to do with DX9 itself. DX9 cards are still a LOT faster then DX8 cards though.

And as Dave reported:

The point being is that I've had a number of converstations with developers saying that, well, DX8 was a bit of a waste - its the interim release that introduces stuff, but ultimatley gets pushed to the sidelines.
 
I don't believe any of the 1.4 shader capable cards will be considered "too slow" by a lot of people even 2 years from now. One of my flatmates in my student appartment has a celeron 500 with a TNT2 vidcard. He'll be taking over my 9000Pro after this summer, as I will be upgrading to something faster by that time.

He happilly plays NFS: Hot Pursuit 2 on his current system, this is almost exactly the minimum supported spec for this game. He has to play at 640*480 with all details at the lowest and still gets what I consider crappy FPS. He however doesn't see any problem with this and even wins occasionaly from me and other flatmates with faster systems. Recently, he bought Jedi Knight 2 and is happy with that (at low detail, etc) as well.

I know several more people like him. I am quite sure he (and they) will have no problem playing Unreal 3 or whatever at 640*480 on a radeon9000 2 years from now.

What does this mean from a developers(or should I say publishers) point of view? IMHO it would be worthwile to code the games basic shadermodels (for walls, skins etc like in GT2 and 3) in PS1.4 (with 1.1 fallback if possible) with optional 2.0 for advanced effects. Kind of like how Morrowind is DX6/7 with an optional pixel shader for water.
 
Hi AJ,

Yes, out of the sites that wrote editorial content last week ExtremeTech was the only one that actually wanted to have a conference call with us prior to going live with their opinions. ....Please note that there's nothing wrong with writing or reporting news as fast as they happen. However with editorial content it's often considered polite to have more than a single source.

Well spoken.

This is the exact issue that I raised with Lars and HardOCP concerning their opinions on 3DMark. I had no problems with them "running the story" (about nVidia's complaints with 3DMark) as fast as possible.

But crfeating editorial content (and in the Case of HardOCP) coming to a conclusion on 3DMark use,) without so much as even attempting to factor in a response from FutureMark, is absurd IMO. It's quite simply "bad form."
 
Back
Top