5900Ultra review at [H]ard OCP

eek7.gif


Disagree BIGTIME...this has been discovered yet no mention of it (and Daniel Vogel replied in that thread and is aware)

http://www.beyond3d.com/forum/viewtopic.php?t=6719&start=0

Yet he has no problem putting up the graphs with the FX card disabling Trilinear even though it is set in the control panel.
I would not call that a fair review at all, in fact there was no renaming of .exe's and they used that same old [H] benchmark that has been targetted for optimizations (50% improvement in recent driver releases).

http://www.hardocp.com/article.html?art=NDk2LDM=

Typical inaccurate REVIEW from [H] !
 
I'm really disaapointed in Brent not mentioning the whole FX filtering mess in UT2003. He knows about it.

The rest of the style of the review was good Brent though, don't lose your knowledge and integrity ..please.
 
I've had a conversation with Brent about this already and all I can say is that I was absolutely and utterly floored by the responce he gave me. [H] are now setting a really dangerous precident IMO.
 
Doomtrooper said:
they used that same old [H] benchmark that has been targetted for optimizations (50% improvement in recent driver releases).
Were those improvements in the 44.03 release? because thats what Brent used.
 
DaveBaumann said:
I've had a conversation with Brent about this already and all I can say is that I was absolutely and utterly floored by the responce he gave me. [H] are now setting a really dangerous precident IMO.

Details? If it's something they believe in, I'm sure the HardOCP team will have no problem with you making it public knowledge. I'm seriously guessing the gist of it is "if you can't see it, it's not cheating". ie, if Nvidia pulls the same "optimisations" in games, then they are valid in the timedemos.

Sad. I used to visit HardOCP several times a day - I haven't been there since they abandoned any pretense of objectivity or knowledge.
 
To paraphrase - "The fact is that its faster, when people play the game thats the IQ they will get and they don't run it with mipmap levels on; apparantly real IQ doesn't matter to users just so long as it looks OK and plays fast."

This is a real damning path to go down. Now its a case that users shouldn't necessarily be presented with the actual IQ that these games boards have been sold as delivering just so long as its give passable IQ and good performance (or just enough to make it look good in the face of the competition). Apparantly users can't see the difference between real Trilinear and the "very close to Bilinear" that's being given on the detail textures - so what the point of having Trilinear at all now, eh?

Of course, there were the screenshots there to show the IQ delivered, but that was only a half assed job as its OK to say that one board "dominates" another without actually showing comparative IQ.

The precident that [H] is setting here is just utterly fucking sad and stupid - all ATI has to do is release a set of drivers that does exactly the same - dial down IQ to increase performance, and that ultimately puts us on a path that screws all consumers.

We'll also gloss over the fact that the situation here is exactly the same as the whole Quake/Quack fiasco - there [H] was willing to make a whole bunch of noise about that, but nary a peep when the shoe is on the other foot, which really leads you to only one conclusion.
 
I think it's something more along the lines of Nvidia pays us this much $ and will continue to do so if we play ball with them and help bail them out of the mess they got themselves into. No one with basic typig skills and the ability to function as a normal human can rightly justify what Nvidia has done lately.

I'm giving them the benefit of the doubt in thinking they are greedy as soppsed to be so completely stupid as to think Nvidia's cheats are ok.
 
I've been stating this since the FX launch, Hardocp is paid off ..time for people to accept the truth...denial against so many facts just shows how 'strung' along some people are.
 
DaveBaumann said:
The precident that [H] is setting here is just utterly fucking sad and stupid - all ATI has to do is release a set of drivers that does exactly the same - dial down IQ to increase performance, and that ultimately puts us on a path that screws all consumers.

What next, force all benchmarks down to monochrome 320x240 and claim 600 frames per second? So much for "cinematic rendering". :rolleyes:

In my opinion, HardOCP are Nvidia's lapdogs. Brent now has the same low credibility as Kyle. Looks like the only thing Nvidia succeeded in doing is dragging everyone else into the gutter with them.

I hope Dave Orton has the courage to stick to the high ground and keep ATI focussed on making great, fast cards with good IQ - something Nvidia isn't capable of doing at all.
 
i had the opportunity to play a little with a fx5800 and AF is totally messed up in ut2003 as the shots showed and you can see it when playing , there is mipmap banding with the quality AF setting.So much for the it can't be seen when playing BS. Application AF didn't work at all , that gave me something weird like bilinear with a tweaked mipmap LOD


...and Dave the f word ? , you must really be pissed :oops:
 
Back
Top