First few GFX benches

Status
Not open for further replies.
OpenGL guy said:
I disagree: Recall the pinwheel shots that were posted here before. The 9700 Pro had much better results than non-gamma corrected results.

As I recall, the difference was largely monitor-dependent, which makes sense if you think about why gamma-correct FSAA is good in the first place (it's a correction for how monitors display data). On my monitor, it turns out that it's slightly better, but not by a significant amount. Overall, the gamma-correct FSAA tends to over-correct. This is why I still feel that the Radeon 9700 really should have some sort of calibration utility for gamma.

Do you know if the gamma correction level for FSAA is adjustable on the Radeon 9700? If so, an optimal situation would be to have something similar to the pinwheel program I produced (though potentially also examining each color channel separately...), so that every owner who has the inclination can adjust the gamma as it suits himself/herself.
 
So what's more important to image quality? Antialiasing, or anisotropic filtering?

Ask that to 100 different people and you'll probably get half on one side, half on the other.

Who says image quality can be made strictly objective?
 
Joe DeFuria said:
On the contrary, IMO, it's because B3D is one of the few with enough depth and intelligence to actually do an informative "shoot-out," is exactly why I'd like to see B3D do them. ;)

Exactly what I was getting at ;)
 
Chalnoth said:
Do you know if the gamma correction level for FSAA is adjustable on the Radeon 9700? If so, an optimal situation would be to have something similar to the pinwheel program I produced (though potentially also examining each color channel separately...), so that every owner who has the inclination can adjust the gamma as it suits himself/herself.

3D Deep had such a utility.
 
Sharkfood, you just don't get it. You are so fixated on trying to pin me down as pro-Nvidia that you simply don't read. Saying something isn't *quantifiable* doesn't mean you can't compare them.


I haven't owned any NVidia stock for a year, and I haven't had an Nvidia card in my computer almost as long. I haven't once said anything bad about the Radeon 9700. All I have continued to say is that these two IHVs each have different feature focuses and make different tradeoffs. On the whole, they will run most games about the same, and ATI will score better in some areas, and NVidia in others. People will choose these cards based on which areas they feel more important. Some people will perceive ATI's FSAA IQ to be slightly better. However, for many people, this fact alone won't be enough to get them to guy the card. If FSAA IQ were the only thing that mattered, 3dfx would now be king.


On the reverse, we get hundreds of people writing nasty stuff about the NV30 and it isn't even out yet, with many specifics about it still shrouded in mystery. If correcting nonsense assertions and speculations is being pro-NVidia, so be it. Maybe if you roll back to around the R300 launch, you'll see I gave ATI the same treatment on many issues related to programmable shading.

I think I have been very fair to both IHVs on this board. But I'll bet your history is not so clean.
 
I haven't seen alot of bad things said on this forum about the Nv30, I don't like size of it (physical) and the price don't sound good from rumours...thats the only threads that said anything 'bad' about it and legit arguements.

This thread is centering on IQ for review comparisons which is a good thing IMO..
 
On the contrary, IMO, it's because B3D is one of the few with enough depth and intelligence to actually do an informative "shoot-out," is exactly why I'd like to see B3D do them.

True.

Damn, it's cold outside. Sodding cat.
 
Heathen said:
On the contrary, IMO, it's because B3D is one of the few with enough depth and intelligence to actually do an informative "shoot-out," is exactly why I'd like to see B3D do them.

True.

Damn, it's cold outside. Sodding cat.

Don't tell me you walk your cat...
 
Wow, I am truly amazed. Is it so hard to just to be wrong sometimes?

Shark you understand demo's point exactly yet you turn it into something else so you can "prove" it wrong. Your reply to demos quanitify/quality lesson went right AROUND what demo said. Sheepish.

You cant quantify something that is subjective. Why is that so hard to understand ? If you could, that something wouldnt be subjective. Thats why demo made the crack about the movie reviewer...
 
Sharkfood, you just don't get it. You are so fixated on trying to pin me down as pro-Nvidia that you simply don't read.

I didn't say pro-NVIDIA. And no, it's the concept and not the IHV in particular. And I would simply state you should read my posts in their entirety as you have now dreamt up your own content from them.

The concept is, no you can't state a standard deviation for "Image Quality" but you can clearly illustrate A > B... so accordingly, in these cases, benchmarks cannot then be displayed as A = B.

We have seen it with SS graphs vs MS, we have seen it with trilinear vs bilinear. We have seen it with broken zbuffer with geometry popping in and out of a scene vs clear/perfect. We have seen it with complete details missing (rocket trails, fences, trees, skylines, etc.). In all these cases, we have seen A = B stipulation with the focus being the ability to benchmark these kinds of stark variances in order to provide higher numbers for a particular IHV.

In all cases, the fallback excuse for such radically perverse behavior has been discounted by the age old "IQ is subjective" nonsense, which doesn't even come close to applying in these kinds of comparisons.

For crying out loud, Anand posted benchmarks of the 9700 Pro with 16x anisotropic filtering versus a Ti4600 at 4x anisotropic filtering.

In the past, I still cant believe that sites reviewing the 8500 and GF3 benchmarked trilinear method A anisotropic filtering vs bilinear method B anisotropic filtering. Running 2x, 4x or 8x on both products was stark and dramatic. One had extremely aggressive LOD with starkly visible seams, the other had no such issue. You can count the number of websites that made this a point on one hand versus the 100's of sources that just benchmarked them all together as A = B.

On the reverse, we get hundreds of people writing nasty stuff about the NV30 and it isn't even out yet

This is where someone would get the theory of your "unfairness" to a particular IHV.

It's not hundreds of people writing nasty stuff or dreaming up fictional or factless stories. When most every reviewing source clearly states the thing is loud, sounds like a buzzsaw and that NVIDIA reps joked by saying "Well, our gamers wont hear the loud fan since they will be wearing headphones" (kind of thing, paraphrased)- these are concerns based on evidence. People concerned with size and noise aren't making baseless chatter.

People saying that all IQ is subjective and that there aren't any bounds that should be drawn are living in the Utopian world where all IHV's don't cut corners or market features to a substantially lesser degree than others. Especially when a singular approach is a direct attempt at dramatically reducing quality for the sake of improving performance. This kind of approach, and the "IQ is subjective" umbrella only encourages such behavior.

We had the same problems 3-4 years ago with surreal level of default LOD bias, back when this had a larger impact on benchmarks. It was known that benchmarks were going to be run blindly and that the fancy graph peaks were all that mattered, regardless of final rendered image quality. Anyone trying to bring to the forefront that a texture on benchmark A only had 18-20 isolated, distinct colored pixels versus 270 isolated and distinct colored pixels on benchmark B were always thrown into that "IQ is subjective" umbrella when there was no doubt that the details of the texture were dramatically truncated in benchmark A, and the "artists making the game" didn't consume the time putting those details in the texture so LOD bias would lose 90% of them.

That's the point.
 
Whos making excuses ? And for what ?

You just summed up whats allready been said shark. Obviously you can compare image quality between cards. How does stating that IQ is subjective an exuse or fall back. It just is. Nothing evil or conspirital at all.
Its what turns reviews into a quagmire of useless figures as screen shots more times than not.

No one here is saying you cant compare and contrast images rendered by two different cards and see which may be capable of superior IQ. You just cant put a number or give referenece to such.

And around and round she goes...
 
Sharkfood said:
It's not hundreds of people writing nasty stuff or dreaming up fictional or factless stories. When most every reviewing source clearly states the thing is loud, sounds like a buzzsaw and that NVIDIA reps joked by saying "Well, our gamers wont hear the loud fan since they will be wearing headphones" (kind of thing, paraphrased)- these are concerns based on evidence. People concerned with size and noise aren't making baseless chatter.

This is the kind of nonsense I'm talking about. No one has had the chance to "review" anything, because there are no review samples. This is baseless chatter. When the card ships, and you can actually review a production card, then you will have evidence.


I touched an NV30 in person at the COMDEX launch, I watched it inside of an open case playing UT2003. I heard no abnormal noise levels. Moreover, NVidia made "Silent Running" part of their touted features of the NV30 presentation I attended.

Sounds like a "buzzsaw", I mean COME ON. Talk about hyperbole. Just go review the way Doomtrooper or HellBinder describe the NV30 whenever they mention it. What's next, sounds like a 747 takeoff? Space Shuttle launch?

Puh-lease. The card isn't out yet, and there are zero reviews of any production samples. It's the very definition of groundless attack. You think I am pro-Nvidia because I have to correct such nonsense even from people who sound like they should know better (e.g. you).

I see the "usual crowd" being more fair about the paper launch of S3's DeltaChrome than I've seen them be towards the NV30. That's fair as in "I welcome the competition, but can't make a judgement yet, so I'll withhold baseless speculation and judgement until it ships"
 
DemoCoder said:
Sounds like a "buzzsaw", I mean COME ON. Talk about hyperbole. Just go review the way Doomtrooper or HellBinder describe the NV30 whenever they mention it. What's next, sounds like a 747 takeoff? Space Shuttle launch?

Puh-lease. The card isn't out yet, and there are zero reviews of any production samples. It's the very definition of groundless attack. You think I am pro-Nvidia because I have to correct such nonsense even from people who sound like they should know better (e.g. you).

*cough*

Umm yes it has by tweaktown..albeit a early card but the reference to noise is made quite clearly...



http://forums.tweaktown.com/showthread.php?s=&threadid=6811



- Cooling

While we couldn’t see it, the fan cooling the heat pipes was very loud – we are talking almost Delta-like volume levels. Possibly, as we get closer to seeing these cards in retail, nVidia may tweak the cooling systems to a more noise tolerable level – at least I hope so.

When quizzed by a gamer at the sound levels coming from the back of the card, an nVidia rep was quick to suggest that it wouldn’t matter much because gamers would be using headphones during their gaming. Unless the cooling technology has thermal throttling (which it very well may, mind you) I would have to disagree with this notion.

Say you are listening to music or fragging away with your desktop speakers, the hum of the cooling fan will still be audible since we do not all use headphones.

Puh-lease get the facts straight ;)
 
You get your facts straight. Tweakdown did not "review" anything. They got the same demo I got at COMDEX which was to stand next to a bunch of NVidia launch partner machines and check out the demos.

But of course, I heard no such "Delta" like noising coming from the machine, nothing louder than the fan I currently hear on my card. Moreover, the Nvidia reps there were touting Silent Running, where the driver adjusts fan speeds based on loads, as a major noise reduction feature. Quite the contrary of the "gamer's won't notice noise" attitude.

But of course, we know you are apt to believe sources biased towards your own beliefs.

The facts are, no one has a production card to review, and hence, no one knows how the AF or AA will look, or if anything's changed.

Those are the FACTS. When you can come up with something I can scientifically reproduce or observe, then you can open your mouth.
 
I'll open my mouth whenever I want...dig, and unlike you they got pictures ..where is yours coder. Are you calling tweaktown liars, are you saying they are spreading fud ?? Did you get to loom inside the tower ??

Even when its in black and white....

Example # 2...

http://www6.tomshardware.com/graphic/20021118/geforcefx-03.html

The demo board, which NVIDIA demonstrated in an nForce2 system, produced a lot of heat. The air coming out of the fan grille is hot to the touch. While the system was quite loud overall, we could still make out the Flow FX fan - not a very positive trait. NVIDIA has promised to refine the design to make it quieter


Thats two websites that commented on the noise (I guess they are lying too :LOL: )..do us a favor and save some internet bandwidth ;)
 
Technically the Tweaktown people can't accurately make any claims about the noise produced by the GeForce FX cooler, since they admit they couldn't see it. Obviously they didn't get to stick their heads inside the tower either, and there's no evidence that the noise was coming from the GeForce FX and not some other fan in the system. I don't believe that is the case, and in all likelihood the noise was being produced by the GeForce FX, but for argument's sake, since you're always so concerned about proof, I feel I should point out your proof is pretty limp.

Regardless, I'm sure the fan is quite loud, and I'm sure the speed control in the drivers will make it easier for people to have conversations without having to turn off their computer first. However, I still don't think it can possibly be louder than the two YS-Tech fans on my old P125 heatsink that is running constantly in an open case next to me. That would be quite a feat.
 
Two sites reported it, thats enough evidence in my books (unless they are all lying . :rolleyes: )...

I'm sure it will get better...that is not the point DC is stating its all wrong, thats Sharks comments are unwarrented and I'm saying Shark has all the evidence he needs to make those claims.
Stockholders never like to see their investment in a bad light :D
 
Doomtrooper said:
Two sites reported it, thats enough evidence in my books (unless they are all lying . :rolleyes: )...

I'm sure it will get better...that is not the point DC is stating its all wrong, thats Sharks comments are unwarrented and I'm saying Shark has all the evidence he needs to make those claims.
Stockholders never like to see their investment in a bad light :D

Yep, they reported. But was any of their reports based on a review of the final product ?
 
It doesn't matter, because any negative news, even if on alpha silicon and PCB, is good news for Doomtrooper and gang.

Benefit of the doubt? Wait and see attitude? Unheard of. Try frolicking with glee.

But boy, watch them spring into action over any early bugs in ATI's drivers that were reported.
 
Status
Not open for further replies.
Back
Top