GeForce FX Digit-Life tidbits

Ailuros said:
I don't want to get the thread off track here, but the site's credibility gained another low IMHO, when I read their latest October digest. Anyone curious just check the high end usability ratings for instance.

Why? Those are just calculated numbers. I'm not sure if I agree with they way they calculated those numbers, but it's just calculations.

As a side note, it really seems like Digit-Life does a really good job at putting forth tons and tons of calculations. They don't seem to do a good job at interpreting those calcs.

The way I see it, I don't care who says what about what the GeForce FX support. We'll find out when the card's in peoples' hands. If it doesn't support DM with the launch drivers, then it doesn't support it. Possible future enhancements are irrelevant (at launch).

Put another way, anytime you can put in a, "maybe," I'd just like to put in an, "I don't care, we'll find out later."

After all, the only time it really matters what a video card does and does not support are when you're going to either buy it or program for it. With either one, the only things you should ever look at are what features are supported at purchase time. Now it doesn't matter, because it's not available for purchase. And if you're a game developer, I would just move forward assuming the GeForce FX doesn't support DM. If it turns out to, great. If it doesn't, no loss.
 
Chalnoth said:
Ailuros said:
I don't want to get the thread off track here, but the site's credibility gained another low IMHO, when I read their latest October digest. Anyone curious just check the high end usability ratings for instance.

Why? Those are just calculated numbers. I'm not sure if I agree with they way they calculated those numbers, but it's just calculations.

As a side note, it really seems like Digit-Life does a really good job at putting forth tons and tons of calculations. They don't seem to do a good job at interpreting those calcs.

The way I see it, I don't care who says what about what the GeForce FX support. We'll find out when the card's in peoples' hands. If it doesn't support DM with the launch drivers, then it doesn't support it. Possible future enhancements are irrelevant (at launch).

Put another way, anytime you can put in a, "maybe," I'd just like to put in an, "I don't care, we'll find out later."

After all, the only time it really matters what a video card does and does not support are when you're going to either buy it or program for it. With either one, the only things you should ever look at are what features are supported at purchase time. Now it doesn't matter, because it's not available for purchase. And if you're a game developer, I would just move forward assuming the GeForce FX doesn't support DM. If it turns out to, great. If it doesn't, no loss.

That sentence didn't require a novel sized reply in the first place. If you want to justify their tactics or ethics than fine.

Fact remains that the R300 should be on top of those lists as a true high end card and not three variations of a mainstream Ti4200.

And I'll personally start to care about a GF FX when it finally arrives on shelves. Then digit-life can tune their high end usability rating accordingly and put that on top pfffffffffff.
 
no_way said:
OT:
Dave, did you get the answers to questions you sent over to NVidia yet ?

No. If I get them back at all (I hope we do) then they said it would be a little while yet. I think the questions were a little too deep, so they had to go over to the techies in the states (and then probably pass through the PR filters after). The comment I got was "Hopefully they'll be back roughly in time for your GFFX reference sample". So, lets hope we see them back soon! :D
 
Ailuros said:
Fact remains that the R300 should be on top of those lists as a true high end card and not three variations of a mainstream Ti4200.

It wasn't a subjective analysis. It was a calculation, one that included price, as well as lower-resolution benchmarks. Why is it so hard for you to see that this has nothing to do with bias? Just poorly-selected data...
 
Why is it so hard for you to see that this has nothing to do with bias? Just poorly-selected data...

I'm not going to pass judgement myself at this time, but ask yourself this question:

WHY was the data selected "poorly?" 2 reasons I can think of:

Either:
1) Ignorace / lack of understanding of what makes "good" data....
2) To purposely spin results to an intended result (Bias.)

It could be either case...but the point is, you can't rule out bias as you have done...
 
But I think that bias being the cause inherently assumes that they'd be willing to try a few different runs of their particular benchmark settings and final score calculations for the final result. I don't think they're willing to do that much work.

And, as I said, I don't think they're that good at analyzing the results, and, conversely, I don't think they'd be very good at just "cooking up" an anti-Radeon 9700 calculation without trying it out.

In other words, I'm going to say that they're not biased because I don't have much respect for their intelligence (whoever they may be...).
 
Just poorly-selected data...

Consistantly poorly selected data. The highlighted part is exactly that makes the difference here. It's not a site that appeared yesterday.

I don't think they're that good at analyzing the results, and, conversely, I don't think they'd be very good at just "cooking up" an anti-Radeon 9700 calculation without trying it out.

I'll put a reminding sticker then on my monitor, for their digest after the GF FX release. Let's see how good they'll be with that one then.

But I think that bias being the cause inherently assumes that they'd be willing to try a few different runs of their particular benchmark settings and final score calculations for the final result. I don't think they're willing to do that much work.

In most cases they just pull out results from formerly conducted benchmarks. They don't rebench each and every card every month for each application they run through.

Ironically there are tons of AA or aniso or AA/aniso numbers from all recent high end accelerators available in different reviews on the site. It's just that the last combination brings any GF4Ti model to a serious disadvantage.

They do more work than you can imagine to come even to such a obnoxious compilation. It's even more sad that especially you are trying innocently to justify them.
 
Hehe. Not just poorly selected data, but poorly selected from often poor data. Remember the superlinear overclocking results?
 
It's even more sad that especially you are trying innocently to justify them.
They don't need any justification at all! Do u have a GeForce FX card? did u test it? did u talk to NVIDIA about it? no, u didn't and they did, so what possible justification could there be here?! They bring accurate info from NVIDIA themselves and they bring numbers according to what they've been told and tested, that's right, tested, nothing more, and your rambling doesn't interest them at all (that also applies to certain other individuals around here).

P.S
What bias? I don't understand certain people, what possible bias do you see here?! I for one see constant bias towards ATI around here, the same people expressing the same opinion about the same subject over and over again and frankly, most of these people blindly speculate!

NVIDIA are behind in schedule, most definetly, there can be no discussion here, they're late with their NV3x parts, but whether that says something about the timeframe of the annoucements of their new products is unknown yet, especially since they pretty much solved all the issues surrouding the move to 0.13... which i'm not so sure about ATI (although I did hear positive rumours).

What don't you people just stop it? We got your point by now, no need to repeat it a hundred times! I agree with Chalnoth here...
 
one side is always biased against the other, but I wouldnt call Ailuros anti-nVidia or pro-ATI. I dont think he has ever owned an ATI card. If an neutral person calls an IHV or web-site on something, the 'fans' always leap to the defence of their favourite.
 
The information on the vertex shader side comes directly from NVIDIA's Geoff Ballews mouth, and he was one of the personell on the NV30 project, so I'll defer to his knowledge rather than anything DL says. Other previews have mentioned similar things to Geoff's statement about the Shader being a 'P10 style array' as well.

"A sea of vertex math engines" or something very similar to that.
 
alexsok said:
It's even more sad that especially you are trying innocently to justify them.
They don't need any justification at all! Do u have a GeForce FX card? did u test it? did u talk to NVIDIA about it? no, u didn't and they did, so what possible justification could there be here?! They bring accurate info from NVIDIA themselves and they bring numbers according to what they've been told and tested, that's right, tested, nothing more, and your rambling doesn't interest them at all (that also applies to certain other individuals around here).

P.S
What bias? I don't understand certain people, what possible bias do you see here?! I for one see constant bias towards ATI around here, the same people expressing the same opinion about the same subject over and over again and frankly, most of these people blindly speculate!

NVIDIA are behind in schedule, most definetly, there can be no discussion here, they're late with their NV3x parts, but whether that says something about the timeframe of the annoucements of their new products is unknown yet, especially since they pretty much solved all the issues surrouding the move to 0.13... which i'm not so sure about ATI (although I did hear positive rumours).

What don't you people just stop it? We got your point by now, no need to repeat it a hundred times! I agree with Chalnoth here...

I'll urge you to actually read a thread from front to end before you actually reply to it.

Digit-Life has in it's October Digest a High End usability rating where on the first, second and third place you'll find a GF4 Ti 4200, followed only on the fifth place by a R300. Now which of the two is the true high end card in your opinion?

That's what it was about.

I won't even deal with the rest of your outburst, you're way off track anyway.
 
I dont think he has ever owned an ATI card.

Let's just say that there isn't a Radeon I haven't played on yet. Having right now in my main gaming machine a Ti4400 clocked at 300/650MHz I know exactly where it's limits are.

I chose it because it covered my needs more efficiently at the time of purchase, but that doesn't keep me from acknowledging that there's something with far more potential out there on shelves in the meantime.
 
Chalnoth said:
Why? Those are just calculated numbers. I'm not sure if I agree with they way they calculated those numbers, but it's just calculations.

based on spurious data. 2 things struck me.

1. In some tests they have the 8500 and Gf4MX460 beating the 9700Pro - suggests a recheck dont you think?

2. They also use the ATI high quality driver setting - which enables 2xAA and 8xAF. The AA shouldnt affect too many scores, but the AF will a bit and that isnt a level playing field, of course with 2xAA and 8xtri AF the Gf4 range would still sit on top of those ratings I'm sure :p

Of course this is the site that complained about the V5 artifacting in Giants, which was only actually visable if you deliberately set certian settings on the drivers (16bit colour with the fastest depth perception setting). Funny that, 16 bit colour and a z-buffer hack causing z-artifacts in a game with a large view depth :-?
 
alexsok said:
P.S
What bias? I don't understand certain people, what possible bias do you see here?! I for one see constant bias towards ATI around here, the same people expressing the same opinion

I think anyone that looks at this objectively would realize that one of the major indicators of someone being biased (although not 100% accurate) is how they perceive others. It's your position compared to others that makes someone else's idea seem radical or fanatical.

I constantly find your comments biased towards Nvidia. You don't even see it--you just notice that others are more oriented towards ATI and you chalk it up as bieing an ATI fan.

And let make myself clear... The last ATI card I owned was the ATI Wonder! :) My last 3 cards have been Nvidia. Perhaps I am slighly leaning towards ATI, but I believe for the most part that is because I almost feel I have to counter-balance the Nvidia hype that is constantly being spoken.
 
Randell said:
Of course this is the site that complained about the V5 artifacting in Giants, which was only actually visable if you deliberately set certian settings on the drivers (16bit colour with the fastest depth perception setting). Funny that, 16 bit colour and a z-buffer hack causing z-artifacts in a game with a large view depth :-?

This really doesn't mean anything to me. As I said, I just don't think they're that smart, though it would probably be more accurate to just say that they're not that careful (which is likely one reason why they're able to put out so many performance graphs...).

As a quick example, I noticed that the Radeon 9700 produced rather unplayable problems in the static portions of the screen in Baldur's Gate 2, but only with AA/AF enabled. This game doesn't gain any realistic benefit from either, so why would I ever have them enabled for the game?

Obviously, because I didn't turn them off from the game before. What xbit labs did was probably almost identical.

I constantly find your comments biased towards Nvidia. You don't even see it--you just notice that others are more oriented towards ATI and you chalk it up as bieing an ATI fan.

Well, if you really paid any attention, you'd realize that there was one glaring inaccuracy in the above two sentences...though I'm sure you'll guess at the one I'm not referring to...
 
Chalnoth said:
Obviously, because I didn't turn them off from the game before. What xbit labs did was probably almost identical.

No, read their notes. They specifically say they use HQ settings for the Radeons. There is nothing accidental abou that bit.
 
Back
Top