Tom's VGA Charts Part 3

OpenGL guy said:
I noticed that in the "F-bucks" list the Radeon 9700 is colored as a DX8 card...

Grey cards were colored that way because they're older and probably not available in any real quantity anymore, not because of any specific capabilities. The 9500 and 5800 are colored that way too.
 
I don't see anything wrong with it. At least not if you also read the conclusions:

We see that the Radeon 9600 Pro/XT and the GeForce FX 5700 Ultra based boards offer very good value for money. When image quality settings are raised, the Radeon 9600 XT is an easy recommendation. Not surprisingly, the enthusiast models don't cut such a good figure owing to their high price - at least in the standard tests. In the quality tests, on the other hand, the Radeon 9800 (non-Pro) looks very attractive indeed. NVIDIA's top models, meanwhile, have no such luck, due to their comparatively low scores

And below each F-Bucks chart:

Important: This table does not represent the gaming experience you get with a card. It's essential that you take a close look at our benchmark result tables further up in this article to find out if your personal favorite fits your needs!!
 
So what's the use then of de FBucks thing?
Why is a ti-4600 nr1 with AF+FSAA but doesnt get a playable framerate in the AF+FSAA benchmarks.

But i'm glad i got a R9800 ;)
 
hjs said:
So what's the use then of de FBucks thing?

I'm sure quite a few people on this board have had to answer the question "What's the best card for my money?" I guess this was Tom's christmas gift to the rest of us so we could just tell them "Look at the chart and get off my back!" It's a nice effort, but it's purely based on price (and even those are suspect). If your card is cheap, you don't need much performance to make it look good. Hence the good showing from the bottom end NV stuff. All that will come from this is more people buying those damn MX's & 5200's and wondering why the performance stinks. You can't make a decision based on FBucks, and unfortunately, some people will. A nice effort, but terribly misguided.

hjs said:
Why is a ti-4600 nr1 with AF+FSAA but doesnt get a playable framerate in the AF+FSAA benchmarks.

Bad price I'd bet. Someone correct me if I'm wrong, but I was under the impression that Ti4600's were usually in the $130 range. Ti4200's about $75. As for the FX5200, that looks suspiciously like the price of a 64-bit version, not a 128-bit as was tested. Yet another reason for a complete description of the test subjects and the results. At least then, we could see where they're coming from.

More realisitic prices (correct me if I'm wrong) would put the NV cards behind the R300's where I'm sure we all expect them to be at high quality settings. And would probably put the R200's ahead at the low quality stuff.



pabst.


ps. not that one, ask if we're related and I'll have your head

pps. Did it always say the prices came from Bizrate? I could have sworn it said PriceGrabber yesterday.
 
hjs said:
So what's the use then of de FBucks thing?

It's just more misdirection to confuse the consumer and make Nvidia look better than they are. That's been Tom's Game Plan for some time now. Since Kyle gave up his Job as NV's Butt-boy, Tom (And Anand) have taken over the duties.

I mean come on! They are still trying to perpetuate the 8X1 Pipeline Fantasy for the high end NV3X cards. When will it stop?

I'll bet Nvidia just loved those fbucks charts. They will probably be featured in the next NV PR blitz. :rolleyes:
 
the F-Bucks thing is the stupidest thing I've seen.
What a GF4 MX good value? They have got to be kidding.
 
beyondhelp said:
hjs said:
So what's the use then of de FBucks thing?

It's just more misdirection to confuse the consumer and make Nvidia look better than they are.

Actually those charts did show an interesting trend that I wish THG would've talked about more. Notice how when you shift from the Standard Quality chart to the High Quality chart, a bunch of ATI cards shoot up towards the top of the list, including some higher-end cards? Definitely shows how the ATI cards let you pile on the IQ features more efficiently than Nvidia cards do.
 
Nazgul said:
beyondhelp said:
hjs said:
So what's the use then of de FBucks thing?

It's just more misdirection to confuse the consumer and make Nvidia look better than they are.

Actually those charts did show an interesting trend that I wish THG would've talked about more. Notice how when you shift from the Standard Quality chart to the High Quality chart, a bunch of ATI cards shoot up towards the top of the list, including some higher-end cards? Definitely shows how the ATI cards let you pile on the IQ features more efficiently than Nvidia cards do.

I see what your getting at but do the Typical consumers understand that? I'm sorry, But the whole fbucks idea is a really bad joke. The formula they use to arrive at the Fbucks number is terribly flawed and basically rewards the cheap cards solely because they're cheap. A couple of the "Older" Benches used inflate the fbucks scores beyond their WORTH as a reliable indicator of Present and FUTURE performance. You see that's one thing Tom et. al. didn't point out at all in relation to their fbucks charts. Future-proofing. This is the DX9 Generation now. What's the Fbucks numbers for say, ONLY a cross section of DX9 games I wonder? You know!, Games typical of what's coming out in the next year? (And why did they drop the Aquamark scores from the fbucks number really?) See for me, part of the problem is the Benchmark cross section they use. Only 3 DX9 tests out of ten, if you can call them DX9 tests. I personally think any X-box port is biased since it was designed to run on NV hardware and may give them an unfair advantage. X-2 is known to run better on Nvidia products(Read biased), and Aquamark, which is probably the only useful DX9 test used. No Splinter cell, No TR:AOD, No 3dMark03, No


The going prices for the cards listed is suspect in my book too. A few dollars one way or the other could skew scores disproportionately, especially on the cheap end. And a few of those LOW prices look a little too low to me. (Unless your bidding on ebay. I have a Friend who just snagged a 9600Pro for $90) Nah, sorry, but it is totally misleading to those consumers(Most) who don't know better.

...puts on TF hat...
If I was a Marketeer at Nvidia, I doubt I could have come up with a better scheme to sell legacy left over cards than those fbucks charts. The unwashed, uneducated masses will look at those charts and assume a big fbucks number equates with better bang for the buck and that couldn't be further from the truth(Looking Ahead). Tom's should be getting a bundle of money from Nvidia for their services as a Marketing Partner... actually, I wouldn't be at all surprised to learn that a little Green Fairy dropped that particular idea into Tom's lap. What a brilliant marketing ploy to sell off old stock! ...takes off tin foil hat and sighs... :LOL: ;)
 
X2 isn't biased--one of the mods/devs at the Egosoft forums claims to get 75-100FPS with his 9800 Pro.

It could be possible--I get 43FPS in the benchmark with a 5700U and all things maxed.
 
ByteMe said:
Just to help clear things up. It is a typo. They meant to use FU-bucks. now you understand?

:LOL:

They claim the 4600 is going for $65 on Bizrate....
 
hjs said:
So what's the use then of de FBucks thing?
IMHO, It's got none, just like any other performance/currency unit or price/performance measurement.

1 - decide how much you're willing to pay
2 - buy the best without blowing your budget

Seems simple enough, don't it? Hard to fit into a nice, easy to understand graph, though.

cu

incurable
 
I just was wondering how a Geforce 4 MX 440 could be more fps per $ than EVERY FX/9xxx card? And the quality setting, 4x AA 8x AF with the 5200 being above every FX/9xxx card except the 9700? Other than that the benchmarks were useful in the rest of the article, but it just seems weird
 
For the FBucks thing to make any sense or have any relevance it really would have to work on some kind of sliding scale -

eg. Something like (off the top of my head):

For each fps below 10 - no fbucks
For each fps between 10 and 20 - 0.25 fbucks
For each fps between 20 and 40 - 0.5 fbucks
For each fps between 40 and 60 - 0.75 fbucks
For each fps above 60 - 1 fbuck

This would introduce the idea of minimum playable frame rates into the mix - of course the exact amount of fbucks and the ranges could promote endless arguments.

Some might, for example, argue that you shouldn't get fbucks once the fps hits some high-cutoff point (say 100fps) since you won't see any benefits. Of course, this would then ignore the possibility of running at higher resolutions where you could use the extra power...

The system as presented above would also tend to undervalue playable frame rates in games that are more difficult (eg. Halo) vs. framerates in easy games (eg. Quake3), so you would really need to do some sort of renormalisation per-game.

FBucks are an interesting idea, but the system definitely needs some work IMHO if a 5200 can outscore a 9600 for high-quality gaming with AA and anisotropic filtering. :oops: :rolleyes:
 
Of course 60fps getting 1 fbuck is fine with an LCD but for a CRT it would have to be 85fps (being the ideal Monitor Hz setting where, for the majority, the eye no longer pickup's on the scan redraw). ;)
 
Back
Top