Late, noisy, HUGE!!! - makes 17k Mark...

Status
Not open for further replies.
Hyp-X said:
The 3dmark database is useless anyway.
I remember when GF3Ti200 came out it was ranked higher than the GF3-classic.

Now I know there's an easy explanation for this - GF3Ti200 was faster with the DetXP drivers than the GF3-classic with the old ones.
But it still makes the database useless.

Hyp-X vs worm
Round 1
<horn>

:D :devilish:

Edit: Should I rename the topic? Really? I really don't believe we can't move on... :rolleyes:
 
Even though the "advanced " test is benched with "real" numbers, hehe, it isn't tabulated into the "final" score because it would of been an arduous task to do so. Fine.

So.....the benched "advanced" test is a "feature" test. And what is a feature test?

Is it to "showcase" the "advanced" feature?

Let's say it is.

So how does 3dmark "showcase" this "advanced" feature?

By allowing the "advanced" shader feature to be emulated so-to-speak by a lesser shader version.

Isn't this odd?

Why would they do this?

It doesn't count in the final score, so why not really "showcase" the "advanced" feature by offering what 1.4 shaders can offer?

Many look at 3dmark 2001 and may see a very impressive benchmark suite with really cool looking tests. The Nature demo simply was incredible to see for many, as were the gaming demos. A lot of imagination and talent there at 3dmark.....no doubt about it, imho.

So with all the abilities of 1.4 shaders over lesser versions and the prov'n imagination and talent at 3dmark.....what does 3dmark come up with for the "advanced" test?

A mundane looking "advanced" shader test that a lesser version can emulate. This I find odd considering it is a "feature test" to "showcase" an "advanced" feature, imho. Considering the imagination and talent that 3dmark has offered in the past, well, I think they could of offered much more, but yet didn't, and their motives may be deemed suspect by some and clearly are by some posters here. imho.
 
smilie_bett.gif
 
worm[Futuremark said:
]
Agreed. I asked for this many posts ago already.

If you had held to your "sworn vows" to leave this thread, it would be over.
Dont make melodramtic calls of "I'm leaving" and then not follow through with them, and then blame the threads continuance on others.
 
Back to business... I don't understand something:

Quoted from here: http://www.nordichardware.se/artiklar/Intervjuer/English/nVidia/

"[NH]: Could you give us an estimated price and date of retail availability for the GeForce FX in Europe?
[nV]: We're looking at a ballpark figure of around 600 Euros however final pricing will be determined by the partners. The high end only makes up for 2% of our sold products and at launch GeForce FX will only be available in limited numbers, for those who want the latest and greatest the GeForce FX will be the product to go for. GeForce FX should pop up at retailers across Europe by the end of February."


and quoted from Dave's Q&A with John Malley:

"Q2: When will we see BFG Tech GeforceFX's in retail ? And can you talk about the promotion Best Buy has for $399?

A: We should see BFG Tech GeforceFX's starting in late February/March , as the pre-orders stated. Best Buy's promotion is pretty good and $399 is the price. I saw a PNY GeforceFX order for $499, but I think it was a placeholder for their actual price of $399.

Q3: Is this for the Ultra version of the GeforceFX?

A: Not sure what you mean by the "Ultra" version. As far as I know there's only one version of the GeforceFX, with a 500MHz core/1 GHz effective DDRII memory."


So, is there any chance to see this very wide price gap between two manufacturers?
(Not to mention that everybody will buy the cards from NV...)
 
smilie_bett.gif


Disrespectful little icon I must say offered with my "armchair" view.


I agree, can't we move along people

I don't think the OT 3dmark discussion was boring....not at all; as many others posters here were drawn into this discussion with many posts that prove this. I guess my post was the breaking or sleeping point so-to-speak and understandable.

Now, I agree to do my part to move along and try to bring the subject back on-topic or wide awake since it is your responsibility.
 
worm[Futuremark said:
]Care to elaborate on what you base you accusation?

I take it back.

It certanly has uses. It shows the average speed of the card since it release during various driver updates. (that sometimes gave 30% increase.)

It is only not appropriate to determine the current speed ratio of the cards.
But I guess noone wanna use that score for that.
My bad.
 
mboeller said:
It might just be me, but looking at the picture above the card appears to be covering the SECOND PCI slot as well

ARRGGG....

You are right, the GFFx covers 2 PCI-slots. This card is really huge. :oops: :oops:


Yep, no denying it actually consumes 2 PCI slots in addition to the AGP slot...!....By comparison, the only thing 3dfx's V5 6K had against it was length--this card more than compensates in width--good grief, that's utterly absurd and ridiculous--that a company would manufacture something as ignoramously thick-headed as that is a measure of how desperate it truly must be. That's horrific.

What will be even dumber, by far, is if nVidia decided *not* ship the GF FX at anything other than 500MHz. With a normal depth to the cooling rig, even at 350-400MHz, and $150 shaved off the price, nVidia could sell a boatload more of those than it will sell of these! If nVidia doesn't sell a non-silent running version of this thing I'll know they have truly popped their corks. That fan is just sad--too sad...
 
WaltC said:
mboeller said:
It might just be me, but looking at the picture above the card appears to be covering the SECOND PCI slot as well

ARRGGG....

You are right, the GFFx covers 2 PCI-slots. This card is really huge. :oops: :oops:


Yep, no denying it actually consumes 2 PCI slots in addition to the AGP slot...!....By comparison, the only thing 3dfx's V5 6K had against it was length--this card more than compensates in width--good grief, that's utterly absurd and ridiculous--that a company would manufacture something as ignoramously thick-headed as that is a measure of how desperate it truly must be. That's horrific.

What will be even dumber, by far, is if nVidia decided *not* ship the GF FX at anything other than 500MHz. With a normal depth to the cooling rig, even at 350-400MHz, and $150 shaved off the price, nVidia could sell a boatload more of those than it will sell of these! If nVidia doesn't sell a non-silent running version of this thing I'll know they have truly popped their corks. That fan is just sad--too sad...

Exhibit A:

Look at the language in this message. "good grief", "utterly absurd", "ridiculous", "ignoramously thick-headed", "desparate", "horrific", and so on. Remember my message about the level of irrational hyperbole anti-Nvidia rhetoric going on here among the fanAT*cs? You think it's pro-NVidia to reply to this stuff and ask for some rationality, some fairness? I can almost imagine Walt screaming into his monitor saying "f*ck NVidia, I hate them, I hate them. They hurts me PRECIOUS ATI, and WE WANTS to HURT THEM"

I don't such childish hyperbole against ATI on these boards. Among the so-called "pro-Nvidia" people, such as myself, Russ, even Chalnoth.
 
I really don't understand what the big deal about the cooler is. I admit I was shocked when I first saw it, but now I don't really see what the fuss is about. If you have a high end gaming machine, I really doubt it's silent to begin with.
 
Because it's just another thing to pick on Nvidia for. ATI had a big insider trading scandal and it was hardly a blip in these boards. If it was NVidia, Doomtrooper, HellBinder, WaltC, et al, would now be talking about how this is more evidence of how evil and bad NVidia is as a corporation and ethically.

I remember how the bash du jour was how NV30 didn't support DX9 displacement mapping, and of course the usual crowd jumped on this until I pointed out that the R300 didn't either.


Basically, if you can find any weakness or negative thing about NVidia or its products, it is blown way of out proportion. Meanwhile, any things lacking from ATI are merely glossed over.


Or, atleast that's how I read this board.
 
Status
Not open for further replies.
Back
Top