NVIDIA's Last Minute Effort - 6850

There were no emotions just an observation that Tom's Hardware is obviously biased, history is used to show bias, if you cannot see the bias than you either see it from a different perspective (which includes the possibility of you sharing the bias), or choose not to see it or it isn't there at all. But I would bet you that the majority of the people here would agree with me about THG bias.


When you use words and phrases like "Sucks Balls" it is interjecting a tone of emotion.


Secondly my emotions if any are not directed at THG they are directed at you and some people here may have bought a 5800 Ultra and will not be able to easily detach emotions about the subject (although I am not one of these people).


You see emotions will do that. It will make something is meaningless as a video card and turn it into a personal mission statement.


As to the review being a year ago I am still waiting for the IQ comparisons and the drivers that were used in that review to be made public as promised by THG.


Does it really matter? And I mean does it really? Like I said maybe it is time to get over it.


And after you said "I am in inpartial reviewer and even I was disappointed by the features of the X800. I cant imagine anybody who was really looking forward to what the R420 had instore for them could feel like."

You can't possibly actually expect us to believe that you are impartial right... Especially after than quoting the NVIDIA SM3.0 marketing line when most of us are taking a wait and see attitude on the benefits of SM3.0 at this point. Don't insult us like that


Heh well I obviously cant expect you to believe it. But as for SM3.0. I wasnt really worried about PS3.0 as much as VS3.0 I was really expecting it and yes it was a letdown that it wasnt included. If you believe me or not I really dont care.


And can you please support your statement that the X800 doesn't even really support SM2.0xx I would like to see how you based your conclusion.

Well show me exactly what outside of the increased registers and Fbuffer does it have? And will the FBuffer be enabled via the drivers this time or will it just be another marketing gimic like the last time?

I will await your next emotionally driven response.
 
Actually since I went to school to become a Merchant Mariner 'Suck Balls' is far more conversational than emotional but you would have to know me to understand that.

Secondly a $400 - $500 investment in hardware is only meaningless to those that consider that level of investment meaningless, this of course is a matter of perspective. You may consider spending $500 no big deal I however do not. Each to their own.

Also if a review site promises IQ comparisons and reviews cards on drivers that never get released and then never again comments on the two above it matters a great deal on whether I ever choose to trust that review site again so as it may not matter for you it does for me and probably many others.. Again a matter of perspective.

Have you read the B3d (p)review??? Have you seen the Shadermark test results?? Just curious.
 
I dont think I ever once said a 500 dollar investment is meaningless did I?

As for the shadermark review I am looking at it right now. What exactly do you want me to see? I dont typically answer loaded questions.
 
The simple fact of the matter is that ATI built a card that both gamers and OEMs are going to like.

Gamers want:

Speed
High resolutions during game play
High AA/AF during game play
Smooth frame rates
Did I mention speed?

OEMs want:

Market demand with strong brand recognition
Good supply from vendor
Good price from vendor
Power consumption that fits within normal specs
Single slot for obvious reasons
Smaller and quieter if possible

ATI delivered all of the above with IMMEDIATE availability. Just looking at the cards side by side tells you that NVDA is scrambling to remain competative. NVDA's brute force approach is obvious and leads to high power consumption, noisy fans, wasted space and more heat inside your case. ATI built it, delivered it and told you when to expect the rest of the R420 line. NVDA paper launched with no firm dates and a slew of "new" cards and beta drivers to confuse everyone.

NVDA is trying to deflect attention away from the R420 at all costs. It is really pathetic if you ask me and they are desperate. Anyone who is honest with themselves can see this. We all need questions to the following from NVDA:

1. Why release the review boards at 400/550 if that is not the intended specs? Why all of a sudden the Ultra Extreme?
2. Why paper launch instead of immediate availability like ATI? Can't you coordinate the launch and availability?
3. When are the boards going to be available? Which ones are going to be released and when?
4. Are the yields at IBM really that bad? 10,000 total NV40 cores of which only 2000 are Ultra quality and only 200 are Ultra Extreme quality? Will their be availability issues?
5. Why are you using beta drivers that have questionable optimizations? Why the sudden release of the 61.11 drivers?
6. What is going on with the Far(t) Cry benching issues? Isn't that a TWIMTBP game that has 3.0 support? Why isn't the 6800 excelling in these benchmarks?
7. Why did you build such a big power hungry chip? What about the reaction of the OEM market?
8. Why does the NV40 lag behind the R420 at high resolutions with AA/AF set higher?
9. If NV40 is going to "own ATI" and "they are smoking something hallucinagenic" and "low-k is dangerous", why is it that R420 was well received today and matches/beats NV40 in most cases?


I have nothing but questions from/for NVDA. They are spreading the FUD nice and deep with the willing/unwilling support of several sites and are desperately trying to deflect attention from ATI. Any idiot can jump up and down in a crowded room and say "Look at me! Look at me!". People will stop and watch the first or second time. But then it gets old and the people stop looking and reacting to your BS because there is no substance to your actions. NVDA is short on credibility and is doing NOTHING but making it worse. The analysts and OEMs will not forget about the NV30 debacle so soon. I smell an even worse debacle brewing now.
 
after the ati "Quack" cheating fiasco, i really dont think nvidia would be so stupid as to think it is possible to pull off this trick...
 
dlo_olb said:
after the ati "Quack" cheating fiasco, i really dont think nvidia would be so stupid as to think it is possible to pull off this trick...

Welcome to 2004. Did you happen to nap through last year?
 
jumping-smiley-011.gif


ATI were desperate for performance then, for the title - as Nvidia have been for the last 2 years ;)
 
mondoterrifico said:
dlo_olb said:
after the ati "Quack" cheating fiasco, i really dont think nvidia would be so stupid as to think it is possible to pull off this trick...

Welcome to 2004. Did you happen to nap through last year?

so just because it wasnt happened last year, it wasnt true ;)




himomo
 
Someone once wrote (paraphrasing here): "It will be a scrap". That guy weren't half wrong (smart fella).

Looking at this thread it seems we have nuclear armageddon rather than a scrap... Doomtrooper I need your popcorn smiley ;)
 
Maintank said:
Well show me exactly what outside of the increased registers and Fbuffer does it have? And will the FBuffer be enabled via the drivers this time or will it just be another marketing gimic like the last time?

The new instruction limits are a hardware function - they are not achived via any F-Buffer tweaks. F-Buffer has no abilities to be enabled in DirectX, however it should have support in OpenGL2.0.
 
John Reynolds said:
Joe DeFuria said:
FiringSquad is just ATROCIOUS. Beyond Belief. They don't even have an X800 P/REVIEW. Their article is titled NVIDIA's GeForce 6800 Ultra Extreme and 6800 GT. Included in the review are X800 benchmarks. Seriously! Are they actually going to do a separate X800 review, or is that it?

Agreed. And they use the 61.11s too. Unbelievable. Just when you think you've seen online reviewing's worst, someone pulls out a shovel and digs the well deeper.

oh come on guys FS is OK... and what about their conclusion?

We’re also impressed by ATI’s driver quality. While NVIDIA’s ForceWare 61 driver is riddled with bugs, we found our experience with ATI’s beta driver for the X800 series to be a much more pleasurable experience. For instance, the ForceWare control panel wasn’t always up to snuff, even as frames would occasionally pause for three seconds or more in IL-2 Sturmovik: Forgotten Battles regardless of the GeForce card used. Right now, we’d have to give the driver quality edge to ATI, despite the fact that their current CATALYST driver doesn’t provide all of the features currently found in ForceWare 50, much less NVIDIA’s upcoming 60 release (although we’ve been told that ATI’s CATALYST team has quite a few goodies in the works for ATI card owners). We’d take stability over features any day of the week in a display driver and this is exactly what ATI is delivering at this moment.

ATI will also beat NVIDIA to market with X800. The X800 PRO has begun shipping to retailers as of today, while the high-end X800 XT Platinum Edition will be shipping on May 21st. The X800 PRO occupies the same $400 price point as the GeForce 6800 GT while the $500 X800 XT Platinum Edition will duke it out with NVIDIA’s GeForce 6800 Ultra. As of right now, we’d take the X800 XT Platinum Edition over NVIDIA’s GeForce 6800 Ultra if we were in the market for a graphics upgrade.

It’s a close call at the $400 price point though. The X800 PRO has its fair share of positives, while the GeForce 6800 GT is no slouch either. We’ll have to wait for retail GeForce 6800 GT boards before a definitive answer can be determined. But based on our performance results and overclocking, ATI’s X800 PRO definitely won’t disappoint if you’re looking to upgrade.

NVDA having too much influence over there? ... that's taking it a bit too far ;)
 
thop said:
It seems the marketing guys at nV responsible for previous debacles were not fired ... the engineers must be pissed.
Uhhh - that would be Vivoli and Tamasi. Ratman and Boy Blunder. Stan and Ollie. Abbot & Costello. Masters of marketing du jour.
 
How times have changed......
I noticed these "happenings" when the R300 was trashing everything in it's path, and they have gotten worse/more frequent ever since.
I won't even get into what some folks feel was happening even b4 that.
Those were some MAGICAL 3dmk jumps I can tell ya that.

Noone can REALLY say NV drivers are REALLY better anymore....
I remember when that was their creedo.
 
muzz said:
Noone can REALLY say NV drivers are REALLY better anymore....
I remember when that was their creedo.
I think Carmack was really the one who was the cause of most of the trouble. Calling NVidia's drivers his "Gold Standard" really did nothing but let NVidiots say "Hah! Eat that, ATI!" which obviously just made the flamewars grow hotter.
 
Back
Top