ATI to show Nvidia no mercy!!

From what I have heard, nvidia will be sending out working samples to the hardware sites in the middle of december

I will quote myself as I did say "samples to hardware sites mid december".


And a few months, 6months, what is the difference :rolleyes: :D.


The geforcefx was supposed to be announced in august and released in november/december time, it was actually announced in november and released in february, they are 4 months behind where they should be.

However, nvidia is behind, the only question is what can nvidia do to counter the r350 and r400. It is far too early for them to release these new drivers that make the geforcefx 30% faster, what about the nv35? How far away is that? Nvidia do have the money and the support to hold out for a couple of product cycles, but after that, they will need to of made up the gap or they fear losing out.
 
The bottom line is that the 9700 was a whopper of a product release.

It will have taken nVidia nearly two full years to go from the GF3 to the next generation NV30. In that two years, all we have gotten from nVidia has been uninspiring speed improvements to the NV2x core (though the top end 4600 is certainly quite a bit faster than the original GF3).

Meanwhile, ATi made just as big of a leap in less than one year. That is simply amazing. They made twice the leap in less than 12 months that nVidia made in more than 18.
 
Bigus Dickus said:
It will have taken nVidia nearly two full years to go from the GF3 to the next generation NV30. In that two years, all we have gotten from nVidia has been uninspiring speed improvements to the NV2x core (though the top end 4600 is certainly quite a bit faster than the original GF3).

Um, okay. The NV25 was the fastest of any of nVidia's refreshes (with respect to the previous release).
 
sas_simon said:
The geforcefx was supposed to be announced in august and released in november/december time, it was actually announced in november and released in february, they are 4 months behind where they should be.

However, nvidia is behind, the only question is what can nvidia do to counter the r350 and r400. It is far too early for them to release these new drivers that make the geforcefx 30% faster, what about the nv35? How far away is that? Nvidia do have the money and the support to hold out for a couple of product cycles, but after that, they will need to of made up the gap or they fear losing out.

Even if "only" 4 months,it's an eternity in the graphics board field...
The "super detonators"....I doubt they'd pull it off this time,they'd want to have as good drivers as possible to defeat the 9700 whereas back then they could afford to not have it on release since they were king of the hill.(and I don't salute their choice since it in reality hindered owners of their cards from using them at the full capacity and wasn't imho the great thing it was written up to be...pure marketing and a way to fight the 8500)
Yes,nVidia needs to get back on track and that's why it's so good they're not the leader anymore,without competition you only compete with yourself and why waste loads of cash on that....?
I think ATi needs to get their Linux drivers better though,I have a friend that's HC Linux and HC gamer and he got the GF3 over the 8500 only due to better Linux drivers back when...(he only uses Win for gaming)
I salute ATi for what they've done,they have released a great card and should be saluted for it no matter if you're for one company or not ("you" is general,not a specific person) now I'm eagerly awaiting to see how things turn out in the future since the "war" about being king has started. :)
 
I tire of all the ATi supporters (and even some of the nVidia supporters) who claim nV25 was nothing more than nV20 with an extra vertex shader and more clock speed.

Do you self-professed experts actually bother to research your claims before making them?

Here is an excerpt from a David Kirk (I'm sure you have heard of him...) interview:

http://www.hwzone.it/html/text.php?id=282
Question 12.

MR: I had an inteview with Dan Vivoli, exactly one year ago (English version can be found here: http://www.hwzone.it/html/text.php?id=205 ). In that interview, he stated that NV30 would have been the first NVIDIA chip to use "something" from the aquired 3dfx technology. Can you tell me anything more, since one year has passed?

DK: GeForce4 series included the VPE - video processing engine - was designed at 3DFX. Also, some of the FSAA technology - filter on scanout - was first developed at 3DFX. This technology allows for very high speed anti-aliased rendering, while using less memory bandwidth. So, you can see that the 3DFX team really hit the ground running when they arrived at NVIDIA.
 
The GF4 is nothing more than a bad GF3 refresh. QAA became poor quality. They reduced the blur. :/
I played UT2K3 at 1024x768 32bpp QAA on my GF3 and it never once went belove 35fps, same with MW and every other game. All at maxed details.

I believe it was a mistake to not allow a QAA blur slider as an option on the GF4. IMO the GF4 maybe slightly faster (not 2x faster :/) with bad QAA.
If you want to argue another persons opinion, you go ahead. Just remember my opinion is neither wrong nor right, just a preference.

If I were buying a new PC with unlimited $$$, I would not buy a GF4. GF3 with QAA "blur" all the way for me.

"Opinion is applicable to a judgment based on grounds insufficient to rule out the possibility of dispute..." www.dictionary.com is your best friend. :)
Synonyms: opinion, view, sentiment, feeling, belief, conviction, persuasion
 
Yeah, but you just make sure your opinion is ludicrous in every single case, probably to get attention or rouse other members into an argument.

It gets tiring sometimes. You can only hear someone claim that they really like slow framerates at low resolutions with horrible QAA blur and no AF but so many times before you finally just start thinking of them as mentally damaged.
 
Kiler, why are you continually confusing poor image quality for good image quality?

The Quincunx/4x9 AA improvements were, well, improvements. By using a gaussian weighting, the GF4's AA is clearer than what the GF3 allowed.

But personally, I don't like either of those methods. I generally game at 2x FSAA when using the GF4.
 
Well Quincunx can have definitely it's uses, and there are games in which I use it, combined with 8xLevel aniso and a conservative LOD negative offset (-0.8 - SS:SE would be a good example).

As far as the amount of blurring goes in Quincunx, I'd rather say it's a matter of personal preference. I personally prefer the NV25 iteration of it, not only is it clearer, but provides an almost equal performance drop as 2xRGMS.

Those who really prefer a bit more blurred scenery, can eventually combine Quincunx with a +0.5 LOD setting.
 
IMO Quincunx was really only useful on the GF2 line which were really too slow to use "real FSAA". From GF3 onward Quincunx became more or less outdated.
 
Nagorak said:
IMO Quincunx was really only useful on the GF2 line which were really too slow to use "real FSAA". From GF3 onward Quincunx became more or less outdated.
Except that GF2 cards are incapable of Quincunx FSAA. This was first introduced with GF3.
 
Bigus Dickus said:
Yeah, but you just make sure your opinion is ludicrous in every single case, probably to get attention or rouse other members into an argument.

It gets tiring sometimes. You can only hear someone claim that they really like slow framerates at low resolutions with horrible QAA blur and no AF but so many times before you finally just start thinking of them as mentally damaged.

Big D:
That's your opinion and to me that is ludicrous. Want to argue over opinions? :rolleyes:

So what if I do like low framerates and massive blur? I don't give a damn if you want me to say "high frame rates and resolutions with XxAA and low LOD is better", I will not say it because: I don't believe it/see it/feel it.

Chalnoth: "Kiler, why are you continually confusing poor image quality for good image quality?", I can use the same argument on you. Since I prefer what you call "low" image quality over what you call "high" image quality.
I believe in using what I like. If you can't see that then I do apologise.
 
K.I.L.E.R said:
Big D:
That's your opinion and to me that is ludicrous. Want to argue over opinions? :rolleyes:

So what if I do like low framerates and massive blur? I don't give a damn if you want me to say "high frame rates and resolutions with XxAA and low LOD is better", I will not say it because: I don't believe it/see it/feel it.

Yeah, I just am not convinced you actually believe that. I'm about as convinced that anyone could possibly truly have such an opinion as I am that someone could truly be a solopsist.
 
Well, if I close my browser, all of you cease to exist. :p


Ed note: Dang, that's a crappy icon for sticking your tongue out.
 
K.I.L.E.R said:
Chalnoth: "Kiler, why are you continually confusing poor image quality for good image quality?", I can use the same argument on you. Since I prefer what you call "low" image quality over what you call "high" image quality.
I believe in using what I like. If you can't see that then I do apologise.

Believe it or not, image quality can be measured objectively. As it pertains to overall 3D graphics, image quality includes two things:

1. Reducing of artifacts.
2. Improving simulation of reality.

Artifacts that can be seen on video cards today include z-buffer errors, texture aliasing, edge aliasing, color banding and dithering. These can be improved by various image quality enhancements such as FSAA and higher-precision pipelines and storage formats.

Improving simulation of reality falls under things like anisotropic filtering, which makes the result of filtering more closely match the underlying data, pixel and vertex shaders, shadows, better lighting, etc.

If you believe things like, say, anti-aliasing actually reduce image quality, then you've just convinced yourself that the aliased image is what a game is supposed to look like, and therefore "good." This is a totally different idea from image quality. This is just not wanting things to change.
 
Quincunx to me is still a positive feature over-all because it helps smooth out the edges more than just a x2 RGMS setting. The ability to sharp'n textures and add anisotropy certainly helps the slight blur.
 
Back
Top