NV40: Surprise, disappointment, or just what you expected?

The Baron said:
It's ironic. PS2.0 never mattered in the last generation to Mal because NVIDIA couldn't really do it, but now PS3.0 is the biggest thing ever.

Newsflash: Some PS3.0 tests on NV4x hardware shows that it runs less than a quarter of how fast it should run. Driver immaturity? We'll see.

I didn't think you can do a true PS 3.0 test until it is officially released in the API does OpenGL 1.5 already support it??
 
Malfunction said:
It really doesn't make a difference to me at this point if ATi can produce 10 to 15 frames more while running lower precision (FP24), it is the great quality drivers nVidia is known for and features that are non existant on ATi cards that win my money over less precision and higher FPS.

Have you been living under a rock for the past year? Did you miss all the issues Nvidia has had with their drivers for the last year? The lowered quality. Missing textures. Missing special effects. I'd hardly say Nvidia is the king of quality drivers.
 
Stryyder said:
The Baron said:
It's ironic. PS2.0 never mattered in the last generation to Mal because NVIDIA couldn't really do it, but now PS3.0 is the biggest thing ever.

Newsflash: Some PS3.0 tests on NV4x hardware shows that it runs less than a quarter of how fast it should run. Driver immaturity? We'll see.

I didn't think you can do a true PS 3.0 test until it is officially released in the API does OpenGL 1.5 already support it??
DX9.0c beta. Exposes ps_3_0 and vs_3_0.
 
I think youguys should chill a bit though.

Remember that comparison between the power lines on the 9800 and the 6800 I have to admit I liked the nvidia better, the 9800 looked like better AA, but it no longer looked like a power line it looked like a fairy cable or something :) Because it looked wispy and such instead of hard and defined.
 
As I comb the internet for other reviews and game benchmarks I am getting more and more concerned. Here are some benches from Nordichardware.com:

4.jpg


Mafia is my favorite game of all time and I played it with 1284x960, 6x AA and 16xAF Tri on my Radeon 9700Pro overclocked 400/360 just fine. I am planning to play it again shortly. The 6800U would be a step down for me using 4x AA vice 6x AA I used before. 6x AA made this game look just awesome.

3.jpg


Tomb Raider I am currently playing with FarCry, not so good here either considering a PS2.0 game.

So what is going on here? Bad drivers is my best guest or HyperZ in the Radeon case is kicking the 6800U butt.

OOPS - I can't link the graphics from the review, I hate that, anyways here is a link to the page.

http://www.nordichardware.com/reviews/graphiccard/2004/nv40/index.php?ez=12
 
I had expected much more. Look at those framerate-numbers:
http://www.3dcenter.de/artikel/nv40_benchmarks/index17.php
Then there is the issue with optimizations in Farcry. So the framerates will be even lower!
However they say that because there had been about 150.000 poly around the framerates wouldn't be higher anyway. I don't understand this. Could someone explain?
They also enabled invisible-mode to disable the KI (less CPU utilization).


Vadi
 
Isn't Pixel shader 3.0 an improvement on PS 2.0? I mean, it's not just a number thing here right?

PS 2.0 is great, though why is it being replaced rather quickly? No I am not a developer, though some of you who claim to be in the know had little idea that 1) The existance of PS 2.0 games would be limited 2) The speed at which current hardware could render PS 2.0 was hardly worth getting giddy about 3)Far Cry is out now, PS 3.0 support will be implement soon for it too. From what I am seeing, there is an advantage to going PS 3.0 over 2.0.

Now, as much time as ATi has had fixing drivers for 3DMark03... they couldn't tell that the blurr was there?

About the backwards compatibilty, I still play DX7 games... lmao. Yes, I know that is sad... but I haven't had any probs with nVidia drivers but had while being a long time ATi supporter. (Radeon 64, 9500, 9700)

The difference in image quality on some games was difficult for me between the 9800 and 5950. Sometimes it was completely apparent on that the nVidia card blew, while other times it was hard to tell. If I am playing a FPS game, there is no way in hell I am gonna tell other than in frame rate. If it is smooth and beyond 60FPS, I have no probs. Far above 60FPS @ 16x12, all I can say is wow! :oops:

So I do speak with experience on both IHV's.
 
PS3.0 isn't replacing PS2.0. Developers are writing their code in HLSL. Some of it will compile for PS3 and PS2, some will need to have additional content written.

The difference between PS3 and PS2 isn't like the big change from PS2 to PS1.

VS3.0 is the big cool feature added this time.
 
Malfunction said:
PS 2.0 is great, though why is it being replaced rather quickly?

They are removing ps_2_0 from dx9c? No, didnt think so.

No I am not a developer, though some of you who claim to be in the know had little idea that 1) The existance of PS 2.0 games would be limited 2) The speed at which current hardware could render PS 2.0 was hardly worth getting giddy about 3)Far Cry is out now, PS 3.0 support will be implement soon for it too. From what I am seeing, there is an advantage to going PS 3.0 over 2.0.

Of course ps3 will have some advantages.

Now, as much time as ATi has had fixing drivers for 3DMark03... they couldn't tell that the blurr was there?

You plan on running 6AA or better with no AF on often do you?
 
Malfunction said:
Now, as much time as ATi has had fixing drivers for 3DMark03... they couldn't tell that the blurr was there?
There is no 'blurring' going on, and there is nothing wrong with the antialiasing.

In the 6800 8xAA shot there is more detail as a result of the supersampling, however if you check out the 4xAA shots you will see that the tail detail is the same on all cards because they are using pure multisampling.

Oh, and neither AF nor supersampling will help the textures in the "Ratbag's Leadfoot" shot that you posted very much, because the textures being used are apparently so low res that there doesn't really appear to be much more detail to be had. ;)
 
andypski said:
Oh, and neither AF nor supersampling will help the textures in the "Ratbag's Leadfoot" shot that you posted very much, because the textures being used are apparently so low res that there doesn't really appear to be much more detail to be had. ;)

Ya, that's what I figured/found too though I like to verify those things by *supposed people in the know, because they know more than I... apparently. :LOL:

I don't believe I will be playing such an old game like Leadfoot on a NV40 with any less than max AA. I just don't understand why I *have to believe nVidia is not the better IHV for me?
 
Malfunction said:
andypski said:
Oh, and neither AF nor supersampling will help the textures in the "Ratbag's Leadfoot" shot that you posted very much, because the textures being used are apparently so low res that there doesn't really appear to be much more detail to be had. ;)

Ya, that's what I figured/found too though I like to verify those things by *supposed people in the know, because they know more than I... apparently. :LOL:

I don't believe I will be playing such an old game like Leadfoot on a NV40 with any less than max AA. I just don't understand why I *have to believe nVidia is not the better IHV for me?

You're free to believe whatever you want. People just get a bit touchy when you present faulty arguments as fact around here.

edit: fixed bad formatting
 
Malfunction said:
I just don't understand why I *have to believe nVidia is not the better IHV for me?

1) Umm, because you're using non-factual assumptions to bolster your claims that one product is superior to another, and all the while without even waiting for the other product to even be announced and previewed?

2) If both products were previewed, you'd at least be able to juggle their individual strengths and weaknesses and how those play to or against your personal preferences (which is what I'm waiting to do).

3) You have a rather strong streak of favoritism toward a particular IHV. Your freedom to do so, of course, but just don't expect open bias to be received with wide open arms on this board.
 
noko said:
...

So what is going on here? Bad drivers is my best guest or HyperZ in the Radeon case is kicking the 6800U butt.
...

Regarding fps, it looks like driver issues and CPU limitations.

1) HyperZ allows greater efficiency, and even with poor efficiency the 6800 U should have more than enough power to excel at Mafia at 4xAA.

2) The 6800 U has its own corresponding feature to HyperZ.
 
So if I still play Medal of Honor John, ATi is the better way to go for me? Are all the Open GL games I play this way too?

What determines the better card to go with John? Does it make a difference if you play more Open GL game? Does it make a difference if they are just about even, the ratio of Open GL games to DX games in my library? If they are DX 6 to 9? Driver compability and backwards compatibility? How do I decide John? :?

Edit - fixed this line "Does it make a difference if they are just about even, the ratio of Open GL games to DX games in my library?"
 
Malfunction said:
So if I still play Medal of Honor John, ATi is the better way to go for me? Are all the Open GL games I play this way too?

What determines the better card to go with John? Does it make a difference if you play more Open GL game? Does it make a difference if they are just about even, the ratio of Open GL games to DX games in my library? If they are DX 6 to 9? Driver compability and backwards compatibility? How do I decide John? :?

Well the first step to making an informed decision is to actually see what both ATI and Nvidia release, look at both products, and then decide what is right for you, using whatever critera are important to you.

It sure as hell isn't seeing the release from one company only, and then deciding based on half the information you need.
 
Malfunction said:
So if I still play Medal of Honor John, ATi is the better way to go for me? Are all the Open GL games I play this way too?

What determines the better card to go with John? Does it make a difference if you play more Open GL game? Does it make a difference if they are just about even, the ratio of Open GL games to DX games in my library? If they are DX 6 to 9? Driver compability and backwards compatibility? How do I decide John? :?

Edit - fixed this line "Does it make a difference if they are just about even, the ratio of Open GL games to DX games in my library?"

Buy them both, and see how they work.Trolling for validation isnt going to get you a answer to your question.
Whats kinda funny here is the way the Nv40 is being compared to the r3xx, and the r3xx is holding its own :eek:
 
Back
Top