First REAL FX Benchmarks

MuFu said:
Sabastian said:
Despite high Vorschusslorbeeren...

Very disappointing considering the high Vorschusslorbeeren, I agree. :(

MuFu.
ROFL

*copy / paste -> quotes file*

Don't you just love babelfish? :D

ta,
-Sascha.rb

P.S. for those who care: Vorschusslorbeeren = "laurels", i.e. praise in advance.rb
 
DaveBaumann said:
Are you looking at Komplett? They were advertyising them at about £160 ish the other day - they appear to have gone up.

Yeah - they had it on offer last week for £169 but obviously they sold out pretty quickly and now the price is ~£185. Kinda bummed I bought a 9500NP (that won't mod) for £140 now. :rolleyes:

These results don't bode well for NV31 at all, espceially if what CMKRNL says about the ~<300MHz clockspeed is true. If this 8-pipe, 8x1 architecture is only just being held afloat by it's high clockspeeds when compared to ATi's equivalent, I hate to think how a frequency-retarded, 4x1 derivative will fare against the 9500 Pro. :?

Looking forward to that 9700NP review, Dave... :)

MuFu.
 
I also have a 9700 NP on the way...$360 Canadian...I couldn't turn down that price.

My review won't be near as indepth as Daves though :D
 
eek7.gif



:p
 
I wonder if GeForce FX Ultra is just unbalanced afterall and in todays games really needs more bandwidth. Is the lack of a 256 bit bus really strangle the GPU - is it data starved? Why else does a GPU with a 40% higher clock rate than the 9700 PRO fall so flat?

I know the testing in tecchannel is not detailed enough nor focused on all the right areas, but if this trend continues NVidia have some serious egg/face problems here.

Too I wonder if NVidia have only just got there drivers fully working (I heard rumours that many features weren't working well). If so in teh nest few months the drivers might get quite a bit better in areas.

All the same - lets wait and see what Beyond3d says.

BTW - have folk noticed the article on Directx 9 testing from

http://www.digit-life.com/articles2/radeon/r9500-9700-dx9-p2.html#p11

To me it was a good (long) read
 
So...

Anybody have any late breaking, concrete info on R350 availability?

I think anybody who is seriously thinking of buying this FX product is nuts. If the R350 specs are remotely close to being true, I see absolutely zero advantage in owning an FX over R350. Honestly, not one single advantage.

1. ATI product is more quiet
2. ATI product offers better AA implementation
3. ATI product supports higher A.F. levels
4. ATI product is cheaper
5. ATI product historically has better 2D quality

etc. etc. etc.

Heck, it almost seems like ATI could hold off R350 release altogether, based on the early numbers...But to really make a statement, they should release it to really drive home the notion that they're the unquestioned technology leader.
 
Type - you forgot - current ATI owners didn't wait 6 months for high resolution, qualityh AA gaming for no reason, either ;)
 
Here's my quick'n'dirty review from my 9700 non-pro...
Good:
- drivers on 3D (no problems so far.)
- DX9 support
- working and simple control panel (no need for addtional tweakers for 3D settings tweaking.)
- excelent 2D and 3D quality as well as performance.
- pretty easy installation. (look for last point on "bad" list.)

Bad:
- No Hardware Video Layer for 2nd display. (BitBlt supported but usually Video layer initialized and then bitblitted to 2nd head, which is awfully slow.)
- multi display features: what features? I am coming from Matrox world and I can't find any of those good features Matrox has. Oh, wait... Theather mode. ahh... well, It took more than 20 minutes and several reboots on 98 to get it working and after that it was still buggy.
- No OpenGL acceleration for neither heads on 2 display mode.
- Bulk card didn't included (at least on my case) Tv-out drop cable (propriatary connector on card.) and makes practically impossible using tv-out. (still hunting for that. I need to get TV-out working.)
- refresh rate problems even on clean install 98SE. (because my "old" trusty HP P1110 was unable to send Plug and play monitor data, it became as "Default Monitor" I tried manually load correct inf file for it, but even with that I was unable to force card using anything else than 60Hz. Using Rage3D Tweak I was able to force it on correct refresh rates, but then when I installed second monitor (Philips 15" 150S3f TFT) on second head, it tried to use same refresh rates for it too. After I had manually set up all 9 modes (640x480-1024x768 at 8,16 and 32bits) to run at 60Hz, I was finished. But last time this hard setting up monitor was when I had 14" KFC and (suprinsingly) ATI Rage II+ PCI about five years ago.)


my verdict:
Excelent gaming card for single monitor use. BUT if you want some work done on more than one display on same machine, I highly recomend look for better multidisplay solutions.
 
Typedef Enum said:
I think anybody who is seriously thinking of buying this FX product is nuts. If the R350 specs are remotely close to being true, I see absolutely zero advantage in owning an FX over R350. Honestly, not one single advantage.

1. ATI product is more quiet
2. ATI product offers better AA implementation
3. ATI product supports higher A.F. levels
4. ATI product is cheaper
5. ATI product historically has better 2D quality

I think the gun is being jumped somewhat here. We still need plenty of firm benchamarks to say where GFFX stands right now, and what we've seen so far isn't a particularily extensive (or possibly representative) set thus far. And we certainly haven't seen any IQ reaseach yet.

Also, I would hold off on reaching any conclusion about the performance and/or configuration of R350 in the absence of any information. Given they will need to clock the thing substancially higher to eke out more fillrate (they cannot rely on bandwidth alone) I still strongly suspect that we are going to see some exotic cooling solutions crop up there as well.
 
I would wager a fair amount of $$ that what we see now...is what you're going to get.

Why the hell has nVidia been so quiet lately? Why are reviews coming in so late? Why was nVidia so strict about benchmarking variables w/ MaximumPC?

The answer is very clear and obvious. This product barely outperforms the R300 when things like AA aren't factored in w/ many of the benchmarks, and will actually get beaten when bandwidth comes into play.

...and it will get much worse with an even better R350 product.
 
would wager a fair amount of $$ that what we see now...is what you're going to get.

Why the hell has nVidia been so quiet lately? Why are reviews coming in so late? Why was nVidia so strict about benchmarking variables w/ MaximumPC?

The answer is very clear and obvious. This product barely outperforms the R300 when things like AA aren't factored in w/ many of the benchmarks, and will actually get beaten when bandwidth comes into play.

...and it will get much worse with an even better R350 product.

/Slaps self in head and stares in utter Amazement.. :oops:

Getting That Matrox Parheliah has really changed you Type.
 
Nappe1 said:
Here's my quick'n'dirty review from my 9700 non-pro...
Good:
- drivers on 3D (no problems so far.)
- DX9 support
- working and simple control panel (no need for addtional tweakers for 3D settings tweaking.)

Btw, I DO have - OGL-refresh, OC, custom res.

- excelent 2D and 3D quality as well as performance.
- pretty easy installation. (look for last point on "bad" list.)

Bad:
- No Hardware Video Layer for 2nd display. (BitBlt supported but usually Video layer initialized and then bitblitted to 2nd head, which is awfully slow.)
[/quote]

?
I did watch two different movie on CRT and TV - at the same time.

- multi display features: what features? I am coming from Matrox world and I can't find any of those good features Matrox has.

Which is that, the very long taskbar? Totally useless - I have to roll several feet to watch my clock in the bottom right corner?

Oh, wait... Theather mode. ahh... well, It took more than 20 minutes and several reboots on 98 to get it working and after that it was still buggy.

1. W98 IS NOT SUPPORTED ANYMORE BY ITS MAKER - ATI still supports, though.

2. I use XP Pro and SONY Wega-series TV I never had any problem watching PC-movies on my TV.

- No OpenGL acceleration for neither heads on 2 display mode.

I'm not sure but SS:SE sports it, even on a 9700, right?

- Bulk card didn't included (at least on my case) Tv-out drop cable (propriatary connector on card.) and makes practically impossible using tv-out. (still hunting for that. I need to get TV-out working.)

If you buy boxed versions, you'll have all the necessary cables. It's not related to pros and cons of the card... ;)

[/quote]
- refresh rate problems even on clean install 98SE. (because my "old" trusty HP P1110 was unable to send Plug and play monitor data, it became as "Default Monitor" I tried manually load correct inf file for it, but even with that I was unable to force card using anything else than 60Hz.
Using Rage3D Tweak I was able to force it on correct refresh rates, but then when I installed second monitor (Philips 15" 150S3f TFT) on second head, it tried to use same refresh rates for it too. After I had manually set up all 9 modes (640x480-1024x768 at 8,16 and 32bits) to run at 60Hz, I was finished. But last time this hard setting up monitor was when I had 14" KFC and (suprinsingly) ATI Rage II+ PCI about five years ago.)
[/quote]

Use XP. At least it's officially sports the 60Hz BS. :D

my verdict:
Excelent gaming card for single monitor use. BUT if you want some work done on more than one display on same machine, I highly recomend look for better multidisplay solutions.

Which one? Matrox is kewl but the 3rd monitor very slow for gaming...

http://hrc.webzeppelin.hu/-=Utils=-/ut2k3sg.jpg
 
nelg said:
I also wonder how long you can strain this card before the thermal protection option in the driver kicks in ? If this GPU heats up high and fast and therfore has to down clock all these bechmarks would be considered pruley academic.

You can setup from driver: choose between "Desktop (2D)" and "Peformance (3D)"
 
DaveBaumann said:
BTW - can anyone how to measure the power consumption?

Well I'd guess a multichannel ammeter inline with the external power feed. From that you can derive power across three rails. This is presuming that only data/address signals pass over the bus.

54W sounds about right for R300-942 so their testing methods seem reliable enough. 75W is unbelievable.

MuFu.
 
I imagine the fan is drawing somewere around 5-6 watts..sounds like it spins alot of RPM..or the noise may come from the fins...just guessing.
 
I figured the power consumption had to be unreasonable in order for that cooler to be necessary; they wouldn't put that cooler on just for show.

I forsee lots of power problems with this card. I wonder how many stock power supplies can deliver that kind of power.
 
Typedef Enum said:
1. ATI product is more quiet
2. ATI product offers better AA implementation
3. ATI product supports higher A.F. levels
4. ATI product is cheaper
5. ATI product historically has better 2D quality

Well, I have a number of reasons to want an FX over the R350:

1. Linux support. I need to work in Linux, and currently I can't on an ATI card.
2. Anisotropic implementation. I really do not like texture aliasing, of which I have noticed more on the 9700 Pro, and the blurring of text in some games is just as bad.
3. Various visual quality issues on the 9700, most likely related to drivers, in various games: z-buffer errors, color banding.

While it is certain that the anti-aliasing visual quality will be somewhat lower, I find that is small compared to the other issues that I've noticed on the 9700.

Btw, I wouldn't buy an FX with the "FX Flow" attached...so that means I'd probably end up going for one of the non-Ultras, if I choose to purchase one.
 
Back
Top