A Few 9700 Screenshots

Thanks for the shots.Did you notice any banding in any of the games ?


The reason that I ask is because most games today are obviously programmed using integer based pipelines & was wondering if the Floating point ones made any difference in how accurate the color is or should be.
 
If buy either an R300 or the NV30, you're basically buying a "1.0" release of a new architecture at a premium price. Those of us who buy these chips this year will be spending $300-500 for them, but both ATI and NVidia are going to release improved versions (ATI -> .13um, NV30 -> Ti "better yield" edition) next year and those will sell for much less. I mean, why buy a hot, power hungry R300 @ .15um for $400 when next year you can buy a cooler, less power hungry, higher clocked R300 @ .13um for less? The .15um R300 IMHO is not ATI's main product, it's just a "concept car" like the NV30 to sell the lower-end units.


Moreover, by the time the .13um R300 ships/NV30 "Ti", the drivers will be way more stable and debugged, and DirextX9 will be out.


For those who are early adopters, we are used to paying premium for a "first gen" chip with driver and/or hardware problems, and lack of software support. Remember, the original 8500 drivers didn't even support smoothvision. The GF1 was SDR! T&L support took a long time.

It's disingenous to say that buying an NV30 in Dec will get you a "barely functional" product. WTF does that mean? Either it will past the functional tests, or it won't. NVidia isn't going to sell you a board that simple doesn't work for $400-500! And how is this any different than ATI sorting bad R300 yields and selling "barely functional R300"s rebranded as 9500's?


Most likely, the NV30's on store shelves will be the "cream of the crop" from the yields, and anything that fails functional tests will either be rebranded and resold as another product, or thrown away.

In some cases, yes, drivers can work around problems by emulating or deactivating features, but if you see bad NV30 benchmark numbers because of missing or emulated features, are you gonna buy it? Is NVidia going to ship it? I doubt it. They will just send working review units, and hold off on ramp up until it is debugged, just as they have done in the past with the "late" GF3/GF4.


Yes, the early adopters of the high-end R300/NV30 are going to lose out. We are buying immature products, no-DX9, immature drivers, no software support in games, "pushing the envelope of TSMC" products.


I just don't see a big difference between ATI and NVidia's situation here, since I think NV30/R300 = GeForce1/Radeon. They are the first step in a new generation/platform.
 
Ascended Saiyan said:
Thanks for the shots.Did you notice any banding in any of the games ?

The reason that I ask is because most games today are obviously programmed using integer based pipelines & was wondering if the Floating point ones made any difference in how accurate the color is or should be.

Well, I didn’t really see any banding but then I thinks its going to be difficult to see any issues with straight textured titles. Remember, as well, games won’t make use of the extra texturing / pipeline flexability by default so if a game forces multipass then its still going to be using 32bit framebuffers for the intermediate passes.

DemoCoder said:
I just don't see a big difference between ATI and NVidia's situation here, since I think NV30/R300 = GeForce1/Radeon. They are the first step in a new generation/platform.

I think you’re wrong there. As you so point out, the first GF was SDR, and its performance benefits over TNT were a little questionable for a while. While ATI may be selling the DX9 features, same as NVIDIA selling T&L, in reality we all know that means nothing right now because DX9 isn’t here yet. However, what you do have is a clear performance advantage with R300 and then the ability to run high resolutions, high quality filtering and high levels of FSAA – more so than anything else, and that’s its selling point. Whether it’s the first step in a new generation / platform means nothing, its whether its can offer the end user benefits now, clearly this can (as will NV30).
 
Democoder, this has what to do with the screenshots?

By the way, thanks for the shots Wavey. When you get done with all of your review can you just send the card my way?
 
jjayb said:
Democoder, this has what to do with the screenshots?

I'm trying to figure that out myself.

Yes, the early adopters of the high-end R300/NV30 are going to lose out. We are buying immature products, no-DX9, immature drivers, no software support in games, "pushing the envelope of TSMC" products.

Those shots and the benchmark scores we've seen on this "immature" product iwth its "immature" drivers fully justifies its price to this gamer. And who knows, by the time NV30 ships the 9700 might be MSRPing for $300-350 and have another 10-15% to its performance due to driver updates, so, IMO, you're painting a somewhat lopsided picture. Not that that surprises me.
 
Thx for the pics Dave...I was looking forward to some more. I do notice on alot of the racing shots you are usually near last place :LOL:

Democoder...

I did some searching on your past comments when the Ti4600 was released and I didn't see you complaining about the $400 then, your comments are way off base and of course show your true intentions
 
Whether it’s the first step in a new generation / platform means nothing, its whether its can offer the end user benefits now, clearly this can (as will NV30).

To further drive home this point....

I clearly remember initial GeForce SDR scores not showing much improvement, and sometimes slower than, the previous gen, TNT-2 Ultra.

I also recall nVidia pimping brand new "TreeMark" demos (and then DMZG) to "show the power" of the GeForce over the TNT.

This situation is clearly different. There ARE NO DX9 demos even available to benchmark, and ATI did not release any GL demos with 9700 extensions either. The clear power of the 9700 can be shown in todays existing titles and benchmarks.

The question of whether the 9700 is worth the price is still valid of course. However, the 9700 brings much more to the table than the "promise" of new features that require developer support.

I almost felt sorry for those who bought a GeForce256 SDR based on promises of a T&L X-Mas, etc. At least the DDR version a couple months later offered a good boost in bandwidth to be able to run in higher resolutions.

Oh, and those pics look great! ;)
 
DemoCoder said:
We are buying immature products, no-DX9, immature drivers, no software support in games, "pushing the envelope of TSMC" products.


So you are saying that TSMC's .15um process is immature? Hardly.
ATI made the right technological and financial decision by going .15um IMO.
 
DaveBaumann said:
Well, I didn’t really see any banding but then I thinks its going to be difficult to see any issues with straight textured titles. Remember, as well, games won’t make use of the extra texturing / pipeline flexability by default so if a game forces multipass then its still going to be using 32bit framebuffers for the intermediate passes.

Try turning the gamma way up. The best example I've seen of this lies in Morrowind. If you have morrowind, try finding a dungeon that has lots of fog, and, with the gamma way up, go in and see if there's any banding. There's quite a lot of banding on my GeForce4.
 
MikeC said:
Joe DeFuria said:
I clearly remember initial GeForce SDR scores not showing much improvement, and sometimes slower than, the previous gen, TNT-2 Ultra.

You could use a refresher course there Joe :)

http://www.nvnews.net/reviews/geforce_256/configuration.shtml

Mike, with early drivers the GF1 SDR did score slower than a TnT2 Ultra in Gamespot's Gamegauge suite. Unfortunately, I just tried finding that article on Gamespot but couldn't, but I clearly remember that its early D3D drivers weren't the greatest.
 
You could use a refresher course there Joe

Heh...pulled right from that review:

Direct 3D games will also have better performance. However, the benefits of the GeForce 256 will not be realized until game developers start making use of DirectX 7's transform and lighting capabilities.

I must admit that this review was challenging in finding applications that make use of the GeForce 256's transform and lighting capbilites. While games based on the Quake engine use the OpenGL transform pipeline, it took the results of the CAD benchmarking and the demos (TLCMark and The Whole Experience) provided by NVIDIA, to show the power of the GeForce 256.

Yes, there are cases, in some GLQuake benchmarks, where there is considerable improvment. But tests like the infamous "crusher" demo ("THE" Quake2 test), Kingpin...all marginal improvements.

And the D3D Expendable, as I remembered, some tests showed Geforce256 SDR as being slower.

Those benchmarks also reminded me of a characteristic of the SDR: in many cases, compared to the TNT, it tended to score higher margins in 16 bit and lower resolutions....presumably because of taking some burden off the CPU in those CPU limited situations.

And that was another problem: nobody wanted a GeForce to run in 16 bit or low resolutions, where it really excelled relative to the TNT. The TNT was already "good enough" for those cases.

Imagine if the Radeon 9700 was only marginally better than the GeForce4 with Aniso and AA, but much faster in "raw" performance at moderate resolutions. Now that's a solution for a problem that doesn't exist!
 
My point is what Joe stated was correct.. what your data shows this contradicts your numbers and is tested over a wide range of hardware...this was a new product but was bareley edging out a TNT Ultra ..low resolution to eliminate video card bottlnecks and idealy 800 x 600 was the more popular resolution for gaming 3 years ago with a avergae monitor size of 15"-17".

Image138.gif


Democoder comments make it sound like you are not getting anything useful by buying one, well the above screenshots contradict that statement..eye candy.
 
chalnoth said:
Try turning the gamma way up. The best example I've seen of this lies in Morrowind. If you have morrowind, try finding a dungeon that has lots of fog, and, with the gamma way up, go in and see if there's any banding. There's quite a lot of banding on my GeForce4.

Why go looking for problems? If you have to turn the gamma "way up" to see it, that's pointless. Are you going to be playing with the gamma "way up"? Does anybody use these cards for what they were meant for anymore? PLAYING GAMES!!! It's like a frigging witch hunt whenever a new card comes out. "hey, lets pause it here, zoom in really close, turn up the gamma, turn down the textures and stand on our heads. I think we'll find a problem with the new card that way." Goes for both sides of the fence.
 
Back
Top