A Few 9700 Screenshots

Sorry, I replied in the wrong thread. I was replying to the "tape out" thread where someone said NVidia will be shipping a "barely working" NV30 this christmas, which seems like a ridiculous comment to make. I mean, do we have any figures on Radeon 9700 PRO yields?

And for the record, I don't think ATI's decision to use .15um is any better than 3dfx's decision to use SDR memory and older processes. ATI really pushed the limits of .15um. Their design is a huge, customized die and power hungry beast. Clearly, they are going to ship a .13um R300 and follow on products that use less power, have higher clock, and are cheaper.

NVidia has always targeted memory technology and processes ahead-of-time and made big bets. Their schedule might be slightly delayed, but I think it was fundamentally the right thing to do. They gained valuable experience doing it. Eventually, the design and process will mature, and NVidia will benefit from it, the same way they benefit from the DDR market when it matured.

Back in the 3dfx days, NVidia was accused of the same thing. DDR was a "risky" bet, multichip was better, more reliable. DDR was expensive. Supply was low, etc etc. Now only a few years later, DDR is everywhere and racing towards 1Ghz.

6 months from now, all of this whining about NVidia's .13um yield problems will seem like a thing of the past, and in hindsight, it will be seen as a bold move.

(Precedent: On B3D, there were oodles of rumors about GF3 yield problems, and rumors that GF3 boards would cost as high as $800 because of bad yields and still expensive DDR)
 
(Precedent: On B3D, there were oodles of rumors about GF3 yield problems, and rumors that GF3 boards would cost as high as $800 because of bad yields and still expensive DDR)

Geforce 3's debuted in Canada at $700..so I'd say they were expensive :-?
I also don't remember the CEO stating the Geforce 3 wasn't taped out or stating problems with .15 micron either...

I think this situation is different.
 
Well, based on your logic, then I guess there's no reason to upgrade right now either...

The results from that 640x480 graph looks mighty familiar.

And I must confess that I had been using a 19" CRT for the previous 3+ years as well...15-17 was the previous generation, as far as I was concerned (and I was just a college student then).
 
I'd bet ATi has allready been working on .13mu desigins for the R300. By the time the 13mu is viable for the NV30 it probably will be too.
 
And for the record, I don't think ATI's decision to use .15um is any better than 3dfx's decision to use SDR memory and older processes.

Ahh...but there is a big difference, and why ATI's decision "turned out" much better: ATI shipped their product sooner than the competition.

That was 3dfx's PLAN. Decisions only look bad if they don't turn out as planned. ;) There are risks with either approach: (push limits of existing tech, or go for bleeding edge, new tech). 3dfx's thinking, like ATI, was that they had a better chance of getting their product out first, by using established technology, rather than rely on newer tech that "should" be ready in 18 months. (18 months from when such decisions are made.)

In 3dfx's case, 3dfx didn't "execute" properly, and the new tech came on-line pretty much on schedule.

In ATI's case, ATI, DID execute very well, and nVidia is the one that might end up really regretting the decision...depending on how late NV30 actually is.

In other words, nobody really cares about what decision is made to use what tech, memory, etc. If 3dfx managed to ship 3 months earlier than Geforce, rather than 3 months after, they might actually still be around today.
 
6 months from now, all of this whining about NVidia's .13um yield problems will seem like a thing of the past, and in hindsight, it will be seen as a bold move.

That is, assuming 6 months from now there are NV30 chips out in quantity. ;)

No-one ever said aggressivley trying 0.13 micron isn't a bold move. At the same time, the same can be said for believing you can design a 110 million transistor chip with DX9 floating point pipes, on 0.15, and clock it at 325 Mhz. That's the point.

Either way you try it, these high-end designs are BOLD. In this case, it's ATI that gets the feather in its cap though, for getting the product out first.
 
eek7.gif



The average monitor size I would say is 17" today for gamers based off the valve survey..that means 1024 x 768..maybe 1280 x 1024...of course compare FSAA and AS filtering as that is what we pay the big bucks for with these high end cards..

So a 9700 is getting 2X times the perfomance of a Ti4600 in this example @ 1024 x 768.. I don't compare CPU limited conditions and I don't compare high end cards not running with any features on either.

aaaf_perf.jpg
 
Doomtrooper said:
My point is what Joe stated was correct.. what your data shows this contradicts your numbers and is tested over a wide range of hardware...this was a new product but was bareley edging out a TNT Ultra

My data shows similar results in Expendable, but not in 3DWinMark99 nor DMZG. Anand is foolish when he stated that "the drivers are the downfall to the GeForce 256's Direct3D performance" by only measuring performance in Expendable.

But I digress as my initial response to Joe was to make a point that you've replicated in your reply to me.

"...this was a new product but was bareley edging out a TNT Ultra"

That statement is misleading and it would be prudent for some of us to take the time to think about what has been written before posting. That's all I'm after.
 
DemoCoder said:
I mean, do we have any figures on Radeon 9700 PRO yields?
Good enough to sell, oooh, right about now :)

And for the record, I don't think ATI's decision to use .15um is any better than 3dfx's decision to use SDR memory and older processes. ATI really pushed the limits of .15um. Their design is a huge, customized die and power hungry beast.
Well, 3dfx still was VERY delayed in using the tried and true SDRAM tech... ATi has delivered a DX9 chip before NV30... wether or not its faster or whatever is irrelevant, because history showed that the 3DFX chip came out after its intended date, with fewer marketing features than the already released NVIDIA product. ATi have not suffered that "delay," instead ATi delivered ahead of NVIDIA this round (DX9 products).


6 months from now, all of this whining about NVidia's .13um yield problems will seem like a thing of the past, and in hindsight, it will be seen as a bold move.
I'm willing to bet no one knows the future for sure ;)
 
jjayb said:
Why go looking for problems?

Because it is one of the many purposes of the Beyond3D forums :) It may well be a fact that if gamma is the "determining factor", 128-bit FP is no real advantage.

I find it quite amusing that many folks want technical content at this site yet the constant gripe about "But will we notice this in actual games?" is also prevalent.

You want advancement or what-you-can-see-generally-speaking? :)

FWIW :

Any product that is better than the last "best product" is undeniably "better". Whether in specs or performance or OOTB image quality improvements.

The keyword, insofar as this forum is concerened AFAICS, is whether we are talking about "products" or architecture.

We need to make the distinctions (or differences in principle, at least in this forum, the way I see it) between "3D-Technology" and "available products" as either topics of interesting discussions or a chance to say "Product doesn't exist. Publicly-available/announced 'architecture', therefore, is not worth discussing".
 
jjayb said:
Why go looking for problems? If you have to turn the gamma "way up" to see it, that's pointless. Are you going to be playing with the gamma "way up"?

Because, in actuality, I do play Morrowind with the gamma "way up." Yes, it's a cheap way of getting around using torches, but I just don't think the torches in the game are bright enough anyway, so I just boost the gamma.
 
Joe DeFuria said:
Ahh...but there is a big difference, and why ATI's decision "turned out" much better: ATI shipped their product sooner than the competition.

Um, ATI's R300 decision has certainly not yet "turned out" at all. It's far too early to see what sort of impact the decisions the two companies made will have.
 
But I digress as my initial response to Joe was to make a point that you've replicated in your reply to me.

"...this was a new product but was bareley edging out a TNT Ultra"

That statement is misleading

Let me refine that point, and maybe we'll all be happy: ;)

The measurable difference on "today's games" between Ti 4600 and the 9700 is much larger than the measurable difference between the GeForce SDR and the TNT-2 Ultra....
 
Um, ATI's R300 decision has certainly not yet "turned out" at all. It's far too early to see what sort of impact the decisions the two companies made will have.

Um, all we know for sure is that the R300 is here now, and the Nv30 isn't. Please, Chalnoth, give ATI credit where it is due.

None of us are making grand claims that the results of ATI's decision will mean the destruction of nvidia, or even some magical turning point. In the grand scheme of things, this ONE high-end product probably won't have a great impact either way...it's how these companies are able to follow-up with the more mainstream ones that will make the big difference.

That being said, ATI is set to release the 9500 in Q4...
 
Joe DeFuria said:
it's how these companies are able to follow-up with the more mainstream ones that will make the big difference.

Which is precisely why ATI's decision hasn't "turned out" at all.

Additionaly, as far as I know, the 9700 is not yet available in stores (I saw ETA's at around early September...i.e. still a week or two away).
 
Chalnoth said:
Joe DeFuria said:
Ahh...but there is a big difference, and why ATI's decision "turned out" much better: ATI shipped their product sooner than the competition.

Um, ATI's R300 decision has certainly not yet "turned out" at all. It's far too early to see what sort of impact the decisions the two companies made will have.

I still can't pick up one off a store shelf, and it's almost September. Even when they start to ship, when will they be available in volume?

I mean, the PlayStation 2 "shipped" in October. I know, I had to drive around for miles trying to find the one store that had a few. I think we need to really draw the line between "paper launched", "shipped a trickle of boards" and "wide availability" Paper launching or shipping a trickle isn't likely to beat any competitor anymore than a concept car is likely to cause consumers to stop buying Civics.
 
Which is precisely why ATI's decision hasn't "turned out" at all.

Eh...whatever, Chalnoth. What's so difficult about "ship comparable product first by at least 3 months = right decision for that product?"

Oh, and ATI has shipped to consumers who ordered directly from them. People have shipping tracking numbers and everything.

It's shipping, OK? Get over it. :rolleyes:
 
Paper launching or shipping a trickle isn't likely to beat any competitor anymore than a concept car is likely to cause consumers to stop buying Civics.

Holy hell, people.

This is a low volume product to begin with. Just remember about drawing these same lines when the NV30 "launches / trickles / volume ships." As with every other previous high-end nVidia product.

As I said...this is ONE PRODUCT. No one made any grand claims of "beating" a competitor as a whole with this product. But for God's sake, product vs. product, can you at least concede that R-300 has the LEAD at this time?

The high-end product is more about mind-share than making a direct difference in the bottom line, IMO. Funny thing about mind-share and indirect benefits though....
 
This is a low volume product to begin with. Just remember about drawing these same lines when the NV30 "launches / trickles / volume ships." As with every other previous high-end nVidia product.

Yes, I'm going to reply to myself. 8)

I know what you'll say...."Man...the NV30 is in such huge demand, I can't find it in stock on the shelves!" :rolleyes:
 
Chalnoth said:
Which is precisely why ATI's decision hasn't "turned out" at all.

Well.. I just got back from Toronto and a couple of meetings with ATI on various things... but I can assure you the 9700 has "turned out" for ATI. They are shipping 9700s and they had them in the office by the dozens.

And anyone who's trying to compare the 9700 to the GeForce SDR launch is nuts... I've been playing with this 9700 for our review for a couple days now since I got back and no matter how you try to spin die size/heat/process/features/whatever, the bottom line is that this board delivers an impressive package over anything else around, and yes, it is available now. Cards are in the mail to those who pre-ordered as we speak.

Edit: fixed quote.
 
Back
Top