Nvidia GT300 core: Speculation

Status
Not open for further replies.
DegustatoR: No, it's 3-way CrossFire against 2-way SLI (even dual-GPU HD4870X2 is pretty close to dual-GPU GTX2950). Anyway, multi-card hype is ower, almost nobody buys it (despite its low price), nobody tests it, so why care? Maybe 99% of users have single-card setup...
 
I agree with you, but your rhetorical question seemed to be pointed to another direction, AMD and nVidia. And you obviously know that AMD had its reasons for this.
Yeah, and i even know what reason this is and it has little to do with high end being unimportant and much more to do with AMD being in the red for several years. The question was more to the public who somehow thinks that having just one GPU is better than having the same GPU plus having another, more complex GPU. It all boils down to what you want to do with these GPUs. If you don't want to compete in ultra high end and you don't want to compete in workstation and GPGPU space then you're OK with only middle class hardware. If not then you're probably heading for some troubles with such strategy.

Contradict much?
Nvidia is using G92, a 1.5year old chip, because they simply cannot earn a profit selling a G200 chip at those prices. G200 is too big and costly to sell for less than ~$150, i.e. G92 prices.
GT200 was never meant to be sold at current prices. The problem is that NV DOESN'T HAVE A CHIP which may compete with RV770 while having the same complexity. It's not a problem of GT200, it's a problem of NV not having the right middle class GPU.
And it's the same (although bigger of course because middle class IS more important than high end) problem that AMD has with the absence of high end GPU. Why else would they do RV790?

DegustatoR: No, it's 3-way CrossFire against 2-way SLI (even dual-GPU HD4870X2 is pretty close to dual-GPU GTX2950).
I'm not talking about the current situation. GT200 is bad compared to RV770 and that's exactly what makes 4870X2/CF look as good as it does -- the greatness of ONE RV770, not some miraculous AFR scaling or anything.

Anyway, multi-card hype is ower, almost nobody buys it (despite its low price), nobody tests it, so why care? Maybe 99% of users have single-card setup...
Well, that means that whoever has the fastest single GPU card wins. But i don't think that it's that simple otherwise NV would have won the battle against RV770.
 
Anyway, sales of GTX285 are etremely low. If you look at steam stats, the card is as common, as the old lackluster HD2900XT, which isn't available for more than a year. Every ultra-value 1-2 generation old board is twice as frequent in gaming PC...

Even 1 generation old GTX280, which is now sold for bargain price (local price is lower than HD4890) is as common, as 1,5 gen. old HD3870...

http://store.steampowered.com/hwsurvey/directx/

Are you sure, that this kind of strategy is really good?
 
Anyway, sales of GTX285 are etremely low. If you look at steam stats, the card is as common, as the old lackluster HD2900XT, which isn't available for more than a year.
I don't think you can compare 2900XT and GTX285 sales using Steam statistics. Many 2900XTs was upgraded to newer cards since then and GTX285 is selling for less time now than 2900XT was sold.

Are you sure, that this kind of strategy is really good?
What kind of strategy? Having a high margins high end chip (GT200) in addition to your low margins high volume chip (G92)? Yes, i'm quite sure that it's better than just having one low margins high volume chip because it gives you and your customers more choice.
NVs current problem is that G92 is old and can't compete with RV770 and GT200 is relatively bad and never really was able to compete with RV770. Again, the problem is that NV doesn't have a proper RV770 competitor, not that it has a big high-end GT200. Big high end chip in itself isn't a problem, it's a way to ultra-high-end/workstation/server/GPGPU markets. It's not like you can't do a two-chip AFR card with two of your middle class chips if you have a high end chip. You still can and you have more options.
 
Yes, you can't compare them directly, but their position is similar, which definately means something. Even GTX280, which is EOL now, isn't much more frequent than HD2900.

GTX280 had 150% die-space of HD2900XT (R600), required more complex PCB (NVIO chip + additional routing) and sale price was (majority of its market life-span) significantly lower than sale price of HD2900XT. Despite its low price, the card wasn't very succesfull on market. That doesn't sound like description of a great product to me.

Almost every website was calling HD2900XT the biggest fiasco from the times of NV30, but all indicators shows, that GTX280 and 285 is in similar or even worse position and despite it, there is still a lot of people considering this as the right step.

I like big monolithic GPUs - if they bring performance, technology and IQ advantage over competition - e.g. X1950XTX. But that is not the case of GT200/b.

Many people (even in this forum) were criticising R580 for poor price-effectiveness compared to G71. Now the situation is opposite (and RV790 is even in better position: IQ comparable to competition, technological advantage: DX10.1 over DX10 and even some marketing advantages: GDDR5, 3rd gen of 55nm etc.) - but only a few of the R580 critics are able to say the same about GT200 (why?)

I'm not sure, if is reasonable to presume, that GT300 will be similar to GT200. It seems to me, that the whole super-clocked scalar concept was successful only due to failure of R600. Despite huge die-size, more of texturing-, blending-, Z- power and scalar (effective) core at double clock (additional performance), the resulting performance is not very stunning.

I'd expect some drastic changes - at least significant increase in ALU:TEX ratio.
 
GTX280 had 150% die-space of HD2900XT (R600), required more complex PCB (NVIO chip + additional routing) and sale price was (majority of its market life-span) significantly lower than sale price of HD2900XT. Despite its low price, the card wasn't very succesfull on market.

I don't think you should compare them.
The GTX280 is a high-end part, the 2900XT was not.
Aside from that, we weren't in a financial crisis back when 2900XT was launched, so prices can't be compared. Most people could afford to pay a lot more for something like a videocard a few years ago.

I for example bought an 8800GTS320 at the time, rather than a 2900XT.
They were similar price and performance, just below the high-end, at sharp prices considering the time. I didn't buy an 8800GTX then, and I wouldn't buy a GTX280 now. I would have bought a 2900XT... it's just that I thought the 8800GTS was a better product in that pricerange.

I bet the 8800GTS sold far more than the 8800GTX, and the GTX280 is the 'replacement' of the 8800GTX as high-end card (or perhaps even Ultra, or both). Not the 8800GTS, so it also has no relation to the 2900XT.

The 8800GTS probably also sold a LOT more than the 2900XT, which proves the fiasco that the 2900XT was.

Almost every website was calling HD2900XT the biggest fiasco from the times of NV30, but all indicators shows, that GTX280 and 285 is in similar or even worse position and despite it, there is still a lot of people considering this as the right step.

But the 2900XT *was* a fiasco. It was supposed to compete with the 8800GTX, but fell well short of that. So what you had was a card that was too expensive, too noisy, and too powerhungry for the performance it delivered. As such it had to be pitched against the 8800GTS instead, which wasn't as powerhungry and noisy (and probably cheaper to build for nVidia).
The GTX280 and GTX285 don't have that problem. They have very good power efficiency (better than ATi), run quietly (better than ATi), and the price is in line with performance.
Thing is just that because of the high price and performance, it is out of reach of many people. That makes it an enthusiast product, not a fiasco (just like a Ferrari is for enthusiasts, they don't sell many cars, but that doesn't make them a fiasco either).

There is only one 'problem' with the GTX series... that nVidia may have wanted to charge higher prices, but they can't, because of competition from ATi.
 
Yes, you can't compare them directly, but their position is similar, which definately means something. Even GTX280, which is EOL now, isn't much more frequent than HD2900.
GTX285 is an update to GTX280, you should add their sales.

GTX280 had 150% die-space of HD2900XT (R600), required more complex PCB (NVIO chip + additional routing) and sale price was (majority of its market life-span) significantly lower than sale price of HD2900XT. Despite its low price, the card wasn't very succesfull on market. That doesn't sound like description of a great product to me.
It wasn't fast enough for the majority of G80 users to go and upgrade -- that was the main problem with GTX280. Plus the gaming landscape changed -- there are only a couple of games which aren't playable on G80/G92/RV770 and all of them aren't playable on GT200 also -- so why buy it? GT200 was a bad GPU from the performance update point of view.

Almost every website was calling HD2900XT the biggest fiasco from the times of NV30, but all indicators shows, that GTX280 and 285 is in similar or even worse position and despite it, there is still a lot of people considering this as the right step.
Not really. GT200/b is still the fastest GPU on the market and GTX295 is the fastest videocard on the market. Neither NV30 nor R600 ever was and that was the reason for them to be failures. GT200 is a failure for NV but it's a good GPU for consumers.

I like big monolithic GPUs - if they bring performance, technology and IQ advantage over competition - e.g. X1950XTX. But that is not the case of GT200/b.
GT200/b is the first GPU on the market with IEEE DP capability. It's the fastest GPU on the market even now, 9 months after launch. It support PhysX and now have an ability to "force" SSAO in some titles. That's features, performance and quality.
It sucks only in perf/mm^2 area. In every other area GT200 is a very solid GPU.

Many people (even in this forum) were criticising R580 for poor price-effectiveness compared to G71.
If I remember correctly it was the strange (at that times) choice of ALU/TU balance that were criticised most, not R580's poor price-effectiveness in general. I myself prefered R580 to G71 during those times.

Now the situation is opposite (and RV790 is even in better position: IQ comparable to competition, technological advantage: DX10.1 over DX10 and even some marketing advantages: GDDR5, 3rd gen of 55nm etc.) - but only a few of the R580 critics are able to say the same about GT200 (why?)
What are you talking about? Everyone SCREAMS about GT200 being bad because of low perf/mm^2!

I'm not sure, if is reasonable to presume, that GT300 will be similar to GT200. It seems to me, that the whole super-clocked scalar concept was successful only due to failure of R600.
Yeah, that's why Intel have chosen the same concept for LRB probably =)

I'd expect some drastic changes - at least significant increase in ALU:TEX ratio.
I think it's pretty safe to assume that GT300 will be a G80/GT200 evolution i.e. it won't be as drastic architectural change as NV20->NV30 or G70->G80 was. Serial scalars won't go anywhere, ALU/TU ratio will be increased again of course but that's far from "drastic changes". Somehow i think that most of improvements will go into thread dispatcher which will allow them to significantly increase processing efficiency. GDDR5 support will give them the same bandwidth per line as AMD has/will have so any current AMD AA performance/PCB advantage will probably disappear. Most interesting questions about GT300 are the DP performance and design decision and tesselation performance compared to RV870. But i wouldn't expect any drastic changes in the underlying architecture.
 
GT200/b is the first GPU on the market with IEEE DP capability.

The only chips in the consumer market with IEEE FP anything are conventional x86 CPUs.
Larrabee's vector ops are an unknown, but at least its x87 pipeline should match some level of the specification.

GPUs currently have varying degrees of "sort of IEEE, getting there maybe".
 
The only chips in the consumer market with IEEE FP anything are conventional x86 CPUs.
Larrabee's vector ops are an unknown, but at least its x87 pipeline should match some level of the specification.

GPUs currently have varying degrees of "sort of IEEE, getting there maybe".
GT200 specs says "IEEE 754 single & double precision floating point units". I'm not an expert on these things - are they lying?
 
Dual GPU setups are still more of a hassle whether or not they take two slots or one. I am just glad that the mid range AMD cards are good. Just think if they made only low end chip and tried to sell 4 of them on one card for mid range.
 
GT200 specs says "IEEE 754 single & double precision floating point units". I'm not an expert on these things - are they lying?

If their statements were that their math is fully IEEE compliant, then yes they would be lying.

Nvidia had a presentation slide showing some of the things they do not support, including flags and exceptions.

The DP is more fully featured than in the past, and denormal support is massively faster than it is on an x86, but not every bit of the specification is there.
 
I don't think you should compare them.
The GTX280 is a high-end part, the 2900XT was not.
HD2900XT was more expensive than GTX280 for majority of its life-span.
Aside from that, we weren't in a financial crisis back when 2900XT was launched, so prices can't be compared.
Are you sure? Price of GT200 fell before we heard first mention of crisis and more than half a year before the crisis affected anything. Prices went down shortly after RV770 launch - it was ATis' agressive price policy, not the crisis.
The 8800GTS probably also sold a LOT more than the 2900XT, which proves the fiasco that the 2900XT was.
HD4800 has also better sales than GTX280, but that isn't related to the situation I was talking about. ATi was able to sell 400mm2 HD2900XT for higher price in similar amounts as nVidia sells 600mm2 GTX280 for lower price. And 3/4 of GTX280s' sales were realized before the crisis impacted these markets.

But the 2900XT *was* a fiasco. It was supposed to compete with the 8800GTX, but fell well short of that. So what you had was a card that was too expensive, too noisy, and too powerhungry for the performance it delivered. As such it had to be pitched against the 8800GTS instead, which wasn't as powerhungry and noisy (and probably cheaper to build for nVidia).
Any single reason why GTS should be cheaper to produce? G80 was bigger than R600, additional chip (NVIO) required, 25% more VRAM required...

The GTX280 and GTX285 don't have that problem. They have very good power efficiency (better than ATi), run quietly (better than ATi), and the price is in line with performance.
That's very nice, but why the board isn't more widespreaded? Why people don't buy it more? Is it really profitable to nVidia to sell 500-600mm2 GPUs for the launch price of 128bit 150mm2 GF6600GT 3 years ago (or GF8600GTS 2 years ago)? At that time nobody of us could imagine more desperate step than to sell 512bit 600mm2 GPU for mainstream price.

Power efficiency, reference coolers or temperatures are not the key features of graphics board. The key aspects for evaluation of 3D accelerator are 3D performance, price (maybe price/performance ratio), image quality and features. Operational characteristics really aren't the main standpoint for choosing a graphic card. Maybe it can help to choose the right product, if I have two boards with similar performance, similar price and similar feature set (e.g. HD4850 / GF9800GTX+)

GTX285 is an update to GTX280, you should add their sales.
RV670 was an update to R600. Should I add it to their sales, too?
Not really. GT200/b is still the fastest GPU on the market and GTX295 is the fastest videocard on the market. Neither NV30 nor R600 ever was and that was the reason for them to be failures. GT200 is a failure for NV but it's a good GPU for consumers.
Very few people care about the fastest solutions on market these days. Esp. if they can buy a solution, which is 5-15% slower, 20-30% cheaper and supports up to date standards (DX10.1).
GT200/b is the first GPU on the market with IEEE DP capability. It's the fastest GPU on the market even now, 9 months after launch. It support PhysX and now have an ability to "force" SSAO in some titles. That's features, performance and quality.
It sucks only in perf/mm^2 area. In every other area GT200 is a very solid GPU.
Yes, it would be nice GP-GPU accelerator, but there are no meaningful GP-GPU applications for end users, so it's logical, that qualities of this card are measured by its 3D performance. Because of the poor performance/size ratio, the chip is sold for fraction of the sale price it was designed for. How can I consider GT200 to be a very solid GPU, if it is competitive (performance-wise) only in 2 grades lower price segment, than it was designed for?

Such a massive price drops of high-end solution indicates to the user, that the manufacturer isn't sure with the postition of the product and that something is wrong with it. GF7900GTX was slower and less-featured than competitor, but nVidia refused that - taunted, how great it is - and it worked. The product was successful on market without any price-drop.

Yeah, that's why Intel have chosen the same concept for LRB probably =)
Concept of GT200 was probably good for GP-GPU applications, but what sells graphics cards these days is still their 3D performance.
GDDR5 support will give them the same bandwidth per line as AMD has/will have so any current AMD AA performance/PCB advantage will probably disappear.
I don't think ATis' AA advantage comes from BW. GTS285 has >25% more BW, but MSAA 8x performance is identical to HD4890 at average.
 
The bottom line is: no one would have cared about the new ATI strategy if their 260mm2 chip hadn't been that insanely fast. We can debate all day on which strategy is the best one (big gpus, more but smaller gpus, etc..) but in the end what surprised everyone is that it's possible to pack all that power in such a "small" chip (btw..still waiting some sort of educated explanation on how they did it, it wasn't an incremental improvement, that's for sure).

Remember: small chip that can perform close to massive chips == win.
 
Yes, and it's why I can not understand DegustatoR's claims that they(ATI) are killing the market.

Fact is nVidia has this huge chip that does nto perform. Market price is dictated by performance alone.
 
NVidia planned to sell GT200 as Tesla and as Quadro, regardless of the price paid by consumers. The profit there is not only "remarkable" but NVidia also provided two sectors of the market a continued reason to buy "nothing but NVidia". With 99% GPGPU and 80%+ workstation market share it seems to me the retail price of GT200 was pretty much irrelevant.

NVidia was prolly selling Tesla GT200b, easy-binning-GT200bs, too, months before the consumer launch - presumably because the channel's mechanics are rather different from the consumer channel.

GT200 development costs are also, effectively, part of the long-running CUDA start-up costs. 5-year plan, not 6-month schedule. So GT200's payback time will last well into the future beyond GT300 simply because NVidia was able to reinforce the position it acquired with G80, solidifying its 2-year lead.

I dare say NVidia chose to scrap "G100" in order to produce a double-precision GT200, incurring a delay of 6-months, showing that NVidia really didn't give two hoots what consumers thought of it. It was more important to NVidia to reinforce the CUDA battlelines - and it seems that RV670's double-precision capability forced a swift re-appraisal.

Though I will admit NVidia does seem to have handed out a lot of Teslas. And "professional" has supposedly collapsed, worse than consumer. So NVidia just has to wait a bit longer for the return on GT200. No big deal when the real return is absolute domination in GPGPU.

Jawed
 
HD2900XT was more expensive than GTX280 for majority of its life-span.

As I already said, you can't compare prices directly when the economy has changed so dramatically. Videocards have become cheaper in general.

Are you sure? Price of GT200 fell before we heard first mention of crisis and more than half a year before the crisis affected anything. Prices went down shortly after RV770 launch - it was ATis' agressive price policy, not the crisis.

Regardless of what caused it, prices today aren't the same as they were 2-3 years ago.

Any single reason why GTS should be cheaper to produce? G80 was bigger than R600, additional chip (NVIO) required, 25% more VRAM required...

The GTS is a 'salvage part' while the 2900XT isn't (in fact, ATi probably had to push hard with the clockspeeds to remain competitive, so yields probably wouldn't be that great), and they also had a 320 mb version, so less VRAM required, and simpler/cheaper boards (remember, also 320 bit memory bus rather than 512 bit).
So I think it's quite certain that the GTS was cheaper to produce than the 2900XT, for more than one reason.

That's very nice, but why the board isn't more widespreaded? Why people don't buy it more?

Because it's an enthusiast part, not a mainstream part.
Since there are many mainstream parts that are cheaper and offer very acceptable gaming performance, most people will just go for a cheaper option. As I say, I personally went for an 8800GTS back then, and I wouldn't go for a GTX280 today either. I'd go for the GTX260 or GTX275.

Is it really profitable to nVidia to sell 500-600mm2 GPUs for the launch price of 128bit 150mm2 GF6600GT 3 years ago (or GF8600GTS 2 years ago)?

Since they don't sell that many, it probably doesn't bother them much how profitable it is.

Power efficiency, reference coolers or temperatures are not the key features of graphics board. The key aspects for evaluation of 3D accelerator are 3D performance, price (maybe price/performance ratio), image quality and features.

I think you are looking only from an enthusiast perspective. OEMs don't like high-end videocards because they require bigger, more expensive PSUs and better case cooling solutions.
I personally don't like noisy videocards either... While I like to play the odd game, I also use my PC in a home recording studio, and I like it quiet.
Another reason why I didn't get a 2900XT was that I would have needed to replace my PSU, while the 8800GTS320 would run on my current PSU.
 
The GTS is a 'salvage part' while the 2900XT isn't (in fact, ATi probably had to push hard with the clockspeeds to remain competitive, so yields probably wouldn't be that great)
HD2900XT was clocked to 740MHz. Majority of boards were able to run at 840MHz with stock voltage and cooler. The problem was extreme power requiremants, not yields. Even GT and PRO versions were very OC friendly, 800MHz was quite common at the voltage level of XT.

This argument is groundless. Yield issues were never mentioned in relation to R600, there were no problems with R600 supplies and both performance and cost-down derivates had good OC headroom.

This is the same argument as saying, that G80 yields were probably bad, because majority of the GPU was clocked to 1500MHz to stay competitive :rolleyes:

and they also had a 320 mb version, so less VRAM required
GTS320 was cheaper and slower part for different price segment (it also suffered from some bugs, which negatively affected performance). I'm comparing GTS640 to HD2900XT ~ two products with comparable price and performance.

and simpler/cheaper boards (remember, also 320 bit memory bus rather than 512 bit).
So I think it's quite certain that the GTS was cheaper to produce than the 2900XT, for more than one reason.
Sorry, but GeForce 8800 GTS used the same (384bit) PCB as GTX. So HD2900XT had only a bit more PCB routing because of slightly wider bus. Do you really think, that this minor difference outweight additional chip (NVIO), more complex PCB because of additional routing for NVIO, 25% more RAM and 70mm2 bigger die covered by heatspreader? I don't think so.
 
Status
Not open for further replies.
Back
Top