price of current high end cards in future?

UPO

Newcomer
We are all waiting for new chips from both NVIDIA and ATI. The question is: when R420 and NV40 finally appear on shelves (April-May) what will be prise drop (% wise) for cards like R350 or NV 5950?
What you guys think?
 
Future prices are hard to predict. It's really just guessing. :p

In the short term, prices will drop quite noticably. This has already happened to some extent for the 9800pro-128; it can be had for ~$215 here in the US.
But where prices go later mainly depends on how many cards are still around. The 9700pro has dropped to ~$190 or so but probably won't drop much more due to scarcity. Ati stopped producing the r300 so the supply quickly became limited, which kept up the price because there was still solid demand.

Therefore, what happens to r350, r360, and nv35 prices depends on if the chips will still be produced or not. Anybody have any info/rumors/speculation on production plans for those chips over next few months?
 
Vortigern_red said:
look at this!

This price seems to be lower then many 9600xt in the UK. No stock ATM but a sign of things to come?

Those sold out in seconds... I'd be surprised if they ever get more stock.

You can get a Sapphire 9600XT for £113 (and other brands for as little as £100, though I'd reccomend Sapphire) at the moment from Overclockers.co.uk. By far the best place to buy one from.
 
Quitch said:
Vortigern_red said:
look at this!

This price seems to be lower then many 9600xt in the UK. No stock ATM but a sign of things to come?

Those sold out in seconds... I'd be surprised if they ever get more stock.

You can get a Sapphire 9600XT for £113 (and other brands for as little as £100, though I'd reccomend Sapphire) at the moment from Overclockers.co.uk. By far the best place to buy one from.

Erm no I dont think so... :p
 
Thanks for answers!

I was wondering about it and came to conclusion that it all depends on how companies are going to react on the market:
If both NVidia and ATI focuses on increasing (keeping) its margins there will be some balance and price drop will be rather moderate. But if one company gets significant technological advantage, it may try to get rid of its competitor by starting price war...

Any comments appreciated :)
 
I'm hoping for more competition this time around. I was quite dissapointed that last year despite the tough competition between ati and nvidia their high end parts still cost so much, $499 US is way too much. Yes, prices did go down and there were lower cost alternatives, but I get all giddy imagining an all out price war driving the high end parts down much quicker!
 
Thunderbird said:
I'm hoping for more competition this time around. I was quite dissapointed that last year despite the tough competition between ati and nvidia their high end parts still cost so much, $499 US is way too much. Yes, prices did go down and there were lower cost alternatives, but I get all giddy imagining an all out price war driving the high end parts down much quicker!

My theory is that there were too many fan boys.

It goes like this, since over 80% of people already decided what they would get regardless of the price/performance difference there was no incentive for ATI and Nv to have a proce war....

Of course that may be total BS but it is one plausible theory. (but wrong most likely :D
 
If you have a tech lead and are making good margins, you don't cut prices. All this does is hurt your profits and devalues your product. It also makes it harder to raise prices in the future as both end customers and your distribution chain has got used to your lower margins giving them a lower price.

If like ATI, you've been in the lead and are selling chips as fast as you can make them, there is no incentive to cut prices, until your competition is hurting you.
 
Tahir said:
Quitch said:
Vortigern_red said:
look at this!

This price seems to be lower then many 9600xt in the UK. No stock ATM but a sign of things to come?

Those sold out in seconds... I'd be surprised if they ever get more stock.

You can get a Sapphire 9600XT for £113 (and other brands for as little as £100, though I'd reccomend Sapphire) at the moment from Overclockers.co.uk. By far the best place to buy one from.

Erm no I dont think so... :p

So where in Britain can you get a cheaper Sapphire Radeon 9600XT 128MB? Where would you reccomend I shop?
 
Quitch said:
So where in Britain can you get a cheaper Sapphire Radeon 9600XT 128MB? Where would you reccomend I shop?

Overclockers.co.uk is fine as long as you never, ever have to return anything to them - I've heard so many horror stories about their returns and RMA policy it's incredible.
 
You wouldn't believe how much sense you make.

Bouncing Zabaglione Bros. said:
If you have a tech lead and are making good margins, you don't cut prices. All this does is hurt your profits and devalues your product. It also makes it harder to raise prices in the future as both end customers and your distribution chain has got used to your lower margins giving them a lower price.

If like ATI, you've been in the lead and are selling chips as fast as you can make them, there is no incentive to cut prices, until your competition is hurting you.
 
K.I.L.E.R said:
You wouldn't believe how much sense you make.

Bouncing Zabaglione Bros. said:
If you have a tech lead and are making good margins, you don't cut prices. All this does is hurt your profits and devalues your product. It also makes it harder to raise prices in the future as both end customers and your distribution chain has got used to your lower margins giving them a lower price.

If like ATI, you've been in the lead and are selling chips as fast as you can make them, there is no incentive to cut prices, until your competition is hurting you.


If your turn it around the other way, you can understand why Nvidia & their partners have been suffering lower margins for quite a while, and now are suddenly putting out products with very low prices for the specification.

Nvidia is having trouble competing on performance/IQ, so they are falling back on marketing (the use of their well known brand name, cheating benchmarks, etc) and price cutting. Nvidia would also be devaluing their products and would find it hard to raise prices again in the future as above.

The only reason ATI have to cut prices is to make sure Nvidia don't get a massive price advantage, by (for instance) pricing a 5950U to make it a viable alternative to 9600XT, thereby dropping Nvidia's top end product into the mid-range price bracket. However, this would hurt Nvidia's margins very badly - they would be losing money with every card sold. Even if Nvidia were willing to take such a hit in order to gain sales, they are in a very bad position because their more expensive top range card still can barely match the much more cheaper mid range cards from ATI.

Nvidia is in a no win situation with regards to price wars until they can gain performance parity with ATI. ATI has better cards that cost them less to produce and make better margins. Nvidia can't go to "price war" against that without it costing them ridiculously large amounts of money.
 
Hanners said:
Quitch said:
So where in Britain can you get a cheaper Sapphire Radeon 9600XT 128MB? Where would you reccomend I shop?

Overclockers.co.uk is fine as long as you never, ever have to return anything to them - I've heard so many horror stories about their returns and RMA policy it's incredible.

I've RMA'd with Overclockers and had nothing but excellent support and a swift turn around. It's Scan I've had problems with. They refused to pay postage, they didn't keep me updated on status changes, they didn't tell me when they had dispatched a replacement, the turn around time was dreadful.

I went through all that, twice, and have decided I will never buy from Scan again.
 
Bouncing Zabaglione Bros. said:
K.I.L.E.R said:
You wouldn't believe how much sense you make.

Bouncing Zabaglione Bros. said:
If you have a tech lead and are making good margins, you don't cut prices. All this does is hurt your profits and devalues your product. It also makes it harder to raise prices in the future as both end customers and your distribution chain has got used to your lower margins giving them a lower price.

If like ATI, you've been in the lead and are selling chips as fast as you can make them, there is no incentive to cut prices, until your competition is hurting you.


If your turn it around the other way, you can understand why Nvidia & their partners have been suffering lower margins for quite a while, and now are suddenly putting out products with very low prices for the specification.

Nvidia is having trouble competing on performance/IQ, so they are falling back on marketing (the use of their well known brand name, cheating benchmarks, etc) and price cutting. Nvidia would also be devaluing their products and would find it hard to raise prices again in the future as above.

The only reason ATI have to cut prices is to make sure Nvidia don't get a massive price advantage, by (for instance) pricing a 5950U to make it a viable alternative to 9600XT, thereby dropping Nvidia's top end product into the mid-range price bracket. However, this would hurt Nvidia's margins very badly - they would be losing money with every card sold. Even if Nvidia were willing to take such a hit in order to gain sales, they are in a very bad position because their more expensive top range card still can barely match the much more cheaper mid range cards from ATI.

Nvidia is in a no win situation with regards to price wars until they can gain performance parity with ATI. ATI has better cards that cost them less to produce and make better margins. Nvidia can't go to "price war" against that without it costing them ridiculously large amounts of money.
Yes, you are right - Nvidia wouldn't win price war. One (nitpicking i admit) thought: there is one factor ATI would consider -cards based on NV3x need a lot of careful coding when doing pixel shader. The chance developers will bother to spent their time on heavy-tunning shaders for Nvidia cards is (IMHO) proportional to amount of sold cards. So it is in NVidia (and not in ATI :) ) interest to boost sales even if it hurts its margins.
(Not that I believe in such tactic - just a thought :) )
 
UPO said:
Yes, you are right - Nvidia wouldn't win price war. One (nitpicking i admit) thought: there is one factor ATI would consider -cards based on NV3x need a lot of careful coding when doing pixel shader. The chance developers will bother to spent their time on heavy-tunning shaders for Nvidia cards is (IMHO) proportional to amount of sold cards. So it is in NVidia (and not in ATI :) ) interest to boost sales even if it hurts its margins.
(Not that I believe in such tactic - just a thought :) )

The first problem is that developers won't do it - historically, developers just like to write their code and have it work properly. If they have to spend huge amounts of time getting code (in this case shaders) to run properly on Nvidia products, they simply will scrap those shaders for Nvidia hardware and run a DX8 path with reduced visuals. The more hoops you make a developer jump through for your hardware, the more likely they are to simply say "screw it" and refuse to play with your hardware. This is why Nvidia is writing shaders for developers.

This is especially true with DX9 and the situation where Nvidia no longer is in the lead. Why develop for Nvidia hardware which is "only meant for developers" when you can code for ATI hardware that is ready and able to do primetime DX9 from the get-go?
Why should a developer support special paths in NV3x hardware that will be superceded by the time their game ships?

Secondly, Nvidia would only need to gain this kind of special path coding support from developers if NV4x was going to have the same problems and issues as NV3x. If Nvidia still need developers to write in extra special and fiddly support for shaders for NV4x, they are in major trouble - NV4x will fail and Nvidia will be in big trouble in the marketplace. If Nvidia increase their shader performance to more acceptable levels, I expect them to drop all this "special support" for NV3x. Anyone who owns one of these cards will find they are stuck with extremely poor shader support, not even hacked to FP16 as is happening today.

I doubt Nvidia will pay the big bucks needed to sell NV3x at a loss in order to gain marketshare so they can train developers to support special paths, just as they launch next gen hardware that shouldn't, *musn't* (if Nvidia is not to lose even more marketshare & profit) have the same kinds of issues and needs for special coding.
 
Bouncing Zabaglione Bros. said:
This is especially true with DX9 and the situation where Nvidia no longer is in the lead. Why develop for Nvidia hardware which is "only meant for developers" when you can code for ATI hardware that is ready and able to do primetime DX9 from the get-go?
Why should a developer support special paths in NV3x hardware that will be superceded by the time their game ships?

I think it depends a bit on your viewpoint...

Looking solely from the programmers pov I quite agree, but if you look at it from the entire company's pov I believe that changes.
The company wants to sell as many games as possible, and to do that they have to cater to the consumers and the hardware those consumers are using...
Considering that Nvidia is still outselling ATI (as we could see from the Peddie report), that means putting in some extra time creating shaders for those consumers.

Yes, they can just use a DX8 path, but that would most likely decrease sales/increase returns due to the "quality" (I'm not talking about any of the shortcuts nvidia is using, but rather what the game can deliver) of the graphics. If they have just bought a shiny new 5600 and find that they don't get as good graphics as their friends who've all bought 9600pro's/5700's they'll complain, and it won't be to nvidia since it's not them who's created the game.

So from a sales pov, putting in a bit of extra time creating shaders that will run ok/good on nvidia hardware is worth it, imo, compared to the risk that the consumers won't buy/will return the game due to the lackluster graphics/performance.

So as longs as nvidia has the larger market share of DX9 cards, companies will work a bit extra to make sure that the graphics are as good as they can before nvidia decreases it through the drivers.
 
MrGaribaldi said:
Considering that Nvidia is still outselling ATI (as we could see from the Peddie report), that means putting in some extra time creating shaders for those consumers.

Nvidia's only outselling ATI with it's low-end "DX9" parts. These are cards that are barely capable of running shaders at at all. If you want your game to use dX9 features, these low-end Nvidia cards cannot be targeted as DX9 cards unless you want single digit slideshows. You have to target these Nvidia cards as DX8 if you want performance. For the developer, this would be the best way of servicing the Nvidia target audience.

Why, as a developer, would you target your DX9 shader path at hardware that cannot run it acceptably? As a developer, you would target DX9 at more capable hardware, and get a more acceptable performance on less capable hardware by using DX8. This is what Valve have done with HL2.

I still stand by the basic rule that the more work you ask a developer to do to support your "special hardware" outside of the standard API, the less likely a developer will do that extra work. Unless there is big payback, he is just as likely to blow you off and point at other hardware to say "it works fine there".
 
Back
Top