How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
You really think they're done paying off bumpgate? :|

Maybe, maybe not but either way it has no bearing on their long term profitability. They have lots of cash so bumpgate hurts the wallet but that's about it. What they should be worried about is their horrendous execution as of late because that's turning into a trend now.
 
from the talks of it all, you guys are all ready calling it GAME OVER for nv. ATI hasnt even released 1 card yet and the prices are going to be out of this roof. we have no specs, no nada. Only wow at a bagilion x bagilion resolution.

we cant say ati wins, nv looses. (i am a ati fan boy tho :). yes ati has best card, but nv has its fan boys, it has its physx, its cuda, and all the games it has paid off as of yet. If ATI can take a blow at nv now, then we might actually begin to see some even games that arnt paid. What ati needs to do is not exactally go for the 5000000$ card now, but hurt nv's pot o gold while it can. As it is turning out, ati's cards are equal to better than nv's but all nv has is its frills. IF ati can knock off the frills, then ati will not only pull ahead now, but in the future.

also, this is buisness and no one goes down without a fight. look at ati/amd. d@mn. those 2 have been underdogs for a while now and they are still fighting. NV is a BIG company, 1 2-4 month stint is not going to wipe em out. By then, however, ati will have a refined system of making these cards and can lower the price probably by half which will hurt nv again. Ati will go back to old tactics (which we all like :) and nv will have a chunk missing from thier side (which i like again :).

POINT BEING! ITS NOT AN "END ALL BE ALL" SITUATION NOW! watch it all unfold and hope and pray ati gets thier 6xxx out by end of july next year so i can start college off with a ripped computer lol.

first post :) been following this thread and decided to jump in. HI ALL
 
Marketing B.S. aside, I'm not sure I see where all the "game over for Nvidia" stuff is coming from. AMD/ATI will release their 5xxx series before Nvidia, and within a few months Nvidia will have their equivalent DX11 part. In the past, sometimes Nvidia has led for a bit, other times ATI has. As of now, Nvidia is still still outselling ATI in the desktop add-in card space, and it would overall be surprising to me if they didn't keep the lead in single-GPU performance once they launch their DX11 part, given that they seem to be OK with having the larger transistor budget. There was a lot of talk last generation how having the smaller GPU would allow AMD to have a very large and clear price/performance lead that Nvidia would not be able to compete against, but it doesn't seem to have worked out that way. Now people are saying the same about this upcoming generation before any hard facts are available (AFAIK).

Nvidia is probably not going to be able keep as much of the market share as they've enjoyed in the past, most likely - but beyond that, what am I missing that's leading to this 'look how far they've fallen' talk?
 
Marketing B.S. aside, I'm not sure I see where all the "game over for Nvidia" stuff is coming from. AMD/ATI will release their 5xxx series before Nvidia, and within a few months Nvidia will have their equivalent DX11 part. In the past, sometimes Nvidia has led for a bit, other times ATI has.
I'm not sure who here is making such claims, difficult situation does not equal game over.

As of now, Nvidia is still still outselling ATI in the desktop add-in card space, and it would overall be surprising to me if they didn't keep the lead in single-GPU performance once they launch their DX11 part, given that they seem to be OK with having the larger transistor budget.
Really? I thought AMD made significant gains in the mobile space and gained a smaller amount in desktop, both at the expense of Nvidia. The outselling part is really out of the blue for me.

There was a lot of talk last generation how having the smaller GPU would allow AMD to have a very large and clear price/performance lead that Nvidia would not be able to compete against, but it doesn't seem to have worked out that way. Now people are saying the same about this upcoming generation before any hard facts are available (AFAIK).

Nvidia is probably not going to be able keep as much of the market share as they've enjoyed in the past, most likely - but beyond that, what am I missing that's leading to this 'look how far they've fallen' talk?
And how is that? You have come to this conclusion without knowing the performance of cypress, the die-size of G300 and the performance of G300. Seems far-fetched just like the outselling part.
 
And how is that? You have come to this conclusion without knowing the performance of cypress, the die-size of G300 and the performance of G300. Seems far-fetched just like the outselling part.

He's talking about the GT200/RV770 generation here. And he's right: ATI didn't have a very substantial performance/price advantage, though they probably enjoyed a nice performance/cost one.
 
He's talking about the GT200/RV770 generation here. And he's right: ATI didn't have a very substantial performance/price advantage, though they probably enjoyed a nice performance/cost one.
They did. I think 4800 sold more than the GTX200 series, steam survey comes to mind.
 
all ati needs to do is keep hacking away at nv. they did a little with the 48xx and now with this 5xxx out early, they can do some damage here. and, ofcourse nv will hold the single chip champion... It would be stupid to release a slower 395 than a 5870x2 or 385/5890....

also, with open cl comin in, that whole physx thing is going to start hitting a wall. games will become much more computer intense due to optimimal performance, and the physx will be a big waste of money in the long run.
 
all ati needs to do is keep hacking away at nv. they did a little with the 48xx and now with this 5xxx out early, they can do some damage here. and, ofcourse nv will hold the single chip champion... It would be stupid to release a slower 395 than a 5870x2 or 385/5890....

also, with open cl comin in, that whole physx thing is going to start hitting a wall. games will become much more computer intense due to optimimal performance, and the physx will be a big waste of money in the long run.

What is it with ATI fans and this OpenCL is our savior shit? ATI has yet to release a certified driver that supports OpenCL, Nvidia has atleast 1 that we know of.
 
How difficult would it be for Nvidia to hack something like Eyefinity into their next line of cards (or even do a prototype using a current chip for demo purpose)? In the past, they've been pretty good at neutralizing new features that others launch that way. I sorta remember them getting kinda crappy fsaa into their cards when Voodoo 5 launched.
 
What is it with ATI fans and this OpenCL is our savior shit? ATI has yet to release a certified driver that supports OpenCL, Nvidia has atleast 1 that we know of.

cuz like it or not, open cl is pretty awesome. everyone can use it and it uses all cpu cores and all gpus.

it will set, if it gets big, a standard for a lot of things for games and such.

Nv's physx/cuda only works for thier things thus game companies arnt totaly for it because it leaves ati holders out of the full game. with open cl, nv or ati can play the full game.
 
What is it with ATI fans and this OpenCL is our savior shit? ATI has yet to release a certified driver that supports OpenCL, Nvidia has atleast 1 that we know of.
It's a whole "open standards" thing, some people seem to prefer it and thinks it stands a much better chance of success than a proprietary solution.
 
That is sad...how could Nvidia fall so much from G80? Is it kinda ironic that ATI present success could not have started worst and Nvidia raced off the lines? HD5xx architecture is an evolution of the HD2xxx while GT300 has its roots in G80..eh so how did Nvidia messed up or was the G80 design never 'smart' enough to last the race?

I guess it shows that R600 was much more scalable compared to G80.

However, can I just add that I don't see how it's possible for ATI/AMD to keep going with this small die approach, assuming they keep following the same trend as seen in RV670 -> RV770 -> RV870. What I mean is; RV870 is looking quite large, like just a bit smaller than the original R600 (both in terms of size and power consumption). I don't know how ATI is going to fit two RV870's on a single board...
 
lol up top :p amd is a little slow/bad with drivers and stuff like that but ur probably right. its in some sort of beta stage up in hq
 
Steam includes 4830 and 4850 in those numbers, cards that went up against G92, not GT200.

It does, but if you want to compare the 48xx to the GT200 series, you can add up the GT200 card totals and it overall comes out to something like 13.6 vs. 7.7 or something like that... roughly 3:2 ratio last time I checked

The G92's and their infinite rebrandings have made it hard to total the total sales since RV770's release

gongo: Maybe G80 design was overrated, because R600 was unable to compete it - not because of its architecture, but because of its bugs.

You brought up a good point. Looking back, with hindsight of course, G80 was a very definite improvement over former Nvidia cards.

That said, its success was also due in no small part to the fact that R600 stank which made G80 look that much better. But looking back also at the G80 derivatives, we start seeing some issues.

For instance, the G92 cards with its smaller memory bandwidth, we started to see some weaknesses. We all know about how higher resolution and AA kills the G92 cards - and once RV770 came out, it was even more obvious that the G92 required a lot more memory bandwidth than the RV770.

GT200 had numerous changes, but kept the core concepts from G80... but as we saw in RV770 vs. GT200, the chip size and transistor count of GT200 vs. RV770 from a production cost standpoint certainly is in favor of the RV770 by far. Smaller and fewer transistors by a larger margin than the performance difference.

Now it remains to be seen what GT300 has in store but it's certainly an interesting point you brought up no-X
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top