What do you expect for R650

What do you expect for HD Radeon X2950XTX

  • Faster then G80Ultra about 25-35% percent overall

    Votes: 23 16.4%
  • Faster then G80Ultra about 15-20% percent overall

    Votes: 18 12.9%
  • Faster then G80Ultra about 5-10% percent overall

    Votes: 18 12.9%
  • About same as G80Ultra

    Votes: 16 11.4%
  • Slower thenll G80Ultra about 5-10% percent overall

    Votes: 10 7.1%
  • Slower then G80Ultra about 15-25% percent overall

    Votes: 9 6.4%
  • I cannot guess right now

    Votes: 46 32.9%

  • Total voters
    140
The 8800, uncontested champion in performance and features for 9 months (!), has achieved a whopping 1% penetration among online gamers.

Uh yeah considering the entire population of online gamers didnt buy new hardware in the last 9 months exactly how is that statistic relevant? No new product will displace a significant portion of the market. What percentage of new sales among online gamers do you think 8800's garnered in the last few months?

ATI was king in the mobile space when mobile gaming wasn't popular. R580's performance/watt relegated it to the desktop space. You are stretching to find correlations between isolated factors while ignoring everything else that was going on. ATI's marketshare loss in-spite of their performance leadership with R580 was due to horrible execution and zero marketing.

Entropy said:
The high end monster card path is, IMO, a dead end in terms of technology evolution.
Regardless of technology though, what is more important is that the market rejects them. People just don't buy, in spite of advertising and tech reporting hype. So maybe it would make sense for the manufacturing and reporting industries to start asking themselves what people actually desire and spend their efforts there. It's not likely to be GPGPU and Crossfire....

You honestly think technology would advance faster without a high-end segment? All that will do is lower the bar that new products need to exceed. I guess they should stop making larger and higher-resolution monitors as well? I get the feeling your relevant universe of consumers includes the folks who walk into a BB and grab whatever cheapo card they see on the shelf. To be honest all this just comes off as mitigation of R600's failure to dominate. I'm sure all this anti-high-end sentiment will vanish once they are competitive again.

IMO better mid-range hardware is a side effect of having a high-end segment. When engineering time and effort is spent squeezing every last drop of performance out of a process those gains scale downward as well. If these companies didn't have to push the envelope the result would not be better $300 cards.
 
Entropy said:
The 8800, uncontested champion in performance and features for 9 months (!), has achieved a whopping 1% penetration among online gamers.
Uh yeah considering the entire population of online gamers didnt buy new hardware in the last 9 months exactly how is that statistic relevant? No new product will displace a significant portion of the market. What percentage of new sales among online gamers do you think 8800's garnered in the last few months?
Well, considering that in its 9 months it gathered less than 1% of the Steam survey (frequent online gamers), the adoption rate is likely to be way lower if you take the larger demographic into consideration, dominated by games such as SIMS/SIMS2/WoW/HP/Civ and so on. MUCH lower. Additionally, even though half a million is a large sample, five thousand 8800GTS/GTX users are not. Are they even representative of consumers? How many of these are connected to the industry in one capacity or another, checking drivers and compatibility, producing reviews for on- and off-line publications, working at game companies wanting to stay on top of developments, working in the channel, et cetera?

1% is probably a huge overestimation of the impact of the 8800 series. The actual penetration among regular game players must reasonably be very much lower.

And this is my point for these forums, it is OK to be a technology enthusiast, it is OK to be in the industry and want to be among interested people. But if you get to cosy in your small group and start to believe that the values of that group is shared by other people, you are prone to make mistakes, in thought or in action. Sticky the damn market data. People can draw whatever conclusions they want from it, but at least a rudimentary reality check would be readily available.



IMO better mid-range hardware is a side effect of having a high-end segment. When engineering time and effort is spent squeezing every last drop of performance out of a process those gains scale downward as well. If these companies didn't have to push the envelope the result would not be better $300 cards.

This is really another issue - but I'll address it as well, even though the reply is more IMHO. When I compare the capabilities of for instance my old Radeon 8500 card with todays new DX10 offerings, I see almost the same pixel pushing ability per Watt. The benefits of three generations of lithographic development went largely into pushing the feature set of the gfx-asics. Performance advances have been bought with the coin of increased power draw and the increased parallelism allowed by lithographic advances. Who has pushed for the feature set advances? Well, some new technology have undeniably benefited large parts of the market. But no consumers ever stood on the barricades demanding IEEE compliant rounding behavior, Crossfire, GPGPU control flow or a number of other largely industry internal features. They cost transistors, and therefore power and money, and they cost engineering resources. Nvidia and ATI have had competitive reasons to push these features, both vs each other, to raise the barrier of entry into the market for any other interested party and to try to open up alternative justifications for their products, asked for or not.

The consumers probably just wanted higher performance, as cheaply as possible, at low noise levels and low power draw.

There is a huge gap between these two positions opening up in the last five years or so. Nvidia and ATI had a duopoly, so they could get away with pushing the market in the direction they desired - consumers had no real alternative other than integrated graphics, so they financed this development, even though they may not have been terribly interested in the benefits compared to just getting more pixels pushed cheaper. But it is interesting to speculate in where we would have been if the industry had spent its resources based on enhancing solutions for portables rather than desktops. It would not present less competition or challenges, only different from the ones that led us where we are today - with a widely publicised high end that next to nobody is interested in actually buying, and mid end cards with lackluster performance for their cost that is unable to show much benefit for their technological advances. As a thought experiment, if AMD had produced and sold the R580 chip on 65nm, achieving lower power draw and cost, and had offered that to compete with the 8600GT, would it serve peoples needs in todays market?

Technology advances along the paths we choose. In this particular case I'd like those choices to be more consumer oriented. After all, these are largely game play accessories we are talking about.


PS. Oh, and $300 cards aren't mid level by any stretch of the imagination. You make my Ivory Tower point perfectly. Good products in the $80-$180 bracket or so should be the design target.
 
Last edited by a moderator:
Come again?80-150$ price-point for good performance?In this day and age?Only if Santa decides to be nice. I too loved the days when 299$ was the pricepoint for enthusiast hardware in the GPU market, but those days are...umm...gone.

Anyway, this is a losing battle, you choose to see what you want to see(as do we all human beings, objectivity does not exist for us, no matter how much one trumpets his:) ). You want to see ATi as doing something really great, and you want to discard the rules that have applied to the GPU market since...well, its inception. That`s OK. But in the grand scheme of things, I have serious doubts that this market will ever behave in the way you envisage.
 
Come again?80-150$ price-point for good performance?In this day and age?Only if Santa decides to be nice. I too loved the days when 299$ was the pricepoint for enthusiast hardware in the GPU market, but those days are...umm...gone.

Well, it depends on how you defines the enthusiasts, doesn't it. Look at the damn data yourself! The percentages are there. Single card price point $250 and up commanded 2.1% of the market for add-in boards. (Note that this study is looking at the category that is likely to pay the most for their graphics solution in isolation. Including preinstalled and obviously all portable graphics, would paint a much lower level picture). Categorizing $300 graphics as "mid-level" is completely disconnected from both market data and actual user data.

http://www.xbitlabs.com/news/video/display/20070413235044.html
 
Last edited by a moderator:
As a thought experiment, if AMD had produced and sold the R580 chip on 65nm, achieving lower power draw and cost, and had offered that to compete with the 8600GT, would it serve peoples needs in todays market?
.


Absoultely not. It doesn't have the DX10 checkbox. And people really just want more power. They dont care about the other stuff.

Not having DX10 is another euphemism for dated, not enough power and the card would be considered a joke. I can see the INQ lambastings at the expense of this card already.

Anyways since R650 was confirmed fake, exactly what is the point of this thread now?
 
Well, it depends on how you defines the enthusiasts, doesn't it. Look at the damn data yourself! The percentages are there. Single card price point $250 and up commanded 2.1% of the market for add-in boards. (Note that this study is looking at the category that is likely to pay the most for their graphics solution in isolation. Including preinstalled and obviously all portable graphics, would paint a much lower level picture). Categorizing $300 graphics as "mid-level" is completely disconnected from both market data and actual user data.

http://www.xbitlabs.com/news/video/display/20070413235044.html

And yet, even the data-set you`re quoting places the cards at the 150$-249$ dead in the middle. I`m quite certain it`s simply a thing of formatting the chart, nothing more. Do you not understand that simply staring at how much something sold is a primitive way to analyze how a market containing that product behaves?Do you honestly think that those 150-249 cards that sold in heaps sold through their own merits?If one IHV had only the 249 card as his top dog, and the other went up to the 700$ mark aimed at enthusiasts that aren`t enthusiasts in your conception, which one do you think would have sold more?Do you think it is easier to make more from less or less from more(translation, is upscaling easier than downscaling when it comes to GPUs?My answer would be no, so there`s another aspect of developing high-end stuff you`re choosing to graciously ignore).

Failing to understand how indirect effects occur, like variations of mind-share due to how the competitor is perceived(they`re top-dogs, they have the mega-bombad best product on the market, their lower end can't suck and I`ll buy that because 700$ is a ridiculous price/mehh, they`re good, not bad, not great...hmm, I`ll have to think about it), and trying to pin everything on sheer volume while disregarding how that volume is actually achieved is a poor approach in the present case, IMHO.
 
Absoultely not. It doesn't have the DX10 checkbox. And people really just want more power. They dont care about the other stuff.

Not having DX10 is another euphemism for dated, not enough power and the card would be considered a joke. I can see the INQ lambastings at the expense of this card already.

What?

Remind me where the G84 sits in terms of power please, a 750MHz shrinked R580 would have:

- 100% more bandwidth
- 200% more raw PS power
- 33% less raw VS power
- 10% more texturing power
- 120% more ROPs power

Looking at the charts, R580 already is way faster than G84 in most of the current games, with some very rare corner cases such as Tomb Raider which seems to love raw texturing power (already favored G71 over R580).

Add that to the 65nm process and the relatively low frequency and you end up with a small, low power and faster chip with no useless DX10 sticker.
 
PSU-failure said:
Add that to the 65nm process and the relatively low frequency and you end up with a small, low power and faster chip with no useless DX10 sticker.
R580 die size = 352 mm^2 in 90nm
G84 die size = 169 mm^2 in 80nm

Assuming perfect scaling to 65nm, we have:
90nm -> 65 nm = 52.2%
80nm -> 65 nm = 66.0%

Thus:
R580 in 65nm = 352 mm^2 * 0.522 = 184 mm^2
G84 in 65 nm = 169 mm^2 * 0.660 = 111.5 mm^2.

R580 would be 65% larger, and thus would cost ~2x as much, assuming perfect scaling.

Since scaling isn't perfect, R580 would end up even larger than G84, all in 65 nm of course.
 
Well, considering that in its 9 months it gathered less than 1% of the Steam survey (frequent online gamers), the adoption rate is likely to be way lower if you take the larger demographic into consideration, dominated by games such as SIMS/SIMS2/WoW/HP/Civ and so on. MUCH lower.

Steam is not a representative sampling of new sales in the last 9 months. Apparently statistics are very easy to recite but difficult to understand.

And this is my point for these forums, it is OK to be a technology enthusiast, it is OK to be in the industry and want to be among interested people. But if you get to cosy in your small group and start to believe that the values of that group is shared by other people, you are prone to make mistakes, in thought or in action.

Don't know why you keep talking about this as if we care about the "values of other people". How many Sims playing housewives do you encounter on these boards? It is in this community that these companies develop their brands and what's interesting to us is the only thing that's relevant in this context IMO.

Performance advances have been bought with the coin of increased power draw and the increased parallelism allowed by lithographic advances. Who has pushed for the feature set advances? The consumers probably just wanted higher performance, as cheaply as possible, at low noise levels and low power draw.

I won't even touch that 8500 statement since I have to believe you're kidding. I'm not sure how you can separate graphics features from graphics performance as they are intimately related and inter-dependent. How exactly do you measure progress if not by feature-set and performance? What is it that you think people want from newer graphics hardware - low noise levels, low power draw and last year's performance and features? Nope.

PS. Oh, and $300 cards aren't mid level by any stretch of the imagination. You make my Ivory Tower point perfectly. Good products in the $80-$180 bracket or so should be the design target.

So we don't have good products in the $80-$180 bracket now? How exactly would you go about bringing more powerful products to that price range? I'm sure the IHV's will be all ears :)
 
Categorizing $300 graphics as "mid-level" is completely disconnected from both market data and actual user data.

I'm really failing to see your point here. Your "market" and "users" are obviously irrelevant to the vast majority of discussions on this board and in the community in general. Why should they be included in any approach we take to classifying hardware performance. I don't even consider sub $100 cards as they literally can't do shit in 3D. And that's the vast majority of the market that you seem obsessed with - a market that's pretty much irrelevant to us.
 
Well, it depends on how you defines the enthusiasts, doesn't it. Look at the damn data yourself! The percentages are there. Single card price point $250 and up commanded 2.1% of the market for add-in boards. (Note that this study is looking at the category that is likely to pay the most for their graphics solution in isolation. Including preinstalled and obviously all portable graphics, would paint a much lower level picture). Categorizing $300 graphics as "mid-level" is completely disconnected from both market data and actual user data.

http://www.xbitlabs.com/news/video/display/20070413235044.html

INteresting chart, but I believe they have it wrong. They say the enthusiast market average cost is $420, but yet run it down to 250? My cview on this would be 400 and up would be Enthusiast, 200-399 performance, 100-199 mainstream and sub 100 entry lvl. Not sure if others would agree with that, but that is my take on how that chart should appear.
 
fudzilla.com said:
computex 07: There is no 55 / 65 nm R6xx high end part


Our friends at VR zone mentioned R680 but our sources close to ATI confirmed that there is no such card. ATI did make some presentation to catch who is leaking the info, mentioning false chips including the R650, R670 and now even R680, just to catch who is leaking the stuff.

Their should be some-kind refresh.
 
there is R650 i'm quite sure, it's been mentioned for ages and ages already, pre-r600 launch too
 
INteresting chart, but I believe they have it wrong. They say the enthusiast market average cost is $420, but yet run it down to 250? My cview on this would be 400 and up would be Enthusiast, 200-399 performance, 100-199 mainstream and sub 100 entry lvl. Not sure if others would agree with that, but that is my take on how that chart should appear.
The chart clearly states 'ASP' (Average Selling Price), so there is no way to determine what ranges they are thinking of as far as I can tell. It is correct, however, that it is fundamentally impossible for a market category's upper range to be equal to its ASP, unless there is only one product sold in that entire category. Errr... no. I would expect 'performance' to be $199-$349 or something along those lines.

Furthermore, it should be noted that this chart is AIB-only. The $199-and-below unit sales are obviously MUCH higher than that, once you take OEMs into consideration.
 
The chart clearly states 'ASP' (Average Selling Price), so there is no way to determine what ranges they are thinking of as far as I can tell. It is correct, however, that it is fundamentally impossible for a market category's upper range to be equal to its ASP, unless there is only one product sold in that entire category. Errr... no. I would expect 'performance' to be $199-$349 or something along those lines.

Furthermore, it should be noted that this chart is AIB-only. The $199-and-below unit sales are obviously MUCH higher than that, once you take OEMs into consideration.

My point really was that IMHO, 250 was way to low a price point for the cost of an Enthusiast card.
250 would clearly be in the performance segment by both our standards(which we differ on by 50 bucks). I choose my price points as i feel 400, right now, is the cut off point between enthusiast and performance. The XT, GTX, Ultra all fall above this point. The GTS(both) fall below it and I would call them performance cards, not high end Enthusiast cards.
 
Don't forget the reason nV got to the point they are is because they are market leaders. With out leadership products, ATi didn't make much headway pre 9700, or did they? I don't remember they having more then 30% marketshare with products that were second in performance.

Why would low end integrate graphics into CPU's hurt nV, come on there is no way they can take on discrete midrange and high end cards. If it was that easy IGP's we have today would have been able to achieve what discrete cards are doing.

The original radeon and the 8500 weren't bad products at all compared to nvidia's offerings, yet I don't think either broke single digit marketshare. The low end refreshes of both did decent though, especially the 9000-9200 chips based off the 8500 that benefitted from the halo effect of the 9700 pro.
ATI targetted high end for quite a while though before they finally broke in, remember the rage fury pro maxx turbo?

Call me a cynic, but I'm not incredibly surprised that current IGPs are awful.

Nv2a and the nforce IGPs were pretty good, though it's been way downhill from there. Not that horrible though, the nintendo wii's gpu has even worse performance per mm^2 and it was designed for that purpose exclusively.
 
The HD 2900 actually reminds me of 8500.

8500's story went like this, IMO: tons of potential compared to GeForce 3. Much more advanced shader capabilities, dual vertex shaders, hyped fancy AA, hardware tesselation, and seemingly the best hidden surface/overdraw/bandwidth management. The drivers let it down for a long time. The "Smoothvision" AA was a real disappointment in performance because it was supersampling and it was awfully slow at it. The special features were a mixed bag in performance and with regards to how much they actually got used. And then it got pretty much squashed by GeForce 4.

Gosh that sounds awfully familiar. I might have to jump back on that bandwagon! It's kinda nostalgia-inducing. ;)
 
The HD 2900 actually reminds me of 8500.

8500's story went like this, IMO: tons of potential compared to GeForce 3. Much more advanced shader capabilities, dual vertex shaders, hyped fancy AA, hardware tesselation, and seemingly the best hidden surface/overdraw/bandwidth management. The drivers let it down for a long time. The "Smoothvision" AA was a real disappointment in performance because it was supersampling and it was awfully slow at it. The special features were a mixed bag in performance and with regards to how much they actually got used. And then it got pretty much squashed by GeForce 4.

Gosh that sounds awfully familiar. I might have to jump back on that bandwagon! It's kinda nostalgia-inducing. ;)
It's very similar apart from the fact that from what we know so far R600 doesn't really seem to be that more advanced than G80 from a hw architecture standpoint, it seems to be that both chips employ a lot of very advanced technology and cool ideas.
 
The original radeon and the 8500 weren't bad products at all compared to nvidia's offerings, yet I don't think either broke single digit marketshare. The low end refreshes of both did decent though, especially the 9000-9200 chips based off the 8500 that benefitted from the halo effect of the 9700 pro.
ATI targetted high end for quite a while though before they finally broke in, remember the rage fury pro maxx turbo?

True but targetting is one thing, being number 1 is another :smile: as you said the 9700 really helped ATi break out of the mold they were in and helped them become top dog for a few years.

The HD 2900 actually reminds me of 8500.

On paper the 8500 was better then any of the GF 3's, even the Ti 500, but as you said the drivers held it back, unlike the 2900xt there are definciencies, well on paper not as great as the competition in some catagories, so can't really expect the 2900 to really step up like the 8500 by just drivers.
 
On paper the 8500 was better then any of the GF 3's, even the Ti 500, but as you said the drivers held it back, unlike the 2900xt there are definciencies, well on paper not as great as the competition in some catagories, so can't really expect the 2900 to really step up like the 8500 by just drivers.

There were deficiences in the 8500, too: it was AFAIK a 2x3 (pipe/TMU per pipe) setting while GF3 was a 4x1, and in pure fill rate GF3 was faster, while in multitexturing things were different.
And, anyway, in R600 there are also strong points. So it depends really by where the SW will go.
 
Back
Top