Taking a step back and removing my editor’s hat for a second (/disclaimer), I cannot help but be a bit disappointed with the current state of the graphics industry. Although we’ve had some positive events, it seems as though we’ve been facing a great deal more negatives. Somewhat surprisingly, blame can’t be placed solely on one or two vendors. Rather, each company appears to have managed to disappoint the enthusiast community to some extent although some certainly more so than others.
Easily the most frustrating and disappointing turn of events would be the lack of any DX10 games. Because DX10 is a “feature†which can only be had when using the Vista Operating system, millions of WinXP users are forced to stick to DX9 games. Due to their actions, Microsoft has forced developers to largely ignore the small install-base of DX10 at this time and instead focus their current energy on addressing the huge install-base of DX9 (and below) users. Despite the market issues restricting the release of DX10 games, there is also the learning curve involved with coding DX10. Although one could argue development for DX10 is significantly easier than coding for something like the Cell architecture of Sony’s PS3 console, it is anything but trivial. As with all new API’s, there are tricks which need to be learned and trade-offs which need to be identified and understood. The adjustment to DX10 takes time, and time is a luxury most developers don’t have. Publishers want content sooner than later and development time is anything but a bargain. Keeping this all in mind, we soon realize why we don’t have dozens of DX10 titles in retail.
Moving our attention towards hardware, we encounter more disappointment. Looking at the last decade or so, the graphics industry has had a few revolutionary moments which have dramatically changed the face of gaming. From the original GeForce DDR to ATI’s Radeon 9700, certain products have been released which offer a huge leap forward in terms of performance, image quality, and functionality. With DX10 on the immediate horizon, many expected the current crop of flagship GPU’s to be the next iteration of “groundbreaking†products. Unfortunately, that hasn’t quite happened on several fronts.
With regards to GPU’s, we find our main disappointment to currently reside with AMD’s new Radeon HD 2900XT. At the end of the day, we’re looking at a “flagship†GPU which is significantly late and doesn’t appear to be a “flagship†class GPU afterall. Nearly everyone here can recall the hype surrounding R600 for the last few quarters. Not since NV30 and the amusing hype movies (esp. the one with the “hidden†nod to Uttar) did a GPU drive so much discussion. In fact, many (including myself) expected R600 to be as ground-breaking as R300. From the moment the world first heard wind of R600, it was touted to be a DX10 monster with insane bandwidth and unparalleled performance. As time progressed we heard additional rumors of a massive heatsink assembly, unprecedented power consumption, and an enormous PCB. ATI/AMD chimed in now and then to keep momentum going for the hype machine surrounding R600. However, looking at their tone over the last two quarters one can clearly see how dramatically things changed. Whereas comments were initially made regarding the “performance crown†for the enthusiast market, comments then became much more focused upon the entire family of R600-based cards and on overall value.
Looking at the end result in terms of the Radeon HD 2900XT, we can identify which rumors came true and which did not. Unfortunately, the rumors of exceptional performance fell terribly short and those regarding power consumption and noise came true. Luckily, the rumor regarding a surprisingly low MSRP did come true although not for the reasons one would prefer. Looking at an architecture consisting of over 700M transistors and multiple years of costly development one would imagine the flagship R600 product would command a healthy price premium. Considering the respins at the fab the GPU went through and how many millions that costs, it was almost inconceivable to imagine an MSRP below $499. However, we sit here today with AMD’s “flagship†level card costing $399 MSRP.
Although I’d love to think AMD chose the $399 MSRP to bring a dose a reality to the insane prices flagship GPU’s are commanding these days and make the cards attainable to a larger group of people, I’m confident that is not the case. Rather, AMD realized that the final product had been massaged as much as possible and the end result fell drastically short of both internal and public expectations for performance. Can you imagine the pushback AMD would feel if they tried selling the Radeon HD 2900XT for $499 or more? Ultimately, AMD realized that their “flagship†card would be matched performance-wise by NVIDIA’s $399 MSRP GeForce 8800 GTS so they were forced to sell the card for the same price. Given the financial burdens AMD is feeling currently and the huge development cost associated with bringing R600 to market, I am confident that the company did not choose $399 out of the kindness in their hearts.
Perhaps the most frustrating aspect of R600 is the fact that we are dealing with a card that still feels rushed and unfinished. Despite having additional time due to multiple and significant delays, we find ourselves facing major driver bugs and issues Unless respins of the GPU changed the architecture, one would imagine the vast majority of these tasks would have been completed ages ago. Certainly it seems reasonable to expect some polished drivers with nothing more than a few tweaks here and there being necessary after being in development so long.
Fortunately, the Radeon HD 2900XT appears to be a diamond in the rough so to speak. On top of some interesting new architecture and some cool new functionality, we can see a great deal of potential. ATI fans and any enthusiast will certainly be dying in anticipation for the almost certain refresh of R600 coming on a smaller process. Armed with cooler temps and higher frequencies, it seems as though this GPU will be able to meet the performance levels many expected with R600. Now, whether AMD is able to keep the MSRP for their flagship level GPU at $399 remains to be seen. For the sake of the industry, that is certainly a trend I would love to see get traction.
On the other side of the fence you have NVIDIA. Without question, NVIDIA caught the world by surprise with the release of the G80 architecture. Despite the crafty musings of David Kirk seemingly indicating otherwise, G80 turned out to be a Unified-Shader architecture. In addition to being the only DX10 GPU on the market for roughly 6 months, the G80 GPU provided exceptional performance and increased image quality. Having learned their lesson with NV30, NVIDIA opted for a more refined process and managed to produce a strong new GPU with excellent yields. My only grip with G80 would have initially been the lack of the enhanced PureVideo features found on the GeForce 8600 series since the card was NVIDIA’s flagship and should have all the latest bells and whistles. However, that soon changed once NVIDIA released their “new†flagship card with the launch of the GeForce 8800 Ultra. Looking at a market flooded with factory overclocked GeForce 8800 GTX cards, the enthusiast community expected some lofty increased in both core and memory frequencies. Unfortunately, the new product launched with minimal increases at best although it commanded an almost obscene price premium. Looking at Newegg and other retailers, it was not uncommon to see factory overclocked GeForce 8800 GTX cards with frequencies higher than the new Ultra cards that were hundreds of dollars less! Although I understand NVIDIA’s release of the card as a tool for position in the market and to ensure the grip on the performance crown was tight, I was sorely disappointed with the GeForce 8800 Ultra. In hindsight, it would have been great to see the company offer the new GeForce 8800 Ultra at the GTX’s model’s initial MSRP and lower prices across the board the other GeForce 8800-class models. In addition to allowing consumers to get even more performance for their money, this would have placed AMD in a truly difficult situation with regards to the “flagship†Radeon HD 2900XT as it would be going up against the GeForce 8800 GTX in benchmarks. Instead, NVIDIA is seemingly much more concerned with garnering as much revenue as possible so such a situation is nothing more than a dream for the enthusiast community.
After reading this, I’m sure some will complain of prejudice or bias and write this rant off as the ramblings of an NVIDIA fanboy. Some will point out that I was once an Editor for nV News which happens to be an NVIDIA fansite. However, I hope many will also realize that I’ve also since been an Editor for HardOCP, AMDMB, PC Perspective, and Hot Hardware as well as a Product Manager at Alienware. I’ve written countless positive and negative reviews for each company over the years and would hardly consider myself a “fanboy†of either vendor. At the end of the day, I am an enthusiast who loves gaming and wants to get the most performance my budget allows. Whether that is from AMD or NVIDIA is a non-issue as I simply want the best hardware I can get my hands on. In the end, I think Kyle had it right when we were talking about the industry late one night at a Quakecon in years past. Here, he said it just comes down to “who offers the best overall gaming experienceâ€. Give me a card which offers the highest framerate with the best image quality and most functionality and you can put whatever brand’s sticker on it you want.
Taking a glimpse at the horizon, we have some very promising advancements for the graphics industry. A crop of solid DX10 titles are slated to be released including Crysis and World in Conflict, as well as a wealth of UE3-based games. With this in mind, GPU vendors are working hard to polish their new architectures and prepare to release a refresh or a new design altogether. With new hardware on its way from Intel and AMD in the way of new processors, chipsets, and platforms we should be seeing some healthy increases in performance throughout the entire system as well. Overall, the last quarter or two have been previews of sorts for the real excitement brewing in the second half of the year. Let’s cross our fingers and hope that the industry sees better execution across the board and that we see some strong competition again on all fronts.
Easily the most frustrating and disappointing turn of events would be the lack of any DX10 games. Because DX10 is a “feature†which can only be had when using the Vista Operating system, millions of WinXP users are forced to stick to DX9 games. Due to their actions, Microsoft has forced developers to largely ignore the small install-base of DX10 at this time and instead focus their current energy on addressing the huge install-base of DX9 (and below) users. Despite the market issues restricting the release of DX10 games, there is also the learning curve involved with coding DX10. Although one could argue development for DX10 is significantly easier than coding for something like the Cell architecture of Sony’s PS3 console, it is anything but trivial. As with all new API’s, there are tricks which need to be learned and trade-offs which need to be identified and understood. The adjustment to DX10 takes time, and time is a luxury most developers don’t have. Publishers want content sooner than later and development time is anything but a bargain. Keeping this all in mind, we soon realize why we don’t have dozens of DX10 titles in retail.
Moving our attention towards hardware, we encounter more disappointment. Looking at the last decade or so, the graphics industry has had a few revolutionary moments which have dramatically changed the face of gaming. From the original GeForce DDR to ATI’s Radeon 9700, certain products have been released which offer a huge leap forward in terms of performance, image quality, and functionality. With DX10 on the immediate horizon, many expected the current crop of flagship GPU’s to be the next iteration of “groundbreaking†products. Unfortunately, that hasn’t quite happened on several fronts.
With regards to GPU’s, we find our main disappointment to currently reside with AMD’s new Radeon HD 2900XT. At the end of the day, we’re looking at a “flagship†GPU which is significantly late and doesn’t appear to be a “flagship†class GPU afterall. Nearly everyone here can recall the hype surrounding R600 for the last few quarters. Not since NV30 and the amusing hype movies (esp. the one with the “hidden†nod to Uttar) did a GPU drive so much discussion. In fact, many (including myself) expected R600 to be as ground-breaking as R300. From the moment the world first heard wind of R600, it was touted to be a DX10 monster with insane bandwidth and unparalleled performance. As time progressed we heard additional rumors of a massive heatsink assembly, unprecedented power consumption, and an enormous PCB. ATI/AMD chimed in now and then to keep momentum going for the hype machine surrounding R600. However, looking at their tone over the last two quarters one can clearly see how dramatically things changed. Whereas comments were initially made regarding the “performance crown†for the enthusiast market, comments then became much more focused upon the entire family of R600-based cards and on overall value.
Looking at the end result in terms of the Radeon HD 2900XT, we can identify which rumors came true and which did not. Unfortunately, the rumors of exceptional performance fell terribly short and those regarding power consumption and noise came true. Luckily, the rumor regarding a surprisingly low MSRP did come true although not for the reasons one would prefer. Looking at an architecture consisting of over 700M transistors and multiple years of costly development one would imagine the flagship R600 product would command a healthy price premium. Considering the respins at the fab the GPU went through and how many millions that costs, it was almost inconceivable to imagine an MSRP below $499. However, we sit here today with AMD’s “flagship†level card costing $399 MSRP.
Although I’d love to think AMD chose the $399 MSRP to bring a dose a reality to the insane prices flagship GPU’s are commanding these days and make the cards attainable to a larger group of people, I’m confident that is not the case. Rather, AMD realized that the final product had been massaged as much as possible and the end result fell drastically short of both internal and public expectations for performance. Can you imagine the pushback AMD would feel if they tried selling the Radeon HD 2900XT for $499 or more? Ultimately, AMD realized that their “flagship†card would be matched performance-wise by NVIDIA’s $399 MSRP GeForce 8800 GTS so they were forced to sell the card for the same price. Given the financial burdens AMD is feeling currently and the huge development cost associated with bringing R600 to market, I am confident that the company did not choose $399 out of the kindness in their hearts.
Perhaps the most frustrating aspect of R600 is the fact that we are dealing with a card that still feels rushed and unfinished. Despite having additional time due to multiple and significant delays, we find ourselves facing major driver bugs and issues Unless respins of the GPU changed the architecture, one would imagine the vast majority of these tasks would have been completed ages ago. Certainly it seems reasonable to expect some polished drivers with nothing more than a few tweaks here and there being necessary after being in development so long.
Fortunately, the Radeon HD 2900XT appears to be a diamond in the rough so to speak. On top of some interesting new architecture and some cool new functionality, we can see a great deal of potential. ATI fans and any enthusiast will certainly be dying in anticipation for the almost certain refresh of R600 coming on a smaller process. Armed with cooler temps and higher frequencies, it seems as though this GPU will be able to meet the performance levels many expected with R600. Now, whether AMD is able to keep the MSRP for their flagship level GPU at $399 remains to be seen. For the sake of the industry, that is certainly a trend I would love to see get traction.
On the other side of the fence you have NVIDIA. Without question, NVIDIA caught the world by surprise with the release of the G80 architecture. Despite the crafty musings of David Kirk seemingly indicating otherwise, G80 turned out to be a Unified-Shader architecture. In addition to being the only DX10 GPU on the market for roughly 6 months, the G80 GPU provided exceptional performance and increased image quality. Having learned their lesson with NV30, NVIDIA opted for a more refined process and managed to produce a strong new GPU with excellent yields. My only grip with G80 would have initially been the lack of the enhanced PureVideo features found on the GeForce 8600 series since the card was NVIDIA’s flagship and should have all the latest bells and whistles. However, that soon changed once NVIDIA released their “new†flagship card with the launch of the GeForce 8800 Ultra. Looking at a market flooded with factory overclocked GeForce 8800 GTX cards, the enthusiast community expected some lofty increased in both core and memory frequencies. Unfortunately, the new product launched with minimal increases at best although it commanded an almost obscene price premium. Looking at Newegg and other retailers, it was not uncommon to see factory overclocked GeForce 8800 GTX cards with frequencies higher than the new Ultra cards that were hundreds of dollars less! Although I understand NVIDIA’s release of the card as a tool for position in the market and to ensure the grip on the performance crown was tight, I was sorely disappointed with the GeForce 8800 Ultra. In hindsight, it would have been great to see the company offer the new GeForce 8800 Ultra at the GTX’s model’s initial MSRP and lower prices across the board the other GeForce 8800-class models. In addition to allowing consumers to get even more performance for their money, this would have placed AMD in a truly difficult situation with regards to the “flagship†Radeon HD 2900XT as it would be going up against the GeForce 8800 GTX in benchmarks. Instead, NVIDIA is seemingly much more concerned with garnering as much revenue as possible so such a situation is nothing more than a dream for the enthusiast community.
After reading this, I’m sure some will complain of prejudice or bias and write this rant off as the ramblings of an NVIDIA fanboy. Some will point out that I was once an Editor for nV News which happens to be an NVIDIA fansite. However, I hope many will also realize that I’ve also since been an Editor for HardOCP, AMDMB, PC Perspective, and Hot Hardware as well as a Product Manager at Alienware. I’ve written countless positive and negative reviews for each company over the years and would hardly consider myself a “fanboy†of either vendor. At the end of the day, I am an enthusiast who loves gaming and wants to get the most performance my budget allows. Whether that is from AMD or NVIDIA is a non-issue as I simply want the best hardware I can get my hands on. In the end, I think Kyle had it right when we were talking about the industry late one night at a Quakecon in years past. Here, he said it just comes down to “who offers the best overall gaming experienceâ€. Give me a card which offers the highest framerate with the best image quality and most functionality and you can put whatever brand’s sticker on it you want.
Taking a glimpse at the horizon, we have some very promising advancements for the graphics industry. A crop of solid DX10 titles are slated to be released including Crysis and World in Conflict, as well as a wealth of UE3-based games. With this in mind, GPU vendors are working hard to polish their new architectures and prepare to release a refresh or a new design altogether. With new hardware on its way from Intel and AMD in the way of new processors, chipsets, and platforms we should be seeing some healthy increases in performance throughout the entire system as well. Overall, the last quarter or two have been previews of sorts for the real excitement brewing in the second half of the year. Let’s cross our fingers and hope that the industry sees better execution across the board and that we see some strong competition again on all fronts.