Disappointed With Industry

pelly

Newcomer
Taking a step back and removing my editor’s hat for a second (/disclaimer), I cannot help but be a bit disappointed with the current state of the graphics industry. Although we’ve had some positive events, it seems as though we’ve been facing a great deal more negatives. Somewhat surprisingly, blame can’t be placed solely on one or two vendors. Rather, each company appears to have managed to disappoint the enthusiast community to some extent although some certainly more so than others.

Easily the most frustrating and disappointing turn of events would be the lack of any DX10 games. Because DX10 is a “feature†which can only be had when using the Vista Operating system, millions of WinXP users are forced to stick to DX9 games. Due to their actions, Microsoft has forced developers to largely ignore the small install-base of DX10 at this time and instead focus their current energy on addressing the huge install-base of DX9 (and below) users. Despite the market issues restricting the release of DX10 games, there is also the learning curve involved with coding DX10. Although one could argue development for DX10 is significantly easier than coding for something like the Cell architecture of Sony’s PS3 console, it is anything but trivial. As with all new API’s, there are tricks which need to be learned and trade-offs which need to be identified and understood. The adjustment to DX10 takes time, and time is a luxury most developers don’t have. Publishers want content sooner than later and development time is anything but a bargain. Keeping this all in mind, we soon realize why we don’t have dozens of DX10 titles in retail.

Moving our attention towards hardware, we encounter more disappointment. Looking at the last decade or so, the graphics industry has had a few revolutionary moments which have dramatically changed the face of gaming. From the original GeForce DDR to ATI’s Radeon 9700, certain products have been released which offer a huge leap forward in terms of performance, image quality, and functionality. With DX10 on the immediate horizon, many expected the current crop of flagship GPU’s to be the next iteration of “groundbreaking†products. Unfortunately, that hasn’t quite happened on several fronts.

With regards to GPU’s, we find our main disappointment to currently reside with AMD’s new Radeon HD 2900XT. At the end of the day, we’re looking at a “flagship†GPU which is significantly late and doesn’t appear to be a “flagship†class GPU afterall. Nearly everyone here can recall the hype surrounding R600 for the last few quarters. Not since NV30 and the amusing hype movies (esp. the one with the “hidden†nod to Uttar) did a GPU drive so much discussion. In fact, many (including myself) expected R600 to be as ground-breaking as R300. From the moment the world first heard wind of R600, it was touted to be a DX10 monster with insane bandwidth and unparalleled performance. As time progressed we heard additional rumors of a massive heatsink assembly, unprecedented power consumption, and an enormous PCB. ATI/AMD chimed in now and then to keep momentum going for the hype machine surrounding R600. However, looking at their tone over the last two quarters one can clearly see how dramatically things changed. Whereas comments were initially made regarding the “performance crown†for the enthusiast market, comments then became much more focused upon the entire family of R600-based cards and on overall value.

Looking at the end result in terms of the Radeon HD 2900XT, we can identify which rumors came true and which did not. Unfortunately, the rumors of exceptional performance fell terribly short and those regarding power consumption and noise came true. Luckily, the rumor regarding a surprisingly low MSRP did come true although not for the reasons one would prefer. Looking at an architecture consisting of over 700M transistors and multiple years of costly development one would imagine the flagship R600 product would command a healthy price premium. Considering the respins at the fab the GPU went through and how many millions that costs, it was almost inconceivable to imagine an MSRP below $499. However, we sit here today with AMD’s “flagship†level card costing $399 MSRP.

Although I’d love to think AMD chose the $399 MSRP to bring a dose a reality to the insane prices flagship GPU’s are commanding these days and make the cards attainable to a larger group of people, I’m confident that is not the case. Rather, AMD realized that the final product had been massaged as much as possible and the end result fell drastically short of both internal and public expectations for performance. Can you imagine the pushback AMD would feel if they tried selling the Radeon HD 2900XT for $499 or more? Ultimately, AMD realized that their “flagship†card would be matched performance-wise by NVIDIA’s $399 MSRP GeForce 8800 GTS so they were forced to sell the card for the same price. Given the financial burdens AMD is feeling currently and the huge development cost associated with bringing R600 to market, I am confident that the company did not choose $399 out of the kindness in their hearts.

Perhaps the most frustrating aspect of R600 is the fact that we are dealing with a card that still feels rushed and unfinished. Despite having additional time due to multiple and significant delays, we find ourselves facing major driver bugs and issues Unless respins of the GPU changed the architecture, one would imagine the vast majority of these tasks would have been completed ages ago. Certainly it seems reasonable to expect some polished drivers with nothing more than a few tweaks here and there being necessary after being in development so long.

Fortunately, the Radeon HD 2900XT appears to be a diamond in the rough so to speak. On top of some interesting new architecture and some cool new functionality, we can see a great deal of potential. ATI fans and any enthusiast will certainly be dying in anticipation for the almost certain refresh of R600 coming on a smaller process. Armed with cooler temps and higher frequencies, it seems as though this GPU will be able to meet the performance levels many expected with R600. Now, whether AMD is able to keep the MSRP for their flagship level GPU at $399 remains to be seen. For the sake of the industry, that is certainly a trend I would love to see get traction.

On the other side of the fence you have NVIDIA. Without question, NVIDIA caught the world by surprise with the release of the G80 architecture. Despite the crafty musings of David Kirk seemingly indicating otherwise, G80 turned out to be a Unified-Shader architecture. In addition to being the only DX10 GPU on the market for roughly 6 months, the G80 GPU provided exceptional performance and increased image quality. Having learned their lesson with NV30, NVIDIA opted for a more refined process and managed to produce a strong new GPU with excellent yields. My only grip with G80 would have initially been the lack of the enhanced PureVideo features found on the GeForce 8600 series since the card was NVIDIA’s flagship and should have all the latest bells and whistles. However, that soon changed once NVIDIA released their “new†flagship card with the launch of the GeForce 8800 Ultra. Looking at a market flooded with factory overclocked GeForce 8800 GTX cards, the enthusiast community expected some lofty increased in both core and memory frequencies. Unfortunately, the new product launched with minimal increases at best although it commanded an almost obscene price premium. Looking at Newegg and other retailers, it was not uncommon to see factory overclocked GeForce 8800 GTX cards with frequencies higher than the new Ultra cards that were hundreds of dollars less! Although I understand NVIDIA’s release of the card as a tool for position in the market and to ensure the grip on the performance crown was tight, I was sorely disappointed with the GeForce 8800 Ultra. In hindsight, it would have been great to see the company offer the new GeForce 8800 Ultra at the GTX’s model’s initial MSRP and lower prices across the board the other GeForce 8800-class models. In addition to allowing consumers to get even more performance for their money, this would have placed AMD in a truly difficult situation with regards to the “flagship†Radeon HD 2900XT as it would be going up against the GeForce 8800 GTX in benchmarks. Instead, NVIDIA is seemingly much more concerned with garnering as much revenue as possible so such a situation is nothing more than a dream for the enthusiast community.

After reading this, I’m sure some will complain of prejudice or bias and write this rant off as the ramblings of an NVIDIA fanboy. Some will point out that I was once an Editor for nV News which happens to be an NVIDIA fansite. However, I hope many will also realize that I’ve also since been an Editor for HardOCP, AMDMB, PC Perspective, and Hot Hardware as well as a Product Manager at Alienware. I’ve written countless positive and negative reviews for each company over the years and would hardly consider myself a “fanboy†of either vendor. At the end of the day, I am an enthusiast who loves gaming and wants to get the most performance my budget allows. Whether that is from AMD or NVIDIA is a non-issue as I simply want the best hardware I can get my hands on. In the end, I think Kyle had it right when we were talking about the industry late one night at a Quakecon in years past. Here, he said it just comes down to “who offers the best overall gaming experienceâ€. Give me a card which offers the highest framerate with the best image quality and most functionality and you can put whatever brand’s sticker on it you want.

Taking a glimpse at the horizon, we have some very promising advancements for the graphics industry. A crop of solid DX10 titles are slated to be released including Crysis and World in Conflict, as well as a wealth of UE3-based games. With this in mind, GPU vendors are working hard to polish their new architectures and prepare to release a refresh or a new design altogether. With new hardware on its way from Intel and AMD in the way of new processors, chipsets, and platforms we should be seeing some healthy increases in performance throughout the entire system as well. Overall, the last quarter or two have been previews of sorts for the real excitement brewing in the second half of the year. Let’s cross our fingers and hope that the industry sees better execution across the board and that we see some strong competition again on all fronts.
 
Excellent read pelly - I for one agree on many points, particularly on pricing. I think the Ultra is a clear indication that R600's performance caught Nvidia by surprise. They were clearly positioning it against something which never materialized, and looking at R600, could not have materialized in this time frame.
 
I don't agree with the sentiment at all: despite it already being a mature industry, it is still able to increase performance over their previous generation by a factor, and there's little evidence that it's about to slow down. How much more exciting than the world of CPU's where not only is the increase slower, but where the effort to make effective use of it is 10 times harder on the SW programmers.

It has been the case for a decade now that the development of a game takes multiple years. Realistically, decent development on DX10 effect can only have really started about 6 months ago. If you're disappointed about the lack of DX10 games at this point, it's a case of unrealistic expectations. And never mind that the vast majority still has a DX9 rig. I think it's completely normal to see new stuff only trickle in.

And wrt price: compare a 8800GTX/8800GTS 640/2900XT to a 7950GX2/7900GTX/X1950XTX, some of then released less than a year ago, and observe the how much performance you get for the money then and now?

Taken together, do answer this question: what has to happen for you not to be disappointed and how do you think commercial enterprises have to be behave to get there?

Because I really don't realistically see how we can have it much better than the way it is now.
 
If you're disappointed about the lack of DX10 games at this point, it's a case of unrealistic expectations. And never mind that the vast majority still has a DX9 rig.

That sentence completely supports my point with regards to DX10 and the attachment to Vista. The vast majority of the market is running WinXP (of some flavor) and using DX9-class hardware. If Microsoft made DX10 available to WinXP, the adressable market for a DX10 game would be increased exponentially. Since DX10 is still strictly Vista and relatively small in terms of install-base, publishers and developers have a tough time justifying the time/effort/money to get a DX10 game out now...They'll obviously be started with DX10 development, but the majority of their efforts will go towards WinXP users (ie: non-DX10).
 
The first game to make major use of DirectX 9 was Tomb Raider: Angel of Darkness. It was released on June 20, 2003, says Wikipedia (I could check GameFAQs, but that's the timeframe I remember for the game anyway). Now, R300 was released September 02, NV30 was November 02/January 03, NV35 was already out, and there wasn't the whole Vista thing to keep in mind.

This isn't slower than any other cycle, despite the Vista launch and all that.
 
Hey Tim....

I understand your analogy...but let's remember that DX9 games could always revert to DX8.1 or lower...

With DX10 games, you almost need two different engines since things are handled so differently...You either go full-bore DX10 and then have to somehow support WinXP users...or you focus on the DX9+ market and then add a DX10 option for a bonus...although with less attention and development time so the eye candy might suffer...


The "catch" with all this is that DX10 is only available to Vista...We have DX10 hardware, but no games to play....We can run DX9 titles....but by the time any volume of DX10 games are out, the "refresh" of the DX10-capable GPU's we have will be available...
 
The "catch" with all this is that DX10 is only available to Vista...We have DX10 hardware, but no games to play....We can run DX9 titles....but by the time any volume of DX10 games are out, the "refresh" of the DX10-capable GPU's we have will be available...
And R350 and NV35 were both out by the time TR: AoD came out. I'm not seeing your point at all. You want D3D10 stuff because it's cool, okay, but time (and DX9) has shown that even if D3D10 were available on XP it would not result in increased adoption. I'm going to guess that the percentage of people with G8x or R/RV6x0 cards who are running XP is much lower than you would think if you just read forums.
 
but time (and DX9) has shown that even if D3D10 were available on XP it would not result in increased adoption.

I have to respectfully disagree here Tim...

If a publisher was looking at having to invest an additional ~$1 million to understand and develop DX10(arbitrary number...but rest assured, it would be high) they would be much more willing to put in that money if the target market was 100 million users instead of 10 million...

I think it is safe to assume the install-base for non-Vista flavors of Windows is at least 2 or 3 times the install base for Vista.

As a publisher....you can churn out a DX9 (or below) title using code you're familiar with in a short amount of time and cater to a huge market.....Or you can invest a ton of time, effort, and money to develop your first DX10 title and cater to a relatively small market...with delays likely as DX10 is new...
 
That sentence completely supports my point with regards to DX10 and the attachment to Vista.
They have a DX9 HW rig. Dx10 on XP wouldn't change that one bit.

I think it is safe to assume the install-base for non-Vista flavors of Windows is at least 2 or 3 times the install base for Vista.
I think that assumption is way too conservative and it helps to make your argument: How about 10 times instead of 2 or 3? It's completely understandable that game companies target this market first and slowly transition.
 
I think the point is, right now DX10 development is already a questionable financial proposition because of the (very small) amount of hardware out there capable of running it, it is further complicated by the fact that "many of the few" who actually do have the needed hardware may not have the software (i.e. Vista) to run it.
 
LOL...Dig...you always manage to get me to laugh with your posts... ;)

And Geeforcer...thanks for properly phrasing what I've been trying to say... :D

At the end of the day, people spending $400 or more for a graphics card really should have some titles to really have it flex its muscles...Not everyone has a 30" LCD and these cards laugh at most games at resolutions of 1600x1200 or lower...

When DX10 games do come out....there's going to be a ton of people forced to "upgrade" (some might even say downgrade) to Vista just to play them properly...
 
True, but the people won't follow until the industry gives 'em a compelling reason to go there. :yep2:

I still don't use Vista personally, and just upgraded to 2Gb of RAM finally. I've been living about a generation or two back and I find it more than enough for some quality gaming and a hell of a lot easier on the wallet. ;)

Being an early adopter means a lot of out of pocket expense, a lot of headaches, and waiting for stuff to use the hardware you have....what's the point? :|
 
Reading Pelly's posts, I get the sense that he's misstated, on purpose or not, his real beef re DX10.

Which is really: "Goddamn MS for making people upgrade the OS to get an API upgrade too."

Is that it Pelly? And if so, why not just say it that way rather than sort of suggest the ISV's are falling down on the job somehow? :smile:
 
Reading Pelly's posts, I get the sense that he's misstated, on purpose or not, his real beef re DX10.

Which is really: "Goddamn MS for making people upgrade the OS to get an API upgrade too."

Is that it Pelly? And if so, why not just say it that way rather than sort of suggest the ISV's are falling down on the job somehow?

Although I can't really blame you for not reading every portion of my PC hardware version of War & Peace, I'll cut you some slack and post the bits here where the flames touched Microsoft... ;)

1) "When DX10 games do come out....there's going to be a ton of people forced to "upgrade" (some might even say downgrade) to Vista just to play them properly..."

2) "Because DX10 is a “featureâ€￾ which can only be had when using the Vista Operating system, millions of WinXP users are forced to stick to DX9 games. Due to their actions, Microsoft has forced developers to largely ignore the small install-base of DX10 at this time and instead focus their current energy on addressing the huge install-base of DX9 (and below) users"

With that said however, you can't overlook publishers...and developers to some extent. Granted, its a business and money needs to be made in order to keep the lights on...However, it would be great to see more developers and publishers embracing DX10...
 
Back
Top