Disappointed With Industry

Gripe with the 'industry'

After being an avid PC gamer for almost 2 decades I finally lost so much faith in the PC gaming industry that I went out and instead of becoming an early DX10 adopter (like I had with almost every other card release), I purchased my first console, the Xbox360. After owning the system for 6 months now, I have come to many conclusions regarding the state of despair I have for the PC.

You have to sit back for a moment and ask yourself, what is the primary goal of the industry? We're all here to play and enjoy games. It seems we have almost lost touch of that as our egos took over and the quest for the biggest and badest 3D accelerator became almost more important than the games they were designed to run.

I believe the primary sources for my discontent are: Product Cycles, Drivers/API's and bloat.

The PC industry seems to have gotten everything horribly backwards. Games, and in particular, their sequels, have 'cycles' of 2 to 4 years. Even the god awful antics of EA only has them releasing games with a minimum of a 1 year cycle. Now the hardware used to run those games is being churned out every 6-12 months. Its lunacy. Now, I'm a technology enthusiast. I want man to proceed forward at the fastest rate possible. I'm upset that we don't have Moon bases, let alone Mars bases, but come on, slow down and get it together. It seems that hardware manufacturers have taken it upon themselves to improve how our games run by increasing the raw power they have to use. This in turn means that game developers don’t get time to optimize their code for a generation of hardware because through the development cycle of the game, there has been 4 to 8 graphics card refreshes. Not only that, but its terribly expensive for the consumer! It’s a horrible cycle (hah ;/ ) that is destined to fail. This brings me on to my next point, drivers.

Currently we seem to be stuck in a mindset that says if the game doesn’t run well on a specific hardware solution, but other games do, then the drivers just haven’t been optimized. It doesn’t matter, the big billion dollar company will get round to sorting out the problem and the game will run well. Eventually the drivers will be optimized for all the games. Or will they? More importantly, should they? I don’t have a lot of knowledge on this subject, but in my opinion, drivers should be optimized for APIs, and that’s it. The API has a set design and has to work in a uniform way as to enable us to have the duality which we enjoy (or used to) in the hardware industry. I know that shaders add complexity to this subject because they’re so programmable, but you don’t see CPU manufacturers having the same problem. Their ‘API’ is x86 and although their hardware works in fundamentally different ways, you almost never hear of a program being optimized for different hardware (unless it’s an SSEx optimization, but they’re not too common), let alone the hardware manufacturer optimizing their chips for certain programs individually. You might argue that there’s nothing wrong with that, the end result is still the same. We have our games and they run well on the hardware.

The problem is that this only holds true for those games which hardware sites chose to use in their benchmark suites. The less popular titles get ignored and the indie developers (with new and fresh ideas) struggle. We’ve seen the problems this develops over the years and drivers are ‘optimized’ for the benchmark games and false results are released. In my opinion, this completely defeats the point of analysing benchmarks unless you are only going to run the specific games that have been tested and nothing else. You have no way of knowing if your $500 purchase will run flawlessly in all the games you own whereas a processor benchmark is an almost clear indication of performance improvement over a previous generation.

I think the DX10/Vista debate has been done to death already, so I won’t go into my views on that terrible decision.

Now we get down to our good old friend, bloat. More specifically, lack of optimization. This is where the 3-5 year product cycle of the console shines through. Rather than just letting the hardware manufacturer improve the performance of your games, the software developer now has time to get to know the hardware and the API in and out. They know the hardware won’t change in the next 6 months, they don’t need to plan ahead, they can just get down to the task at hand and churn out games that run flawlessly and beautifully. The two games that I will use as examples are, in essence, console ports. This does create a problem as even less optimization has been done for the PC, but as I’ve said, that’s not entirely the fault of the game developer. The first game is Rainbow 6: Vegas. From all accounts this game runs horribly, even on the big daddy, the 8800 GTX. The second game is Colin Mcrae: DiRT and the story is the same, it runs poorly on anything less than the 8800 GTX. Both of these titles run flawlessly on the Xbox360 and look absolutely amazing. Now considering the Xbox costs $100 dollars less than just the 8800 GTX alone, I start to battle for words to describe the insanity.

If only hardware developers would slow down. They can still keep releasing newer cards, but why not just release the cards on refined processes. Rerelease (does a release occur after a lease?) the X1950XT on 65nm, with smaller cooling and better power consumption to keep the revenue incoming. I’m sure there will be many people who would find that tempting, including oems. Let the card live out in the wild for 3 years, then release a huge generational jump. A completely new and refined architecture that will blow everyone’s minds. I’m sure game companies would love that and in turn we’d see better games, maybe some where gameplay was the focus, not just pushing more polys and textures. I don’t see another solution to this problem and currently the industry is on a slippery slope. It’s possible that if one of these companies fails that that might solve the problem. The company with the monopoly will slow down and some sense will be regained, but you never want a monopoly, it’s just too scary.

If the Formula 1 of gaming (the PC) fails, where does that leave the GT (the consoles) technology? All I know is that today my 7950GX2 has been used to process text and html, I haven’t purchased a PC game in over 6 months while I have purchased exactly 10 Xbox games and I spent most of the day playing the DiRT demo on my Xbox, not my PC.

Hope my Sunday afternoon rant wasn’t too boring.
 
I don’t have a lot of knowledge on this subject, but in my opinion, drivers should be optimized for APIs, and that’s it.
Great APIs are flexible and orthogonal. They can and will be used in ways that were not initially foreseen by the API architects. It's up to the driver writers to convert those orthogonal features into something that maps to the hardware. There are usually different ways to do this and since it's often not known up front how application developers are going to use it, driver writers will come up with something that works in, what they think, is a common way.

Then there's scarcity of resources and Amdahls law: if a particular feature can be made super efficient by spending 2 more months on it, but it only accounts for 2% of overall performance, maybe it's better to first optimize code paths that actually matter... until a program comes along that uses the API differently so that this 2% feature becomes 20%. What's happening now for DX10 is exactly what was to be expected: since there were no games, driver writers didn't have a clear idea which features will be used more than other. It's only normal that it will take time to start to see usage patterns and optimize the drivers accordingly.

And then there's the simply matter that there are performance bugs that are not caught during testing: a O(n^2) code path that doesn't get exposed because the test data set isn't large enough. Freakish compiler behavior etc.

Their ‘API’ is x86 and although their hardware works in fundamentally different ways, you almost never hear of a program being optimized for different hardware (unless it’s an SSEx optimization, but they’re not too common), let alone the hardware manufacturer optimizing their chips for certain programs individually.

Your example of a CPU instruction set is broken: individual instructions are very small building blocks with little or no side-effects. And even there, optimization trade-offs have been made: you only need to compare a REP MOVSD copy loop with a hand-coded copy loop on an i386 and on a Conroe and you'll see right away how relative performance has shifted, even with a very stable API. And, yes, many many moons ago, I would write 2 different ASM versions just to ensure maximum performance on different CPUs.

They can still keep releasing newer cards, but why not just release the cards on refined processes.
Re-releasing any chip on a smaller process is a major undertaking: most of your combinational logic can stay the same, so you'll win time there, but almost all the rest has to be redone. Analog cells much be redesigned and qualified. Synthesis, DFT, place & route done from scratch. Power circuits tuned. And then the whole process of full product qualification.

When you don't have enough manpower to re-release AND work on your next generation product, and when the competition is trying hard to beat you, I think the choice is obvious.
 
... Eventually the drivers will be optimized for all the games. Or will they? More importantly, should they? I don’t have a lot of knowledge on this subject, but in my opinion, drivers should be optimized for APIs, and that’s it. The API has a set design and has to work in a uniform way as to enable us to have the duality which we enjoy (or used to) in the hardware industry. I know that shaders add complexity to this subject because they’re so programmable, but you don’t see CPU manufacturers having the same problem. Their ‘API’ is x86 and although their hardware works in fundamentally different ways, you almost never hear of a program being optimized for different hardware (unless it’s an SSEx optimization, but they’re not too common), let alone the hardware manufacturer optimizing their chips for certain programs individually. You might argue that there’s nothing wrong with that, the end result is still the same. We have our games and they run well on the hardware.

Overall, I agree with a good amount of what you said in the rest of your post. However, the one aspect to consider with regards to the portion I've quoted above is that a game engine is incredibly complex and can certainly drift miles apart from the "standard" API approach once you factor various programming styles and approaches for developers. Consider Sweeney or Carmack here...and understand that these "cutting edge" developers are typically creating engines that use dramatic new approaches. Smaller developers will generally abide by standard reference approaches which are outlined in the API's SDK. However, certain developers will definitely try new approaches and take different steps in order to either achieve better performance or add some new enhancement (be it IQ or gameplay). In the past, hardware vendors and their driver teams would have to support the generic API and then "tweak" the driver for the particular approach used in a game once it had already hit retail. Fortunately, graphics card vendors are now working much closer with developers to get a headstart on supporting new features and approaches. Ideally, the driver which supported the new approaches would be ready the same time as the game and would be flawless. Unfortunately, complexity and time are enemies in this case and it usually takes a fair amount of time to tweak and polish until the game is as fast and as visually impressive as possible given that specific IHV's hardware.

Now, if every developer produced games using a cookie-cutter approach to the API....we'd certainly have polished drivers...but at the cost of innovation and differentiation between games.

In terms of the whole PC vs console topic, the console guys are fortunate enough to not have to worry about driver optimizations and such. However, they are faced with the evil issue of "ports" and having to deal with titles that are crude ports between consoles versus those that are coded to fully take advantage of a given console's hardware (ie: PS3).

In the end, neither side is perfect....so I end up with a PC, a PS3, an empty wallet, and some complaints... ;)
 
I agree that my x86, API comparison was an over simplification, but I still believe that you shouldn't have to rely on the drivers to optimize their performance for every single game individually as it creates problems like I stated above. There must be a better way.

@pelly: If you were worried about the console port problem, you should have got an Xbox360 instead (fanboy alert). It seems to be the primary development platform for most companies and doesn't suffer from ports as much as the other systems, if at all.

@silent_guy: As far as I know, the people required to develop the chip on a more refined process is quite different from the team developing the ASIC and they aren't going to be any more busy releasing refined cards as they are working on the new cards less. Also, drivers will become more mature and don't almost need to be made from scratch every time, so the team working on that doesn't need to be as big. I agree that this probably won't change due to the competition problem, but that doesn't mean it shouldn't. Its depressing.
 
I agree that my x86, API comparison was an over simplification, but I still believe that you shouldn't have to rely on the drivers to optimize their performance for every single game individually as it creates problems like I stated above. There must be a better way.
You're assuming here that that driver optimizations made for CoJ, CoD, and LP won't help newer games. With DX10 being so new, I doubt that this is the case: there must be plenty of performance opportunities available that don't require game specific tweaks.

@silent_guy: As far as I know, the people required to develop the chip on a more refined process is quite different from the team developing the ASIC...
Yes, they may be different individuals. ;)
The team will be quite a bit smaller, because RTL and front-end verification can be skipped, but the other tasks are the same. It's not a walk in the park and it ties up a significant number of specialists that are better used doing something profitable and future proof.
 
You're assuming here that that driver optimizations made for CoJ, CoD, and LP won't help newer games. With DX10 being so new, I doubt that this is the case: there must be plenty of performance opportunities available that don't require game specific tweaks.

Sure, I agree with that, but it raises another issue; The TWIMTBP and Get in the Game programs. While I have yet to see ATI go anywhere with theirs, nVidia's program has found its way into most games out there, specifically the new DX10 games. While its good for nVidia card owners, it does kind of mean that ATI hardware will never run those games as fast as their hardware would normally allow. Imo, this means that ATI owners and to a lesser degree, nVidia owners, have a mixed bag of games that run well on their brand new hardware. While the drivers eventually make up for this over time, it really can be annoying. What makes it even worse is that the game runs brilliantly on a 1 year old+ console with all the bells and whistles, purely because the developers have one system to work with.

I guess what I'm trying to get at is that I wish there was a way of unifying the API or the instruction set (so to speak) in such a way that it limited the radical performance difference seen between a game that has driver optimization and a game that doesn't, but without limiting creativity. Bascially some kind of standardization allowing the PC model to get closer to the console model in terms of development. Because I would much rather be playing games on my PC than the Xbox.

I'm not sure how you can side with the way its being done now, the consumer is stuck with the $500 bill for the new hardware, then has to deal with driver issues that may span the short life of the product only to have the same thing all over again in 12 months just so they can play a game which an Xbox owner has no issues with. I believe that more people are going to become disheartened if this continues. We are already seeing declines in GPU sales and if that trend increases then the whole industry is going to suffer, not just the PC. Obviously I don't expect any of my solutions (if they can be called that) to be adopted, I was just floating ideas out there as I feel something has to be done.

I'll just leave you with one extra thing to consider. The Xbox360 has a 203W power supply...
 
I'll just leave you with one extra thing to consider. The Xbox360 has a 203W power supply...

Make no mistake, the casual gamer or someone on a tight budget will certainly will fair better with consoles as they have a life-cycle of years instead of months. As time progresses, developers gain familiarity with the platform and learn how to squeeze every ounce of performance from the hardware. As such, impressive games will continue to appear on XBox 360 and PS3 for years to come.

However...

The cutting-edge enthusiast will only be satisfied with the "latest and greatest" on the PC side. Granted, they will likely own a console or two (I own a PS3). However, nothing will allow the person to keep pace with the rate of innovation like the PC can. Simply put...consoles are a static platform which create limitations for development. The PC is a dynamic platform which (for a cost...sometimes severe) can keep pace with the innovation on the software side. As new engines are created, new API's made, and new titles developed...there can be hardware which will fully take advantage of it and give the user the best gaming experience possible. Again, this typically comes at a huge price premium compared to consoles....However, the ability to alter FSAA and AF settings...the option of running incredibly high resolutions...etc...make the PC the ultimate enthusiast platform...since they are all options which cannot be found on a console...

So if the XBox 360 is able to provide a good enough gaming experience for you and you're happy saving money with the smaller PSU not jacking-up your electricity bill...congrats and happy fragging for years to come! However, you're going to have a tough time getting folks here to trade-in their uber-PC systems in favor of a console...If there was ever a community devised of the enthusiasts I mentioned above...this is it! :D

As long as I can afford it...I'll choose to play PS3 AND PC games...and do my best to keep up with the product-cycles...lol...
 
Well I chose Wii. I haven't been playing PC games for at least a year now, just because most of them simply suck. And I agree with PsychoZA, it's insanity buying new HW every few months just to have it sit there and heat up the room, while driving your electricity bills sky-high. Let alone the fact that you still wouldn't really need anything better than a good old GF7 or Radeon X1 series for the curent crop of games.

Pelly: I'm an enthusiast and would surely like to have the latest and greatest if I had any reason to. But the latest and greatest is all but useless right now, let alone that there are no games out there right now which would make me want it.
 
Pelly: I'm an enthusiast and would surely like to have the latest and greatest if I had any reason to. But the latest and greatest is all but useless right now, let alone that there are no games out there right now which would make me want it.

Thats exactly what I'm saying, and its not because we're currently going through a bad time in the development cycle, its been going this way for years now. Piracy on the PC is probably another big factor scaring developers away, but thats a whole other issue.

The sad thing is, as soon as Crysis comes out and I realise it won't perform as I would like on my GX2, I'll be online searching out the best deals to blow $2000 on hardware to run it. I never learn.

My comment on the PSU had nothing to do with electricity bill or heat, I was trying to make a point about the effeciency of the system and how ineffecient PCs have become.
 
Well I chose Wii. I haven't been playing PC games for at least a year now, just because most of them simply suck. And I agree with PsychoZA, it's insanity buying new HW every few months just to have it sit there and heat up the room, while driving your electricity bills sky-high. Let alone the fact that you still wouldn't really need anything better than a good old GF7 or Radeon X1 series for the curent crop of games.

And how much is a x1 or gf7 part now? Not too much. That is what I like about PC gaming, I just live in the past now for the most part, but games for 3.99 from the 48hour madness sale etc...
 
What the PC platform really needs IMO is more standard middleware, more standard formats for art outsourcing (I'm not just thinking Russia & China, but also merchant websites, which currently ar the very opposite of standardization) and, very importantly, a completely new business model.

Today, MMORPGs are more lucrative than pretty much every other game genre combined. That should tell you something. Note that I'm talking about that in terms of operating profits (thus, including development and running costs), not just revenue or whatever. The margins are just, let us say, delicious.

The business model for most PC games right now is just awful. Heck, it's not really worse than the console model, but that one works because of sheer volume. As it is, non-MMO PC games rarely have very attractive financial returns AFAIK...

If the costs could be reduced through better outsourcing strategies (including more middleware sharing) and smarter art pipelines, and that the gross margins could be increased through schemes such as digital distribution (STEAM is a ripoff, though) then the market has a chance to make a comeback. Sadly, I'm not seeing many signs of that.

As for the current state of PC gaming, I've got to agree with most of your points. One exception though: where could I get games such as the new Sam & Max Episodes, except on a PC? :) Okay, I know Psychonaut was ported to consoles, but given how little success it had even there, I doubt that's going to happen again.
 
If we talk about art outsourcing the formats are the smallest of all problems. The art must fulfill the art style and performance criteria’s of a game. This cannot simply ensure with standard formats.

Middleware is another problem. Packages like Speedtree have the tendency to have a huge impact on the overall game design as games that use such a package have the tendency to look very similar.
 
If we talk about art outsourcing the formats are the smallest of all problems.
Yes and no. I was told once that because most artists will do things in slightly different and non-standard ways, you might have to waste 1-3 hours per art asset you buy off a merchant website. This drastically reduces the economical viability of doing this. It is, of course, far from the only problem.

The art must fulfill the art style and performance criteria’s of a game. This cannot simply ensure with standard formats.
I wouldn't be so worried about the performance characteristics if there was more choice on the market (which would happen if there were more buyers...) but the art style is indeed a major concern. If a game supports more exotic features than just the traditional ones, that also would have to be manually added to the art asset by the buyer.

But the art style is indeed the biggest problem. Once again, a larger choice would help. Or, if it becomes a larger business, you might have more people focusing on that professionaly, thus sticking to the same kind of art style.

Do understand that I'm not thinking of this for large-budget productions. Doing everything from scratch, in-house, and based on your engine's specifications is always going to give the best results if you have good artists. However, for small-budget projects (which is where the PC truly has a chance to shine and innovate, purely imo) various forms of outsourcing may be highly appealing, if some of the problems mentioned above were somehow resolved or diminished.

Middleware is another problem. Packages like Speedtree have the tendency to have a huge impact on the overall game design as games that use such a package have the tendency to look very similar.
I agree completely. An artist once told me that he didn't want to use SpeedTree for a project and stick to an art style without as much vegetation, not because SpeedTree was bad... But because without significant effort, it it much too generic. It is possible to customize trees in very cool ways and differentiate yourself from other titles, but that takes a lot of effort.

In the case of some other middleware solutions, there even isn't any differentation possible. This is indeed a major problem, but it doesn't apply to all forms of middleware.
 
Not only from an art standpoint, but the entire engine as well. Every FPS game developer seems to be reinventing the wheel a lot of the time where they could all just use one engine and custom tailor it to their needs. I have a wierd feeling that the Unreal engine could be that engine, only problem is that the small developers with the new ideas probably can't afford it. Within each genre there must be one way of doing something that is simply the best. The best method of doing shadows/lighting/collision detection/physics, why should each developer try come up with the solution independently. Seems counter productive (I also wonder the same thing in other aspects of life, i.e. car engines, there must be one way that is the best for a certain engine capacity). I seem to remember Carmack saying something like that a short while after Doom 3 came out. I guess that is a bit of a commie way of thinking about things though...
 
Slight tangent...

I too am getting my gaming and eyecandy from a xbox 360 for financial reasons. And, even doing some fun projects on it with XNA (love the GPU/controllers, hate the C#, math perf and garbage collection.)

I oohed at the Rainbow 6 Vegas graphics, and chuckled at the framerates on PC. But, I can't handle playing FPS against humans w/o a mouse for aiming - it's just frustrating not being able to aim fast or spin 180. And not for lack of trying - I've logged plenty of hours in co-op. Serious RTS people feel the same way.

So, getting to the point - I wonder if the motivation for not supporting a mouse on 360 is trying to stem the tide of gamers migrating off Windows? Or if it's really about "fairness" on the 360? PC users are used to everyone having different HW (advantages.) And, there was that weird PC vs console experiement in Shadowrun - but think that was to push Vista/Games for Windows.
 
I think different companies "re-invent the wheel" WRT FPS games due to the fact that any one person isn't either "the greatest" or isn't "focused" on every aspect of a 3D engine.

Each engine has it's focuses and it's trade offs. The Doom 3 Engine for example focused on shadows and lighting (?) and made trade offs in other areas. Unreal Engine went for large expansive levels and thus made trade offs in other areas.

I'd imagine to make a "perfect engine" would take close to inifinity as technology doesn't stay stagnant. As technology progresses, new avenues open up for implementing features that were impossible in the past. And yet you will still have to make trade offs with what you want to accomplish with the available resources you have.

In regards to XBOX 360 not encouraging the use of a Keyboard and Mouse. This is still a hold over from the original XBOX. Microsoft was very keen to make as few links between the console and PCs as possible to avoid people thinking of the XBOX as just a PC in a different case.

IE - They wanted the image of a console and anything that might outwardly remind someone of a PC was anathema. For example the controller inputs (original XBOX) were basically USB ports, however they were designed to not look like USB ports so that people wouldn't associate the XBOX with a PC.

P2P filesharing in conjuntion with software piracy is what will kill the PC gaming industry. Piracy is far to mainstream now to stem the tide from what I can see. And thus eventually I believe PC gaming will follow in the footsteps of another great Home Computer that went down due to piracy. The Amiga.

Regards,
SB
 
Back
Top