Your thoughts: Patches for console games bring more than what is expected from the PC

Farid

Artist formely known as Vysez
Veteran
Supporter
Let's try this, I would like to hear from our knowledgeable members what they think about a somewhat recent trend that are the really "PC-like" patches for the consoles games.

You might be puzzled by what I'm trying to convey when I talk about "PC-like" patches, well, let me put it into perspective this way, I'm talking about adding options that could potentially reduce the game overall performance. Which might not be a terrible issue on the PC, given that you could always, as a developer/publisher, rely on the get a better PC line. Thing that is, quite hard to imagine on a console, being a closed platform and all...

Moreover, you might want to know about the why this question all of a sudden. Well, I just stumbled upon the
release note of the upcoming patch for Saint Row posted in another thread. And one particular point did attract my interest:

  • Added player options to enable v-sync during cutscenes and/or gameplay, with warning that it will affect framerate.

I thought that I would just discuss that briefly on IRC, but no one around seemed interested enough to actually discuss it (Joking about it, they could do, though). So, what are your thoughful and quite detailed -as always, of course- opinions on this particular subject?

Note that this isn't a thread about patches on consoles in general. It's precisely about adding graphical (or what else) options that come to the price of performance, in this case it was a patch, but it could have just as well shipped with the original game.

Simply put, in this case, developers are telling you, in a closed box, to choose between image quality and performances... In other words, think carefully of the potential ramifications this type of situation may cause in the future before anyone gets quickly to the apologistic shelf and choose a nice and convenient item to apply as a prebaked answer to the question asked in this thread.

Once again, everybody's take on the subject, as long as it's backed up with reasonable points, is welcome on this thread.
 
I'm all in favour of options. Being able to choose the settings on PC that creates the balance between performance and quality is fair enough. However, that's not what I want from consoles! I'd take your argument a step backwards, as the trend in patches seems to coincide with a trend in games that are very PC like in implementation. We wouldn't and shouldn't need performance adjusting patches if the game was properly targetted to run on the hardware at a standard 30/60 fps with VSync enabled. Any game that finds itself disabling VSync or dropping frame should be redesigned to not do any of those things!

If games are going to continue to be released with dodgy framerates and tearing, I'd like the option to adjust settings to get a balance I preferred. But, I wish that weren't the case. I want every game to have that original console experience - you put in the game, turn on the machine, it's up in not much time, plays without any settings needing to be adjusted, and provides a rock-solid framerate with no graphical glitches.
 
some things i like to think about are what if the game came with that option originaly what would the reaction be without thinking of the nature of patches?

and we all know that there are games that's frame rate has suffered at times. would you have liked to have turned off the vsinc, or should they have got it right in the first place. i know gta:sa had frame rate problems in the burning building level on the x-box. not sure if this was the same on the PS2. but i never saw it come up in reviews for either.

basically we are given a choice between two bad options. choosing your poison, because it's still going to get messed up either way. i know that dispite vsinc or frame rate problems i've enjoyed games. it's just that we expect more. we expect a product that isn't technically flawed for the price we pay. i don't really think that is an unreasonable expectation. only better options can give us the choice we need to choose the option that isn't a compromise between one problem and another. so where's the game that should be bought instead of saints row?
 
On a closed platform where the user cannot do anything to improve the performance, there should be no such compromises. It's solely the developers fault for not working within the limits presented to them, hardware, dev tools and their own skillset wise.
 
On a closed platform where the user cannot do anything to improve the performance, there should be no such compromises.

I think it's important to remember that these platforms are not as closed as they have been historically.

Up until this gen the primary output choices have been PAL or NTSC (does SECAM count?) in the modern home with the possibility of PAL,NTSC, 720p, 1080i,1080p is it reasonable that developers should set all the rendering options in stone or should they allow consumers to choose which compromises suit best.

I suspect the market will answer this question better than any of us can but my main concern would relate to whether allowing people to make these selections could distort score comparisons or give certain players an advantage in online modes.
 
Unfortunately, as the percentages of users that go online and number of on-line only titles grow the more PC like console will become in terms of patching. Its only a matter of time until we see a console version of the BF2 situations.
 
I think the most immediate change we'll see if this trend catches on is that every publisher from now on will have a huge list of "stunning visual effects" or "unparalleled quality" their games have (or will have, they promise). They just won't mention that the games can't really do all of the things on the list at the same time.

It'll also probably screw with QA for any games that come out. What would have been unacceptable degradation of game performance or reliability at the initial release will probably get by if it's just tacked on later.

I doubt QA budgets are going to be that much bigger to cover all this.
 
the more options the better, the customer is king, one size doesnt fit all + other cliches
perhaps the only limitations are those that would help gameplay over a networked situation
eg turning off shadows (but the option still should be there in the single player)
 
I'm with the more options the better crowd, with a bit of a giggle added as a caveat because the consoles are beginning to mimic the PCs more and more. ;)
 
As a computer bigot since the time of the 8-bit Atari, I say I'll switch to console gaming when the experience becomes the same as playing on the PC. :p

Patches aren't always a bad thing... at least when the game's from a good company. Many patches added new features, levels, and gameplay. Half-life, Neverwinter Nights, Unreal series, and many others all had major additions that improved those games.

What concerns me is the ease of selling online through Live and the Sony and Nintendo stores may result in companies charging for the extra features.

If that happens, I'm going to blame every idiot that bought horse armor.
 
I'm with Shifty. If you can't keep a steady framerate with Vsync enabled, you shouldn't even publish. However, I would be in favor of allowing you turn on and off other graphical features in exchange for frames except for one thing. You know that if you allowed them to have a sliding graphical detail feature, all the released screens would be at ultra-super-max settings, which would run at about 10fps on the console, while to get even a stable 30fps would require running at a significantly lower detail level. It's bad enough with the super high-res, ultra-AA/AF, camera angles that don't appear in game devshots. This would make it a thousand times worse.
 
I think it's an important differentiating feature of consoles that you can "just play" a game, and that they shouldn't lose that feature. However, with that said I am all for giving the savvy user the OPTION of PC-style customization. I don't think it can hurt as long as that primary aspect of "ease of play" is still intact.
 
To answer your question as an average gamer...

What is the cost of implementing said feature ? Most people would say 'yes' to new features, but they may not use it in real life.

If it's only useful for a handful of people, I'd just ignore the request and focus on stuff that are useful to everyone. In the Saint Row's example, I think the value the "new option" brings sounds more like a compromise that the dev should make instead of the users. Such compromise may be nice to gurus but brings little (or worsestill, confusion) to the users at large. Just my 2 cents worth.
 
I'm not really sure on what to expect.

On one hand, it's convenient because it would salvage the game by fixing the bugs that were missed... or patching it so it would enhance the game by adding features that players want/complained about.

On the other, it might promote lousy/sloppy/sub-par game coz it would end up in the "who cares if it's bad right now, we can always patch it anyways" mentality.
 
I like Console Gaming precisely because I don't have to twist and tweak my settings to get the game working just right (or at all). The Console is a closed system with fixed performance, so make a game that matches the system in the best possible way.

Ideally, patches shouldn't be necessary, but if they do happen, they should either fix a bug, or add a feature. I think where patching is most acceptable might be where games follow an episodic content structure, like Half-Life 2, MMOs, or even GT:HD Premium.

I'm not against patches per se, but I don't want to do any performance management. Which, by the way, does not mean I am against optimisations for different resolutions - if you can make the game look better in SD by adding some special support for it, that's great, but there too, the game should just adjust to my display settings without me having to think about it.
 
I like console gaming since consoles are a fast pick up, turn on and play system. I play games on my pc, but i am not a hardcore pc gamer, and personally i really hate to spend half an hour each time i get a new pc game just tweaking the game until i feel it good enough. That means, activating AA, or deactivating AF to get more FPS, etc. Those things really take a while and personally i dont like.

I still remember how much time i ended up spending tweaking fear turning options that i liked ON and sacrificing other options that i did not care about OFF. It took me a good while to tweak it to a level where i was satisfied.

If the future of console gaming will be having those kind of options like enabling AF with the notice that enabling it will reduce performance, then console gaming has no future. Why? because consoles will be going more and more to the pc route and will reach a point where the two will be the same.
 
All I can say is: I KNEW IT!

I knew that the second it was possible, companies would spend less money on bugtesting/Q/A!

It's already too late to stop, though.

With the ability to patch, the ability to give users choices that affect performance is fine.

We're all playing PCs now. :(

(Having more than one version of a console? Upgradeable? Things like that.)

EDIT: I apologize for being OT.
 
Last edited by a moderator:
I havent read all the posts but i am sure someone has said this.

There is zero reasons for a console game to release with any major game killing bugs. It is a closed hardware platform. Addons are diff like roster updates in sports games. Major bug fixes are unacceptable.
 
Back
Top