The slow demise of the console experience [*spawn*^4]

Even then, when a game was 30 fps it was a stable 30 fps normally. Tearing just didn't happen on consoles. Stuttery framerates didn't happen, until the end of the Ps2 era by my uncertain recollection.

I remember tearing happening quite regularly in games like Unreal Championship, Ghost Recon, and Splinter Cell.

Even on consoles known for 60fps games (like the DC) there were plenty games with stuttery frame rates. UT on both the DC and PS2 had frame rate issues for example. UC on the xbox was also known for it's frame rate issues. I know I'm bringing up Unreal games a lot but those are some of the first examples to spring to mind. :p

I'm sure if I actually tried, I could remember more games with inconsistent frame rates. Honestly, I think the slow demise of console experience happened when we moved from 2D to 3D.

I guses it was this overreaching the hardware that has continued, maybe because of marketting?? My associations from last gen are a comparison between NWN and Morrowind on PC, and the likes of BGDA and GT3 on PS2. The console games were always consistent; the PC games always either tearing horribly or juddering with an inconsistent framerate when V-sync'd. That kinda makes sense when the developers couldn't target a specific hardware level, so couldn't tune the game to run consistently at a given level. It makes sense to just create the game and let the hardware do the best it can. But on console we've seen a changing attitude where devs are trying to do more and will happily sacrifice framerate and IQ to achieve it, where before they'd have pared back some other aspect of the game like model detail or view range.

I think a large part of it has to do with marketing. Look back at how high they raised the bar with all of the pre-rendered concept videos, tech demos, and talk of system power. MS and especially Sony have done what they could to burn the thought that these systems are all powerful machines capable of Pixar levels of CGI. I can only imagine the thoughts going through developer's heads when watching MS and Sony's E3 conferences. Just recently we saw DICE defend themselves and why the console versions featured lower player counts due to the fanbase's expectations.

To be fair, it was similar last gen too AFAIR. Weren't most ps2 games (and even GC games) sub-480p?

If you don't take it as an absolute, I think it's a valid point. Relative to PC, by a considerable margin, most console games (and this goes back to the 16 bit era and Amiga vs. PC) ran without bugs and with consistent framerates. I recall R-Type on the Sega Master System would vanish sprites from the display in order to prevent slowdown! Some games did slide-show, like Populous, where they were being ambitious. Overall though the performance of consoles over PC was pronounced, especially without the setup issues of PC, or the incompatibilities. As time has progressed, those advantages have dwindled. DX has sorter solved setup in a lot of cases, although drivers have a large impact. Screen tear and judder is no less prevalent on console games, perhaps less even if a game is targeting consoles and then running on much beefier PC hardware. Bugs are as common on console games. Apart from being able to put in a disk and play without having to install, a feature PS3 is quickly trying to redress with mandatory installs, the reasons for owning a console aren't what they were.

I could be wrong, but aren't games an order of magnitude more complex now than they were 10-15 years ago? Development schedules have remained the same, if not made shorter, while the work to create these games has grown significantly.

The more advanced we want our games, and how often we want these experiences, I'm afraid that that level of polish is a rarity these days.
 
The PC is still a PC and the Consoles still provide a better out of the box experience.

It depends on your priorities. I play 99.9% of the time on pc now, and when I do go back to console they just look silly in comparison, even the latest and greatest console stuff all looks horribly dated. I just think that graphics have fallen back dramatically in peoples priorities so they don't care anymore. Friends, achievements, quality online, etc have shifted to the forefront. I definitely miss that part playing on pc as the pc severely lags the 360 in those departments, but then again I'm a graphics whore so I can't look at console games much anymore. I realize I'm in the monitority though.
 
It depends on your priorities. I play 99.9% of the time on pc now, and when I do go back to console they just look silly in comparison, even the latest and greatest console stuff all looks horribly dated. I just think that graphics have fallen back dramatically in peoples priorities so they don't care anymore. Friends, achievements, quality online, etc have shifted to the forefront. I definitely miss that part playing on pc as the pc severely lags the 360 in those departments, but then again I'm a graphics whore so I can't look at console games much anymore. I realize I'm in the monitority though.

I don't think graphics have fallen back, I just think the average console gamer is extremely ignorant. I don't really mean this in the insulting way, just that they don't know any better.

Take BF3 for example. I went to the midnight launch last night, they had BF3 running on the ps3 and it drew a crowd with almost everyone talking about how good it looked. These same people may not have been thinking this if they knew how much better it can look when running on a high end PC IMO.

I think if MS or Sony throw up a new console with shiny new graphics for $400 in the next couple years, it'll gain a lot of attention, largely due to the improved graphics.
 
I don't think graphics have fallen back, I just think the average console gamer is extremely ignorant. I don't really mean this in the insulting way, just that they don't know any better.

I'm not sure that's the case really. I don't doubt they want better graphics, or would recognize better graphics if they saw them. I just don't think they are as important anymore as each gen keeps giving diminishing returns. So when this tread is titled the demise of the console experience, I don't think it's a demise at all. Graphics are good enough, yeah there's screen tear but so what, sure there's low res textures, oh well. Does it give me 15+ hours of single player campaign? Does it have great online? Is it fun to play? Does it look pretty good? Cool, they get my $60. Does pc look better? Sure I know it does, but my friends are on console and the graphics on console are good enough anwyays. At least I think that's where the mindset has gone.
 
I remember tearing happening quite regularly in games like Unreal Championship, Ghost Recon, and Splinter Cell.

Yep. The greatest thing about PC gaming has always been its ability to set performance based on your gpu and your preference in terms of AA, framerate, resolution and tearing.

I don't know about today (I don't PC game anymore) but there used to be quite a few top tier PC games that at release even the latest and greatest card would choke if your turn everything up to high/ultra.

The reason PC and consoles have switched roles is because the vast majority of investment of pushing hardware to its edges in terms of visual IQ is on the console. If you don't think thats true, then just look at what possible on PC and whats possible on consoles and tell me a time where the gap between a highend PC and a 6 year old console was anywhere near this close.
 
I'm not sure that's the case really. I don't doubt they want better graphics, or would recognize better graphics if they saw them. I just don't think they are as important anymore as each gen keeps giving diminishing returns. So when this tread is titled the demise of the console experience, I don't think it's a demise at all. Graphics are good enough, yeah there's screen tear but so what, sure there's low res textures, oh well. Does it give me 15+ hours of single player campaign? Does it have great online? Is it fun to play? Does it look pretty good? Cool, they get my $60. Does pc look better? Sure I know it does, but my friends are on console and the graphics on console are good enough anwyays. At least I think that's where the mindset has gone.

I think you nailed it right there.

The only weird exception is ... frame rate. People both notice it more than they think and claim it bothers them more than it does. The worst offender is the PC gamer who correctly talks about the advantage of higher frame rate while bragging about averaging 35 fps compared to a console game capped at 30, while seemingly oblivious to stutters that take the experience well below that baseline that would fail a console game in validation. Bonkers.

Changing the subject back more broadly to "the console experience", I've recently rediscovered the joys of tile based consoles, run on a CRT, with games that run at 60 fps (as standard, not as a bragging right) that respond instantly to a D-Pad rather than some cumbersome, partially analogue thumbstick.

An analogue thumstick makes a great low-accuracy 360 degree directionerer pointerer, but a shitty joystick and a shitty D-pad and a shitty mouse. Also, the 360 and PS3 pad's D-pads make shitty D-pads, so that doesn't help. :(
 
Last edited by a moderator:
I don't think graphics have fallen back, I just think the average console gamer is extremely ignorant. I don't really mean this in the insulting way, just that they don't know any better.

Take BF3 for example. I went to the midnight launch last night, they had BF3 running on the ps3 and it drew a crowd with almost everyone talking about how good it looked. These same people may not have been thinking this if they knew how much better it can look when running on a high end PC IMO.

I think if MS or Sony throw up a new console with shiny new graphics for $400 in the next couple years, it'll gain a lot of attention, largely due to the improved graphics.

I think thats true for alot but not all. There are alot of posters here impressed by Vita possible performance as a handheld even though they are in no way ignorant of whats possible on consoles or PCs.

Most of us here know what a PC with a highend gpu can do but this board is not full of "meh"s when talking visuals on consoles. Most people don't use a highend PC as a frame of reference when judging console games. That would be like using a Murciélago to judge $40K sportcars.
 
I would have very happily stuck with PC gaming having switched after the Amiga died a death, having started with a VCS2600..., but it was always such a painful experience under Windows that the PS1 was a breath of fresh air. I don't think that this has changed. It's just that the hardware has reach such a powerful level that it masks a huge amount of the issues that still affect it.
Programming paradigms have also shifted to better, more able models, and this has also had a positive impact on game performance, though the environment they live in has, if anything, got even more resource hungry. Especially on the PC.
IMO, the current difficult period that console games appear to be going through is akin to the transition consoles did to home computers to PC and back to consoles again. It's a maturing market where the expectations of the gamers is exceeding the ability of the hardware and the games manufacturers are making more mature games with stronger content. For me the transition was from Books, the RPGs, to Films, and then to games, which combine all the best elements and give me the lazy imagination option combined with the excitement of being in control.
Basically, the apparent slide in QC is only because of our expectations are set so high.
 
Battlefield 3 is the only game that offers me a compelling reason to play on a PC. Diablo 3 will as well, I guess. Otherwise, I already have money sunk in a console that will play games, albeit with lesser quality. Framerates and stability for the most part are very good, without any of the hassles of drivers or other weird issues as were seen with Rage and other titles. I already have a laptop, that can't be used for gaming very much, that does everything I need a PC to do. So why would I invest even a few hundred dollars in a second PC, just for games, when I already have a console? Battlefield 3 is about the only game that makes it tempting because of the larger maps and 64 players.
 
This is a crappy thread. Demise of the console experience? WTH are you guys playing? Console gaming is better than it has ever been.
 
This is a crappy thread. Demise of the console experience? WTH are you guys playing? Console gaming is better than it has ever been.
Really? So you've lots of experience through the years of buying a console game, having to wait while it installs, then playing so far until you hit a game-breaking bug and you have to wait a month until it gets patched?
 
Really? So you've lots of experience through the years of buying a console game, having to wait while it installs, then playing so far until you hit a game-breaking bug and you have to wait a month until it gets patched?

I remember back in the day, all the Megadrive and SNES owners laughing at Amiga and ST owners with their long loading times and stupid copy protection codes and multicoloured cardboard wheels (Amiga and ST owners laughed back though, because all their games were "free").

Cut to today, where you install a console game, wait for it to load and have to enter a f***ing code to get access to your content. Don't play for a while and you need to buy a new map pack or get relegated to Leper Island. That's not the console experience, that's the Amiga experience mixed with the *Horse Armour experience.

*Horse Armour increasingly not optional.

This is a crappy thread. Demise of the console experience? WTH are you guys playing? Console gaming is better than it has ever been.

In many ways, but definitely not all.

Being able to co-op with chums across the country (like you used to do on Double Dragon and Streets of Rage) is awesome. Halo is awesome (not so much Halo Reach, but still). Not having the left side of your screen disappear and a black border appear on the right when you use an RGB scart lead (meaning you have to take the back off your telly and attack it with a screwdriver) is awesome. But that's not the whole story.
 
Really? So you've lots of experience through the years of buying a console game, having to wait while it installs, then playing so far until you hit a game-breaking bug and you have to wait a month until it gets patched?

That's just how the progression of technology goes -- and honestly I expect that it will only get better. How long have consoles had the ability to store gigs of data... since the ps2 if I recall? And it was pretty painful back then with FFXI. It's been employed en mass since then with newer consoles, and at a time where not only have drive speeds increased, but solid state storage is becoming more affordable, and hard media is becoming factored out in favor of digital distribution. It only seems bad right now because it's still relatively immature; that's really just how progression goes.

And yes - I prefer the patch approach instead of having to return my hard media for replacement or facing the possibility of never having a patch for a game-breaking bug at all. Game-breaking bugs are nothing new - the difference is that the fixes can actually be pushed home for once, compared to previous generations.
 
That's just how the progression of technology goes -- and honestly I expect that it will only get better.
I can't believe that. Software is more buggy now than every, in every way. We have buggy TVs and recorder and phones for goodness sake! Let alone computers. As others rightly say, it's because everything has become more complex, but as a result it can't really get better unless we find a new way to develop complex software that leaves out the bugs.

And yes - I prefer the patch approach instead of having to return my hard media for replacement or facing the possibility of never having a patch for a game-breaking bug at all. Game-breaking bugs are nothing new - the difference is that the fixes can actually be pushed home for once, compared to previous generations.
They were very rare. Buggy console games are now common. The fact we can, sometimes, fix issues after release isn't really a better situation than not having those issues.

I think the main take-home for me is that console gaming is no longer trouble-free, stress-free. I am experiencing conisderable frustration and annoyance with my hobby at times because the software doesn't work and there's often no contact with the developers and patches may be released and often they don't fix problems anyway. There was one patch for FIFA 11 that didn't improve the game in how we played it, but did introduce a graphical bug in a particular stadium. the subsequent patch never fixed this bug. Borderlands never got fixed in its network code or voice chat. My friend bought one of the LEGO games (HP or SW) and there was a game stopping bug that got fixed a month or two later. The next LEGO game he bought had a similar bug, although he managed to glitch his way out. Buying a game nowadays comes with an element of risk, whereas when I played Master SyStem and Amiga and PS1 and PS2, the games I played worked. Maybe that's inevitable due to more complex software, but inevitable or not, it signifies console gaming isn't what it used to be.
 
And yes - I prefer the patch approach instead of having to return my hard media for replacement or facing the possibility of never having a patch for a game-breaking bug at all. Game-breaking bugs are nothing new - the difference is that the fixes can actually be pushed home for once, compared to previous generations.

Along with the ability to patch comes a increased willingness to ship in a state that will require patching. A blessing and a curse.

Taking steps to minimise the impact and frequency of patching, like MS has done with the 360, allows for the optimal balance IMO.
 
This is a crappy thread. Demise of the console experience? WTH are you guys playing? Console gaming is better than it has ever been.

I actually agree with you, to some extent. But I agree with Shifty as well.

On the one hand, during the PS2 days (which for me were the nicest, in general and in this respect), there was no firmware, there were no HDD installs, and there were no patches. So you bought your game, put it in, and done. Publishers had to be extra careful releasing their games, because once the game was out with bugs, those bugs couldn't be fixed short of printing another run.

On the other hand, now you can plug in your (let's say) PS3, go into an online store, buy and download some demos, buy one or two nice games, and off you go. No disc even necessary. Occasionally, free upgrades or additional content pop up, for (compared to retail prices) next to nothing (what's 2 euros for another pinball table? You got two plays on one in the Arcade hall for the same money back in the day). Games sometimes get new features for free for a long, long time. You can play games online, compare scores with friends, stream content from other devices, play all disc formats you can buy in store, rent/buy online, stream netflix (if you're in the right country), and whatever else.

But with all the new features come new potential problems. And so in general, the number of issues have also increased. Games can now be patched, and boy do we know it - we all know about the day-one patch, and Sony's incremental patch system combines with no patching limits means that getting the game running can take longer than going to the store to buy it in the first place, except it's AND, not OR. Online servers can't always keep up, and sometimes games forget to take that into account (GT5 on release, with several even menu actions calling the non-responding server for ultimate fail).

So it has gotten both better and worse, and certainly being able to patch games has made console games similar to PC in that respect (and sometimes worse). When it's bad, with patches, DRM, broken online, etc., it's very bad. But when it's good, it's fantastic and miles beyond last-gen. If for instance I compare Singstar PS3 with last gen Singstar then wow, what an amazing difference. That I've had one unstable version, and had to call support for re-downloading all my songs once because I got a new PS3 is easily made up for by the myriad of fantastic features that have been added to the game and the ability to pick and choose from the more than 1000 songs in the online store (including songs in my native language), compare scores with friends in clubs, play together online with camera support, use voice control and save, edit and share videos all from within the game, just to mention a few of my highlights. ;)
 
I can't believe that. Software is more buggy now than every, in every way. We have buggy TVs and recorder and phones for goodness sake! Let alone computers. As others rightly say, it's because everything has become more complex, but as a result it can't really get better unless we find a new way to develop complex software that leaves out the bugs.

They were very rare. Buggy console games are now common. The fact we can, sometimes, fix issues after release isn't really a better situation than not having those issues.

I think the main take-home for me is that console gaming is no longer trouble-free, stress-free. I am experiencing conisderable frustration and annoyance with my hobby at times because the software doesn't work and there's often no contact with the developers and patches may be released and often they don't fix problems anyway. There was one patch for FIFA 11 that didn't improve the game in how we played it, but did introduce a graphical bug in a particular stadium. the subsequent patch never fixed this bug. Borderlands never got fixed in its network code or voice chat. My friend bought one of the LEGO games (HP or SW) and there was a game stopping bug that got fixed a month or two later. The next LEGO game he bought had a similar bug, although he managed to glitch his way out. Buying a game nowadays comes with an element of risk, whereas when I played Master SyStem and Amiga and PS1 and PS2, the games I played worked. Maybe that's inevitable due to more complex software, but inevitable or not, it signifies console gaming isn't what it used to be.

Eh, I think this is the main reason in the end... millionaire cost & time development not allow so deeper beta testing so that what going on. Most of times biblic time of development cause the close of a software house (see Team Bondi...); I don't think the things will changes in better on the next generation, indeed... honestly I can't see any realistic solution of this problem, maybe when the cost will decrease to enable reasonable beta testing but I don't know whether will never happen...
 
Last edited by a moderator:
I don't know about any of this. Gaming on the PC, which I did for many many years was a FAR FAR FAR more horrible buggy experience than anything I've seen on the console. Sure, my Super Nintendo was rock solid, but as has been mentioned, those games were incredibly simple compared to what you're seeing now, with the myriad of online integration. There is no way to write perfect bug-free software, especially with increasing complexity. If that is your expectation, then just stop gaming. I will say that one of the current gen consoles does provide a much more hassle-free experience than the other, but they're both good.
 
I don't know about any of this. Gaming on the PC, which I did for many many years was a FAR FAR FAR more horrible buggy experience than anything I've seen on the console. Sure, my Super Nintendo was rock solid, but as has been mentioned, those games were incredibly simple compared to what you're seeing now, with the myriad of online integration. There is no way to write perfect bug-free software, especially with increasing complexity. If that is your expectation, then just stop gaming.
Bug-free isn't a fair expectation. But whatever happened to QA testing ensuring no killer bugs? How can games get released where the basic netcode doesn't work, or the voice chat craps out? Once upon a time a console company wouldn't allow such buggy games. They'd reject the game and tell the devs to go fix these bugs because they weren't providing a good enough experience for their platform. We still have that in theory as evidenced by Under Siege's long-overdue patch, so how come it's not in force universally and ensuring 1) games mostly work and 2) where there is a game-stopping bug, devs are held to account and made to get a fix out in short time? If you bought a LEGO game on PS2, it worked flawlessly at 60fps. Buy a LEGO game this gen and you get tearing or framestutters and bugs that trap you and prevent progression and you just wait and wait until maybe a patch is release. Does that not strike you as a demise?
 
That's one thing I like with Nintendo this generation, many of their first party games aim and run at 60 FPS

I am not a tech guy and I may often have to look at side to side comparisons to see that "oh, the PC is better here and there" but I am very sensitive to unstable framerates

I somewhat like that the mentality of always having to upgrade your PC has disappeared this generation, it has done so that my aged PC(HD4870 512MB) still can play modern games with good framerates...The "clusterfuck" of Rage PC reminded me of how bothersome a gaming PC can be in comparison to the consoles though

One other thing that makes PC my primary choice for gaming though, is that I don't approve of Xbox Live's Pay to Play...I mean, if I had paid for Live since the 360's release, that money had been enough for a whole new 360 today

One thing I fear for the next-gen though, with all the current focus on 3D, I guess it's only fair to assume that next-gen will be even more about 3D...With the extra burden of having 3D for most/all games, is there any reason to expect devs to put performance over graphics next-gen?

I also don't like that consoles nowadays are all so dependent on online...I can take my SNES or GC with me everywhere and just plug it in and play, but current games are released with so many glitches that must be patched away
 
Back
Top