Even then, when a game was 30 fps it was a stable 30 fps normally. Tearing just didn't happen on consoles. Stuttery framerates didn't happen, until the end of the Ps2 era by my uncertain recollection.
I remember tearing happening quite regularly in games like Unreal Championship, Ghost Recon, and Splinter Cell.
Even on consoles known for 60fps games (like the DC) there were plenty games with stuttery frame rates. UT on both the DC and PS2 had frame rate issues for example. UC on the xbox was also known for it's frame rate issues. I know I'm bringing up Unreal games a lot but those are some of the first examples to spring to mind.
I'm sure if I actually tried, I could remember more games with inconsistent frame rates. Honestly, I think the slow demise of console experience happened when we moved from 2D to 3D.
I guses it was this overreaching the hardware that has continued, maybe because of marketting?? My associations from last gen are a comparison between NWN and Morrowind on PC, and the likes of BGDA and GT3 on PS2. The console games were always consistent; the PC games always either tearing horribly or juddering with an inconsistent framerate when V-sync'd. That kinda makes sense when the developers couldn't target a specific hardware level, so couldn't tune the game to run consistently at a given level. It makes sense to just create the game and let the hardware do the best it can. But on console we've seen a changing attitude where devs are trying to do more and will happily sacrifice framerate and IQ to achieve it, where before they'd have pared back some other aspect of the game like model detail or view range.
I think a large part of it has to do with marketing. Look back at how high they raised the bar with all of the pre-rendered concept videos, tech demos, and talk of system power. MS and especially Sony have done what they could to burn the thought that these systems are all powerful machines capable of Pixar levels of CGI. I can only imagine the thoughts going through developer's heads when watching MS and Sony's E3 conferences. Just recently we saw DICE defend themselves and why the console versions featured lower player counts due to the fanbase's expectations.
To be fair, it was similar last gen too AFAIR. Weren't most ps2 games (and even GC games) sub-480p?
If you don't take it as an absolute, I think it's a valid point. Relative to PC, by a considerable margin, most console games (and this goes back to the 16 bit era and Amiga vs. PC) ran without bugs and with consistent framerates. I recall R-Type on the Sega Master System would vanish sprites from the display in order to prevent slowdown! Some games did slide-show, like Populous, where they were being ambitious. Overall though the performance of consoles over PC was pronounced, especially without the setup issues of PC, or the incompatibilities. As time has progressed, those advantages have dwindled. DX has sorter solved setup in a lot of cases, although drivers have a large impact. Screen tear and judder is no less prevalent on console games, perhaps less even if a game is targeting consoles and then running on much beefier PC hardware. Bugs are as common on console games. Apart from being able to put in a disk and play without having to install, a feature PS3 is quickly trying to redress with mandatory installs, the reasons for owning a console aren't what they were.
I could be wrong, but aren't games an order of magnitude more complex now than they were 10-15 years ago? Development schedules have remained the same, if not made shorter, while the work to create these games has grown significantly.
The more advanced we want our games, and how often we want these experiences, I'm afraid that that level of polish is a rarity these days.