Digital Foundry Retro Discussion [2016 - 2017]

Status
Not open for further replies.

Cyan

orange
Legend
Supporter
Yep. I honestly have no idea why anyone thinks 8k will ever be necessary for the home.
I also wonder how DF staff are going to find out the resolution of any given game when they are over 1080p. I don't think there is going to be steps to count..

Additionally, Digital Foundry retro looks VERY promising. Hopefully they won't only focus on retro consoles, but also arcade motherboards and machines from the past, like the Model 1 and Model 2, ZN-1, Naomi, etc etc etc.

http://www.eurogamer.net/articles/digitalfoundry-2016-introducing-df-retro

Plus, I wonder if games like Gran Turismo 4 and other legendary games at the time quite delivered in the framerate department. People used to say how incredible Gran Turismo 4 was running at a flawless 60 fps, but I certainly doubt it was flawless.
 
I also wonder how DF staff are going to find out the resolution of any given game when they are over 1080p. I don't think there is going to be steps to count..

Additionally, Digital Foundry retro looks VERY promising. Hopefully they won't only focus on retro consoles, but also arcade motherboards and machines from the past, like the Model 1 and Model 2, ZN-1, Naomi, etc etc etc.

http://www.eurogamer.net/articles/digitalfoundry-2016-introducing-df-retro

Plus, I wonder if games like Gran Turismo 4 and other legendary games at the time quite delivered in the framerate department. People used to say how incredible Gran Turismo 4 was running at a flawless 60 fps, but I certainly doubt it was flawless.

I'd like an episode on Transformers PS2. That game was a tech marvel of its day, and was well liked here because of that(Well, that and it's a solid third person shooter. Not super great, but solid).
 
Digital Foundry Retro: 007 Golden Eye and Perfect Dark

Conclusions:

-The N64 resolution was 320 x 222 in standard mode and 640 x 222 in the high-res mode (Expansion Pack)

-The frame rate in Golden Eye was about 15-20 fps with drops to about 10 fps.

-Perfect Dark ran at about 15 fps with drops to less than 10 fps, at times.

-The four player screen split multiplayer ran around 10 fps in both games, in Perfect Dark sometimes running at less than 10 fps.

-The remastering of Perfect Dark (Xbox 360) runs at 1920 x 1080 and 60 fps.


Now I can understand why Sega Saturn creators turned down the Silicon Graphics offer to build the Saturn's GPU, and the chip went into the N64.
 
Digital Foundry Retro: 007 Golden Eye and Perfect Dark

Conclusions:

-The N64 resolution was 320 x 222 in standard mode and 640 x 222 in the high-res mode (Expansion Pack)

-The frame rate in Golden Eye was about 15-20 fps with drops to about 10 fps.

-Perfect Dark ran at about 15 fps with drops to less than 10 fps, at times.

-The four player screen split multiplayer ran around 10 fps in both games, in Perfect Dark sometimes running at less than 10 fps.

-The remastering of Perfect Dark (Xbox 360) runs at 1920 x 1080 and 60 fps.


Now I can understand why Sega Saturn creators turned down the Silicon Graphics offer to build the Saturn's GPU, and the chip went into the N64.

I'm pretty sure the N64 was significantly more powerful than the Saturn. Not as many people cared about low framerates back then(before we had youtube channels dedicated to telling us lol), so we ended up getting games that pushed the hardware further than the probably should. There's also the factor of nintendo discouraging developers from using the higher performance modes(microcode or whatever) for the GPU in favor of more accurate ones.

I mean, I'm sure the N64 could do a lot more than the saturn or playstation at a consistent 30 or 60 fps.
 
I'm pretty sure the N64 was significantly more powerful than the Saturn. Not as many people cared about low framerates back then(before we had youtube channels dedicated to telling us lol), so we ended up getting games that pushed the hardware further than the probably should. There's also the factor of nintendo discouraging developers from using the higher performance modes(microcode or whatever) for the GPU in favor of more accurate ones.

I mean, I'm sure the N64 could do a lot more than the saturn or playstation at a consistent 30 or 60 fps.

Gamers cared long before YouTube was invented. People that were into arcade fighting games were already into frame counting as it was important for consistency of moves as well as being consistent in performing your moves. It was also key in knowing what moves could "beat" or "interrupt" the opponents moves and which moves were risky to use. That goes all the way back to the 80's. Arcade fighters that couldn't provide consistent high framerates would generally fail.

Serious PC gamers were already turning down settings in order to get guaranteed 60 FPS framerates for not only competitive gameplay but a more enjoyable single player experience. Many considered a high framerate to offer not only a better gaming experience but a more visually pleasing one as well. That was in the 90's. In the 80's they'd physically replace the clock generator on motherboards in order to boost the clockspeed for more consistent performance in games.

People couldn't do much about the framerates on consoles, so they put up with it. It offered a fraction of the arcade experience in the home, and that was enough for a lot of people to put up with a lot of dodgy gameplay, bugs, glitches, and performance issues. It wasn't so much that no one cared, but that those that did care, couldn't do anything about it. And without the internet, their complaints were limited to their local group of friends of local gaming community. But it was always a problem.

YouTube has nothing to do with people finding inconsistent FPS or low FPS to be unsatisfying. The internet on the other hand allowed those people to connect with other people that felt that way in a way that was impossible before the internet became ubiquitous. And they no longer had to be quiet and just put up with it.

What YouTube and the internet did do however, is make resolution in games a far larger issue than it actually is. It's not difficult to feel the difference between 30 and 60 FPS in a fast action game. It's even easier to feel the difference between a consistent framerate and an inconsistent one. It's far far harder to tell the difference between 900p and 1080p, or even 720p and 1080p on most peoples TVs in most families living rooms. That requires side by side screenshots for most people to see the difference. Move that back to ~3 meters in the living room with a 42-55" HDTV and the vast majority of people can't tell the difference. For PC gaming, however, that isn't the case as you usually sit within 1-2 feet (~1/3 - 2/3 of a meter) of your monitor.

Regards,
SB
 
Last edited:
You're right, but I was kind of joking about the youtube thing. I played saturn doom as a kid and never noticed the framerate though...
 
We certainly cared about standards in select genres. We just didn't turn into whinging, hyperbole-spewing toddlers when a frame was lost somewhere. We even managed to cherish odd performance outliers like the 30 fps Soul Edge (or rather the 25 fps SE for us Europeans). Imagine the meltdown if someone were to announce a 30fps beat-em-up in 2016. And at the time of the Playstation and N64, we really didn't chase high framerates on the PC either. If anything, we were chasing halfway playable framerates in state of the art titles like Wing Commander IV. Maxing that game out while getting a shaky 10-15 fps at 640x480 was a-okay. The quest for high framerates came much later and was pretty much only a thing in the pro gaming scene. And comparing yesteryear's pro counterstrike players to the raging forumites of today isn't quite an apples-to-apples comparison.
 
... It's not difficult to feel the difference between 30 and 60 FPS in a fast action game. It's even easier to feel the difference between a consistent framerate and an inconsistent one. It's far far harder to tell the difference between 900p and 1080p, or even 720p and 1080p on most peoples TVs in most families living rooms. That requires side by side screenshots for most people to see the difference. Move that back to ~3 meters in the living room with a 42-55" HDTV and the vast majority of people can't tell the difference.

Regards,
SB
You can't generalize. Different people, different perceptions.
 
Indeed. I always preferred higher framerates, turning down detail on PC to get something that wasn't a slideshow. But those of us raised on Amiga had the benefit of awesome graphics and steady 30fps minimum by and large. It's only really the 3D era where things got ropey, basically pointing to the fact that good 3D is pretty damned difficult to do. ;)
 
Obviously nobody wants a slideshow, but when a vocal minority acts like a bunch of framepacing issues are nothing short of ruinous to a game like Bloodborne, a game which somehow still managed to rake in one goty award after the other, I'd say we're looking at a slight disconnect from reality. Don't get me wrong, I'm just as thrilled as the next guy that Doom looks as good as it does while running circles around similar or worse looking games in terms of performance. I'm just not expecting a 400€ box which was never advertized to be a 1080p/60 monster in the first place to pull that off that kind of feat in every game. And if anyone actually did, I wouldn't say they were duped like many of them claimed they were. They were simply naive idiots.
 
When I was a kid I often avoided N64 versions of games because they ran slower. *shrug* And I'd do the same for some PS1 games as well.

Low rate rate sucks, and often hurtsm y eyes.
 
We certainly cared about standards in select genres. We just didn't turn into whinging, hyperbole-spewing toddlers when a frame was lost somewhere. We even managed to cherish odd performance outliers like the 30 fps Soul Edge (or rather the 25 fps SE for us Europeans). Imagine the meltdown if someone were to announce a 30fps beat-em-up in 2016. And at the time of the Playstation and N64, we really didn't chase high framerates on the PC either. If anything, we were chasing halfway playable framerates in state of the art titles like Wing Commander IV. Maxing that game out while getting a shaky 10-15 fps at 640x480 was a-okay. The quest for high framerates came much later and was pretty much only a thing in the pro gaming scene. And comparing yesteryear's pro counterstrike players to the raging forumites of today isn't quite an apples-to-apples comparison.

Not really. People were already chasing framerates in Doom all the way back in 1993 (quite a few people picking up 486 CPUs just to get increased framerates) on a large scale (IE - not the small scale of changing clock oscillators on the MB). Discussions at local CUGs and BBSes would often focus on performance of games on hardware and attempting to get at a minimum a solid 30 FPS. Similar situation with Ultima Underground and Strike Commander. Speaking of Strike Commander, flight sim enthusiasts were also huge on chasing framerates.

Further advancements in games like Quake allowed much more control over settings which directly impacted framerates during gameplay and helped them to attained higher levels of popularity. Now you weren't limited to only being able to upgrade hardware in order to attain playable framerates, but you could adjust your software settings to better match your level of hardware.

3dfx, Rendition, and later Nvidia gained much notoriety not just for making 3D hardware accelerated gaming affordable, but for also increasing framerates to what many considered playable levels versus having to put up with crappy framerates because they had no choice.

Just because the vast majority of people back in the 80's and early 90's had no choice but to accept low or inconsistent framerates due to hardware budget limitations doesn't mean they were happy with it. Just that they couldn't do anything about and accepted it. Once games appeared with the ability to adjust settings in game to increase framerates, conversation went from "ugh this game performs poorly" to "What settings should I change to get [30 FPS, 30 FPS stable, 60 FPS, or 60 FPS stable]". People didn't have to just accept that their game runs badly because they couldn't afford a level of hardware that could allow smoother gameplay. Once that happened they could be more vocal about talking about it as well as discussing solutions for increasing performance.

People have a choice now. Some people choose to accept it because it's good enough for them or they have no choice because it's exclusive to a platform that is unable to play it at higher framerates. Many people don't choose to accept it because they find it unacceptable, especially as it directly affects gameplay. More so in fast paced games where fine control is a benefit. Dark Souls 2 at 60 FPS is a hugely better experience than Dark Souls 2 at a 30 FPS level with drops below that. Bloodborne would also be a significantly better experience at 60 FPS than 30 FPS. The fact that you can only play it at 30 FPS because you have no choice doesn't change that. The fact that it's just a 30 FPS game also doesn't make it a bad game. But it's unquestionable that it'd be a better game at 60 FPS.

Regards,
SB
 
Last edited:
Not really. People were already chasing framerates in Doom all the way back in 1993 (quite a few people picking up 486 CPUs just to get increased framerates) on a large scale (IE - not the small scale of changing clock oscillators on the MB).

People also picked up 386 CPUs for Wing Commander, and Pentiums for WC3 or Strike Commander. This wasn't because they were after some lofty goals like locked 60 fps, though. It was because these games really did run like shit otherwise. I still remember the 3-tiered spec recommendations from magazines back in the day: according to German magazine PC Player, you'd need a 486 DX50 in order to get a "perfect" experience with Strike Commander (I believe a 386 DX40 was the minimum, but this rendered the game basically unplayable, even if you opted for pure gouraud shading). Except perfect really meant between 15 and 20 fps on medium settings back then.
 
People also picked up 386 CPUs for Wing Commander, and Pentiums for WC3 or Strike Commander. This wasn't because they were after some lofty goals like locked 60 fps, though. It was because these games really did run like shit otherwise. I still remember the 3-tiered spec recommendations from magazines back in the day: according to German magazine PC Player, you'd need a 486 DX50 in order to get a "perfect" experience with Strike Commander (I believe a 386 DX40 was the minimum, but this rendered the game basically unplayable, even if you opted for pure gouraud shading). Except perfect really meant between 15 and 20 fps on medium settings back then.
Hey, I played Wing Command 3 on a 486 SX25. Well it ran .... quite good actually. But on the later Pentium 100 the resolution bump to 640*480 was much better.
I also played Crusader nor regret on these machines and it ran good.... but when I play them now, they aren't that fluent, because the engine was capped at 20fps (if I'm not mistaken). This was good for those small screens (14" monitor at that time) but now with a >21" monitor, those framerates are just no longer acceptable.
 
Ironically, the graphics were rendered on Amigas in 4096 colours and converted to VGA's 256 colour palette. But it was ridiculous to expect a 7 MHz 68000 to run a game designed for CPUs several times faster and without the overhead of bitmapped display planes.
 
Posted this in the other thread for BC stuff, but


Thanks - looks like it'll be worth a replay (loved it first time round) - out of interest, is there a site that condences all these down with an easy to view comparison of original vs emulated?
 
Status
Not open for further replies.
Back
Top