Not that I'm doubting brute force rendering which most PC GPUs tend to do... but I don't subscribe to the PC mantra that PCs will always sustain "consistent" frame-rates.
One thing which can affect that is what else they have running on their pc. I use my gaming pc purely for games and for overnight video renders, nothing more so mine runs consistent and clean for gaming. I had a friend that couldn't figure out why he had inconsistent performance on games and it turned out he was leaving a torrent app downloading and seeing constantly in the background among other things.
Also some people still use ancient operating systems which simply don't run as well. I use Windows 8.1 which runs somewhat faster and leaner than previous versions but many stick to os's based on decade old code so it's not surprising they have issues whereas mine runs solid and consistent. Those people running ancient version of Windows will have worse file i/o, worse memory use and on and on.
Regarding benchmarks if they are going by lowest fps and not average then their numbers will always be wrong on pc. That's because sometimes on pc (depending on the game) data will have to cross the pci bus en mass, like on a level change, large visibility change or something like that. That will sometimes cause a stutter. In the real world that's largely irrelevant as it represents 0.0000001% of your gaming time, but it will appear on a min fps measurement which gives the illusion of large slowdowns even if it really isn't. It's why I always personally disregard min and peak fps and instead look more to the average. Min is useless because it doesn't represent reality and peak is useless because who cares about 200fps when you are looking at a wall.