Didn't make a great deal of difference for games either way. Although I'd argue more res was that little more useful for smaller/clearer UIs. I guess NTSC was marginally less flickery.20% higher framerate is more important than 20% higher resolution.
Yet ironically not divisible by 24 so movies had to use a 3:2 pulldown, whereas for PAL movies were just run that bit faster at 25 fps.Plus, 60 is divisible by 1,2,3,4,5,6,10,12,15,20,30, and 60. 50 is only divisible by 1,2,5,10,25,50. This give a bit more leverage when updating below the refresh, especially when you miss an update.
None of which matters, as the reason for the different update rates was the power system, not a technical decision to choose better framerates or anything. Europe ran on 50Hz AC so the electronics sync'd to that, and NA and much of RoW ran at 60 Hz and sync'd to that.
Once displays decoupled from the power supply, 60 Hz was the more established refresh rate so the one electronics were built for. No particular technical reason. This then affected EU TV content shot at 50 fps trying to be displayed on 60 fps screens. TBH I don't even know if 'PAL' is still a thing. Hopefully not! So we then had the worst of all worlds, juddery TV and juddery movies. But now displays can sync to all sorts of content, it's not an issue. Yay.
/history.
Well I'd say it's still either a choice of 1080p or 2160p.Part of the problem for consoles is once upon a time everyones tv had the same resolution ntsc or pal now thats no longer the case