Just to add what London-Boy just replied: If your TV refreshes at 60 Hz and your game runs at 45 frames-per-second, the games and the tv refresh rate wouldn't be properly synched (in parallel), thus producing tearing.
If you're game however runs at 30 fps and your TV refreshes at 60 Hz, then in a simplified example, the TV would be receiving the update every second refresh, thus no fluctuations and as a result no tearing. A game running at 60 fps on a TV with a refresh rate of 60 Hz would be running in parallel - as a result, you wouldn't have any tearing either.
If you're game refreshes at 45 fps and your TV refreshes at 60 Hz, the fundimental problem is that they both start in parall (TV 1st refresh = 1st refresh of game), but from that point they'd be missing each other when the refreshes occur.
My example of using PAL refresh rates was an unlucky one - living in PAL territory I sometimes use the numbers while forgeting that the majority uses NTSC and the popular 30 / 60 fps. Hope that answers your question.