I don't understand these strange theories, Robert Hallock posted it straight on Reddit, 62.5 FPS is the average FPS result, and utilization per batch was 51%, 71.9% and 92.3 %, it's dead simple.
62.5 FPS is more than 58.7 FPS > Better Performance
Better efficiency means simply that they achieve more FPS per watt in that test, than a GTX 1080 does
Hallock confirmed on Reddit that 62.5 FPS was the average FPS result, and the average utilization for that result was 1.83x. 1.83x is 91.5% per card.
I don't know about the efficiency stuff, I think people are reading that to mean perf / W.