This is sort of a Part 2 of my previous thread. I've chosen to go the Mac mini route with an i7-8700B 6C/12T CPU and 32GB of ram connected to an eGPU. Coupled to this is a recently purchased 60Hz 4K IPS display with very decent 1440p scaling, so high FPS gaming is not in the cards now or in the future. Now gaming can, and mostly will, be done in Windows. For optimal compatibility with MacOS and the least amount of "hacking" on my part I'll be going with an AMD GPU.
I've had my eyes on the 5700XT as I suspect it won't bottleneck too badly at 60 FPS over the 4 lane PCIe3 + overhead eGPU connection. Or might I want to wait and see what RDNA2 brings to the table? With luck 4K 60 FPS, or 1440p 60 FPS with ray-tracing, becomes possible. If that's the case I don't mind shoving a second hand 580 in the box and waiting a bit. But then again, it might be fruitless if it bottlenecks regardless.
So the real head-scratcher is how games will saturate the 4 lane PCIe3 + overhead under my 60 FPS high resolution/eye-candy target. Which begs the question of the title: Where is the point of diminishing return? What GPU, really, is the practical limit here?
I've had my eyes on the 5700XT as I suspect it won't bottleneck too badly at 60 FPS over the 4 lane PCIe3 + overhead eGPU connection. Or might I want to wait and see what RDNA2 brings to the table? With luck 4K 60 FPS, or 1440p 60 FPS with ray-tracing, becomes possible. If that's the case I don't mind shoving a second hand 580 in the box and waiting a bit. But then again, it might be fruitless if it bottlenecks regardless.
So the real head-scratcher is how games will saturate the 4 lane PCIe3 + overhead under my 60 FPS high resolution/eye-candy target. Which begs the question of the title: Where is the point of diminishing return? What GPU, really, is the practical limit here?