Where did you generate the Ethereum adress you're mining to and what kind of wallet are you using? First, it seems your payment threshold is still at the default one full ETH. You need to lower that to the minimum amount of 0.05 to force a payout. Then there's the transcation fee and maybe a fixed amount deducted, and then the remaining ETH tokens are credited to you adress. Whereever you got that from - maybe it even is inside your wallet by then.
Is there a reason to think that the GTX 1070 Ti will perform better than the original GTX 1070 in mining?
I know the memory bandwidth and latency specs should be the same but hopefully the memory will be of a better quality to allow higher overclocking.
Also does the extra 27% CUDA cores help with mining.
Remember that Mining, contrary to public opinion, is not Ethereum-only, which indeed is largely memory bound and where you can eek out maybe a better efficiency due to two factors (if you're lucky): One, you could clock the chip lower to reach the same minimum FLOPS to saturate memory bandwith, thus saving a few watts. Two, you're practically guaranteed a newer GPU production wise, which could have slightly better electrical characteristics due to improved process management at TSMC.
But, there are (a lot of?) coins which also depend on FLOPs, foremost Zcash comes to mind. Here, higher TFLOPS lead to higher earnings and that's a niche where a TI would show an advantage over a non-Ti. BUT - and this is the big but - on the other hand, pricing so far seems to put the Ti almost on par with the cheaper 1080s, which in turn do have slightly more FLOPs again. But with those, you're basically giving up on the option of mining ETH because of GDDR5X.
1070 Ti, if you can get a decently priced one, would be your most flexible, but not the best-in-all-cases option for mining and gaming from Nvidia. And there are Radeon RX Vega to consider as well, especially fro XMR, DCR and ETH they seem to do well; in fact, for ETH I know it, for the others it's an educated guess.