Running into the final 5 months before games go gold, having development kits with only half memory bandwidth would not be optimal. MS and Sony were shipping development kits far closer to final unit performance this close to launch. Unless, of course, 25 GB/s is indeed the final figure.
Nintendo have traditionally been waiting on software rather than hardware to be ready for launch.
Looking back to last year, it appears that cost per transistor was speculated to be cheaper on 20 nm than 16 nm (http://www.eetimes.com/author.asp?doc_id=1326937). For a company willing to sign a long term contract there may be some good deals to be had on an older process such as 20 nm.
For WiiU, Nintendo went with Renesas 45 nm for the GPU when IBM had a newer 32nm edram capable process (which MS utilised for their final 360 shrink) and IBM 45 nm for their CPU when IBM were rolling their own Power processors on 32 nm. Presumably, it was cheaper to Nintendo to shop around for old nodes than go with an all-IBM SoC.
You don't think that 16/14nm process node got cheaper after Galaxy S6 launched on 14nm? this report is 16+ months old and will be nearly 2 years old at launch, with the production on 16nm/14nm being far wider while new 10nm process takes hold for big manufactures, I have to assume the deals would all be in favor of the 16/14nm process over the stopgap 20nm process. The prices are so close even over a year ago when Nintendo and Nvidia were really nailing the design down, that I think it would have been the obvious choice for them considering the ability to drop active cooling for X1's target performance, which would drastically decrease the price of the unit. (dollars rather than cents)