I have no doubt that they did the demo, I question as to whether the level of DETAIL in the simulation was the same level of detail in the REAL LOTR scene or whether it was CUT DOWN for the demo the same way NVidia CUT DOWN the resolution of the geometry and textures in the Final Fantasy demo.
LOTR was rendered on 200+ dual CPU linux boxes using a 12terabyte network attached storage. The average scene database was over 2gb IIRC. This means without even thinking about rendering, the CPU would have to compute and generate over 2gb of scene data per frame, and datarates would far outstrip the bandwidth of the CPU at real-time rendering rates.
If I simply wrote a 1 line C program that wrote out a stream bytes over and over, I wouldn't be able to push that much data.
But, if you want to believe that Massive was running the true LOTR scene data, in full detail, at real time frame rates, go right ahead.
Looks like those 200+ dual-cpu linux boxes were a big waste of money then, and someone at WETA Digital is in trouble.