These guys here : https://penrosestudios.com/ (watch "Experience Penrose" !) Are using a Facial Mocap app where the actor can see his Mocap results immediately in front on his iphone screen.
How about implementing this into "live Face" ? Not shure if this would slow down communication beetween Iclone and the Iphone, but using a dummy Character in the app could prevent from extensive data exchange beetween Live face and PC.
I think this may make things much easier with faster results.