Hi,
I want to be able to render in Unreal but also be able to render the figure inside iClone to then composite the two renders in post.
This should be easily possible with an iClone piloted camera but reality is that it's not possible because the Unreal and the iClone render do not match for a frame by frame overlay of the renders.
I believe the solution for this is to modify the way Live Link transfers the animation. Why does the animation transfer need to happen in real time? I would prefer for Live Link to have the option to transfer slow but accurate! To me frame accuracy is of much greater importance.
I am also not sure about this 60 frames per second tutorial you have put out. Most films render in 24fps. Do I need 60?
I've watched this video on getting a proper frame rate:
https://www.youtube.com/watch?v=bOlnjHQuSFQI do not think this video is very good because it skips some really important information:
1, Why 60? Can't I do 24 if that's my target render frame rate?
2, If higher frequency gets better results would multiples of 24 be better than 60? 48 perhapse?
3, I use a Titan X Pascal and at times can't get over 26fps with a single character, what sort of hardware setup are you using in this video to get 60fps?
4, How to set up the figure? My figure is interacting with linked props. Some could be combined. Would that speed things up?
5, Does texture size matter? I assume not but I don't know.
6, If I'd manage to reach 60 fps would it even make a difference for a 24 fps render output I am aiming for?
7, Even if I reach 60 fps I am guessing I'd likely still have time slippage between a 24fps render in iClone and a 24fps render in Unreal. Is there any way to get frame accurate renders between UE and iClone for compositing??