Votes
4
Product:
iClone 6
Version:
6.42
Status:
Active
Issue 288
Facial Puppeteer Improvements
In working with the Face Puppet, I'd like to suggest a re-vamp of the controls to add more capability. Moving my mouse around for big movements like turning the head or looking up and down is fine, but for finer, more subtle movements --it seems unnecessarily difficult. What occurred to me is that you can't key frame the facial performance in any precise way. Then it occurred to me that you have all of these expressions, but no way to precisely mix them during a performance. But then as I was looking over Persona and the Perform menu option, it dawned on me that we should have a specialized Persona Studio for the face. This is something where you can see the face in Perspective or orthographic view. When you select the iAvatar, a list of their facial expressions are listed to the left or right, with the timeline for the face below.Before you start the performance, you should be able to scroll through the list of expressions and check the ones you're going to use and add them to the performance tab, so you only have to scroll through the stuff you need and not the whole list.

Then you start the vocal track and instead of clicking and dragging with the mouse, you click on each item through the take. So you can set when they blink just by clicking on the blink button, which adds a keyframe for the action. Here's where it gets interesting. Once you've gone through the blink track, you can click on the keyframe and then add any other facial change at the time. which adds an entry for it that you can offset, plus you get the strength slider for the action so that you can make it subtle or exaggerated. If you like the combination of dials, you can save the expression to a quick expression that's can be saved to each actor. You can also stretch/squash the time it takes to perform the action so that you can have a slow blink or fast blink -as you prefer, or if you want them to smile from this point all the way through the dialogue you can set the expression to remain through the performance and just adjust it subtly by adding facial ticks. If the character is bone based, then you would have visual locators for the face bones so that you could do the same thing in a bone based character. The idea is not to re-invent anything but to give you access to what's already there in a more precise fashion.

The idea is to use what's already available to the character. What makes it even more powerful is that if the character has custom facial expressions already, you don't have to do anything but dial it in during the performance. So if you brought in a custom morph for say Vampire fangs and added it to the character's custom facial expressions --you could now keyframe it and have them dial in slowly or quickly. I think that it's a better way to get a character performance because it is more precise than doing take after take with the puppeteer. I've done a mockup of what I have in mind.
OS: Windows 10
Attachment:
  • FacePuppet2.jpg
  •  2
  •  2758
Submitted bywill2power71
1
COMMENTS (2)
VirtualMedia
Couldn't agree more, face puppet has a lot of un-tapped potential but is very limited in it's current form. Improved facial animation tools imho is the most important feature that needs to be addressed in IC.
Delerna
I am definitely voting for facial puppeteer improvements. The record space move mouse method kind of works for certain things but key framing facial emotions and being able to move the keyframe around in the timeline will give much better control and fine tuning where needed.
1