Animate 3D: Face Tracking FAQ
DeepMotion's Animate 3D Face Tracking is markerless motion capture from any RGB camera - no software or hardware needed. Check out the FAQ for how to integrate it int
DeepMotion's Animate 3D Face Tracking is markerless motion capture from any RGB camera - no software or hardware needed. Check out the FAQ for how to integrate it int
Animate 3D Face Tracking is markerless motion capture from any RGB camera. Full-body tracking with face is now possible - no hardware or software needed! Check out our full announcement here.
The new Face Tracking Toggle is located under the ‘Animation Output’ settings. Simply turn it on and generate your animations as normal.
You can upload any video that is full-body (head-to-toe), half-body (head-to-waist), or headshot (full face within frame). The larger and more clear the face is, the more accurate your results will be.
The Face Tracking animation will be included as Blendshapes within the normal default animation export, or already retargeted onto your custom character . You can then retarget them to a character of your choice.
Our default characters work the best with face tracking however you can use your own character with a 39 Blendshape subset of the the 52 ARKit blendshape standard, or you can use the custom characters generated by the built-in avatar creator.
See the Face Tracking Technical Specifications below.
If you use our default characters to create the facial animations you can retarget the blendshape weights conforming to the ARKit Blendshape standard in the animations to your own characters in your favorite DCC tools by yourself. If you use Custom Characters to create the facial animations and your custom characters have a face rig that contains the 39 Blendshape subset of the the 52 ARKit Blendshape standard, the downloaded animations in the .FBX or .GLB will already be retargeted to your custom characters.
Our face tracking output uses a subset of the 52 ARKit Blendshape standard. Our specific set-up includes 39 Blendshapes total and rotations on one head and two eyeball joints. You can use the full standard but have to make sure that the Blendshape Specifications below are followed for your animation retargeting and custom characters to work correctly.
Name Your Blendshapes Correctly: Custom characters with the below 39 ARKit blendshapes can be used for face tracking, the full standard 52 Blendshapes can also be used if desired. When face tracking is enabled, Animate 3D will apply the blendshape weights according to the blendshape names. Make sure that your model’s blendshape names are exactly the same as the ARKit standard. If not, rename them before you upload to Animate 3D.
Joint Setup: Your character rig needs to have a head joint and two eyeball joints. The eyeball joints should control the rotation of your eyeball mesh, which should be looking straight ahead by default.
Full-Body Humanoid Characters Needed: We currently only support full body custom characters, and don’t support head only ones. This is because face tracking is now a supplement to body tracking, and we don’t support capturing face alone. This means your character needs to also satisfy our custom character requirements for body tracking. You can check out our Custom Character FAQ here.
These 39 Blendshapes are a subset of the ARKit blendshapes:
"mouthLeft",
"mouthRight",
"mouthSmileLeft",
"mouthSmileRight",
"mouthDimpleLeft",
"mouthDimpleRight",
"mouthStretchLeft",
"mouthStretchRight",
"mouthFrownLeft",
"mouthFrownRight",
"mouthPressLeft",
"mouthPressRight",
"mouthPucker",
"mouthFunnel",
"mouthUpperUpLeft",
"mouthUpperUpRight",
"mouthLowerDownLeft",
"mouthLowerDownRight",
"mouthShrugUpper",
"mouthShrugLower",
"mouthRollUpper",
"mouthRollLower",
"cheekPuff",
"cheekSquintLeft",
"cheekSquintRight",
"jawOpen",
"jawLeft",
"jawRight",
"jawForward",
"browInnerUp",
"browOuterUpLeft",
"browOuterUpRight",
"browDownLeft",
"browDownRight",
"noseSneerLeft",
"noseSneerRight",
"mouthClose",
"eyeBlinkLeft",
"eyeBlinkRight"