Rokoko, makers of the Smartsuit have announced a facial motion capture add-on for Rokoko Studio coming in June. Details are sparse at this time but the initial version will capture and record facial tracking data into Rokoko studio which can be broadcast along with Smartsuit data to Rokoko’s real-time plugins for Unity, Unreal, and Motionbuilder.
No information is available to show how to make real-time models compatible with iPhone X data. Other mobile devices will be supported in the future.
The subscription fee will be $35/month – hopefully that will be as flexible as the other Rokoko Studio add-ons which allow month-to-month as needed, and yearly subscriptions at a discount.
Helsinki-based Next Games has released ARKit Animation Tool as a free asset for Unity. It records facial blendshapes from the iPhoneX face scanner into Unity editor through Apple’s ARKit.
ARKit Animation Tool on the Asset Store
Realtime for iClone is a facial tracking application powered by Faceware Technologies that communicates solely with Reallusion’s iClone 7. Realtime for iClone helps Reallusion empower indie developers and professional studios with tools for real-time facial motion capture and recording. Fast, accurate, and markerless–all from a PC webcam.
Apparently this requires two purchases, in addition to iClone 7 and Character Creator 2:
- iClone Facial Mocap Plug-in for Faceware ($599)
- Faceware Realtime for iClone ($599)
Add one more purchase :3DXchange 7 Pipeline if you want to export the figures or animations out of the iClone universe.
No one ever claimed iClone or it’s attractive Character Creator is cheap, but un-surprisingly many software bundles can be purchased through Reallusion for a constantly fluctuating discount:
Apple’s ARKit contains instructions for mapping facial expressions to blendshapes when using their face recognition technology on the iPhone X.
A Unity blog post ARKit Face Tracking on iPhone X states that Unity will be releasing a sample scene where ARKit is used to animate a 3D head, although that demo scene is not yet available.
I’ve compiled a list of ARKit’s blendshapes from the Apple Developer website:
Mouth and Jaw
Eyebrows, Cheeks, and Nose
Apple Developer: Creating Face-Based AR Experiences
Each key in this dictionary (an
ARFaceAnchor.BlendShapeLocation constant) represents one of many specific facial features recognized by ARKit. The corresponding value for each key is a floating point number indicating the current position of that feature relative to its neutral configuration, ranging from
0.0 (neutral) to
1.0 (maximum movement).
You can use blend shape coefficients to animate a 2D or 3D character in ways that follow the user’s facial expressions. ARKit provides many blend shape coefficients, resulting in a detailed model of a facial expression; however, you can use as many or as few of the coefficients as you desire to create a visual effect. For example, you might animate a simple cartoon character using only the
eyeBlinkRight coefficients. A professional 3D artist could create a detailed character model rigged for realistic animation using a larger set, or the entire set, of coefficients.
You can also use blend shape coefficients to record a specific facial expression and reuse it later. The
init(blendShapes:) initializer creates a detailed 3D mesh from a dictionary equivalent to this property’s value; the serialized form of a blend shapes dictionary is more portable than that of the face mesh those coefficients describe.
Face ID is enabled by the TrueDepth camera and is simple to set up. It projects and analyzes more than 30,000 invisible dots to create a precise depth map of your face.
The TrueDepth camera analyzes more than 50 different muscle movements to mirror your expressions in 12 Animoji.
see: Apple buys Faceshift
Winner of the SIGGRAPH 2016 Award for Best Real-Time Graphics and Interactivity, this scene based on Ninja Theory’s upcoming game, Hellblade: Senua’s Sacrifice, was shot, edited and rendered to final quality in minutes, a process that would normally take weeks or months.
This real-time cinematography project was developed by Epic Games, Ninja Theory, Cubic Motion & 3Lateral, with additional support from House of Moves, IKinema, NVIDIA and Technoprops.