Faceware’s Facial Mocap for iClone

Realtime for iClone is a facial tracking application powered by Faceware Technologies that communicates solely with Reallusion’s iClone 7. Realtime for iClone helps Reallusion empower indie developers and professional studios with tools for real-time facial motion capture and recording. Fast, accurate, and markerless–all from a PC webcam.

Apparently this requires two purchases, in addition to iClone 7 and Character Creator 2:

  • iClone Facial Mocap Plug-in for Faceware ($599)
  • Faceware Realtime for iClone ($599)

Add one more purchase :3DXchange 7 Pipeline if you want to export the figures or animations out of the iClone universe.

No one ever claimed iClone or it’s attractive Character Creator is cheap, but un-surprisingly many software bundles can be purchased through Reallusion for a constantly fluctuating discount:
https://mocap.reallusion.com/iClone-faceware-mocap/default.html

ARKit maps your expressions to Blendshapes

Apple’s ARKit contains instructions for mapping facial expressions to blendshapes when using their face recognition technology on the iPhone X.

A Unity blog post ARKit Face Tracking on iPhone X states that Unity will be releasing a sample scene where ARKit is used to animate a 3D head, although that demo scene is not yet available.

I’ve compiled a list of ARKit’s blendshapes from the Apple Developer website:

Left Eye

  • eyeBlinkLeft
  • eyeLookDownLeft
  • eyeLookInLeft
  • eyeLookOutLeft
  • eyeLookUpLeft
  • eyeSquintLeft
  • eyeWideLeft

Right Eye

  • eyeBlinkRight
  • eyeLookDownRight
  • eyeLookInRight
  • eyeLookOutRight
  • eyeLookUpRight
  • eyeSquintRight
  • eyeWideRight

Mouth and Jaw

  • jawForward
  • jawLeft
  • jawRight
  • jawOpen
  • mouthClose
  • mouthFunnel
  • mouthPucker
  • mouthLeft
  • mouthRight
  • mouthSmileLeft
  • mouthSmileRight
  • mouthFrownLeft
  • mouthFrownRight
  • mouthDimpleLeft
  • mouthDimpleRight
  • mouthStretchLeft
  • mouthStretchRight
  • mouthRollLower
  • mouthRollUpper
  • mouthShrugLower
  • mouthShrugUpper
  • mouthPressLeft
  • mouthPressRight
  • mouthLowerDownLeft
  • mouthLowerDownRight
  • mouthUpperUpLeft
  • mouthUpperUpRight

Eyebrows, Cheeks, and Nose

  • browDownLeft
  • browDownRight
  • browInnerUp
  • browOuterUpLeft
  • browOuterUpRight
  • cheekPuff
  • cheekSquintLeft
  • cheekSquintRight
  • noseSneerLeft
  • noseSneerRight

Apple Developer: Creating Face-Based AR Experiences

Each key in this dictionary (an ARFaceAnchor.BlendShapeLocation constant) represents one of many specific facial features recognized by ARKit. The corresponding value for each key is a floating point number indicating the current position of that feature relative to its neutral configuration, ranging from 0.0 (neutral) to 1.0 (maximum movement).

You can use blend shape coefficients to animate a 2D or 3D character in ways that follow the user’s facial expressions. ARKit provides many blend shape coefficients, resulting in a detailed model of a facial expression; however, you can use as many or as few of the coefficients as you desire to create a visual effect. For example, you might animate a simple cartoon character using only the jawOpen, eyeBlinkLeft, and eyeBlinkRight coefficients. A professional 3D artist could create a detailed character model rigged for realistic animation using a larger set, or the entire set, of coefficients.

You can also use blend shape coefficients to record a specific facial expression and reuse it later. The ARFaceGeometryinit(blendShapes:) initializer creates a detailed 3D mesh from a dictionary equivalent to this property’s value; the serialized form of a blend shapes dictionary is more portable than that of the face mesh those coefficients describe.

Real-Time Cinematography in Unreal Engine 4

Winner of the SIGGRAPH 2016 Award for Best Real-Time Graphics and Interactivity, this scene based on Ninja Theory’s upcoming game, Hellblade: Senua’s Sacrifice, was shot, edited and rendered to final quality in minutes, a process that would normally take weeks or months.

This real-time cinematography project was developed by Epic Games, Ninja Theory, Cubic Motion & 3Lateral, with additional support from House of Moves, IKinema, NVIDIA and Technoprops.

https://www.unrealengine.com/blog/unreal-engine-4-powers-real-time-cinematography-at-siggraph

Cinema Face Cap – Facial Capture for Unity

Cinema Face Cap is a markerless facial capture software solution for Unity 5.x. Dust off your Microsoft Kinect® 2.0 and start creating your own custom facial animations. You can even configure your own models to the system using the advanced Output Wizard! Make your character talk, look around, blink, laugh, frown, and much more!

Main Features:
– Supports capture of 20 facial blendshapes. – Record a facial capture session without ever leaving your Unity Project.
– Live Model preview right inside your current Unity scene using your own models.
– Facial masking. Record only the parts you need
– Captures both facial animations units as well as head orientation (turning, looking, etc.)
– Custom Filtering and Smoothing to fine tune your capture.
– Configure your own blendshape mapping with your own models for Cinema Face Cap using our Output Wizard.
– Save animations for use with your own models.
– Automatically builds an Animation library within your project.
– Save and review raw data sessions

FaceRig

Unknown-7

FaceRig enables anyone to live-puppet a fully 3D figure, or sprite-based 2.5D character. It uses your webcam to map your facial movements and speech to animate the avatar – with a growing list of additional modules supporting Intel® RealSense™ cameras, and Leap Motion™ (for hand control). Expressions and idle animations can be triggered by mouse or keyboard, and an app for mobile is on the way.

community_image_1410311605

FaceRig can render Quicktime video or an Image sequence, or broadcast a live puppet session like a webcam for Skype or streaming. A number of fun avatars are available. Custom props and rigged models can also be imported, as well as models from Live2D with an add-on. Renders are watermarked even for the Pro version.

FaceRig $14.99
webcam based tracking, fully featured for home non-commercial use.

FaceRig Pro Upgrade $64.99
can be used by people who make significant ad-based revenue off the place where they showcase their creations.

IRFaceRig Free
free for everyone and it works only with the Intel® RealSense™ SDK and the Intel® RealSense™ Camera on system with Intel® CPU’s.

FaceRig support for Leap Motion™ Controller Free
combine the Leap Motion™ Controller (for hand tracking) with another regular camera (for face tracking).

FaceRig Live2D Module $3.99
brings the amazing Live2D technology to FaceRig, enabling hand-drawn avatars that move and behave as if they were 3D while keeping all the aspects that make hand-drawn 2D avatars special. Use the seven models provided, or create your with the Live2D Cubism Editor:

Under Development:

FaceRig Studio, targeted at businesses, which will also enable numeric mo-cap tracking, destined to be use with professional software.

FaceRig Mobile for iOS and Android. – This version is currently in development.

FaceRig for Mac and/or Linux – is in the development plan it will most likely come to life after the release of the mobile version.

FaceRig website
FaceRig on Steam