Tutorials

Basic

01/ Getting started with MocapX

Learn how to quickly setup MocapX

02/ Pose library

Learn how to work with pose library

03/ macOS Installation

Learn how to install MocapX on macOS

Advanced

04/ Head connection

Learn how to connect head

05/ Eye connection

Learn how to connect eye

06/ Sample data

Download sample data and project files

Example

07/ Tips and Tricks

Learn how to quickly generate poses

08/ Advanced Skeleton

Using Advanced Skeleton With MocapX

FAQ

Most frequently asked questions and answers

MocapX runs on every iPhone and tablet with iOS 12. Please note that facial tracking is only supported on devices with FaceID. Click here for a full list of hardware features.

Apple’s True depth camera, which is responsible for facial mocap, is exactly the same in iPhone X and 11. 

Yes, of course. The MocapX app and plug-in were created for animators to speed up the animation process. You can use the MocapX plug-in for any commercial and non-comercial projects. If you do anything interesting in MocapX, let us know and we will be happy to share.

Please note that more complicated rigs may decrease real-time performance depending on your computer’s processing power and graphics card. However, MocapX data is captured and saved at a full 60 fps.

Facial tracking in the MocapX app uses Apple’s Face ID technology to capture facial expressions and transfer the data directly to Maya. This is a paid feature in the application and is only available for devices with Face ID capability. Click here for a full list of hardware features.

We support all Maya versions from 2016 to 2019. The operating systems we recommend are Windows 10 with iTunes installed, MacOS HighSierra or later and Linux with Centos 7.4 (wifi connection only).

Generally speaking, yes! You can connect MocapX data directly to your rig controller or use PoseLib to drive multiple controllers on your rig. We provide demos as well as a tutorials and other useful information.

Sliders and joysticks are available in the free version of the MocapX app and allow you to control objects and attributes in real time in Maya. You can record live action sequences and save them as animation clips for later use – for example, you can animate a car or plane just by rotating your iPhone.

The Android version of MocapX is in the works and is coming soon.

Please be aware that for the Android version, you will have to use a device with similar technology to Apple’s True depth camera (Infrared + dot projector).

Google Pixel 4 looks like a good fit for now.

We support only Maya real-time facial motion capture with our Maya plug-in and PoseLib editor. The Unity plug-in for live mocap will come in 2Q 2020 and Unreal connection in late 2020.

With PRO version you can both stream to Maya in real-time and record a clip locally for later use in Maya (the clip length is limited only by available storage on your device). The basic version includes 30 seconds of free local recording and you can buy additional minutes up to 45 minutes. With basic version you are not able to stream facial expression to Maya directly. However you can test the connection with your pc and Maya using keypads and sliders.

History of purchases can be found only in your AppStore account.

Currently no. Your purchase is tied to Apple ID and to device where the purchase was done.

Get the MocapX App and Plug-in

Download MocapX – it’s free

MocapX App

Free

Works with iOS 12

MocapX Plug-in

Free

Windows 10, MacOS 10.12, Linux CentOS 7

Download