🎥 Demo Video: https://www.youtube.com/watch?v=SqiHZh8AU64
🛒 Unity Asset Store: https://assetstore.unity.com/packages/tools/ai-ml-integration/real-time-full-body-tracking-system-350552
Real Time Full Body Tracking is a complete, all-in-one real-time human tracking solution for Unity, combining Full Body Pose Tracking, Hand Tracking, and Face Mesh Tracking with animation into a single, unified plugin.
Built on high-performance RGB-based computer vision, the system works with any standard camera — no depth sensors or special hardware required. Each tracking module (Face, Hand, Pose) can be enabled or disabled independently to balance performance and accuracy.
This plugin is designed for creators building VTuber systems, digital humans, AR/VR experiences, motion-based games, virtual try-on apps, and real-time avatar animation. With plug-and-play prefabs, demo scenes, and inspector-based configuration, you can get realistic avatar tracking running in minutes.
The system supports humanoid avatar animation, including facial blendshapes, runs smoothly up to 60 FPS, and works seamlessly with Ready Player Me avatars.
- Full Body Pose Tracking using RGB camera
- Hand Tracking with 20+ 3D landmarks per hand
- Face Mesh Tracking & facial animation (blendshape-based)
- Independent module control (Face / Pose / Hand)
- Real-time tracking with low latency
- No depth camera required
- Animate any Humanoid avatar
- Works out-of-the-box with Ready Player Me
- Full body motion replication
- Up to 180° body rotation tracking
- Automatic IK & FK handling
- Facial animation via ARKit-compatible blendshapes
- Hand mesh & finger joint animation
- Adjustable motion smoothness
- Configurable visibility thresholds
- Auto-scaling avatar based on camera distance
- Multiple model types (Lite / Heavy)
- Supports Camera, Video, and Image input
- 🖥️ Windows, macOS, Linux (out of the box)
- 📱 Android & iOS (Lite model recommended)
- Pose Tracking Demo
- Face Tracking Demo
- Hand Tracking Demo
- Full Tracking Demo (Face + Hand + Body)
- VTuber Demo Scene
- VTuber & Live Streaming Avatars
- Digital Humans & Virtual Characters
- Motion-Based Games
- Gesture-Based UI & Interaction
- Avatar Motion Capture (Mocap)
- Education & Training Simulations
- Virtual Try-On Applications
- Remote Collaboration with Avatars
- Fitness & Training Avatars
- Research & HCI Projects
- AR/VR Virtual Presence Systems
- Avatar Puppeteering & Animation Preview
- Open Unity
- Create a new scene
- Add an Orthographic or Perspective camera
- Position it to face the avatar
-
Create an empty GameObject
-
Attach the following scripts:
SolutionGraphRunner
-
Select the Bootstrap Prefab
-
Open the App Settings Scriptable Object
-
Configure:
- Input Source (Camera / Video / Image)
- Model Type (Lite / Heavy)
- Running Mode
- Resolution
- Enable / Disable Face, Pose, Hand Tracking
- Drag the MainCanvas Prefab into the Scene Hierarchy
Refer to the demo scenes for recommended configurations.
- Must use Humanoid Rig
- Facial animation requires blendshapes (ARKit recommended)
Download an avatar with facial blendshapes:
https://models.readyplayer.me/avatar_id.glb?morphTargets=ARKit&textureAtlas=1024
Convert GLB → FBX before importing into Unity.
Sample avatar: https://drive.google.com/file/d/1KxctzTEvZ7js_Y3c3wn5v1vXsHxmbO4i/view?usp=sharing
- Import the avatar model
- Select the model → Rig tab
- Set Animation Type = Humanoid
- Apply changes
- Drag the avatar into the scene
- Position it in front of the camera
- Add
FaceTrackingscript to the avatar - Assign blendshapes and face settings
- Add
PoseTrackingscript - Configure body tracking options
- Add
HandTrackingscripts for left & right hands - Assign hand meshes if needed
| Field | Description |
|---|---|
| Scale Factor | Adjust avatar scale to match human |
| Visibility Threshold | Landmark confidence threshold |
| Smoothness | Animation interpolation control |
| In Place | Keep avatar position fixed |
| Match Scale | Auto-scale based on camera distance |
| Skeleton Parent | Parent transform for landmarks |
- Use Low Resolution Input for better FPS
- Ensure bright, even lighting
- Use Lite models on mobile devices
- Tune Visibility Threshold & Smoothness
- Disable unused modules (Face / Hand / Pose)
📧 Email: realmoctopus@gmail.com 🌐 Unity Asset Store Support: Contact via the Real Time Full Body Tracking asset page
If you encounter any issues or need help with setup or customization, feel free to reach out.
🚀 Real Time Full Body Tracking — build expressive, real-time avatars in Unity using only an RGB camera.
