Skip to content

realmoctopus/Real-Time-Full-Body-Tracking-System

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

Real Time Full Body Tracking – Full Body, Face & Hand Tracking for Unity

Real Time Full Body Tracking – Demo Video

🎥 Demo Video: https://www.youtube.com/watch?v=SqiHZh8AU64

🛒 Unity Asset Store: https://assetstore.unity.com/packages/tools/ai-ml-integration/real-time-full-body-tracking-system-350552


Introduction

Real Time Full Body Tracking is a complete, all-in-one real-time human tracking solution for Unity, combining Full Body Pose Tracking, Hand Tracking, and Face Mesh Tracking with animation into a single, unified plugin.

Built on high-performance RGB-based computer vision, the system works with any standard camera — no depth sensors or special hardware required. Each tracking module (Face, Hand, Pose) can be enabled or disabled independently to balance performance and accuracy.

This plugin is designed for creators building VTuber systems, digital humans, AR/VR experiences, motion-based games, virtual try-on apps, and real-time avatar animation. With plug-and-play prefabs, demo scenes, and inspector-based configuration, you can get realistic avatar tracking running in minutes.

The system supports humanoid avatar animation, including facial blendshapes, runs smoothly up to 60 FPS, and works seamlessly with Ready Player Me avatars.


Key Features

Core Tracking

  • Full Body Pose Tracking using RGB camera
  • Hand Tracking with 20+ 3D landmarks per hand
  • Face Mesh Tracking & facial animation (blendshape-based)
  • Independent module control (Face / Pose / Hand)
  • Real-time tracking with low latency
  • No depth camera required

Avatar Animation

  • Animate any Humanoid avatar
  • Works out-of-the-box with Ready Player Me
  • Full body motion replication
  • Up to 180° body rotation tracking
  • Automatic IK & FK handling
  • Facial animation via ARKit-compatible blendshapes
  • Hand mesh & finger joint animation

Performance & Control

  • Adjustable motion smoothness
  • Configurable visibility thresholds
  • Auto-scaling avatar based on camera distance
  • Multiple model types (Lite / Heavy)
  • Supports Camera, Video, and Image input

Platform Support

  • 🖥️ Windows, macOS, Linux (out of the box)
  • 📱 Android & iOS (Lite model recommended)

Demo Scenes Included

  • Pose Tracking Demo
  • Face Tracking Demo
  • Hand Tracking Demo
  • Full Tracking Demo (Face + Hand + Body)
  • VTuber Demo Scene

Use Cases

  1. VTuber & Live Streaming Avatars
  2. Digital Humans & Virtual Characters
  3. Motion-Based Games
  4. Gesture-Based UI & Interaction
  5. Avatar Motion Capture (Mocap)
  6. Education & Training Simulations
  7. Virtual Try-On Applications
  8. Remote Collaboration with Avatars
  9. Fitness & Training Avatars
  10. Research & HCI Projects
  11. AR/VR Virtual Presence Systems
  12. Avatar Puppeteering & Animation Preview

Project Setup

Step 1: Create a New Scene

  • Open Unity
  • Create a new scene

Step 2: Add a Camera

  • Add an Orthographic or Perspective camera
  • Position it to face the avatar

Step 3: Attach Solution Scripts

  1. Create an empty GameObject

  2. Attach the following scripts:

    • Solution
    • Graph
    • Runner

Step 4: Configure App Settings

  1. Select the Bootstrap Prefab

  2. Open the App Settings Scriptable Object

  3. Configure:

    • Input Source (Camera / Video / Image)
    • Model Type (Lite / Heavy)
    • Running Mode
    • Resolution
    • Enable / Disable Face, Pose, Hand Tracking

Step 5: Add Main Canvas

  • Drag the MainCanvas Prefab into the Scene Hierarchy

Refer to the demo scenes for recommended configurations.


Avatar Setup

Avatar Requirements

  • Must use Humanoid Rig
  • Facial animation requires blendshapes (ARKit recommended)

Recommended: Ready Player Me Avatar

Download an avatar with facial blendshapes:

https://models.readyplayer.me/avatar_id.glb?morphTargets=ARKit&textureAtlas=1024

Convert GLB → FBX before importing into Unity.

Sample avatar: https://drive.google.com/file/d/1KxctzTEvZ7js_Y3c3wn5v1vXsHxmbO4i/view?usp=sharing

Import Avatar into Unity

  1. Import the avatar model
  2. Select the model → Rig tab
  3. Set Animation Type = Humanoid
  4. Apply changes

Add Avatar to Scene

  • Drag the avatar into the scene
  • Position it in front of the camera

Attach Tracking Scripts

Face Tracking

  • Add FaceTracking script to the avatar
  • Assign blendshapes and face settings

Pose Tracking

  • Add PoseTracking script
  • Configure body tracking options

Hand Tracking

  • Add HandTracking scripts for left & right hands
  • Assign hand meshes if needed

Common Configuration Fields

Field Description
Scale Factor Adjust avatar scale to match human
Visibility Threshold Landmark confidence threshold
Smoothness Animation interpolation control
In Place Keep avatar position fixed
Match Scale Auto-scale based on camera distance
Skeleton Parent Parent transform for landmarks

Performance Tips

  • Use Low Resolution Input for better FPS
  • Ensure bright, even lighting
  • Use Lite models on mobile devices
  • Tune Visibility Threshold & Smoothness
  • Disable unused modules (Face / Hand / Pose)

Support & Contact

📧 Email: realmoctopus@gmail.com 🌐 Unity Asset Store Support: Contact via the Real Time Full Body Tracking asset page

If you encounter any issues or need help with setup or customization, feel free to reach out.


🚀 Real Time Full Body Tracking — build expressive, real-time avatars in Unity using only an RGB camera.

Releases

No releases published

Packages

No packages published