Convert SOMA human motion captures into humanoid robot joint animation. Takes BVH motion files as input and produces robot-playable CSV joint data as output using GPU-optimized inverse kinematics via Newton and high-performance computation with NVIDIA Warp.
The retargeting pipeline handles proportional human-to-robot scaling, multi-objective IK solving with joint limits, feet stabilization to maintain ground contact, and per-DOF joint limit clamping. Currently supports SOMA as the input skeleton and Unitree G1 (29 DOF) as the output robot. Additional robot targets are planned.
SOMA Retargeter is part of the SOMA body model ecosystem for humanoid motion data.
Note: This project is in active development. The API may change between releases as the design is refined.
- Python: 3.12
- Git LFS: Installed and initialized for asset downloads
- OS: Windows (x86-64) and Linux (x86-64, aarch64)
- GPU: NVIDIA GPU (Maxwell or newer), driver 545+ (CUDA 12). No local CUDA Toolkit installation required.
Setup instructions
conda create -n soma-retargeter python=3.12 -y conda activate soma-retargetergit lfs pullpip install .Follow the official installation guide if uv is not yet installed.
git lfs pulluv sync creates an isolated .venv virtual environment inside the project directory, installs the correct Python version and resolves all dependencies.
uv syncNote (Linux): For the GUI viewer to work, install tkinter
sudo apt-get install python3.12-tkNote (Windows): If imgui-bundle fails to install, the Microsoft Visual C++ Redistributables may be missing. Download from the official Microsoft documentation.
This repo includes 10 sample BVH/CSV pairs in assets/motions/ for immediate testing.
For large-scale motion data, see the SEED dataset (Skeletal Everyday Embodiment Dataset) published by Bones Studio. SEED provides a large-scale collection of human motions on the SOMA uniform-proportion skeleton, which is the expected input format for this tool. The G1 robot motion data included in SEED was retargeted using SOMA Retargeter.
When using uv (Method 2), replace
pythonwithuv runin the commands below.
python ./app/bvh_to_csv_converter.py --config ./assets/default_bvh_to_csv_converter_config.json --viewer glThe viewer displays the source SOMA motion alongside the retargeted robot in a 3D viewport. Use the right panel to load BVH files, run retargeting, and save CSV output. Playback controls at the bottom allow scrubbing, speed adjustment, and looping. Toggle visibility of the skinned mesh, skeleton, joint axes, and positioning gizmos.
Process a folder of BVH files without a display. Set import_folder and export_folder in the config file, then run:
python ./app/bvh_to_csv_converter.py --config ./assets/default_bvh_to_csv_converter_config.json --viewer nullBatch mode recursively finds all .bvh files in the import folder, processes them in configurable batch sizes, and writes CSV files to the export folder mirroring the input directory structure.
| File | Description |
|---|---|
bvh_to_csv_converter.py | Main entry point. Drives both interactive and headless batch retargeting modes. |
| Module | Description |
|---|---|
animation/ | Core data structures for skeletons, animation buffers, IK, and skinned meshes. |
assets/ | File I/O for BVH, CSV, and USD formats. |
pipelines/ | Retargeting pipeline: IK solving, feet stabilization, and joint limit clamping. |
robotics/ | Human-to-robot scaling and robot output formatting. |
renderers/ | Visualization for the interactive viewer. |
utils/ | Math, pose, coordinate conversion, Newton and Warp helpers. |
configs/ | JSON configuration for retargeting, scaling, and feet stabilization parameters. |
SOMA Retargeter is a support tool within the SOMA ecosystem for humanoid motion data:
- SOMA Body Model - Parametric human body model with standardized skeleton, mesh, and shape parameters
- GEM-X - Human motion estimation from video
- Kimodo - Kinematic motion diffusion model for text and constraint-driven 3D human and robot motion generation
- ProtoMotions - GPU-accelerated simulation and learning framework for training physically simulated digital humans and humanoid robots
- SONIC - Whole-body control for humanoid robots, training locomotion and interaction policies
This project draws inspiration and builds upon excellent open-source work, including:
This codebase is licensed under Apache-2.0.
This project will download and install additional third-party open source software projects. Review the license terms of these open source projects before use.

