MOTION CAPTURE (MoCap) technology enables the recording and analysis of human movement, with applications spanning from biomechanical research and clinical rehabilitation to animation and interactive entertainment[1]. Traditional optical motion capture systems provide high-precision data but require dedicated studio spaces, optimal lighting conditions, and investments often exceeding $20,000[2]. While IMU-based commercial alternatives offer portability, they remain costly due to proprietary hardware and software ecosystems [3].
Mesquite Mocap (https://github.com/Mesquite-Mocap) is an IoT-based open software/hardware/architecture motion capture system that utilizes a wireless network of inertial sensors and a consumer phone to track human movement. The system implements a hierarchical network architecture, real-time data transmission, and browser-based visualization and recording. Our contributions include: a low-cost, open-source wireless sensor network design for human motion capture using commodity hardware; a robust network design optimized for real-time sensor data transmission with minimal latency; real-time kinematic reconstruction; comprehensive performance evaluation comparing accuracy with commercial systems; and demonstration of practical applications across multiple domains.
In this talk/demo there will be a full demonstration of the Mesquite Mocap system including the calibration procedure and retargeting the recorded motion to custom armatures and 3D environments like Unity and Unreal Engine. The entire system can be built for under $600 (see BOM here: https://github.com/Mesquite-Mocap/mesquite.cc?tab=readme-ov-file#project-hardware-requirements), as compared to commercially available systems which cost much more. The utility of the system which can be calibrated and used in the “wild” will also be demonstrated.
Code of Conduct: https://github.com/Mesquite-Mocap/mesquite.cc?tab=coc-ov-file#contributor-covenant-code-of-conduct
In this talk a walkthrough of the build and network design of a wearable IMU based motion capture system will be presented. The presenter will bring 2 motion capture suits that the participants can touch, feel and use. The motion capture system is comparable to commercial optical systems (like Optitrack) with only 2-5 degrees deviation. See GIF below for comparison (blue is optitrack; orange is Mesquite). A thorough presentation on the offline network architecture including hints on future enhancements will be discussed.
Demo of a DIY motion capture system
Development roadmap of an open-source/hardware project
Future improvements to existing platform
Looks like a cool project to demo on stage.
Looks like a really cool project which is just coming out. These techniques have applications beyond motion capture, including activity monitoring of sports and exercises. I expect an engaging presentation on this one.
There are other popular open hardware solutions such as SlimeVR (https://www.crowdsupply.com/slimevr/slimevr-full-body-tracker). The kit of parts approach of this project makes it DIY friendly for people building from scratch.