Talk
Beginner

Mesquite MoCap: Democratizing Real-Time Motion Capture with Affordable, Open-Source, Networked IMU Hardware and WebXR SLAM

Approved

MOTION CAPTURE (MoCap) technology enables the recording and analysis of human movement, with applications spanning from biomechanical research and clinical rehabilitation to animation and interactive entertainment[1]. Traditional optical motion capture systems provide high-precision data but require dedicated studio spaces, optimal lighting conditions, and investments often exceeding $20,000[2]. While IMU-based commercial alternatives offer portability, they remain costly due to proprietary hardware and software ecosystems [3].

Mesquite Mocap (https://github.com/Mesquite-Mocap) is an IoT-based open software/hardware/architecture motion capture system that utilizes a wireless network of inertial sensors and a consumer phone to track human movement. The system implements a hierarchical network architecture, real-time data transmission, and browser-based visualization and recording. Our contributions include: a low-cost, open-source wireless sensor network design for human motion capture using commodity hardware; a robust network design optimized for real-time sensor data transmission with minimal latency; real-time kinematic reconstruction; comprehensive performance evaluation comparing accuracy with commercial systems; and demonstration of practical applications across multiple domains.

https://lh7-rt.googleusercontent.com/docsz/AD_4nXcqt57qDFrXXMcxk5xZmOZfjVle09CjLPxF3AZ-nPQa37V_Yr8boKp0Hnqogpwru4SAdm7R34ES-ucP-wIeBb3EG37pmgElfMWnwwrEB8IKUkgyaQS89aYUsOECzsd925nsDh9SLw?key=4KtuG4KhakzNnPDVdhyEaw

In this talk/demo there will be a full demonstration of the Mesquite Mocap system including the calibration procedure and retargeting the recorded motion to custom armatures and 3D environments like Unity and Unreal Engine. The entire system can be built for under $600 (see BOM here: https://github.com/Mesquite-Mocap/mesquite.cc?tab=readme-ov-file#project-hardware-requirements), as compared to commercially available systems which cost much more. The utility of the system which can be calibrated and used in the “wild” will also be demonstrated.  


Code of Conduct: https://github.com/Mesquite-Mocap/mesquite.cc?tab=coc-ov-file#contributor-covenant-code-of-conduct 

In this talk a walkthrough of the build and network design of a wearable IMU based motion capture system will be presented. The presenter will bring 2 motion capture suits that the participants can touch, feel and use. The motion capture system is comparable to commercial optical systems (like Optitrack) with only 2-5 degrees deviation. See GIF below for comparison (blue is optitrack; orange is Mesquite). A thorough presentation on the offline network architecture including hints on future enhancements will be discussed.

  1. Demo of a DIY motion capture system

  2. Development roadmap of an open-source/hardware project

  3. Future improvements to existing platform

Tutorial about using a FOSS project
Introducing a FOSS project or a new version of a popular project
Which track are you applying for?
Main track

Tejaswi Gowda, PhD
Assistant Professor Arizona State University
https://www.linkedin.com/in/tejaswigowda/
Speaker Image

100 %
Approvability
5
Approvals
0
Rejections
0
Not Sure

Looks like a cool project to demo on stage.

Reviewer #1
Approved
Reviewer #2
Approved

Looks like a really cool project which is just coming out. These techniques have applications beyond motion capture, including activity monitoring of sports and exercises. I expect an engaging presentation on this one.

There are other popular open hardware solutions such as SlimeVR (https://www.crowdsupply.com/slimevr/slimevr-full-body-tracker). The kit of parts approach of this project makes it DIY friendly for people building from scratch.

Reviewer #3
Approved
Reviewer #4
Approved
Reviewer #5
Approved