Muscles in Time

DOI

Exploring the intricate dynamics between muscular and skeletal structures is pivotal for understanding human motion. However, acquiring ground truth muscle activation data is resource-intensive and results in a scarcity of datasets. The Muscles in Time (MinT) dataset aims to address this by introducing a large-scale synthetic muscle activation dataset. MinT is created by enriching existing motion capture datasets with muscle activation simulations from biomechanical models using the OpenSim platform, a widely accepted tool in biomechanics and human motion research. Neural networks designed for human motion understanding have historically relied on indirect data, like video or motion capture, similar to prisoners in Plato's cave who see only shadows rather than the true objects. Current systems, despite advances in capturing human motion, do not account for the complex inner mechanics—particularly the muscle activations driving human movement. These activations are key to understanding physical exertion and motion difficulty but are often overlooked due to the limitations of traditional data collection methods such as EMG. To overcome these challenges, our dataset, MinT, incorporates simulations that provide detailed muscle activation information. Starting from simple pose sequences, we extract fine-grained muscle activation timings and interactions within the human musculoskeletal system. MinT contains over nine hours of simulation data, covering 227 subjects and 402 muscle strands, offering a comprehensive and scalable resource for further research into human motion.

Please find additional information about this dataset and its usage under https://simplexsigil.github.io/mint

Simulation of muscle activations on basis of motion capture data

https://simplexsigil.github.io/mint

Identifier
DOI https://doi.org/10.35097/VDPCEFSThBWlDPFL
Metadata Access https://www.radar-service.eu/oai/OAIHandler?verb=GetRecord&metadataPrefix=datacite&identifier=10.35097/VDPCEFSThBWlDPFL
Provenance
Creator Schneider, David ORCID logo
Publisher Computer Vision for Human Computer Interactions Lab (cv:hci), Institute for Anthropomatics and Robotics (IAR), Karlsruhe Institute of Technology
Contributor RADAR
Publication Year 2024
Funding Reference Carl-Zeiss-Stiftung CZS Other JuBot - Jung bleiben mit Robotern
Rights Open Access; Other; info:eu-repo/semantics/openAccess
OpenAccess true
Representation
Language English
Resource Type Dataset
Format application/x-tar
Discipline Computer Science; Computer Science, Electrical and System Engineering; Engineering Sciences