The increasing affordability and miniaturization of sensors such as accelerometers, gyroscopes, magnetometers and heart rate monitors is making it increasingly possible for people to be able to record detailed information about their personal activities such as sleep patterns, estimated calories burnt, distances run, how many steps have been taken, etc. This has caught on as a social movement known as the quantified self movement. Commercial devices such as Fitbit, Jawbone Up, and Nike FuelBand have popularized this trend, making it increasingly easy to record and analyze data on such activities.
Given that different activities require different interactions of movements between different body parts, such technology potentially lends itself to also being able to analyze whether particular activities are being performed correctly. This is known as Qualitative Activity Recognition (QAR).
This project involved looking at the kind of accuracy that could be achieved in detecting if a person is performing a bicep curl correctly. This involved using data of people performing bicep curls with multiple sensors attached to key locations on their body. This was fed through a Random Forrest machine learning algorithm that learned to detect when the exercise was performed with good form. The trained model achieves 99.08% accuracy on a held out portion of the dataset.
The full report (which includes the source code) for this project can be viewed in the following link