This project classifies different human activities into their respective actions using the `LibSVM` library.
-
Updated
May 15, 2019 - Perl
This project classifies different human activities into their respective actions using the `LibSVM` library.
Gaussian Latent Dirichlet Allocation
The FMCW radar-based food intake monitoring-oriented dataset contains 70 meal sessions (4132 eating gestures and 893 drinking gestures) from 70 participants with a total duration of 1155 minutes. Four eating styles (fork & knife, chopsticks, spoon, hand) are included in this dataset.
ROS/ROS2 -- Navigation, Manipulation, Mimicking, Sensor Fusion, VR, Speech Recogition, Activity Recognition, Computer Vision
Official GitHub page of the arXiv publication "WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition"
Wearable Sensor based Human Activity Recognition with Recurrent Neural Networks.
Using random forest to recognize human activity.
Human Activity Recognition from Accelerometer Data
A Computer Vision Framework to run CV models and tasks.
The Human Activity Recognition database was built from the recordings of 30 study participants performing activities of daily living (ADL) while carrying a waist-mounted smartphone with embedded inertial sensors. The objective is to classify activities into one of the six activities performed.
A set of out-of-the-box dimensionality reduction techniques conducted on the HAR dataset, utilizing Python
Searching Efficient Models for Human Activity Recognition
Multi-view balance related body landmark (joints) dataset with synchronized center of pressure (CoP)
Human Activity Recognition Example Model Zoo
SmartRelationship Android App
A Machine Learning approach to predict the activities of person.
In this project, we try to track the physical activities of people through sensors from smartphones placed in different positions of the body.
Self-Explainable Zero-shot Human Activity Recognition Network
The objective of this case study is to build a model that predicts human activities such as Walking, Walking Upstairs, Walking Downstairs, Standing, Sitting or laying using sensors like accelerometer which measures acceleration and gyroscope which measures angular velocity that we have in the Smartphone.
Add a description, image, and links to the human-activity-recognition topic page so that developers can more easily learn about it.
To associate your repository with the human-activity-recognition topic, visit your repo's landing page and select "manage topics."