Object Detection toolkit based on PaddlePaddle. It supports object detection, instance segmentation, multiple object tracking and real-time multi-person keypoint detection.
-
Updated
May 24, 2024 - Python
Object Detection toolkit based on PaddlePaddle. It supports object detection, instance segmentation, multiple object tracking and real-time multi-person keypoint detection.
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Python implementation of KNN and DTW classification algorithm
Real-Time Spatio-Temporally Localized Activity Detection by Tracking Body Keypoints
Convolutional Neural Network for Human Activity Recognition in Tensorflow
[IJCAI-21] "Time-Series Representation Learning via Temporal and Contextual Contrasting"
Using deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type of movement amongst 6 categories or 18 categories on 2 different datasets.
MotionSense Dataset for Human Activity and Attribute Recognition ( time-series data generated by smartphone's sensors: accelerometer and gyroscope) (PMC Journal) (IoTDI'19)
Unity's privacy-preserving human-centric synthetic data generator
Quickly add MediaPipe Pose Estimation and Detection to your iOS app. Enable powerful features in your app powered by the body or hand.
An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
[TKDD 2023] AdaTime: A Benchmarking Suite for Domain Adaptation on Time Series Data
Abnormal Human Behaviors Detection/ Road Accident Detection From Surveillance Videos/ Real-World Anomaly Detection in Surveillance Videos/ C3D Feature Extraction
Multi Person Skeleton Based Action Recognition and Tracking
Human Activity Recognition based on WiFi Channel State Information
Classifying the physical activities performed by a user based on accelerometer and gyroscope sensor data collected by a smartphone in the user’s pocket. The activities to be classified are: Standing, Sitting, Stairsup, StairsDown, Walking and Cycling.
This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks.
Human Activity Recognition using Channel State Information
Self-supervised learning for wearables using the UK-Biobank (>700,000 person-days)
Implementation of Action Recognition using 3D Convnet on UCF-101 dataset.
Add a description, image, and links to the human-activity-recognition topic page so that developers can more easily learn about it.
To associate your repository with the human-activity-recognition topic, visit your repo's landing page and select "manage topics."