Skip to content

Myo-armband and "intelligent speech" based on Gesture Recognition.

Notifications You must be signed in to change notification settings

lasopablo/EMG-Gesture-Recognition

Repository files navigation

EMG: a myo armband for people who can't talk

this was a group project (prototype) consisting on multiple computer scientists, several biomedical engineerers and one industrial engineer.

this code is incomplete and unfinished (requires some minor changes and other files like an sdk or the hardware for the myo armband).

the goal of this project was to use the data provided by the myoband (position and gestures) to construct an alphabet.

the function of this proyotype is to produce letters for each gesture, in each position, available.

it can also "predict" some words that are in a notebook file by taking into account the first letter(s).

some links that may be useful: https://developerblog.myo.com/myo-unleashed-python/ https://myo-python.readthedocs.io/en/latest/

About

Myo-armband and "intelligent speech" based on Gesture Recognition.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages