Skip to content

This repository contains the code and resources for a deep learning project aimed at recognizing hand signs for the game of Rock-Paper-Scissors. The project utilizes convolutional neural networks (CNNs) to classify hand signs captured through a webcam, enabling users to play the game without the need for physical gestures.

Notifications You must be signed in to change notification settings

GhufranBarcha/Gesture-Recognition-Rock-Paper-Scissors

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Gesture Recognition Rock Paper Scissors

This project aims to recognize hand signs for the game of Rock-Paper-Scissors using deep learning techniques. The system utilizes a convolutional neural network (CNN) to classify hand signs captured through a webcam, allowing users to play the game without the need for physical gestures. Additionally, the application includes features that make it accessible to visually impaired individuals, making it a fun and inclusive way to enjoy the game.

Key Features

  • Webcam-Based Game: Users can play Rock-Paper-Scissors using hand signs captured by a webcam.
  • High Accuracy: The CNN model achieves a classification accuracy of over 98% on the test dataset.
  • Accessibility: The application is designed to be accessible to visually impaired individuals, providing an inclusive gaming experience.
  • Easy to Use: Simple controls allow users to interact with the game using keyboard shortcuts.

Installation

To run the project locally, follow these steps:

  1. Clone the repository to your local machine:

git clone https://github.com/GhufranBarcha/Gesture-Recognition-Rock-Paper-Scissors

  1. Install the required dependencies using pip:

  2. Download the pretrained model weights from the GitHub repository and place them in the project directory.

  3. Run the app.py script to start the webcam-based game:

Usage

Once the application is running, follow these steps to play the game:

  1. Position your hand in front of the webcam, making a Rock, Paper, or Scissors gesture.
  2. Press the spacebar to capture an image of your hand sign.
  3. The system will classify the hand sign and display the result on the screen.
  4. Repeat the process to play additional rounds of the game.
  5. Press the ESC key to exit the game when finished.

About

This repository contains the code and resources for a deep learning project aimed at recognizing hand signs for the game of Rock-Paper-Scissors. The project utilizes convolutional neural networks (CNNs) to classify hand signs captured through a webcam, enabling users to play the game without the need for physical gestures.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published