Skip to content

Third year university project, recognise and translate sign language using infrared camera and machine learning

Notifications You must be signed in to change notification settings

win20/sign-language-interpreter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Sign Language Interpreter

Currently only has a command line interface but I am planning to improve this and add a GUI

Third year university project, use an infrared device called a 'Leap Motion' and its api in order to gather hand, joints and finger coordinates in order to train an AI model to translate British Sign Language (BSL).

Folder Structure

  • .ipynb_checkpoints - contains all files neccessary for Jupyter Notebook to work || DO NOT ALTER.

  • model_training - .ipynb files are source code files that are opened in Jupyter Notebook.

    • .ipynb_checkpoints - contains all files neccessary for Jupyter Notebook to work || DO NOT ALTER.
    • data - contains hand tracking data used when developing the models
    • model_testing - all the ouput files of my model testing script
    • saved_models - all the different models that I have developed and saved for later use
  • scripts - where the main application scripts reside, DataExtractor.py is the driver module.

  • tests - some screenshots of graphs and confusion matrices of different models

Prerequisites

  • Leap Motion Controller is connected to computer via USB, and drivers have been installed.
  • Python 2.7 installed, take note of the directory location.
  • Rename the python.exe file in your Python 2.7 folder to python27.exe, if not already done.
  • Python 3.9 installed.
  • All python dependencies and packages installed.

How to run

  1. Make sure all the prerequisites are met.
  2. Run the command prompt (run as administrator if you encouter any problems)
  3. Navigate to the project's "scripts" folder using the 'cd' command. Example: cd C:\Users\winba\Desktop\TYP_SUPPLEMENTARY\scripts
  4. Run the DataExtractor script using python27. Example: python27 DataExtractor.py
  5. Follow on-screen instructions.

About

Third year university project, recognise and translate sign language using infrared camera and machine learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published