GithubHelp home page GithubHelp logo

miguelramosfdz / sign-language Goto Github PK

View Code? Open in Web Editor NEW

This project forked from mquinn960/sign-language

0.0 1.0 0.0 816.65 MB

Android application which uses feature extraction algorithms and machine learning (SVM) to recognise and translate static sign language gestures.

License: MIT License

Java 26.76% HTML 0.12% Makefile 0.06% CMake 4.44% C 3.31% C++ 65.20% Objective-C 0.12%

sign-language's Introduction

Sign Language

The Sign Language app is an Android application which can translate static ASL and BSL signs, such as the fingerspelling alphabet. These translated signs can be displayed to the user whilst allowing for sentences to be constructed. This app is currently a proof of concept to illustrate low-cost, freely available and offline Sign Language recognition using purely visual data.

The current beta version of this app can be tested here:

(Click the image below to watch the video demo) Alt text

For a comprehensive step by step guide on using these applications and some additional info on how they work, please see my new help repo

Getting Started

  • Clone the repo onto your local machine by using:

    HTTPS: git clone https://github.com/Mquinn960/sign-language.git

    SSH: git clone [email protected]:Mquinn960/sign-language.git

  • Ensure the prerequisites below are installed/satisfied

  • If you're using Android Studio, load the project and hit run

Prerequisites

  • Load in a trained SVM XML file created using the Offline Trainer
    • https://github.com/Mquinn960/offline-trainer
    • You must place the file (named trained.xml) in the sign-language\app\src\main\res\raw\ directory. This gets loaded when running the app the first time.
    • Note: If you're just looking for a quick start, then you can download a sample trained.xml here
  • Ensure you have an Android smartphone connected to your computer, and that debugging is enabled
  • It is recommended that you use the Android Studio IDE noted below

Development Environment

User Guide

For a comprehensive step by step guide on using these applications and some additional info on how they work, please see my new help repo

  • Using this app requires the use of a trained.xml file, which contains the Machine Learning information required to make predictions about your Sign Language gestures
  • Follow the instructions found in the Offline Trainer repo to create this file from input image training data you create using the Dataset Creator app, or find online - for more info see the Offline Trainer repo
  • Once the trained.xml file has been added to the app raw resources folder as per the Prerequisites
  • Start the app, and point the smartphone camera at a person performing Sign Language alphabet gestures
  • Use the onscreen buttons to capture the translated gestures

Exporting the imaging kernel

If you want to alter the Sign Language app and then use the Sign Language app's imaging kernel to train a new model with the Offline Trainer, you must first run the Grade make-jar task.

  • Edit the app's Gradle build file sign-language\app\build.gradle
  • Comment out the entire com.android.application build step
  • Uncomment the com.android.library task
  • Perform a Gradle sync
  • Run the make-jar Gradle task
  • Find the exported imaging kernel sign-language\app\build\outputs\jar\app-release-null.jar and import this into the Offline Trainer's "new" folder, as described in the repo README
  • (Optional) Run any training you want to do in the Offline Trainer, then take the trained SVM XML file and import it into the Sign Language app as descibed in the User Guide above
  • Undo steps 2 and 3 and run the main app task again.

Built With

  • Java 8 - Java Programming Language
  • Android - Android OS
  • Gradle - Gradle build system
  • OpenCV - The Open Source Computer Vision Library

Contributing

  • Feel free to submit issues to this repository but please include usable information if you are looking to have something fixed
  • Use Feature Branching where possible
  • Submit Pull Requests to @Mquinn960 for review

Authors

License

This project is licensed under the MIT License - see the LICENSE.md file for details

sign-language's People

Contributors

mquinn960 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.