--- license: apache-2.0 tags: - tensorflow-lite - edge-ai - asl-recognition - mediapipe - computer-vision - gesture-recognition library_name: tensorflow inference: false datasets: [] model-index: - name: ASL-TFLite-Edge results: [] --- # ASL-TFLite-Edge This repository contains a TensorFlow Lite model trained to recognize American Sign Language (ASL) fingerspelling gestures using hand landmark data. The model is optimized for real-time inference on edge devices. ## ๐Ÿง  Model Details - **Format:** TensorFlow Lite (.tflite) - **Input:** 64x64 RGB image (generated from hand landmarks via Mediapipe) - **Output:** Softmax probabilities over 59 ASL character classes (including a padding token) - **Frameworks:** TensorFlow, Mediapipe ## ๐Ÿ“ Files Included - `asl_model.tflite` โ€“ The TFLite model file for ASL recognition - `inference_args.json` โ€“ JSON file containing the selected columns used for inference from parquet data - `tflite_inference.py` โ€“ Inference script to run predictions from raw `.parquet` landmark files ## ๐Ÿš€ How to Run Inference You can download and load the TFLite model directly from Hugging Face using the `huggingface_hub` library. ### Clone the image ```bash git lfs install git clone https://huggingface.co/ColdSlim/ASL-TFLite-Edge cd ASL-TFLite-Edge ``` ### Requirements ```bash pip install -r requirements.txt ``` ### Run the Script ```bash python tflite_inference.py path/to/sample.parquet ``` ### Output ```bash Predicted class index: 5 ``` >๐Ÿ” You can map this class index back to the character using your `char_to_num` mapping used during training. ## ๐Ÿ“Œ Example Workflow 1. Extract right-hand landmark data from Mediapipe and store it in a `.parquet` file. 2. Ensure it contains the same selected_columns as in `inference_args.json`. 3. Run `tflite_inference.py` to get the predicted class. ## ๐Ÿงพ License This project is licensed under the Apache 2.0 License. ## ๐Ÿ‘จโ€๐Ÿ’ป Author Developed by Manik Sheokand For sign language fingerspelling Recognition on edge devices using TensorFlow Lite