top of page

AI Case Study

SignAll translates spoken English into sign language enabling better communication for hearing impaired using vision processing and natural language processing

Using vision processing to understand hand movements and body language, machine learning and natural language processing SignAll has developed a proprietary platform to translate spoken language into written messages or hand signals enabling easier communication for people who have hearing trouble.

Industry

Public And Social Sector

Ngo

Project Overview

"Sign Language is a complex and nuanced language that consists of different elements. To translate accurately, it is imperative to be able to detect: Body movement, facial expressions, hand shapes.

SignAll has developed innovative patent-pending technology combining: Computer vision, Machine learning, Natural-language processing algorithms.

Reported Results

POC; Results not yet available

Technology

Function

R And D

Product Development

Background

"Over 100 million people are unable to hear. Being deaf from birth or childhood, many of these people use sign language as their primary form of communication."

Benefits

Data

bottom of page