Every human has six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. At Sagacify, we applied facial recognition technology to develop a machine learning agent that is able to detect human emotions.
People themselves find it sometimes difficult to read emotions, so one can easily imagine how hard it must be for a machine to do it. In order to realize this project, two models were required. Both of them are based on deep convolutional neural networks (CNN): The task of the first model is facial detection, while the second one specializes in emotion detection. Thousands of images were used as dataset to train the first model on recognizing faces, while for the second model, a labeled dataset was created consisting of RGB images and video clips.
Both models were subsequently used to recognize emotions in real-time. The first model identifies and extracts each face in every frame of the moving material. Once collected, the second model scans the facial expression and, suggests the correct emotion expressed on the user’s face.
But what’s in it for companies? It might be a valuable asset in the upcoming years in the fields of:
It can already prove useful to some businesses, however, together with new clients, Sagacify will gain the necessary expertise to develop new applications with this technology. Interested? Contact us and we will be happy to discuss the possibilities.
More information about the process can be found in our thoughts section.