With the rapidly evolving demand of the global IoT market, an ever-increasing number of devices are being connected, generating tremendous amounts of data. At the same time, advances in deep learning have enabled us to explore complex challenges through the ingestion of structured and unstructured data in the form of audio, video, text, and images, including facial recognition.
Visual computing coupled with machine learning (ML), in particular, has become quite popular. These paired technologies can be found in a broad range of applications in the commercial, home security, consumer electronics, and anomaly detection markets, among others.
As one might imagine, training a deep neural network requires enormous computing resources, such as a powerful GPU to accelerate the process. However, deep learning inference requires considerably less computing power. So it’s possible to run computer vision and ML inference tasks on embedded hardware and software solutions, such as Samsung’s ARTIK platform. To demonstrate this, we created Application Note 140 (AN140).
AN140 details the implementation of a deep learning facial recognition system capable of identifying the faces of six celebrities. The ML model was built using a deep neural network with several convolutional and fully connected layers. We used a GPU-based computer to run the training process, using a subset of Microsoft’s “MS-Celeb-1M” dataset (a collection of celebrity images) to help familiarize our engine with the subjects it would be identifying.
Once the training was completed, we used an ARTIK 710 system-on-module (mounted to a development board) as the embedded hardware element to run the ML inference. The resulting ML engine was able to reach an accuracy of 91% correct facial recognition. As part of this project, we ported the TensorFlow™ ML library onto the ARTIK710, so now you can easily install the library from the binary and develop your own ML applications on ARTIK.
This project demonstrates ARTIK as a platform capable of running ML applications, especially those involving the inference process. For the complete package, you can download the documentation and reference code for AN140 here.