How to Build a Deep Learning Classification System for Less than $600 USD
Introduction
Deep learning is set to alter the machine vision landscape in a big way. It is enabling new applications and disrupting established markets. As a product manager with FLIR, I have the privilege of visiting companies across a diverse range of industries; every company I visited this year is working on deep learning. It’s never been easier to get started, but where do you begin? This article will provide an easy-to-follow framework to building a deep learning inference system for less than $600
What is deep learning inference?
Inference is the use of a deep-learning-trained neural network to make predictions on new data. Inference is far better at answering complex and subjective questions than traditional rules-based image analysis. By optimizing networks to run on low-power hardware, inference can be run “on the edge”, near the data source. This eliminates the system’s dependence on a central server for image analysis, leading to lower latency, higher reliability, and improved security.
1. Selecting the hardware
The goal of this guide is to build a reliable, high-quality system to deploy in the field. While it is beyond the scope of this guide, combining traditional computer vision techniques with deep learning inference can deliver high accuracy and computational efficiency by leveraging the strengths of each approach. The Aaeon UP Squared-Celeron-4GB-32GB single-board computer has the memory and CPU power required for this approach. Its X64 Intel CPU runs the same software as traditional desktop PCs, simplifying development compared to ARM-based, single-board computers (SBCs).
The code that enables deep learning inference uses branching logic; dedicated hardware can greatly accelerate the execution of this code. The Intel® Movidius™ Myriad™ 2 Vision Processing Unit (VPU) is a very powerful and efficient inference accelerator, which has been integrated into our new inference camera, the Firefly DL.
Part |
Part Number |
Price [USD] |
USB3 Vision Deep Learning Enabled Camera |
299 |
|
Single Board Computer |
UP Squared-Celeron-4GB-32GB-PACK |
239 |
3m USB 3 cablel |
10 |
|
Lens |
10 |
|
Software |
Ubuntu 16.04/18.04, TensorFlow, Intel NCSDK, FLIR Spinnaker SDK |
0 |
Total $558 |
2. Software Requirements
There are many free tools available for building, training and deploying deep learning inference models. This project uses an array of free and open source software. Installation instructions for each software package are available on their respective websites. This guide assumes you are familiar with the basics of the Linux console.
Collect Training Data |
Train Network (augmentation optional) |
Evaluate performance |
Convert to Movidius graph format |
Deploy to Firefly DL camera |
Run inference on captured images |
Fig. 1. Deep learning inference workflow and the associated tools for each step.
3. Detailed Guide
Getting Started with Firefly Deep Learning on Linux provides an introduction on how to retrain a neural network and convert the resulting file into a firefly compatible format, and display the results using SpinView. Users are provided with step by step process on how to train and convert inference networks themselves, through terminal.
Related Articles
Neural Networks Supported by the Firefly Deep Learning shows which neural networks we have tested to work on the Firefly-DL.
Tips on creating Training Data for Deep Learning Neural Networks goes over how to create an efficient deep learning neural network by creating high quality training data for a given application.
Troubleshooting neural network graph conversion issues provides helpful tips on issues that may occur when converting inference network files to Firefly compatible format, and how to resolve them.