Inference Engine Developer Guide

Introduction to Intel® Distribution of OpenVINO™ toolkit

The Intel® Distribution of OpenVINO™ toolkit is a comprehensive toolkit that you can use to develop and deploy vision-oriented solutions on Intel® platforms. Vision-oriented means the solutions use images or videos to perform specific tasks. A few of the solutions use cases include autonomous navigation, digital surveillance cameras, robotics, and mixed-reality headsets.

The Intel® Distribution of OpenVINO™ toolkit:

The Intel® Distribution of OpenVINO™ toolkit includes the following components:

This Guide provides overview of the Inference Engine describing the typical workflow for performing inference of a pre-trained and optimized deep learning model and a set of sample applications.

NOTE: Before you perform inference with the Inference Engine, your models should be converted to the Inference Engine format using the Model Optimizer. To learn about how to use Model Optimizer, refer to the Model Optimizer Developer Guide. To learn about the pre-trained and optimized models delivered with the Intel® Distribution of OpenVINO™ toolkit, refer to Pre-Trained Models.

Table of Contents

Typical Next Step: Introduction to Intel® Deep Learning Deployment Toolkit