Inference Engine Samples

The Inference Engine sample applications are simple console applications that demonstrate how you can use the Inference Engine in your applications.

The OpenVINO™ toolkit sample and demo applications are available in the <INSTALL_DIR>/deployment_tools/inference_engine/samples directory. The difference between samples, tools and demos is the following:

The OpenVINO™ toolkit includes the following samples and tools:

The OpenVINO™ toolkit includes the following demos:

Media Files Available for Samples

To run the sample applications, you can use images and videos from the media files collection available at https://github.com/intel-iot-devkit/sample-videos.

Samples that Support Pre-Trained Models

You can download the pre-trained models using the OpenVINO Model Downloader or from https://download.01.org/opencv/. The table below shows the correlation between models, samples, and supported plugins. The plugins names are exactly as they are passed to the samples with -d option. The correlation between the plugins and supported devices see in the Supported Devices section. The samples are available in <INSTALL_DIR>/deployment_tools/inference_engine/samples.

NOTE: MYRIAD below stands for Intel® Movidius™ Neural Compute Stick, Intel® Neural Compute Stick 2, and Intel® Vision Accelerator Design with Intel® Movidius™ Vision Processing Units.

Model Samples supported on the model CPU GPU MYRIAD HETERO:FPGA,CPU
action-recognition-0001-decoder Action Recognition Demo Supported Supported
action-recognition-0001-encoder Action Recognition Demo Supported Supported
driver-action-recognition-adas-0002-decoder Action Recognition Demo Supported Supported
driver-action-recognition-adas-0002-encoder Action Recognition Demo Supported Supported Supported
resnet50-binary-0001 Classification Sample Supported Supported
person-attributes-recognition-crossroad-0230 Crossroad Camera Demo Supported Supported Supported
person-reidentification-retail-0031 Crossroad Camera Demo Supported Supported Supported Supported
person-reidentification-retail-0076 Crossroad Camera Demo Supported Supported Supported Supported
person-reidentification-retail-0079 Crossroad Camera Demo Supported Supported Supported Supported
person-vehicle-bike-detection-crossroad-0078 Crossroad Camera Demo Supported Supported Supported Supported
human-pose-estimation-adas-0001 Human Pose Estimation Demo Supported Supported Supported
semantic-segmentation-adas-0001 Image Segmentation Demo Supported Supported
instance-segmentation-security-0033 Instance Segmentation Demo Supported Supported Supported
instance-segmentation-security-0049 Instance Segmentation Demo Supported Supported Supported
age-gender-recognition-retail-0013 Interactive Face Detection Demo Supported Supported Supported Supported
emotions-recognition-retail-0003 Interactive Face Detection Demo Supported Supported Supported Supported
face-detection-adas-0001 Interactive Face Detection Demo Supported Supported Supported Supported
face-detection-adas-binary-0001 Interactive Face Detection Demo Supported Supported
face-detection-retail-0004 Interactive Face Detection Demo Supported Supported Supported Supported
facial-landmarks-35-adas-0002 Interactive Face Detection Demo Supported Supported Supported
license-plate-recognition-barrier-0001 Security Barrier Camera Demo Supported Supported Supported Supported
vehicle-attributes-recognition-barrier-0039 Security Barrier Camera Demo Supported Supported Supported Supported
vehicle-license-plate-detection-barrier-0106 Security Barrier Camera Demo Supported Supported Supported Supported
face-reidentification-retail-0095 Smart Classroom Demo Supported Supported Supported Supported
landmarks-regression-retail-0009 Smart Classroom Demo Supported Supported Supported Supported
person-detection-action-recognition-0005 Smart Classroom Demo Supported Supported Supported
person-detection-action-recognition-teacher-0002 Smart Classroom Demo Supported Supported Supported
single-image-super-resolution-1032 Super Resolution Demo Supported Supported Supported
single-image-super-resolution-1033 Super Resolution Demo Supported Supported Supported
text-detection-0002 Text Detection Demo Supported Supported Supported
text-recognition-0012 Text Detection Demo Supported Supported
face-person-detection-retail-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported
pedestrian-and-vehicle-detector-adas-0001 any demo that supports SSD*-based models, above Supported Supported Supported Supported
pedestrian-detection-adas-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported
pedestrian-detection-adas-binary-0001 any demo that supports SSD*-based models, above Supported Supported
person-detection-retail-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported
person-detection-retail-0013 any demo that supports SSD*-based models, above Supported Supported Supported Supported
road-segmentation-adas-0001 any demo that supports SSD*-based models, above Supported Supported Supported Supported
vehicle-detection-adas-binary-0001 any demo that supports SSD*-based models, above Supported Supported
vehicle-detection-adas-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported

*Several C++ samples referenced above have simplified equivalents in Python (<INSTALL_DIR>/deployment_tools/inference_engine/samples/python_samples)*.

Notice that the FPGA support comes through a heterogeneous execution, for example, when the post-processing is happening on the CPU.

Build the Sample Applications

Build the Sample Applications on Linux*

The officially supported Linux* build environment is the following:

Use the following steps to build sample application on Linux:

NOTE: If you have installed the product as a root user, switch to root mode before you continue: sudo -i

  1. Navigate to a directory that you have write access to and create a samples build directory. This example uses a directory named build:
    mkdir build

    NOTE: If you ran the Image Classification demo, the samples build directory was already created in your home directory: /home/<user>/inference_engine_samples_build/

  2. Go to the created directory:
    cd build
  3. Run CMake to generate the Make files for release or debug configuration:
    • For release configuration:
      cmake -DCMAKE_BUILD_TYPE=Release <INSTALL_DIR>/deployment_tools/inference_engine/samples
    • For debug configuration:
      cmake -DCMAKE_BUILD_TYPE=Debug <INSTALL_DIR>/deployment_tools/inference_engine/samples
  4. Run make to build the samples:
    make

For the release configuration, the sample application binaries are in <path_to_build_directory>/intel64/Release/; for the debug configuration — in <path_to_build_directory>/intel64/Debug/.

Build the Sample Applications on Microsoft Windows* OS

The recommended Windows* build environment is the following:

To build the sample applications for Windows, run the build_samples_msvc2015.bat or build_samples_msvc2017.bat batch files. These files create and build solutions for sample code using 2015 or 2017 versions of Microsoft Visual Studio respectively. For example, to build the samples using Microsoft Visual Studio 2015:

<INSTALL_DIR>\inference_engine\samples\build_samples_msvc2015.bat

The sample applications binaries are in folder:

You can also build a generated solution by yourself, for example, if you want to build binaries in Debug configuration. Run the appropriate version of the Microsoft Visual Studio and open the generated solution file from the following folders:

Get Ready for Running the Sample Applications

Get Ready for Running the Sample Applications on Linux*

Before running compiled binary files, make sure your application can find the Inference Engine and OpenCV libraries. Run the setupvars script to set all necessary environment variables:

source <INSTALL_DIR>/bin/setupvars.sh

**(Optional)**: The OpenVINO environment variables are removed when you close the shell. As an option, you can permanently set the environment variables as follows:

  1. Open the .bashrc file in <user_home_directory>:
    vi <user_home_directory>/.bashrc
  2. Add this line to the end of the file:
    source /opt/intel/openvino/bin/setupvars.sh
  3. Save and close the file: press the Esc key and type :wq.
  4. To test your change, open a new terminal. You will see [setupvars.sh] OpenVINO environment initialized.

You are ready to run sample applications. To learn about how to run a particular sample, read the sample documentation by clicking the sample name in the samples list above.

Get Ready for Running the Sample Applications on Windows*

Before running compiled binary files, make sure your application can find the Inference Engine and OpenCV libraries. Use the setupvars script, which sets all necessary environment variables:

<INSTALL_DIR>\bin\setupvars.bat

To debug or run the samples on Windows in Microsoft Visual Studio, make sure you have properly configured Debugging environment settings for the Debug and Release configurations. Set correct paths to the OpenCV libraries, and debug and release versions of the Inference Engine libraries. For example, for the Debug configuration, go to the project's Configuration Properties to the Debugging category and set the PATH variable in the Environment field to the following:

PATH=<INSTALL_DIR>\deployment_tools\inference_engine\bin\intel64\Debug;<INSTALL_DIR>\opencv\bin;%PATH%

where <INSTALL_DIR> is the directory in which the OpenVINO toolkit is installed.

You are ready to run sample applications. To learn about how to run a particular sample, read the sample documentation by clicking the sample name in the samples list above.

See Also