Converting a ONNX* Model

Introduction to ONNX

ONNX* is a representation format for deep learning models. ONNX allows AI developers easily transfer models between different frameworks that helps to choose the best combination for them. Today, PyTorch*, Caffe2*, Apache MXNet*, Microsoft Cognitive Toolkit* and other tools are developing ONNX support.

Supported Public ONNX Topologies

Model Name Path to Public Models master branch
bert_large model archive
bvlc_alexnet model archive
bvlc_googlenet model archive
bvlc_reference_caffenet model archive
bvlc_reference_rcnn_ilsvrc13 model archive
inception_v1 model archive
inception_v2 model archive
resnet50 model archive
squeezenet model archive
densenet121 model archive
emotion_ferplus model archive
mnist model archive
shufflenet model archive
VGG19 model archive
zfnet512 model archive
GPT-2 model archive

Listed models are built with the operation set version 8 except the GPT-2 model. Models that are upgraded to higher operation set versions may not be supported.

Supported Pytorch* Models via ONNX Conversion

Starting from the 2019R4 release, the OpenVINO™ toolkit officially supports public Pytorch* models (from torchvision 0.2.1 and pretrainedmodels 0.7.4 packages) via ONNX conversion. The list of supported topologies is presented below:

Package NameSupported Models
Torchvision Models alexnet, densenet121, densenet161, densenet169, densenet201, resnet101, resnet152, resnet18, resnet34, resnet50, vgg11, vgg13, vgg16, vgg19
Pretrained Models alexnet, fbresnet152, resnet101, resnet152, resnet18, resnet34, resnet152, resnet18, resnet34, resnet50, resnext101_32x4d, resnext101_64x4d, vgg11
ESPNet Models

Supported PaddlePaddle* Models via ONNX Conversion

Starting from the R5 release, the OpenVINO™ toolkit officially supports public PaddlePaddle* models via ONNX conversion. The list of supported topologies downloadable from PaddleHub is presented below:

Model Name Command to download the model from PaddleHub
MobileNetV2 hub install mobilenet_v2_imagenet==1.0.1
ResNet18 hub install resnet_v2_18_imagenet==1.0.0
ResNet34 hub install resnet_v2_34_imagenet==1.0.0
ResNet50 hub install resnet_v2_50_imagenet==1.0.1
ResNet101 hub install resnet_v2_101_imagenet==1.0.1
ResNet152 hub install resnet_v2_152_imagenet==1.0.1

NOTE: To convert a model downloaded from PaddleHub use paddle2onnx converter.

The list of supported topologies from the models v1.5 package:

NOTE: To convert these topologies one should first serialize the model by calling

(description) command and after that use paddle2onnx converter.

Convert an ONNX* Model

The Model Optimizer process assumes you have an ONNX model that was directly downloaded from a public repository or converted from any framework that supports exporting to the ONNX format.

To convert an ONNX* model:

  1. Go to the <INSTALL_DIR>/deployment_tools/model_optimizer directory.
  2. Use the script to simply convert a model with the path to the input model .nnet file:
    python3 --input_model <INPUT_MODEL>.onnx

There are no ONNX* specific parameters, so only framework-agnostic parameters are available to convert your model.

Supported ONNX* Layers

Refer to Supported Framework Layers for the list of supported standard layers.