Convert TensorFlow* BERT Model to the Intermediate Representation

Pre-trained models for BERT (Bidirectional Encoder Representations from Transformers) are publicly available.

Supported Models

Currently, the following models from the pre-trained BERT model list are supported:

Download the Pre-Trained BERT Model

Download and unzip an archive with the BERT-Base, Multilingual Uncased Model.

After the archive is unzipped, the directory uncased_L-12_H-768_A-12 is created and contains the following files:

Pre-trained model meta-graph files are bert_model.ckpt.*.

Convert TensorFlow BERT Model to IR

To generate the BERT Intermediate Representation (IR) of the model, run the Model Optimizer with the following parameters:

python3 ./
--input_meta_graph uncased_L-12_H-768_A-12/bert_model.ckpt.meta \
--output bert/pooler/dense/Tanh \
--disable_nhwc_to_nchw \
--input Placeholder{i32},Placeholder_1{i32},Placeholder_2{i32}