Pre-trained models for BERT (Bidirectional Encoder Representations from Transformers) are publicly available.
Currently, the following models from the pre-trained BERT model list are supported:
BERT-Base, Multilingual Cased
BERT-Base, Multilingual Uncased
Download and unzip an archive with the BERT-Base, Multilingual Uncased Model.
After the archive is unzipped, the directory
uncased_L-12_H-768_A-12 is created and contains the following files:
Pre-trained model meta-graph files are
To generate the BERT Intermediate Representation (IR) of the model, run the Model Optimizer with the following parameters: