Introduction to Inference Engine Device Query API

This section provides a high-level description of the process of querying of different device properties and configuration values. Refer to the Hello Query Device Sample sources and Multi-Device Plugin guide for example of using the Inference Engine Query API in user applications.

Using the Inference Engine Query API in Your Code

The Inference Engine Core class provides the following API to query device information, set or get different device configuration properties:

The InferenceEngine::ExecutableNetwork class is also extended to support the Query API:

Query API in the Core Class

GetAvailableDevices

std::vector<std::string> availableDevices = core.GetAvailableDevices();

The function returns list of available devices, for example:

MYRIAD.1.2-ma2480
MYRIAD.1.4-ma2480
FPGA.0
FPGA.1
CPU
GPU.0
GPU.1
...

Each device name can then be passed to:

GetConfig()

The code below demonstrates how to understand whether HETERO device dumps .dot files with split graphs during the split stage:

bool dumpDotFile = core.GetConfig("HETERO", HETERO_CONFIG_KEY(DUMP_GRAPH_DOT)).as<bool>();

For documentation about common configuration keys, refer to ie_plugin_config.hpp. Device specific configuration keys can be found in corresponding plugin folders.

GetMetric()

  • To extract device properties such as available device, device name, supported configuration keys, and others, use the InferenceEngine::Core::GetMetric method:
std::string cpuDeviceName = core.GetMetric("GPU", METRIC_KEY(FULL_DEVICE_NAME)).as<std::string>();

A returned value looks as follows: Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz.

NOTE: All metrics have specific type, which is specified during metric instantiation. The list of common device-agnostic metrics can be found in ie_plugin_config.hpp. Device specific metrics (for example, for HDDL, MYRIAD devices) can be found in corresponding plugin folders.

Query API in the ExecutableNetwork Class

GetMetric()

The method is used to get executable network specific metric such as METRIC_KEY(OPTIMAL_NUMBER_OF_INFER_REQUESTS):

auto network = core.ReadNetwork("sample.xml");
auto exeNetwork = core.LoadNetwork(network, "CPU");
auto nireq = exeNetwork.GetMetric(METRIC_KEY(OPTIMAL_NUMBER_OF_INFER_REQUESTS)).as<unsigned int>();

Or the current temperature of MYRIAD device:

auto network = core.ReadNetwork("sample.xml");
auto exeNetwork = core.LoadNetwork(network, "MYRIAD");
float temperature = exeNetwork.GetMetric(METRIC_KEY(DEVICE_THERMAL)).as<float>();

GetConfig()

The method is used to get information about configuration values the executable network has been created with:

auto network = core.ReadNetwork("sample.xml");
auto exeNetwork = core.LoadNetwork(network, "CPU");
auto ncores = exeNetwork.GetConfig(PluginConfigParams::KEY_CPU_THREADS_NUM).as<std::string>();

SetConfig()

The only device that supports this method is Multi-Device.

InferenceEngine::Core
This class represents Inference Engine Core entity.
Definition: ie_core.hpp:29
METRIC_KEY
#define METRIC_KEY(name)
shortcut for defining common Inference Engine metrics
Definition: ie_plugin_config.hpp:28
InferenceEngine::PluginConfigParams::KEY_CPU_THREADS_NUM
static constexpr auto KEY_CPU_THREADS_NUM
Limit #threads that are used by Inference Engine for inference on the CPU.
Definition: ie_plugin_config.hpp:192
HETERO_CONFIG_KEY
#define HETERO_CONFIG_KEY(name)
Shortcut for defining HETERO configuration keys.
Definition: hetero_plugin_config.hpp:27