Namespaces | Macros | Enumerations | Variables
ie_plugin_config.hpp File Reference

A header for advanced hardware related properties for Inference Engine plugins To use in SetConfig, LoadNetwork, ImportNetwork methods of plugins. More...

#include <string>
#include <tuple>
#include <vector>
#include <map>
#include "ie_precision.hpp"
#include "hetero/hetero_plugin_config.hpp"
#include "multi-device/multi_device_config.hpp"
#include "cldnn/cldnn_config.hpp"
#include "gna/gna_config.hpp"

Go to the source code of this file.


 Inference Engine C++ API.
 Generic plugin configuration.


#define METRIC_KEY(name)   InferenceEngine::Metrics::METRIC_##name
 shortcut for defining common Inference Engine metrics
 shortcut for defining common Inference Engine ExecutableNetwork metrics
#define METRIC_VALUE(name)   InferenceEngine::Metrics::name
 shortcut for defining metric values
#define CONFIG_KEY(name)   InferenceEngine::PluginConfigParams::_CONFIG_KEY(name)
 shortcut for defining configuration keys
#define CONFIG_VALUE(name)   InferenceEngine::PluginConfigParams::name
 shortcut for defining configuration values
#define AUTO_CONFIG_KEY(name)   InferenceEngine::_CONFIG_KEY(AUTO_##name)
 A macro which provides an AUTO-mangled name for configuration key with name name


enum class  InferenceEngine::Metrics::DeviceType { integrated = 0 , discrete = 1 }
 Enum to define possible device types.


static constexpr auto InferenceEngine::Metrics::METRIC_AVAILABLE_DEVICES = "AVAILABLE_DEVICES"
 Metric to get a std::vector<std::string> of available device IDs. String value is "AVAILABLE_DEVICES".
static constexpr auto InferenceEngine::Metrics::METRIC_SUPPORTED_METRICS = "SUPPORTED_METRICS"
 Metric to get a std::vector<std::string> of supported metrics. String value is "SUPPORTED_METRICS". More...
static constexpr auto InferenceEngine::Metrics::METRIC_SUPPORTED_CONFIG_KEYS = "SUPPORTED_CONFIG_KEYS"
 Metric to get a std::vector<std::string> of supported config keys. String value is "SUPPORTED_CONFIG_KEYS". More...
static constexpr auto InferenceEngine::Metrics::METRIC_FULL_DEVICE_NAME = "FULL_DEVICE_NAME"
 Metric to get a std::string value representing a full device name. String value is "FULL_DEVICE_NAME".
 Metric to get a std::vector<std::string> of optimization options per device. String value is "OPTIMIZATION_CAPABILITIES". More...
static constexpr auto InferenceEngine::Metrics::FP32 = "FP32"
static constexpr auto InferenceEngine::Metrics::BF16 = "BF16"
static constexpr auto InferenceEngine::Metrics::FP16 = "FP16"
static constexpr auto InferenceEngine::Metrics::INT8 = "INT8"
static constexpr auto InferenceEngine::Metrics::BIN = "BIN"
static constexpr auto InferenceEngine::Metrics::WINOGRAD = "WINOGRAD"
static constexpr auto InferenceEngine::Metrics::BATCHED_BLOB = "BATCHED_BLOB"
static constexpr auto InferenceEngine::Metrics::METRIC_RANGE_FOR_STREAMS = "RANGE_FOR_STREAMS"
 Metric to provide information about a range for streams on platforms where streams are supported. More...
 Metric to provide a hint for a range for number of async infer requests. If device supports streams, the metric provides range for number of IRs per stream. More...
 Metric to get an unsigned int value of number of waiting infer request. More...
static constexpr auto InferenceEngine::Metrics::METRIC_NUMBER_OF_EXEC_INFER_REQUESTS = "NUMBER_OF_EXEC_INFER_REQUESTS"
 Metric to get an unsigned int value of number of infer request in execution stage. More...
static constexpr auto InferenceEngine::Metrics::METRIC_DEVICE_ARCHITECTURE = "DEVICE_ARCHITECTURE"
 Metric which defines the device architecture.
static constexpr auto InferenceEngine::Metrics::METRIC_DEVICE_TYPE = "DEVICE_TYPE"
 Metric to get a type of device. See DeviceType enum definition for possible return values.
static constexpr auto InferenceEngine::Metrics::METRIC_DEVICE_GOPS = "DEVICE_GOPS"
 Metric which defines Giga OPS per second count (GFLOPS or GIOPS) for a set of precisions supported by specified device.
static constexpr auto InferenceEngine::Metrics::METRIC_IMPORT_EXPORT_SUPPORT = "IMPORT_EXPORT_SUPPORT"
 Metric which defines support of import/export functionality by plugin.
static constexpr auto InferenceEngine::Metrics::METRIC_NETWORK_NAME = "NETWORK_NAME"
 Metric to get a name of network. String value is "NETWORK_NAME".
static constexpr auto InferenceEngine::Metrics::METRIC_DEVICE_THERMAL = "DEVICE_THERMAL"
 Metric to get a float of device thermal. String value is "DEVICE_THERMAL".
 Metric to get an unsigned integer value of optimal number of executable network infer requests.
static constexpr auto InferenceEngine::PluginConfigParams::YES = "YES"
 generic boolean values
static constexpr auto InferenceEngine::PluginConfigParams::NO = "NO"
static constexpr auto InferenceEngine::PluginConfigParams::KEY_CPU_THREADS_NUM = "CPU_THREADS_NUM"
 Limit #threads that are used by Inference Engine for inference on the CPU.
static constexpr auto InferenceEngine::PluginConfigParams::KEY_CPU_BIND_THREAD = "CPU_BIND_THREAD"
 The name for setting CPU affinity per thread option. More...
static constexpr auto InferenceEngine::PluginConfigParams::NUMA = "NUMA"
static constexpr auto InferenceEngine::PluginConfigParams::HYBRID_AWARE = "HYBRID_AWARE"
static constexpr auto InferenceEngine::PluginConfigParams::CPU_THROUGHPUT_NUMA = "CPU_THROUGHPUT_NUMA"
 Optimize CPU execution to maximize throughput. More...
static constexpr auto InferenceEngine::PluginConfigParams::CPU_THROUGHPUT_AUTO = "CPU_THROUGHPUT_AUTO"
static constexpr auto InferenceEngine::PluginConfigParams::KEY_CPU_THROUGHPUT_STREAMS = "CPU_THROUGHPUT_STREAMS"
static constexpr auto InferenceEngine::PluginConfigParams::KEY_PERF_COUNT = "PERF_COUNT"
 The name for setting performance counters option. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_DYN_BATCH_LIMIT = "DYN_BATCH_LIMIT"
 The key defines dynamic limit of batch processing. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_DYN_BATCH_ENABLED = "DYN_BATCH_ENABLED"
 The key checks whether dynamic batch is enabled.
static constexpr auto InferenceEngine::PluginConfigParams::KEY_CONFIG_FILE = "CONFIG_FILE"
 This key directs the plugin to load a configuration file. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_LOG_LEVEL = "LOG_LEVEL"
 the key for setting desirable log level. More...
static constexpr auto InferenceEngine::PluginConfigParams::LOG_NONE = "LOG_NONE"
static constexpr auto InferenceEngine::PluginConfigParams::LOG_ERROR = "LOG_ERROR"
static constexpr auto InferenceEngine::PluginConfigParams::LOG_WARNING = "LOG_WARNING"
static constexpr auto InferenceEngine::PluginConfigParams::LOG_INFO = "LOG_INFO"
static constexpr auto InferenceEngine::PluginConfigParams::LOG_DEBUG = "LOG_DEBUG"
static constexpr auto InferenceEngine::PluginConfigParams::LOG_TRACE = "LOG_TRACE"
static constexpr auto InferenceEngine::PluginConfigParams::KEY_DEVICE_ID = "DEVICE_ID"
 the key for setting of required device to execute on values: device id starts from "0" - first device, "1" - second device, etc
static constexpr auto InferenceEngine::PluginConfigParams::KEY_EXCLUSIVE_ASYNC_REQUESTS = "EXCLUSIVE_ASYNC_REQUESTS"
 the key for enabling exclusive mode for async requests of different executable networks and the same plugin. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_DUMP_EXEC_GRAPH_AS_DOT = "DUMP_EXEC_GRAPH_AS_DOT"
 This key enables dumping of the internal primitive graph. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_ENFORCE_BF16 = "ENFORCE_BF16"
 The name for setting to execute in bfloat16 precision whenever it is possible. More...
static constexpr auto InferenceEngine::PluginConfigParams::KEY_CACHE_DIR = "CACHE_DIR"
 This key defines the directory which will be used to store any data cached by plugins. More...
static constexpr auto InferenceEngine::KEY_AUTO_DEVICE_LIST = "AUTO_DEVICE_LIST"
 Limit device list config option, with comma-separated devices listed.

Detailed Description

A header for advanced hardware related properties for Inference Engine plugins To use in SetConfig, LoadNetwork, ImportNetwork methods of plugins.