Public Member Functions
InferenceEngine::Core Class Reference

This class represents Inference Engine Core entity. It can throw exceptions safely for the application, where it is properly handled. More...

#include <ie_core.hpp>

Public Member Functions

  Core (const std::string &xmlConfigFile=std::string())
  Constructs Inference Engine Core instance using XML configuration file with plugins description. See RegisterPlugins for more details. More...
 
std::map< std::string, Version GetVersions (const std::string &deviceName) const
  Returns plugins version information. More...
 
void  SetLogCallback (IErrorListener &listener) const
  Sets logging callback Logging is used to track what is going on inside the plugins, Inference Engine library. More...
 
ExecutableNetwork  LoadNetwork (CNNNetwork network, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >())
  Creates an executable network from a network object. Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources) More...
 
void  AddExtension (IExtensionPtr extension, const std::string &deviceName)
  Registers extension for the specified plugin. More...
 
ExecutableNetwork  ImportNetwork (const std::string &modelFileName, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >())
  Creates an executable network from a previously exported network. More...
 
ExecutableNetwork  ImportNetwork (std::istream &networkModel, const std::string &deviceName={}, const std::map< std::string, std::string > &config={})
  Creates an executable network from a previously exported network. More...
 
QueryNetworkResult  QueryNetwork (const ICNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >()) const
  Query device if it supports specified network with specified configuration. More...
 
void  SetConfig (const std::map< std::string, std::string > &config, const std::string &deviceName=std::string())
  Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp. More...
 
Parameter  GetConfig (const std::string &deviceName, const std::string &name) const
  Gets configuration dedicated to device behaviour. The method is targeted to extract information which can be set via SetConfig method. More...
 
Parameter  GetMetric (const std::string &deviceName, const std::string &name) const
  Gets general runtime metric for dedicated hardware. The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values. More...
 
std::vector< std::string >  GetAvailableDevices () const
  Returns devices available for neural networks inference. More...
 
void  RegisterPlugin (const std::string &pluginName, const std::string &deviceName)
  Register new device and plugin which implement this device inside Inference Engine. More...
 
void  UnregisterPlugin (const std::string &deviceName)
  Removes plugin with specified name from Inference Engine. More...
 
void  RegisterPlugins (const std::string &xmlConfigFile)
  Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. XML file has the following structure: More...
 

Detailed Description

This class represents Inference Engine Core entity. It can throw exceptions safely for the application, where it is properly handled.

Constructor & Destructor Documentation

§ Core()

InferenceEngine::Core::Core ( const std::string &  xmlConfigFile = std::string() )
explicit

Constructs Inference Engine Core instance using XML configuration file with plugins description. See RegisterPlugins for more details.

Parameters
xmlConfigFile A path to .xml file with plugins to load from. If XML configuration file is not specified, then default Inference Engine plugins are loaded from the default plugin.xml file.

Member Function Documentation

§ AddExtension()

void InferenceEngine::Core::AddExtension ( IExtensionPtr  extension,
const std::string &  deviceName 
)

Registers extension for the specified plugin.

Parameters
deviceName Device name to indentify plugin to add an extension in
extension Pointer to already loaded extension

§ GetAvailableDevices()

std::vector<std::string> InferenceEngine::Core::GetAvailableDevices ( ) const

Returns devices available for neural networks inference.

Returns
A vector of devices. The devices are returned as { CPU, FPGA.0, FPGA.1, MYRIAD } If there more than one device of specific type, they are enumerated with .# suffix.

§ GetConfig()

Parameter InferenceEngine::Core::GetConfig ( const std::string &  deviceName,
const std::string &  name 
) const

Gets configuration dedicated to device behaviour. The method is targeted to extract information which can be set via SetConfig method.

Parameters
deviceName - A name of a device to get a configuration value.
name - value of config corresponding to config key.
Returns
Value of config corresponding to config key.

§ GetMetric()

Parameter InferenceEngine::Core::GetMetric ( const std::string &  deviceName,
const std::string &  name 
) const

Gets general runtime metric for dedicated hardware. The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.

Parameters
deviceName - A name of a device to get a metric value.
name - metric name to request.
Returns
Metric value corresponding to metric key.

§ GetVersions()

std::map<std::string, Version> InferenceEngine::Core::GetVersions ( const std::string &  deviceName ) const

Returns plugins version information.

Parameters
deviceName Device name to indentify plugin
Returns
A vector of versions

§ ImportNetwork() [1/2]

ExecutableNetwork InferenceEngine::Core::ImportNetwork ( const std::string &  modelFileName,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = std::map< std::string, std::string >() 
)

Creates an executable network from a previously exported network.

Parameters
deviceName Name of device load executable network on
modelFileName Path to the location of the exported file
config Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
Returns
An executable network reference

§ ImportNetwork() [2/2]

ExecutableNetwork InferenceEngine::Core::ImportNetwork ( std::istream &  networkModel,
const std::string &  deviceName = {},
const std::map< std::string, std::string > &  config = {} 
)

Creates an executable network from a previously exported network.

Parameters
deviceName Name of device load executable network on
networkModel network model stream
config Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation*
Returns
An executable network reference

§ LoadNetwork()

ExecutableNetwork InferenceEngine::Core::LoadNetwork ( CNNNetwork  network,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = std::map< std::string, std::string >() 
)

Creates an executable network from a network object. Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)

Parameters
network CNNNetwork object acquired from CNNNetReader
deviceName Name of device to load network to
config Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation
Returns
An executable network reference

§ QueryNetwork()

QueryNetworkResult InferenceEngine::Core::QueryNetwork ( const ICNNNetwork network,
const std::string &  deviceName,
const std::map< std::string, std::string > &  config = std::map< std::string, std::string >() 
) const

Query device if it supports specified network with specified configuration.

Parameters
deviceName A name of a device to query
network Network object to query
config Optional map of pairs: (config parameter name, config parameter value)
Returns
Pointer to the response message that holds a description of an error if any occurred

§ RegisterPlugin()

void InferenceEngine::Core::RegisterPlugin ( const std::string &  pluginName,
const std::string &  deviceName 
)

Register new device and plugin which implement this device inside Inference Engine.

Parameters
pluginName A name of plugin. Depending on platform pluginName is wrapped with shared library suffix and prefix to identify library full name
deviceName A device name to register plugin for. If device name is not specified, then it's taken from plugin using InferenceEnginePluginPtr::GetName function

§ RegisterPlugins()

void InferenceEngine::Core::RegisterPlugins ( const std::string &  xmlConfigFile )

Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. XML file has the following structure:

<ie>
<plugins>
<plugin name="" location="">
<extensions>
<extension location=""/>
</extensions>
<properties>
<property key="" value=""/>
</properties>
</plugin>
</plugins>
</ie>
  • name identifies name of device enabled by plugin
  • location specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.
  • Properties are set to plugin via the SetConfig method.
  • Extensions are set to plugin via the AddExtension method.
    Parameters
    xmlConfigFile A path to .xml file with plugins to register.

§ SetConfig()

void InferenceEngine::Core::SetConfig ( const std::map< std::string, std::string > &  config,
const std::string &  deviceName = std::string() 
)

Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.

Parameters
deviceName An optinal name of a device. If device name is not specified, the config is set for all the registered devices.
config Map of pairs: (config parameter name, config parameter value)

§ SetLogCallback()

void InferenceEngine::Core::SetLogCallback ( IErrorListener listener ) const

Sets logging callback Logging is used to track what is going on inside the plugins, Inference Engine library.

Parameters
listener Logging sink

§ UnregisterPlugin()

void InferenceEngine::Core::UnregisterPlugin ( const std::string &  deviceName )

Removes plugin with specified name from Inference Engine.

Parameters
deviceName Device name identifying plugin to remove from Inference Engine

The documentation for this class was generated from the following file: