interface InferenceEngine::ICore¶
Overview¶
Minimal ICore interface to allow plugin to get information from Core Inference Engine class. More…
#include <ie_icore.hpp>
template ICore
{
// methods
virtual CNNNetwork ReadNetwork(
const std::string& model,
const Blob::CPtr& weights
) const = 0;
virtual CNNNetwork ReadNetwork(
const std::string& modelPath,
const std::string& binPath
) const = 0;
virtual SoExecutableNetworkInternal LoadNetwork(
const CNNNetwork& network,
const std::string& deviceName,
const std::map<std::string, std::string>& config = {}
) = 0;
virtual SoExecutableNetworkInternal LoadNetwork(
const std::string& modelPath,
const std::string& deviceName,
const std::map<std::string, std::string>& config
) = 0;
virtual SoExecutableNetworkInternal ImportNetwork(
std::istream& networkModel,
const std::string& deviceName = {},
const std::map<std::string, std::string>& config = {}
) = 0;
virtual QueryNetworkResult QueryNetwork(
const CNNNetwork& network,
const std::string& deviceName,
const std::map<std::string, std::string>& config
) const = 0;
virtual Parameter GetMetric(
const std::string& deviceName,
const std::string& name
) const = 0;
virtual std::vector<std::string> GetAvailableDevices() const = 0;
virtual bool DeviceSupportsImportExport(const std::string& deviceName) const = 0;
};
Detailed Documentation¶
Minimal ICore interface to allow plugin to get information from Core Inference Engine class.
Methods¶
virtual CNNNetwork ReadNetwork(
const std::string& model,
const Blob::CPtr& weights
) const = 0
Reads IR xml and bin (with the same name) files.
Parameters:
model |
string with IR |
weights |
shared pointer to constant blob with weights |
Returns:
virtual CNNNetwork ReadNetwork(
const std::string& modelPath,
const std::string& binPath
) const = 0
Reads IR xml and bin files.
Parameters:
modelPath |
path to IR file |
binPath |
path to bin file, if path is empty, will try to read bin file with the same name as xml and if bin file with the same name was not found, will load IR without weights. |
Returns:
virtual SoExecutableNetworkInternal LoadNetwork(
const CNNNetwork& network,
const std::string& deviceName,
const std::map<std::string, std::string>& config = {}
) = 0
Creates an executable network from a network object.
Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
Parameters:
network |
CNNNetwork object acquired from Core::ReadNetwork |
deviceName |
Name of device to load network to |
config |
Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
Returns:
An executable network reference
virtual SoExecutableNetworkInternal LoadNetwork(
const std::string& modelPath,
const std::string& deviceName,
const std::map<std::string, std::string>& config
) = 0
Creates an executable network from a model file.
Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
Parameters:
modelPath |
Path to model |
deviceName |
Name of device to load network to |
config |
Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
Returns:
An executable network reference
virtual SoExecutableNetworkInternal ImportNetwork(
std::istream& networkModel,
const std::string& deviceName = {},
const std::map<std::string, std::string>& config = {}
) = 0
Creates an executable network from a previously exported network.
Parameters:
networkModel |
network model stream |
deviceName |
Name of device load executable network on |
config |
Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation* |
Returns:
An executable network reference
virtual QueryNetworkResult QueryNetwork(
const CNNNetwork& network,
const std::string& deviceName,
const std::map<std::string, std::string>& config
) const = 0
Query device if it supports specified network with specified configuration.
Parameters:
deviceName |
A name of a device to query |
network |
Network object to query |
config |
Optional map of pairs: (config parameter name, config parameter value) |
Returns:
An object containing a map of pairs a layer name -> a device name supporting this layer.
virtual Parameter GetMetric(
const std::string& deviceName,
const std::string& name
) const = 0
Gets general runtime metric for dedicated hardware.
The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.
Parameters:
deviceName |
|
name |
|
Returns:
Metric value corresponding to metric key.
virtual std::vector<std::string> GetAvailableDevices() const = 0
Returns devices available for neural networks inference.
Returns:
A vector of devices. The devices are returned as { CPU, FPGA.0, FPGA.1, MYRIAD } If there more than one device of specific type, they are enumerated with .# suffix.
virtual bool DeviceSupportsImportExport(const std::string& deviceName) const = 0
Checks whether device supports Export & Import functionality of network.
Parameters:
deviceName |
|
Returns:
True if device has IMPORT_EXPORT_SUPPORT metric in SUPPORTED_METRICS and this metric returns ‘true’, False otherwise.