This class represents Inference Engine Core entity. It can throw exceptions safely for the application, where it is properly handled. More...
#include <ie_core.hpp>
Public Member Functions | |
| Core (const std::string &xmlConfigFile=std::string()) | |
| Constructs Inference Engine Core instance using XML configuration file with plugins description. See RegisterPlugins for more details. More... | |
| std::map< std::string, Version > | GetVersions (const std::string &deviceName) const |
| Returns plugins version information. More... | |
| void | SetLogCallback (IErrorListener &listener) const |
| Sets logging callback Logging is used to track what is going on inside the plugins, Inference Engine library. More... | |
| ExecutableNetwork | LoadNetwork (CNNNetwork network, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >()) |
| Creates an executable network from a network object. Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources) More... | |
| void | AddExtension (IExtensionPtr extension, const std::string &deviceName) |
| Registers extension for the specified plugin. More... | |
| ExecutableNetwork | ImportNetwork (const std::string &modelFileName, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >()) |
| Creates an executable network from a previously exported network. More... | |
| QueryNetworkResult | QueryNetwork (const ICNNNetwork &network, const std::string &deviceName, const std::map< std::string, std::string > &config=std::map< std::string, std::string >()) const |
| Query device if it supports specified network with specified configuration. More... | |
| void | SetConfig (const std::map< std::string, std::string > &config, const std::string &deviceName=std::string()) |
| Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp. More... | |
| Parameter | GetConfig (const std::string &deviceName, const std::string &name) const |
| Gets configuration dedicated to device behaviour. The method is targeted to extract information which can be set via SetConfig method. More... | |
| Parameter | GetMetric (const std::string &deviceName, const std::string &name) const |
| Gets general runtime metric for dedicated hardware. The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values. More... | |
| std::vector< std::string > | GetAvailableDevices () const |
| Returns devices available for neural networks inference. More... | |
| void | RegisterPlugin (const std::string &pluginName, const std::string &deviceName) |
| Register new device and plugin which implement this device inside Inference Engine. More... | |
| void | UnregisterPlugin (const std::string &deviceName) |
| Removes plugin with specified name from Inference Engine. More... | |
| void | RegisterPlugins (const std::string &xmlConfigFile) |
| Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. XML file has the following structure: More... | |
This class represents Inference Engine Core entity. It can throw exceptions safely for the application, where it is properly handled.
|
explicit |
Constructs Inference Engine Core instance using XML configuration file with plugins description. See RegisterPlugins for more details.
| xmlConfigFile | A path to .xml file with plugins to load from. If XML configuration file is not specified, then default Inference Engine plugins are loaded from the default plugin.xml file. |
| void InferenceEngine::Core::AddExtension | ( | IExtensionPtr | extension, |
| const std::string & | deviceName | ||
| ) |
Registers extension for the specified plugin.
| deviceName | Device name to indentify plugin to add an extension in |
| extension | Pointer to already loaded extension |
| std::vector<std::string> InferenceEngine::Core::GetAvailableDevices | ( | ) | const |
Returns devices available for neural networks inference.
| Parameter InferenceEngine::Core::GetConfig | ( | const std::string & | deviceName, |
| const std::string & | name | ||
| ) | const |
Gets configuration dedicated to device behaviour. The method is targeted to extract information which can be set via SetConfig method.
| deviceName | - A name of a device to get a configuration value. |
| name | - value of config corresponding to config key. |
| Parameter InferenceEngine::Core::GetMetric | ( | const std::string & | deviceName, |
| const std::string & | name | ||
| ) | const |
Gets general runtime metric for dedicated hardware. The method is needed to request common device properties which are executable network agnostic. It can be device name, temperature, other devices-specific values.
| deviceName | - A name of a device to get a metric value. |
| name | - metric name to request. |
| std::map<std::string, Version> InferenceEngine::Core::GetVersions | ( | const std::string & | deviceName | ) | const |
Returns plugins version information.
| Device | name to indentify plugin |
| ExecutableNetwork InferenceEngine::Core::ImportNetwork | ( | const std::string & | modelFileName, |
| const std::string & | deviceName, | ||
| const std::map< std::string, std::string > & | config = std::map< std::string, std::string >() |
||
| ) |
Creates an executable network from a previously exported network.
| deviceName | Name of device load executable network on |
| modelFileName | Path to the location of the exported file |
| config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation* |
| ExecutableNetwork InferenceEngine::Core::LoadNetwork | ( | CNNNetwork | network, |
| const std::string & | deviceName, | ||
| const std::map< std::string, std::string > & | config = std::map< std::string, std::string >() |
||
| ) |
Creates an executable network from a network object. Users can create as many networks as they need and use them simultaneously (up to the limitation of the hardware resources)
| network | CNNNetwork object acquired from CNNNetReader |
| deviceName | Name of device to load network to |
| config | Optional map of pairs: (config parameter name, config parameter value) relevant only for this load operation |
| QueryNetworkResult InferenceEngine::Core::QueryNetwork | ( | const ICNNNetwork & | network, |
| const std::string & | deviceName, | ||
| const std::map< std::string, std::string > & | config = std::map< std::string, std::string >() |
||
| ) | const |
Query device if it supports specified network with specified configuration.
| deviceName | A name of a device to query |
| network | Network object to query |
| config | Optional map of pairs: (config parameter name, config parameter value) |
| void InferenceEngine::Core::RegisterPlugin | ( | const std::string & | pluginName, |
| const std::string & | deviceName | ||
| ) |
Register new device and plugin which implement this device inside Inference Engine.
| pluginName | A name of plugin. Depending on platform pluginName is wrapped with shared library suffix and prefix to identify library full name |
| deviceName | A device name to register plugin for. If device name is not specified, then it's taken from plugin using InferenceEnginePluginPtr::GetName function |
| void InferenceEngine::Core::RegisterPlugins | ( | const std::string & | xmlConfigFile | ) |
Registers plugin to Inference Engine Core instance using XML configuration file with plugins description. XML file has the following structure:
<ie> <plugins> <plugin name="" location=""> <extensions> <extension location=""> </extensions> <properties> <property key="" value=""> </properties> </plugin> </plugin> </ie>
name identifies name of device enabled by pluginlocation specifies absolute path to dynamic library with plugin. A path can also be relative to inference engine shared library. It allows to have common config for different systems with different configurations.SetConfig method.AddExtension method. | void InferenceEngine::Core::SetConfig | ( | const std::map< std::string, std::string > & | config, |
| const std::string & | deviceName = std::string() |
||
| ) |
Sets configuration for device, acceptable keys can be found in ie_plugin_config.hpp.
| deviceName | An optinal name of a device. If device name is not specified, the config is set for all the registered devices. |
| config | Map of pairs: (config parameter name, config parameter value) |
| void InferenceEngine::Core::SetLogCallback | ( | IErrorListener & | listener | ) | const |
Sets logging callback Logging is used to track what is going on inside the plugins, Inference Engine library.
| listener | Logging sink |
| void InferenceEngine::Core::UnregisterPlugin | ( | const std::string & | deviceName | ) |
Removes plugin with specified name from Inference Engine.
| deviceName | Device name identifying plugin to remove from Inference Engine |