ffmpeg/libavfilter/dnn
Guo, Yejun ff37ebaf30 dnn: add openvino as one of dnn backend
OpenVINO is a Deep Learning Deployment Toolkit at
https://github.com/openvinotoolkit/openvino, it supports CPU, GPU
and heterogeneous plugins to accelerate deep learning inferencing.

Please refer to https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md
to build openvino (c library is built at the same time). Please add
option -DENABLE_MKL_DNN=ON for cmake to enable CPU path. The header
files and libraries are installed to /usr/local/deployment_tools/inference_engine/
with default options on my system.

To build FFmpeg with openvion, take my system as an example, run with:
$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/deployment_tools/inference_engine/lib/intel64/:/usr/local/deployment_tools/inference_engine/external/tbb/lib/
$ ../ffmpeg/configure --enable-libopenvino --extra-cflags=-I/usr/local/deployment_tools/inference_engine/include/ --extra-ldflags=-L/usr/local/deployment_tools/inference_engine/lib/intel64
$ make

Here are the features provided by OpenVINO inference engine:
- support more DNN model formats
It supports TensorFlow, Caffe, ONNX, MXNet and Kaldi by converting them
into OpenVINO format with a python script. And torth model
can be first converted into ONNX and then to OpenVINO format.

see the script at https://github.com/openvinotoolkit/openvino/tree/master/model-optimizer/mo.py
which also does some optimization at model level.

- optimize at inference stage
It optimizes for X86 CPUs with SSE, AVX etc.

It also optimizes based on OpenCL for Intel GPUs.
(only Intel GPU supported becuase Intel OpenCL extension is used for optimization)

Signed-off-by: Guo, Yejun <yejun.guo@intel.com>
Signed-off-by: Pedro Arthur <bygrandao@gmail.com>
2020-07-02 09:36:34 +08:00
..
Makefile dnn: add openvino as one of dnn backend 2020-07-02 09:36:34 +08:00
dnn_backend_native.c
dnn_backend_native.h
dnn_backend_native_layer_conv2d.c
dnn_backend_native_layer_conv2d.h
dnn_backend_native_layer_depth2space.c
dnn_backend_native_layer_depth2space.h
dnn_backend_native_layer_mathbinary.c
dnn_backend_native_layer_mathbinary.h
dnn_backend_native_layer_mathunary.c
dnn_backend_native_layer_mathunary.h
dnn_backend_native_layer_maximum.c
dnn_backend_native_layer_maximum.h
dnn_backend_native_layer_pad.c
dnn_backend_native_layer_pad.h
dnn_backend_native_layers.c
dnn_backend_native_layers.h
dnn_backend_openvino.c dnn: add openvino as one of dnn backend 2020-07-02 09:36:34 +08:00
dnn_backend_openvino.h dnn: add openvino as one of dnn backend 2020-07-02 09:36:34 +08:00
dnn_backend_tf.c
dnn_backend_tf.h
dnn_interface.c dnn: add openvino as one of dnn backend 2020-07-02 09:36:34 +08:00