BuildingForWindows - akuporos/openvino GitHub Wiki
Build OpenVINO™ Runtime for Windows systems
The software was validated on:
- Microsoft Windows 10 (64-bit) with Visual Studio 2019
Table of content:
- Software Requirements
- Build Steps
- Additional Build Options
- Building Inference Engine with Ninja* Build System
Software Requirements
- CMake* 3.14 or higher
- Microsoft* Visual Studio 2019, version 16.8 or later
- (Optional) Intel® Graphics Driver for Windows* (30.0) driver package.
- Python 3.7 or higher for OpenVINO Runtime Python API
- Git for Windows*
Build Steps
- Clone submodules:
git clone https://github.com/openvinotoolkit/openvino.git cd openvino git submodule update --init --recursive - Create build directory:
mkdir build && cd build
NOTE: By default, the build enables the Inference Engine GPU plugin to infer models on your Intel® Processor Graphics. This requires you to download and install the Intel® Graphics Driver for Windows (26.20) driver package before running the build. If you don't want to use the GPU plugin, use the
-DENABLE_INTEL_GPU=OFFCMake build option and skip the installation of the Intel® Graphics Driver.
-
In the
builddirectory, runcmaketo fetch project dependencies and generate a Visual Studio solution.For Microsoft* Visual Studio 2019 x64 architecture:
cmake -G "Visual Studio 16 2019" -A x64 -DCMAKE_BUILD_TYPE=Release ..For Microsoft* Visual Studio 2019 ARM architecture:
cmake -G "Visual Studio 16 2019" -A ARM -DCMAKE_BUILD_TYPE=Release ..For Microsoft* Visual Studio 2019 ARM64 architecture:
cmake -G "Visual Studio 16 2019" -A ARM64 -DCMAKE_BUILD_TYPE=Release .. -
Build generated solution in Visual Studio or run
cmake --build . --config Release --verbose -j8to build from the command line. Note that this process may take some time. -
Before running the samples, add paths to the Threading Building Blocks (TBB) and OpenCV binaries used for the build to the
%PATH%environment variable. By default, TBB binaries are downloaded by the CMake-based script to the<openvino_repo>/inference-engine/temp/tbb/binfolder, OpenCV binaries to the<openvino_repo>/inference-engine/temp/opencv_4.5.0/opencv/binfolder.
Additional Build Options
-
Internal JIT GEMM implementation is used by default.
-
Threading Building Blocks (TBB) is used by default. To build Inference Engine with OpenMP threading, set the
-DTHREADING=OMPoption. -
Required versions of TBB and OpenCV packages are downloaded automatically by the CMake-based script. If you want to use the automatically-downloaded packages but you have already installed TBB or OpenCV packages configured in your environment, you may need to clean the
TBBROOTandOpenCV_DIRenvironment variables before running thecmakecommand; otherwise they won't be downloaded and the build may fail if incompatible versions were installed. -
If the CMake-based build script can not find and download the OpenCV package that is supported on your platform, or if you want to use a custom build of the OpenCV library, refer to the Use Custom OpenCV Builds section for details.
-
To switch off/on the CPU and GPU plugins, use the
cmakeoptions-DENABLE_INTEL_CPU=ON/OFFand-DENABLE_INTEL_GPU=ON/OFFrespectively. -
To build the OpenVINO Runtime Python API:
- First, install all additional packages (e.g., cython and opencv) listed in the
src\bindings\python\src\compatibility\openvino\requirements-dev.txtfile:pip install -r requirements-dev.txt - Second, enable the
-DENABLE_PYTHON=ONin the CMake (Step #4) option above. To specify an exact Python version, use the following options:-DPYTHON_EXECUTABLE="C:\Program Files\Python37\python.exe" ^ -DPYTHON_LIBRARY="C:\Program Files\Python37\libs\python37.lib" ^ -DPYTHON_INCLUDE_DIR="C:\Program Files\Python37\include" - To build a wheel package (.whl), enable the
-DENABLE_WHEEL=ONoption in the CMake step above (Step 4): - After the build process finishes, export the newly built Python libraries to the user environment variables:
or install the wheel with pip:set PYTHONPATH=<openvino_repo>/bin/intel64/Release/python_api/python3.7;%PYTHONPATH% set OPENVINO_LIB_PATH=<openvino_repo>/bin/intel64/Release;%OPENVINO_LIB_PATH%pip install <openvino_repo>/build/wheel/openvino-2022.2.0-000-cp37-cp37-win_amd64.whl
- First, install all additional packages (e.g., cython and opencv) listed in the
-
OpenVINO runtime compilation options:
-DENABLE_OV_ONNX_FRONTEND=ONenables the building of the ONNX importer.
Building Inference Engine with Ninja* Build System
call "C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\VC\Auxiliary\Build\vcvars64.bat"
cmake -G Ninja -Wno-dev -DCMAKE_BUILD_TYPE=Release ..
cmake --build . --config Release