2. Build neural network inference library - QueensGambit/CrazyAra GitHub Wiki
At the time of writing there are three back-ends available to run the neural network inference with:
Back-end | GPU Support | CPU Support | Inference Speed | Effort to install |
---|---|---|---|---|
TensorRT (default) | :heavy_check_mark: | :x: | :fire::fire::fire::fire: | :warning::warning: |
OpenVino | (:heavy_check_mark:) not tested yet | :heavy_check_mark: | :fire::fire::fire: | :warning: |
MXNet | :heavy_check_mark: | :heavy_check_mark: | :fire::fire: | :warning::warning::warning: (CPU only) / :warning::warning::warning::warning: (GPU) |
Torch | :heavy_check_mark: | :heavy_check_mark: | :fire: | :warning: |
Next part: