Winograd - AshokBhat/ml GitHub Wiki
About
- Convolutions can be implemented using Winograd Algorithm
- Can reduce the number of floating-point multiplications in convolution by a factor of 2.25x
FAQ
- Is Winograd faster than conventional FFTs for CNN? Yes
- Does oneDNN use Winograd? Yes
- Does cuDNN use Winograd? Yes
- How does Winograd compare against conventional GEMM based implementations?
- Is Winograd faster than direct convolution? Yes
oneDNN and Winograd
Executing convolution using the Winograd algorithm often gives a significant performance boost compared with using the Direct algorithm. Details about the algorithm can be found in Fast Algorithms for Convolutional Neural Networks by A. Lavin and S. Gray.
Source: https://oneapi-src.github.io/oneDNN/winograd_convolution.html
cuDNN and Winograd
- Supports
CUDNN_CONVOLUTION_FWD_ALGO_WINOGRAD_NONFUSED
algorithm incudnnConvolutionForward
function