function transforms (aka torch.func, functorch) - pytorch/pytorch GitHub Wiki
Page Maintainers: @zou3519
Scope
- understand what composable function transforms are and their most common use cases
- understand what DynamicLayerStack is and how it is used to implement composition of function transforms
Learn about function transforms
- Read through the whirlwind tour
- Read through the advanced autodiff tutorial
- Read through the per-sample-gradients tutorial
- Read through the model ensembling tutorial
Exercise
The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.
- Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
- Write a function to compute the Jacobian by composing vmap and jvp.
The APIs should have the following signature:
def jacobian(f, *args):
pass
You can assume that f
accepts multiple Tensor arguments and returns a single Tensor argument.
Understand how PyTorch implements composable function transforms
Read through this gdoc.
Next
Back to the Core Frontend Onboarding