{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# For tips on running notebooks in Google Colab, see\n# https://pytorch.org/tutorials/beginner/colab\n%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Jacobians, Hessians, hvp, vhp, and more: composing function transforms\n======================================================================\n\nComputing jacobians or hessians are useful in a number of\nnon-traditional deep learning models. It is difficult (or annoying) to\ncompute these quantities efficiently using PyTorch\\'s regular autodiff\nAPIs (`Tensor.backward()`, `torch.autograd.grad`). PyTorch\\'s\n[JAX-inspired](https://github.com/google/jax) [function transforms\nAPI](https://pytorch.org/docs/master/func.html) provides ways of\ncomputing various higher-order autodiff quantities efficiently.\n\n```{=html}\n
This tutorial requires PyTorch 2.0.0 or later.
\n```\n```{=html}\n