{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# For tips on running notebooks in Google Colab, see\n# https://codelin.vip/beginner/colab\n%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Introduction to ONNX** \\|\\| [Exporting a PyTorch model to\nONNX](export_simple_model_to_onnx_tutorial.html) \\|\\| [Extending the\nONNX exporter operator support](onnx_registry_tutorial.html) \\|\\|\n[Export a model with control flow to\nONNX](export_control_flow_model_to_onnx_tutorial.html)\n\nIntroduction to ONNX\n====================\n\nAuthors: [Ti-Tai Wang](https://github.com/titaiwangms), [Thiago\nCrepaldi](https://github.com/thiagocrepaldi).\n\n[Open Neural Network eXchange (ONNX)](https://onnx.ai/) is an open\nstandard format for representing machine learning models. The\n`torch.onnx` module provides APIs to capture the computation graph from\na native PyTorch `torch.nn.Module`{.interpreted-text role=\"class\"} model\nand convert it into an [ONNX\ngraph](https://github.com/onnx/onnx/blob/main/docs/IR.md).\n\nThe exported model can be consumed by any of the many [runtimes that\nsupport ONNX](https://onnx.ai/supported-tools.html#deployModel),\nincluding Microsoft\\'s [ONNX Runtime](https://www.onnxruntime.ai).\n\n```{=html}\n
Currently, you can choose either through TorchScript https://pytorch.org/docs/stable/jit.html orExportedProgram https://pytorch.org/docs/stable/export.html to export the model to ONNX by theboolean parameter dynamo in torch.onnx.export.In this tutorial, we will focus on the ExportedProgram
approach.