{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# For tips on running notebooks in Google Colab, see\n# https://codelin.vip/beginner/colab\n%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Introduction to ONNX** \\|\\| [Exporting a PyTorch model to\nONNX](export_simple_model_to_onnx_tutorial.html) \\|\\| [Extending the\nONNX exporter operator support](onnx_registry_tutorial.html) \\|\\|\n[Export a model with control flow to\nONNX](export_control_flow_model_to_onnx_tutorial.html)\n\nIntroduction to ONNX\n====================\n\nAuthors: [Ti-Tai Wang](https://github.com/titaiwangms), [Thiago\nCrepaldi](https://github.com/thiagocrepaldi).\n\n[Open Neural Network eXchange (ONNX)](https://onnx.ai/) is an open\nstandard format for representing machine learning models. The\n`torch.onnx` module provides APIs to capture the computation graph from\na native PyTorch `torch.nn.Module`{.interpreted-text role=\"class\"} model\nand convert it into an [ONNX\ngraph](https://github.com/onnx/onnx/blob/main/docs/IR.md).\n\nThe exported model can be consumed by any of the many [runtimes that\nsupport ONNX](https://onnx.ai/supported-tools.html#deployModel),\nincluding Microsoft\\'s [ONNX Runtime](https://www.onnxruntime.ai).\n\n```{=html}\n
NOTE:
\n```\n```{=html}\n
\n```\n```{=html}\n

Currently, you can choose either through TorchScript https://pytorch.org/docs/stable/jit.html orExportedProgram https://pytorch.org/docs/stable/export.html to export the model to ONNX by theboolean parameter dynamo in torch.onnx.export.In this tutorial, we will focus on the ExportedProgram approach.

\n```\n```{=html}\n
\n```\nWhen setting `dynamo=True`, the exporter will use\n[torch.export](https://pytorch.org/docs/stable/export.html) to capture\nan `ExportedProgram`, before translating the graph into ONNX\nrepresentations. This approach is the new and recommended way to export\nmodels to ONNX. It works with PyTorch 2.0 features more robustly, has\nbetter support for newer ONNX operator sets, and consumes less resources\nto make exporting larger models possible.\n\nDependencies\n------------\n\nPyTorch 2.5.0 or newer is required.\n\nThe ONNX exporter depends on extra Python packages:\n\n> - [ONNX](https://onnx.ai) standard library\n> - [ONNX Script](https://onnxscript.ai) library that enables\n> developers to author ONNX operators, functions and models using a\n> subset of Python in an expressive, and yet simple fashion\n> - [ONNX Runtime](https://onnxruntime.ai) accelerated machine\n> learning library.\n\nThey can be installed through [pip](https://pypi.org/project/pip/):\n\n``` {.bash}\npip install --upgrade onnx onnxscript onnxruntime\n```\n\nTo validate the installation, run the following commands:\n\n``` {.python}\nimport torch\nprint(torch.__version__)\n\nimport onnxscript\nprint(onnxscript.__version__)\n\nimport onnxruntime\nprint(onnxruntime.__version__)\n```\n\nEach [import]{.title-ref} must succeed without any errors and the\nlibrary versions must be printed out.\n\nFurther reading\n---------------\n\nThe list below refers to tutorials that ranges from basic examples to\nadvanced scenarios, not necessarily in the order they are listed. Feel\nfree to jump directly to specific topics of your interest or sit tight\nand have fun going through all of them to learn all there is about the\nONNX exporter.\n\n::: {.toctree hidden=\"\"}\n:::\n" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.12" } }, "nbformat": 4, "nbformat_minor": 0 }