Main Content


Export network to ONNX model format



exportONNXNetwork(net,filename) exports the deep learning network net with weights to the ONNX™ format file filename. If filename exists, then exportONNXNetwork overwrites the file.

This function requires the Deep Learning Toolbox™ Converter for ONNX Model Format support package. If this support package is not installed, then the function provides a download link.

exportONNXNetwork(net,filename,Name,Value) exports a network using additional options specified by one or more name-value pair arguments.


collapse all

Load a pretrained SqueezeNet convolutional neural network.

net = squeezenet
  DAGNetwork with properties:

         Layers: [68×1 nnet.cnn.layer.Layer]
    Connections: [75×2 table]
     InputNames: {'data'}
    OutputNames: {'ClassificationLayer_predictions'}

Export the network as an ONNX format file in the current folder called squeezenet.onnx. If the Deep Learning Toolbox Converter for ONNX Model Format support package is not installed, then the function provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install.

filename = 'squeezenet.onnx';

Now, you can import the squeezenet.onnx file into any deep learning framework that supports ONNX import.

Input Arguments

collapse all

Trained network or graph of network layers, specified as a SeriesNetwork, DAGNetwork, dlnetwork, or layerGraph object.

You can get a trained network (SeriesNetwork, DAGNetwork, or dlnetwork) in these ways:

  • Import a pretrained network. For example, use the googlenet function.

  • Train your own network. Use trainNetwork to train a SeriesNetwork or DAGNetwork. Use a custom training loop to train a dlnetwork.

A layerGraph object is a graph of network layers. Some of the layer parameters of this graph might be empty (for example, the weights and bias of convolution layers, and the mean and variance of batch normalization layers). Before using layerGraph as an input argument to exportONNXNetwork, initialize the empty parameters by assigning random values. Alternatively, you can do one of the following before exporting:

  • Convert layerGraph to a dlnetwork by using layerGraph as an input argument to dlnetwork. The empty parameters are automatically initialized.

  • Convert layerGraph to a trained DAGNetwork by using trainNetwork. Use layerGraph as the layers input argument to trainNetwork.

You can detect errors and issues in a trained network or graph of network layers before exporting to an ONNX network by using analyzeNetwork. exportONNXNetwork requires SeriesNetwork, DAGNetwork, and dlnetwork to be error free. exportONNXNetwork permits exporting a layerGraph with a missing or unconnected output layer.

Name of file, specified as a character vector or string scalar.

Example: 'network.onnx'

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

Example: exportONNXNetwork(net,filename,'NetworkName','my_net') exports a network and specifies 'my_net' as the network name in the saved ONNX network.

Name of ONNX network to store in the saved file, specified as a character vector or a string scalar.

Example: 'my_squeezenet'

Version of ONNX operator set to use in the exported model. If the default operator set does not support the network you are trying to export, then try using a later version. If you import the exported network to another framework and you used an operator set during export that the importer does not support, then the import can fail.

To ensure that you use the appropriate operator set version, consult the ONNX operator documentation [3]. For example, 'OpsetVersion',9 exports the maxUnpooling2dLayer to the MaxUnpool-9 ONNX operator.

Example: 6


  • exportONNXNetwork supports ONNX versions as follows:

    • The function supports ONNX intermediate representation version 6.

    • The function supports ONNX operator sets 6, 7, 8, and 9.

  • exportONNXNetwork does not export settings or properties related to network training such as training options, learning rate factors, or regularization factors.

  • If you export a network containing a layer that the ONNX format does not support (see Layers Supported for ONNX Export), then exportONNXNetwork saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks.

  • Because of architectural differences between MATLAB® and ONNX, an exported network can have a different structure compared to the original network.


If you import an exported network, layers of the reimported network might differ from the original network and might not be supported.

More About

collapse all

Layers Supported for ONNX Export

exportONNXNetwork can export the following:


  • You can export a trained MATLAB deep learning network that includes multiple inputs and multiple outputs to the ONNX model format. To learn about a multiple-input and multiple-output deep learning network, see Multiple-Input and Multiple-Output Networks.


[1] Open Neural Network Exchange.

[2] ONNX.

Introduced in R2018a