Main Content

exportONNXNetwork

Export network to ONNX model format

Description

exportONNXNetwork(net,filename) exports the deep learning network net with weights to the ONNX™ format file filename. If filename exists, then exportONNXNetwork overwrites the file.

This function requires the Deep Learning Toolbox™ Converter for ONNX Model Format support package. If this support package is not installed, then the function provides a download link.

example

exportONNXNetwork(net,filename,Name=Value) exports a network using additional options specified by one or more name-value arguments. For example, you can specify the name and batch size of the ONNX network.

Examples

collapse all

Load the pretrained SqueezeNet convolutional neural network.

net = imagePretrainedNetwork("squeezenet")
net = 
  dlnetwork with properties:

         Layers: [68×1 nnet.cnn.layer.Layer]
    Connections: [75×2 table]
     Learnables: [52×3 table]
          State: [0×3 table]
     InputNames: {'data'}
    OutputNames: {'prob_flatten'}
    Initialized: 1

  View summary with summary.

net is a dlnetwork object that contains the layers and learnable parameters of the network, among other properties.

Analyze the network.

analyzeNetwork(net)

exportONNXNetwork_analyeNetwork.png

analyzeNetwork displays an interactive plot of the network architecture and a table containing information about the network layers. You can also detect errors and issues in the network net before exporting it to the ONNX format. net is error free.

Export the network net as an ONNX format file named squeezenet.onnx. Save the file to the current folder. If the Deep Learning Toolbox Converter for ONNX Model Format support package is not installed, then exportONNXNetwork provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install.

filename = "squeezenet.onnx";
exportONNXNetwork(net,filename)

Now you can import the squeezenet.onnx file into any deep learning framework that supports ONNX import.

Input Arguments

collapse all

Trained network or graph of network layers, specified as a dlnetwork object.

You can get a trained network in these ways:

  • Import a pretrained network by using the imagePretrainedNetwork function

  • Train a dlnetwork object by using the trainnet function or a custom training loop

exportONNXNetwork requires net to be error free. You can detect errors and issues in a trained network before exporting it to an ONNX network by using analyzeNetwork.

Name of file, specified as a character vector or string scalar.

Example: "network.onnx"

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: exportONNXNetwork(net,filename,NetworkName="my_net") exports a network and specifies "my_net" as the network name in the saved ONNX network.

Name of ONNX network to store in the saved file, specified as a character vector or a string scalar.

Example: NetworkName="my_squeezenet"

Version of ONNX operator set to use in the exported model, specified as a positive integer in the range [6 18]. If the default operator set does not support the network you are trying to export, then try using a later version. If you import the exported network to another framework and you used an operator set during export that the importer does not support, then the import can fail.

To ensure that you use the appropriate operator set version, consult the ONNX operator documentation [3]. For example, OpsetVersion=9 exports the maxUnpooling2dLayer to the MaxUnpool-9 ONNX operator.

Example: OpsetVersion=6

Batch size of the ONNX network, specified as [] or as a positive integer. If you specify BatchSize as [], the ONNX network has a dynamic batch size. If you specify BatchSize as a positive integer k, the ONNX network has a fixed batch size of k.

Example: BatchSize=10

Limitations

  • exportONNXNetwork supports these ONNX:

    • ONNX intermediate representation version 9

    • ONNX operator sets 6–18

  • exportONNXNetwork does not export settings or properties related to network training such as training options, learning rate factors, or regularization factors.

  • If you export a network containing a layer that the ONNX format does not support (see Layers Supported for ONNX Export), then exportONNXNetwork saves a placeholder ONNX operator in place of the unsupported layer and returns a warning. You cannot import an ONNX network with a placeholder operator into other deep learning frameworks.

  • Because of architectural differences between MATLAB® and ONNX, an exported network can have a different structure compared to the original network.

Note

If you import an exported network, layers of the reimported network might differ from layers of the original network, and might not be supported.

More About

collapse all

Layers Supported for ONNX Export

exportONNXNetwork can export the following:

Tips

  • You can export a trained MATLAB deep learning network that includes multiple inputs and multiple outputs to the ONNX model format. To learn about a multiple-input and multiple-output deep learning network, see Multiple-Input and Multiple-Output Networks.

References

[1] Open Neural Network Exchange. https://github.com/onnx/.

[2] ONNX. https://onnx.ai/.

Version History

Introduced in R2018a

expand all