Unable to specify BatchSize in exportONNXNetwork

3 visualizaciones (últimos 30 días)
Gabor Balazs
Gabor Balazs el 22 de Mzo. de 2022
Comentada: Sivylla Paraskevopoulou el 9 de Mayo de 2022
Hello!
I want to export a simple neural network from Matlab to the .ONNX format. Therefore, I use the exportONNXNetwork function. In the documentation, there is the name-value argument "BatchSize", which allows to fix the batch size of the network, which I need to be set to 1.
exportONNXNetwork(net, 'network.onnx', "BatchSize", 1)
Sadly, I cannot set the name-value argument as described in the documentation (https://www.mathworks.com/help/deeplearning/ref/exportonnxnetwork.html); it is not defined inside the exportONNXNetwork function anywhere and finally throws an error in iValidateInputs:
Error using nnet.internal.cnn.onnx.exportONNXNetwork>iValidateInputs (line 49)
'BatchSize' is not a recognized parameter. For a list of valid name-value pair arguments, see the documentation for this function.
Error in nnet.internal.cnn.onnx.exportONNXNetwork (line 29)
[NNTNetwork, Filename, NetworkName, OpsetVersion] = iValidateInputs(NNTNetwork, Filename, defaultOpset, varargin{:});
Error in exportONNXNetwork (line 38)
nnet.internal.cnn.onnx.exportONNXNetwork(Network, filename, varargin{:});
Error in eval_mlp (line 76)
exportONNXNetwork(net, 'network.onnx', "BatchSize", 1)
How can I export a neural network from Matlab to .ONNX format with a fixed batch size?
  1 comentario
Sivylla Paraskevopoulou
Sivylla Paraskevopoulou el 9 de Mayo de 2022
The name-value argument BatchSize was added to the exportONNXNetwork function in R2022a. Which MATLAB version are you using?

Iniciar sesión para comentar.

Respuestas (0)

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by