How to implement a Transformer with separate encoder and decoder inputs and cross-attention in MATLAB?

15 visualizaciones (últimos 30 días)
Hello,
I was able to build Transformer models in encoder-only and decoder-only form by stacking layers. However, I would now like to implement a model similar to the original Transformer paper, which takes two separate inputs (encoder input and decoder input) and applies cross-attention between them.
The difficulties I am facing are:
  1. trainNetwork does not seem to support multiple inputs.
  2. MATLAB does not provide a built-in cross-attention layer.
  3. When I try to implement a custom cross-attention layer, I encounter too many errors during training.
Is there any recommended way or workaround to build and train a Transformer model in MATLAB that supports encoder-decoder input structure with cross-attention?
Thank you in advance for your help.

Respuestas (1)

Matt J
Matt J el 20 de Ag. de 2025 a las 21:49
Editada: Matt J el 20 de Ag. de 2025 a las 21:54
trainNetwork does not seem to support multiple inputs.
trainNetwork is deprecated. You should be using trainnet, which supports multiple inputs. From the doc,
"For neural networks with multiple inputs, you must use a TransformedDatastore or CombinedDatastore object."
MATLAB does not provide a built-in cross-attention layer.
Perhaps the example Create Cross-Attention Neural Network is relevant to you.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Productos


Versión

R2025a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by