How to implement a Transformer with separate encoder and decoder inputs and cross-attention in MATLAB?
15 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hello,
I was able to build Transformer models in encoder-only and decoder-only form by stacking layers. However, I would now like to implement a model similar to the original Transformer paper, which takes two separate inputs (encoder input and decoder input) and applies cross-attention between them.
The difficulties I am facing are:
- trainNetwork does not seem to support multiple inputs.
- MATLAB does not provide a built-in cross-attention layer.
- When I try to implement a custom cross-attention layer, I encounter too many errors during training.
Is there any recommended way or workaround to build and train a Transformer model in MATLAB that supports encoder-decoder input structure with cross-attention?
Thank you in advance for your help.
0 comentarios
Respuestas (1)
Matt J
el 20 de Ag. de 2025 a las 21:49
Editada: Matt J
el 20 de Ag. de 2025 a las 21:54
trainNetwork does not seem to support multiple inputs.
trainNetwork is deprecated. You should be using trainnet, which supports multiple inputs. From the doc,
"For neural networks with multiple inputs, you must use a TransformedDatastore or CombinedDatastore object."
MATLAB does not provide a built-in cross-attention layer.
0 comentarios
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!