Text Analytics Toolbox Model for BERT-Large Network
Pretrained BERT-Large Network for MATLAB
85 descargas
Actualizado
19 jun 2024
BERT-Large is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 24 self-attention layers and a hidden size of 1024.
To load a BERT-Large model, you can run the following code:
[net, tokenizer] = bert(Model="large");
Compatibilidad con la versión de MATLAB
Se creó con
R2023b
Compatible con cualquier versión desde R2023b hasta R2024b
Compatibilidad con las plataformas
Windows macOS (Apple Silicon) macOS (Intel) LinuxEtiquetas
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Descubra Live Editor
Cree scripts con código, salida y texto formateado en un documento ejecutable.