Text Analytics Toolbox Model for BERT-Base Multilingual Cased Network

Pretrained BERT-Base Multilingual Cased Network for MATLAB
59 descargas
Actualizado 11 sep 2024
BERT-Base Multilingual Cased is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 12 self-attention layers and a hidden size of 768.
To load a BERT-Multilingual Cased model, you can run the following code:
[net, tokenizer] = bert(Model="multilingual");
Compatibilidad con la versión de MATLAB
Se creó con R2023b
Compatible con cualquier versión desde R2023b hasta R2024b
Compatibilidad con las plataformas
Windows macOS (Apple Silicon) macOS (Intel) Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!