About 489,000 results
Open links in new tab
  1. Fine-tuning mBART to unseen languages - Medium

    Jun 19, 2023 · In this article, we’ll show you how to specialise mBART, a multilingual translation model, to translate Galician, a low-resources language that is not in its original language. Let’s …

  2. MBart and MBart-50 - Hugging Face

    MBart is a multilingual encoder-decoder (sequence-to-sequence) model primarily intended for translation task. As the model is multilingual it expects the sequences in a different format. A …

  3. MBart — transformers 3.5.0 documentation - Hugging Face

    MBart is a multilingual encoder-decoder (seq-to-seq) model primarily intended for translation task. As the model is multilingual it expects the sequences in a different format. A special language …

  4. mBART - Anwarvic's Blog

    Jan 22, 2020 · The whole idea behind mBART is to apply the BART architecture to large-scale monolingual corpora across many languages where the input texts are noised by masking …

  5. Fine-tune neural translation models with mBART

    Jun 10, 2020 · The basic idea is to train the model using monolingual data by masking a sentence that is fed to the encoder, and then have the decoder predict the whole sentence including the …

  6. bhattbhavesh91/language-translation-using-mBart-50 - GitHub

    Streamlit app to Translate text to or between 50 languages with mBART-50 from Huggingface and Facebook. If you like my work, you can support me by buying me a coffee by clicking the link …

  7. mBART generative model pre-training & fine-tuning.

    We proposed fine-tuned language models based on pre-trained language models called AraBERT and M-BERT to perform Arabic GED on two approaches, which are the token level and …

  8. mBART Generative Model Pre-training & Fine-tuning V-C.2.

    Among transformer models, Generative Pre-trained Transformer (GPT) has achieved remarkable performance in text generation, or natural language generation (NLG), which needs the …

  9. mBART: Multilingual AI Translation | SERP AI

    mBART represents a significant advancement in multilingual AI translation, offering several key improvements over previous approaches. The architecture employs a standard sequence-to …

  10. Enhancing Neural Machine Translation with Fine-Tuned mBART50 …

    Jan 3, 2024 · In this paper, the resilience of the self-supervised multilingual sequence-to-sequence pre-trained model (mBART50) were investigated when fine-tuned with small …

Refresh