Neural Machine Translation (NMT) tutorial with OpenNMT-py. Data preprocessing, model training, evaluation, and deployment.
- Data Processing (notebook | code)
- NMT Model Training with OpenNMT-py (notebook)
- Translation/Inference with CTranslate2 (code)
- MT Evaluation with BLEU and other metrics (tutorial | code | notebook)
- Simple Web UI (tutorial | code)
- Running TensorBoard with OpenNMT (tutorial)
- Low-Resource Neural Machine Translation (tutorial)
- Domain Adaptation with Mixed Fine-tuning (tutorial)
- Overview of Domain Adaptation Techniques (tutorial)
- Multilingual Machine Translation (tutorial)
- Using Pre-trained NMT models with CTranslate2 (M2M-100 | NLLB-200)
- Domain-Specific Text Generation for Machine Translation (paper | article | code)
- Adaptive Machine Translation with Large Language Models (paper | code)
- Fine-tuning Large Language Models for Adaptive Machine Translation (paper | code)
@misc{moslem-2022-OpenNMTtutorial,
title = "{OpenNMT-py} Tutorial: Neural Machine Translation Data Preprocessing, Model Training, and Evaluation",
author = "Moslem, Yasmin",
year = "2022",
publisher = "GitHub",
url = "https://github.com/ymoslem/OpenNMT-Tutorial",
note = "GitHub repository"
}@inproceedings{moslem-etal-2022-domain,
title = "Domain-Specific Text Generation for Machine Translation",
author = "Moslem, Yasmin and
Haque, Rejwanul and
Kelleher, John and
Way, Andy",
booktitle = "Proceedings of the 15th biennial conference of the Association for Machine Translation in the Americas (Volume 1: Research Track)",
month = sep,
year = "2022",
address = "Orlando, USA",
publisher = "Association for Machine Translation in the Americas",
url = "https://aclanthology.org/2022.amta-research.2",
pages = "14--30",
}