FBK-Fairseq
Open source repository with the code and models used in recent papers
Read Moreby Marco Gaido | May 30, 2023 | Software | 0
Open source repository with the code and models used in recent papers
Read Moreby Matteo Negri | May 30, 2023 | Software | 0
A neural adaptive machine translation system that adapts to context and learns from corrections
Read Moreby Dennis Fucci | May 30, 2023 | Software | 0
AQET (Adaptive Quality Estimation Tool) is an open-source package for performing Quality Estimation for Machine Translation able to continuously learn from post-edited sentences.
Read Moreby Andrea Piergentili | May 30, 2023 | Software | 0
An extension of MGIZA++, which allows to align sentence pair in an online mode.
Read Moreby Dennis Fucci | May 30, 2023 | Software | 0
The IRST Language Modeling (IRSTLM) Toolkit features algorithms and data structures suitable to estimate, store, and access very large n-gram language models.
Read Moreby Dennis Fucci | May 30, 2023 | Software | 0
Moses is a statistical machine translation system that allows you to automatically train translation models for any language pair.
Read More
‼️ *Paper, code, models, and outputs out* for one of our last papers "AlignAtt" about Simultaneous Speech Translation recently published at #Interspeech2023!
📍Official Paper: https://www.isca-speech.org/archive/pdfs/interspeech_2023/papi23_interspeech.pdf
📍Repo (code, etc.): https://github.com/hlt-mt/FBK-fairseq/blob/master/fbk_works/ALIGNATT_SIMULST_AGENT_INTERSPEECH2023.md
#NLProc #NLP #speech #translation
Our pick of the week by @apierg: "Prompt-Driven Neural Machine Translation" by Li et al., Findings ACL 2022.
#machine #translation #MT #NLP #NLProc #ACL #computational #linguistics #prompt
The call for diversity and inclusion (D&I) subsidies for #EMNLP2023 is online!
EMNLP 2023 is providing D&I funds for registration, caregiving, bandwidth, travel and VPN subsidies.
Deadline: October 20, 2023 11:59pm (Anywhere on Earth)
Details:
#NLProc
One of the very first multilingual and multimodal model to obtain performance competitive with dedicated models, maybe the first of many? Anyway, a very interesting read: https://arxiv.org/pdf/2308.11466.pdf