AINEVA is the association of Italian regions and autonomous provinces which include Alps, whose goal is to coordinate efforts that local members play in the prevention and information in the field of snow and avalanches. On a daily basis, AINEVA members compile and make available bulletins of conditions and avalanche forecasts written in Italian. In order to allow their consultation to non-Italian people (e.g. foreign tourists), AINEVA members provide translations of original bulletins into languages such as English, French, German and Slovenian. AINEVA and FBK started a 20-month collaboration at the end of 2009 with the goal of developing a system for the automatic translation of such bulletins into English, French and German.
Emanuele Pianta Award for the Best Master’s Thesis in Computational Linguistics submitted at an Italian university and defended between August 1st 2024 and July 31st 2025
- Deadline: August 1st, 2025 (11:59 pm CEST)
- All details online: https://clic2025.unica.it/emanuele-pianta-award-for-the-best-masters-thesis/
Our pick of the week by @DennisFucci: "Speech Representation Analysis Based on Inter- and Intra-Model Similarities" by Yassine El Kheir, Ahmed Ali, and Shammur Absar Chowdhury (ICASSP Workshops 2024)
#speech #speechtech
Findings from https://ieeexplore.ieee.org/document/10669908 show that speech SSL models converge on similar embedding spaces, but via different routes. While overall representations align, individual neurons learn distinct localized concepts.
Interesting read! @fbk_mt
Cosa chiedono davvero gli italiani all’intelligenza artificiale?
FBK in collaborazione con RiTA lancia un’indagine aperta a tutte/i per capire usi reali, abitudini e bisogni.
Bastano 10 minuti per partecipare, scopri di più: https://magazine.fbk.eu/it/news/italiani-e-ia-cosa-chiediamo-veramente-allintelligenza-artificiale/
🚀 Last call for the Model Compression for Machine Translation task at #WMT2025 (co-located with #EMNLP2025)!
Test data out on June 19 ➡️ 2 weeks for evaluation!
Can you shrink an LLM and keep translation quality high?
👉 https://www2.statmt.org/wmt25/model-compression.html #NLP #ML #LLM #ModelCompression