Deep neural language and translation models have revolutionized modern natural language processing. Data-intensive approaches using deep learning and artificial neural networks are applied in the majority of NLP tasks and require extensive high-performance computing.
Projects:
- HPLT – High-Performance Language Technologies (2022-2025)
- Uncertainty-aware neural language models (2022-2024)
- Behind the words: Deep neural models of language meaning for industry-grade appliciations (2021-2023)
- FoTran – Found in Translation (ERC), (2018-2024)
- OPUS-MT – Open Translation Models, Tools and Services (ELG pilot pro)
- EOSC-nordic – the NLPL use case inside the nordic chapter of the European Open Science Cloud
- OPUS (supported by NLPL), including OPUS-MT
People:
- Jörg Tiedemann
- Mathias Creutz
- Hande Celikkanat
- Michele Boggia
- Stig-Arne Grönroos
- Timothee Mickus
- Marianna Apidianaki
- Sami Virpioja
- Khalid Alnajjar
- Juan Raul Vazquez Carrillo
- Aarne Talman
- Mikko Aulamo
- Eetu Sjöblom
- Robert Östling (2016)
Publications: