Invited lecture: Recent Findings and Advances in Context-Aware and Unsupervised Neural Machine Translation

#Machine #translation #neural #machine #models #context-aware #NMT #unsupervised
Share

Machine translation has provided impressive translation quality for many language pairs. The improvements are largely due to the introduction of neural networks to the field, resulting in the modern sequence-to-sequence neural machine translation models. NMT is at the core of many large-scale industrial tools for automatic translation such as Google Translate, Microsoft Translator, Amazon Translate and many others.

Current NMT models work on the sentence-level, meaning they translate sentences independently. However, sentences are almost always contextualized in some way and in many practical use-cases, a user is interested in translating a document in full. Translating individual sentences independently results in an incoherent document translation due to the inconsistent translation of ambiguous source words or incorrect translation of anaphoric pronouns. I will present state-of-the-art context-aware NMT models that address this problem and show why is context-aware NMT essential on the road to human-level translation. I will also present our latest work which calls into question recent results which suggest that the complex task of coreference resolution for pronoun translation, which requires strong reasoning capabilities, is successfully addressed in NMT.

In the second part of the talk, I will briefly present our latest work on unsupervised NMT. Strong MT systems require large corpora of translated sentences which are only available for a limited number of language pairs out of the over 6500 languages in the world. Unsupervised NMT is a method that builds translation models using monolingual corpora only. However, current UNMT methods work well only for language pairs for which large amounts of monolingual data are available. I will present an approach that addresses this issue and show our results on English-Macedonian and English-Albanian translation.



  Date and Time

  Location

  Hosts

  Registration



  • Date: 16 Dec 2020
  • Time: 06:00 PM to 07:30 PM
  • All times are (UTC+01:00) Skopje
  • Add_To_Calendar_icon Add Event to Calendar
If you are not a robot, please complete the ReCAPTCHA to display virtual attendance info.
  • Skopje, Macedonia
  • Macedonia

  • Contact Event Hosts
  • Starts 10 December 2020 12:29 PM
  • Ends 16 December 2020 07:30 PM
  • All times are (UTC+01:00) Skopje
  • No Admission Charge


  Speakers

Dario Stojanovski M.Sc.

Topic:

Recent Findings and Advances in Context-Aware and Unsupervised Neural Machine Translation

Machine translation has provided impressive translation quality for many language pairs. The improvements are largely due to the introduction of neural networks to the field, resulting in the modern sequence-to-sequence neural machine translation models. NMT is at the core of many large-scale industrial tools for automatic translation such as Google Translate, Microsoft Translator, Amazon Translate and many others.

Current NMT models work on the sentence-level, meaning they translate sentences independently. However, sentences are almost always contextualized in some way and in many practical use-cases, a user is interested in translating a document in full. Translating individual sentences independently results in an incoherent document translation due to the inconsistent translation of ambiguous source words or incorrect translation of anaphoric pronouns. I will present state-of-the-art context-aware NMT models that address this problem and show why is context-aware NMT essential on the road to human-level translation. I will also present our latest work which calls into question recent results which suggest that the complex task of coreference resolution for pronoun translation, which requires strong reasoning capabilities, is successfully addressed in NMT.

In the second part of the talk, I will briefly present our latest work on unsupervised NMT. Strong MT systems require large corpora of translated sentences which are only available for a limited number of language pairs out of the over 6500 languages in the world. Unsupervised NMT is a method that builds translation models using monolingual corpora only. However, current UNMT methods work well only for language pairs for which large amounts of monolingual data are available. I will present an approach that addresses this issue and show our results on English-Macedonian and English-Albanian translation.

Biography:

Dario Stojanovski is a PhD student at the Center for Information and Language Processing at LMU Munich under the supervision of Prof. Dr. Alexander Fraser. His main research interests are context-aware and unsupervised neural machine translation. His work is published at top-tier Natural Language Processing and Machine Translation conferences and he was involved in the LMU Munich's high-scoring MT systems at the WMT shared tasks. He also worked as an Applied Research Intern at Amazon Research Berlin. He completed his bachelor and master studies at the Faculty of Computer Science and Engineering at Ss. Cyril and Methodius University in Skopje where he also worked as a researcher focusing on sentiment analysis and emotion identification.

Email: