Di Wu
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
Project maintained by moore3930
Hosted on GitHub Pages — Theme by mattgraham
I pressed the shutter at random and captured this scene.
Hi there! I’m Di Wu 吴迪, a Phd candidate (starting in Sept 2022) under the supervision of Christof Monz
at Language Technology Lab, University of Amsterdam.
I appreciate designing models or approaches that are driven by intuition after capturing deep understandings of specific
fields, equipped with modern neural architectures, and grounded in real-world scenarios. The research that attracted me
most is simple, insightful, and far-reaching creations or findings, such as Word2Vec.
I mainly focus on Natural Language Processing and more specifically Machine Translation.
Here are some problems I get interested in now. If you’re willing to chat about them, leave me a message. I’m always
open to collaborations, or any kind of chat.
Selected Papers
- Representational Isomorphism and Alignment of Multilingual Large Language Models
- Di Wu, Yibin Lei, Andrew Yates, Christof Monz, 2024
- EMNLP2024 Findings, [PDF]
- How Far can 100 Samples Go? Unlocking Zero-Shot Translation with Tiny Multi-Parallel Data
- Di Wu, Shaomu Tan, Yan Meng, David Stap, Christof Monz, 2024
- ACL2024 Findings, [PDF]
- Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine Translation
- Di Wu, Christof Monz, 2023
- EMNLP2023, [PDF]
- UvA-MT’s Participation in the WMT23 General Translation Shared Task
- Di Wu, Shaomu Tan, David Stap, Ali Araabi, Christof Monz, 2023
- WMT2023, Winning System, [PDF]
- SLUA: A Super Lightweight Unsupervised Word Alignment Model via Cross-Lingual Contrastive Learning
- Di Wu, Liang Ding, Dacheng Tao, IWSLT 2022
- IWSLT2022, [PDF]
- SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling
- Di Wu, Liang Ding, Fan Lu, Jian Xie
- EMNLP2020, [PDF]
Ding~ you may know me better from this personal page.