
Wominjeka!
I am a Lecturer and a Postdoctoral Fellow at the University of Melbourne. I received a PhD in Computer Science from the University of Melbourne in 2018 (check my PhD thesis). Even though my other degrees are also in computer science and machine learning, I've always been aspired to study lingustics. Therefore, most of my current research is in the field of computational linguistics. More specifically, I am using technology to explore systematicity and regularities in Language, and vice versa, utilizing the regularities in linguistic structures to improve NLP models and learning algorithms, with a focus on under-resourced languages and their documentation.
The ultimate goal of my research is to find universal principles and underlying structures in the organization of Human Language in general and languages in particular. Using statistical models, I aim to understand the constraints that led to the shared properties observed in most languages around the world.
A (non-exhaustive) list of my research interests: computational approaches to linguistic typology, computational social science, formal languages, information theory, machine translation, low-resource NLP, multilingual NLP, NLP for under-resourced languages and field research.
I co-organized SIGTYP 2019 – 2022 workshops and shared tasks and the SIGMORPHON 2017 – 2022 shared tasks on morphological reinflection. I am also co-running FieldMatters 2022 and LoResMT 2022. Finally, I am an active member of the UniMorph Project.
More curious readers are invited to check my CV for more detailed information. :-)
Teaching
I am currently co-lecturing COMP10001 ``Foundations of Computing'', Semester 2 (with Chris Leckie and Huey Yee Chan). If you're a student taking this course, please consider visiting the page for the course.
Please visit the Teaching section to check all subjects I've been teaching.
Most Recent Publications
UniMorph 4.0: Universal Morphology. (2022). In Proceedings of the 12th International Conference on Language Resources and Evaluation (LREC)
by Khuyagbaatar Batsuren, Omer Goldman, Salam Khalifa, Nizar Habash, Witold Kieraś, {...}, David Yarowsky, Ryan Cotterell, Reut Tsarfaty, Ekaterina Vylomova
Read in PDF
The SIGTYP 2022 Shared Task on the Prediction of Cognate Reflexes. (2022). In Proceedings of the 4th Workshop on Research in Computational Linguistic Typology and Multilingual NLP (pp. 52--62)
by Johann-Mattis List, Ekaterina Vylomova, Robert Forkel, Nathan Hill, Ryan Cotterell
Read in PDF
The SIGMORPHON–UniMorph 2022 Shared Task 0: Generalization and Typologically Diverse Morphological Inflection. (2022). In Proceedings of the 19th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology (pp. 176--203)
by Jordan Kodner, Salam Khalifa, Khuyagbaatar Batsuren, Hossep Dolatian, Ryan Cotterell, {...}, Jeremiah Young, Ekaterina Vylomova
Read in PDF
The Neuroscientification of Psychology: The Rising Prevalence of Neuroscientific Concepts in Psychology From 1965 to 2016. (2022).
Perspectives on Psychological Science, 17(2) (pp. 519--529)
by Nick Haslam, Ekaterina Vylomova, Sean C. Murphy, and Sarah J. Wilson
Read in PDF
Please visit my Google Scholar profile for the full list.
Invited Talks
Documenting and modeling inflectional paradigms in under-resourced languages. (Cardamom Series; Jan, 2022)
Slides
Watch to Recording
The Secret Life of Words: Exploring Regularity and Systematicity. Episode II. (Keynote at SIGMORPHON 2021; Aug, 2021)
Slides
Event Page
UniMorph and Morphological Inflection Task: Past, Present, and Future. (SIGTYP Lecture Series; Aug, 2021)
Slides
Event Page
Watch the Recording
The Secret Life of Words: Exploring Regularity and Systematicity (w/Ryan Cotterell). (Moscow State University; Nov, 2020)
Slides
Event Page
Watch the Recording
What Do Neural Models "Know'' About Natural Language? (CHDH Seminar Series; Apr, 2020)
Slides
Event Page