Computing and Information Systems - Theses

Permanent URI for this collection

Search Results

Now showing 1 - 1 of 1
  • Item
    Thumbnail Image
    Source-Free Transductive Transfer Learning for Structured Prediction
    Kurniawan, Kemal Maulana ( 2023-07)
    Current transfer learning approaches require two strong assumptions: the source domain data is available and the target domain has labelled data. These assumptions are problematic when both the source domain data is private and the target domain has no labelled data. Thus, we consider the source-free unsupervised transfer setup in which the assumptions are violated across both languages and domains (genres). To transfer structured prediction models in the source-free setting, we propose two methods: Parsimonious Parser Transfer (PPT) designed for single-source transfer of dependency parsers across languages, and PPTX which is the multi-source version of PPT. Both methods outperform baselines. We then propose to improve PPTX with logarithmic opinion pooling (PPTX-LOP), and find that it is an effective multi-source transfer method for structured prediction in general. Next, we study if our proposed source-free transfer methods provide improvements when pretrained language models (PTLMs) are employed. We first propose Parsimonious Transfer for Sequence Tagging (PTST) which is a variation of PPT designed for sequence tagging. Then, we evaluate PTST and PPTX-LOP on domain adaptation of semantic tasks using PTLMs. We show that for globally normalised models, PTST and PPTX-LOP improve precision and recall respectively. Besides unlabelled data, the target domain may have models trained on various tasks (but not the task of interest). To investigate if these models can be used successfully to improve performance in source-free transfer, we propose two methods. We find that leveraging these models can improve recall over direct transfer with one of the proposed methods. Finally, we critically discuss and conclude the findings in this thesis. We cover relevant subsequent work and close with a discussion on limitations and future work.