SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Sayyed Zeeshan Ali) "

Sökning: WFRF:(Sayyed Zeeshan Ali)

  • Resultat 1-2 av 2
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Dakota, Daniel, et al. (författare)
  • Bidirectional Domain Adaptation Using Weighted Multi-Task Learning
  • 2021
  • Ingår i: IWPT 2021. - Stroudsburg, PA, USA : Association for Computational Linguistics. - 9781954085800 ; , s. 93-105
  • Konferensbidrag (refereegranskat)abstract
    • Domain adaption in syntactic parsing is still a significant challenge. We address the issue of data imbalance between the in-domain and out-of-domain treebank typically used for the problem. We define domain adaptation as a Multi-task learning (MTL) problem, which allows us to train two parsers, one for each domain. Our results show that the MTL approach is beneficial for the smaller treebank. For the larger treebank, we need to use loss weighting in order to avoid a decrease in performance below the single task. In order to determine to what degree the data imbalance between two domains and the domain differences affect results, we also carry out an experiment with two imbalanced in-domain treebanks and show that loss weighting also improves performance in an in-domain setting. Given loss weighting in MTL, we can improve results for both parsers.
  •  
2.
  • Sayyed, Zeeshan Ali, et al. (författare)
  • Annotations Matter : Leveraging Multi-task Learning to Parse UD and SUD
  • 2021
  • Ingår i: Findings of the Association for Computational Linguistics. - Stroudsburg, PA, USA : Association for Computational Linguistics. - 9781954085541 ; , s. 3467-3481
  • Konferensbidrag (refereegranskat)abstract
    • Using multiple treebanks to improve parsing performance has shown positive results. However, to what extent similar, yet competing annotation decisions play in parser behavior is unclear. We investigate this within a multi-task learning (MTL) dependency parser setup on two parallel treebanks, UD and SUD, which, while possessing similar annotation schemes, differ in specific linguistic annotation preferences. We perform a set of experiments with different MTL architectural choices, comparing performance across various input embeddings. We find languages tend to pattern in loose typological associations, but generally the performance within an MTL setting is lower than single model baseline parsers for each annotation scheme. The main contributing factor seems to be the competing syntactic annotation information shared between treebanks in an MTL setting, which is shown in experiments against differently annotated treebanks. This suggests that the impact of how the signal is encoded for annotations and its influence on possible negative transfer is more important than that of the input embeddings in an MTL setting.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-2 av 2
Typ av publikation
konferensbidrag (2)
Typ av innehåll
refereegranskat (2)
Författare/redaktör
Dakota, Daniel (2)
Sayyed, Zeeshan Ali (2)
Kuebler, Sandra (1)
Lärosäte
Uppsala universitet (2)
Språk
Engelska (2)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (2)
År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy