SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(de Lhoneux Miryam 1990 ) srt2:(2017)"

Sökning: WFRF:(de Lhoneux Miryam 1990 ) > (2017)

  • Resultat 1-3 av 3
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • de Lhoneux, Miryam, 1990-, et al. (författare)
  • From raw text to Universal Dependencies : look, no tags!
  • 2017
  • Ingår i: Proceedings of the CoNLL 2017 Shared Task. - Vancouver, Canada : Association for Computational Linguistics. - 9781945626708 ; , s. 207-217
  • Konferensbidrag (refereegranskat)abstract
    • We present the Uppsala submission to the CoNLL 2017 shared task on parsing from raw text to universal dependencies. Our system is a simple pipeline consisting of two components. The first performs joint word and sentence segmentation on raw text; the second predicts dependency trees from raw words. The parser bypasses the need for part-of-speech tagging, but uses word embeddings based on universal tag distributions. We achieved a macroaveraged LAS F1 of 65.11 in the official test run and obtained the 2nd best result for sentence segmentation with a score of 89.03. After fixing two bugs, we obtained an unofficial LAS F1 of 70.49.
  •  
2.
  • de Lhoneux, Miryam, 1990-, et al. (författare)
  • Arc-Hybrid Non-Projective Dependency Parsing with a Static-Dynamic Oracle
  • 2017
  • Ingår i: IWPT 2017 15th International Conference on Parsing Technologies. - Pisa, Italy : Association for Computational Linguistics. - 9781945626739 ; , s. 99-104
  • Konferensbidrag (refereegranskat)abstract
    • We extend the arc-hybrid transition system for dependency parsing with a SWAP transition that enables reordering of the words and construction of non-projective trees. Although this extension potentially breaks the arc-decomposability of the transition system, we show that the existing dynamic oracle can be modified and combined with a static oracle for the SWAP transition. Experiments on five languages with different degrees of non-projectivity show that the new system gives competitive accuracy and is significantly better than a system trained with a purely static oracle.
  •  
3.
  • de Lhoneux, Miryam, 1990-, et al. (författare)
  • Old School vs. New School : Comparing Transition-Based Parsers with and without Neural Network Enhancement
  • 2017
  • Ingår i: <em>Proceedings of the 15th Treebanks and Linguistic Theories Workshop (TLT)</em>. ; , s. 99-110
  • Konferensbidrag (refereegranskat)abstract
    • In this paper, we attempt a comparison between "new school" transition-based parsers that use neural networks and their classical "old school" coun-terpart. We carry out experiments on treebanks from the Universal Depen-dencies project. To facilitate the comparison and analysis of results, we onlywork on a subset of those treebanks. However, we carefully select this sub-set in the hope to have results that are representative for the whole set oftreebanks. We select two parsers that are hopefully representative of the twoschools; MaltParser and UDPipe and we look at the impact of training sizeon the two models. We hypothesize that neural network enhanced modelshave a steeper learning curve with increased training size. We observe, how-ever, that, contrary to expectations, neural network enhanced models needonly a small amount of training data to outperform the classical models butthe learning curves of both models increase at a similar pace after that. Wecarry out an error analysis on the development sets parsed by the two sys-tems and observe that overall MaltParser suffers more than UDPipe fromlonger dependencies. We observe that MaltParser is only marginally betterthan UDPipe on a restricted set of short dependencies.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-3 av 3

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy