SwePub
Sök i LIBRIS databas

  Utökad sökning

WFRF:(de Lhoneux Miryam 1990 )
 

Sökning: WFRF:(de Lhoneux Miryam 1990 ) > What Should/Do/Can ...

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

de Lhoneux, Miryam, 1990- (författare)
Uppsala universitet,Institutionen för lingvistik och filologi
Stymne, Sara, 1977- (författare)
Uppsala universitet,Institutionen för lingvistik och filologi
Nivre, Joakim, 1962- (författare)
Uppsala universitet,Institutionen för lingvistik och filologi
 (creator_code:org_t)
MIT Press, 2020
2020
Engelska.
Ingår i: Computational linguistics - Association for Computational Linguistics (Print). - : MIT Press. - 0891-2017 .- 1530-9312. ; 46:4, s. 763-784
  • Tidskriftsartikel (refereegranskat)
Abstract Ämnesord
Stäng  
  • There is a growing interest in investigating what neural NLP models learn about language. A prominent open question is the question of whether or not it is necessary to model hierarchical structure. We present a linguistic investigation of a neural parser adding insights to this question. We look at transitivity and agreement information of auxiliary verb constructions (AVCs) in comparison to finite main verbs (FMVs). This comparison is motivated by theoretical work in dependency grammar and in particular the work of Tesnière (1959), where AVCs and FMVs are both instances of a nucleus, the basic unit of syntax. An AVC is a dissociated nucleus; it consists of at least two words, and an FMV is its non-dissociated counterpart, consisting of exactly one word. We suggest that the representation of AVCs and FMVs should capture similar information. We use diagnostic classifiers to probe agreement and transitivity information in vectors learned by a transition-based neural parser in four typologically different languages. We find that the parser learns different information about AVCs and FMVs if only sequential models (BiLSTMs) are used in the architecture but similar information when a recursive layer is used. We find explanations for why this is the case by looking closely at how information is learned in the network and looking at what happens with different dependency representations of AVCs. We conclude that there may be benefits to using a recursive layer in dependency parsing and that we have not yet found the best way to integrate it in our parsers.

Ämnesord

NATURVETENSKAP  -- Data- och informationsvetenskap -- Språkteknologi (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Language Technology (hsv//eng)

Nyckelord

Datorlingvistik
Computational Linguistics

Publikations- och innehållstyp

ref (ämneskategori)
art (ämneskategori)

Hitta via bibliotek

Till lärosätets databas

Sök utanför SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy