SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:DiVA.org:uu-462769"
 

Search: onr:"swepub:oai:DiVA.org:uu-462769" > What Should/Do/Can ...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

de Lhoneux, Miryam, 1990- (author)
Uppsala universitet,Institutionen för lingvistik och filologi
Stymne, Sara, 1977- (author)
Uppsala universitet,Institutionen för lingvistik och filologi
Nivre, Joakim, 1962- (author)
Uppsala universitet,Institutionen för lingvistik och filologi
 (creator_code:org_t)
MIT Press, 2020
2020
English.
In: Computational linguistics - Association for Computational Linguistics (Print). - : MIT Press. - 0891-2017 .- 1530-9312. ; 46:4, s. 763-784
  • Journal article (peer-reviewed)
Abstract Subject headings
Close  
  • There is a growing interest in investigating what neural NLP models learn about language. A prominent open question is the question of whether or not it is necessary to model hierarchical structure. We present a linguistic investigation of a neural parser adding insights to this question. We look at transitivity and agreement information of auxiliary verb constructions (AVCs) in comparison to finite main verbs (FMVs). This comparison is motivated by theoretical work in dependency grammar and in particular the work of Tesnière (1959), where AVCs and FMVs are both instances of a nucleus, the basic unit of syntax. An AVC is a dissociated nucleus; it consists of at least two words, and an FMV is its non-dissociated counterpart, consisting of exactly one word. We suggest that the representation of AVCs and FMVs should capture similar information. We use diagnostic classifiers to probe agreement and transitivity information in vectors learned by a transition-based neural parser in four typologically different languages. We find that the parser learns different information about AVCs and FMVs if only sequential models (BiLSTMs) are used in the architecture but similar information when a recursive layer is used. We find explanations for why this is the case by looking closely at how information is learned in the network and looking at what happens with different dependency representations of AVCs. We conclude that there may be benefits to using a recursive layer in dependency parsing and that we have not yet found the best way to integrate it in our parsers.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Språkteknologi (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Language Technology (hsv//eng)

Keyword

Datorlingvistik
Computational Linguistics

Publication and Content Type

ref (subject category)
art (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
de Lhoneux, Miry ...
Stymne, Sara, 19 ...
Nivre, Joakim, 1 ...
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
and Language Technol ...
Articles in the publication
Computational li ...
By the university
Uppsala University

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view