SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:gup.ub.gu.se/305523"
 

Search: onr:"swepub:oai:gup.ub.gu.se/305523" > Can the Transformer...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Can the Transformer Learn Nested Recursion with Symbol Masking?

Bernardy, Jean-Philippe, 1978 (author)
Gothenburg University,Göteborgs universitet,Institutionen för filosofi, lingvistik och vetenskapsteori,Department of Philosophy, Linguistics and Theory of Science
Ek, Adam, 1990 (author)
Gothenburg University,Göteborgs universitet,Institutionen för filosofi, lingvistik och vetenskapsteori,Department of Philosophy, Linguistics and Theory of Science
Maraev, Vladislav, 1986 (author)
Gothenburg University,Göteborgs universitet,Institutionen för filosofi, lingvistik och vetenskapsteori,Department of Philosophy, Linguistics and Theory of Science
 (creator_code:org_t)
Stroudsburg, PA : The Association for Computational Linguistics, 2021
2021
English.
In: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, August 1 - 6, 2021, Online / Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli (Editors). - Stroudsburg, PA : The Association for Computational Linguistics. - 9781954085541
  • Conference paper (peer-reviewed)
Abstract Subject headings
Close  
  • We investigate if, given a simple symbol masking strategy, self-attention models are capable of learning nested structures and generalise over their depth. We do so in the simplest setting possible, namely languages consisting of nested parentheses of several kinds. We use encoder-only models, which we train to predict randomly masked symbols, in a BERT-like fashion. We find that the accuracy is well above random baseline, with accuracy consistently above 50% both when increasing nesting depth and distances between training and testing. However, we find that the predictions made correspond to a simple parenthesis counting strategy, rather than a push-down automaton. This suggests that self-attention models are not suitable for tasks which require generalisation to more complex instances of recursive structures than those found in the training set.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Språkteknologi (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Language Technology (hsv//eng)

Keyword

Language Models
Transformer
Dyck Languages
BERT

Publication and Content Type

ref (subject category)
kon (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Bernardy, Jean-P ...
Ek, Adam, 1990
Maraev, Vladisla ...
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
and Language Technol ...
Articles in the publication
Findings of the ...
By the university
University of Gothenburg

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view