Home
Grants
Recent Publications
Awards
Contact
Tree-Planted Transformers: Unidirectional Transformer Language Models with Implicit Syntactic Supervision
Ryo Yoshida
,
Taiga Someya
,
Yohei Oseki
August 2024
International
Preprint
Type
Conference paper
Publication
Findings of the Association for Computational Linguistics: ACL 2024
(acceptance rate: ??%)
First Author
Peer-reviewed
Related
Composition, Attention, or Both?
Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars
Localizing Syntactic Composition with Left-Corner Recurrent Neural Network Grammars
Dissociating Syntactic Operations via Composition Count
Targeted Syntactic Evaluations on the Chomsky Hierarchy
Cite
×