Ryo Yoshida

Ryo Yoshida

Ph.D student at the University of Tokyo

Biography

I’m Ryo Yoshida, a Ph.D student at the Department of Language and Information Sciences, Graduate School of Arts and Sciences, University of Tokyo. I’m working on natural language processing, cognitive modeling, and syntactic supervision.

Interests
  • Natural Language Processing
  • Cognitive Modeling
  • Syntactic Supervision
Education
  • Doctor of Arts, April 2023 - Present

    Department of Language and Information Sciences, Graduate School of Arts and Sciences, University of Tokyo

  • Master of Arts, April 2021 - March 2023

    Department of Language and Information Sciences, Graduate School of Arts and Sciences, University of Tokyo

  • Bachelor of Arts, April 2017 - March 2021

    Department of Humanities and Social Sciences, College of Arts and Sciences, University of Tokyo

Skills

Python

4 years

R

A little

haskell_logo_icon_170039
Haskell

Beginner

Experience

 
 
 
 
 
CTO
Apr 2022 – Mar 2024 Kyoto
 
 
 
 
 
学術専門職員
Nov 2021 – Mar 2023 Tokyo
 
 
 
 
 
AI engineering internship
May 2020 – Oct 2022 Tokyo

Recent Publications

(2025). Derivational Probing: Unveiling the Layer-wise Derivation of Syntactic Structures in Neural Language Models. Proceedings of the 29th Conference on Computational Natural Language Learning (acceptance rate: 18.4%).

Cite DOI URL

(2025). Developmentally-plausible Working Memory Shapes a Critical Period for Language Acquisition. Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (acceptance rate: 20.3%).

Cite DOI URL

(2025). If Attention Serves as a Cognitive Model of Human Memory Retrieval, What is the Plausible Memory Representation?. Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (acceptance rate: 20.3%).

Cite DOI URL

(2025). Investigating Psychometric Predictive Power of Syntactic Attention. Proceedings of the 29th Conference on Computational Natural Language Learning (acceptance rate: 18.4%).

Cite DOI URL

(2024). Emergent Word Order Universals from Cognitively-Motivated Language Models. Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (acceptance rate: 21.3%).

Cite DOI URL

Contact