All Issue

2022 Vol.37, Issue 4 Preview Page

Research Article

28 February 2022. pp. 475-489
Abstract
Neural(-network) language models (LMs) have recently been successful in performing the tasks that require sensitivity to syntactic structure. We provide further evidence for their sensitivity to syntactic structure by showing that compared to adding a non-adaptive counterpart to it, adding an adaptation-as-priming paradigm to L2 LSTM LMs improves their ability to track abstract structure. By applying a gradient similarity metric between structures, this mechanism allows us to reconstruct the organization of the L2 LMs’ syntactic representational space. In so doing, we discover that sentences with a particular type of relative clauses behave in a similar fashion to other sentences with the same type of relative clauses in the L2 LMs’ representation space, in keeping with the recent studies of L1 LM adaptation. We also demonstrate that the similarity between given sentences is not affected by specific words in sentences. Our results show that the L2 LMs have the ability to track abstract structural properties of sentences, just as L1 LMs do.
References
  1. Bock, J. K. 1986. Syntactic persistence in language production. Cognitive Psychology 18, 355-387. 10.1016/0010-0285(86)90004-6
  2. Chang, F., G. S. Dell, and K. Bock. 2006. Becoming syntactic. Psychological Review 113, 234-272. 10.1037/0033-295X.113.2.234 16637761
  3. Dubey, A., F. Keller, and P. Sturt. 2006. Integrating syntactic priming into an incremental probabilistic parser, with an application to psycholinguistic modeling. Proceedings of the 21st international Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics, 417-424. 10.3115/1220175.1220228
  4. Gulordava, K., P. Bojanowski, E. Grave, T. Linzen, and M. Baroni. 2018. Colorless green recurrent networks dream hierarchically. Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 1195-1205. 10.18653/v1/N18-1108
  5. Kaschak, M. P., T. J. Kutta, and C. Schatschneider. 2011. Long-term cumulative structural priming persists for (at least) one week. Memory and Cognition 39, 381-388. 10.3758/s13421-010-0042-3 21264596
  6. Kim, E. 2022. Probing sentence embeddings in L2 learners’ LSTM neural language models using adaptation learning. Ms., Shinhan University.
  7. Kuhn, R. and R. De Mori. 1990. A cache-based natural language model for speech recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 12, 570-583. 10.1109/34.56193
  8. Linzen, T., E. Dupoux, and Y. Goldberg. 2016. Assessing the ability of LSTMs to learn syntax-sensitive dependencies. Transactions of the Association for Computational Linguistics 4, 521-535. 10.1162/tacl_a_00115
  9. Marvin, R. and T. Linzen. 2018. Targeted syntactic evaluation of language models. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 1192-1202. 10.18653/v1/D18-1151
  10. Pinheiro, J. and D. Bates. 2000. Mixed-effects Models in S and S-PLUS. Springer Science & Business Media. 10.1007/978-1-4419-0318-1
  11. Prasad, G., M. Van Schijndel, and T. Linzen. 2019. Using priming to uncover the organization of syntactic representations in neural language models. Proceedings of the 23rd Conference on Computational Natural Language Learning, 66-76. 10.18653/v1/K19-1007 PMC6330790
  12. Roland, D., F. Dick, and J. L. Elman. 2007. Frequency of basic English grammatical structures: A corpus analysis. Journal of Memory and Language 57, 348-379. 10.1016/j.jml.2007.03.002 19668599 PMC2722756
  13. Van Schijndel, M. and T. Linzen. 2018. A neural model of adaptation in reading. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 4704-4710. 10.18653/v1/D18-1499
  14. Van Schijndel, M., A. Mueller, and T. Linzen. 2019. Quantity doesn’t buy quality syntax with neural language models. Proceedings of the 2019 conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 5831-5837. 10.18653/v1/D19-1592
Information
  • Publisher :The Modern Linguistic Society of Korea
  • Publisher(Ko) :한국현대언어학회
  • Journal Title :The Journal of Studies in Language
  • Journal Title(Ko) :언어연구
  • Volume : 37
  • No :4
  • Pages :475-489