Showing 1–20 of 32 results
/ Date/ Name
Jun 22, 2020An information-theoretic account of semantic interference in word productionNov 5, 2018Do RNNs learn human-like abstract word order preferences?May 20, 2024Linguistic Structure from a Bottleneck on Sequential Information ProcessingMar 8, 2019Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic StateAug 18, 2017The Natural Stories CorpusSep 8, 2017A Statistical Comparison of Some Theories of NP Word OrderSep 5, 2018RNNs as psycholinguistic subjects: Syntactic state and grammatical dependencyJan 28, 2025How Linguistics Learned to Stop Worrying and Love the Language ModelsOct 1, 2015Response to Liu, Xu, and Liang (2015) and Ferrer-i-Cancho and Gómez-Rodríguez (2015) on Dependency Length MinimizationMay 13, 2024An information-theoretic model of shallow and deep language comprehensionMay 19, 2025Clarifying orthography: Orthographic transparency as compressibilityOct 12, 2020Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language ModelsApr 21, 2021Sensitivity as a Complexity Measure for Sequence Classification TasksJun 6, 2023A Cross-Linguistic Pressure for Uniform Information Density in Word OrderMar 3, 2019Structural Supervision Improves Learning of Non-Local Grammatical DependenciesMar 11, 2022When classifying grammatical role, BERT doesn't care about word order... except when it mattersAug 31, 2018What do RNN Language Models Learn about Filler-Gap Dependencies?Dec 16, 2022A unified information-theoretic model of EEG signatures of human language processingDec 29, 2023Exploring the Sensitivity of LLMs' Decision-Making Capabilities: Insights from Prompt Variation and HyperparametersApr 10, 2026You Can't Fight in Here! This is BBS!