Computational Phonology

Humans intuitively pattern meaningless symbolic elements into meaningful units when we comprehend and produce language. The rules and structures governing this patterning - phonology - act as a cognitive filter for the underlying structure of language as it is transformed into external action, whether through speech, sign, or tactile language. The wide variety of patterns that humans produce across languages, and their mappings from underlying forms to surface forms, show a remarkable computational simplicity, combined with efficient and accurate learning algorithms. I draw on automata theory, formal language theory, grammatical inference, and mathematical logic to clarify the nature of the human mental capacity of phonology.


To Appear Rawski, Jonathan. 2017. Phonological Complexity is Subregular: Evidence from Sign language. Proceedings of the 51st Meeting of the Chicago Linguistics Society



Rawski, Jonathan, Jeffrey Heinz, Jane Chandlee, Adam Jardine. 2018. How the Constraint Space Structure Facilitates Learning. Tenth North American Phonology Conference. Concordia University, Montreal

Rawski, Jonathan. 2018. The logical Complexity of Phonology Across Speech and Sign. Invited Talk, Institut Jean Nicod, Ecole Normale Superieure, Paris

Rawski, Jonathan 2017. Phonological Complexity is Subregular: Evidence from Sign Language. Talk. The 51st Annual Meeting of the Chicago Linguistic Society. University of Chicago. [slides]



Rawski, Jon. 2018. Subregular Complexity Across Speech and Sign. Poster. Society for Computation in Linguistics 1st Meeting. Salt Lake City 

Rawski, Jon, Aniello De Santo, and Jeffrey Heinz. 2017. Reconciling Minimum Description Length with Grammar-Independent Complexity Measures. Poster. MIT Workshop on Simplicity in Grammar Learning


Computational Neuroscience

Phonology lies at the intersection of linguistics and neurobiology. Phonological rules govern the physical instantiation of thought made in language, and yet phonology is the first completely abstract computational layer that perceptual encoding/decoding must enter. The mapping problem, or instantiating these symbolic rules into neuronal systems, is an important part of figuring out the biological capacity for language. How brains perform this feat is a matter of their unique structure and computing power, most of which is an open question. 


in prep. Rawski, Jonathan. 2017. Syntax, Prosody, and Neural Oscillatory Computation

in prep Rawski Jonathan and Boris Gutkin. Homeostatic Influences on learning Maximum Entropy Grammars

Rawski, Jonathan. 2016. Homeostasis in Harmonic Grammar. MS, Higher School of Economics

Rawski, Jonathan. 2015. Second-Language Phonology Learning and Neuroplasticity. Linguistic Portfolios Vol. 4 



De Santo, Aniello, Jonathan Rawski, Amanda yazdani, J.E. Drury. Quantified sebtebces as a Window into prediction and Priming. Chicago Linguistics Society Annual Meeting, Chicago, Illinois

Rawski, Jonathan. 2016. Homeostatic Reinforcement Learning for Harmonic Grammars. Talk. SYNC Conference. CUNY, New York.
Rawski, Jonathan, Boris Gutkin. 2016. Homeostatic Reinforcement Learning for Harmonic Grammars. Invited Talk. MIT Linguistics Dept.
Rawski, Jonathan. 2016. Homeostasis in Harmonic Grammar. Invited Talk. Laboratoire de Neurosciences Cognitives, Ecole Normale Superieure, Paris, France 

Rawski, Jonathan.  2015. Linguistic Structure from Neural Computation. Talk. Center for Cognition Seminar, Higher School of Economics.



De Santo, Aniello., Jon Rawski, & J.E. Drury. 2017. ERP Effects for Quanitifier Complexity, Priming, and Truth Value in an auditory/visual Verification Task. Poster Society for the Neurobiology of Language Annual Meeting 2017

Rawski, Jon. 2017. A Homeostatic Space for OT Phonology. Poster. Pronouns: Syntax, Semantics, Processing Conference (Moscow, Russia)



"Pirates and Emperors: On Publishers, Journalists, and Academic Elites." Talk at the New School for Social Research, March 2018


"A Dangerous Nuclear Ignorance" Article in Counterpunch Magazine, August 2017