We comment on the technical interpretation of the study of Watson et al. and caution against their conclusion that the behavioral evidence in their experiments points to nonhuman animals’ ability to learn syntactic dependencies, because their results are also consistent with the learning of phonological dependencies in human languages.
We demonstrate a computational restriction on iterative prosody in phonology by using logical transductions. We show that the typology is fundamentally local but requires output recursion, formulated via quantifier-free transductions and least-fixed-point operators, respectively. We focus on two case studies from iterative prosody. One is iterative secondary stress. The other is more complex: iterative syllabification and epenthesis in Arabic dialects. The second case study involves formalizing Ito (1989)'s analysis of directional syllabification.
This paper examines the characterization and learning of grammars defined with enriched representational models. Model-theoretic approaches to formal language theory traditionally assume that each position in a string belongs to exactly one unary relation. We consider unconventional string models where positions can have multiple, shared properties, which are arguably useful in many applications. We show the structures given by these models are partially ordered, and present a learning algorithm that exploits this ordering relation to effectively prune the hypothesis space. We prove this learning algorithm, which takes positive examples as input, finds the most general grammar which covers the data.
We explore the generative capacity of morphological theories of reduplication. We computationally classify theories of reduplication using a hierarchy of string-to-string function classes. Reduplication as a process requires only the regular class of functions. We show that various morphological theories necessarily treat it as a more expressive polyregular function, while others maintain regularity. We discuss the significance of this formal result for reduplicative functions and recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.