By the same authors

Learning implicational models of universal grammar parameters

Research output: Chapter in Book/Report/Conference proceedingChapter

Full text download(s)

Author(s)

Department/unit(s)

Publication details

Title of host publicationThe Evolution of Language: Proceedings of the 12th International Conference (EVOLANGXII)
DatePublished - 29 Jan 2018
Number of pages10
PublisherOnline at http://evolang.org/torun/proceedings/papertemplate.html?p=176
Place of PublicationTorun, Poland
EditorsC. Cuskley, M. Flaherty, H. Little, L. McCrohon, A. Ravignani, T. Verhoef
Original languageEnglish

Abstract

The use of parameters in the description of natural language syntax has to balance between the need to discriminate among (sometimes subtly different) languages, which can be seen as a cross-linguistic version of Chomsky's descriptive adequacy (Chomsky, 1964), and the complexity of the acquisition task that a large number of parameters would imply, which is a problem for explanatory adequacy. Here we first present a novel approach in which machine learning is used to detect hidden dependencies in a table of parameters. The result is a dependency graph in which some of the parameters can be fully predicted from others. These findings can be then subjected to linguistic analysis, which may either refute them by providing typological counter-examples of languages not included in the original dataset, dismiss them on theoretical grounds, or uphold them as tentative empirical laws worth of further study. Machine learning is also used to explore the full sets of parameters that are sufficient to distinguish one historically established language family from others. These results provide a new type of empirical evidence about the historical adequacy of parameter theories.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations