- hochgeladen 16. Februar 2018
Independent of branches like Quantitative Linguistics and Computational Linguistics, linguistics has witnessed a quite remarkable quantitative turn over the last two decades. Major drivers of this development have been the design of ever more and ever larger electronic corpora, the increasing importance of psycho- and neurolinguistic experiments in exploring language processing and language variation, and the availability of ever more sophisticated statistical tools for handling large, complex linguistic data sets. Has this quantitative turn been to the detriment of qualitative methods, or even of linguistic theorizing in general? Has linguistics reached the point of a "quantitative crisis" yet, as it has recently been proclaimed for a range of academic disciplines, or is it still a discipline characterized by a healthy equilibrium, if not mutual reinforcement, of quantitative and qualitative approaches?