Generative grammar: Difference between revisions
imported>OAbot m Open access bot: url-access updated in citation with #oabot. |
imported>Citation bot Alter: pages, issue. Add: pmid, issue, jstor, pages, doi-broken-date, isbn, authors 1-1. Removed parameters. Formatted dashes. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Noam Chomsky | #UCB_Category 5/35 |
||
| Line 10: | Line 10: | ||
== Principles == | == Principles == | ||
Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.<ref name ="WasowHandbookUmbrella">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages=296,311|quote="...generative grammar is not so much a theory as a family or theories, or a school of thought... [having] shared assumptions and goals, widely used formal devices, and generally accepted empirical results"}}</ref><ref name=carnie_p5>{{Cite book |last=Carnie|first=Andrew|title=Syntax: A Generative Introduction|author-link=Andrew Carnie|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=5}}</ref> | Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.<ref name ="WasowHandbookUmbrella">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages=296,311|doi-broken-date=11 June 2025 |quote="...generative grammar is not so much a theory as a family or theories, or a school of thought... [having] shared assumptions and goals, widely used formal devices, and generally accepted empirical results"}}</ref><ref name=carnie_p5>{{Cite book |last=Carnie|first=Andrew|title=Syntax: A Generative Introduction|author-link=Andrew Carnie|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=5}}</ref> | ||
=== Cognitive science === | === Cognitive science === | ||
Generative grammar studies language as part of [[cognitive science]]. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction |publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|pages= | Generative grammar studies language as part of [[cognitive science]]. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction |publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|pages=4–6,8}}</ref><ref name ="WasowHandbookMental">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=295-296,299-300|isbn=978-0-631-20497-8 }}</ref><ref name = "AdgerCogSci">{{cite book |last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|page=14|isbn=978-0199243709}}</ref> | ||
Like other approaches in linguistics, generative grammar engages in [[linguistic description]] rather than [[linguistic prescriptivism|linguistic prescription]].<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=8}}</ref><ref name ="WasowHandbookPreDes">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=295,297}}</ref> | Like other approaches in linguistics, generative grammar engages in [[linguistic description]] rather than [[linguistic prescriptivism|linguistic prescription]].<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=8}}</ref><ref name ="WasowHandbookPreDes">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=295,297|isbn=978-0-631-20497-8 }}</ref> | ||
=== Explicitness and generality === | === Explicitness and generality === | ||
Generative grammar proposes models of language consisting of explicit rule systems, which make testable [[falsifiability|falsifiable]] predictions. This is different from [[traditional grammar]] where grammatical patterns are often described more loosely.<ref name ="WasowHandbookExpGen">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages= | Generative grammar proposes models of language consisting of explicit rule systems, which make testable [[falsifiability|falsifiable]] predictions. This is different from [[traditional grammar]] where grammatical patterns are often described more loosely.<ref name ="WasowHandbookExpGen">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=298–300|isbn=978-0-631-20497-8 }}</ref><ref name = "AdgerExpGen">{{cite book|last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=14–15|isbn=978-0199243709}}</ref> These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. For example, because English [[imperative mood|imperative]] [[tag questions]] obey the same restrictions that second person [[future tense|future]] [[declarative mood|declarative]] tags do, [[Paul Postal]] proposed that the two constructions are derived from the same underlying structure. By adopting this hypothesis, he was able to capture the restrictions on tags with a single rule. This kind of reasoning is commonplace in generative research.<ref name ="WasowHandbookExpGen"/> | ||
Particular theories within generative grammar have been expressed using a variety of [[formal system]]s, many of which are modifications or extensions of [[context free grammars]].<ref name ="WasowHandbookExpGen"/> | Particular theories within generative grammar have been expressed using a variety of [[formal system]]s, many of which are modifications or extensions of [[context free grammars]].<ref name ="WasowHandbookExpGen"/> | ||
| Line 26: | Line 26: | ||
=== Competence versus performance === | === Competence versus performance === | ||
Generative grammar generally distinguishes [[linguistic competence]] and [[linguistic performance]].<ref name ="WasowHandbookCompPerf">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages= | Generative grammar generally distinguishes [[linguistic competence]] and [[linguistic performance]].<ref name ="WasowHandbookCompPerf">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages=297–298|doi-broken-date=11 June 2025 }}</ref> Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use.<ref name ="WasowHandbookCompPerf"/><ref>{{cite book|last=Pritchett|first=Bradley|year=1992|title=Grammatical competence and parsing performance|publisher=University of Chicago Press|page=2|isbn=0-226-68442-3}}</ref> This distinction is related to the broader notion of [[David Marr (neuroscientist)#Levels_of_analysis|Marr's levels]] used in other cognitive sciences, with competence corresponding to Marr's computational level.<ref>{{cite book |last=Marr|first=David|author-link=David Marr (neuroscientist)|year=1982|title=Vision|publisher=MIT Press|isbn=978-0262514620|page=28}}</ref> | ||
For example, generative theories generally provide competence-based explanations for why [[English language|English]] speakers would judge the sentence in (1)<!--This refers to "*That cats is eating the mouse". Please update labels if necessary.--> as [[acceptability (linguistics)|odd]]. In these explanations, the sentence would be [[ungrammatical]] because the rules of English only generate sentences where [[demonstrative]]s [[Agreement (linguistics)|agree]] with the [[grammatical number]] of their associated [[noun]].<ref name = "AdgerCompPerf">{{cite book |last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages= | For example, generative theories generally provide competence-based explanations for why [[English language|English]] speakers would judge the sentence in (1)<!--This refers to "*That cats is eating the mouse". Please update labels if necessary.--> as [[acceptability (linguistics)|odd]]. In these explanations, the sentence would be [[ungrammatical]] because the rules of English only generate sentences where [[demonstrative]]s [[Agreement (linguistics)|agree]] with the [[grammatical number]] of their associated [[noun]].<ref name = "AdgerCompPerf">{{cite book |last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=4–7,17|isbn=978-0199243709}}</ref> | ||
:(1) *That cats is eating the mouse. | :(1) *That cats is eating the mouse. | ||
| Line 36: | Line 36: | ||
:(2) *The cat that the dog that the man fed chased meowed. | :(2) *The cat that the dog that the man fed chased meowed. | ||
In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence.<ref name="DillonMomaSlides"/><ref>{{cite encyclopedia|title=Deriving competing predictions from grammatical approaches and reductionist approaches to island effects|encyclopedia=Experimental syntax and island effects|year=2013|last1=Sprouse|first1=Jon|last2=Wagers|first2=Matt|last3=Phillips|first3=Colin|author-link3=Colin Phillips|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|editor-link2=Norbert Hornstein|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.002}}</ref> For example, while many generative models of syntax explain [[syntactic island|island effects]] by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.<ref>{{cite encyclopedia|title=On the nature of island constraints I: Language processing and reductionist accounts|encyclopedia=Experimental syntax and island effects|year=2013|last=Phillips|first=Colin|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|url=http://www.colinphillips.net/wp-content/uploads/2014/08/phillips2013_islands1.pdf|doi=10.1017/CBO9781139035309.005}}</ref><ref>{{cite encyclopedia|title=Islands in the grammar? Standards of evidence|encyclopedia=Experimental syntax and island effects|year=2013|last1=Hofmeister|first1=Philip|last2=Staum Casasanto|first2=Laura|last3=Sag|first3=Ivan|author-link3=Ivan Sag|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.004}}</ref> | In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence.<ref name="DillonMomaSlides"/><ref>{{cite encyclopedia|title=Deriving competing predictions from grammatical approaches and reductionist approaches to island effects|encyclopedia=Experimental syntax and island effects|year=2013|last1=Sprouse|first1=Jon|last2=Wagers|first2=Matt|last3=Phillips|first3=Colin|author-link3=Colin Phillips|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|editor-link2=Norbert Hornstein|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.002|doi-broken-date=11 June 2025 }}</ref> For example, while many generative models of syntax explain [[syntactic island|island effects]] by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.<ref>{{cite encyclopedia|title=On the nature of island constraints I: Language processing and reductionist accounts|encyclopedia=Experimental syntax and island effects|year=2013|last=Phillips|first=Colin|pages=64–108 |editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|url=http://www.colinphillips.net/wp-content/uploads/2014/08/phillips2013_islands1.pdf|doi=10.1017/CBO9781139035309.005|isbn=978-1-139-03530-9 }}</ref><ref>{{cite encyclopedia|title=Islands in the grammar? Standards of evidence|encyclopedia=Experimental syntax and island effects|year=2013|last1=Hofmeister|first1=Philip|last2=Staum Casasanto|first2=Laura|last3=Sag|first3=Ivan|pages=42–63 |author-link3=Ivan Sag|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.004|isbn=978-1-139-03530-9 }}</ref> | ||
Non-generative approaches often do not posit any distinction between competence and performance. For instance, [[usage-based models of language]] assume that grammatical patterns arise as the result of usage.<ref> {{cite book|last1=Vyvyan|first1=Evans|author-link=Vyvyan Evans|last2=Green|first2=Melanie|year=2006|title=Cognitive Linguistics: An Introduction|publisher=Edinburgh University Press|pages= | Non-generative approaches often do not posit any distinction between competence and performance. For instance, [[usage-based models of language]] assume that grammatical patterns arise as the result of usage.<ref> {{cite book|last1=Vyvyan|first1=Evans|author-link=Vyvyan Evans|last2=Green|first2=Melanie|year=2006|title=Cognitive Linguistics: An Introduction|publisher=Edinburgh University Press|pages=108–111|isbn=0-7486-1832-5}}</ref> | ||
=== Innateness and universality === | === Innateness and universality === | ||
A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some [[domain-specific]] aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.<ref name ="WasowHandbookUniversality">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|page=299}}</ref><ref name = "PesetskyUG">{{cite encyclopedia|title=Linguistic universals and universal grammar|encyclopedia=The MIT encyclopedia of the cognitive sciences|year=1999|last=Pesetsky|first=David|author-link=David Pesetsky|editor-last1=Wilson|editor-first1=Robert|editor-last2=Keil|editor-first2=Frank|publisher=MIT Press|doi=10.7551/mitpress/4660.001.0001 |pages= | A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some [[domain-specific]] aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.<ref name ="WasowHandbookUniversality">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|page=299|doi-broken-date=11 June 2025 }}</ref><ref name = "PesetskyUG">{{cite encyclopedia|title=Linguistic universals and universal grammar|encyclopedia=The MIT encyclopedia of the cognitive sciences|year=1999|last=Pesetsky|first=David|author-link=David Pesetsky|editor-last1=Wilson|editor-first1=Robert|editor-last2=Keil|editor-first2=Frank|publisher=MIT Press|doi=10.7551/mitpress/4660.001.0001 |pages=476–478|isbn=978-0-262-33816-5 }}</ref> | ||
The idea that at least some aspects are innate is motivated by [[poverty of the stimulus]] arguments.<ref name = "AdgerPOS">{{cite book|last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages= | The idea that at least some aspects are innate is motivated by [[poverty of the stimulus]] arguments.<ref name = "AdgerPOS">{{cite book|last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=8–11|isbn=978-0199243709}}</ref><ref name ="Lasnik&LidzPOS">{{cite encyclopedia |title=The Argument from the Poverty of the Stimulus|last1=Lasnik|first1=Howard|author-link1=Howard Lasnik|last2=Lidz|first2=Jeffrey|author-link2=Jeffrey Lidz|encyclopedia=The Oxford Handbook of Universal Grammar|year=2017|editor-last=Roberts|editor-first=Ian|editor-link=Ian Roberts (linguist)|publisher=Oxford University Press|url=https://jefflidz.com/Docs/LasnikLidz2016.pdf}}</ref> For example, one famous poverty of the stimulus argument concerns the acquisition of [[yes–no question]]s in English. This argument starts from the observation that children only make mistakes compatible with rules targeting [[hierarchical structure (linguistics)|hierarchical structure]] even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange [[constituent (linguistics)|constituents]] in [[Tree (data_structure)|tree structure]]s. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.<ref name = "AdgerPOS"/><ref name ="Lasnik&LidzPOS"/><ref>{{cite journal|last1=Crain|first1=Stephen|author-link1=Stephen Crain|last2=Nakayama|first2=Mineharu|year=1987|title=Structure dependence in grammar formation|journal=Language|volume=63|issue=3|pages=522–543 |doi=10.2307/415004|jstor=415004 }}</ref> The empirical basis of poverty of the stimulus arguments has been challenged by [[Geoffrey Pullum]] and others, leading to back-and-forth debate in the [[language acquisition]] literature.<ref name="PullumScholz">{{cite journal|last1=Pullum|first1=Geoff|author-link1=Geoff Pullum|last2=Scholz|first2=Barbara|author-link2=Barbara Scholz|date=2002|title=Empirical assessment of stimulus poverty arguments|journal=The Linguistic Review|volume=18|issue=1–2|pages=9–50|doi=10.1515/tlir.19.1-2.9}}</ref><ref name="LegateYang">{{cite journal|last1=Legate|first1=Julie Anne|author-link1=Julie Anne Legate|last2=Yang|first2=Charles|author-link2=Charles Yang (linguist)|date=2002|title=Empirical re-assessment of stimulus poverty arguments|journal=The Linguistic Review|volume=18|issue=1–2|pages=151–162|doi=10.1515/tlir.19.1-2.9|url=https://www.ling.upenn.edu/~ycharles/papers/tlr-final.pdf}}</ref> Recent work has also suggested that some [[recurrent neural network]] architectures are able to learn hierarchical structure without an explicit constraint.<ref>{{cite journal|last1=McCoy|first1=R. Thomas|last2=Frank|first2=Robert|last3=Linzen|first3=Tal|year=2018 |title=Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks|journal=Proceedings of the 40th Annual Conference of the Cognitive Science Society|pages=2093–2098|url=https://tallinzen.net/media/papers/mccoy_frank_linzen_2018_cogsci.pdf}}</ref> | ||
Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by [[Hagit Borer]] holds that the fundamental syntactic operations are universal and that all variation arises from different [[feature (linguistics)|feature]]-specifications in the [[mental lexicon|lexicon]].<ref name="PesetskyUG"/><ref>{{cite encyclopedia|title=Parameters|encyclopedia=The Oxford Handbook of Linguistic Minimalism|year=2012|last= Gallego|first=Ángel|editor-last=Boeckx|editor-first=Cedric|publisher=Oxford University Press|doi=10.1093/oxfordhb/9780199549368.013.0023}}</ref> On the other hand, a strong hypothesis adopted in some variants of [[Optimality Theory]] holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.<ref name="PesetskyUG"/><ref name ="McCarthyOT">{{cite book|last=McCarthy|first=John|year=1992|title=Doing optimality theory|publisher=Wiley|pages= | Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by [[Hagit Borer]] holds that the fundamental syntactic operations are universal and that all variation arises from different [[feature (linguistics)|feature]]-specifications in the [[mental lexicon|lexicon]].<ref name="PesetskyUG"/><ref>{{cite encyclopedia|title=Parameters|encyclopedia=The Oxford Handbook of Linguistic Minimalism|year=2012|last= Gallego|first=Ángel|editor-last=Boeckx|editor-first=Cedric|publisher=Oxford University Press|doi=10.1093/oxfordhb/9780199549368.013.0023}}</ref> On the other hand, a strong hypothesis adopted in some variants of [[Optimality Theory]] holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.<ref name="PesetskyUG"/><ref name ="McCarthyOT">{{cite book|last=McCarthy|first=John|year=1992|title=Doing optimality theory|publisher=Wiley|pages=1–3|isbn=978-1-4051-5136-8}}</ref> In a 2002 paper, [[Noam Chomsky]], [[Marc Hauser]] and [[W. Tecumseh Fitch]] proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.<ref>{{cite journal|last1=Hauser|first1=Marc|author-link1=Marc Hauser|last2=Chomsky|first2=Noam|author-link2=Noam Chomsky|last3=Fitch|first3=W. Tecumseh|author-link3=W. Tecumseh Fitch|year=2002|title=The faculty of language: what is it, who has it, and how did it evolve|journal=Science|volume=298|issue=5598 |pages=1569–1579|doi=10.1126/science.298.5598.1569|pmid=12446899 }}</ref> | ||
In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.<ref name ="WasowHandbookUniversality"/> | In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.<ref name ="WasowHandbookUniversality"/> | ||
| Line 60: | Line 60: | ||
=== Phonology === | === Phonology === | ||
Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work on [[phonotactic]] rules which govern which [[phonemes]] can be combined, as well as those that determine the placement of [[stress (linguistics)|stress]], [[tone (linguistics)|tone]], and other [[suprasegmental]] elements.<ref name = "ClementsPhonology">{{cite encyclopedia|title=Phonology|encyclopedia=The MIT encyclopedia of the cognitive sciences|year=1999|last=Clements|first=Nick|author-link=Nick Clements|editor-last1=Wilson|editor-first1=Robert|editor-last2=Keil|editor-first2=Frank|publisher=MIT Press|doi=10.7551/mitpress/4660.003.0026 |pages= | Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work on [[phonotactic]] rules which govern which [[phonemes]] can be combined, as well as those that determine the placement of [[stress (linguistics)|stress]], [[tone (linguistics)|tone]], and other [[suprasegmental]] elements.<ref name = "ClementsPhonology">{{cite encyclopedia|title=Phonology|encyclopedia=The MIT encyclopedia of the cognitive sciences|year=1999|last=Clements|first=Nick|author-link=Nick Clements|editor-last1=Wilson|editor-first1=Robert|editor-last2=Keil|editor-first2=Frank|publisher=MIT Press|doi=10.7551/mitpress/4660.003.0026 |pages=639–641}}</ref> Within generative grammar, a prominent approach to phonology is [[Optimality Theory]].<ref name ="McCarthyOT"/> | ||
=== Semantics === | === Semantics === | ||
| Line 70: | Line 70: | ||
=== Music === | === Music === | ||
Generative grammar has been applied to [[music theory]] and [[musical analysis|analysis]] since the 1980s.<ref>{{cite journal|last1=Baroni|first1=Mario|last2=Maguire|first2=Simon|last3=Drabkin|first3=William|year=1983|title=The Concept of Musical Grammar|journal=Music Analysis|volume=2|issue=2|pages= | Generative grammar has been applied to [[music theory]] and [[musical analysis|analysis]] since the 1980s.<ref>{{cite journal|last1=Baroni|first1=Mario|last2=Maguire|first2=Simon|last3=Drabkin|first3=William|year=1983|title=The Concept of Musical Grammar|journal=Music Analysis|volume=2|issue=2|pages=175–208|doi=10.2307/854248|jstor=854248 }}</ref> One notable approach is [[Fred Lerdahl]] and [[Ray Jackendoff]]'s [[Generative theory of tonal music]], which formalized and extended ideas from [[Schenkerian analysis]].<ref>{{cite book|last=Lerdahl|first=Fred|author2=Ray Jackendoff|title=A Generative Theory of Tonal Music|url=https://archive.org/details/generativetheory0000lerd|publisher=MIT Press|year=1983|isbn=978-0-262-62107-6}}</ref> | ||
=== Biolinguistics === | === Biolinguistics === | ||
| Line 80: | Line 80: | ||
Analytical models based on semantics and [[discourse]] [[pragmatics]] were rejected by the [[Bloomfieldian]] school of linguistics<ref name=Garvin_1954>{{cite journal |last=Garvin | first=Paul L.|title=Prolegomena to a Theory of Language by Louis Hjelmslev; Francis J. Whitfield |year=1954|journal=Language |volume=30 |issue=1 |pages=69–96 |doi=10.2307/410221| jstor=410221}}</ref> whose derivatives place the [[object (grammar)|object]] into the [[verb phrase]], following from [[Wilhelm Wundt]]'s [[Völkerpsychologie]]. Formalisms based on this convention were constructed in the 1950s by [[Zellig Harris]] and [[Charles Hockett]]. These gave rise to modern generative grammar.<ref name=Seuren_1998>{{cite book|author=Seuren, Pieter A. M. |authorlink = Pieter Seuren |year=1998|title=Western linguistics: An historical introduction|publisher=Wiley-Blackwell|isbn=0-631-20891-7|pages=160–167}}</ref> | Analytical models based on semantics and [[discourse]] [[pragmatics]] were rejected by the [[Bloomfieldian]] school of linguistics<ref name=Garvin_1954>{{cite journal |last=Garvin | first=Paul L.|title=Prolegomena to a Theory of Language by Louis Hjelmslev; Francis J. Whitfield |year=1954|journal=Language |volume=30 |issue=1 |pages=69–96 |doi=10.2307/410221| jstor=410221}}</ref> whose derivatives place the [[object (grammar)|object]] into the [[verb phrase]], following from [[Wilhelm Wundt]]'s [[Völkerpsychologie]]. Formalisms based on this convention were constructed in the 1950s by [[Zellig Harris]] and [[Charles Hockett]]. These gave rise to modern generative grammar.<ref name=Seuren_1998>{{cite book|author=Seuren, Pieter A. M. |authorlink = Pieter Seuren |year=1998|title=Western linguistics: An historical introduction|publisher=Wiley-Blackwell|isbn=0-631-20891-7|pages=160–167}}</ref> | ||
As a distinct research tradition, generative grammar began in the late 1950s with the work of [[Noam Chomsky]].<ref>{{cite book|last=Newmeyer|first=Frederick|year=1986|title=Linguistic Theory in America|publisher=Academic Press|pages= | As a distinct research tradition, generative grammar began in the late 1950s with the work of [[Noam Chomsky]].<ref>{{cite book|last=Newmeyer|first=Frederick|year=1986|title=Linguistic Theory in America|publisher=Academic Press|pages=17–18|isbn=0-12-517152-8}}</ref> However, its roots include earlier [[Structuralism|structuralist]] approaches such as [[glossematics]] which themselves had older roots, for instance in the work of the ancient Indian grammarian [[Pāṇini]].<ref name="Koerner_1978">{{cite book|last=Koerner|first=E. F. K.|title=Toward a Historiography of Linguistics: Selected Essays|publisher=John Benjamins|date=1978|pages=21–54|chapter=Towards a historiography of linguistics}}</ref><ref>Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88</ref><ref>Hockett, Charles, 1987, 41</ref> Military funding to generative research was an important factor in its early spread in the 1960s.<ref>Newmeyer, F. J. (1986). Has there been a 'Chomskyan revolution' in linguistics?. Language, 62(1), p.13</ref> | ||
The initial version of generative syntax was called [[transformational grammar]]. In transformational grammar, rules called transformations mapped a level of representation called [[deep structure]]s to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room".<ref>{{Cite journal |last=Heitner |first=Reese |date=2003-10-03 |title=An Integrated Theory of Linguistic Descriptions [1964] |url=https://doi.org/10.1111/1467-9191.00147 |journal=The Philosophical Forum |volume=34 |issue= | The initial version of generative syntax was called [[transformational grammar]]. In transformational grammar, rules called transformations mapped a level of representation called [[deep structure]]s to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room".<ref>{{Cite journal |last=Heitner |first=Reese |date=2003-10-03 |title=An Integrated Theory of Linguistic Descriptions [1964] |url=https://doi.org/10.1111/1467-9191.00147 |journal=The Philosophical Forum |volume=34 |issue=3–4 |pages=401–416 |doi=10.1111/1467-9191.00147 |issn=0031-806X|url-access=subscription }}</ref> | ||
After the [[Linguistics wars]] of the late 1960s and early 1970s, Chomsky developed a revised model of syntax called [[Government and binding theory]], which eventually grew into [[Minimalist program|Minimalism]]. In the aftermath of those disputes, a variety of other generative models of syntax were proposed including [[relational grammar]], [[Lexical Functional Grammar|Lexical-functional grammar]] (LFG), and [[Head-driven phrase structure grammar]] (HPSG).<ref>{{Citation | | After the [[Linguistics wars]] of the late 1960s and early 1970s, Chomsky developed a revised model of syntax called [[Government and binding theory]], which eventually grew into [[Minimalist program|Minimalism]]. In the aftermath of those disputes, a variety of other generative models of syntax were proposed including [[relational grammar]], [[Lexical Functional Grammar|Lexical-functional grammar]] (LFG), and [[Head-driven phrase structure grammar]] (HPSG).<ref>{{Citation |last1=Sadler |first1=Louisa |title=Morphology in Lexical-Functional Grammar and Head-driven Phrase Structure Grammar |date=2018-12-13 |work=The Oxford Handbook of Morphological Theory |pages=211–243 |editor-last=Audring |editor-first=Jenny |url=https://academic.oup.com/edited-volume/34505/chapter/292751429 |access-date=2025-05-08 |publisher=Oxford University Press |language=en |doi=10.1093/oxfordhb/9780199668984.013.17 |isbn=978-0-19-966898-4 |last2=Nordlinger |first2=Rachel |editor2-last=Masini |editor2-first=Francesca|url-access=subscription }}</ref> | ||
Generative phonology originally focused on [[rewriting|rewrite rules]], in a system commonly known as ''SPE Phonology'' after the 1968 book [[The Sound Pattern of English]] by Chomsky and [[Morris Halle]]. In the 1990s, this approach was largely replaced by [[Optimality theory]], which was able to capture generalizations called [[conspiracy (phonology)|conspiracies]] which needed to be stipulated in SPE phonology.<ref name ="McCarthyOT"/> | Generative phonology originally focused on [[rewriting|rewrite rules]], in a system commonly known as ''SPE Phonology'' after the 1968 book [[The Sound Pattern of English]] by Chomsky and [[Morris Halle]]. In the 1990s, this approach was largely replaced by [[Optimality theory]], which was able to capture generalizations called [[conspiracy (phonology)|conspiracies]] which needed to be stipulated in SPE phonology.<ref name ="McCarthyOT"/> | ||
Revision as of 10:15, 11 June 2025
Template:Short description Script error: No such module "redirect hatnote".
Script error: No such module "Sidebar".
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists (Template:IPAc-en),[1] tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
Generative grammar began in the late 1950s with the work of Noam Chomsky, having roots in earlier approaches such as structural linguistics. The earliest version of Chomsky's model was called Transformational grammar, with subsequent iterations known as Government and binding theory and the Minimalist program. Other present-day generative models include Optimality theory, Categorial grammar, and Tree-adjoining grammar.
Principles
Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.[2][3]
Cognitive science
Generative grammar studies language as part of cognitive science. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.[4][5][6]
Like other approaches in linguistics, generative grammar engages in linguistic description rather than linguistic prescription.[7][8]
Explicitness and generality
Generative grammar proposes models of language consisting of explicit rule systems, which make testable falsifiable predictions. This is different from traditional grammar where grammatical patterns are often described more loosely.[9][10] These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. For example, because English imperative tag questions obey the same restrictions that second person future declarative tags do, Paul Postal proposed that the two constructions are derived from the same underlying structure. By adopting this hypothesis, he was able to capture the restrictions on tags with a single rule. This kind of reasoning is commonplace in generative research.[9]
Particular theories within generative grammar have been expressed using a variety of formal systems, many of which are modifications or extensions of context free grammars.[9]
Competence versus performance
Generative grammar generally distinguishes linguistic competence and linguistic performance.[11] Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use.[11][12] This distinction is related to the broader notion of Marr's levels used in other cognitive sciences, with competence corresponding to Marr's computational level.[13]
For example, generative theories generally provide competence-based explanations for why English speakers would judge the sentence in (1) as odd. In these explanations, the sentence would be ungrammatical because the rules of English only generate sentences where demonstratives agree with the grammatical number of their associated noun.[14]
- (1) *That cats is eating the mouse.
By contrast, generative theories generally provide performance-based explanations for the oddness of center embedding sentences like one in (2). According to such explanations, the grammar of English could in principle generate such sentences, but doing so in practice is so taxing on working memory that the sentence ends up being unparsable.[14][15]
- (2) *The cat that the dog that the man fed chased meowed.
In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence.[15][16] For example, while many generative models of syntax explain island effects by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.[17][18]
Non-generative approaches often do not posit any distinction between competence and performance. For instance, usage-based models of language assume that grammatical patterns arise as the result of usage.[19]
Innateness and universality
A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some domain-specific aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.[20][21]
The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments.[22][23] For example, one famous poverty of the stimulus argument concerns the acquisition of yes–no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting hierarchical structure even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.[22][23][24] The empirical basis of poverty of the stimulus arguments has been challenged by Geoffrey Pullum and others, leading to back-and-forth debate in the language acquisition literature.[25][26] Recent work has also suggested that some recurrent neural network architectures are able to learn hierarchical structure without an explicit constraint.[27]
Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature-specifications in the lexicon.[21][28] On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.[21][29] In a 2002 paper, Noam Chomsky, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.[30]
In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.[20]
Subfields
Research in generative grammar spans a number of subfields. These subfields are also studied in non-generative approaches.
Syntax
Syntax studies the rule systems which combine smaller units such as morphemes into larger units such as phrases and sentences.[31] Within generative syntax, prominent approaches include Minimalism, Government and binding theory, Lexical-functional grammar (LFG), and Head-driven phrase structure grammar (HPSG).[3]
Phonology
Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work on phonotactic rules which govern which phonemes can be combined, as well as those that determine the placement of stress, tone, and other suprasegmental elements.[32] Within generative grammar, a prominent approach to phonology is Optimality Theory.[29]
Semantics
Semantics studies the rule systems that determine expressions' meanings. Within generative grammar, semantics is a species of formal semantics, providing compositional models of how the denotations of sentences are computed on the basis of the meanings of the individual morphemes and their syntactic structure.[33]
Extensions
Music
Generative grammar has been applied to music theory and analysis since the 1980s.[34] One notable approach is Fred Lerdahl and Ray Jackendoff's Generative theory of tonal music, which formalized and extended ideas from Schenkerian analysis.[35]
Biolinguistics
Recent work in generative-inspired biolinguistics has proposed that universal grammar consists solely of syntactic recursion, and that it arose recently in humans as the result of a random genetic mutation.[36] Generative-inspired biolinguistics has not uncovered any particular genes responsible for language. While some prospects were raised at the discovery of the FOXP2 gene,[37][38] there is not enough support for the idea that it is 'the grammar gene' or that it had much to do with the relatively recent emergence of syntactical speech.[39]
History
Analytical models based on semantics and discourse pragmatics were rejected by the Bloomfieldian school of linguistics[40] whose derivatives place the object into the verb phrase, following from Wilhelm Wundt's Völkerpsychologie. Formalisms based on this convention were constructed in the 1950s by Zellig Harris and Charles Hockett. These gave rise to modern generative grammar.[41]
As a distinct research tradition, generative grammar began in the late 1950s with the work of Noam Chomsky.[42] However, its roots include earlier structuralist approaches such as glossematics which themselves had older roots, for instance in the work of the ancient Indian grammarian Pāṇini.[43][44][45] Military funding to generative research was an important factor in its early spread in the 1960s.[46]
The initial version of generative syntax was called transformational grammar. In transformational grammar, rules called transformations mapped a level of representation called deep structures to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room".[47]
After the Linguistics wars of the late 1960s and early 1970s, Chomsky developed a revised model of syntax called Government and binding theory, which eventually grew into Minimalism. In the aftermath of those disputes, a variety of other generative models of syntax were proposed including relational grammar, Lexical-functional grammar (LFG), and Head-driven phrase structure grammar (HPSG).[48]
Generative phonology originally focused on rewrite rules, in a system commonly known as SPE Phonology after the 1968 book The Sound Pattern of English by Chomsky and Morris Halle. In the 1990s, this approach was largely replaced by Optimality theory, which was able to capture generalizations called conspiracies which needed to be stipulated in SPE phonology.[29]
Semantics emerged as a subfield of generative linguistics during the late 1970s, with the pioneering work of Richard Montague. Montague proposed a system called Montague grammar which consisted of interpretation rules mapping expressions from a bespoke model of syntax to formulas of intensional logic. Subsequent work by Barbara Partee, Irene Heim, Tanya Reinhart, and others showed that the key insights of Montague Grammar could be incorporated into more syntactically plausible systems.[49][50]
See also
- Cognitive linguistics
- Cognitive revolution
- Digital infinity
- Formal grammar
- Functional theories of grammar
- Generative lexicon
- Generative metrics
- Generative principle
- Generative semantics
- Generative systems
- Parsing
- Phrase structure rules
- Syntactic Structures
References
<templatestyles src="Reflist/styles.css" />
- ↑ Template:Cite Dictionary.com
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ a b c Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88
- ↑ Hockett, Charles, 1987, 41
- ↑ Newmeyer, F. J. (1986). Has there been a 'Chomskyan revolution' in linguistics?. Language, 62(1), p.13
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
Script error: No such module "Check for unknown parameters".
Further reading
- Chomsky, Noam. 1965. Aspects of the theory of syntax. Cambridge, Massachusetts: MIT Press.
- Hurford, J. (1990) Nativist and functional explanations in language acquisition. In I. M. Roca (ed.), Logical Issues in Language Acquisition, 85–136. Foris, Dordrecht.
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
External links
Script error: No such module "Navbox". Script error: No such module "Navbox". Template:Authority control