Chapter 1 Generative Grammar
Chapter 1 Generative Grammar
Chapter 1 Generative Grammar
INTRODUCTION: Generative Grammar (Noam Chomsky) attempts to give response to the native speakers ability to produce language. The Generative linguistic changes traditional linguistic (external) into natives knowledge of language (internal): the internal principles that are responsible for linguistic data of different languages. Its goal is to develop an Universal Grammar (UG) which provides the tools to describe any natural language. They think that human beings have a genetically endowed language faculty that allows children to develop a grammar of any natural language. 1. METHODOLOGICAL UNDERPINNINGS: Language: intellectual activity that produce speech by mental tools and Throughout mental structures. Language has to provide answers to: o Particular phenomena; o Learnability; o Universality; Linguistics: provides theories of Language. Andrew Radford: it must be fully explicit and to be explicit, it must be formal. Chomskys theory of the initial state of human language or UG, implies that the study of any particular external language is an instance of such a grammar. Example: a) Mary washes herself b) John washes himself a) *Herself washes Mary; b) *Himself washes John.
Any sentence has: 1) a complex hierarchical structure 2) the subject need to hierarchical dominate objects (syntactic constraint).
The linguistic cannot gather the list of all the possible sentences. Some linguistic evidence may serve us to analyse a language: why the native speaker accepts some sentences and rejects some others. Independent evidence should be found. Example: a) John wishes that Mary looks at herself in the mirror; b) *Mary wishes that John looks at herself in the mirror;
Here we can conclude that anaphors (herself, himself) need to be coreferential with its closest nominal antecedent.
a) * John wishes that Mary looks at herself in the mirror; b) Mary wishes that John looks at herself in the mirror.
Here pronouns can corefer with its farthest antecedent, but not with its closest one.
The amount of principles of UG considered with respect to particular evidence of a language constitute the grammar of that language; but not all principles of UG appear in all grammars (language diversity) through the notion of parameter setting. Languages do not differ with respect to the computational system, but with respect to specific morphological properties included in the lexicon of each particular language. We cannot speak of different systems of rules to explain apparent contradictory data in the variation of languages, but of a set of specific linguistic parameters allowed by UG. Language diversity should be understood in terms of parameters and universal principles. We may speak of a derivational component equal for all languages in which a language L sets a system of grammar allowed by UG parameterized for that particular language, the syntax in a broad sense (including phonology). Assuming that the syntax provides three fundamental levels of representation:
Deep structure Phonetic Form (PF) Logical Form (LF).
Each lexical unit in the lexicon contains a system of features according to some phonetic and syntactic properties that determine its sound, meaning and syntactic roles through more general principles parameterized for each language.
2. COMPETENCE AND LANGUAGE ACQUISITION. Linguistic competence: set of rules of the speakers knowledge, that allows her to learn and produce language. It explains the ability to render grammaticality judgements with respect to an unlimited number of English sentences; and the new strings of language according to English grammar. Universal competence : Grammatical rules innantes on the languages. Particular competence: specific rules sof one language in its environment. Performance: The opposite to competence. Conjunct of constraints to limitate the use of language. Ability to speak a language. It is based: On the innate principles of UG; On the exposure to a specific language. With them as a basis, we develop a grammar of a language: the core grammar of such a language: Triggering experience Language X UG (with parameters) Core grammar Language X
However, we would not expect a linguistic system to be a pure case. Rather, all sorts of accidents of history would have contaminated the system. 2.1 Insufficient external stimuli. The linguistic task is reduced to formalize the theory of a specific language. For doing so, she needs to look at both UG and the particular grammar of a language, so showing how the principles of UG are fixed for each particular parameter. In the generative work, English is considered as part of the obligatory subject parameters, whereas Spanish, Italian, etc., belong to the null subject parameter: Pall looks at her because he loves her. (Pall la mira porque la quiere.)
*Subject.
2.2 UG and parameters The linguist needs also to explain how each language is acquired. The child is never exposed to all possible sentences of her language; but to those crucial examples which help her to fix her own language parameter. The linguistics task is to identify those crucial examples, which actually trigger language acquisition: I like the boy who/that I see in that picture. *I like the boy speaks Japanese. (The relative pronoun cannot be omitted
with subject relative clauses regardless the human antecedent).
Its to be assumed that the child will only hear grammatical sentences, being always exposed to positive evidence. Therefore, the problem of language acquisition is reduced to fixing parameters allowed by UG. In conclusion, it is proposed that UG is an innate, unconscious ability present at birth, knowledge of grammar; and children have the ability to accept these rules and apply then; and shes able to take on whichever language it is exposed to, as all languages have common elements and are intertranslatable. SUMMARY: If we want to provide our theory of language with explanatory adequacy we need to assume UG. By adopting linguistic competence as the core grammar, language acquisition is the task to learn those marked options that characterize a particular parameter. 1) We cannot count with all possible data of a particular language to formalize a grammar. Besides, if we only pay attention to linguistic performance, well sometimes obtain contradictory data caused from other factors such as slips of the tongue, insecurity, nervousness, etc.; 2) Linguistic competence is unlimited and in principle we may learn any particular language. 3) The speaker may produce new sentences, never heard before, and be sure that such sentences are grammatical in the grammar of his language.
3. LEXICAL AND FUNCTIONAL CATEGORIES. SUBCATEGORIZATION AND CONSTITUENT STRUCTURE. Lexicon: basis for all grammatical relations: - It satisfies external constraints of the interface relation with other levels of grammatical representation; -The constituent structure and the subcategorization give account of the organization of syntactic categories of the structure of semantic universe. English lexicon: the lexical categories are N, A, V, P, A; we classify them by means of its features; -Functional categories: conjunctions. 3.1 Syntactic and morphological evidence. A word belongs to a grammatical category because it shares with other words some properties, acting in a similar way in grammar: N: share the plural form (suffix + S); it can be identified by the definite article; V: can bear morphological features when it is finite (progressive with
ing suffix; to +V, as a non-finite).
Adj: the comparative with er suffix; can be modified by the adverb very Advs: some end in the suffix ly; Prepositions: can be intensified by right, straight, etc. Although the main focus of generative grammar was initially transformational rules (deriving from complete sentences), during the past 25 years the lexicon has increased its important role; so many phenomena seem to be better explained by rules associated with lexical/a set of lexical entries.
3.2 Lexical categories. The relevant morphological evidence of l.c. comes from: >Inflectional properties: some words may take or end in the same suffix; the two major categories of word with these distinctive properties are: -N: number (sing/pl), in contrast with -V: Tense (Pres/past), defective inflect. for person (-s) Pres >Derivational properties: relate to the process by which a word can be used to form a different kind of w. by adding another morpheme. The other piece for specific categories is syntactic distribution: >A precede N; >Only V may follow modal V; We can differentiate a finite present V from a nonfinite V taking the suffix ing by virtue of the fact that only the latter may be combined with the auxiliary be to form the progressive. If UG is responsible for words to belong to one particular category, irregular forms present a problem. In fact, the English child overegularize stems and their regular forms (went, wented, wenting). All these irregular forms will occur in the Child lexicon as marked lexical entries. 3.3 Constituent structure and subcategorization In the first studies of the lexicon only four categories (N, V, A, P) could project a particular structure, that is, have different projection levels in terms of which a Specifier position (spec) and a Complement position (cmp) could be distinguished. Some years later, the constituent structure depicted for the lexical category was extended to functional categories. A complement (C) is a phrase that a lex.cat. takes or selects. Some V, for ex., take specific Verb Phrases: a) Died cannot have any complements; b) Relied (+ on) must have a Prepos. Phrase (PP) as a complement; c) Dismembered + NP; d) Talked + optional Cm +to and/or optional PP Cm with about, etc., as the head;
These C selection requirements can be represented in subcategorization frames: Die, V, [_] Rely, V [_PP [on]] Dismember, V, [_NP] Talk, V, [_ (PP [to] (PP [about]]
_: the position of the lexical head. []: Delimitate the phrase. (): Optional Complemet.
If a C with a particular head is subcategorized for, the head is listed as a feature on the C (rely, talk). Adjectives, nouns and prepositions also subcategorized for their complements: AP examples:
- Blue/blue that Mary will wear; - Afraid (of); - Orthogonal (to); - Ambivalent (to Peter) about his feelings); - Certain (that); - Insistent (to his husband) (that theyll leave).
NP examples:
- Team (of students); - Individual; - Book (about/to); - Generosity (+to); - Dislike (of) - Ambivalence (to P.) about his feelings); - Rumor (that hell be sacked); - Message (to) (about his).
PP examples
- About [the meeting]; - Before [he resigned]; - From [every part of the country]; - [hurry] up.
We can generalize that the lex.cat. (N,V, A, P) in English: > Subcategorize for their Cms; > Precede their Cms in the phrase; > Co-occur with other constituents.
Finally, head and phrases are not only parts of phrases: NPs can be preceded by words or even phrases like: the, no, some, every, Marys, my fathers; APs, by degree words: very, extremely, rather, quite. These items differ from Cms. because they precede the lexical category and they are not subcategorized for. They are specifiers. 3.4 Functional categories. Chomsky puts forward a similar constituent structure for functional categories such as I (=Inflection) and C (Complementizer), where Inflectional Phrase (IP) replaces S. In this system, the more usual notation is then XP, X and X instead of X2, X1 and X0: C: Complementizers like that, for, if; its Specifier is the landing site for wh-movement; I: is the category of verbal inflection and, in E., of modal auxiliaries; IP: the subject of the sentence. Interestingly the functional projection I subcategorizes for the lexical categories. So, it is through subcategorization in the lexicon that functional categories relate with lexical Categories. And this will be projected in the syntax. Finally, Chomskys proposal of functional categories such as CP and IP has been followed by many other minor categories. And inflectional elements. Abney, f.ex., proposes that determiners and degree elements (D and Deg) are also heads in the X-theory: Pollock, on the basis of languages like French (V is inflected for tense and agreement features), proposes that the elements under I: Tense (TP) and Agreement (AgrP) are realized as separate functional categories, each heading their own projection: The hypothesis is that T subcategorizes for AgrP as its Cm and Agr for Vp in this order.
Other proposed functional heads for the verbal system: Tenny: Asp= aspectual markers of auxiliaries; Pollock: Neg or Pol=negative and affirmative markers; Ouhalla: -Mod=modal auxiliary; -Voice or Pass=passive morphology; Chomsky: -Agr-S=Subject agreement; -Agr-O=Object agreement; For the nominal system, Hale & Keyser have a head for Case K (ase), above D. Others like Ritter put forward a functional category NumP above NP. Although those new heads seems to be a radical innovation, the existence of functional categories has been always recognized. Different names of the distinction between lex. And functional Heads are: -Open class items/closed -Content words/function -Lexical formatives/grammatical formatives -Major categories/minor categories Almost all functional head proposed correspond to traditional grammar. Each major part of speech (N,V,A) has its own characteristic properties: PART OF SPEECH GRAMMATICAL CATEGORY N case, definiteness, number, gender V mood, tense, polarity, aspect, voice A degree What is new is the way they project syntactically. In the case of NP, it has changed the way they are analyzed: >Classical analysis of NP. The determiner isnt a head of NP, but its Spec (sister of N) >Modern analysis of NP. The determiner is the head of NP, taking it (NP) as its Complement. The same difference is seen in the analysis of degree-elements, auxiliaries, negation, and complementizers. Functional heads arent therefore an exclusive peculiarity of the generative view on syntactic structure, but a matter of general linguistic concern.
3.4.1 English auxiliaries and infinitival to. In traditional grammar, auxiliaries behave in a different manner that other V, so they belong to a different category. They typically take a V expression as Complement: have the semantic function of marking grammatical properties associated with the relevant V: tense, aspect, voice, mood, modality. Following Radfors description of functional Categories. Examples of auxiliaries that take V Complement in brackets are: >Have=trad. considered a perfective auxiliary: He has/had [gone] >Be= an imperfective/progressive auxiliary: She is/was [staying home] >Do= a dummy auxiliary: he really does/did >The modal auxiliary: You can/could [help]; I shall/should [say a lot]` The main difference between auxiliaries and main Verb is that: Auxiliaries undergo some syntactic processes: -Inversion in questions -They can be directly negated by a following not, (nt) -They appear in sentence-final tags -They must appear in short answers Main V not undergo these processes. 4 THE SYNTAX OF CLAUSES. a. Syntactic analysis of phrases and larger clauses, extending the x-bar theory to them; b. The internal structure of each constituent in the clause and all possible Word-orders in all declarative, negative, and interrogative sentences; c. Analysis of passives; d. Reformulate notions: subject, O, IO in more refined syntactic analysis; e. Provide a data-motivated, stepwise introduction to the main tenets of the Government and Binding (GB) theory (Chomsky);
INTERPRETABLE AND UNINTERPRETABLE FEATURES.- According to minimalism, phonetic, gramar and semantic properties of words can be described in terms of sets of features. The Minimalist Program puts forward a grammar with only two levels of representation: the Phonetic Form (PF) and the Logical Form (LF) in whose representative nature each contains only phonetically or semantic interpretable features in turn, as required by the UG constraint of Full Interpretation . Other grammar features are uninterpretable (they cant be read off by the PF or LF components; being exclusively responsible for certain syntactic operations proper of the CS (computer system). But how do they enter into CS? 1) Lexical items (phonetic, semantic, and grammatical features) are selected by the lexicon; 2) Constituents are combined together in a pair wise fashion to form a phrase structure tree by the process of merger. 3) After spell out, PF and LF operations are responsible for the process of phonetic and semantic features. All uninterpretable features are invisible now, for theyve been checked properly in the syntactic derivation. The resulting model is: Phrase formation = spell out = Interpretable features, LF operation PP operation, interpretable features. Uninterpretable features include: A. Number (sg/pl): plays an obvious role in the syntax of agreement; B. Gender: in the syntax of reflexive anaphors, C. Person: syntax of S-V agreement;
D. Features which determine the morphological form of items: a. The case features of pronouns; b. The inflection features of V. They do not include features which have no morphological or sintaxis correlate;
Examples: words such as calf, foal, kitten, etc, share the feature [young], but its a purely semantic feature which doesnt play any role in grammar, so is not a grammatical feature. Chomsky: interpretation features are relevant for LF and include categorical [+V,+P,+A] and nominal phi-features (+fem, +sing); they arent deleted after checked. Non-interpretable features are deleted e involve the case features of NPs and V and the phi-features of V. reasons for this distinction: Some features remain visible after checking; cant be deleted. Why the V-and N-f. of early Minimallism are changed by categorical, Case and features? Because the Extended Projection Principle effects (cl have structural subjects) are accounted for by a strong D-feature in I (Chomsky), since Minimalist trees dont project automatically a Specific position along with the introduction of bare phrase structure and specifier IP must somehow be present; If Radfords principe of Full Interpretation is correct, and PF or LF Representation contain only semantically interpretable f., then uninterpretable features must be eliminated in derivation for the derivation to converge at LF (by means of checking). CHECKING OTHER GRAMMATICAL FEATURES. - The Spec-Head relation is the primary means of expressing checking relation in the MP.
Abneys DP-hypothesis for N: the MP<NPs are part of a DP structure when they are internal arguments. a. *This books b. These books Here, the grammatical feature of N can be checked against a functional category expressing agreement. MP: the opetarion Move (later named by Chomsky Attract) must meet certain economy conditions.
Minimal Link Condition (MLC, Rizzis Relativized Minimality) < The MLC forces the shortest move and is inviolable; given several potential landing sites, the closest must be chosen or else the derivation crashes. Chomsky: the operat. Attract/Move: K attracts only if there is no , closer to K than , such that K attracts . Here, K is the target and is the Case feature of N. Chomsky: the functional projection for checking the Nominative Case f. is Tense Phrase (TF), old IP. BARE STRUCTURE. - This theory departs from several assumptions: Categories are elementary constructions from properties of lexical items, satisfying the inclusiveness condition. In BF, no levels and no distinction exists lexical items/heads projected from them so, an tem can be X or XP.