Linguistics 522

Lecture 2

Structure-dependence turns out to be a very important notion.

We linked it to two ideas

these correspond to the two things Carnie talks about at the outset of Chapter two. We take up phrasehood first. Everybody find this absolutely uncontroversial? Sometimes in traditional grammar there's notion of a verb group. The above proposal misses this completely. In general there will be a number of plausible candidates intuitions won't decide among. And as you will see, linguists will disagree. In general, we will need arguments for phrasehood. Intuitions won't do. So phrasehood is the first component of what Carnie means by structure in structure dependence. The other component is categoriality: We have the same phrases as before exactly but now they are labeled with labels like NP, M, VP, AP, PP. As an alternative and completely equivalent representation of the tree we have the labeled bracketing in (5)

In fact this is exactly the labeled bracketing that was used to generate the above tree in the tree-drawing web-site.

The trees we'll be using then embody two notions, constituency (phrasehood) and categoriality. There's a third notion which isn't really made explicit in Chapter 2 but which will become important later. Headedness. Each of the phrases has a word that it is its head. We'll argue for this in detail later but it's the head of the phrase that determines many of its properties. For instance, The man with two Cadillacs has man as its head.

Here is some evidence:

For NPs, the head determines its number agreement properties and what kind of entity the NP as a whole describes. The man with two Cadillacs has man as its head and refers to a man, not a Cadillac.

So we move on to arguments for kind of structure we're assuming.

Determining Parts of Speech

  1. Noun: table, destruction, family, theory ...
  2. Adjective: green, utter, delicious, syntactic
  3. Adverb: cleverly, very, quite
  4. Verb: walk, sleep, criticize...
  5. Modal: may, could, can, will...
  6. Preposition: of, about, on, on top of, with, to ...
  7. Determiner: the, every, few, many, more, seven, both, only, my?...

Word-level Evidence

Word-level phonological evidence for lexical categories:

This argument needs to be adjusted slightly for American speakers.

Semantics is sensitive to category. Ambiguity of phrases:

Notice headededness is changing as well as category.

Morphology is sensitive to category. Verb inflection

These inflections go on verbs, not adjectives, not nouns. Criticisms?

Modals do inflect for tense:

Modals don't inflect completely like verbs. Some modals lack past tense forms:

What do we SAY here when we mean the past tense form of must? Modals never have an ing form or participle form, never have an agreeing form inflected in -s.

Adjectives and adverbs take -er. (16),(17), p. 59.

Nouns take plural -s.

Prepositions take NO inflection. [some languages DO have inflecting prepositions.]

Determiners have no single defining morphological characteristic. Hmmm. Worry about this.

Why no consistent inflection? Do any determiners take any inflection at all? Well maybe.

What about my, his, her? Maybe these are possessive forms? If so, they would be possessive forms of I, he, she. But these are not determiners, so they're not what we're looking for now, morphological processes that apply to determiners. And in the end, we're not even going to be sure that this process produces determiners, because we're going to have some questions about whether possessives should be thought of as determiners. Here's an example to think about. More is sometimes treated as the comparative of many. This is too complicated to argue for now, but the intuition is on the basis of analogous pairs like these:

  1. John bought an cheap book.
  2. John bought a cheaper book (than Mary).
  3. The lecture attracted many students.
  4. The lecture attracted more students (than teachers).

In any case this lack of morphology for determiners seems to be a peculiarity of English. In many languages, detrminers inflect quite freely for number and gender. Spanish, for example. Examples?

All of this was INFLECTIONAL morphology. There is another kind of morphology called DERIVATIONAL.

Digression: What is the difference between derivational and inflectional morphology? Some diagnostics for inflectional affixes

We can also find derivational morphological evidence.

Summarizing the evidence thus far:

At the same times we've been developing a set of diagnostics, or tests, for each category Thus, taking the affix -ness AND -ly is a good test for adjectivehood.

A very important kind of evidence for categories is distributional evidence.

For a distributional argument for category X we need a context in which only things of category X can occur. For example, is this a good distributional context context for argueing for adjectives? No, adverbs go here too. For adjectives We need something more like: For adverbs how about: For prepositions Or

Phrases: Argueing for constituency

Thus far structural evidence for parts of speech. Moving on to phrases.

Remember there are two kinds of information in representations of structure (trees), phrasal information, what words make meaningful units, and category information, what the categories of phrases are. So we'll be looking for evidence of both kinds, evidence that the things we hypothesize to be phrases are actual linguistic units, and evidence that they have the categories we say they have.

We're going to call phrases constituents, because we generally consider them in a syntactic context, so they are constituents of the sentences they occur in.

Morphological evidence. (parallels the word-level morphological argument). The possessive affix: 's.

Observation 1: The possessive affix attaches at end of entire phrase, not onto the head.

This is in contrast to what we saw before, inflectional and derivational morphemes attaching onto words. What seems to be going on is that the affix has a particular structurally defined position it goes on to. This so different from other affixes that people have argued that this should be treated as a different kind of thing altogether. Sometimes the English possessive is called a clitic. Rather than attaching to particualr kinds of words like other affixes, clitics attach to phrases or have structurally defined positions. There's a discussion of a Serbo-Croatian clitic near the end of Chapter 1. Observation 2: Possessive only attaches to NPs We have something that attaches to phrases of a specific category, namely, NP, so we have morphological evidence for phrase-level categories, just as we had morphological evidence for word-level categories. At the same time time, we have evidence for the phrase-boundary, because the possessive affix pays attention to where the noun phrase ends.

Semantic evidence (parallels the word-level semantic argument)

This sentence is ambiguous. It has two readings. Ambiguities are very important in syntactic argumentation. Why? Just as there different sources for acceptability so there are different sources for ambiguity. An ambiguity means has a sentence two or more ways of being interpreted. But the rules of the language are suppose to account for meanings. Where does that difference comes from. We assume that the rules can provide meaning differences in two ways. Different words are chosen. Or different structures: These two examples represent different kinds of ambiguity.

The first ambiguity hinges on the meaning of the word flight. This is a lexical ambiguity.

The second case is different. No word changes meanings between the two readings. It's a question of what the relationships among the lexical meanings are. This is a syntactic or structual ambiguity. In one case flying to Paris modifies we (we are doing the flying), in the other (slightly silly) reading, it modifies the Eiffel Tower (the Eiffel tower is doing the flying). [The question of how we would reprsent this difference in trees is a complicated one we set aside for now).

Next we look at semantic evidence based on a structural ambiguity.

The structures we will assume:

  1. not-possible reading: [The president] [could not] [ratify the treaty].
  2. possible-not reading: I[The president] [could] [not ratify the treaty].

Now assume an adverb goes before the VP constituent. Then these two different structures would predict different adverb placements.

  1. [The president] [could not] simply [ratify the treaty].
  2. [The president] [could] simply [not ratify the treaty].
And this works. (1) has only the not-possible reading; (2) has only the possible-not reading.

Form of argument:

  1. A structural distinction posited which accounts for one phenomenon. (the ambiguity)
  2. The same structural difference is independently motivated. That is, the account in (1) is not ad hoc. Other phenomena are accounted for by the same structural distinction.

Further independent evidence:

  1. What the president could not do is ratify the treaty.
  2. What the president could do is not ratify the treaty.
These syntactic variations on the first sentence are called pseudo-clefts. (1) has only the not-possible reading; (2) has only the possible-not reading.

Contraction facts

  1. [The president] [couldn't] [ratify the treaty].
  2. * [The president] [could] [n't ratify the treaty].

Distributional Evidence for Phrases


Call this second sentence a case of pre-posing.

Lots of elements prepose:

Some things do not.

Hypothesis: only a whole phrase (and not just PART of a phrase)can be preposed.

Things which are not constituents do not prepose

Why do we say that strings like up her mother in these sentences are not constituents? One: word order flexibility of up

Two: When the particle precedes the NP, the particle and verb must be adjacent, in contrast to prepositions and verbs:

This suggests the verb and particle form a constituent in these phrasal verbs. If the verb and particle form a constituent, then the particle and NP do not. Is the argument clear here? The particle can't belong to BOTH constituents.

Preposing Constraint


Movement Constraint

Next constraint: Sentence fragments must be phrasal constituents:

We started out looking for arguments for phrasal constituents but really Two notions emerging as important:

An argument for categoriality. The other kind of information our trees is category. To motivate syntactic category as part of our notion of structure we need some linguistic phenomena that are sensitive to category.

Two kinds of adverbs:

Why is this not an S position?

Why is this not a VP position?

Why are these positions both?

Can you articulate a property that these positions share that makes them ambiguous? Extend this. Why not:

More arguments for constituency

Coordination Constraint:

Shared Constituent Constraint:

Problems with Coordination. As we noted at the beginning of the chapter, not all the diagnostics we come up with are going to converge on the same result. Here's a case in point: What is being coordinated here? Do these strings form a constituent? What do the other tests say? In general coordination is a less reliable test.

Anaphora test

The man who wrote the book on Transformational Grammar was greeted at Kennedy Airport today by massive crowds, cheers, and fainting. He is universally adored.

He refers to the same individual as the entire Noun Phrase:

We call this the antecedent of the pronoun..

Pronoun takes entire NP as antecedent and syntactically behaves like entire NP.

It's really a pro-NP not a pro-noun.

Pro-VP (so, as):


Pro PP (there):

But what about this?

Refined hypothesis: The pronominal form there has the distribution of a PP, but it's antecedent can be anything that denotes a place, either a PP or an NP.

General pronominalization Claim:


See examples (100) and (101) in the text, pp. 82,83.

Another process that takes all of a phrase, nbot part of it, limited in this case to VPs. We have a diagnostic for VPs.

Word versus phrases

Sentences of the following kind: We argue now that the italicized words are phrases as well as being words. Cars is an NP. Useful is an AdjP. So there are single-word phrases. The first idea is that all the processes that we claimed applied to FULL phrases apply to these single-word strings.

We noted that conjunction is a somewhat unreliable test. But conjunction does seem to require parallelism. That is, the things conjoined need to be alike. Notice that in our a example a phrase is coordinated with a single word intolerant. Examples illustrating the need for like things in coordination:

The second idea is that wherever we have full phrases, these one-word strings can occur too.

Thus these one-word strings have the same distribution as phrases. In other words, these one-word strings are passing the distributional test for phrasehood.