The Theory behind the Cascadia Syntax Graphs

1. Cascadia Syntax Graphs

1.1 Head-driven Phrase Structure Grammar

The linguistic framework implemented by the Cascadia Syntax Graphs[1] is Head-driven Phrase Structure (HPSG). HPSG is a generative non-transformational framework in the tradition of “Constraint-Based Lexicalism.” This phrase functions, along with “non-transformational,” as a sort of technical short hand for several of the key tenants of the theory, each of which deserves some explanation of its own below.

1.1.1 Constraint Based Syntax

Head-driven Phrase Structure Grammar is one of a number of constraint base frameworks that have been developed over the past thirty-five years. The other most well-known of these are Lexical-Functional Grammar and Generalized Phrase Structure Grammar, from which HPSG is derived.[2] Constraint based syntactic theory stands in direct opposition to the transformational frameworks developed by Noam Chomsky and his adherent.[3]

This is true in two respects. First all, whereas transformational linguistic frameworks use syntactic derivation to move from hypothesized abstract syntactic structures, Head-driven Phrase structure focuses entirely on surface forms and structure of the language – or Sag, Wasow, and Bender rather pointedly put it, “There are no operations that destructively modify any representations.”[4] The grammatical phenomena which originally triggered the use of transformations, movement, and syntactic derivation are dealt with through interaction between theoretic principles, grammar rules, and individual lexical entries for the words of a given clause.[5] Once a given grammatical structure for a clause is generated, there is no further change: “[P]hrase structures are not rearranged, trimmed, or otherwise modified via transformational rules.”[6]

But constraint based syntax also stands in opposition to mainstream generative grammar in the way it structures grammatical information. Andrew Radford defines the Autonomous Syntax Principle as follows: “No syntactic rule can make reference to pragmatic, phonological, or semantic information.”[7] Wherein mainstream generative grammar, syntax is an autonomous grammatical module, whose rules function independently from all other linguistic information, HPSG and other constraint based frameworks begin with the fundamental concept that different aspects of linguistic meaning constrain each other: Phonology, morphology, information structure, semantics, all of these influence and constrain syntax. Indeed, one might even go as far as saying that syntax is the cumulative result of the interface of these components of linguistic structure and meaning.

1.1.2 Lexicalism

The richness of HPSG’s lexicon must be viewed as one of the key features of the framework as well as one of its major strengths. For a framework to be considered lexicalist, it must adhere to the the Lexicalist Hypothesis, also called the Lexical Integrity Principle, stated in example (1).

(1) “No syntactic rule can refer to elements of morphological structure.”[8]

What this means is that syntax and syntactic rules can only work with the grammatical-semantic information provided by a given inflectional form and not the processes which caused that form. Morphological structure is not influenced by syntactic structure.[9] This principle makes the lexicon the critical and driving component of the framework, where lexical rules rather than transformations dictate much of syntactic structure.

1.1.3 Head-driven

As noted above, there are several Constraint-based Lexicalist frameworks for syntax. What then makes HPSG unique from others. The words “head-driven” reflect HPSG approach to phrase structure grammar, in which complex categories and feature structures play a central role.[10] Previous phrase structure grammars were what we will call simply context free grammars (CFG), which the vast majority of morpho-syntactic information played little or no role in the formation of syntactic structure. CFGs can go a long way in analyzing natural language, but they have their limits Thus, a syntactic rules such as in example (2) below would correctly analyze the sentence in figure 1.

(2) S –> NP VP
VP –> V NP

Fig. 1


But there is a fundamental problem here: nothing about the framework itself can tell us anything about what a linguistically natural syntactic rule is. Because context free grammar is exactly that: context free, there is nothing within the theory that would prevent rule such as those in example (3).

(3) NP –>P P V
VP –> Det N

Yet we know implicitly that verbs do not function as an immediate daughter of a noun phrase and nouns cannot be the immediate daughters of verb phrases. This is the headed nature of language.

Moreover, the simple rules of CFGs grow exponentially more complicated when we deal with questions of agreement or transitivity. And it is here that the headed nature of language comes to the rescue once again. HPSG takes the position that not only are phrases headed by their lexical counterpart (i.e. nouns head noun phrases), but also the grammatical and morpho-syntactic information encoded in the lexical entry for a given inflectional form is carried from the terminal node level all the way up the headed phrase structure. This includes, subject agreement, noun phrase agreement, valency, and any number of other grammatical features. This is seen in figure 2 below.

Fig. 2[11]
Alex likes the opera.

At first glance the tree is rather intimidating but not the relationship between the word level feature structures for “likes” in the tree in relation to the feature structures in the rest of the clause. The required specifier for “lives” is filled by the subject Alex as [1] and the required complement is filled by the noun phrase “the opera” as [2]. Likewise, the specifier requirement of “opera” is satisfied by “the” as [3]. In this way, the grammar and syntax is constrained by the phrasal heads in the clause. Phrase Structure is head-driven.

What this does for Cascadia not only constrains the syntactic parser that creates the trees themselves, but also provides a significant power for users for searching the database for syntactic constructions. Essentially, any morpho-syntactic information used for a given search only needs to be included at the level where that information functions as part of the head of a phrase. Thus if one were curious about occurrences of μέλλω +infinitive where the infinitival clause preceded μέλλω (found only in Acts 23:3), there is no need to create a search as complex as that in figure 3.

Fig. 3


Rather, all that is needed is the significantly simpler search seen in figure 4.

Fig. 4

The difference and power of the HPSG approach is immediately apparent and the results are identical.

[1] Andi Wu, and Randall Tan. Cascadia Syntax Graphs of the New Testament; Cascadia Syntax Graphs of the New Testament (Logos, 2009).


[2] Peter Culicover and Ray Jackendoff’s recent Simpler Syntax Hypothesis (Simpler Syntax, [Oxford: Oxford University Press, 2005]) should also be viewed as within the group of Constraint based Lexicalist frameworks, since it builds on a combination of LFG, HPSG, and Construction Grammar in its devastating critique of Chomsky’s work, or more specifically, what they call “Mainstream Generative Grammar” (Ibid., 3n1; cf. R.D. Borsley, “Peter W. Culicover and Ray Jackendoff, Simpler Syntax, Oxford University Press, Oxford (2005) xvii + 608 pp..,” Lingua 117, no. 4 [April 2007]: 741-749).

[3] The most relevant work of Chomsky here is: Noam Chomsky, Lectures on Government and Binding: The Pisa Lectures, 7th ed. (Mouton de Gruyter, 1993); The Minimalist Program (The MIT Press, 1995).

[4] Ivan A. Sag, Thomas Wasow, and Emily M. Bender, Syntactic Theory: A Formal Introduction (2nd ed.; Stanford, Cal.: CSLI, 2003), 295.

[5] Unfortunately, because of the detailed formalism of HPSG and the inherent complexity of the rules and lexical entries, providing an example of the interaction of principles, rules, and lexical entries would cause more confusion than explanation.

[6] Ibid.

[7] Andrew Radford, Transformational Grammar: A First Course (CTL; Cambridge: University Press, 1988), 30.

[8] Mary Dalrymple, Lexical Functional Grammar (S&S 34; Sand Diego: Academic Press, 2001), 84. See also the introduction to HPSG by Sag, Wasow, and Bender (Syntactic Theory, 295), who do not express it so concisely.

[9] Consider for example the fact that while “free” word order languages are rather common cross-linguistically, there is no such thing as a free morpheme order for the internal structure of words.

[10] Much of the reasoning behind this rather cryptic statement involves a large portion of the history of syntactic theory going back to the 1950’s and 60’s when Context Free Phrase Structure dominated the scene. Going through the historical reasoning behind HPSG with reference to both early phrase structure grammars and also Chomsky’s original transformational grammar is well beyond the scope of this paper.

[11] Sag, Wasow, and Bender, Syntactic Theory, 104.

10 thoughts on “The Theory behind the Cascadia Syntax Graphs

  1. After the first paragraph, what a relief to find that all those parts of the first paragraph that I didn’t understand were going to be clarified … The only part of the first paragraph that was crystal clear was the final dozen words or so: “the key tenants of the theory, each of which deserves some explanation of its own below.”

  2. Mike, I really appreciate this post. I have yet to be successful in searching the Cascadia database on my own. Maybe this will help me understand the underlying principles.

    Question only slightly related to the topic – I know how to get the lemma in the search with the g: dealio. But I am clueless as to getting the morphology into the search. In your example, I cannot figure out how to get the mood and part-of-speech parameters entered. Do you have a quick answer?

    1. Yes. I do have a quick answer.

      In the search window, after you select a node of your search there what Logos 4 is calling the “Specifics” panel on the right side, which you’ve found for entering the text and lemma beneath the “Head Term Text & Lemma” section of the panel is “Head Term Semantic Domains” and beneath that is “Head Term Morphology.” In the morphology box, you need to first type @ and then a window will pop up to give you more options.

      The typing of @ first before anything else isn’t terribly intuitive, but after that, it should be smooth sailing.

      Let me know if you have any more challenges.

  3. You’re right, it was smooth sailing after that. Didn’t find that little trick anywhere in the documentation, although it almost seems like I knew that once upon a time. Thanks a million.

  4. Mike,
    I’m just getting around to reading this and found it helpful (especially the end). I’m starting to get more and more interested in Cascadia as opposed to Opentext

    1. That’s great to hear. I had been worried about its technicality. I’m actually in the process of revision the paper I presented at BibleTech:2010, of which this was a part. I’ll be posting it when that’s complete.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s