The Role of Dependency Grammar in Linguistic Theory and Practice
A Comparative Study of Tesnière’s Dependency Grammar and Chomsky’s Phrase Structure Grammar Using English and Japanese Languages
Recently, I came across this quote by the 19th century French Linguist Lucien Tesnière, “Every word in a sentence does not operate independently but depends on another”.
This got me thinking about the role of words within a sentence, specifically the role of verbs, a concept that Tesnière himself deeply investigated. This couldn’t have been more timely as I only recently started learning Japanese, a language that essentially frames almost every sentence around the verb or action it conveys.
In the study of linguistics, understanding the structure of language is extremely important, and various theories have been pronounced to decipher the complex architecture of human communication. Among these, dependency grammar, developed by Lucien Tesnière, offers its own unique take to syntactic analysis. This theory prioritises relationships between words based on dependency rather than grouping them into hierarchical constituents.
The relevance of comparing different syntactic models becomes apparent when dealing with the practical challenges of parsing, translation, and language teaching.
So in this article, I shall discuss the distinctions and practical implications of Tesnière’s dependency grammar and juxtapose it with Noam Chomsky’s phrase structure grammar. If you are unaware of Chomsky’s works in this regard, then I highly recommend this article that I had written earlier this year. I have covered Chomsky’s thesis in detail in that article and other articles on this Substack.
This analysis is particularly pertinent for those intrigued by languages like Japanese, where traditional syntactic models encounter limitations. By engaging with this article, you will gain a deeper understanding of how different grammatical theories provide the tools to better describe and analyse languages across the spectrum, offering practical benefits for both academic research and real-world applications.
Dependency Grammar
Dependency grammar, as articulated by Lucien Tesnière, positions the verb as the central axis of syntactic structures, asserting that all other elements in a sentence orbit this all important word. This conceptualisation diverges markedly from traditional phrase structure grammars that segment the sentence into nested hierarchies of constituents (noun phrases, verb phrases, etc.).
Dependency grammar simplifies the analysis of sentence structure by focusing on the relationships between words, i.e., how each word connects to its governor (or head) with direct lines that denote dependency. This model is visually represented through tree diagrams, where each node (word) links directly to the node (word) upon which it depends. Tesnière illustrated this with his stemmas, which not only clarified the linear relationships but also highlighted the functional roles of words as they contribute to the overall meaning of the sentence.
Consider the English sentence, "Alice gave Bob a book." In a dependency diagram, "gave" serves as the root, with "Alice," "Bob," and "book" all depending on "gave." "Alice" is the subject, "Bob" the indirect object, and "book" the direct object, each linked directly to the verb without intermediary nodes, reflecting their semantic roles more transparently than constituent-based models typically allow.
Contrasting with Phrase Structure Grammar
While phrase structure grammar, particularly as developed by Chomsky, is good for analysing language with its rules for generating tree structures, it can obscure some of the direct relationships between words by embedding them within larger constituents. For instance, in the same sentence, "Alice gave Bob a book," phrase structure would require us to parse the sentence into a noun phrase and a verb phrase, which are then broken down into smaller constituents. This method often adds layers of complexity that may not directly correspond to the functional roles words play.
Why Dependency Grammar Matters
Dependency grammar’s focus on direct relationships and minimal reliance on abstract constituents makes it particularly advantageous for analysing languages with free or flexible word order, such as Russian, Japanese or Turkish. This approach aligns closely with recent advances in computational linguistics, where dependency-based models have facilitated more accurate parsing algorithms by closely mirroring natural language processing.
By adhering to Tesnière’s view, where he argues that "the connections between words in a sentence are not to be seen as merely syntactic formalities but as vital arteries through which the lifeblood of meaning circulates," linguists and language technologies gain a framework for dissecting the syntax that more closely resembles how meaning is constructed and interpreted in human languages.
Tesnière vs Chomsky
Tesnière's model is particularly effective for analysing languages with free or flexible word order because it focuses on the functional dependencies between words rather than their positions within a fixed phrase structure. This approach aligns more closely with semantic interpretation, where the verb acts as the nucleus of sentence meaning, and all other elements are directly linked to it, reflecting their semantic roles.
Chomsky’s framework, on the other hand, with its emphasis on rules and transformations, is adept at dealing with complex syntactic phenomena like embedded clauses and passive constructions. The hierarchical nature of this grammar allows for a detailed analysis of how different language components interact within a sentence, making it particularly suitable for languages like English, where word order plays a critical role in grammatical structure.
Dependency grammars, with their direct modelling of syntactic relations, often provide a more straightforward framework for developing parsing technologies. This is especially true in the context of languages where maintaining the order of words is less crucial to the meaning than the relationships between them.
Conversely, phrase structure grammars have been foundational in developing theories of syntax that accommodate a wide range of linguistic data, helping create models for generating and transforming syntactic structures. This has significant implications for linguistic theory, especially in areas that explore the cognitive aspects of language processing.
To better understand these differences, let's examine specific sentence structures and how each model would parse them.
Example 1
Sentence: "The cat sat on the mat."
Dependency Grammar: In this model, "sat" is the central verb and the root of the structure. The words "the cat" (subject) and "on the mat" (prepositional phrase indicating location) are dependent on "sat." This can be visualised as
sat
├── the cat
└── on the mat
Phrase Structure Grammar: Phrase structure would break this down into a noun phrase (NP) and a verb phrase (VP). The VP "sat on the mat" includes a prepositional phrase (PP) nested within it. The structure might look like
S
├── NP
│ └── The cat
└── VP
├── V
│ └── sat
└── PP
├── P
│ └── on
└── NP
└── the mat
Example 2
Sentence: "The student who read the book laughed."
Dependency Grammar: "Laughed" is the main verb and the root. "The student" is the subject depending on "laughed," and "who read the book" is a relative clause depending on "student." The verb "read" within the relative clause makes "the book" its dependent.
laughed
└── the student
└── who read
└── the book
Phrase Structure Grammar: This analysis would involve multiple layers of nesting. The sentence would be divided into an NP containing a relative clause and a VP.
S
├── NP
│ ├── The student
│ └── S'
│ ├── NP
│ │ └── who
│ └── VP
│ ├── V
│ │ └── read
│ └── NP
│ └── the book
└── VP
└── V
└── laughed
From the above examples, we can understand that dependency grammar provides a more straightforward and semantically transparent analysis for sentences, especially when dealing with languages that exhibit a significant degree of syntactic flexibility. It emphasises the direct relationships between verbs and their arguments or modifiers, which can be particularly beneficial for semantic analysis and machine translation.
Phrase structure grammar, on the other hand, with its detailed constituent breakdown, is highly effective for handling complex syntactic transformations and is good for linguistic research that focuses on the structural intricacies of language, such as movement phenomena and syntactic ambiguity.
Dependency Grammar and Japanese Syntax
The structure of the Japanese language, with its characteristic Subject-Object-Verb (SOV) order and flexible sentence construction, makes it an excellent case study for the application of Lucien Tesnière's dependency grammar.
Example
Sentence in Japanese: 猫がマットの上に座った。 (The cat sat on the mat.)
Transliteration: Neko ga matto no ue ni suwatta.
Dependency Grammar: In this sentence, "suwatta" (sat) serves as the root. The subject "neko" (cat) and the location phrase "matto no ue ni" (on the mat) both depend directly on the verb. This can be visually represented as
suwatta
├── neko ga
└── matto no ue ni
This structure highlights how each component of the sentence connects directly to the verb, reflecting their functional roles clearly and succinctly.
Example 2
Sentence in Japanese: 本を読んだ学生が笑った。 (The student who read the book laughed.)
Transliteration: Hon o yonda gakusei ga waratta.
Dependency Grammar: "Waratta" (laughed) is the main verb and the root. "Gakusei ga" (the student) is the subject depending on "waratta," and i (who read the book) forms a dependent clause linked to "gakusei." The diagram for this structure would be
waratta
└── gakusei ga
└── hon o yonda
This representation clearly delineates how the relative clause connects to the noun it modifies, demonstrating dependency grammar’s utility in mapping even complex sentence structures in Japanese without additional syntactic overhead.
Dependency Grammar in Computational Linguistics
Dependency grammar has become increasingly influential in the field of NLP, where understanding the relationships between words directly impacts the effectiveness of parsing algorithms and machine translation systems.
Consider a machine translation system designed to translate complex sentences from English to a relatively free word order language like German. Dependency grammar allows for a more flexible alignment of sentence components, ensuring that semantic relationships are maintained even when syntactic structures differ a lot between the source and target languages.
Visualisation
English: She gave him the book.
Dependency Diagram: [gave] -> [She, subject]; [him, indirect object]; [the book, direct object]
German: Sie gab ihm das Buch.
Dependency Diagram: [gab] -> [Sie, subject]; [ihm, indirect object]; [das Buch, direct object]
This direct mapping of dependencies across languages simplifies the translation process by maintaining the core grammatical relationships.
Notes on Japanese Grammar and the Philosophy of Language
My recent exposure to Japanese grammar has helped me discover the language’s reliance on verb dependency. This has also brought to my attention the importance of dependency grammar not just as a syntactic preference but as a reflection of how meaning is constructed and communicated. This structural emphasis on verbs (actions and states) over subjects (entities) is an interesting aspect in the broader themes of the nature of reality as perceived and described through language.
In Japanese, the verb's positioning at the end of the sentence influences how actions and states are perceived as central to understanding interactions and events. This can be interpreted through the lens of existentialism, where actions define beings or through process philosophy, which sees processes rather than static entities as fundamental to the universe.
The dependency of other sentence elements on the verb in Japanese shows the contextual nature of meaning it entails. Unlike languages with rigid syntactic structures, the meaning in Japanese is highly dependent on context.
In a linguistic sense, this could be seen as a reflection of a worldview where entities are not isolated but exist only in relation to others and their actions.
The challenge for artificial intelligence in processing such languages is not just technical but also conceptual. It involves understanding and replicating a form of thought that sees the world in terms of interrelated actions and contexts, which is fundamentally different from the entity-centric view prevalent in Western linguistics.