Complexity of Linguistic Structures, Simplicity of Language Mechanisms
Inaugural lecture delivered at the Collège de France on Thursday 5 November 2020
Texte intégral
1Mr. Administrator,
Dear colleagues,
Dear friends,
Ladies and Gentlemen,
2First of all, I wish to thank you for being here, or for joining us online, despite a health crisis that once again is causing us deep concern and is challenging our health, our social life, and our individual lives.
3But let us try to look ahead and talk today about language and general linguistics. Language is a central component of human life. We live immersed in language. We use it not only to structure our thoughts and to communicate and interact with others, but also in play and in artistic creation. Paradoxically, the ubiquitous presence of language makes it difficult to approach as a subject of scientific study; it is so inseparable from the main aspects of human life that we no longer see its remarkable properties.
4To study language scientifically therefore requires us to first stand back and have an inquiring mind. How would a scientist from another planet, with a language completely different from ours, study human language? Only with such distance can we perceive the intellectually interesting problem behind the obviousness of an all too familiar object. The scientific analysis can then begin.
5But how can such an inquiry be approached? Which perspective is legitimate and likely to be fruitful? In his late work The Descent of Man, Charles Darwin set out to link – one might say somewhat anachronistically – human cognitive capacities to his theory of evolution. He spoke, of course, about language:
Language certainly is not a true instinct, as every language has to be learnt. It differs, however, widely from all ordinary arts, for man has an instinctive tendency to speak, as we see in the babble of our young children, while no child has an instinctive tendency to brew, bake, or write. (Darwin, 1871: 51)
6Thus, there is not only a learning dimension to language, but also a natural “instinctive tendency” that is not found in other cultural practices. About a century earlier, in the first paragraph of his Essay on the Origin of Languages, Jean-Jacques Rousseau had already highlighted the two dimensions of language: natural and social. After emphasizing the inescapable component of learning in our linguistic capabilities (“Out of usage and necessity, each learns the language of his own country”), Rousseau wrote: “speech, being the first social institution, owes its form to natural causes alone”. Language is thus the point of contact between nature and culture, between individual and society. In very different contexts and forms, Rousseau and Darwin emphasize the particular, indeed unique, nature of language compared to other human capabilities.
7The subject can therefore be approached from many different angles. As an object that is both natural and social, language can be studied with the methods and explanatory models of the formal and natural sciences, and with the instruments offered by the humanities and social sciences. The “naturalist” approach will focus on the cognitive bases of language and the different aspects of the biological asset that allows members of our species to learn and use languages. This perspective will be closely related to cognitive neuroscience and human biology, as well as the formal sciences which are able to provide accurate tools for modelling. The “culturalist” approach will focus on the relationship between language and society, history, philology and literature.
8While these two main perspectives are sometimes seen as competing alternatives, they should rather be viewed as complementary; it is the very nature of language, its complexity and its hybridity that justify a plurality of perspectives and methods. In fact, they cannot be totally isolated from each other: one cannot study the biological foundations of language without using the empirical facts emanating from historically given languages. Conversely, the study of any socio-historical aspect of language can only be informed by findings on the natural foundations of linguistic capabilities. The adoption of either perspective is thus a matter of choosing a centre of gravity in a complex system, rather than establishing a viewpoint that is impervious to other inputs.
Syntactic creativity
9The intellectual tradition in which my work is situated has recent roots in the formal linguistics of the last sixty years, primarily generative grammar, a research programme which is in turn grounded in a much older tradition. This tradition is marked, or even defined, by a foundational question that has been called the question of “creativity” in the regular use of language.
10Every speaker is constantly faced with new sentences that they have never heard in their previous linguistic experience. Yet they can integrate these new objects, understand them, and react appropriately in a coherent dialogue, possibly by creating in turn another new object, another sentence. What exactly is this feeling of familiarity with what is constantly new?
11I wish to emphasize that the novelty of sentences is not at all exceptional, it is quite normal in the use of language. For example, it is highly unlikely that in your previous linguistic experience, any of you have heard the exact sentence that I am producing right now.
12The other striking aspect linked to novelty is the unbounded nature of our ability to build new sentences. The lexicon is a finite set of words, and while new words can of course be added, at any given moment the set is finite, representable by a well-defined material object such as a dictionary. When it comes to sentences, this is by no means the case. There is nothing that could resemble an exhaustive repertoire of the sentences of a language, a dictionary of sentences. We may wonder what the longest word in a language is, and to get the answer simply count the phonemes or the graphemes. But we cannot identify “the longest sentence” in a language; irrespective of the length and complexity of a sentence, it can always be lengthened by adding something, a subordinate clause, for example. We are indeed in the realm of the unlimited; the analogy with the case of natural numbers has often been mentioned.
13These simple observations have roots dating back a long time. In the Discourse on Method, René Descartes noted that the ability to arrange individual signs into coherent sentences suited to the discursive context distinguished humans from animals:
For it is highly deserving of remark, that there are no men so dull and stupid […] as to be incapable of joining together different words, and thereby constructing a declaration by which to make their thoughts understood; and that on the other hand, there is no other animal, however perfect and happily circumstanced, which can do the like. (Descartes, 1637/1850: 98)1
14Linguistic abilities also provide a reliable test for distinguishing man from machine, according to Descartes:
For we may easily conceive a machine to be so constructed that it emits vocables […] but not that it should arrange them variously so as appositely to reply to what is said in its presence, as men of the lowest grade of intellect can do. (Ibid.: 97-98)2
15Descartes’ observations remain fundamentally valid almost four centuries later. Despite the extraordinary advances in computer science, which make the automata that Descartes was so passionate about seem simple, some properties of language that are easily accessible to the five-year-old child pose significant problems for human-computer linguistic interaction programs. We are still a long way from an artificial intelligence system that can pass the famous Alan Turing test in a completely general way (Turing, 1950). Although conceived in a very different context, this test is in a sense a modern version of Descartes’ test to distinguish between a human and a machine. At the same time, the precise study of communication systems by animal cognition experts confirms that the “languages” of our closest cousins, non-human primates, are very different from human language and are radically poorer and more limited.
16What is at the root of this difference? Descartes suggests that it is the ability to “arrange various words together”, that is, a combinatorial ability. The importance of combinatorics for human language had been emphasized some years earlier, in a different context, by Galileo. At a certain point in Galileo’s Dialogue, the question is raised as to what the greatest achievements of human genius are. The heights of painting, sculpture, architecture, poetry, music, science, arte navigatoria, and so on are all mentioned, but Sagredo, one of the three participants in the dialogue, introduces another idea. The greatest invention of mankind could be alphabetical writing:
[...] but after all the stupendous inventions, what kind of eminence of mind was that of the man who thought of finding a way to communicate his deepest thoughts to some other person, even at great distance of place and time? ... And with what ease? With the different arrangements of twenty little characters on a paper. (Galileo, 1630/1953: 105)3
17Galileo rightly characterizes writing as an invention, a cultural product (we saw the same point in Darwin), but one that is based on a crucial empirical discovery, a natural property of language: its radically combinatorial nature, that is, the fact that a few dozen distinct sounds, phonemes, combine to produce a few thousand morphemes, which combine to form the words of a language, which, in turn, combine to define an unlimited number of phrases and sentences, which enable a member of our species to express and communicate “i suoi più reconditi pensieri” (his or her deepest thoughts).
18About thirty years after Galilean Dialogue, in The Port-Royal Grammar: General and Rational Grammar, Arnaud and Lancelot took up Galileo’s words almost to the letter, identifying “one of the greatest advantages which man has over all the other animals” to be the ability to combine a small number of sounds, enabling us to signify our thoughts and communicate to others “all that we conceive and all the diverse movements of our souls” (Arnaud et Lancelot, 1660/1975: 65-66).
19The words of Galileo and the Port-Royal grammarians clearly reflect the two contrasting terms in the title of this inaugural lecture: the simplicity of the basic ingredients and the complexity of the product of their combination. This is the conception that is taken up in Wilhelm von Humboldt’s famous sentence, “language makes infinite use of finite means”.
20A major contribution of structural linguistics in the first half of the twentieth century was to study these “finite means” and to show their systematic nature. But focusing on inventories of the basic ingredients is only part of the analysis of language; it is still necessary to describe the properties and understand the nature of the combinatorial mechanism, the real engine of the system, which allows “infinite use”.
Recursiveness
21Noam Chomsky’s seminal contribution in the 1950s was to address the question of the combinatorial nature of language and to answer it with precision. Knowledge of a language involves the storage of finite inventories (phonemes, morphemes, words) and the mastery of a system of combinatorial rules that are applied to generate complex linguistic expressions. It is possible to study these systems in detail using certain formalisms developed in studies of the logical foundations of mathematics such as systems of rewriting rules (Chomsky, 1955, 1957).
22From then on, the emphasis shifted from finite inventories to dynamic systems that generate potentially unlimited sets of representations. This shift in emphasis would go on to characterize the following decades of linguistic research, up to the present.
23These aspects were mentioned in the teachings of illustrious scholars at the Collège de France as early as the early 1960s. Émile Benveniste pointed out that:
Phonemes, morphemes, words [...] can be counted; they are finite in number. But not sentences. [...] The sentence, an indefinite creation of limitless variety, is the very life of language in action. (Benveniste, 1966: 129)4
and Claude Lévi-Strauss noted that:
Syntax does not wait until it has been possible to enumerate a theoretically unlimited series of events before becoming manifest, because syntax consists in the body of rules which presides over the generation of these events. (Lévi-Strauss, 1964: 15-16)5
24But what allows a finite entity, such as a system of formal rules, a computer, or our brain, to master an infinite set of structures? In other words, what allows us to make “infinite use of finite means”? The answer is recursion: a simple, formalized concept studied in mathematical logic. Recursion is the property of a rule, or a system of rules, to reapply itself to its own result, thus creating a boundless loop.6 Italian mathematician Giuseppe Peano’s formalization of arithmetic, for instance, can be understood as involving the recursive rule “add 1” which, by successive additions of the unit, allows the generation of any natural number. The generative grammar of a language can be conceived of in the same way, as a recursive system capable of generating an infinite number of sentences. Let us pause for a moment to consider the original generative formalisms as presented, for example, in Nicolas Ruwet’s wonderful An Introduction to Generative Grammar (1967/1973). It was my first reading in generative grammar, and it made me decide to further my studies in this direction. Consider, for example, the following small system of rewriting rules, simplified in many respects:

25This simple system makes it possible to build the following tree structure, with rules (1), (2), (3) functioning as instructions to build elementary trees that are merged together in a kind of Lego:

If a sentence can contain another sentence within it, this allows the system to loop and thus to generate a family of indefinitely expandable structures such as:
26The sentences that we normally use, or find in a text, are much less monotonous than this, because the forms of recursion are numerous and can be alternated and varied, but the abstract principle is clear.
27The exact nature of the recursive procedures used by human languages has been discussed in detail for a good half century in formal linguistics. Recently, some linguists have come to the conclusion that there is only one recursive rule of great simplicity and generality, which subsumes specific rules like (1), (2) and (3). This is the operation called merge: the key operation of the Minimalist Program, the most recent framework of generative grammar (Chomsky, 1995; see also Chomsky, 2020, for a recent discussion). Again, it was Chomsky who introduced it, forty years after his first major book, Syntactic Structures (Chomsky, 1957).

28Two elements A and B can be merged, thus creating the new expression [A B]. The merging is recursive, so [A B] can be merged with another element C, giving rise to the complex structure [C [A B]], and so on.
29As far as we know, recursive merge is a universal property of human language. No known language limits its utterances to single words, or to combinations of two words, or three words, or four words, which would be expected of a system without operations of recursive merge. This is a crucial and distinctive property of human language. If we look at animal communication systems, especially among our closest cousins, non-human primates, for example in the recent synthesis by Schlenker et al. (2016), we find only a few rudimentary forms of two-element combination with no trace of recursiveness. Recursive merging thus appears to be a hallmark of adult human language, which raises important questions about the ontogeny and phylogeny of this human capability – questions that can be explored only in an interdisciplinary cognitive neuroscience framework.
Linearity and hierarchy
30Let us shift our attention now from combinatorial rules to the representations generated. Any utterance, any oral or written sentence, appears to our senses as a sequence of elements: sounds (or gestures, in sign languages), graphemes, and words. In the written sentence (7), for example, the words follow one another like beads on a necklace.
31But if what I have said about generation is correct, then linguistic representations are necessarily more complex than simple linear sequences. A vertical, hierarchical dimension is inevitable. This two-dimensionality of linguistic representations is well expressed by tree structures. The tree associated with a sentence can be seen as the recapitulation of all the applications of merge necessary for its generation, where each application generates an elementary tree. This is how a short sentence like (7) is represented by the tree (8), where I simplify things in many respects.7

32These representations have remarkable properties, which have been studied in detail. First, the branching is always binary. One could easily imagine representations with multiple branching, but natural languages use only binary branching. This was an important discovery, among many others, by Richard Kayne, my professor at the University of Paris 8-Vincennes, with Nicolas Ruwet. Even in cases where a ternary or n-ary branching would be plausible, such as the double object construction in English (Mary gave Bill a book), an in-depth analysis shows that binary structuring is correct (Kayne, 1983).
33What kind of empirical argument can be given in support of an invisible, inaudible hierarchical structure like (8), which accompanies the immediately visible linear/sequential structure? There is good reason to think that the invisible hierarchical structure is much more fundamental to the functioning of language than the visible linear structure. Grammatical rules are invariably sensitive to hierarchical organization. Whenever the two structures clash, the hierarchical structure always wins. To try to illustrate this point, I will now discuss the processes of movement and locality effects.
Movement and locality
34The construction of structures by merging separate parts is clearly the central procedure of linguistic combinatorics, but not the only one. In natural languages, constituents are often pronounced in positions that differ from the positions in which they are interpreted. Consider the following type of question about a constituent:
35The expression which book at the beginning of the sentence must be interpreted as the direct object of the verb buy. There is a dependency between the initial expression and the verb governing its interpretation.
36The classical way of approaching the problem of these long-distance dependencies is to postulate a movement. In an abstract representation, the expression which book is positioned as the direct object of buy (in (10)a), before being moved to the position at the beginning of the sentence (in (10)b):

37The movement leaves a “trace”, a copy in the initial position – a position that remains unpronounced, but which is visible to our mental grammar. Now, all movement obeys principles of locality, which limit its application in certain contexts. Consider, for example, the following asymmetry: an interrogative element can be taken out of a declarative clause, but not out of an indirect question. Given (11)a as a baseline, the main question (11)b can be constructed, while from the indirect interrogative (12)a, it is impossible to obtain the main question (12)b, in the interpretation where one wonders about the time of departure (the asterisk indicates this impossibility):


38Why this restriction? Here we see a well-studied principle of locality, “Relativized Minimality”, which precludes any movement across an element of the same type as the element that has been moved (Rizzi, 1990, 2004, 2018). Movement from Y to X is precluded in a configuration such as the following, where Z is an element of the same type as X:

39In the representation of sentence (12)b, it is the interrogative element who that precludes the movement of when, another interrogative element. But in other contexts, the intervention effect is not found. From a baseline such as (14)a, one can very well derive a question like (14)b:
40Why is it that in (14)b who does not determine an intervention effect, as in (12)b? The tree structures immediately provide the answer:


41In (15), who intervenes hierarchically between when and its trace, in that it is lower in the tree than when, and higher than the trace, a property that can be expressed precisely by the formal relation of c-command8, but which I leave here at an intuitive level. This triggers the locality effect. On the other hand, in (16) who intervenes only in the linear sequence; it is too deeply embedded to affect the proper formation of the structure. Overall, it is only the hierarchical intervention that enters the locality calculation; the linear intervention is simply not seen by the system.
42What is true for movement and locality is also true for all sorts of grammatical processes, from morpho-syntactic rules of case assignment, to rules of agreement and phenomena such as the interpretation of pronouns and anaphoric elements.9 It is also true for other interface phenomena between syntax and semantics, such as the assignment of scope to quantifiers, not to mention interface processes between syntax and phonology, such as the constraints on liaison in French and many other phono-syntactic phenomena (Manzini, 1983; Rizzi and Savoia, 1993). In short, our entire mental grammar is sensitive to invisible hierarchical relations rather than to the immediately observable linear order.
The experimental turn
43Let us try to reconstruct the logic of the reasoning that we have developed. In the cases analysed above, we have compared two hypotheses on the organization of linguistic structures, one linear, based on the sequence of words, and the other hierarchical, based on the organization of the tree structure. The precise wording of these hypotheses makes it possible to generate predictions which we can test on the basis of judgments of acceptability and interpretation by native speakers. This reasoning therefore involves two components:
the construction of a precise model with deductive depth, capable of generating empirical predictions;
the verification of the predictions.
44These are the two pillars of the scientific method established in the natural sciences from the seventeenth century onwards and the Galilean lesson: sensata esperienza (experience based on the senses) and necessarie dimostrazioni (necessary demonstrations), of which Galileo spoke in 1615 in his famous letter to Christina of Lorraine, Grand Duchess of Tuscany, a sort of manifesto of modern science. These two components are applicable to the study of language.
45For several decades, formal linguistic research relied on a radically simplified experimentalism, based on speakers’ introspective capabilities. Any native speaker of a language can answer simple questions such as: could you say such and such a sentence in such and such a context? Does the sentence allow for such and such an interpretation? Using this simple methodological apparatus, formal linguists have been able to build up an extremely rich repertoire of knowledge about the languages of the world, attesting to the richness and subtlety of the linguistic knowledge that any speaker can access through introspection. This huge repertoire of data is now becoming increasingly accessible to the scientific community. For example, the Terraling project, organized and directed by Hilda Koopman, offers an open and searchable database of this repertoire, which is constantly being expanded (Koopman and Guardiano, 2020).
46Despite its formidable effectiveness, the judgment of acceptability and interpretation technique can only be part of the story. First, that which is accessible to introspection has very specific limits: we have introspective access to the results of our mental calculations on linguistic structures, but not to the mental mechanisms involved or to their functioning, which remain totally unconscious. Secondly, these methods are difficult to use in certain situations, for example with the young child. Thirdly, it is good scientific practice to vary the nature of empirical data as much as possible.
47For all these reasons, and others, in the last quarter century formal linguistics has opened up to a real experimental turn, characterized by a wide variety of techniques and methodologies.
48Extensive experimentation is essential if we wish to know how linguistic knowledge is represented in a speaker’s brain. This is a central theme in neurolinguistics. Here, it intersects with a typical theme of comparative linguistics, that of possible and impossible linguistic rules. If we compare languages, we find that certain rules, though easy to imagine, are not attested in any language. In some cases, there are clear reasons of principle for this exclusion. For example, consider rules that would systematically violate hierarchical sentence structures; these rules are never attested in the languages of the world. No language of the world places the negation in a fixed position in the linear sequence of words, for instance as the third word of the sentence. And no world language forms an interrogative sentence by mirroring the word order of the declarative sentence.
49Andrea Moro and colleagues considered how possible and impossible rules are learned and represented in the brain.10 They showed that, even if we can learn both linguistically possible and linguistically impossible rules, our brains make a clear distinction between the two. Magnetic resonance imaging (MRI) shows that linguistic circuits, primarily Broca’s area, are involved in learning possible rules, while other non-linguistic circuits are involved in learning impossible rules. We are capable of learning the latter, just as we are of learning all kinds of non-linguistic rules in everyday life, but our brain discriminates between the two types of rules and does not engage the language circuits in learning impossible linguistic rules.
50How does our brain compute recursive procedures, thus mastering the unlimited set of sentences in a language? Recursion in the brain is, in a way, the Holy Grail of neurolinguistic research. How should we proceed? First, we need to identify the neural structures that are involved in processing recursive structures, and then use imaging research and formal modelling to understand how this takes place.
51Pursuing this type of question, Stanislas Dehaene, Christophe Pallier and their colleagues have shown on MRI that certain areas of the linguistic circuit are progressively activated as the size of the phrases – and therefore the number of merge operations – increases; not only certain components of Broca’s area, but also the superior temporal sulcus. This activation occurs independently of lexical semantics; it is observed even if words are replaced by “made-up words” that have no meaning (for example: “the ranf is gorping the gibble”11). The circuit identified by these authors thus seems to respond selectively to syntactic complexity, to the repeated application of merging.
52A considerable amount of work remains to be done, of course, to reach a precise understanding of the nature and functioning of the brain’s computations involved in language. Formal linguists and neuroscientists do however now speak technical languages that are sufficiently close and compatible to allow for fruitful collaboration, and knowledge in this area is constantly advancing.
Invariance and variation
53Let us return to the question of the complexity of language and look at one of its most striking dimensions. If we compare languages, the most salient aspect is variability, which affects all levels of analysis: variability of sounds, of morphological forms, of syntax, of lexicon, perhaps even of meaning (although semantic variability is more controversial, and in any case less immediately visible). However, linguists cannot stop at the observation of variability; they must consider the general framework within which variability can exist. The very possibility of a general linguistics depends on this, and the founders of the discipline clearly established it as a baseline. In Course in General Linguistics, compiled by Ferdinand de Saussure’s students in the early twentieth century, based on his Geneva teachings, we read:
The aims of linguistics will be:
a. to describe all known languages and record their history […];
b. to determine the forces operating permanently and universally in all languages, and to formulate general laws which account for all particular linguistic phenomena historically attested; [...]. (Saussure, 1916/1959: 6)12
Similarly, in his opening lesson of the comparative grammar course delivered at the Collège de France, on 13 February 1906, Antoine Meillet said:
The search for general laws, whether morphological or phonetic, should now be one of the main goals of linguistics. But, by their very definition, these laws go beyond the boundaries of language families; they apply to all of humanity. (Meillet, 1906: 91)13
According to Meillet, we cannot therefore limit ourselves to identifying generalizations within a family of languages, such as the Indo-European languages; we must aim higher, with the search for universal laws of language.
54Émile Benveniste likewise pointed out that:
[…] linguistics has a dual purpose: it is the science of language and the science of languages. This distinction, which is not always made, is necessary: language, a human faculty, a universal and immutable characteristic of Man, is something other than languages, which are always particular and variable, and in which it is realized. [But [...] these different paths are often intertwined [...]. (Benveniste, 1966: 19)14
55The question then arises as to how to connect the detailed description of individual languages with the quest for the universals of language. How can we best express the intertwining of these lines of inquiry, to which Benveniste is referring? Various approaches proposed in the course of the twentieth century have sought to answer this question, focusing sometimes on variation, and sometimes on invariance, to the detriment of the other aspects, although both of these are equally essential.
56A simple idea, introduced in generative grammar in the late 1970s, defined the modern approach to the question and profoundly influenced comparative studies in recent decades. It is conceivable that the human faculty of language specifies a universal grammar, a system of mental calculation dedicated to language. This system consists of certain universal rules, subject to general principles of application and stipulating a finite number of parameters. The parameters are choice points left open by the system, which can typically assume one of two possible values. Equipped with this system, children set the parameters on the basis of their experience, the utterances they hear in their linguistic environment, and in this way acquire the grammar of their language, which is one of the possible realizations of the general system. From this point of view, language acquisition is a procedure of selection that children perform, based on their experience, from a set of options generated by their mind. This manner of conceiving of learning follows from findings on the initial cognitive state of the baby15, in the research of Jacques Mehler and his extraordinary schools of experimentalists in France and Italy, as well as Jean-Pierre Changeux and Stanislas Dehaene in their research on the neural bases of learning. From the perspective of modern science, these ideas and findings revisit the classical debate on the nature of knowledge, with its ancient roots in Greek thought.
57To stick to the linguistic aspects, according to the parametric model, human languages are variations on a single underlying theme, the universal grammar. Variability is important, but limited to the parametric choice points, and strongly constrained by the invariant structure of the system.
58Let us reconsider what I have said about linguistic structures, in light of parameter theory. Merge is a universal mechanism. All languages we know build structures on this basis. The resulting configurations are hierarchically organized; a fundamental hierarchical relation such as c-command and certain principles of locality are universal in scope.
59But the hierarchical structures still need to be linearized if they are to be usable. This is the principle of linearity of the signifier that Saussure spoke of. And the linear order is subject to variation. There is, prima facie, wide variability in the organization of word order in the languages of the world. But closer inspection shows that the variation is reduced to one of two possible forms of linearization. The two values of the linearization parameter are:

60The head is the element that selects, that determines the category of the phrase. For example, the verb determines the category of the verbal phrase and selects its complements, primarily the direct object. We thus have the verb-object order, known as “VO” (read book), in head-initial languages, and the object-verb order, known as “OV” (book read), in head-final languages. The exact nature of this parameter is a matter of debate; while in the classical approach, it could be conceived of as a specification that operates directly on the application of merge, Kayne (1994) treats it in terms of movement, and Berwick & Chomsky (2016) consider that this parameter operates on the externalization of structures that are not linearized within the grammar. I will not go into this important debate here and will focus instead on the classical approach in this illustration.
61This parameter may be set differently for different categories in a particular language. However, as Joseph Greenberg discovered, languages show a strong tendency towards consistency, towards harmonizing orders across the syntactic tree (Greenberg, 1963).16 English and Japanese are systematically organized as head-initial and head-final, respectively. In English, as in French, the verb precedes the direct object, the auxiliary precedes the verb phrase, the complementizer precedes the sentence. At each level, Japanese chooses the opposite order:

62As always, the corresponding trees give richer information. They show that the structures in the two languages are almost, but not entirely, mirror images of one another:

63This is so because certain properties remain invariant: the subject-predicate order remains the same; indeed, this order is constant in the vast majority of the world’s languages. In the few languages where the subject follows the predicate (e.g. verb-object-subject/VOS languages, such as Malagasy), an analysis of movement of the predicate to the beginning of the sentence seems plausible, based on an underlying structure with a final predicate: S [V O] [V O] S.
64The idea of organizing the language system into principles and parameters has proved to be of extraordinary heuristic value, and comparative generative grammar has flourished on this basis (Chomsky, 1981; Rizzi, 1982; Kayne, 1983). Since the 1980s, the principles and parameters approach has determined an enormous enrichment of theoretically guided descriptions of the world’s languages, and of the empirical basis of theoretical models. Many aspects of these systems, even fundamental ones, are still subject to discussion: what is the format of the parameters? In other words, can we define what a possible parameter is? And what is the locus of the parameters, the place in the general system where the parametric information is stored? These questions are the subject of thriving debate17 which, about forty years after the introduction of these notions, shows the extraordinary vitality and relevance of this approach and of the resulting comparative research.
The cartography of syntactic structures
65Syntactic structures are complex objects, not only in terms of the variation observed, but also of the richness of their internal articulation. For a quarter of a century, some linguists, fascinated by this form of complexity, have pursued research projects that attempt to map the structures. The aim is to draw maps, in the form of tree structures, of syntactic configurations in as much detail as possible. But what exactly is being mapped? We can certainly draw a map of a delimited space, but as I have emphasized, syntax is an infinite domain. How can an infinite space be mapped?
66To this we can reply that the space of syntactic structures is indeed infinite, being generated by recursive procedures of merge, according to the minimalist hypothesis. However, merge does not take place in a vacuum, it needs a lexicon, an inventory of elements that can be combined. Now, the elements of the lexicon, whether they are content elements (nouns, verbs, adjectives, etc.) or functional elements (articles, auxiliaries, complementizers, etc.), have particular combinatorial requirements, which determine elementary tree assemblies that are relatively stable and are constrained by lexical requirements. Thus, on the one hand we have the atoms of syntax, the elements of the lexicon and the elementary trees that they project18, and on the other hand, stable aggregations of atoms, sorts of complex molecules. It is this molecular structure that is the subject of the cartographic analysis.
67That the tree-like structures created by merge are organized into hierarchical zones is a classic observation. For example, we can look at the structure of the sentence as being made up of the three hierarchically ordered zones circled in (21): the verbal zone, which expresses the argument structure (who does what to whom), the inflectional zone, which expresses the tense, among other things (and is sometimes labeled as T, sometimes as I, as in (22)a), and the peripheral zone of the complementizer, which expresses, among other things, the type or force of the sentence (declarative, interrogative, imperative, etc.).

68But (21) is only a very first approximation. As soon as we look at the details, we realize that these representations need to be enriched considerably. In particular, it would be useful to split the circled areas into sequences of smaller components, still organized hierarchically, while keeping the same basic geometry. The actual cartographic work can then begin.
69Thus, the progressive splitting of the inflectional zone (which bears the marks of the temporal, modal and aspectual inflections) into finer components, initiated by Jean-Yves Pollock in the mid 1980s (Pollock, 1989) and continued by several linguists, gave rise ten years later to Guglielmo Cinque’s detailed map, shown in representation (22)b, informed by a rich comparative database (Cinque, 1999).

70At the same time, the initial system of the sentence, the complementizer (23)a, could, in my own work, be detailed under the cartographic lens, with the “fine structure” of this periphery of the sentence thus taking the form of another expanded functional sequence, as in (23)b.

71Detailed maps of all major types of phrases – nominal, verbal, adjectival, prepositional – have likewise been identified.19 The mapping projects, initially designed (at least for the left periphery) to detail certain areas of the syntactic tree in Romance and Germanic languages, soon showed a general purpose. Over the course of two decades, the empirical scope of these studies has been extended to many languages and language families. In addition to the initial core of Romance and Germanic languages (Rizzi, 1997, 2000, 2004; Cinque, 1999, 2002; Belletti, 2004a-b, 2009; Poletto, 2000; Laenzlinger, 1997; Cardinaletti, 2004; Giusti, 2002; Benincà and Munaro, 2008; Berthelot, 2017; Botteri, 2018; Grewendorf, 2002; Haegeman, 2012; Samo, 2019; among many other works), and among the Indo-European languages, some structural areas of Celtic (Roberts, 2004) and Slavic languages (Krapova and Cinque, 2008), Modern Greek (Roussou, 2000), Armenian (Giorgi and Haroutyunian, 2016), and Persian (Ilkhanipour, 2014) have been mapped. Outside the Indo-European family, there are cartographic analyses of Basque (Irurtzun, 2021), Finno-Ugric languages (Puskas, 2000; Jokilehto, 2014), Semitic languages (Shlonsky, 1998, 2014), and Cushitic languages (Frascarelli and Puglielli, 2007). Much research has been devoted to mapping the languages of sub-Saharan Africa, starting with the seminal work of Enoch Aboh on Gbe languages, but also with several works on Bantu languages (Aboh, 2004; Biloa, 2013, Bassong, 2010; Torrence, 2013). There has been considerable interest in mapping Chinese (Tsai, 2008, 2015; Pan, 2015, 2019; Si, 2017), Japanese (Endo, 2007; Saito, 2013) and, more recently, Korean (Korean Linguistic Society workshop, 2018), as well as Dravidian languages (Jayaseelan, 2008). There is also cartographic work on native American (Speas and Tenny, 2003; Nevins and Seki, 2017) and Australian (Legate, 2002, 2008) languages, as well as Austronesian languages (Pearce, 1999; Tsai et al., 2015) and Creole languages (Durrleman, 2008). Some sign languages have been mapped (Aboh and Pfau, 2012). Mapping research has informed studies of Romance and Germanic dialectology (Poletto, 2000; Benincà, 1996; Cruschina, 2012; Manzini and Savoia, 2005; Di Domenico, 2012; Pescarini, 2018), as well as studies of historical linguistics and of classical languages (Salvi, 2005; Danckaert, 2012; Benincà, 2006; Franco, 2010).20
72The cartographic projects thus have a broad descriptive dimension: we want to know which maps are empirically correct for the different languages of the world. And we want to ask the fundamental question of comparative linguistics, the question of invariance and variation: what properties remain constant across languages? What kinds of variation are found and what are the limits of possible variation? Here cartographic studies are fruitfully integrated with parametric models, promoting research on the parameter systems that affect the variation of structures.
73Other crucial theoretical questions are raised. Can we deduce empirically observed properties in cartographic representations from certain fundamental principles that govern our linguistic capabilities? For example, can we derive certain hierarchical orders observed in functional sequences from principles of locality, or from principles governing the interface between form and meaning? In this way, cartographic research can function as a powerful generator of questions for theoretical research, stimulating the construction of general models and determining a significant enhancement of the empirical basis of theoretical studies.
Mapping language acquisition
74Cartographic research can also extend beyond the sole dimension of adult grammar and interact with the study of language acquisition. Here, the question concerns the ways in which the complex structures that we have identified come into being in the development of language skills in children.
75Since the mission of the Collège de France is to present “research in the making”, I would like to conclude by briefly mentioning an ongoing research project that I am conducting in collaboration with Naama Friedmann from Tel Aviv University and Adriana Belletti from the University of Siena. This work poses the question of language development from the perspective of cartography: how do complex trees “grow” in the mind and brain of the child who is learning the language? The working hypothesis is very simple: the acquisition of complex structures proceeds from the bottom up. The child first acquires the lowest areas of the syntactic maps and then incrementally adds higher and higher areas, always following this bottom-up modality in the growth of the tree, without ever skipping an area. Using this logic to study the acquisition of the sentence periphery in Hebrew, it was possible to identify three stages corresponding to each of the three hierarchically ordered areas of the tree structure shown in (24) (Friedmann, Belletti and Rizzi, 2021).

76In the first stage, the child shows no form of use of peripheral structures; only manifestations of the basic core of the sentence are found in their productions. The second stage shows the use of the lower part of the sentence periphery, with the manifestation of partial questions (Where are you going?) and adverbial anteposition (now I’m reading). The third stage manifests constructions that also involve the upper part of the periphery: forms of finite subordination (you see that mummy is leaving!), relative clauses, structures with topicalization (this book, I’m taking), questions with why, which occupies a distinct, and higher, position compared to the other interrogative elements who, what, where, when, etc.
77We can thus identify specific temporal stages corresponding to the mastery of growing areas of the tree. Hence, the results of the cartographic research can provide an organizing principle for the study of language development. We are only at the beginning of this endeavour; it will be necessary to verify whether the temporal sequence observed in Hebrew is generalizable across languages and whether the same bottom-up organizing principle proves to be applicable and useful for analysing the development of other parts of the syntactic tree. This kind of approach, developed in a framework of fundamental research, has an important potential for applied research. It can, for example, provide a characterization of typical development, to be used as a baseline for the study of developmental pathologies, as well as for the study of the loss of linguistic abilities in certain adult pathologies, such as aphasia.
Conclusion
78What could be more normal for our species than combining words to organize and express complex thoughts, or watching our children learn to speak? Science’s job is to provide answers to the questions that ordinary experience cannot illuminate: how is all of this possible? What are the ingredients, what is the functioning of these capabilities, of these learnings that are constantly unfolding before our eyes? General linguistics endeavours not only to formulate these questions with precision, but also to work towards constructing elements of an answer, always partial and temporary, yet at the same time ever more refined and systematic.
79If the perspective that I have sought to illustrate is oriented in the right direction, then at the base of our remarkable linguistic capabilities there is an extremely simple and general combinatorial procedure that puts two linguistic expressions together to form a third. Yet this procedure is recursive and, by successive applications starting from a well-structured lexicon, it allows us, step by step, to build very complex configurations capable of expressing “our deepest thoughts”, as Galileo put it. If we look at these complex structures under a cartographic magnifying glass, we nevertheless see very simple elementary configurations, where the elements combine according to a constant tree geometry. In his book Les Atomes, French physicist Jean Perrin describes atomic theory as the effort to “explain the complicated visible by the simple invisible” (Perrin, 1913) – an oft-quoted characterization that has a much more general significance. The research that I have illustrated here attempts to conform, in its own field, to this explanatory ideal: the precise and detailed description of a complex universe of syntactic configurations is reduced to elementary mechanisms and configurations of extreme simplicity. However, the search for explanation in this approach always strives to comply with the requirements of description, which is the basis and the essential source of the explanatory dimension. The difficult but crucial challenge is to lose none of the expressive richness, fine articulation, and variety of the empirical domain described in the equally fundamental search for an explanatory model.
Remerciements
I would like to thank Adriana Belletti and Frédérique Berthelot for reading and commenting on an early version of this inaugural lecture, and especially Frédérique Berthelot for her editorial assistance.
Bibliographie
Des DOI sont automatiquement ajoutés aux références bibliographiques par Bilbo, l’outil d’annotation bibliographique d’OpenEdition. Ces références bibliographiques peuvent être téléchargées dans les formats APA, Chicago et MLA.
Format
- APA
- Chicago
- MLA
Aboh Enoch and Pfau Roland, “Spatial Adpositions in Sign Language”, MIT Working Papers in Linguistics, 2012.
Aboh Enoch, The Morphosyntax of Complement-Head Sequences: Clause Structure and Word Order Patterns in Kwa, New York, Oxford University Press, 2004.
Arnauld Antoine and Lancelot Claude, Grammaire générale et raisonnée, Paris, Éditions Allia, 1660/2016. [English translation by Bernard E. Rollin and Jacques Rieux, The Port-Royal Grammar: General and Rational Grammar, Ann Arbor, University of Michigan, 1975.]
Baker Mark, The Atoms of Language, New York, Oxford University Press, 2001.
Bassong Paul Roger, The Structure of the Left Periphery in Basaa [manuscript], University of Yaounde I, Cameroun, 2010.
Belletti Adriana (ed.), Structures and Beyond, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 3, 2004b.
10.1093/oso/9780195171976.001.0001 :Belletti Adriana, “Aspects of the Low IP Area”, in Luigi Rizzi (ed.), The Structure of CP and IP, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 2, 2004a, pp. 16-51.
10.1093/oso/9780195159486.001.0001 :Belletti Adriana, “Inversion as Focalization”, in Aafke C.J. Hulk and Jean-Yves Pollock (eds.), Subject Inversion in Romance and the Theory of Universal Grammar, Oxford, Oxford University Press, coll. “Oxford Studies in Comparative Syntax”, 2001, pp. 60-90.
10.1093/oso/9780195142693.001.0001 :Belletti Adriana, Structures and Strategies, London-New York, Routledge, 2009.
10.4324/9780203887134 :Benincà Paola and Munaro Nicola (eds.), Mapping the Left Periphery, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 5, 2010, pp. 3-15.
Benincà Paola and Poletto Cecilia, “Topic, Focus and V2: Defining the CP Sublayers”, in Luigi Rizzi (ed.), The Structure of CP and IP, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 2, 2004a.
Benincà Paola, “A Detailed Map of the Left Periphery of Medieval Romance”, in Raffaella Zanuttini, Héctor Campos, Elena Herburger and Paul H. Portner (eds.), Crosslinguistic Research in Syntax and Semantics. Negation, Tense, and Clausal Architecture, Washington (D.C.), Georgetown University Press, 2006.
Benincà Paola, “La struttura della frase esclamativa alla luce del dialetto padovano”, in Paola Benincà, Guglielmo Cinque, Tullio De Mauro and Nigel Vincent (eds.), Italiano e dialetti nel tempo, Rome, Bulzoni, 1996.
Benveniste Émile, Problèmes de linguistique générale, Paris, Gallimard, 1966.
Berthelot Frédérique, Movement of and out of Subjects in French, PhD thesis, University of Geneva, 2017, https://archive-ouverte.unige.ch/unige:96575.
Berwick Robert and Chomsky Noam, Why Only Us: Language and Evolution, Cambridge (Mass.), MIT Press, 2016.
10.7551/mitpress/9780262034241.001.0001 :Bianchi Valentina and Frascarelli Mara, “Is Topic a Root Phenomenon?”, Iberia, vol. 2, 2010, pp. 43-48.
Bianchi Valentina, Bocci Giuliano and Cruschina Silvio, , “Focus and its implicatures”, in Enoch Aboh, Jeannette Schaeffer and Petra Sleeman (eds.), Romance Languages and Linguistic Theory: Selected Papers from Going Romance 2013, Amsterdam, John Benjamins, 2014.
Biloa Edmond, The Syntax of Tuki: A Cartographic Approach, Amsterdam, John Benjamins, 2013.
10.1075/la.203 :Bocci Giuliano, The Syntax-Prosody Interface: A Cartographic Perspective with Evidence from Italian, Amsterdam, John Benjamins, 2013.
Boolos George, Burgess John and Jeffrey Richard, Computability and Logic: Fourth Edition, Cambridge, Cambridge University Press, 2002.
10.1017/CBO9780511804076 :Botteri Daniele, Aspects of the Italian Interrogative System, PhD thesis, University of Sienna, 2018.
Cardinaletti Anna, “Towards a Cartography of Subject Positions”, in Luigi Rizzi (ed.), The Structure of CP and IP, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 2, 2004a, pp. 115-165.
10.1093/oso/9780195159486.001.0001 :Changeux Jean-Pierre, L’Homme de vérité, Paris, Odile Jacob, 2002.
Changeux Jean-Pierre, L’Homme neuronal, Paris, Fayard, 1983.
Chomsky Noam, Lectures on Government and Binding, Dordrecht, Foris Publications, 1981.
10.1515/9783110884166 :Chomsky Noam, Syntactic Structures, La Hague, Mouton, 1957.
10.1515/9783112316009 :Chomsky Noam, The Logical Structure of Linguistic Theory, New York, PhD thesis, University of Pennsylvania, Philadelphia, New York, Plenum Publishing Corporation, 1955/1975.
Chomsky Noam, The Minimalist Program, Cambridge (Mass.), MIT Press, 1995.
10.7551/mitpress/9780262527347.001.0001 :Chomsky Noam, The UCLA Lectures, transcription of Robert Freidin, 2020, available on LingBuzz, https://ling.auf.net/lingbuzz/005485.
Cinque Guglielmo and Krapova Iliyana, “DP and CP: a Relativized Minimality Approach to One of their Non Parallelisms”, paper delivered at the CIL13, University of Geneva, 2013.
Cinque Guglielmo (ed.), Functional Structure in DP and IP, New York, Oxford University Press, 2002.
Cinque Guglielmo, “Deriving Greenberg’s Universal 20 and its Exceptions”, Linguistic Inquiry, vol. 36, no. 3, 2005.
10.1162/0024389054396917 :Cinque Guglielmo, Adverbs and Functional Heads: A Cross-linguistic Perspective, Oxford, Oxford University Press, 1999.
10.1093/oso/9780195115260.001.0001 :Cruschina Silvio, Discourse-Related Features and Functional Projections, New York, Oxford University Press, 2012.
10.1093/acprof:oso/9780199759613.001.0001 :Danckaert Lieven, Latin Embedded Clauses: The Left Periphery, Amsterdam, John Benjamins, 2012.
10.1075/la.184 :Darwin Charles, The Descent of Man, and Selection in Relation to Sex, London, John Murray, 1871.
10.1515/9781400831296 :Dehaene Stanislas, Apprendre !, Paris, Odile Jacob, 2018.
Dehaene Stanislas, Meyniel Florent, Wacongne Catherine, Wang Liping and Pallier Christophe, “The Neural Representation of Sequences: From Transition Probabilities to Algebraic Patterns and Linguistic Trees”, Neuron, vol. 88, no. 1, 2015, doi: 10.1016/j.neuron.2015.09.019.
10.1016/j.neuron.2015.09.019 :Descartes René, Le Discours de la méthode, Paris, Union générale d’éditions, coll. “10-18”, 1637/1951. [English translation by John Veitch, Discourse on the Method of Rightly Conducting the Reason, and Seeking the Truth in the Sciences, Edinburgh, T. Constable, Printer to Her Majesty, 1850.]
Di Domenico Elisa, “La focalizzazione in perugino”, Rivista italiana di linguistica e dialettologia, vol. 14, pp. 145-170, 2012.
Durrleman Stéphanie, The Syntax of Jamaican Creole, Amsterdam, John Benjamins, 2008.
10.1075/la.127 :Endo Yoshio, Locality and Information Structure, Amsterdam, John Benjamins, 2007.
10.1075/la.116 :Franco Irene, Verbs, Subjects and Stylistic Fronting. A Comparative Analysis of the Interaction of CP Properties with Verb Movement and Subject Positions in Icelandic and Old Italian, PhD thesis, University of Siena, 2009.
Frascarelli Mara and Hinterhölzl Roland, “Types of Topics in German and Italian”, in Susanne Winkler and Kerstin Schwabe (eds.), On Information Structure, Meaning and Form, Amsterdam, John Benjamins, 2007, pp. 87-116.
Frascarelli Mara and Puglielli Annarita, “Focus in the Force-Fin System. Information Structure in Cushitic Languages”, in Enoch Aboh, Katharina Hartmann and Malte Zimmermann (eds.), Focus Strategies in African Languages, Berlin, Mouton de Gruyter, 2007, pp. 161-184.
10.1515/9783110199093 :Friedmann Naama, Belletti Adriana and Rizzi Luigi, “Growing trees: The acquisition of the left periphery”, Glossa: a journal of general linguistics, vol. 6, no. 131, p. 1-38, 2021, doi: 10.16995/glossa.5877.
10.16995/glossa.5877 :Galilei Galileo, Dialogo Sopra i Due Massimi Sistemi del Mondo, Torino, Einaudi, 1630/1970. [English translation by Stillman Drake, Dialogue Concerning the Two Chief World Systems – Ptolemaic & Copernican, Berkeley, University of California Press, 1953.]
Giorgi Alessandra and Haroutyunian Sona, “Word Order and Information Structure in Modern Eastern Armenian”, Journal of the Society For Armenian Studies, vol. 25, 2016, pp. 185-200.
Giusti Giuliana, “The Functional Structure of Noun Phrases: A Bare Phrase Structure Approach”, in Guglielmo Cinque (ed.), Functional Structure in DP and IP, New York, Oxford University Press, 2002, pp. 54-90.
10.1093/oso/9780195148794.001.0001 :Greenberg Joseph, “Some Universals of Grammar with Particular Reference to the Order of Meaningful Elements”, in Joseph Greenberg (ed.), Universals of Human Language, Cambridge (Mass.), MIT Press, 1963.
Grewendorf Gunther, “Left Dislocation as Movement”, in Sara Mauck and Jennifer Mittelstaedt (eds.), Georgetown University Working Papers in Theoretical Linguistics, vol. 2, 2002, pp. 31-38.
Haegeman Liliane, Adverbial Clauses, Main Clause Phenomena, and Composition of the Left Periphery, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 8, 2012.
Hagège Claude, “Les pronoms logophoriques”, Bulletin de la Société linguistique de Paris, no. 69, 1974, pp. 287-310.
Hager M’Boua Clarisse, Structure de la phrase en Abidji, PhD thesis, University of Geneva, Mauritius, Éditions universitaires européennes, 2014/2020.
Ilkhanipour Negin, “On the CP Analysis of Persian Finite Control Constructions”, Linguistic Inquiry, vol. 45. no. 2, 2014, pp. 323-331.
10.1162/LING_a_00157 :Irurtzun Aritz, “Why-Questions Break the Residual V2 Restriction (in Basque and Beyond)”, forthcoming in Gabriela Soare (ed.), Why Is ‘Why’ Unique? Its Syntactic and Semantic Properties, Amsterdam, John Benjamins, 2021.
Jayaseelan Karattuparambil, “Topic, Focus and Adverb Positions in Clause Structure”, Nanzan Linguistics, vol. 4, 2008, pp. 43-68.
Jokilehto Dara, “Finnish Contrastive Topics Get Passports to the Left Periphery: Refining an RM Approach”, paper delivered at CECIL’S 4, 2014, https://cecils.webclass.co/programme/Book%20of%20abstracts.pdf#page=40.
Karimi Simin and Piattelli-Palmarini Massimo (eds.), Linguistic Analysis, vol. 41, nos. 3-4, 2017.
Kayne Richard, Connectedness and Binary Branching, Dordrecht, Foris Publications, 1983.
10.1515/9783111682228 :Kayne Richard, The Antisymmetry of Syntax, Cambridge (Mass.), MIT Press, 1984.
Koopman Hilda and Guardiano Cristina, “Managing Data in TerraLing, a Large-Scale Cross-Linguistic Database of Morphological, Syntactic, and Semantic Patterns”, in Andrea Berez-Kroeker, Bradley McDonnell, Eve Koller and Lauren Collister (eds.), The Open Handbook of Linguistic Management, Cambridge (Mass.), MIT Press, 2022.
Krapova Iliyana and Guglielmo Cinque, “On the Order of WH-Phrases in Bulgarian Multiple WH-Fronting”, in Gerhild Zybatow, Luka Szucsich, Uwe Junghanns and Roland Meyer (eds.), Formal Description of Slavic Languages: The Fifth Conference, Leipzig 2003, Francfort-sur-le-Main, Peter Lang, coll. “Linguistik International”, 2008, pp. 318-336.
Laenzlinger Christopher, Comparative Studies in Word Order Variations: Pronouns, Adverbs and German Clause Structure, Amsterdam, John Benjamins, 1997.
10.1075/la.20 :Legate Julie “Warlpiri and the Theory of Second Position Clitics”, Natural Language and Linguistic Theory, vol. 26, no. 1, 2008.
10.1007/s11049-007-9030-0 :Legate Julie, Warlpiri: Theoretical Implications, PhD thesis, MIT, 2002, https://www.ling.upenn.edu/~jlegate/main.pdf.
Lévi-Strauss Claude, Le Cru et le Cuit, Paris, Plon, 1964.
Manzini Maria Rita and Savoia Leonardo, I dialetti italiani e romanci, Alessandria, Edizioni dell’Orso, 2005.
Manzini Maria Rita, “Syntactic conditions on phonological rules”, MIT Working Papers in Linguistics, vol. 5, 1983, pp. 1-9.
Mehler Jacques and Dupoux Emmanuel, Naître humain, Paris, Odile Jacob, 1990.
Meillet Antoine, “L’état actuel des études de linguistique générale”, Leçon d’ouverture du cours de grammaire comparée au Collège de France, 13 février 1906.
Mioto Carlos, A periferia esquerda no português brasileiro [manuscript], Universidade Federal de Santa Catarina, University of Siena, 1999.
Moro Andrea, Dynamic Antisymmetry, Cambridge (Mass.), MIT Press, 2000.
Moro Andrea, The Boundaries of Babel, Cambridge (Mass.), MIT Press, 2008.
10.7551/mitpress/9780262134989.001.0001 :Nelson Matthew, El Karoui Imen, Giber Kristof, Yang Xiaofang, Cohen Laurent, Koopman Hilda, Cash Sydney, Naccache Lionel, Hale John, Pallier Christophe and Dehaene Stanislas, “Neurophysiological Dynamics of Phrase-Structure Building during Sentence Processing”, Proceedings of the National Academy of Sciences of the USA, vol. 114, no. 18, 2017.
Pallier Christophe, Devauchelle Anne-Dominique and Dehaene Stanislas, “Cortical Representation of the Constituent Structure of Sentences”, Proceedings of the National Academy of Sciences of the USA, vol. 108, no. 6, 2011, pp. 2522-2527.
10.1073/pnas.1018711108 :Pan Victor, “Mandarin Peripheral Construals at Syntax-Discourse Interface”, The Linguistic Review, vol. 32, no. 4, 2015, pp. 819-868.
10.1515/tlr-2015-0005 :Pan Victor, Architecture of The Periphery in Chinese: Cartography and Minimalism, London-New York, Routledge, coll. “Routledge Studies on Chinese Linguistics”, 2019.
10.4324/9781315158228 :Paul Waltraud, “Low IP area and left periphery in Mandarin Chinese”, Recherches linguistiques de Vincennes, 2005, pp. 111-134.
10.4000/rlv.1303 :Paul Waltraud, New Perspectives on Chinese Syntax, Berlin-Boston, De Gruyter Mouton, 2014.
Peano Giuseppe, Aritmetica Generale e Algebra Elementare, Turin, Paravia, 1902.
Pearce Elizabeth, “Topic and Focus in a Head-initial Language: Maori”, Toronto Working Papers in Linguistics, vol. 16, 1999.
Perrin Jean, Les Atomes, Paris, Félix Alcan, 1913; new edition, Paris, Gallimard, 1991.
10.14375/NP.9782369430230 :Pescarini Diego, “Subject and Impersonal Clitics in Northern Italian Dialects”, in Roberto Petrosino, Pietro Cerrone and Harry van der Huls (eds.), From Sounds to Structures, Berlin, De Gruyter Mouton, 2018, pp. 409-431.
Pescarini Diego, La Microvariation syntaxique dans les langues romanes. Un modèle paramétrique, file delivered for HDR, University of Côte d’Azur, 2020.
Poletto Cecilia, The Higher Functional Field: Evidence from Northern Italian Dialects, New York, Oxford University Press, 2000.
10.1093/oso/9780195133561.001.0001 :Pollock Jean-Yves, “Verb Movement, Universal Grammar and the Structure of IP”, Linguistic Inquiry, vol. 20, no. 3, 1989, pp. 365-424.
Post Emil, “Finite Combinatory Processes. Formulation 1”, Journal of Symbolic Logic, no. 1, 1936, pp. 103-105.
10.2307/2269031 :Puskas Genoveva, Word Order in Hungarian: The Syntax of A’-positions, Amsterdam-Philadelphia, John Benjamins, 2000.
10.1075/la.33 :Reinhart Tanya, The Syntactic Domain of Anaphora, PhD thesis, MIT, 1976, https://dspace.mit.edu/bitstream/handle/1721.1/16400/03491311-MIT.pdf?sequence=1&isAllowed=y.
Rizzi Luigi and Bocci Giuliano, “The Left Periphery of the Clause-Primarily illustrated for Italian”, in: The Blackwell Companion to Syntax, 2nd edition, 2015.
10.1002/9781118358733 :Rizzi Luigi and Cinque Guglielmo, “Functional Categories and Syntactic Theory”, Annual Review of Linguistics, vol. 2, 2016.
10.1146/annurev-linguistics-011415-040827 :Rizzi Luigi and Savoia Leonardo, “Conditions on /u/ Propagation in Southern Italian Dialects: A Locality Parameter for Phonosyntactic Processes”, in Adriana Belletti (ed.), Syntactic Theory and the Dialects of Italy, Turin, Rosenberg et Sellier, 1993.
Rizzi Luigi, “Cartography, Criteria, and Labeling”, in Ur Shlonsky (ed.), Beyond the Functional Sequence, New York, Oxford University Press, 2015, pp. 314-338.
Rizzi Luigi, “Intervention Effects in Grammar and Language Acquisition”, Probus, vol. 30, no. 2, 2018, pp. 339-367.
10.1515/probus-2018-0006 :Rizzi Luigi, “Labeling, Maximality, and the Head-Phrase Distinction”, The Linguistic Review, vol. 33, 2016, pp. 103-127.
10.1515/tlr-2015-0016 :Rizzi Luigi, “Locality and Left Periphery”, in Adriana Belletti (ed.), Structures and Beyond, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 3, 2004b, pp. 223-251.
10.1093/oso/9780195171976.001.0001 :Rizzi Luigi, “Notes on Cartography and Further Explanation”, Probus, vol. 25, no. 1, 2013.
10.1515/probus-2013-0010 :Rizzi Luigi, “On the Cartography of Syntactic Structures”, in Luigi Rizzi (ed.), The Structure of CP and IP, Oxford-New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 2, 2004a.
10.1093/oso/9780195159486.001.0001 :Rizzi Luigi, “On the Position ‘Int(errogative)’ in the Left Periphery of the Clause”, in Guglielmo Cinque and Giampaolo Salvi (eds.), Current Studies in Italian Syntax: Essays Offered to Lorenzo Renzi, Amsterdam, Elsevier, 2001, pp. 267-296.
Rizzi Luigi, “The Fine Structure of the Left Periphery”, in Liliane Haegeman (ed.), Elements of Grammar: A Handbook of Generative Syntax, Dordrecht, Kluwer, 1997, pp. 281-337.
10.4324/9780203461785 :Rizzi Luigi, Comparative syntax and Language Acquisition, London, Routledge, 2000.
10.4324/9780203461785 :Rizzi Luigi, Issues in Italian Syntax, Dordrecht, Foris Publications, 1982; new edition, La Hague, Mouton de Gruyter, 1993.
10.1515/9783110883718 :Rizzi Luigi, Relativized Minimality, Cambridge (Mass.), MIT Press, 1990.
Roberts Ian, “The C-system in Brythonic Celtic languages, V2, and the EPP”, in Luigi Rizzi (ed.), The Structure of CP and IP, coll. “The Cartography of Syntactic Structures”, vol. 2, 2004a, pp. 297-328.
10.1093/oso/9780195159486.001.0001 :Roberts Ian, Parameter Hierarchies and Universal Grammar, New York, Oxford University Press, 2019.
10.1093/oso/9780198804635.001.0001 :Rousseau Jean-Jacques, Essai sur l’origine des langues, L’Harmattan, 1781/2009. [English translation by John H. Moran and Alexander Code, On the Origin of Languages, Chicago, The University of Chicago Press, 1966.]
10.1522/cla.roj.ess :Roussou Anna, “On the Left Periphery: From Modal Particles to Complementisers”, Journal of Greek Linguistics, vol. 1, 2000, pp. 65-94.
10.1075/jgl.1.05rou :Ruwet Nicolas, Introduction à la grammaire générative, Paris, Plon, 1967. [English translation by Norval S.H. Smith, An Introduction to Generative Grammari, Amsterdam, North-Holland Publishing Company, 1973.]
Saito Mamoru, “Sentence Types and the Japanese Right Periphery”, in Gunther Grewendorf and Thomas Zimmerman (eds.), Discourse and Grammar, Boston-Berlin, De Gruyter Mouton, coll. “Studies in Generative Grammar”, 2013.
10.1515/9781614511601 :Salvi Giampaolo, “Some Firm Points on Latin Word Order: The Left Periphery”, Universal Grammar in the Reconstruction of Ancient Languages, vol. 83, 2005.
Samo Giuseppe, A Criterial Approach to the Cartography of V2, Amsterdam-Philadelphia, John Benjamins, 2019.
10.1075/la :Saussure Ferdinand de, Cours de linguistique générale, Paris, Payot, 1916/1995. [English translation by Wade Baskin, Course in General Linguistics, New York, Philosophical Library, 1959.]
Schlenker Philippe, Chemla Emmanuel, Schel Anne M., Fuller James, Gautier Jean-Pierre, Kuhn Jeremy, Veselinović Dunja, Arnold Kate, Cäsar Cristiane, Keenan Sumir, Lemasson Alban, Ouattara Karim, Ryder Robin and Zuberbühler Klaus, “Formal Monkey Linguistics”, Theoretical Linguistics, vol. 42, nos. 1-2, 2016.
Seki Lucy and Nevins Andrew, Strategies of Embedding and the Complementizer Layer in Kamayurà, 2017.
Shlonsky Ur, “Topicalization and Focalization: A Preliminary Exploration of the Hebrew Left Periphery”, in Anna Cardinaletti, Guglielmo Cinque and Yoshido Endo (eds.), Peripheries, Tokyo, H. Syobo, 2014, pp. 327-341.
Shlonsky Ur, Semitic Clitics, 1994, https://archive-ouverte.unige.ch/unige:8366.
10.1093/oso/9780195108668.001.0001 :Si Fuzhen (ed.), Studies on Syntactic Cartography, Beijing, China Social Sciences Press, 2017.
Speas Peggy and Tenny Carol, “Configurational Properties of Point of View Roles”, in Anna Maria Di Sciullo (ed.), Asymmetry in Grammar, John Benjamins, 2003, pp. 315-345.
10.1075/la :Tettamanti Marco, Alkadhi Hatem, Moro Andrea, Perani Daniela, Kollias Spyros and Weniger Dorothea, “Neural Correlates for the Acquisition of Natural Language Syntax”, NeuroImage, vol. 17, 2002, pp. 700-709.
Torrence Harold, The Clause Structure of Wolof: Insights into the Left Periphery, Amsterdam-Philadelphia, John Benjamins, 2013.
10.1075/la.198 :Tsai Wei-Tien Dylan (ed.), The Cartography of Chinese Syntax, New York, Oxford University Press, coll. “The Cartography of Syntactic Structures”, vol. 11, 2015.
10.1093/acprof:oso/9780190210687.001.0001 :Tsai Wei-Tien Dylan, “Left Periphery and How-Why Alternations”, Journal of East Asian Linguistics, vol. 17, 2008, pp. 83-115.
10.1007/s10831-008-9021-0 :Turing Alan, “Computing Machinery and Intelligence”, Mind, no. 59, 1950, pp. 433-460.
10.1093/mind/LIX.236.433 :Turing Alan, “On Computable Numbers, with an Application to the Entscheidungsproblem”, Proceedings of the Mathematical Society, vol. 42, no. 2, 1936, pp. 230-265.
10.1112/plms/s2-42.1.230 :Notes de bas de page
1 “Car c’est une chose bien remarquable qu’il n’y a point d’hommes si hébétés et si stupides […] qu’ils ne soient capables d’arranger ensemble diverses paroles, et d’en composer un discours par lequel ils fassent entendre leurs pensées ; et qu’au contraire il n’y a point d’autre animal, tant parfait et tant heureusement né qu’il puisse être, qui fasse le semblable.” (Descartes, 1637/1951: 86)
2 “Car on peut bien concevoir qu’une machine soit tellement faite qu’elle profère des paroles […], mais non pas qu’elle les arrange diversement, pour répondre au sens de tout ce qui se dira en sa présence, ainsi que les hommes les plus hébétés peuvent faire.” (Descartes, 1637/1951: 86)
3 “[…] ma sopra tutte le invenzioni stupende, qual eminenza di mente fu quella di colui che s’immaginò di trovar modo di comunicare i suoi più reconditi pensieri a qualsivoglia altra persona, benché distante per lunghissimo intervallo di luogo e di tempo? … E con qual facilità? Con i vari accozzamenti di venti caratteruzzi sopra una carta.” (Galileo, 1630/1970: 130)
4 “Les phonèmes, les morphèmes, les mots […] peuvent être comptés ; ils sont en nombre fini. Les phrases, non. […] La phrase, création indéfinie, variété sans limite, est la vie même du langage en action.” (Benveniste, 1966: 129)
5 “La syntaxe n’attend pas pour se manifester qu’une série théoriquement illimitée d’événements aient pu être recensés, parce qu’elle consiste dans le corps de règles qui préside à leur engendrement.” (Lévi-Strauss, 1964: 15-16)
6 On recursion theory: Turing, 1936; Post, 1936; and the discussion in Boolos, Burgess and Jeffrey, 2002.
7 In (8) the lexical verb say does not move to T, as is always the case in contemporary English (Pollock, 1989). On the mechanism by which the verb is associated to the inflection -ed, see Chomsky, 1957 and 1995 for different proposals.
8 C-command (or “constituent command”, Reinhart 1976) is a basic structural relation. We say that A c-commands B when B is contained in the node merged with A. Who intervenes hierarchically between when and its trace in (15) in that who c-commands the trace and is c-commanded by when. In (16), who is too deeply embedded to be able to c-command anything in the main clause, thus in (16) who does not intervene hierarchically between when and its trace.
9 Including “logophoric” pronouns, studied by Claude Hagège, my predecessor in theoretical linguistics at the Collège de France (Hagège, 1974).
10 Tettamanti et al., 2002 ; Moro, 2008.
11 Pallier et al., 2011 ; Dehaene et al., 2015 ; Nelson et al., 2017.
12 “La tâche de la linguistique sera : a. de faire la description et l’histoire de toutes les langues qu’elle pourra atteindre […] ; b. de chercher les forces qui sont en jeu d’une manière permanente et universelle dans toutes les langues, et de dégager les lois générales auxquelles on peut ramener tous les phénomènes particuliers de l’histoire ; […].” (Saussure, 1916/1995: 20)
13 “La recherche de lois générales, tant morphologiques que phonétiques, doit être désormais l’un des principaux objets de la linguistique. Mais, de par leur définition même, ces lois dépassent les limites des familles des langues ; elles s’appliquent à l’humanité entière.” (Meillet, 1906: 91)
14 “[…] la linguistique a un double objet, elle est science du langage et science des langues. Cette distinction, qu’on ne fait pas toujours, est nécessaire : le langage, faculté humaine, caractéristique universelle et immuable de l’homme, est autre chose que les langues, toujours particulières et variables, en lesquelles il se réalise. […] Mais […] ces voies différentes s’entrelacent souvent […].” (Benveniste, 1966: 19)
15 Mehler and Dupoux, 1990; Changeux, 1983, 2002; Dehaene, 2018.
16 For recent discussions on this line of research in terms of generative grammar, see among others: Cinque, 2005; Roberts, 2019.
17 See, for example, the special issue “Parameters” of the journal Linguistic Analysis, edited by Simin Karimi and Massimo Piattelli-Palmarini, vol. 41, nos. 3-4, 2017, https://www.linguisticanalysis.com/volume-41-issue-3-4.
18 In his excellent introduction to parameter theory, Baker (2001) uses the term “language atoms” in a different way, to refer to parameters. I prefer to use this term to refer to the ultimate ingredients of syntactic combinations.
19 A synthesis of the results is presented in Rizzi and Cinque, 2016.
20 More complete bibliographical information on cartographic studies is available on the website of the SynCart project (Syntactic Cartography), created at the University of Geneva by Giuliano Bocci, Karen Martini and Giuseppe Samo, https://www.unige.ch/lettres/linguistique/research/syntax-and-psycholinguistics/syncart/home.
Auteurs
Le texte seul est utilisable sous licence Licence OpenEdition Books. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.
Origine et histoire des hominidés. Nouveaux paradigmes
Leçon inaugurale prononcée le jeudi 27 mars 2008
Michel Brunet
2008
L’épidémie du sida. Mondialisation des risques, transformations de la santé publique et développement
Peter Piot
2010
Les nanotechnologies peuvent-elles contribuer à traiter des maladies sévères ?
Patrick Couvreur
2010
Des microbes et des hommes. Guerre et paix aux surfaces muqueuses
Leçon inaugurale prononcée le jeudi 20 novembre 2008
Philippe Sansonetti
2009
De l’atome au matériau. Les phénomènes quantiques collectifs
From the atom to matter. Collective quantum phenomena
Antoine Georges
2010