Institut Jean Nicod

Accueil > Séminaires/Colloques > Archives > Colloques > 2016-2017 > En l’honneur de Sylvain Bromberger > Presentation



Presentation

 

Workshop en l'honneur de Sylvain Bromberger

 

 

 

Le 15 mai 2017

 

Salle des Actes, Ecole normale supérieure, 45, rue d'Ulm 75005 Paris.

Organisé par Paul Egré (IJN)

Description: We are delighted to welcome Professor Sylvain Bromberger at Ecole normale supérieure for a one-day workshop celebrating his work. Professor Bromberger will open the workshop with a presentation of his recent research on why-questions, followed by five papers on topics connected to his work, including questions, knowledge and metacognition, homophony, polysemy, and the nature of linguistic theory.

Participants :

Sylvain Bromberger (MIT)
Isabelle Dautriche (Edimbourg)
Paul Egré (IJN)
Robert May (UC Davis)
David Nicolas (IJN)
Joelle Proust (IJN)
Benjamin Spector (IJN)
 

Programme

 

9:00-9:15: Arrival of participants

9:15-9:30: Opening Words (by Paul Egré)

9:30-10:30: Sylvain Bromberger (MIT)
On some aspects of why-questions

10:30-10:45: Coffee break

10:45-11:45: Benjamin Spector (IJN)
Plural predication, vagueness, and principles of language use

11:50-12:50: Joelle Proust (IJN)
Knowing what one does not know: A dual-process view


--

12:50-14:30: Lunch break

--

14:30-15:30: David Nicolas (IJN)
Mass and count: grammar, ambiguity, and polysemy

15:35-16:35: Isabelle Dautriche (Edimburgh)
Homophones: a challenge for meaning(s) acquisition?

16:35-16:50: Coffee break

16:50-17:50: Robert May (UC Davis)
Derivation and Interpretation

17:50-18:10: Conclusions, by Robert May & Paul Egré

 

Résumés :

On some aspects of why-questions
Sylvain Bromberger, MIT

Why-questions are very different from other wh-questions and even from their close neighbor "How come" questions. In this talk I will explore some of these differences and their significance.


Homophones: a challenge for meaning(s) acquisition?
Isabelle Dautriche, Emmanuel Chemla, Anne Christophe

Homophones present the learner with a unique word-learning situation. While most words conform to a one-to-one mapping between form and meaning, a homophone is a phonological form associated with several unrelated meanings. Mature language users use a diverse array of information sources (visual, linguistic…) to reach the correct interpretation of these words. However, children in the process of learning their language are faced with a different situation: they need to discover that a single word-form maps onto several distinct meanings. In two sets of studies, we explore the situations that lead children and toddlers to postulate homophony for a word. First with preschoolers learning novel words, we show that the distribution of the learning exemplars in conceptual space influences children’s inferences: observing exemplars clustered around two distant positions in conceptual space (e.g., 2 tigers and 2 beetles as opposed to 4 random animals) boosted the likelihood that the exemplars were sampled from two independent categories rather than from a single superordinate category (i.e., animal). Second, we show that 20-month-olds are willing to learn a second meaning for a word they know, provided that the two homophones are sufficiently distant syntactically (e.g. ‘an eat’ is a good name for a novel animal), or semantically (e.g. ‘a sweater’ for a novel animal), but not when they are close (e.g. ‘a cat’ for a novel animal). Taken together, our results show that children recruit multiple sources of information to infer whether or not a given word-form is likely to instantiate a novel meaning.
 

Derivation and Interpretation
Robert May (UC Davis)

Here is a common picture of the relation of syntax and semantics: The syntax outputs formal representations of sentences which are then assigned interpretations by the semantics. Part of the business of linguistic theory, so the story goes, is to specify these representations by a genera
tive procedure, which are then given over to the interpretive component. While there are disputes about what interpretation amounts to on this picture, the overall picture is broadly accepted for the semantics of natural languages. In this paper, we will demur from this consensus. Broadly our objection is just this: It is a mistake, in the context of linguistic theory, to think of the syntax as something that has an output. Rather, the syntax is a specification of a class of derivations,  and derivations are not the sort of thing that is subject to semantic interpretation (or interpretation at all). To see this, we explore, in part from a historical perspective, the role of representation in linguistic theory, and how this is related to semantics. There will be three main points of focus. 

(i) What is the role of semantics in giving a definition of a language (the Definition Problem)? 

(ii) Given this definition, how are syntax and semantics related (the Interface Problem )? 

(iii) To what extent can syntactic representations be appropriated as formal objects of interepretation (the Repurposing Problem )? 

Our view is that (i) semantics is not part of the definition specified by linguistic theory of the concept of a language; (ii) standard conceptions of the syntax-semantics interface misconstrue the role of representation in linguistic theory; and (iii) syntactic representations are interpreted entities, and so not subject to further interpretation.
 

Mass and count: grammar, ambiguity, and polysemy
David Nicolas, IJN

According to the traditional view, languages like English distinguish two morphosyntactic subclasses of common nouns, mass nouns and count nouns. A defining characteristic of mass nouns, like "milk", "furniture", and "wisdom", is that they are invariable in grammatical number, while count nouns, like "rabbit", "match", and "idea", can be used in the singular and in the plural. Depending on the language, this basic morphosyntactic difference between the two types of noun is supplemented by differences as to the determiners they can combine with. Thus, in English, mass nouns can be used with determiners like "much" and "a lot of", but neither with "one" nor "many". On the contrary, count nouns can be employed with numerals like "one" and determiners like "many", but not with "much". However, as is well known, mass nouns (like "milk") can often be used as count nouns: "You should take a hot milk with some honey". And vice versa: "You will find a lot of rabbit around here". So, drawing inspiration from Bromberger (2012)'s discussion of the partly similar case of color terms, on what basis should one individuate the uses of common nouns and their interpretations? What pertains to grammar, ambiguity, and polysemy?

Bromberger, S. (2012). Vagueness, ambiguity, and the “sound” of meaning. In  M. Frappier, D. Brown & R. DiSalle (eds.), Analysis and Interpretation in the Exact Sciences, 75-93. Springer. 


Knowing what one does not know: A dual-process view
Joëlle Proust, Institut Jean-Nicod     

Sylvain Bromberger has offered us insightful analyses on the importance of knowing what we don't know in the complex domain of scientific explanation. This prompts the additional research question: does the general ability of reliably detecting what we don't know depend on having a language, or at least on having a propensity to communicate? Sensitivity to one's own ignorance, in humans, is seen as standardly expressed in questions, i.e. interrogative speech acts. The felicity conditions of interrogatives are that the agent does not know the truth about P, wants to know it, and believes that the addressee may supply it. Interrogative speech acts have been shown to be present in all cultures. With no words yet available, 20-month old infants are able to strategically ask for informational help.  Information-seeking questions are relentlessly asked by 3 to 5 year-old children. It is arguable, however, that the felicity conditions of interrogatives can be fulfilled by agents with no linguistic ability and no concept of knowledge, such as rhesus monkeys. From this viewpoint, it appears also unlikely that sensitivity to one's own ignorance needs to depend on having communicational goals, or that it requires hypothesizing a specially evolved non-conceptual attitude called "basic questioning". A more economical account is based on an independently motivated dual-process theory of metacognition.

 

Plural predication, vagueness and principles of language use
Benjamin Spector (IJN)

(Based on joint work with Manuel Kriz)

The interpretation of plural definites (among others) displays two somewhat unexpected properties: non-maximality and homogeneity.

 - Non-maximality refers to the fact that sometimes, plural definites have less-than-universal quantificational force:

(1) Non-Maximality

[Context: A job interview]
The committee members smiled.
>> Can be appropriately used if, say, 8 out of 10 committee members smiled.

- Homogeneity refers to the fact that plural definites tend to have (near)-existential force in affirmative sentences, but only existential force in the scope of negation:

(2) Homogeneity
(a) John read the books on the reading list.
>> He must have read roughly all of them.
(b) John didn't read the books on the reading list.
>> He must have read roughly none of them.

These two properties are not restricted to plural definites, but are pervasive over other types of constructions (question-embedding, singular predication over complex objects, etc.).

We show how, by adopting (a) an underspecified semantics for plural predication making it intrinsically vague, and (b) certain principles of language uses, we can account for both properties in a way that makes very specific and apparently correct predictions.

I will discuss to what extent these principles can be motivated from general principles of rational communication, and whether they also apply, in some cases, to ambiguity resolution.

 

Sponsor: The workshop is organized with main support from the Labex Institut d’Etudes de la Cognition, Program New Ideas, Ecole normale supérieure.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


CNRS EHESS ENS ENS