Trusting others is one of the most common epistemic practices to make sense of the world around us. Sometimes we have reasons to trust, sometimes not, and many times our main reason to trust is based on the epistemic authority we attribute to our informants. The way we usually weight this authority, how we select the “good informants”[1], which of their properties we use as indicators of their trustworthiness is a rather complex matter. Indicators of trustworthiness may be notoriously biased by our prejudices, they may be faked or manipulated by malevolent or just interested informants and change through time and space. Discussions around the way in which these criteria are established and shared in a community range from history of science, to contemporary epistemology and moral philosophy. Their flavour can be more descriptive - as in Steven Shapin’s account of the role of gentelmanry in the establishment of scientific reputation in XVII century in Britain – or normative, as in Miranda Fricker’s work on the virtue of epistemic justice as the basic virtue that a hearer must have in order to judge in a non-biased way his or her informants.[2] Here, I do not want to propose some other criteria for ascertaining the reliability of our informants, rather, I would like to discuss the variability of these criteria within different contexts of communication. Communication is an active process through which interlocutors not only transmit and receive information, but negotiate new epistemic standards by constructing together new reasons and justifications that are heavily influenced by the moral, social or political context and by the interests at stake on both sides, the speaker and the hearer. The kind of “doxastic responsibility”[3] hearers exercise on testimonial knowledge is sensitive to the way the communicative process takes place and to the many contextual factors that influence its success or failure.
The overall picture of trust in testimonial knowledge I want to suggest is due to some ideas I have developed upon the relations between the epistemology of trust and the pragmatics of communication[4]. What I want to defend here is that the way we gain knowledge through communication is influenced by the responsibility we take in granting authority to a source of information. Our responsibility is not just a moral quality we happen to possess or have learnt through experience, but a way of approaching our communicative interactions, a stance towards credibility and credulity that is shared, in successful cases, with our informants. It’s a dynamic stance, which varies through contexts and contents at stake, but it is necessary for any epistemic enrichment of our cognitive life, apart from the marginal cases of epistemic luck[5]. An obvious advantage of a pragmatic approach to epistemology of testimony is to avoid an artificial image that depicts the hearer as a rational chooser who has the option of accepting or refusing a chunk of information that is presented towards her by a speaker. In most cases we just do not have the choice: we learn from others because we’re immerged in conversational practices whose output is exactly the chunk we end to believe or disbelieve. There is no a priori information to gain that is not constructed in the process of communication.
In order to illustrate my main point, I will present in the rest of the paper a series of examples - some real and some other fictional - that show how epistemic standards vary through contexts, how different is our way to weigh evidence as our interests at stake change and, finally, how the interplay between pragmatic effects of communication and its epistemic consequences is a dynamic feature of our way of accessing new information. As contexts and stakes change, what we may have considered once as just a pragmatic vagary of conversation may become a crucial epistemic cue that orients our allocation of credibility and moral authority to our informants.
Let me start with a political example - which had a huge impact of the credibility and authority of many political leaders in
But let’s stick the facts.
On February 3rd 2003, The Intelligence of the British Government released a dossier entitled: “Iraq- Its infrastructure of concealment, deception and intimidation” on how Iraq's security organisations operated to conceal weapons of mass destruction from UN Inspectors, the organization of Iraq Intelligence and the effects of the security apparatus on ordinary citizens. The report was previously sent to Mr. Colin Powell who used some of the material for his well known presentation at United Nations (Monday, February 3rd).
Few days later, Channel 4 News learned from the
Verily, lying is an ill and detestable vice. Nothing makes us men, and no other means keeps us bound one to another, but our word; knew we but the horror and weight of it, we would with fire and sword pursue and hate the same, and more justly than any other crime […] Whatsoever a lier should say, we would take it in a contrary sense. But the opposite of truth has many shapes, and an indefinite field.[8]
Kant’s well-known position on lying is that it can be never justified under all possible circumstances. Not lying is a universal law, with no exception: “This means that when you tell a lie, you merely take exception to the general rule that says everyone should always tell the truth”[9]. In his exchange with Benjamin Constant, who objects to the German philosopher the existence of a “duty to tell the truth”, Kant restates his position even in response to an extreme example that Constant submits to him. Constant, who wrote against Kant that: “The moral principle stating that it is a duty to tell the truth would make any society impossible if that principle were taken singly and unconditionally”[10], presents the following example as a clear case in which the duty to tell the truth is objectionable: If someone were at your door to murder a friend of yours who is hiding in your house and asks you where he is, do you still have the duty to tell him the truth? Kant replies in a text entitled: “On a Supposed Right to Lie Because of Philanthropic Concerns”, that yes, even in such an extreme case you have the duty to tell the truth: “Truthfulness in statements that cannot be avoided is the formal duty of man to everyone, however great the disadvantage that may arise there from for him or for any other”.[11] So, sincerity seems to be a much more important duty than accuracy in the history of philosophy. Inaccuracy is a more recent “moral fault”, that comes with Modernity and with the idea that trust in governments should not be based on the moral virtues of governors, rather, on their expertise and on the reliability of the procedures that assure the good functioning of the State[12]. In a “disenchanted world” in which politics doesn’t depend anymore on virtue, truthfulness has to be based on evidence, and an epistemic flaw in providing such evidence in support of a claim which has political consequences is a moral weakness as well as a way of undermining the grounds of trust that sustain a society.
But let us go on with the story.
On June 2003 Alastair Campbell, Director of Communications of Tony Blair’s government, was accused by the BBC journalist Andrew Gilligan to have “sexed up” another report, released on September 24, 2002 on Iraq ‘s weapons of mass destruction, that the Government unprecedentely decided to publish with a preface of the Prime Minister. Based on the testimony of an insider expert who contributed to the report, Gilligan affirmed that the claim made in the dossier that
On July, the name of Gilligan’s informant was revealed to the press by the Defence Office. On July 18th the microbiologist David Kelly, adviser of the Foreign Office and the Defence Office about the chemical weapons, was found dead, two days after a very tough audience at the Parliament on July 15th. Having learned about his death while travelling in
In January 2004, Lord Hutton released his report. After having heard 74 testimonies and analysed more than 300 claims, Lord Hutton established the following facts:
- David Kelly killed himself under no pressure by any other person
- In his conversation with Mr. Gilligan on May 23, David Kelly was in breach of the rules governing disclosure of confidential information even if part of his job description as an adviser of the Foreign and Defence Offices concerned speaking to the media and institutions on Iraq’s weapons.
- It is dubious that David Kelly said to Mr. Gilligan that the claim about 45 minutes was exaggerated.
- There were no effective pressures of the Government for “sexing up” the dossier.
The rest is known: The main responsibility for the affair fell on the BBC, Gilligan lost his job and the Director General of the BBC, Greg Dyke, resigned.
The result of the inquiry was disappointing for the public opinion, because it was perceived as a way to acquit the government from its responsibilities. But here I do not want to enter the debate on the moral responsibilities of the
This is also a particularly illuminating example of the strict relation between epistemic authority and political authority in democratic societies. Social epistemology today is becoming also a “political epistemology”: political authority is the more and more sustained by various forms of epistemic authority: experts, reports, oracles, think-tanks, independent inquiries, have to provide the evidence on which a political choice is going to be taken and judged.
Here are some of the provisory conclusions on trust in authority that I want to draw from this example:
1. Governments rely on experts on technical matters to take decision. But the unprecedented choice of the British Government to publish the September 2002 dossier shows that the use of experts in this case was more than just for acquiring information about the facts in
3. Even if the September 2003 dossier was obviously produced for political reasons, that it, with the aim to establish the facts that would justify the invasion of Iraq, a direct influence of the political authority on the presentation of facts is intolerable for the democratic functioning of a society, in particular on such a delicate matter as the decision to send people to war. In general: when the potential consequences are grave, standards of evidence, objectivity and impartiality must be raised. An institution that has epistemic authority knows the facts that may have justified a political decision but its authority depend on its autonomy from the political power.
On the second part of this paper I would like to explore how these real standards vary nonetheless, according to time, places and contexts of communication. For example, what could have been perceived for my father’s generation as a form of gallantry and well mannered way to deal with women in conversation, as hiding the price list in a restaurant to women guests, is nowadays perceived as an unjust way of blocking access of information to a special category of people. Or else, the paternalistic way of doctors to hide part of the information about a patient’s health in cases of serious diagnoses has been now evacuated as a legitimate communicational practice from medical ethics. A contractualist relationship based on informed consent has become the standard way of dealing with ethical issues within medical decision making. This practice aimed at readjusting the balance between the power of doctors and the autonomy of patients in order to avoid the risks of abuse that a blind trust relationship would expose the patients too. Informed consent ha been introduced as a moral and legal requirement for any medical intervention be it research or therapeutic, by the 1997 European Convention of Human Rights and Biomedicine[14].
These examples show a continuum that I wish to explore between communicational practices and epistemic standards. How we talk to other people - what we say and don’t say – is more than just a matter of linguistic preference for a certain style of conversation. We can adjust our language to what we want to give access to in terms of information. And, as hearers, we always adjust our interpretations according not only to our pragmatic expectations, but also to our epistemic needs.
For example, authority is an important epistemic cue in interpreting what other people say. Nice experiments show that the same text given to two different groups of people while indicating to each group two different sources, one authoritative and the other not, gives very different results in the interpretation[15] given by the readers. In the case of the authoritative source, even if the text may result obscure, people tend to overinterpret it in order to make sense of it. While in the second case, the effort of interpretation is more limited and, if the text is too obscure, people rapidly conclude that it’s nonsense (many of us have experienced the same “authority” effect in reviewing articles from colleagues and students…This is one of the reason why the peer review process is anonymous!). Authority biases are thus extremely relevant to what we come to understand and believe from our informants: but this is not a on/off process in which we believe in authoritative sources and don’t believe in non- credible ones: it is the way we process information that comes from authorities, how we adjust our interpretation in order to make sense of what they say that determines what we come to believe. It is thus the stance we take towards our informants, the way we exercise our epistemic responsibility, that makes us believe or not believe what we are said. Thus, the construction of testimonial knowledge is a shared responsibility between informants and hearers: there are not purely unbiased informants - apart from some uninteresting cases like the timetable of trains in the station – as there aren’t naïve receivers of information. Even children, who have been considered for longtime as the paradigmatic case of naïve credulous creatures, have proved more sophisticated epistemic subjects than what we used to think: they take into account cues of credibility in order to accept what an informant says and check the linguistic consistency of their informants in order to adjust their credibility investment[16].
Epistemic responsibility is thus a matter of adjusting our way of interpreting what other people say to our epistemic needs: if I’m involved in a small talk in a party in which someone is talking of the possibility of an invasion on
The interplay between linguistic practices and epistemic concerns may seem as a trivial claim. But, surprisingly, there is little work in philosophy and epistemology that ties the debate on the pragmatics of communication and the acceptance of testimonial knowledge. For example, in discussing hearer’s responsibilities in gaining knowledge from testimony, McDowell refers to a vague “doxastic responsibility” that the hearer should exercise before accepting testimonial information.[17] But what this doxastic responsibility consists of is left largely unexplained. I think that a fruitful way of conceiving this responsibility is to place it in the inferential process of interpretation of what others say, in the responsible stance we assume to dose the epistemic weight we give to what we hear.
In communication, people do not look for true information, but for relevant information, that is, information that is relevant enough in a particular context to deserve our attention. But what is relevant in a context is a good proxy for information that has an epistemic value for us[18]. We trust other people to provide us relevant information, and adjust our epistemic requirements according to the context in which the interpretation takes place. We exercise our “epistemic vigilance”, to use an expression coined by Dan Sperber[19], during our interpretation by adopting a stance of trust that our interlocutors will provide relevant information for us. Any departure from the satisfaction of our expectations of relevance may result in a revision or a withdrawal of our default trust.
Let me illustrate this interplay between epistemology and interpretation with two fictional examples: the first one whose aim is to show that a departure from relevance may have effects on our epistemic stance, and, conversely, the second that illustrates how a change in our stance of trust may result in a different appraisal of relevance in interpretation.
Consider this case. Arianna is late tonight for the Parent-Children Association meeting at her son’s school. It’s not the first time she’s late at these kinds of events, and feels awfully guilty. She justifies herself with the President of the Association by telling her a long and detailed story about a series of accidents and unforeseen events that explain her delay. She adds a little bit too much to her story: Not only the underground did stop for 15 minutes due to an alarm, but also she fell on the stairs and broke her umbrella, and had to shelter from the rain under a roof. Then she met a very old friend who announced her a serious illness and was too touched to brusquely interrupt the conversation…The President is listening with the lesser and lesser attention: the relevance of what Arianna is saying is decreasing: too many details just for explaining a delay. This lack of relevance of what Arianna is saying weakens her stance of trust: Why all these details? May be isn’t she telling the truth?
Or consider this second example. A typical Parisean fraude that may happen in the street is to be approached by a stranger, who pretends to be Italian, is very friendly and, after a conversation, tries to convince his “victim” to buy some fake leather jackets he has in his car. He deceives his “clients” by asking them to buy the jackets because he’s unable to bring them back with him in
These two stories illustrate the interplay between trust and interpretation as I intend it here – that is- the search for relevant information (information whose cost to have is balanced with the effort to treat it) . In the first case, Arianna’s description is too detailed: she’s giving too much information to the President to be relevant for her, and this creates a suspicious attitude in the President. In the second example, Jules is no more reliable in Jim’s mind: this acts as a bias in his way of interpreting what he’s saying.
Our epistemic responsibility is first of all a matter of taking an opportune stance of trust towards our informants, a sort of “virtual trust” that doesn’t commit us to accept as true what is said in conversation. We weigh through our interpretation the authority and credibility of our informants according to our epistemic needs. On the other hand, the way the informants “pack in language” what they want to say has epistemic consequences on our allocation of credibility. And the epistemic duty of the informants amounts to be relevant for us in a context, thus suggesting to us some possible epistemic gains in listening to them. We may take the risk to take a trustful posture in the speakers’ willingness to be relevant and yet check their trustfulness and reliability through the process of interpretation.
Epistemic responsibilities are thus shared, but in a lighter sense than what is often intended in the epistemological literature on testimonial knowledge: we share a context of communication, and a practice of interpretation and take on both sides the responsibility of the epistemic consequences of our social life.
References
AA.VV. The Hutton Inquiry, Tim Coates Publisher, 2004,
Adler, J. (2003) Belief’s own ethics, Mit Press.
Clément, F.; Koenig, M.; Harris, P. (2004) “The Ontogenesis of Trust”, Mind & Language, 19, pp. 360-379.
Coady, A. (1992) Testimony,
Foley, R. (2001) Intellectual Trust in Oneself and Others,
Fricker, E. (2006) “Testimony and Epistemic Autonomy” in J. Lackey and E. Sosa (eds.) The Epistemology of Testimony,
Fricker, M. (2007) Epistemic Injustice,
Gopnik, A., Graf, P. (1988) “Knowing how you know: Young Children’s Ability to Identify and Remember the Sources of Their Beliefs”, Child Development, 59, n. 5, pp. 1366-1371.
Holton, R. (1994) “Deciding to Trust, Coming to Believe”, Australasian Journal of Philosophy, vol. 72, pp. 63-76.
Moran, R. (2005) “Getting Told and Being Believed”, in Philosophers’ Imprints, vol. 5, n. 5, pp. 1-29.
Origgi, G. (2004) “Is Trust an Epistemological Notion?” Episteme, 1, 1, pp. 61-72.
Origgi, G. (2005) “What Does it Mean to Trust in Epistemic Authority?” in P. Pasquino (ed.) Concept of Authority, Edizioni Fondazione Olivetti,
Origgi, G. (2007) “Le sens des autres. L’ontogenèse de la confiance épistémique », in A. Bouvier, B. Conein (eds.) L’épistémologie sociale, EHESS Editions, Paris
Origgi, G. (2008) Qu’est-ce que la confiance?, Paris, VRIN.
Pettit, P., Smith, M. (1996) “Freedom in Belief and Desire”, The Journal of Philosophy, XCIII, 9, pp. 429-449.
Pritchard, D. (2005) Epistemic Luck, Clarendon Press,
A. Ross (1986) “Why Do We Believe What We Are Told?” Ratio, 28, 1986, pp. 69-88
T. Ruffmann, L. Slade, E. Crowe (2002) The Relation between Children’s and Mothers’ Mental State Language and Theory of Mind Understanding”, Child Development, 73, pp. 734-751.
M. A. Sabbagh, D. Baldwin (2001) “Learning Words from Knowledgeable vs. Ignorant Speakers: Links between preschooler’s Theory of Mind and Semantic Development”, Child Development, 72, 1054-1070.
Shapin, S. (2004) The Social History of Truth,
Sperber, D.,
Wilson, D. , Sperber, D. (2002) “Truthfulness and Relevance”, Mind, 111(443):583-632.
[1] On the concept of « good informant » see E. Craig (1990), Knowledge and the State of Nature,
[2] See S. Shapin (1992) A Social History of Truth,
[3] The expression is due to McDowell. See J. McDowell (1998) “Knowledge by Hearsay”, in Meaning, Knowledge and Reality,
[4] Cf. G. Origgi (2004) “Is Trust an Epistemic Notion?”, Episteme, 1, 1.
[5] For an interesting and recent analysis of such cases, see D. Pritchard (2005) Epistemic Luck, Oxford University Press.
[6] Cf. http://middleeastreference.org.uk/
[7] Cf. B. Williams (2002), Truth and Truthfulness,
[8] Cf. Montaigne : « On Lyers », Essays, Book 1,
[9] Cf. Kant, Groundwork of the Metaphysics of Moral.
[10] Cf. B. Constant
[11] Cf. Kant « On a Supposed Right to Lie because of Philantropic Concerns » published as a postscript of the Groundwork.
[12] See on this point Russell Hardin: “Trust in Governments” in
[13] Cf. H. Franfurt (2005), On Bullshit,
[14] Cf. on this case G. Origgi, M. Spranzi (2007) “La construction de la confiance dans l’entretien medical”, in T. Martin, P-Y. Quiviger (eds.) Action médicale et confiance, Presses Universitaires de Franche-Comté.
[15] Cf. G. Mosconi (1985) L’ordine
[16] Cf. F. Clément et al. (2004) « The ontogenesis of Trust », Mind and Language, 19, p. 360-379; G. Origgi (2007) “Le sens des autres. L’ontogenèse de la confiance épistémique”, in A. Bouvier, B. Conein (eds.) L’épistémologie sociale, EHESS Editions, Paris.
[17] Cf. J. McDowell, cit.
[18] I’m using here the technical concept of relevance developed by D. Sperber and D. Wilson in their post-gricean approach to pragmatics. Cf. D. Sperber, D.
[19] Cf. D. Sperber, O. Mascaro (draft) “Mindreading, comprehension and epistemic vigilance”