Friday, February 08, 2008

Trust, Authority and Epistemic Responsibility

Draft. Do not quote. Paper submitted for a collective volume on Epistemic Justice.

Trusting others is one of the most common epistemic practices to make sense of the world around us. Sometimes we have reasons to trust, sometimes not, and many times our main reason to trust is based on the epistemic authority we attribute to our informants. The way we usually weight this authority, how we select the “good informants”[1], which of their properties we use as indicators of their trustworthiness is a rather complex matter. Indicators of trustworthiness may be notoriously biased by our prejudices, they may be faked or manipulated by malevolent or just interested informants and change through time and space. Discussions around the way in which these criteria are established and shared in a community range from history of science, to contemporary epistemology and moral philosophy. Their flavour can be more descriptive - as in Steven Shapin’s account of the role of gentelmanry in the establishment of scientific reputation in XVII century in Britain – or normative, as in Miranda Fricker’s work on the virtue of epistemic justice as the basic virtue that a hearer must have in order to judge in a non-biased way his or her informants.[2] Here, I do not want to propose some other criteria for ascertaining the reliability of our informants, rather, I would like to discuss the variability of these criteria within different contexts of communication. Communication is an active process through which interlocutors not only transmit and receive information, but negotiate new epistemic standards by constructing together new reasons and justifications that are heavily influenced by the moral, social or political context and by the interests at stake on both sides, the speaker and the hearer. The kind of “doxastic responsibility”[3] hearers exercise on testimonial knowledge is sensitive to the way the communicative process takes place and to the many contextual factors that influence its success or failure.

The overall picture of trust in testimonial knowledge I want to suggest is due to some ideas I have developed upon the relations between the epistemology of trust and the pragmatics of communication[4]. What I want to defend here is that the way we gain knowledge through communication is influenced by the responsibility we take in granting authority to a source of information. Our responsibility is not just a moral quality we happen to possess or have learnt through experience, but a way of approaching our communicative interactions, a stance towards credibility and credulity that is shared, in successful cases, with our informants. It’s a dynamic stance, which varies through contexts and contents at stake, but it is necessary for any epistemic enrichment of our cognitive life, apart from the marginal cases of epistemic luck[5]. An obvious advantage of a pragmatic approach to epistemology of testimony is to avoid an artificial image that depicts the hearer as a rational chooser who has the option of accepting or refusing a chunk of information that is presented towards her by a speaker. In most cases we just do not have the choice: we learn from others because we’re immerged in conversational practices whose output is exactly the chunk we end to believe or disbelieve. There is no a priori information to gain that is not constructed in the process of communication.

In order to illustrate my main point, I will present in the rest of the paper a series of examples - some real and some other fictional - that show how epistemic standards vary through contexts, how different is our way to weigh evidence as our interests at stake change and, finally, how the interplay between pragmatic effects of communication and its epistemic consequences is a dynamic feature of our way of accessing new information. As contexts and stakes change, what we may have considered once as just a pragmatic vagary of conversation may become a crucial epistemic cue that orients our allocation of credibility and moral authority to our informants.

Let me start with a political example - which had a huge impact of the credibility and authority of many political leaders in United States and Great Britain – namely, the search of evidence and the rhetoric of justification that surrounded the decision of attacking Iraq in 2003. It is interesting to notice that this historical example is probably the first case, at least to my knowledge, in which epistemic reasons were so involved in the justification of a political choice that they were used in the public debate in order to get consensus. Getting evidence on the presence of weapons of mass destruction in Iraq became at a certain point the key issue in order to possess a political justification to go to war. But the story I’m going to tell has also an interesting epistemic moral: there are epistemic standards which people care about and whose violation is heavily sanctioned in terms of credibility. The social order of a society is dependent also on its cognitive order and the back-lash of public opinion on political leaders when the expertise they appealed to proved unreliable and its political exploitation disingenuous was a clear demonstration of this. I underline this point because there is a post-modern tendency today - in social science and in social epistemology too - to consider that all public opinion stems from ideology, and that the reasons given to justify an idea depend only on power. Yet, I think that people are smarter and more responsible epistemic agents than post-modernists tend to consider: their awareness of epistemic standards orients their choices and the formation of their opinions, at least in mature democracies.

But let’s stick the facts.

On February 3rd 2003, The Intelligence of the British Government released a dossier entitled: “Iraq- Its infrastructure of concealment, deception and intimidation” on how Iraq's security organisations operated to conceal weapons of mass destruction from UN Inspectors, the organization of Iraq Intelligence and the effects of the security apparatus on ordinary citizens. The report was previously sent to Mr. Colin Powell who used some of the material for his well known presentation at United Nations (Monday, February 3rd).

Few days later, Channel 4 News learned from the Cambridge academic Glen Rangwala - a lecturer in Middle Eastern politics - that the dossier had been massively copied from an article written by a post-graduate student, Ibrahim al Marashi, published on September 2002 on The Middle East Review of International Affairs, and some other articles in Jane’s Intelligence Review. Glen Rangwala was able to disclose the plagiarism by spotting the government’s duplication of material of his own site on Iraq weaponry and Middle Eastern affairs[6]. The news was rapidly confirmed by Downing Street and the government apologized for the plagiarism, which actually revealed as a very lucky case of “blind” trust in expertise: further controls confirmed the accuracy of al-Marashi evidence about Iraq Intelligence. But the back-lash on public opinion was strong: the impression left was that of an up to the minute Intelligence analysis with no serious search of evidence. Furthermore, that Colin Powell could have used such shallow evidence in order to justify the necessity of an attack on Iraq in front of the world was perceived as a lack of moral and epistemic responsibility. In Bernard Williams’ terms, one can say that the loss of credibility was due in this case to a lack of accuracy about truth than to a lack of sincerity[7]. This is interesting because there is a rich moral tradition which severely sanctions deception as one of the worst sin against our conspecifics. Our ties of trust are built on the credibility of the words we exchange to each other, hence lying is a betrayal of our human nature, a deep violation of the fundamental – almost natural – rules that make up human communities. Montaigne says for example:

Verily, lying is an ill and detestable vice. Nothing makes us men, and no other means keeps us bound one to another, but our word; knew we but the horror and weight of it, we would with fire and sword pursue and hate the same, and more justly than any other crime […] Whatsoever a lier should say, we would take it in a contrary sense. But the opposite of truth has many shapes, and an indefinite field.[8]

Kant’s well-known position on lying is that it can be never justified under all possible circumstances. Not lying is a universal law, with no exception: “This means that when you tell a lie, you merely take exception to the general rule that says everyone should always tell the truth”[9]. In his exchange with Benjamin Constant, who objects to the German philosopher the existence of a “duty to tell the truth”, Kant restates his position even in response to an extreme example that Constant submits to him. Constant, who wrote against Kant that: “The moral principle stating that it is a duty to tell the truth would make any society impossible if that principle were taken singly and unconditionally”[10], presents the following example as a clear case in which the duty to tell the truth is objectionable: If someone were at your door to murder a friend of yours who is hiding in your house and asks you where he is, do you still have the duty to tell him the truth? Kant replies in a text entitled: “On a Supposed Right to Lie Because of Philanthropic Concerns”, that yes, even in such an extreme case you have the duty to tell the truth: “Truthfulness in statements that cannot be avoided is the formal duty of man to everyone, however great the disadvantage that may arise there from for him or for any other”.[11] So, sincerity seems to be a much more important duty than accuracy in the history of philosophy. Inaccuracy is a more recent “moral fault”, that comes with Modernity and with the idea that trust in governments should not be based on the moral virtues of governors, rather, on their expertise and on the reliability of the procedures that assure the good functioning of the State[12]. In a “disenchanted world” in which politics doesn’t depend anymore on virtue, truthfulness has to be based on evidence, and an epistemic flaw in providing such evidence in support of a claim which has political consequences is a moral weakness as well as a way of undermining the grounds of trust that sustain a society.

But let us go on with the story.

On June 2003 Alastair Campbell, Director of Communications of Tony Blair’s government, was accused by the BBC journalist Andrew Gilligan to have “sexed up” another report, released on September 24, 2002 on Iraq ‘s weapons of mass destruction, that the Government unprecedentely decided to publish with a preface of the Prime Minister. Based on the testimony of an insider expert who contributed to the report, Gilligan affirmed that the claim made in the dossier that Iraq could be ready to use chemical weapons in forty-five minutes was known to be exaggerated by the Government at the time they decided to include it in the dossier. Here also, the use of evidence is surprisingly new: Traditionally, political propaganda in support to war wasn’t committed to use evidence to justify its choice: appealing to rhetorical values such as “Loyalty to the Nation” or “Duty to defend our boarders from the enemy” was all that governors felt the duty to share as information with citizens. Here, though, the British Government felt the urge to share with its citizens the information contained in the dossier, at its own risk: responsible receivers of this information were able to disclose another flaw, even if of a different kind. In Harry Frankfurt’s crude terms, Campbell’s mistake is more of the order of a “bullshit” than that of a “lie”. Bullshits are common in our informationally overloaded societies: given that there is too much information around, bullshits are a way of “sexing up” information in order to get attention to it.[13] It’s a way of playing with the relevance of the information, by adding some appealing pragmatic effects to it, hence having more chances to be heard. But in this case, a not-so-innocent pragmatic manoeuvre was perceived as a grave violation of the epistemic standards that a decent society must endorse in order to sustain trust relationships between citizens and politicians. The consequences of this act are well known.

On July, the name of Gilligan’s informant was revealed to the press by the Defence Office. On July 18th the microbiologist David Kelly, adviser of the Foreign Office and the Defence Office about the chemical weapons, was found dead, two days after a very tough audience at the Parliament on July 15th. Having learned about his death while travelling in Japan, Tony Blair declared that an independent inquiry would be open on the case. Lord Hutton was charged to establish the facts around David Kelly’s death.

In January 2004, Lord Hutton released his report. After having heard 74 testimonies and analysed more than 300 claims, Lord Hutton established the following facts:

  • David Kelly killed himself under no pressure by any other person
  • In his conversation with Mr. Gilligan on May 23, David Kelly was in breach of the rules governing disclosure of confidential information even if part of his job description as an adviser of the Foreign and Defence Offices concerned speaking to the media and institutions on Iraq’s weapons.
  • It is dubious that David Kelly said to Mr. Gilligan that the claim about 45 minutes was exaggerated.
  • There were no effective pressures of the Government for “sexing up” the dossier.

The rest is known: The main responsibility for the affair fell on the BBC, Gilligan lost his job and the Director General of the BBC, Greg Dyke, resigned.

The result of the inquiry was disappointing for the public opinion, because it was perceived as a way to acquit the government from its responsibilities. But here I do not want to enter the debate on the moral responsibilities of the UK government. Rather, what interested me in this example was the role played by epistemic standards shared by responsible citizens in evaluating the credibility of the government’s testimony on Iraqi military power. The way in which information was filtered and evaluated was different in the two cases I’ve discussed: in the first case, the flaw was due to inaccuracy whereas in the second one to insincerity: yet, the fact that in both cases there were people able to detect it and severely judge it, shows that real standards exist, that people usually have quite an accurate conception of them and that it is not easy to fool everybody without paying a price in credibility.

This is also a particularly illuminating example of the strict relation between epistemic authority and political authority in democratic societies. Social epistemology today is becoming also a “political epistemology”: political authority is the more and more sustained by various forms of epistemic authority: experts, reports, oracles, think-tanks, independent inquiries, have to provide the evidence on which a political choice is going to be taken and judged.

Here are some of the provisory conclusions on trust in authority that I want to draw from this example:

1. Governments rely on experts on technical matters to take decision. But the unprecedented choice of the British Government to publish the September 2002 dossier shows that the use of experts in this case was more than just for acquiring information about the facts in Iraq: it was also a way of legitimizing its political action on the basis of the information contained in the dossier. The political authority appealed to the epistemic authority of its experts to justify its action.

2. In the February 2003 report, the expertise of Intelligence was questioned because of plagiarism: the information revealed correct, but the way it was acquired was unreliable and inappropriate. It seems so that justification of epistemic authority matters for public opinion: an institution that has epistemic authority not only must hold the appropriate information but it must be justified in holding it. The “epistemic luck” of acquiring the right piece of information by chance (or through an unreliable method) deflates the authority of the institution.

3. Even if the September 2003 dossier was obviously produced for political reasons, that it, with the aim to establish the facts that would justify the invasion of Iraq, a direct influence of the political authority on the presentation of facts is intolerable for the democratic functioning of a society, in particular on such a delicate matter as the decision to send people to war. In general: when the potential consequences are grave, standards of evidence, objectivity and impartiality must be raised. An institution that has epistemic authority knows the facts that may have justified a political decision but its authority depend on its autonomy from the political power.

4. A political independent authority, Lord Hutton, is then charged to check the facts and assess the responsibilities of all the actors of the affairs, experts, media and political authorities. His moral authority gave to his report a special epistemic status of “ultimate truth about the case”. Our trust in the “cognitive order” of our society - that is, who holds knowledge, on what rules and principles knowledge is distributed and diffused in a society, on what grounds experts should be believed – influences our trust in its social order and is influenced by it. Yet, as I said, real standards play a role, epistemic justice is a shared value and massive violations of it are difficult to maintain at least in democracies which ground their consent in the autonomy and epistemic responsibility of subjects.

On the second part of this paper I would like to explore how these real standards vary nonetheless, according to time, places and contexts of communication. For example, what could have been perceived for my father’s generation as a form of gallantry and well mannered way to deal with women in conversation, as hiding the price list in a restaurant to women guests, is nowadays perceived as an unjust way of blocking access of information to a special category of people. Or else, the paternalistic way of doctors to hide part of the information about a patient’s health in cases of serious diagnoses has been now evacuated as a legitimate communicational practice from medical ethics. A contractualist relationship based on informed consent has become the standard way of dealing with ethical issues within medical decision making. This practice aimed at readjusting the balance between the power of doctors and the autonomy of patients in order to avoid the risks of abuse that a blind trust relationship would expose the patients too. Informed consent ha been introduced as a moral and legal requirement for any medical intervention be it research or therapeutic, by the 1997 European Convention of Human Rights and Biomedicine[14].

These examples show a continuum that I wish to explore between communicational practices and epistemic standards. How we talk to other people - what we say and don’t say – is more than just a matter of linguistic preference for a certain style of conversation. We can adjust our language to what we want to give access to in terms of information. And, as hearers, we always adjust our interpretations according not only to our pragmatic expectations, but also to our epistemic needs.

For example, authority is an important epistemic cue in interpreting what other people say. Nice experiments show that the same text given to two different groups of people while indicating to each group two different sources, one authoritative and the other not, gives very different results in the interpretation[15] given by the readers. In the case of the authoritative source, even if the text may result obscure, people tend to overinterpret it in order to make sense of it. While in the second case, the effort of interpretation is more limited and, if the text is too obscure, people rapidly conclude that it’s nonsense (many of us have experienced the same “authority” effect in reviewing articles from colleagues and students…This is one of the reason why the peer review process is anonymous!). Authority biases are thus extremely relevant to what we come to understand and believe from our informants: but this is not a on/off process in which we believe in authoritative sources and don’t believe in non- credible ones: it is the way we process information that comes from authorities, how we adjust our interpretation in order to make sense of what they say that determines what we come to believe. It is thus the stance we take towards our informants, the way we exercise our epistemic responsibility, that makes us believe or not believe what we are said. Thus, the construction of testimonial knowledge is a shared responsibility between informants and hearers: there are not purely unbiased informants - apart from some uninteresting cases like the timetable of trains in the station – as there aren’t naïve receivers of information. Even children, who have been considered for longtime as the paradigmatic case of naïve credulous creatures, have proved more sophisticated epistemic subjects than what we used to think: they take into account cues of credibility in order to accept what an informant says and check the linguistic consistency of their informants in order to adjust their credibility investment[16].

Epistemic responsibility is thus a matter of adjusting our way of interpreting what other people say to our epistemic needs: if I’m involved in a small talk in a party in which someone is talking of the possibility of an invasion on Iran because of its military nuclear plan, I can accept loose evidence for the sake of conversation. But if I hear the same rumour coming from an insider of the Ministry of Foreign Affairs, and if the consequences for my life are that my son may risk his life by fighting in a potential war, then it is my responsibility to raise epistemic standards and to ask for further evidence about Iran’s military nuclear plan.

The interplay between linguistic practices and epistemic concerns may seem as a trivial claim. But, surprisingly, there is little work in philosophy and epistemology that ties the debate on the pragmatics of communication and the acceptance of testimonial knowledge. For example, in discussing hearer’s responsibilities in gaining knowledge from testimony, McDowell refers to a vague “doxastic responsibility” that the hearer should exercise before accepting testimonial information.[17] But what this doxastic responsibility consists of is left largely unexplained. I think that a fruitful way of conceiving this responsibility is to place it in the inferential process of interpretation of what others say, in the responsible stance we assume to dose the epistemic weight we give to what we hear.

In communication, people do not look for true information, but for relevant information, that is, information that is relevant enough in a particular context to deserve our attention. But what is relevant in a context is a good proxy for information that has an epistemic value for us[18]. We trust other people to provide us relevant information, and adjust our epistemic requirements according to the context in which the interpretation takes place. We exercise our “epistemic vigilance”, to use an expression coined by Dan Sperber[19], during our interpretation by adopting a stance of trust that our interlocutors will provide relevant information for us. Any departure from the satisfaction of our expectations of relevance may result in a revision or a withdrawal of our default trust.

Let me illustrate this interplay between epistemology and interpretation with two fictional examples: the first one whose aim is to show that a departure from relevance may have effects on our epistemic stance, and, conversely, the second that illustrates how a change in our stance of trust may result in a different appraisal of relevance in interpretation.

Consider this case. Arianna is late tonight for the Parent-Children Association meeting at her son’s school. It’s not the first time she’s late at these kinds of events, and feels awfully guilty. She justifies herself with the President of the Association by telling her a long and detailed story about a series of accidents and unforeseen events that explain her delay. She adds a little bit too much to her story: Not only the underground did stop for 15 minutes due to an alarm, but also she fell on the stairs and broke her umbrella, and had to shelter from the rain under a roof. Then she met a very old friend who announced her a serious illness and was too touched to brusquely interrupt the conversation…The President is listening with the lesser and lesser attention: the relevance of what Arianna is saying is decreasing: too many details just for explaining a delay. This lack of relevance of what Arianna is saying weakens her stance of trust: Why all these details? May be isn’t she telling the truth?

Or consider this second example. A typical Parisean fraude that may happen in the street is to be approached by a stranger, who pretends to be Italian, is very friendly and, after a conversation, tries to convince his “victim” to buy some fake leather jackets he has in his car. He deceives his “clients” by asking them to buy the jackets because he’s unable to bring them back with him in Italy for custom reasons, and therefore it’s a very good bargain to buy them. Now, imagine that Jim is crossing the street and is approached by a guy with a strong Italian accent, who introduces himself as “Jules”. He starts a conversation, with the default trust he usually displays, but at a certain point he remembers that a friend told him about this kind of fraude. He stays still, unable to withdraw immediately the attention he granted to the guy. But Jules’ words are no more the same to his ears. For example, Jules says: “Are you in a hurry?”. Jim is not in a hurry, but interprets this as an invitation to spend more time with Jules and accepting his bargain proposals. He answers “Yes” and flies away.

These two stories illustrate the interplay between trust and interpretation as I intend it here – that is- the search for relevant information (information whose cost to have is balanced with the effort to treat it) . In the first case, Arianna’s description is too detailed: she’s giving too much information to the President to be relevant for her, and this creates a suspicious attitude in the President. In the second example, Jules is no more reliable in Jim’s mind: this acts as a bias in his way of interpreting what he’s saying.

Our epistemic responsibility is first of all a matter of taking an opportune stance of trust towards our informants, a sort of “virtual trust” that doesn’t commit us to accept as true what is said in conversation. We weigh through our interpretation the authority and credibility of our informants according to our epistemic needs. On the other hand, the way the informants “pack in language” what they want to say has epistemic consequences on our allocation of credibility. And the epistemic duty of the informants amounts to be relevant for us in a context, thus suggesting to us some possible epistemic gains in listening to them. We may take the risk to take a trustful posture in the speakers’ willingness to be relevant and yet check their trustfulness and reliability through the process of interpretation.

Epistemic responsibilities are thus shared, but in a lighter sense than what is often intended in the epistemological literature on testimonial knowledge: we share a context of communication, and a practice of interpretation and take on both sides the responsibility of the epistemic consequences of our social life.

References

AA.VV. The Hutton Inquiry, Tim Coates Publisher, 2004, London.
Adler, J. (2003) Belief’s own ethics, Mit Press.
Clément, F.; Koenig, M.; Harris, P. (2004) “The Ontogenesis of Trust”, Mind & Language, 19, pp. 360-379.
Coady, A. (1992) Testimony, Oxford, Clarendon Press.
Foley, R. (2001) Intellectual Trust in Oneself and Others, Cambridge University Press.
Fricker, E. (2006) “Testimony and Epistemic Autonomy” in J. Lackey and E. Sosa (eds.) The Epistemology of Testimony, Oxford University Press.
Fricker, M. (2007) Epistemic Injustice, Oxford University Press.
Gopnik, A., Graf, P. (1988) “Knowing how you know: Young Children’s Ability to Identify and Remember the Sources of Their Beliefs”, Child Development, 59, n. 5, pp. 1366-1371.
Holton, R. (1994) “Deciding to Trust, Coming to Believe”, Australasian Journal of Philosophy, vol. 72, pp. 63-76.
Moran, R. (2005) “Getting Told and Being Believed”, in Philosophers’ Imprints, vol. 5, n. 5, pp. 1-29.
Origgi, G. (2004) “Is Trust an Epistemological Notion?” Episteme, 1, 1, pp. 61-72.
Origgi, G. (2005) “What Does it Mean to Trust in Epistemic Authority?” in P. Pasquino (ed.) Concept of Authority, Edizioni Fondazione Olivetti, Rome.
Origgi, G. (2007) “Le sens des autres. L’ontogenèse de la confiance épistémique », in A. Bouvier, B. Conein (eds.) L’épistémologie sociale, EHESS Editions, Paris
Origgi, G. (2008) Qu’est-ce que la confiance?, Paris, VRIN.
Pettit, P., Smith, M. (1996) “Freedom in Belief and Desire”, The Journal of Philosophy, XCIII, 9, pp. 429-449.
Pritchard, D. (2005) Epistemic Luck, Clarendon Press, Oxford.
A. Ross (1986) “Why Do We Believe What We Are Told?” Ratio, 28, 1986, pp. 69-88
T. Ruffmann, L. Slade, E. Crowe (2002) The Relation between Children’s and Mothers’ Mental State Language and Theory of Mind Understanding”, Child Development, 73, pp. 734-751.

M. A. Sabbagh, D. Baldwin (2001) “Learning Words from Knowledgeable vs. Ignorant Speakers: Links between preschooler’s Theory of Mind and Semantic Development”, Child Development, 72, 1054-1070.

Shapin, S. (2004) The Social History of Truth, Harvard University Press.

Sperber, D., Wilson, D. (1986/1995) Relevance. Communication and Cognition, Basil Blackwell, Oxford.

Wilson, D. , Sperber, D. (2002) “Truthfulness and Relevance”, Mind, 111(443):583-632.



[1] On the concept of « good informant » see E. Craig (1990), Knowledge and the State of Nature, Oxford, Clarendon Press, where he argues that our very concept of knowledge originates from the basic epistemic need in the State of Nature of recognizing the good informants, that is, those who are trustworthy and bear indicator properties of their trustworthiness.

[2] See S. Shapin (1992) A Social History of Truth, Chicago University Press; M. Fricker (2007) Epistemic Injustice, Oxford University Press, and M. Fricker, this volume.

[3] The expression is due to McDowell. See J. McDowell (1998) “Knowledge by Hearsay”, in Meaning, Knowledge and Reality, Harvard University Press, Cambridge, Mass.

[4] Cf. G. Origgi (2004) “Is Trust an Epistemic Notion?”, Episteme, 1, 1.

[5] For an interesting and recent analysis of such cases, see D. Pritchard (2005) Epistemic Luck, Oxford University Press.

[6] Cf. http://middleeastreference.org.uk/

[7] Cf. B. Williams (2002), Truth and Truthfulness, Princeton University Press.

[8] Cf. Montaigne : « On Lyers », Essays, Book 1, Ch. IX.

[9] Cf. Kant, Groundwork of the Metaphysics of Moral.

[10] Cf. B. Constant

[11] Cf. Kant « On a Supposed Right to Lie because of Philantropic Concerns » published as a postscript of the Groundwork.

[12] See on this point Russell Hardin: “Trust in Governments” in

[13] Cf. H. Franfurt (2005), On Bullshit, Princeton University Press.

[14] Cf. on this case G. Origgi, M. Spranzi (2007) “La construction de la confiance dans l’entretien medical”, in T. Martin, P-Y. Quiviger (eds.) Action médicale et confiance, Presses Universitaires de Franche-Comté.

[15] Cf. G. Mosconi (1985) L’ordine del discorso, Il Mulino, Bologna ; D. Sperber (2005) « The guru effect », unpublished article, on line at www.dan.sperber.com ;

[16] Cf. F. Clément et al. (2004) « The ontogenesis of Trust », Mind and Language, 19, p. 360-379; G. Origgi (2007) “Le sens des autres. L’ontogenèse de la confiance épistémique”, in A. Bouvier, B. Conein (eds.) L’épistémologie sociale, EHESS Editions, Paris.

[17] Cf. J. McDowell, cit.

[18] I’m using here the technical concept of relevance developed by D. Sperber and D. Wilson in their post-gricean approach to pragmatics. Cf. D. Sperber, D. Wilson (1986/95) Relevance. Communication and Cognition, Basil Blackwell. On the relations between relevance and truth, cf. D. Wilson and D. Sperber (2000) “Truthfulness and Relevance”, Mind.

[19] Cf. D. Sperber, O. Mascaro (draft) “Mindreading, comprehension and epistemic vigilance”

2 comments:

สูตรสล็อต Pg slot said...

สูตรสล็อต Pg slot แหล่งทำเงินสล็อตยอดนิยมปี 2021 เล่นง่าย ได้เงินจริง มีระบบฝากถอนเงินด้วยระบบออโต้ที่รวดเร็ว

JAke Leonel said...

The article you have shared here is very informative and the points you have mentioned are very helpful.rotating masturbator cup