سوگیری شناختی اعتقادگرایی (به انگلیسی: Belief Bias) تمایلی است ذهنی که در آن فرد هر استدلال را با توجه به باورپذیری نتایجش قضاوت میکند به جای اینکه قدرت استدلال را بررسی کند. به بیان دیگر اعتقادگرایی باعث میشود فرد در ارزیابی استدلالهای مختلف فقط به این نکته توجه کند که نتایج هر استدلال چقدر با باورها و اعتقادات شخصی وی همخوانی دارد.
معنای این پدیده این است که افراد اغلب استدلالی را میپذیرند که با باورهایشان سازگار است بیآنکه حتی منطقی بودن دلایل را در نظر بگیرند. در این حالت فرد تمایل دارد استدلالی را حتی اگر منطقی و عقلانی باشد، در صورتی که خارج از نظام ارزشی او باشد، رد کند.
اعتقادگرایی اجازه نمیدهد افراد منطقی فکر کنند زیرا فقط به نتایج استدلال تکیه میکنند و وقتی نتایج خارج از باورهایشان است، کل استدلال را نادیده میگیرند.
دانش هر فرد مجموعهای است از حقایق و اعتقادات فردی. مسئله این است که شخص در مراجعه به دانش شخصی خود برای کدام وزن بیشتری قائل است.
Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with his values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.
A syllogism is a kind of logical argument in which one proposition (the conclusion) is inferred from two or more others (the premises) of a specific form. The classical example of a valid syllogism is:
An example of an invalid syllogism is:
Typically, a majority of test subjects in studies incorrectly identify this syllogism as one in which the conclusion follows from the premises. It might be true in the real world that a) girls study and b) this is because they are ambitious. However, this argument is a fallacy, because the conclusion is not supported by its premises. The validity of an argument is independent from the truth of its conclusion: there are valid arguments for false conclusions and invalid arguments for true conclusions. Hence, it is an error to judge the validity of an argument from the plausibility of its conclusion. This is the reasoning error known as belief bias.
When a person gives a response that is determined by the believability of the conclusion rather than logical validity, this is referred to as belief bias only when a syllogism is used. This phenomenon is so closely related to syllogistic reasoning that, when it does occur, in areas such as Wason's selection task or the THOG problem, it is called "memory cueing" or the "effects of content".
Dual-process theory of belief bias
Many researchers in thinking and reasoning have provided evidence for a dual-process cognitive approach to reasoning, judgment and decision making. They argue that these two mental processes (system 1 and system 2) engage in a constant battle for control over our brain to reason and make decisions. System 1 can be described as an automatic response system characterised by "unconscious", "intuitive" and "rapid" evaluation; whereas system 2 is said to be a controlled response system, characterised by “conscious”, “analytic” and “slow” evaluation; some researchers even claimed to have found a link between general intelligence and the effectiveness of decision making. It is important to note that the dual-process cognitive theory is different from the two minds hypothesis. Research done by Jonathan Evans in 2007 provided evidence for the view that System 1, which serves as a quick heuristic processor, fights for control over System 2's slower analytical approach. In the experiment, participants were asked to evaluate syllogisms that have valid arguments with unconvincing conclusions; valid arguments with convincing conclusions; invalid arguments with unconvincing conclusions; invalid arguments with convincing conclusions. The results show that when the conclusion is believable, people blindly accept invalid conclusions more so than invalid arguments are accepted.
Influencing factors of belief bias
Various studies have proved that the time period for which a subject is allowed to think when evaluating arguments is related to the tendency for belief bias to take place. In a study done by Evans and Holmes in 2005, they recruited two different groups of people to answer a series of reasoning questions. One group of people were given only two seconds to answer the questions; whereas the other group of people were allowed to use as much time as they would like to answer the questions. The result obtained was that a higher percentage of incorrect answers were found in the time pressured group than the other; they concluded that this was a result of a shift in logical to belief-biased way of thinking.
Nature of content
The nature of the content presented can also affect belief bias of an individual as shown by a study done by Goel & Vartanian in 2011. In their experiment, 34 participants were presented with a syllogism upon each trial. Each trial were either neutral or carried some degree of negative content. Negative content involved in the experiment were politically incorrect social norm violations, such as the statement “Some wars are not unjustified, Some wars involve raping of women, therefore, Some raping of women is not unjustified”. For syllogisms where the content was neutral, the results were consistent with studies of belief bias; however, for syllogisms with negative emotional content, participants were more likely to reason logically on invalid syllogisms with believable conclusions instead of automatically judging them to be valid. In other words, the effect of belief bias is reduced when the content presented is of negative emotion. According to Goel and Vartanian, this is a because negative emotions prompts us to reason in more careful and in more detail. This argument was supported by the observation that for questions with negative emotions, the reaction time taken was significantly longer than that of questions that a neutral.
In an experiment done by Evans, Newstead, Allen & Pollard in 1994, where subjects were given detailed instructions which lack specific reference to the notion of logical necessity when answering questions, it was shown that a larger proportion of answers actually rejected invalid arguments with convincing conclusions as opposed to when no further instructions were given when the subjects were asked to answer the questions. The results of the experiments reflects that when elaborated instructions were given to subjects to reason logically, the effects of belief bias is decreased.
In a series of experiments by Evans, Barston and Pollard (1983), participants were presented with evaluation task paradigms, containing two premises and a conclusion. The participants were asked to make an evaluation of logical validity. The subjects, however, exhibited a belief bias, evidenced by their tendency to reject valid arguments with unbelievable conclusions, and endorse invalid arguments with believable conclusions. Instead of following directions and assessing logical validity, the subjects based their assessments on personal beliefs.
Consequently, these results demonstrated a greater acceptance of more believable (80%), than unbelievable (33%) conclusions. Participants also illustrated evidence of logical competences and the results determined an increase in acceptance of valid (73%) than invalid (41%). Additionally, there's a small difference between believable and valid (89%) in comparison to unbelievable and invalid (56%) (Evans, Barston & Pollard, 1983; Morley, Evans & Handley, 2004).
It has been argued that using more realistic content in syllogisms can facilitate more normative performance from participants. It has been suggested that the use of more abstract, artificial content will also have a biasing effect on performance. Therefore, more research is required to understand fully how and why belief bias occurs and if there are certain mechanisms that are responsible for such things. There is also evidence of clear individual differences in normative responding that are predicted by the response times of participants.
A 1989 study by Markovits and Nantel gave participants four reasoning tasks. The results indicated “a significant belief-bias effect” that existed “independently of the subjects' abstract reasoning ability.”
A 2010 study by Donna Torrens examined differences in belief bias among individuals. Torrens found that “the extent of an individual's belief bias effect was unrelated to a number of measures of reasoning competence” but was, instead, related to that person's ability “to generate alternative representations of premises: the more alternatives a person generated, the less likely they were to show a belief bias effect."
In a 2010 study, Chad Dube and Caren M. Rotello of the University of Massachusetts and Evan Heit of the University of California, Merced, showed that “the belief bias effect is simply a response bias effect.”
In a 2012 study, Adrian P. Banks of the University of Surrey explained that “belief bias is caused by the believability of a conclusion in working memory which influences its activation level, determining its likelihood of retrieval and therefore its effect on the reasoning process.”