Atheism and the Psychology of Doubt and Belief

Updated on January 2, 2018
bombadere profile image

Joel is a writer who has spent 7 years researching and writing on topics of religion. He has a BA in psychology and an MA in education.


The Definition of Belief

The definition of the word "belief" is being called into dispute in recent years. Classically, "belief" simply meant any idea a person holds to be true. In more recent years, the concept of "belief" is being lingually entangled with the concept of "faith." The definition of "faith" has drifted a great deal in recent years, as well. Once being a word synonymous with "trust," it has since become entirely tied to its use in religion. As religious beliefs have fallen out of fashion in a post-enlightenment world, religious notions are being seen as far from "trustworthy." Consequently, "faith" is now "blind trust," and "belief" is basically "faith."

All this quibbling over definitions is disconcerting. From a psychological standpoint, everyone – when encountering a proposition – will consider that proposition for one of three categories: true, false, or unsure.

Since everyone has ideas that they hold to be true which are actually true, and ideas they hold to be true which are actually false, the real question becomes, "how are beliefs formed, and how do they relate to the actual world in which we live?"

Beliefs and Knowledge

A strong example related to the new definition of "belief" is Michael Shermer’s book The Believing Brain: From Ghosts and Gods to Politics and Conspiracies — How We Construct Beliefs and Reinforce Them as Truths. Shermer, himself an Atheist, broadly seems to define “belief” as convictions people hold which are arrived at intuitively. Shermer basically says that people adopt a belief as a result of the brain’s readiness to perceive patterns in the world around it, and to then assign agency to those patterns. Then, Shermer says, once a person has adopted this belief based purely upon intuition imposed on the world around them, the person looks for reinforcers for the belief, such that they provide reasons for belief after they have already believed.

Presumably, of course, Shermer believes that the system he defines in his book is accurate to reality. So either Shermer has come to that conclusion through the process he defines, or one ought to find a word other than “belief” to describe Shermer’s process. If Shermer doesn’t “believe” he’s stumbled upon a truth here, what does he do? Conclude it? Affirm it? Suspect it?

Further, when a psychologist such as Shermer tells a patient that she should “believe in herself” - is he saying that this patient ought to begin with an unfounded conviction of success, then find reasons to back up that conviction? In fact, he probably does. It kind of kills the message when one puts it that way, though.


Defining Belief

Either all people navigate the world around them operating off of a hodgepodge of unfounded convictions – say, that the sky is blue, that cars have four tires and that Michael Shermer is a quality psychologist – or people do, in fact, reach certain conclusions based on something other than intuition, and we ought to better work out the definition of “belief.”

The Oxford Dictionary gives “belief” as beingan acceptance that a statement is true or that something exists,” or “something one accepts as true or real; a firmly held opinion or conviction, or trust, faith,” or “confidence in someone or something.” Lastly the dictionary will concede: “a religious conviction.”

So are there any studies that speak of how one comes to the conclusion that something is true aside from intuition and pattern recognition, or are all ideas about what is true reached in that manner, pending investigation into why one’s preconceptions may be accepted?

If the latter, this is just further fuel for the argument that one’s ideas about things are entirely untrustworthy, and that we cannot ever “know” anything in the full sense of the word.


Dr. Alex Lickerman on Belief Formation

In his Psychology Today article, “Two Kinds of Beliefs,” Dr. Alex Lickerman espouses an idea similar to Shermer’s, but doesn’t leave a more traditional definition of “belief” off the table. Says Lickerman:

“Simply, a belief defines an idea or principle which we judge to be true.”

Despite his broader definition of “belief,” Lickerman, similar to Shermer, goes on to say that,

“We now know that our intellectual value judgments—that is, the degree to which we believe or disbelieve an idea—are powerfully influenced by our brains' proclivity for attachment. Our brains are attachment machines, attaching not just to people and places, but to ideas. And not just in a coldly rational manner. Our brains become intimately emotionally entangled with ideas we come to believe are true (however we came to that conclusion) and emotionally allergic to ideas we believe to be false.”

Here, Lickerman affirms the notion that people ought not necessarily trust anything they believe, because the way in which humans form beliefs is arbitrary, and usually due to their environment and preconceptions formed early in life based on things instilled in them.

Lickerman goes on to say that, once a person forms a belief, they are drawn to things that back up that belief, and repulsed by things that do not. Commonly known as “Confirmation Bias,” and “Decomfirmation Bias.” Says Lickerman:

“Accuracy of belief isn't our only cognitive goal. Our other goal is to validate our pre-existing beliefs, beliefs that we've been building block by block into a cohesive whole our entire lives. In the fight to accomplish the latter, confirmation bias and disconfirmation bias represent two of the most powerful weapons at our disposal, but simultaneously compromise our ability to judge ideas on their merits and the evidence for or against them.”

Lickerman, however, shows his hand eventually by piling on a heaping helping of his own decomfirmation bias. He says:

“This is why, for example, creationists continue to disbelieve in evolution despite overwhelming evidence in support of it and activist actors and actresses with autistic children continue to believe that immunizations cause autism despite overwhelming evidence against it.”

This is not to say he is necessarily wrong in his convictions about Creationism and anti-immunization campaigns, but at the point he says this, the article ceases to be the kind of neutral dispassionate explanation of facts drawn from studies, and makes statements on subjects to which the article is not equipped to speak in terms of data collected, and studies cited. He either assumes that the reader agrees with him, or that they will accept that he is correct on the basis of pure authority. Exactly the kind of thing the article speaks against.

Lickerman betrays himself in the very next sentence:

“But if we look down upon people who seem blind to evidence that we ourselves find compelling, imagining ourselves to be paragons of reason and immune to believing erroneous conclusions as a result of the influence of our own pre-existing beliefs, more likely than not we're only deceiving ourselves about the strength of our objectivity.”

Lickerman’s suggests that adults should reason more like infants: accept those things that appear to be true by impulse, rather than comparing them to pre-developed biases, and forming conclusions backwards. Says Lickerman:

“…we have to identify the specific biases we've accumulated with merciless precision. And … we have to practice noticing how (not when) those specific biases are exerting influence over the judgments we make about new facts.”  

Scott Adams, the cartoonist known for his Dilbert comic, notes that people who have been given hypnotic suggestions, will follow those suggestions – no matter how ridiculous – and then attempt to explain why they did what they did in some reasonable terms. In other words, someone may act upon a completely unreasonable impulse, then attempt to justify it through reason. This observation somewhat ties back into Lickerman’s theory on Belief. Adams himself, links it to religious beliefs.


Belief Mapping

Both Lickerman and Shermer somewhat agree in their diagnosis of how humans form beliefs, however, Lickerman is kind enough not to link his exclusively with religious and paranormal beliefs.

What Lickerman brushes up against in his theory is something which may well be termed “Belief Mapping.” Essentially, as Lickerman suggests, infants begin to map out the way in which the world functions through pure observation. They will reach some general conclusions early on – things like “If I drop this, it falls,” and, “If I touched that, I get smacked on the hand.” Basically, some “if/then” observations which begin to form a basic pattern of how the world works around them.

This belief mapping is fleshed out a great deal in early childhood as they begin to interact with people, and become aware that adults can show them things that work pragmatically. The concept of “authority” begins to form, and the child is perfectly comfortable to accept things upon authority, as it generally seems to be good information. This becomes their primary outlet for belief mapping, and may continue to be for the remainder of their lives (although the definition of “authority” may expand to include books/television/internet or any other source of information).

Once a person has formed a comprehensive enough belief map, they will compare new information against their established belief map, and see where it fits in the schema of things. If the new information entirely contradicts the belief map, it is rejected. If it can be shoehorned into the belief map in some way, it is crammed in in any way possible, and the belief map is expanded accordingly. At this point, it is a Worldview.

This method of belief formation is not as terrible as Shermer and Lickerman might … well… Believe. In one way, it is almost inevitable. One cannot continue to hold beliefs disconnectedly in the helter-skelter manner of a child. Eventually, one is apt to take the facts they hold, and begin to connect them in some way. Inevitably, they will encounter and then adopt a Worldview which makes best sense of the facts they hold, so that they may make sense of all facts they encounter in the future on the basis of their Worldview.

At this point, the person has a shortcut for judging information they encounter as to the quality of its truth. A new fact is encountered. It is immediately held up against the framework of the person's worldview for comparison, and then it is adopted or dismissed accordingly. While not a flawless way of navigating the world of information a person might encounter, it has been an adequate method of thought for most of human existence. It increases the rate at which people may process new information, and decreases the number of facts people dismiss because they remain unsure.


Flaws in the System

The flaws of this system of belief formation have really come into focus with the arrival of the "Information Age." Now, a person is bombarded by facts from every direction – like drinking from a fire hose. Worse, they are aware that there is a lot of false or misleading information out there. The belief-mapping kicks into overdrive, and ideas are adopted or dismissed practically without consideration based entirely upon which seem right and which seem wrong compared with a person's current belief map.

Consider, for instance, Fake News – sensationalized news stories which began circulating online in the mid-2010's. Fake News preys upon specific worldviews for propagation. So if a story comes out that says something like, "President orders bombing of orphanage in Uganda," people who like the president are going to recognize this story for the shill that it is because their belief map will not allow for that kind of egregious behavior from a man they respect. However, people who dislike the president will eat this up like candy, because it confirms what they already suspect about the person.

Additionally, matters upon which the person has no set opinion will be accepted and rejected based on the person's worldview. Thusly, for instance, a person who has no interest in, nor opinion about, say, Gun Laws - when confronted with the matter, will tend to ultimately defend the position of their political party based entirely upon their allegiance to that Worldview.


Belief Formation and the Scientific Method

However, this process of data-collection, worldview formation, and fact confirmation is actually very similar to the way in which science works. A model is constructed to explain facts – say field theory which explains the fundamental nature of the material universe – and all new information is compared against the accepted model and judged accordingly. New information is either integrated into the current scientific model, suspected because of the way in which it contradicts the current model, or accepted as accurate, resulting in a revision of the current model. In many ways, Belief Mapping is the only way in which a person may advance in thought processing to the level of maturity.

To completely reject the concept of “belief” based on human fallibility is to cut off one’s nose to spite one’s face. The human capacity to “believe” is both unavoidable and necessary to function.


If caution may be taken from Shermer and Lickerman’s critique of belief formation, it would be that one must be willing to modify one’s Worldview if strong enough evidence suggests itself. Of course, this knife cuts both ways. If anyone has motivation to suspect one’s core beliefs, it would be the very person who has seen human fallibility in belief-formation. Lickerman begins his article preaching against homeopathy, and punctuates it with a rallying cry against creationism and anti-vaccination. Clearly Lickerman has some underlying audience he looks down upon for rationalizing their beliefs. Perhaps Lickerman's beliefs have been adequately researched and formed dispassionately, and perhaps not - but nevertheless, a motive remains clear as he preaches the inadequacy of belief formation.

It could not be more clear that Shermer had a motive for his book beyond just defining belief formation. It was, after all, subtitled “From Ghosts and Gods to Politics and Conspiracies — How We Construct Beliefs and Reinforce Them as Truths.” If anyone ought to know how not to push their point by showing their hand up front, it would be psychologists commenting on belief-formation.

Again, belief-mapping has never been as problematic as it is in the information age. If a solution may be reached, it would start with the individual being skeptical of one's belief-map and/or of all information they receive, no matter how attractive.

So far as communicating with others, educational theory has a nice, common-sensical method of integrating information into a person's worldview with the least amount of resistance: you meet the person where they are at.

An educator, will, for instance, probe a student for their interests, then teach the subject matter relating it to that interest. Math may relate to music or shopping, so if the student likes shopping, this interest may be tapped in order to teach them math.

Parents instinctively do this for children, too. To explain the concept of taxes, they may use chore money to demonstrate how it works. You find something the person has already integrated into their belief map, and then use that to demonstrate your point.

In short, Belief exists. It is a word relevant to everyone – at least by its classic definition. Everyone has the same potential flaw with belief formation in that, if their worldview is flawed, their belief formation will be poor in terms of discerning accurate beliefs from inaccurate ones. One must question one's own personal belief map before attacking those of others.


The Psychology of Doubt

Doubt characterizes a state of mind when a proposition which has been held as true becomes suspect, and then remains in a status of neither being held as fully true or fully false. It can also describe a state when a mind encounters a new idea, and is unable to decide upon the truth or falsity of that idea.

It can also describe something which is not trustworthy. This is the case especially when it comes to self-doubt, that is, the inability to trust oneself to be able to discern between that which is true and that which is false.

It could also be the case that when a person encounters a source of information which they have determined to be unreliable, any information that comes from that source will be deemed as uncertain as to its quality of truthfulness.

Possibly the most common kind of doubt is self-doubt. Typically people who self-doubt do so because of a negative self-image. They have come to the conclusion that they cannot trust themselves – either to come to reasonable conclusions, or to control their own lives.

When people self-doubt, they typically have what is called an “external locus of control”: meaning that they believe that they have little or no control of their life and their environment. They don’t make things happen – things happen TO them.

The source of self-doubt is usually something that happens early in the person’s development, and is typically encouraged by outside sources whom they trust. This being so, the person has come to rely on others to affirm or deny beliefs.

Such a person will look to others in order to validate beliefs. If and when peers or authorities deny a particular belief, the person will adopt the beliefs of those around them.

A person with a fairly strong self-respect will tend to rely on their own capacity to affirm or deny beliefs. This person typically has an internal locus of control – meaning they are self-reliant. They rely on themselves in order to discern the truth or falsity of beliefs. A person such as this is much less likely to self-doubt than the previous type of person, and it will take a lot to ever convince them they have been wrong about something. However, for this kind of person, doubt is a much stronger force. If this person is convinced in some way (usually through personal investigation rather than taking the word of some authority) that they have been wrong about something – they are almost certain to suffer, considering they are self-reliant, and they have exposed a flaw in their own thinking.

On the basis of certain studies, atheists in general tend to be more self-reliant with an internal locus of control. There are certainly irreligious people who are not as self-reliant, but they are more of your so-called “Nones” who are willing to be unsure about religion rather than making a firm decision as to the truth or falsity of the belief.

On average, your atheist – who has made a firm decision in regards to the truth or falsity of religion – tend according to studies, to be analytical thinkers and self-reliant ones as well. They tend to be the kind of people who avoid herd mentality, such that they don’t feel the need for things like the emotional exultation of the worship experience, or the sense of community offered by the church.

As mentioned before, it tends to be far less likely for someone with an internal locus of control, with analytic thinking to doubt their viewpoint, since they consider themselves to be the masters of their own beliefs.

This is not meant as a criticism of people with internal locus of control, just to say that people with ILC are far less able to change their views on things, since once they have a belief, it tends to be set in stone.

Doubt, in general, tends to be a very uncomfortable feeling - such that people will actively avoid or reject sources of information that might contradict the truths they endorse. This ties back to Lickerman's confirmation and deconfirmation bias.

The fact that doubt could cause mental - or even physical - discomfort should not be entirely surprising: when one's beliefs are called into doubt, this suggests that a person cannot trust themselves to determine truth. When a person calls into question their own sensibilities, that person has to question not just a belief that they hold - but rather all beliefs that they hold, because they realize they have the capacity for error.


    0 of 8192 characters used
    Post Comment

    • Jay C OBrien profile image

      Jay C OBrien 4 months ago from Houston, TX USA

      "One must question one's own personal belief map before attacking those of others."

      I certainly agree with that. I believe we should start this discussion by setting high Ideals (beliefs).

      An ideal is a standard by which one lives. "The ideal gives us a sense of stability, guidance and orientation, as well as a criterion for judgments." (Puryear & Thurston, 1987, p. 95)

      "An ideal is not a goal. It is a motivational standard by which to evaluate our goals and our reasons for pursuing those goals. The goal is what; the ideal is why! A spiritual ideal is not so much a goal toward which we move as it is the spirit in which we grow. It is a living and dynamic standard by which we quicken and measure our daily motivation." (Puryear, 1982, p. 112)