When you are bombarded with so much non-sense and misinformation every day, you need something to help you filter out the good stuff. You need a baloney detection kit...
This is the 6th and last part in my series on tracking down pseudoscience. Here are the previous posts in this series:
1. How Reliable is Your Source
2. Tracking Down Pseudoscience - Part 2
3. Tracking Down Pseudoscience - Part 3
4. Tracking Down Pseudoscience - Part 4
5. Tracking Down Pseudoscience - Einstein got it all Wrong!
Generic Introduction
For those wanting to become educated in critical thinking, for those who want to be able to spot the baloney and the potential intentions towards deception in other people's claims and arguments, and for people who are interested in solid reasoning, Carl Sagan and Michael Shermer propose their baloney detection kit, a set of questions that one should ask when in doubt.
Today I'm going to focus on question 10 from the baloney detection kit.
Baloney Detection - Point 10
Question 10 - Do the claimant's personal beliefs and biases drive the conclusions?
The majority of us do not know that many of the decisions we make every day are in a great part based on personal biases and ideologies and have little to do with rational thinking. It is not only decisions, but also the thoughts we have and the actions we take.
Modern humans have evolved over the last 200,000 years. Earlier hominid species trace back to ~2 million years ago. Throughout this entire time we have developed automatic 'decisions' that would prevent us from falling prey to predators or getting cast out of the group.
Our inborn flaws in thinking have, in part, provided us with a safe place in the group or tribe.
Think of 'herd mentality' for example: you don't want to be the one going against the consensus within the group; otherwise you may risk getting thrown out; and in a tribal based society, getting around all alone may be difficult; you are safer if you stick to the group. Herd mentality is part of your inner biases - you apply it unconsciously, even if sometimes it is to your detriment.
Think of cognitive dissonance: or what happens when two or more contradictory ideas fight for a place within your thinking; cognitive dissonance creates mental unease and drives more 'energy'. When you are being confronted with ideas and claims contradictory to your beliefs, your tendency is to avoid dissonance by disregarding the beliefs and staying loyal to the ones you already hold; it's how the confirmation bias works.
In general, not everything about logical and cognitive fallacies is bad. We should pay tribute to some of them as they help us get around even in the society of today, which is much more different than what we've been used to from an evolutionary perspective.
This does not mean that we shouldn't educate ourselves about them. Being aware about your inner mental flaws is one step to better thinking; however, do know that awareness does not provide complete safety.
Rational and critical thinkers may also fall prey to these biases. However, it is better knowing about them than running in total darkness (being a passenger of your life, instead of the driver).
A good application of critical thinking for the purpose of minimizing personal biases is the scientific method. It should not only be useful for people doing scientific research; I think it should be included in the 'decision making' repertoire of each and every one of us. I assume we'd become a better society.
We can learn from the way in which the peer-review process is conducted in research:
Peer review has been around ever since the first science journals - circa 300 years ago. According to Elsevier:
"Peer review helps validate research, establish a method by which it can be evaluated, and increase networking possibilities within research communities."
The scientific method and the peer-review process are not without flaws; they have critics and they have been brought criticism; nonetheless, they are some of the best tools we have for validating research and for minimizing bias.
Ending thoughts
Ideology and politics have no place in science. To be objective in your approach, you have to stave them off and look into the data you have at hand (good or bad). Michael Shermer elegantly put it:
"Science deals in fuzzy fractions of certainties and uncertainties, where evolution and big bang cosmology may be assigned a 0.9 probability of being true, and creationism and UFOs a 0.1 probability of being true. In between are borderland claims: we might assign superstring theory a 0.7 and cryonics a 0.2.
In all cases, we remain open-minded and flexible, willing to reconsider our assessments as new evidence arises. This is, undeniably, what makes science so fleeting and frustrating to many people; it is, at the same time, what makes science the most glorious product of the human mind."
I am hopeful that this series of 6 posts will help you and me become better thinkers; critical thinkers prepared to deal with the misinformation, deception, and all the baloney that's being thrown our way 24/7, intentionally or unintentionally.
While this is the end of my series on tracking down pseudoscience, it's not the end of my quest in developing good thinking. Stay tuned, there's more to come. In the mean time, I'll leave you with a 12-minute 'friendly' video explaining the scientific method, so that you can implement it in your life (personal and professional) right away:
To stay in touch with me, follow 
Credits for Images: [By Center for Scientific Review [2] [Public domain], via Wikimedia Commons] and [By Nikolaus Kriegeskorte, CC-BY-3.0, via Wikimedia Commons].
#science #psychology #practical
Cristi Vlad, Self-Experimenter and Author