In 2024, there is information overload. In the country where I live, media and their super-rich owners don’t compete over who brings the news first anymore. They just get their shared power and the people above them tell them what to write and which medium to include to. Floods, high casualties, and accidents just don’t get reported because nobody cares. What I am trying to insinuate is that we get a lot of information, but we must not only select them but objectively evaluate and then, according to our needs, we must gain new information from sources we found based on previous sources. This could be a prelude to the beginner’s guide to critical thinking.
IQ doesn’t exist? What is therefore an increase in toddlers, infants, and teenagers?
My point is that the average intellectually able baby isn’t ready to have a discussion about whether there is a monotheistic God. But since there are psychometrics, cognitive psychology, and neuroscience, we know there is something – which we can describe vaguely – as IQ.
It is such a pity that the smartest brain discovered this one of the most statistically significant concepts of humanity so late.
We have also talents and creativity but these are derived from something we call sheer intelligence.
Intelligence arises
Babies are born with basic cognitive abilities, and their early experiences are centered around sensory perceptions. In the first months of life, they recognize faces, voices, and respond to stimuli like touch and sound. Their brains are developing rapidly, but their cognitive functions are mostly instinctual, with little to no problem-solving ability. By the time they reach the infant stage, around 6 months to 2 years, their understanding of the world begins to expand. Infants start to recognize patterns, form basic memories, and develop an understanding of cause and effect. Their ability to focus on objects and people strengthens, and they become curious about their environment. This early curiosity is an essential foundation for the development of general intelligence (g factor). As it reflects the brain’s growing capacity to process and organize information.
As children transition to the toddler stage, typically between ages 2 and 4, their cognitive skills see a notable leap. Toddlers start to develop language skills, which are a critical component of IQ. They begin to form simple sentences, express basic emotions verbally, and understand simple instructions. They also show early problem-solving skills, such as figuring out how to use tools or navigate around obstacles. This period is crucial for brain development, as toddlers learn through exploration and play. Their reasoning abilities are still limited, but they start to grasp concepts like size, shape, and quantity. The g factor becomes more evident as they acquire skills that require memory, attention, and problem-solving, all of which are essential for intellectual development. Toddlers also begin to imitate others, showing early signs of social learning, which further contributes to cognitive growth.
Childhood IQ
During the childhood stage, particularly as they approach school age, children’s cognitive abilities take another significant step forward. Between ages 5 and 12, they develop more complex language and reasoning skills. They learn to read, write, and perform mathematical operations, which are directly linked to the development of general intelligence. Their ability to think abstractly begins to form, although it is still in the early stages. Children now understand concepts like time, sequence, and relationships between objects. They become capable of following multi-step instructions and solving problems that require logical thinking. In terms of g factor, this period sees a steady increase in intellectual abilities. This drives formal education and increased exposure to structured problem-solving tasks. As they grow, children begin to understand and apply rules, both in social situations and in learning environments, which helps them develop critical thinking skills.
Adolescence
As they enter adolescence, typically around ages 13 to 19, teenagers experience significant cognitive and intellectual development. The brain continues to mature, especially in areas responsible for higher-order thinking, such as the prefrontal cortex. Teenagers develop the ability to think abstractly and critically, which allows them to engage in more complex problem-solving and decision-making. Their capacity for reasoning becomes more advanced. And they can understand hypothetical situations, grasp moral and ethical dilemmas, and engage in reflective thinking. IQ levels often stabilize during adolescence, as the g factor reaches its adult potential. Teenagers also begin to develop metacognition, the ability to think about their own thinking processes, which is a hallmark of mature intellectual development. This stage is marked by a growing ability to plan, analyze, and synthesize information, skills that are essential for adult-level intelligence.
Throughout these stages, from babyhood to adolescence, the progression in IQ and g factor development is closely tied to both biological maturation and environmental influences. Early cognitive stimulation, education, and social interaction all play critical roles in shaping the intellectual growth of a child. By the time they reach adulthood, their g factor has typically reached its peak, reflecting the culmination of years of cognitive development, learning, and experience.
Let’s go from the basics
They are able to know they put a cup into a sink. Then they learn how to lace their shoes. And as the logical complexity rises, they are able to make rational claims about religion. But let’s go ahead with what they need.
What is information?
Information is data that has been processed, organized, or structured in a way that provides meaning or context. It can be used to understand, communicate, or make decisions, serving as a bridge between raw data and knowledge. Information often answers questions like who, what, where, when, or how, and is crucial in guiding human understanding and actions.
A beginner’s guide to critical thinking: where to find information?
When I was deciding whether there are some spiritual ways that could be truthful, I heard about science. I didn’t know what science was precisely, but I have found information – popular science books, articles, TV series. So I obtained that information. And it gave me 1000 possibilities what to explore and what are opinions of prominent scientists.
It was the same with philosophy. I had no clue philosophy why exists when there is science. I came only to contact with continental philosophy so I deemed that philosophy in general is in stark contrast to science. But it is about information.
I learned not only there is another philosophy, but not only it is compatible with science but it helps science (analytic philosophy). But I have to work with information and, then finally, obtain critical thinking.
Critical thinking: How to work with information
A beginner’s guide to critical thinking could continue in what way to work with information. A duck cannot work with information, but you, given your mental ability, are able to work with information. And our brains are pretty good at reproducing reality (given how the reality is complex). You can invent 10,000 theories or thoughts, but only one is true.
So you must work with it. Some people have distinct IQ profiles, some have talents and creativity, so critical thinking definitely requires raw thinking.
What are cognitive biases, fallacies and formal fallacies?
Cognitive biases are systematic patterns of deviation from rational thinking. They occur when the brain uses mental shortcuts, or heuristics, to make decisions quickly, often leading to errors in judgment. These biases are the result of evolutionary pressures that favored speed and efficiency over perfect accuracy. For example, confirmation bias leads people to focus on information that supports their pre-existing beliefs while ignoring contradictory evidence. Cognitive biases affect daily decision-making, from personal choices to interpreting news, often distorting reality and reinforcing false beliefs.
Fallacies, on the other hand, are errors in reasoning that undermine the logic of an argument. They can be intentional or unintentional, often used in debate or rhetoric to mislead or persuade without sound logic. A common example is the ad hominem fallacy, where a person attacks their opponent’s character instead of addressing the actual argument. Fallacies exploit weaknesses in human reasoning and can be particularly persuasive when the listener is not critically evaluating the argument. Unlike cognitive biases, which are unintentional mental errors, fallacies often involve a deliberate misuse of logic.
Formal fallacies specifically refer to errors in the structure of an argument that violate the rules of logic. These fallacies occur in deductive reasoning, where the argument’s form is flawed, making the conclusion invalid even if the premises are true. An example is the “affirming the consequent” fallacy, where someone assumes that if a certain result is observed, the cause must have occurred, even though other causes could lead to the same result. Formal fallacies are distinct from informal fallacies because they involve the breakdown of logical form rather than content, making them detectable through strict logical analysis.
Our mental abilities are fit for the African savannah
Cognitive biases and fallacies developed over the course of human evolution as adaptive mechanisms for survival in harsh, unpredictable environments. Early humans often faced life-threatening situations that required quick, decisive action. Our brains evolved to use mental shortcuts – called heuristics – that allowed for fast judgments without the need for lengthy analysis. In prehistoric environments where dangers were immediate, these shortcuts helped individuals survive. For example, hearing a rustle in the bushes and assuming it was a predator might have been incorrect in many cases. But those who acted on this assumption more often survived. These quick, automatic responses helped early humans react to threats swiftly. This was a crucial advantage when lives depended on making fast decisions in environments filled with uncertainty.
One significant bias that emerged from this evolutionary pressure is confirmation bias, which refers to the tendency to search for, interpret, and remember information that confirms one’s existing beliefs or hypotheses. This bias likely evolved because it made decision-making more efficient in uncertain environments. Early humans who held onto successful beliefs and avoided constantly reevaluating them were better equipped to avoid threats. In a survival context, it was less risky to rely on past experiences that had worked rather than continuously challenging and revising assumptions, which could take too much time or energy. However, in the modern world, confirmation bias leads to flawed thinking and a refusal to engage with new, contradictory evidence. This is particularly harmful in complex areas like science, politics, and social issues, where sticking rigidly to one’s pre-existing beliefs can prevent learning, progress, and understanding.
Heuristic
Another crucial bias that developed during human evolution is the availability heuristic. This cognitive shortcut leads people to estimate the likelihood of events based on how easily examples come to mind. For early humans, this was a vital survival tool. If someone recently witnessed a dangerous event – such as a tribe member being killed by a predator – they would overestimate the probability of encountering that predator again. This mental shortcut increased vigilance and helped avoid danger. However, in modern times, the availability heuristic often leads to irrational fears and misjudgments. For example, people may have an exaggerated fear of airplane crashes or terrorist attacks because these events receive extensive media coverage and are easy to recall. Even though they are statistically rare. This bias skews our perception of risk and leads to disproportionate reactions to low-probability events.
The appeal to authority fallacy, where people defer to authority figures without critical thinking, also has evolutionary roots. In small tribal societies, it made sense to rely on the wisdom of experienced leaders or elders, as their survival skills and knowledge of the environment were crucial for the group’s survival. Trusting authority was a way to ensure that individuals followed proven strategies for finding food, avoiding danger, and maintaining social order. This deference to authority was a survival advantage in small, close-knit communities where leaders had direct knowledge of the group’s needs. However, in today’s world, this bias can lead to blind obedience or uncritical acceptance of advice from authority figures who may not always have the necessary expertise. In modern, complex societies, this fallacy can lead to poor decision-making when people follow leaders, celebrities, or so-called experts without scrutinizing the validity of their claims.
The Bandwagon Effect
The bandwagon effect, another cognitive bias with evolutionary roots, evolved to ensure social cohesion within early human groups. In small tribes, conforming to group norms was essential for survival. Individuals who aligned with the group’s beliefs and behaviors were more likely to be accepted, gain protection, and access shared resources. Rejecting the group’s values could lead to isolation or exclusion, both of which were dangerous for early humans who relied on others for protection and cooperation in activities like hunting and foraging. Conformity fostered group harmony and reduced conflict, which benefited the tribe’s overall survival. However, in modern times, the bandwagon effect can be harmful. People adopt popular beliefs or behaviors without critical thinking, leading to groupthink. This can result in poor decisions, as individuals follow the crowd rather than evaluating the evidence or considering alternative viewpoints.
The sunk cost fallacy
The sunk cost fallacy is another cognitive bias that evolved as a survival mechanism but has become maladaptive in modern contexts. This fallacy occurs when people continue investing time, effort, or resources into something simply because they’ve already committed to it. Even when it no longer makes sense to do so. In an evolutionary context, persistence in the face of adversity was often essential. Early humans who gave up on difficult tasks too quickly, such as hunting or gathering food, might have starved.
The ability to persist in challenging situations was a valuable trait, as it increased the chances of eventually securing vital resources. However, in the modern world, the sunk cost fallacy can lead people to stay in bad relationships, hold onto failing investments, or persist with unworkable strategies because they don’t want to feel that their previous efforts were wasted. This leads to greater losses and wasted resources, making the bias maladaptive in today’s complex environments.
Throughout human evolution, these cognitive biases and fallacies served useful purposes by simplifying decision-making and helping our ancestors navigate dangerous and uncertain environments. They allowed for quick judgments that increased the likelihood of survival. However, in modern times, these same biases often lead to poor decision-making, as the environment has drastically changed. We no longer face the same immediate, life-threatening dangers that required such rapid judgments. Instead, the complexity of modern society demands more careful, rational thought, and critical evaluation of information. Yet our brains are still wired to rely on shortcuts that are now frequently maladaptive. These evolutionary remnants are responsible for many of the errors in judgment and reasoning that humans still struggle with today.
A beginner’s guide to critical thinking continues: There are thousands of cognitive biases, fallacies or formal fallacies
Cognitive Biases (1–65):
- Confirmation Bias – People seek out and interpret information in a way that confirms their pre-existing beliefs, ignoring evidence that contradicts them.
- Availability Heuristic – People judge the likelihood of an event based on how easily examples come to mind, such as fearing flying after seeing news of a plane crash.
- Anchoring Bias – People rely heavily on the first piece of information they encounter, which influences their later judgments, even if the anchor is irrelevant.
- Hindsight Bias – After an event, people believe they predicted it all along, which leads to overconfidence in their predictive abilities.
- Dunning-Kruger Effect – People with low skill in an area overestimate their abilities, while those with expertise may underestimate their competence.
- Self-Serving Bias – People attribute their successes to personal qualities but blame failures on external factors, leading to skewed self-perceptions.
- Negativity Bias – People focus more on negative experiences than positive ones, which affects their judgment and emotional responses.
- Framing Effect – People make decisions based on how information is presented, such as reacting differently to “90% fat-free” versus “10% fat.”
- In-Group Bias – People favor those within their social group and may discriminate against outsiders, affecting judgment and behavior.
- Sunk Cost Fallacy – People continue to invest in a losing endeavor because of past investments, instead of cutting their losses.
- Optimism Bias – People believe that they are more likely to experience positive outcomes than negative ones, often underestimating risks.
- Status Quo Bias – People prefer things to stay the same, even if a change might lead to better outcomes.
- Illusory Correlation – People perceive a relationship between two unrelated variables, such as associating an object with bad luck.
- Halo Effect – People allow one positive trait to influence their overall impression of someone or something, which can distort judgment.
- Overconfidence Bias – People overestimate their knowledge, abilities, or predictions, which leads to risky decisions or poor outcomes.
- Recency Bias – People give more weight to recent information than to older data when making decisions, even when older data might be more relevant.
- Gambler’s Fallacy – People believe that random events are influenced by past events, such as assuming that after a string of losses, a win is due.
- Bandwagon Effect – People adopt beliefs or behaviors because others are doing so, often without critical evaluation.
- Base Rate Fallacy – People ignore statistical information (the base rate) in favor of specific, anecdotal information, leading to flawed conclusions.
- False Consensus Effect – People overestimate how much others share their beliefs or behaviors, leading to mistaken assumptions about social norms.
- Outcome Bias – People judge the quality of a decision based on its outcome rather than the decision-making process, which can distort their evaluation of choices.
- Belief Bias – People accept or reject arguments based on how believable their conclusions are, rather than on the strength of the evidence.
- Attribution Bias – People attribute others’ actions to their character while blaming external factors for their own similar actions.
- Clustering Illusion – People see patterns in random data, such as interpreting a random sequence of events as meaningful.
- Primacy Effect – People remember information presented first better than information presented later, which influences their overall judgments.
- Survivorship Bias – People focus on the successes in a group and ignore the failures, which can lead to overly optimistic views.
- Backfire Effect – When confronted with evidence that contradicts their beliefs, people may strengthen their original beliefs instead of reconsidering them.
- Attentional Bias – People focus on specific types of information, often what is personally or emotionally relevant, and ignore other important details.
- Placebo Effect – People experience improvements because they believe a treatment is effective, even if the treatment itself has no therapeutic value.
- Planning Fallacy – People consistently underestimate how long a task will take, leading to overly optimistic timelines.
- Confirmation Bias – People seek out, interpret, and remember information that confirms their pre-existing beliefs.
- Fundamental Attribution Error – People attribute others’ actions to their character while downplaying situational factors.
- Endowment Effect – People value things more because they own them, leading to irrational attachment to personal possessions.
- Egocentric Bias – People overestimate how much others pay attention to them or share their thoughts and beliefs.
- Just-World Hypothesis – People believe that the world is fair and that people get what they deserve, leading to victim-blaming in unjust situations.
- Pessimism Bias – People expect negative outcomes more than positive ones, leading to overly cautious decisions.
- Bystander Effect – People are less likely to take action in emergencies when others are present, assuming someone else will intervene.
- Actor-Observer Bias – People explain their own actions by situational factors but attribute others’ actions to personal characteristics.
- Barnum Effect – People believe vague, general statements (such as horoscopes) apply specifically to them.
- Mere Exposure Effect – People develop a preference for things they are familiar with, leading to bias toward the known over the unknown.
- Risk Compensation – People take greater risks when they feel safer, such as driving more recklessly when wearing a seatbelt.
- Choice-Supportive Bias – After making a decision, people believe their choice was better than it actually was, downplaying its flaws.
- Zero-Sum Bias – People believe that one person’s gain must result in another’s loss, even in situations where mutual benefit is possible.
- Pro-Innovation Bias – People overestimate the benefits of new technologies or ideas while downplaying the risks.
- IKEA Effect – People value products more when they have participated in their creation, even if the end result is inferior.
- Curse of Knowledge – People who know something have difficulty imagining what it’s like for others who don’t, making it hard to explain complex ideas simply.
- Forer Effect – People accept vague or general personality descriptions as highly accurate for themselves.
- Empathy Gap – People underestimate the influence of emotional states on decision-making, especially during periods of emotional neutrality.
- Moral Credential Effect – After doing something ethical, people feel entitled to act immorally, as if their previous good deeds compensate for bad actions.
- Projection Bias – People assume others share their thoughts, beliefs, or emotions, leading to misunderstandings.
- Decoy Effect – People are more likely to choose an option when a worse alternative is presented as a “decoy.”
- Fading Affect Bias – Negative memories fade faster than positive ones, leading to a more favorable view of past experiences.
- Third-Person Effect – People believe others are more influenced by media or external messages than they are themselves.
- Restraint Bias – People overestimate their ability to control impulsive behaviors, leading to poor decisions in tempting situations.
- Hot-Hand Fallacy – People believe that a streak of successes will continue, even when outcomes are random, such as in gambling or sports.
- Belief Perseverance – People cling to their beliefs even after they’ve been discredited, ignoring contradictory evidence.
- Paradox of Choice – Having too many options leads to decision paralysis and dissatisfaction with the choice eventually made.
- Loss Aversion – People fear losses more than they value gains, leading to overly cautious decision-making.
- Information Bias – People seek more information than necessary, believing it will improve decisions, even when it doesn’t.
- Anchoring Bias – People rely too heavily on the first piece of information they encounter when making decisions.
- Scarcity Heuristic – People value things more simply because they are scarce, leading to irrational decisions.
- Bias Blind Spot – People recognize biases in others but fail to see their own, leading to distorted self-assessments.
- Hyperbolic Discounting – People prefer smaller, immediate rewards over larger, delayed rewards, even if the delayed rewards are more beneficial.
- Normative Bias – People conform to social norms, even when those norms conflict with personal beliefs or values.
- Expectation Bias – People’s expectations influence their perceptions and behavior, often leading to self-fulfilling prophecies.
Informal Fallacies (66–100):
- Ad Hominem – Attacking the person making an argument instead of the argument itself, often to avoid addressing the issue at hand.
- Straw Man – Misrepresenting someone’s argument to make it easier to attack, often by exaggerating or distorting the original point.
- Slippery Slope – Arguing that a small first step will inevitably lead to a chain of negative events, without providing evidence for such a progression.
- Appeal to Authority – Believing something is true simply because an authority figure endorses it, without critically examining the evidence.
- False Dilemma – Presenting only two options when in reality more options exist, forcing a choice between extremes.
- Circular Reasoning – The argument’s conclusion is used as a premise, leading to a logical loop that doesn’t provide real support.
- Red Herring – Introducing irrelevant information to distract from the main issue, often used to mislead or confuse.
- Post Hoc Ergo Propter Hoc – Assuming that because one event followed another, the first event caused the second, even when no evidence supports that conclusion.
- Begging the Question – An argument where the conclusion is assumed in the premises, leading to circular logic.
- Appeal to Emotion – Manipulating emotions, such as fear or pity, to win an argument rather than using logical reasoning or evidence.
- False Equivalence – Treating two different things as if they are the same or comparable when they are not, often leading to misleading conclusions.
- Appeal to Ignorance – Arguing that something is true because it hasn’t been proven false, or vice versa, placing the burden of proof improperly.
- Equivocation – Using a word in two different senses within the same argument, causing confusion or misleading the audience.
- Appeal to Tradition – Arguing that something should continue simply because it has always been done that way, without examining if it is still valid or beneficial.
- Middle Ground Fallacy – Assuming that the truth lies between two extremes, regardless of the actual evidence.
- Tu Quoque (Whataboutism) – Deflecting criticism by accusing the critic of the same behavior, rather than addressing the actual argument.
- Appeal to Nature – Arguing that something is good because it is “natural” or bad because it is “unnatural,” without examining its merits.
- No True Scotsman – Defending a generalization by dismissing counterexamples as not being “true” or representative, protecting the claim from disproof.
- Composition Fallacy – Assuming that what is true of the parts must also be true of the whole. For example, “Each player on the team is excellent, so the team must be excellent.”
- Division Fallacy – Assuming that what is true of the whole must be true of its parts. For example, “The team is excellent, so each player must be excellent.”
- False Balance – Giving equal weight to two sides of an argument, even when one side is overwhelmingly supported by evidence.
- Appeal to Pity – Using sympathy or compassion to win an argument, instead of presenting logical reasons to support the conclusion.
- Hasty Generalization – Drawing a broad conclusion from a small or unrepresentative sample of data, often leading to stereotypes or faulty assumptions.
- Appeal to Consequences – Arguing that a belief is true or false based on whether the consequences are desirable, rather than evaluating the evidence.
- Genetic Fallacy – Judging an argument based on its origin rather than its actual merits or content.
- Appeal to Flattery – Using excessive praise to persuade someone, rather than relying on logical arguments.
- Appeal to Force – Using threats or intimidation to win an argument, rather than reasoning.
- Wishful Thinking – Accepting a claim because it would be pleasant or desirable if it were true, rather than based on evidence or reason.
- Special Pleading – Applying different standards to one’s own argument than to others’ without justification.
- Guilt by Association – Discrediting a person or argument by associating them with an unpopular group or idea.
- Loaded Question – Asking a question that contains a presupposition, forcing the respondent into an answer that seems incriminating regardless of their response.
- Moralistic Fallacy – Assuming that because something ought to be a certain way, it is that way, without considering reality.
- Appeal to Popularity (Bandwagon Fallacy) – Arguing that something is true or right because many people believe it, without considering the evidence.
- Non Sequitur – The conclusion doesn’t logically follow from the premises. For example, “She’s good at her job, so she must be a great parent.”
- False Cause (Correlation vs. Causation) – Confusing correlation with causation, assuming that because two things occur together, one must have caused the other.
Formal Fallacies (101–150):
- Affirming the Consequent – A formal fallacy where one assumes that if “If A then B” is true, and B is true, then A must be true, which is incorrect.
- Denying the Antecedent – Assuming that if “If A then B” is true, the inverse must also be true. For example, “If it rains, the streets will be wet. It hasn’t rained, so the streets aren’t wet.”
- Fallacy of Four Terms – A formal fallacy where a syllogism includes more than three distinct terms, making the argument logically invalid.
- Quantifier Shift – Swapping the order of quantifiers in an argument, leading to incorrect conclusions.
- Argument from Fallacy – Assuming that because an argument contains a fallacy, its conclusion must be false.
- Illicit Major – A formal fallacy where the major term is distributed in the conclusion but not in the premise.
- Illicit Minor – A formal fallacy where the minor term is distributed in the conclusion but not in the premise.
- Undistributed Middle – A formal fallacy where the middle term in a syllogism is not distributed in either premise.
- Existential Fallacy – A conclusion asserts the existence of something when no such assumption was made in the premises.
- Conjunction Fallacy – People incorrectly assume that specific conditions are more probable than a general one.
- Masked Man Fallacy – A fallacy that occurs when a substitution of identical terms in a valid statement leads to a false conclusion.
- Division Fallacy – Assuming that what is true of the whole must be true of its parts.
- Fallacy of Composition – Assuming that because something is true for the parts, it must be true for the whole.
- False Analogy – Comparing two things that are not truly comparable in relevant aspects.
- Formal Fallacy of Equivocation – Using a word with multiple meanings in different parts of an argument, leading to a misleading conclusion.
- Affirming a Disjunct – Assuming that because one option in a “or” statement is true, the other must be false.
- Base Rate Fallacy – Ignoring statistical base rates in favor of anecdotal or specific information.
- Appeal to Authority – Assuming that because an authority figure believes something, it must be true.
- Moralistic Fallacy – Confusing moral values with factual claims.
- Cherry Picking – Selectively presenting evidence that supports a claim while ignoring contradictory evidence.
- Continuum Fallacy – Rejecting a claim because it is not precise or because there is no clear distinction between two extremes.
- Ecological Fallacy – Making inferences about individuals based on aggregate data for a group.
- Fallacy of Composition – Assuming that what is true for the parts is also true for the whole.
- Division Fallacy – Assuming that what is true for the whole must also be true for its parts.
- False Equivalence – Equating two things that are not comparable.
- No True Scotsman – Altering the definition of a term to dismiss counterexamples.
- Fallacy of Reification – Treating an abstract concept as if it were a real, tangible thing.
- Appeal to Ignorance – Arguing that something is true simply because it has not been proven false.
- Appeal to Spite – Arguing based on personal grudges or emotions rather than objective reasoning.
- Appeal to Wealth – Assuming that wealth equates to greater knowledge or moral standing.
- False Compromise – Assuming that the middle ground between two opposing arguments is the best solution, regardless of evidence.
- Appeal to Fear – Using fear to persuade others instead of logical arguments.
- Slippery Slope – Arguing that a specific action will lead to an extreme outcome without evidence.
- Affirming the Consequent – Assuming that if “If A then B” is true, and B is true, then A must also be true.
- Denying the Antecedent – Assuming that if “If A then B” is true, and A is false, then B must also be false.
- Fallacy of Composition – Assuming that what is true for the parts must be true for the whole.
- Equivocation – Using ambiguous language to mislead or distort the argument.
- Existential Fallacy – Assuming that a universal statement implies the existence of something.
- Affirming a Disjunct – Assuming that if one alternative is true, the other must be false, even if both could be true.
- Conjunction Fallacy – Assuming that specific conditions are more probable than a single general one.
- False Cause – Assuming that correlation implies causation.
- Illicit Major – A formal fallacy where the major term in a syllogism is distributed in the conclusion but not in the premise.
- Illicit Minor – A formal fallacy where the minor term in a syllogism is distributed in the conclusion but not in the premise.
- Fallacy of Four Terms – A formal fallacy where a categorical syllogism includes more than three terms, making the argument invalid.
- Masked Man Fallacy – A logical error that arises when substitution of identical terms leads to an incorrect conclusion.
- Undistributed Middle – A formal fallacy where the middle term in a syllogism is not distributed in either the premise or the conclusion.
- Fallacy of False Dilemma – Presenting two options as the only possibilities when others exist.
- Conjunction Fallacy – The assumption that more specific conditions are more probable than a general one.
- Argument from Ignorance – Assuming something is true because it has not yet been proven false.
- Fallacy of Reification – Treating abstract concepts as if they were physical or tangible.
Critical thinking: priests, religious figures or theologists are best at perverting the reality
I meet with some priests, read theology, and philosophical theology, and know the best that every cognitive bias, fallacy, or formal fallacy that can be used, they will just use it.
These are actually not honest arguments, these are circus pieces on how you can twist logic.
The main narrative is: that they have a conclusion – for example, Christianity has Jesus Christ, then every single argument is perverted so they can at least legitimate some deity if they are so sane not to believe in person God.
Inventing and thinking out without succumbing to cognitive biases and fallacies
Everything was basically explained now. You must use your intelligence (IQ, in a broader sense or generalizable way), talents, and creativity and be devoid of cognitive biases, fallacies, and formal fallacies.
My own experience with critical thinking and why you cannot believe in God
My own critical thinking made me align with science and analytic philosophy. I also, after time of hard examining my critical thinking, concluded that what we see in politics is a circus, 98 % is in the background. The super-rich families I used to have laughed at really do exist – with their bankers. There is evidence for it.
I am an atheist, even not a deist. I don’t believe in any superpower because there must have been logical objects, propositions, and so on.
And I guess with devoid of the abovementioned, you can’t really believe in a personal God.
Conclusion: A beginner’s guide to critical thinking
When you are devoid of this, it is a huge challenge to have this in mind, be precise, and maintain your thinking.
However, when master these, you have brilliant insight into things that normal people have no clue about.
So it is demanding. But you will become an independent thinker (in a broader sense) and the world will be shaped by reality, not nonsense stuff.
Leave a Reply