Falacies & Informal Logic

| Don Lindsays List | Fallacy in Logics List | Changing Minds Fallacies | Fallacy Files List |


     http://www.don-lindsay-archive.org/
  • Ad Hominem (Argument To The Man):
    attacking the person instead of attacking his argument. For example, "Von Daniken's books about ancient astronauts are worthless because he is a convicted forger and embezzler." (Which is true, but that's not why they're worthless.)

    Another example is this syllogism, which alludes to Alan Turing's homosexuality:

    Turing thinks machines think.
    Turing lies with men.
    Therefore, machines don't think.

    (Note the  equivocation in the use of the word "lies".)

    A common form is an attack on sincerity. For example, "How can you argue for vegetarianism when you wear leather shoes ?" The  two wrongs make a right fallacy is related.

    A variation (related to  Argument By Generalization) is to attack a whole class of people. For example, "Evolutionary biology is a sinister tool of the materialistic, atheistic religion of Secular Humanism." Similarly, one notorious net.kook waved away a whole category of evidence by announcing "All the scientists were drunk."

    Another variation is attack by innuendo: "Why don't scientists tell us what they really know; are they afraid of public panic ?"

    There may be a pretense that the attack isn't happening: "In order to maintain a civil debate, I will not mention my opponent's drinking problem." Or "I don't care if other people say you're [opinionated/boring/overbearing]."

    Attacks don't have to be strong or direct. You can merely show disrespect, or cut down his stature by saying that he seems to be sweating a lot, or that he has forgotten what he said last week. Some examples: "I used to think that way when I was your age." "You're new here, aren't you ?" "You weren't breast fed as a child, were you ?" "What drives you to make such a statement ?" "If you'd just listen.." "You seem very emotional." (This last works well if you have been hogging the microphone, so that they have had to yell to be heard.)

    Sometimes the attack is on the other person's intelligence. For example, "If you weren't so stupid you would have no problem seeing my point of view." Or, "Even you should understand my next point."

    Oddly, the stupidity attack is sometimes reversed. For example, dismissing a comment with "Well, you're just smarter than the rest of us." (In Britain, that might be put as "too clever by half".) This is Dismissal By Differentness. It is related to  Not Invented Here and Changing The Subject.

    Ad Hominem is not fallacious if the attack goes to the credibility of the argument. For instance, the argument may depend on its presenter's claim that he's an expert. (That is, the Ad Hominem is undermining an  Argument From Authority.) Trial judges allow this category of attacks.

  • Needling:
    simply attempting to make the other person angry, without trying to address the argument at hand. Sometimes this is a delaying tactic.

    Needling is also Ad Hominem if you insult your opponent. You may instead insult something the other person believes in ("Argumentum Ad YourMomium"), interrupt, clown to show disrespect, be noisy, fail to pass over the microphone, and numerous other tricks. All of these work better if you are running things - for example, if it is your radio show, and you can cut off the other person's microphone. If the host or moderator is firmly on your side, that is almost as good as running the show yourself. It's even better if the debate is videotaped, and you are the person who will edit the video.

    If you wink at the audience, or in general clown in their direction, then we are shading over to Argument By Personal Charm.

    Usually, the best way to cope with insults is to show mild amusement, and remain polite. A humorous comeback will probably work better than an angry one.

  • Straw Man (Fallacy Of Extension):
    attacking an exaggerated or caricatured version of your opponent's position.

    For example, the claim that "evolution means a dog giving birth to a cat."

    Another example: "Senator Jones says that we should not fund the attack submarine program. I disagree entirely. I can't understand why he wants to leave us defenseless like that."

    On the Internet, it is common to exaggerate the opponent's position so that a comparison can be made between the opponent and Hitler.

  • Inflation Of Conflict:
    arguing that scholars debate a certain point. Therefore, they must know nothing, and their entire field of knowledge is "in crisis" or does not properly exist at all.

    For example, two historians debated whether Hitler killed five million Jews or six million Jews. A Holocaust denier argued that this disagreement made his claim credible, even though his death count is three to ten times smaller than the known minimum.

    Similarly, in "The Mythology of Modern Dating Methods" (John Woodmorappe, 1999) we find on page 42 that two scientists "cannot agree" about which one of two geological dates is "real" and which one is "spurious". Woodmorappe fails to mention that the two dates differ by less than one percent.

  • Argument From Adverse Consequences (Appeal To Fear, Scare Tactics):
    saying an opponent must be wrong, because if he is right, then bad things would ensue. For example: God must exist, because a godless society would be lawless and dangerous. Or: the defendant in a murder trial must be found guilty, because otherwise husbands will be encouraged to murder their wives.

    Wishful thinking is closely related. "My home in Florida is one foot above sea level. Therefore I am certain that global warming will not make the oceans rise by fifteen feet." Of course, wishful thinking can also be about positive consequences, such as winning the lottery, or eliminating poverty and crime.

  • Special Pleading (Stacking The Deck):

    using the arguments that support your position, but ignoring or somehow disallowing the arguments against.

    Uri Geller used special pleading when he claimed that the presence of unbelievers (such as stage magicians) made him unable to demonstrate his psychic powers.

  • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation):
    assuming there are only two alternatives when in fact there are more. For example, assuming Atheism is the only alternative to Fundamentalism, or being a traitor is the only alternative to being a loud patriot.
  • Short Term Versus Long Term:
    this is a particular case of the Excluded Middle. For example, "We must deal with crime on the streets before improving the schools." (But why can't we do some of both ?) Similarly, "We should take the scientific research budget and use it to feed starving children."
  • Burden Of Proof:

    the claim that whatever has not yet been proved false must be true (or vice versa). Essentially the arguer claims that he should win by default if his opponent can't make a strong enough case.

    There may be three problems here. First, the arguer claims priority, but can he back up that claim ? Second, he is impatient with ambiguity, and wants a final answer right away. And third, "absence of evidence is not evidence of absence."

  • Argument By Question:

    asking your opponent a question which does not have a snappy answer. (Or anyway, no snappy answer that the audience has the background to understand.) Your opponent has a choice: he can look weak or he can look long-winded. For example, "How can scientists expect us to believe that anything as complex as a single living cell could have arisen as a result of random natural processes ?"

    Actually, pretty well any question has this effect to some extent. It usually takes longer to answer a question than ask it.

    Variants are the rhetorical question, and the loaded question, such as "Have you stopped beating your wife ?"

  • Argument by Rhetorical Question:
    asking a question in a way that leads to a particular answer. For example, "When are we going to give the old folks of this country the pension they deserve ?" The speaker is leading the audience to the answer "Right now." Alternatively, he could have said "When will we be able to afford a major increase in old age pensions?" In that case, the answer he is aiming at is almost certainly not "Right now."
  • Fallacy Of The General Rule:

    assuming that something true in general is true in every possible case. For example, "All chairs have four legs." Except that rocking chairs don't have any legs, and what is a one-legged "shooting stick" if it isn't a chair ?

    Similarly, there are times when certain laws should be broken. For example, ambulances are allowed to break speed laws.

  • Reductive Fallacy (Oversimplification):
    over-simplifying. As Einstein said, everything should be made as simple as possible, but no simpler. Political slogans such as "Taxation is theft" fall in this category.
  • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue):
    if an argument or arguer has some particular origin, the argument must be right (or wrong). The idea is that things from that origin, or that social class, have virtue or lack virtue. (Being poor or being rich may be held out as being virtuous.) Therefore, the actual details of the argument can be overlooked, since correctness can be decided without any need to listen or think.
  • Psychogenetic Fallacy:
    if you learn the psychological reason why your opponent likes an argument, then he's biased, so his argument must be wrong.
  • Argument Of The Beard:

    assuming that two ends of a spectrum are the same, since one can travel along the spectrum in very small steps. The name comes from the idea that being clean-shaven must be the same as having a big beard, since in-between beards exist.

    Similarly, all piles of stones are small, since if you add one stone to a small pile of stones it remains small.

    However, the existence of pink should not undermine the distinction between white and red.

  • Argument From Age (Wisdom of the Ancients):
    snobbery that very old (or very young) arguments are superior. This is a variation of the Genetic Fallacy, but has the psychological appeal of seniority and tradition (or innovation).

    Products labelled "New ! Improved !" are appealing to a belief that innovation is of value for such products. It's sometimes true. And then there's cans of "Old Fashioned Baked Beans".

  • Not Invented Here:

    ideas from elsewhere are made unwelcome. "This Is The Way We've Always Done It."

    This fallacy is a variant of the Argument From Age. It gets a psychological boost from feelings that local ways are superior, or that local identity is worth any cost, or that innovations will upset matters.

    An example of this is the common assertion that America has "the best health care system in the world", an idea that this 2007 New York Times editorial refuted.

    People who use the Not Invented Here argument are sometimes accused of being stick-in-the-mud's.

    Conversely, foreign and "imported" things may be held out as superior.

  • Argument By Dismissal:

    an idea is rejected without saying why.

    Dismissals usually have overtones. For example, "If you don't like it, leave the country" implies that your cause is hopeless, or that you are unpatriotic, or that your ideas are foreign, or maybe all three. "If you don't like it, live in a Communist country" adds an emotive element.

  • Argument To The Future:
    arguing that evidence will someday be discovered which will (then) support your point.
  • Poisoning The Wells:
    discrediting the sources used by your opponent. This is a variation of Ad Hominem.
  • Argument By Emotive Language (Appeal To The People):
    using emotionally loaded words to sway the audience's sentiments instead of their minds. Many emotions can be useful: anger, spite, envy, condescension, and so on.

    For example, argument by condescension: "Support the ERA ? Sure, when the women start paying for the drinks! Hah! Hah!"

    Americans who don't like the Canadian medical system have referred to it as "socialist", but I'm not quite sure if this is intended to mean "foreign", or "expensive", or simply guilty by association.

    Cliche Thinking and Argument By Slogan are useful adjuncts, particularly if you can get the audience to chant the slogan. People who rely on this argument may seed the audience with supporters or "shills", who laugh, applaud or chant at proper moments. This is the live-audience equivalent of adding a laugh track or music track. Now that many venues have video equipment, some speakers give part of their speech by playing a prepared video. These videos are an opportunity to show a supportive audience, use emotional music, show emotionally charged images, and the like. The idea is old: there used to be professional cheering sections. (Monsieur Zig-Zag, pictured on the cigarette rolling papers, acquired his fame by applauding for money at the Paris Opera.)

    If the emotion in question isn't harsh, Argument By Poetic Language helps the effect. Flattering the audience doesn't hurt either.

  • Argument By Personal Charm:
    getting the audience to cut you slack. Example: Ronald Reagan. It helps if you have an opponent with much less personal charm.

    Charm may create trust, or the desire to "join the winning team", or the desire to please the speaker. This last is greatest if the audience feels sex appeal.

    Reportedly George W. Bush lost a debate when he was young, and said later that he would never be "out-bubba'd" again.

  • Appeal To Pity (Appeal to Sympathy, The Galileo Argument):
    "I did not murder my mother and father with an axe ! Please don't find me guilty; I'm suffering enough through being an orphan."

    Some authors want you to know they're suffering for their beliefs. For example, "Scientists scoffed at Copernicus and Galileo; they laughed at Edison, Tesla and Marconi; they won't give my ideas a fair hearing either. But time will be the judge. I can wait; I am patient; sooner or later science will be forced to admit that all matter is built, not of atoms, but of tiny capsules of TIME."

    There is a strange variant which shows up on Usenet. Somebody refuses to answer questions about their claims, on the grounds that the asker is mean and has hurt their feelings. Or, that the question is personal.

  • Appeal To Force:
    threats, or even violence. On the Net, the usual threat is of a lawsuit. The traditional religious threat is that one will burn in Hell. However, history is full of instances where expressing an unpopular idea could you get you beaten up on the spot, or worse.
    "The clinching proof of my reasoning is that I will cut anyone who argues further into dogmeat."
    -- Attributed to Sir Geoffery de Tourneville, ca 1350 A.D.
  • Argument By Vehemence:
    being loud. Trial lawyers are taught this rule:
    If you have the facts, pound on the facts.
    If you have the law, pound on the law.
    If you don't have either, pound on the table.
    The above rule paints vehemence as an act of desperation. But it can also be a way to seize control of the agenda, use up the opponent's time, or just intimidate the easily cowed. And it's not necessarily aimed at winning the day. A tantrum or a fit is also a way to get a reputation, so that in the future, no one will mess with you.

    This is related to putting a post in UPPERCASE, aka SHOUTING.

    Depending on what you're loud about, this may also be an Appeal To ForceArgument By Emotive LanguageNeedling, or Changing The Subject.

  • Begging The Question (Assuming The Answer, Tautology):

    reasoning in a circle. The thing to be proved is used as one of your assumptions. For example: "We must have a death penalty to discourage violent crime". (This assumes it discourages crime.) Or, "The stock market fell because of a technical adjustment." (But is an "adjustment" just a stock market fall ?)

  • Stolen Concept:

    using what you are trying to disprove. That is, requiring the truth of something for your proof that it is false. For example, using science to show that science is wrong. Or, arguing that you do not exist, when your existence is clearly required for you to be making the argument.

    This is a relative of Begging The Question, except that the circularity there is in what you are trying to prove, instead of what you are trying to disprove.

    It is also a relative of Reductio Ad Absurdum, where you temporarily assume the truth of something.

  • Argument From Authority:
    the claim that the speaker is an expert, and so should be trusted.

    There are degrees and areas of expertise. The speaker is actually claiming to be more expert, in the relevant subject area, than anyone else in the room. There is also an implied claim that expertise in the area is worth having. For example, claiming expertise in something hopelessly quack (like iridology) is actually an admission that the speaker is gullible.

  • Argument From False Authority:
    a strange variation on Argument From Authority. For example, the TV commercial which starts "I'm not a doctor, but I play one on TV." Just what are we supposed to conclude ?
  • Appeal To Anonymous Authority:

    an Appeal To Authority is made, but the authority is not named. For example, "Experts agree that ..", "scientists say .." or even "they say ..". This makes the information impossible to verify, and brings up the very real possibility that the arguer himself doesn't know who the experts are. In that case, he may just be spreading a rumor.

    The situation is even worse if the arguer admits it's a rumor.

  • Appeal To Authority:
    "Albert Einstein was extremely impressed with this theory." (But a statement made by someone long-dead could be out of date. Or perhaps Einstein was just being polite. Or perhaps he made his statement in some specific context. And so on.)

    To justify an appeal, the arguer should at least present an exact quote. It's more convincing if the quote contains context, and if the arguer can say where the quote comes from.

    A variation is to appeal to unnamed authorities .

    There was a New Yorker cartoon, showing a doctor and patient. The doctor was saying: "Conventional medicine has no treatment for your condition. Luckily for you, I'm a quack." So the joke was that the doctor boasted of his lack of authority.

  • Appeal To False Authority:
    a variation on Appeal To Authority, but the Authority is outside his area of expertise.

    For example, "Famous physicist John Taylor studied Uri Geller extensively and found no evidence of trickery or fraud in his feats." Taylor was not qualified to detect trickery or fraud of the kind used by stage magicians. Taylor later admitted Geller had tricked him, but he apparently had not figured out how.

    A variation is to appeal to a non-existent authority. For example, someone reading an article by Creationist Dmitri Kuznetsov tried to look up the referenced articles. Some of the articles turned out to be in non-existent journals.

    Another variation is to misquote a real authority. There are several kinds of misquotation. A quote can be inexact or have been edited. It can be taken out of context. (Chevy Chase: "Yes, I said that, but I was singing a song written by someone else at the time.") The quote can be separate quotes which the arguer glued together. Or, bits might have gone missing. For example, it's easy to prove that Mick Jagger is an assassin. In "Sympathy For The Devil" he sang: "I shouted out, who killed the Kennedys, When after all, it was ... me."

  • Statement Of Conversion:
    the speaker says "I used to believe in X".

    This is simply a weak form of asserting expertise. The speaker is implying that he has learned about the subject, and now that he is better informed, he has rejected X. So perhaps he is now an authority, and this is an implied Argument From Authority.

    A more irritating version of this is "I used to think that way when I was your age." The speaker hasn't said what is wrong with your argument: he is merely claiming that his age has made him an expert.

    "X" has not actually been countered unless there is agreement that the speaker has that expertise. In general, any bald claim always has to be buttressed.

    For example, there are a number of Creationist authors who say they "used to be evolutionists", but the scientists who have rated their books haven't noticed any expertise about evolution.

  • Bad Analogy:

    claiming that two situations are highly similar, when they aren't. For example, "The solar system reminds me of an atom, with planets orbiting the sun like electrons orbiting the nucleus. We know that electrons can jump from orbit to orbit; so we must look to ancient records for sightings of planets jumping from orbit to orbit also."

    Or, "Minds, like rivers, can be broad. The broader the river, the shallower it is. Therefore, the broader the mind, the shallower it is."

    Or, "We have pure food and drug laws; why can't we have laws to keep movie-makers from giving us filth ?"

  • Extended Analogy:
    the claim that two things, both analogous to a third thing, are therefore analogous to each other. For example, this debate:
    "I believe it is always wrong to oppose the law by breaking it."
    "Such a position is odious: it implies that you would not have supported Martin Luther King."
    "Are you saying that cryptography legislation is as important as the struggle for Black liberation ? How dare you !"

    A person who advocates a particular position (say, about gun control) may be told that Hitler believed the same thing. The clear implication is that the position is somehow tainted. But Hitler also believed that window drapes should go all the way to the floor. Does that mean people with such drapes are monsters ?

  • Argument From Spurious Similarity:
    this is a relative of Bad Analogy. It is suggested that some resemblance is proof of a relationship. There is a WW II story about a British lady who was trained in spotting German airplanes. She made a report about a certain very important type of plane. While being quizzed, she explained that she hadn't been sure, herself, until she noticed that it had a little man in the cockpit, just like the little model airplane at the training class.
  • Reifying:
    an abstract thing is talked about as if it were concrete. (A possibly Bad Analogy is being made between concept and reality.) For example, "Nature abhors a vacuum."
  • False Cause:
    assuming that because two things happened, the first one caused the second one. (Sequence is not causation.) For example, "Before women got the vote, there were no nuclear weapons." Or, "Every time my brother Bill accompanies me to Fenway Park, the Red Sox are sure to lose."

    Essentially, these are arguments that the sun goes down because we've turned on the street lights.

  • Confusing Correlation And Causation:
    earthquakes in the Andes were correlated with the closest approaches of the planet Uranus. Therefore, Uranus must have caused them. (But Jupiter is nearer than Uranus, and more massive too.)

    When sales of hot chocolate go up, street crime drops. Does this correlation mean that hot chocolate prevents crime ? No, it means that fewer people are on the streets when the weather is cold.

    The bigger a child's shoe size, the better the child's handwriting. Does having big feet make it easier to write ? No, it means the child is older.

  • Causal Reductionism (Complex Cause):
    trying to use one cause to explain something, when in fact it had several causes. For example, "The accident was caused by the taxi parking in the street." (But other drivers went around the taxi. Only the drunk driver hit the taxi.)
  • Cliche Thinking:

    using as evidence a well-known wise saying, as if that is proven, or as if it has no exceptions.

  • Exception That Proves The Rule:

    a specific example of Cliche Thinking. This is used when a rule has been asserted, and someone points out the rule doesn't always work. The cliche rebuttal is that this is "the exception that proves the rule". Many people think that this cliche somehow allows you to ignore the exception, and continue using the rule.

    In fact, the cliche originally did no such thing. There are two standard explanations for the original meaning.

    The first is that the word "prove" meant test. That is why the military takes its equipment to a Proving Ground to test it. So, the cliche originally said that an exception tests a rule. That is, if you find an exception to a rule, the cliche is saying that the rule is being tested, and perhaps the rule will need to be discarded.

    The second explanation is that the stating of an exception to a rule, proves that the rule exists. For example, suppose it was announced that "Over the holiday weekend, students do not need to be in the dorms by midnight". This announcement implies that normally students do have to be in by midnight. Here is a discussion of that explanation.

    In either case, the cliche is not about waving away objections.

  • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal to Common Practice):
    the claim, as evidence for an idea, that many people believe it, or used to believe it, or do it.

    If the discussion is about social conventions, such as "good manners", then this is a reasonable line of argument.

    However, in the 1800's there was a widespread belief that bloodletting cured sickness. All of these people were not just wrong, but horribly wrong, because in fact it made people sicker. Clearly, the popularity of an idea is no guarantee that it's right.

    Similarly, a common justification for bribery is that "Everybody does it". And in the past, this was a justification for slavery.

  • Fallacy Of Composition:

    assuming that a whole has the same simplicity as its constituent parts. In fact, a great deal of science is the study of emergent properties. For example, if you put a drop of oil on water, there are interesting optical effects. But the effect comes from the oil/water system: it does not come just from the oil or just from the water.

    Another example: "A car makes less pollution than a bus. Therefore, cars are less of a pollution problem than buses."

    Another example: "Atoms are colorless. Cats are made of atoms, so cats are colorless."

  • Fallacy Of Division:
    assuming that what is true of the whole is true of each constituent part. For example, human beings are made of atoms, and human beings are conscious, so atoms must be conscious.
  • Complex Question (Tying):

    unrelated points are treated as if they should be accepted or rejected together. In fact, each point should be accepted or rejected on its own merits.

    For example, "Do you support freedom and the right to bear arms ?"

  • Slippery Slope Fallacy (Camel's Nose)

    there is an old saying about how if you allow a camel to poke his nose into the tent, soon the whole camel will follow.

    The fallacy here is the assumption that something is wrong because it is right next to something that is wrong. Or, it is wrong because it could slide towards something that is wrong.

    For example, "Allowing abortion in the first week of pregnancy would lead to allowing it in the ninth month." Or, "If we legalize marijuana, then more people will try heroin." Or, "If I make an exception for you then I'll have to make an exception for everyone."

  • Argument By Pigheadedness (Doggedness):
    refusing to accept something after everyone else thinks it is well enough proved. For example, there are still Flat Earthers.
  • Appeal To Coincidence:

    asserting that some fact is due to chance. For example, the arguer has had a dozen traffic accidents in six months, yet he insists they weren't his fault. This may be Argument By Pigheadedness. But on the other hand, coincidences do happen, so this argument is not always fallacious.

  • Argument By Repetition (Argument Ad Nauseam):
    if you say something often enough, some people will begin to believe it. There are some net.kooks who keeping reposting the same articles to Usenet, presumably in hopes it will have that effect.
  • Argument By Half Truth (Suppressed Evidence):

    this is hard to detect, of course. You have to ask questions. For example, an amazingly accurate "prophecy" of the assassination attempt on President Reagan was shown on TV. But was the tape recorded before or after the event ? Many stations did not ask this question. (It was recorded afterwards.)

    A book on "sea mysteries" or the "Bermuda Triangle" might tell us that the yacht Connemara IV was found drifting crewless, southeast of Bermuda, on September 26, 1955. None of these books mention that the yacht had been directly in the path of Hurricane Iona, with 180 mph winds and 40-foot waves.

  • Argument By Selective Observation:

    also called cherry picking, the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses. For example, a state boasts of the Presidents it has produced, but is silent about its serial killers. Or, the claim "Technology brings happiness". (Now, there's something with hits and misses.)

    Casinos encourage this human tendency. There are bells and whistles to announce slot machine jackpots, but losing happens silently. This makes it much easier to think that the odds of winning are good.

  • Argument By Selective Reading:

    making it seem as if the weakest of an opponent's arguments was the best he had. Suppose the opponent gave a strong argument X and also a weaker argument Y. Simply rebut Y and then say the opponent has made a weak case.

    This is a relative of Argument By Selective Observation, in that the arguer overlooks arguments that he does not like. It is also related to Straw Man (Fallacy Of Extension), in that the opponent's argument is not being fairly represented.

  • Argument By Generalization:

    drawing a broad conclusion from a small number of perhaps unrepresentative cases. (The cases may be unrepresentative because of Selective Observation.) For example, "They say 1 out of every 5 people is Chinese. How is this possible ? I know hundreds of people, and none of them is Chinese." So, by generalization, there aren't any Chinese anywhere. This is connected to the Fallacy Of The General Rule.

    Similarly, "Because we allow terminally ill patients to use heroin, we should allow everyone to use heroin."

    It is also possible to under-generalize. For example,

    "A man who had killed both of his grandmothers declared himself rehabilitated, on the grounds that he could not conceivably repeat his offense in the absence of any further grandmothers."
    -- "Ports Of Call" by Jack Vance
  • Argument From Small Numbers:
    "I've thrown three sevens in a row. Tonight I can't lose." This is Argument By Generalization, but it assumes that small numbers are the same as big numbers. (Three sevens is actually a common occurrence. Thirty three sevens is not.)

    Or: "After treatment with the drug, one-third of the mice were cured, one-third died, and the third mouse escaped." Does this mean that if we treated a thousand mice, 333 would be cured ? Well, no.

  • Misunderstanding The Nature Of Statistics (Innumeracy):

    President Dwight Eisenhower expressed astonishment and alarm on discovering that fully half of all Americans had below average intelligence. Similarly, some people get fearful when they learn that their doctor wasn't in the top half of his class. (But that's half of them.)

    "Statistics show that of those who contract the habit of eating, very few survive." -- Wallace Irwin.

    Very few people seem to understand "regression to the mean". This is the idea that things tend to go back to normal. If you feel normal today, does it really mean that the headache cure you took yesterday performed wonders ? Or is it just that your headaches are always gone the next day ?

    Journalists are notoriously bad at reporting risks. For example, in 1995 it was loudly reported that a class of contraceptive pills would double the chance of dangerous blood clots. The news stories mostly did not mention that "doubling" the risk only increased it by one person in 7,000. The "cell phones cause brain cancer" reports are even sillier, with the supposed increase in risk being at most one or two cancers per 100,000 people per year. So, if the fearmongers are right, your cellphone has increased your risk from "who cares" to "who cares".

  • Inconsistency:

    for example, the declining life expectancy in the former Soviet Union is due to the failures of communism. But, the quite high infant mortality rate in the United States is not a failure of capitalism.

    This is related to Internal Contradiction.

  • Non Sequitur:

    something that just does not follow. For example, "Tens of thousands of Americans have seen lights in the night sky which they could not identify. The existence of life on other planets is fast becoming certainty !"

    Another example: arguing at length that your religion is of great help to many people. Then, concluding that the teachings of your religion are undoubtably true.

    Or: "Bill lives in a large building, so his apartment must be large."

  • Meaningless Questions:
    irresistible forces meeting immovable objects, and the like.
  • Argument By Poetic Language:
    if it sounds good, it must be right. Songs often use this effect to create a sort of credibility - for example, "Don't Fear The Reaper" by Blue Oyster Cult. Politically oriented songs should be taken with a grain of salt, precisely because they sound good.
  • Argument By Slogan:
    if it's short, and connects to an argument, it must be an argument. (But slogans risk the Reductive Fallacy.)

    Being short, a slogan increases the effectiveness of Argument By Repetition. It also helps Argument By Emotive Language (Appeal To The People), since emotional appeals need to be punchy. (Also, the gallery can chant a short slogan.) Using an old slogan is Cliche Thinking.

  • Argument By Prestigious Jargon:
    using big complicated words so that you will seem to be an expert. Why do people use "utilize" when they could utilize "use" ?

    For example, crackpots used to claim they had a Unified Field Theory (after Einstein). Then the word Quantum was popular. Lately it seems to be Zero Point Fields.

  • Argument By Gibberish (Bafflement):
    this is the extreme version of Argument By Prestigious Jargon. An invented vocabulary helps the effect, and some net.kooks use lots of CAPitaLIZation. However, perfectly ordinary words can be used to baffle. For example, "Omniscience is greater than omnipotence, and the difference is two. Omnipotence plus two equals omniscience. META = 2." [From R. Buckminster Fuller's No More Secondhand God.]

    Gibberish may come from people who can't find meaning in technical jargon, so they think they should copy style instead of meaning. It can also be a "snow job", AKA "baffle them with BS", by someone actually familiar with the jargon. Or it could be Argument By Poetic Language.

    An example of poetic gibberish: "Each autonomous individual emerges holographically within egoless ontological consciousness as a non-dimensional geometric point within the transcendental thought-wave matrix."

  • Equivocation:

    using a word to mean one thing, and then later using it to mean something different. For example, sometimes "Free software" costs nothing, and sometimes it is without restrictions. Some examples:

    "The sign said 'fine for parking here', and since it was fine, I parked there."

    All trees have bark.
    All dogs bark.
    Therefore, all dogs are trees.

    "Consider that two wrongs never make a right, but that three lefts do."
    - "Deteriorata", National Lampoon

  • Euphemism:
    the use of words that sound better. The lab rat wasn't killed, it was sacrificed. Mass murder wasn't genocide, it was ethnic cleansing. The death of innocent bystanders is collateral damage. Microsoft doesn't find bugs, or problems, or security vulnerabilities: they just discover an issue with a piece of software.

    This is related to Argument By Emotive Language, since the effect is to make a concept emotionally palatable.

  • Weasel Wording:
    this is very much like Euphemism, except that the word changes are done to claim a new, different concept rather than soften the old concept. For example, an American President may not legally conduct a war without a declaration of Congress. So, various Presidents have conducted "police actions", "armed incursions", "protective reaction strikes," "pacification," "safeguarding American interests," and a wide variety of "operations". Similarly, War Departments have become Departments of Defense, and untested medicines have become alternative medicines. The book "1984" has some particularly good examples.
  • Error Of Fact:
    for example, "No one knows how old the Pyramids of Egypt are." (Except, of course, for the historians who've read records and letters written by the ancient Egyptians themselves.)

    Typically, the presence of one error means that there are other errors to be uncovered.

  • Argument From Personal Astonishment:
    Errors of Fact caused by stating offhand opinions as proven facts. (The speaker's thought process being "I don't see how this is possible, so it isn't.") An example from Creationism is given here.

    This isn't lying, quite. It just seems that way to people who know more about the subject than the speaker does.

  • Lies:
    intentional Errors of Fact. In some contexts this is called bluffing.

    If the speaker thinks that lying serves a moral end, this would be a Pious Fraud.

  • Contrarian Argument:
    in science, espousing some thing that the speaker knows is generally ill-regarded, or even generally held to be disproven. For example, claiming that HIV is not the cause of AIDS, or claiming that homeopathic remedies are not just placebos.

    In politics, the phrase may be used more broadly, to mean espousing some position that the establishment or opposition party does not hold.

    This is sometimes done to make people think, and sometimes it is needling, or perhaps it supports an external agenda. But it can also be done just to oppose conformity, or as a pose or style choice: to be a "maverick" or lightning rod. Or, perhaps just for the ego of standing alone:

    "It is not enough to succeed. Friends must be seen to have failed."
    -- Truman Capote

    "If you want to prove yourself a brilliant scientist, you don't always agree with the consensus. You show you're right and everyone else is wrong."
    -- Daniel Kirk-Davidoff discussing Richard Lindzen

    Calling someone contrarian risks the Psychogenetic Fallacy. People who are annoying are not necessarily wrong. On the other hand, if the position is ill-regarded for a reason, then defending it may be uphill.

    Trolling is Contrarian Argument done to get a reaction. Trolling on the Internet often involves pretense.

  • Hypothesis Contrary To Fact:
    arguing from something that might have happened, but didn't.
  • Internal Contradiction:
    saying two contradictory things in the same argument. For example, claiming that Archaeopteryx is a dinosaur with hoaxed feathers, and also saying in the same book that it is a "true bird". Or another author who said on page 59, "Sir Arthur Conan Doyle writes in his autobiography that he never saw a ghost." But on page 200 we find "Sir Arthur's first encounter with a ghost came when he was 25, surgeon of a whaling ship in the Arctic.."

    This is much like saying "I never borrowed his car, and it already had that dent when I got it."

    This is related to Inconsistency.

  • Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis):
    this is sometimes used to avoid having to defend a claim, or to avoid making good on a promise. In general, there is something you are not supposed to notice.

    For example, I got a bill which had a big announcement about how some tax had gone up by 5%, and the costs would have to be passed on to me. But a quick calculation showed that the increased tax was only costing me a dime, while a different part of the the bill had silently gone up by $10.

    This is connected to various diversionary tactics, which may be obstructive, obtuse, or needling. For example, if you quibble about the meaning of some word a person used, they may be quite happy about being corrected, since that means they've derailed you, or changed the subject. They may pick nits in your wording, perhaps asking you to define "is". They may deliberately misunderstand you:

    "You said this happened five years before Hitler came to power. Why are you so fascinated with Hitler ? Are you anti-Semitic ?"

    It is also connected to various rhetorical tricks, such as announcing that there cannot be a question period because the speaker must leave. (But then he doesn't leave.)

  • Argument By Fast Talking:
    if you go from one idea to the next quickly enough, the audience won't have time to think. This is connected to Changing The Subject and (to some audiences) Argument By Personal Charm.

    However, some psychologists say that to understand what you hear, you must for a brief moment believe it. If this is true, then rapid delivery does not leave people time to reject what they hear.

  • Having Your Cake (Failure To Assert, or Diminished Claim):
    almost claiming something, but backing out. For example, "It may be, as some suppose, that ghosts can only be seen by certain so-called sensitives, who are possibly special mutations with, perhaps, abnormally extended ranges of vision and hearing. Yet some claim we are all sensitives."

    Another example: "I don't necessarily agree with the liquefaction theory, nor do I endorse all of Walter Brown's other material, but the geological statements are informative." The strange thing here is that liquefaction theory (the idea that the world's rocks formed in flood waters) was demolished in 1788. To "not necessarily agree" with it, today, is in the category of "not necessarily agreeing" with 2+2=3. But notice that writer implies some study of the matter, and only partial rejection.

    A similar thing is the failure to rebut. Suppose I raise an issue. The response that "Woodmorappe's book talks about that" could possibly be a reference to a resounding rebuttal. Or perhaps the responder hasn't even read the book yet. How can we tell ? [I later discovered it was the latter.]

  • Ambiguous Assertion:

    a statement is made, but it is sufficiently unclear that it leaves some sort of leeway. For example, a book about Washington politics did not place quotation marks around quotes. This left ambiguity about which parts of the book were first-hand reports and which parts were second-hand reports, assumptions, or outright fiction.

    Of course, lack of clarity is not always intentional. Sometimes a statement is just vague.

    If the statement has two different meanings, this is Amphiboly. For example, "Last night I shot a burglar in my pyjamas."

  • Failure To State:
    if you make enough attacks, and ask enough questions, you may never have to actually define your own position on the topic.
  • Outdated Information:
    information is given, but it is not the latest information on the subject. For example, some creationist articles about the amount of dust on the moon quote a measurement made in the 1950's. But many much better measurements have been done since then.
  • Amazing Familiarity:
    the speaker seems to have information that there is no possible way for him to get, on the basis of his own statements. For example: "The first man on deck, seaman Don Smithers, yawned lazily and fingered his good luck charm, a dried seahorse. To no avail ! At noon, the Sea Ranger was found drifting aimlessly, with every man of its crew missing without a trace !"
  • Least Plausible Hypothesis:

    ignoring all of the most reasonable explanations. This makes the desired explanation into the only one. For example: "I left a saucer of milk outside overnight. In the morning, the milk was gone. Clearly, my yard was visited by fairies."

    There is an old rule for deciding which explanation is the most plausible. It is most often called "Occam's Razor", and it basically says that the simplest is the best. The current phrase among scientists is that an explanation should be "the most parsimonious", meaning that it should not introduce new concepts (like fairies) when old concepts (like neighborhood cats) will do.

    On ward rounds, medical students love to come up with the most obscure explanations for common problems. A traditional response is to tell them "If you hear hoof beats, don't automatically think of zebras".

  • Argument By Scenario:
    telling a story which ties together unrelated material, and then using the story as proof they are related.
  • Affirming The Consequent:
    logic reversal. A correct statement of the form "if P then Q" gets turned into "Q therefore P".

    For example,

    "All cats die; Socrates died; therefore Socrates was a cat."

    Another example: "If the earth orbits the sun, then the nearer stars will show an apparent annual shift in position relative to more distant stars (stellar parallax). Observations show conclusively that this parallax shift does occur. This proves that the earth orbits the sun." In reality, it proves that Q [the parallax] is consistent with P [orbiting the sun]. But it might also be consistent with some other theory. (Other theories did exist. They are now dead, because although they were consistent with a few facts, they were not consistent with all the facts.)

    Another example: "If space creatures were kidnapping people and examining them, the space creatures would probably hypnotically erase the memories of the people they examined. These people would thus suffer from amnesia. But in fact many people do suffer from amnesia. This tends to prove they were kidnapped and examined by space creatures." This is also a Least Plausible Hypothesis explanation.

  • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection):
    if your opponent successfully addresses some point, then say he must also address some further point. If you can make these points more and more difficult (or diverse) then eventually your opponent must fail. If nothing else, you will eventually find a subject that your opponent isn't up on.

    This is related to Argument By Question. Asking questions is easy: it's answering them that's hard.

    If each new goal causes a new question, this may get to be Infinite Regression.

    It is also possible to lower the bar, reducing the burden on an argument. For example, a person who takes Vitamin C might claim that it prevents colds. When they do get a cold, then they move the goalposts, by saying that the cold would have been much worse if not for the Vitamin C.

  • Appeal To Complexity:
    if the arguer doesn't understand the topic, he concludes that nobody understands it. So, his opinions are as good as anybody's.
  • Common Sense:
    unfortunately, there simply isn't a common-sense answer for many questions. In politics, for example, there are a lot of issues where people disagree. Each side thinks that their answer is common sense. Clearly, some of these people are wrong.

    The reason they are wrong is because common sense depends on the context, knowledge and experience of the observer. That is why instruction manuals will often have paragraphs like these:

    When boating, use common sense. Have one life preserver for each person in the boat.

    When towing a water skier, use common sense. Have one person watching the skier at all times.

    If the ideas are so obvious, then why the second sentence ? Why do they have to spell it out ? The answer is that "use common sense" actually meant "pay attention, I am about to tell you something that inexperienced people often get wrong."

    Science has discovered a lot of situations which are far more unfamiliar than water skiing. Not surprisingly, beginners find that much of it violates their common sense. For example, many people can't imagine how a mountain range would form. But in fact anyone can take good GPS equipment to the Himalayas, and measure for themselves that those mountains are rising today.

    If a speaker tells an audience that he supports using common sense, it is very possibly an Ambiguous Assertion.

  • Argument By Laziness (Argument By Uninformed Opinion):
    the arguer hasn't bothered to learn anything about the topic. He nevertheless has an opinion, and will be insulted if his opinion is not treated with respect. For example, someone looked at a picture on one of my web pages, and made a complaint which showed that he hadn't even skimmed through the words on the page. When I pointed this out, he replied that I shouldn't have had such a confusing picture.
  • Disproof By Fallacy:
    if a conclusion can be reached in an obviously fallacious way, then the conclusion is incorrectly declared wrong. For example,
    "Take the division 64/16. Now, canceling a 6 on top and a six on the bottom, we get that 64/16 = 4/1 = 4."
    "Wait a second ! You can't just cancel the six !"
    "Oh, so you're telling us 64/16 is not equal to 4, are you ?"

    Note that this is different from Reductio Ad Absurdum, where your opponent's argument can lead to an absurd conclusion. In this case, an absurd argument leads to a normal conclusion.

  • Reductio Ad Absurdum:
    showing that your opponent's argument leads to some absurd conclusion. This is in general a reasonable and non-fallacious way to argue. If the issues are razor-sharp, it is a good way to completely destroy his argument. However, if the waters are a bit muddy, perhaps you will only succeed in showing that your opponent's argument does not apply in all cases, That is, using Reductio Ad Absurdum is sometimes using the Fallacy Of The General Rule. However, if you are faced with an argument that is poorly worded, or only lightly sketched, Reductio Ad Absurdum may be a good way of pointing out the holes.

    An example of why absurd conclusions are bad things:

    Bertrand Russell, in a lecture on logic, mentioned that in the sense of material implication, a false proposition implies any proposition. A student raised his hand and said "In that case, given that 1 = 0, prove that you are the Pope". Russell immediately replied, "Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope."
  • False Compromise:

    if one does not understand a debate, it must be "fair" to split the difference, and agree on a compromise between the opinions. (But one side is very possibly wrong, and in any case one could simply suspend judgment.) Journalists often invoke this fallacy in the name of "balanced" coverage.

    "Some say the sun rises in the east, some say it rises in the west; the truth lies probably somewhere in between."

    Television reporters like balanced coverage so much that they may give half of their report to a view held by a small minority of the people in question. There are many possible reasons for this, some of them good. However, viewers need to be aware of this tendency.

  • Fallacy Of The Crucial Experiment:

    claiming that some idea has been proved (or disproved) by a pivotal discovery. This is the "smoking gun" version of history.

    Scientific progress is often reported in such terms. This is inevitable when a complex story is reduced to a soundbite, but it's almost always a distortion. In reality, a lot of background happens first, and a lot of buttressing (or retraction) happens afterwards. And in natural history, most of the theories are about how often certain things happen (relative to some other thing). For those theories, no one experiment could ever be conclusive.

  • Two Wrongs Make A Right (Tu Quoque, You Too, What's sauce for the goose is sauce for the gander):

    a charge of wrongdoing is answered by a rationalization that others have sinned, or might have sinned. For example, Bill borrows Jane's expensive pen, and later finds he hasn't returned it. He tells himself that it is okay to keep it, since she would have taken his.

    War atrocities and terrorism are often defended in this way.

    Similarly, some people defend capital punishment on the grounds that the state is killing people who have killed.

    This is related to Ad Hominem (Argument To The Man).

  • Pious Fraud:

    fraud done to accomplish some good end, on the theory that the end justifies the means.

    For example, a church in Canada had a statue of Christ which started to weep tears of blood. When analyzed, the blood turned out to be beef blood. We can reasonably assume that someone with access to the building thought that bringing souls to Christ would justify his small deception.

    In the context of debates, a Pious Fraud could be a lie. More generally, it would be when an emotionally committed speaker makes an assertion that is shaded, distorted or even fabricated. For example, British Prime Minister Tony Blair was accused in 2003 of "sexing up" his evidence that Iraq had Weapons of Mass Destruction.

    Around the year 400, Saint Augustine wrote two books, De Mendacio[On Lying] and Contra Medacium[Against Lying], on this subject. He argued that the sin isn't in what you do (or don't) say, but in your intent to leave a false impression. He strongly opposed Pious Fraud. I believe that Martin Luther also wrote on the subject.


Last modified: 16 September 2013

Back to the Science and Skepticism page.


Browse Logical Fallacies

“Logic is the beginning of wisdom, not the end” – Leonard Nimoy

Logical fallacies are flaws in reasoning that weaken an argument, or tricks of thought used as a debate tactic in order to persuade people. They are commonplace in all types of debates and discussions – in politics, advertising, media, and our everyday conversations – whether they are used intentionally or committed unknowingly due to a lack of argumentation skills.

Accident Fallacy - Ad Hoc Fallacy - Ad Hominem Fallacies - Ad Hominem Abusive Affirming the Consequent - Anecdotal Fallacy - Appeal to Authority - Appeal to Celebrity Appeal to Coindidence - Appeal to Consequesnces - Appeal to Emotion - Appeal to Force - Appeal to Ignorance - Appeal to Nature - Appeal to Novelty - Appeal to Pity Appeal to Probability - Appeal to Tradition - Argument from Fallacy - Argument From Incredulity - Bandwagon - Begging the Question - Burden of Proof - Cherrypicking Circumstantial Ad Hominem - Composition Denying the Antecedent - Division - Ecological Fallacy - Equivocation - False Analogy - False Dilemma - False Equivalence Gamblers Fallacy - Genetic - Gish Gallop - Hasty Generalization - Loaded Question Middle Ground - No True Scotsman - Poisoning the Well - Post Hoc - Red Herring Slippery Slope - Special Pleading - Straw Man - Texas Sharpshooter - Tu Quoque - Whataboutism

The Fallacy Summaries Below are from https://fallacyinlogic.com/ where the explanations contain much more information than these summaries.


GUIDE TO LOGICAL FALLACIES

A

ACCIDENT FALLACY

  • Accident Fallacy: Definition and Examples - a generalization is applied to a situation where, in reality, it doesn’t apply. Although “rules of thumb” can be useful and help us think more efficiently, they typically shouldn’t be taken for unconditional, universal rules. Also known as the “fallacy of the general rule” and “sweeping generalization”. Occurs when someone applies a general rule to a case in which the rule is inapplicable.

    An example of this would be: “Human beings have the ability to hear sounds. Therefore, all people are cabable of hearing sounds.” Such a claim is fallacious because the general rule doesn’t apply here; the speaker ignores the fact that there are people who have a hearing disability.

AD HOC

  • Ad Hoc Fallacy: Definition and Examples - refers to an idea or solution that is intended for a specific use, and not for any other uses. Occurs when someone comes up with a rationale or explanation to dismiss the counter-evidence to their claim in a bid to protect it.  Since the ad hoc rescue is not an actual argument, it technically cannot be a logical fallacy. However, it is often still classified as one because it’s used as a substitution for a proper argument.

    A person presents a new explanation – that is unjustified or simply unreasonable – of why their original belief or hypothesis is correct after evidence that contradicts the previous explanation has emerged. The explanation is specifically constructed to be used in a particular case and is created hastily at the moment rather than being the result of deliberate, fact-based reasoning.

AD HOMINEM

  • Ultimate Guide to Ad Hominem Fallacies: How And When Personal Attacks Are Fallacious - is based on personal and irrelevant attacks against the source of an argument, instead of addressing the argument itself. The attacker takes aim at their opponent’s supposed failings, that are unrelated to the issue at hand, rather than focusing on the validity of the argument or position they support. The attacks can be directed towards someone’s character, background, past actions, intelligence, morals, physical appearance, or credentials. As such, this fallacy tends to appeal to people’s emotions and prejudices instead of intellect.

    In the political arena, its use is also referred to as “mudslinging”, and it’s often the meat and potatoes of political campaigns. For instance, calling your opponent offending nicknames, such as “lyin’ Hillary and “crooked Hillary”, can be seen as fallacious ad hominems when they are used in an attempt to discredit the opponent’s arguments. In many cases, criticizing your adversary personally is a powerful (although unethical) strategy if your goal is to pull focus off the real issue. Personal insults tend to have an emotional appeal, which can be effective in manipulating the audience’s opinion and possibly damaging the credibility of the opposing side.

    There are five main types of ad hominems: abusive, circumstantial, tu quoque, guilt by association, and poisoning the well.

AD HOMINEM ABUSIVE

  • Ad Hominem Abusive (Personal Attack): Definition And Examples - occurs when someone verbally attacks the person making an argument, rather than criticizing the validity of their claim. In other words, it’s an attempt to discredit someone’s argument by directing the focus on their supposed failings – that are unrelated to the issue at hand – such as their character, intelligence, physical appearance, or morals.

AFFIRMING THE CONSEQUENT

  • What is the Formal Fallacy of Affirming the Consequent? - occurs when someone mistakenly infers that the opposite of a true “if-then” statement is true. Also known as “converse error”, “asserting the consequent”, and “fallacy of the consequent”. The converse of a true conditional statement (or “if-then” statement) is said to be true. In other words, it is assumed that if the proposition “if A, then B” is true, then “if B, then A” is true as well.

    A type of formal fallacy (or deductive fallacy), which refers to a flaw in the structure of a deductive argument. A deductive argument is one that is intended to provide a necessarily valid conclusion if the premises are true: its validity is dependant on the structure of the argument. Affirming the consequent is an invalid argument because its premises do not guarantee the truthfulness of the conclusion. As seen above, there is a flaw in the argument’s structure because it uses erroneous conditional logic, and it is this flaw that renders the conclusion invalid.

ANECDOTAL FALLACY

  • Anecdotal Fallacy: Why Is The Use of Anecdotal Evidence Fallacious? - evidence that is collected in a non-scientific manner and supported by isolated, specific instances of an event. It relies on personal testimonies rather than on scientific evidence, and, consequently, is considered the weakest type of evidence. Arises when someone uses proof that relies on personal testimonies, such as a story based on someone’s individual experience, in order to support or refute a claim. The speaker draws a general conclusion based on a limited number of examples that are collected in an informal way (and often cherry-picked in favor of the argument).

    Person Y told me that he saw/heard X. Therefore, X must be true. “My grandfather was a heavy smoker most of his life, but he lived to be 90 years old. Therefore, smoking is not harmful to people.” Here, the notion that a single individual lived to old age despite smoking is anecdotal evidence and, in reality, does not prove that smoking is harmless. In such issues, there will be exceptions to the rule, and simply pointing out one of those exceptions doesn’t disprove the rule; only statistical evidence can show us how typical something is.

    Another problem with anecdotal evidence is that the way it is collected and presented is subject to cognitive biases. For instance, someone may be affected by the confirmation bias and bring up only those instances that are in support of their existing beliefs, or they may put unwarranted credibility to a particular claim due to our innate tendency, known as the bandwagon effect, to adopt something because a lot of other people are doing it.

APPEAL TO AUTHORITY

  • Appeal to Authority Fallacy: When and How Is It Wrong to Rely on Experts? - refers to the different ways of fallaciously using the statements or opinions of authority figures in order to support a conclusion. For instance, someone may assume that something must be true if a so-called expert believes it to be true, and no other evidence is needed. One misuses the testimonies of perceived authorities in an attempt to back up a certain claim or position.

    Making a claim based on the opinions of experts is by no means always unreasonable or fallacious. Relevant experts can provide us with strong reasons to believe that something is true... However, note that even the views of valid experts cannot guarantee that something is true; in terms of logic and argumentation, even experts can be wrong and their testimonies can only suggest that something is likely to be true, not that it is necessarily true.

    Various ways this fallacy may occur;
    APPEAL TO FALSE AUTHORITY - likely the most common way of erroneously citing (supposed) experts. It occurs when someone uses the words of poor or irrelevant authorities as evidence for a claim. In such a case, the authorities are unqualified or their expertise is not relevant to the argument being made (almost any celebrity endorsement in advertising).

    AGAINST THE CONSENSUS - an expert’s views are contradictory to the consensus (or general agreement) within a field of study, their testimony can only provide weak evidence. ...if 97% of climate scientists say that climate change is real, it is unreasonable to make a conclusion based on the beliefs of the 3% who disagree.

    IPSE DIXIT - is Latin for “he himself said it”, is a term that refers to a situation where someone fallaciously uses themselves as an authority in an attempt to prove that something is true. For example, a patient asserts that their doctor’s opinion about their condition is wrong since “it is their body, so they must know better than doctors”.

    APPEAL TO UNNAMED AUTHORITIES - it’s fallacious to make a claim based on authorities’ opinions that cannot be verified. This type of claim often appeals vaguely to some unnamed experts. For example, someone claims that “most dentists say toothbrush X is the best kind of toothbrush for you, so it must be true”. If this is the only proof they offer, and we are unable to verify if it’s actually correct, we don’t have real reasons to believe it’s truthfulness.

APPEAL TO CELEBRITY

  • What Is Appeal to Celebrity? | Definition and Examples - based on the belief that celebrities are authoritative sources even in areas that are outside of their field of expertise (a specific form of the appeal to authority fallacy). Occurs especially frequently in advertising where companies hire famous people to endorse their brand and products... when a famous person’s opinion or argument is accepted as true simply because they are famous. The need for factual evidence and sound reasoning is dismissed due to the authoritative status they are perceived to have.

    However, since most celebrities are actors, singers, or sports stars, they are rarely experts on the subjects they talk about. “Celebrity X, who is a famous actor, says that moon landing was a hoax and we must start acknowledging the fact. He/she must be right.” Here, the claim is accepted solely based on the celebrity status of the individual behind it, rather than on the actual merits of the claim. This is fallacious because, in terms of logic and argumentation, popularity or social status of a person is irrelevant to the truth value of the claims they make.

APPEAL TO COINCIDENCE

  • Appeal to Coincidence Fallacy - one asserts that a certain event must have occurred due to a coincidence, despite all the evidence to the contrary. As such, it’s a form of slothful induction: A flawed inductive argument that rejects a reasonable conclusion even though there is strong evidence for it (also commonly known as “appeal to luck” or “appeal to bad luck”). Appeal to coincidence occurs when someone refuses to accept a conclusion supported with relevant evidence, and instead claims that a certain event or result was purely coincidental.

    One example would be: Jim has been fired from 7 different jobs in the past six months. He says that it has nothing to do with him or his skills; he has just been very unlucky. Here, it is very unlikely that Jim’s poor track record doesn’t have anything to do with him, even if such bad luck is possible. The evidence (being fired 7 times in 6 months) seems to strongly suggest that Jim has lots of room for improvement as an employee. If we have proof to show that a particular event was not, in fact, due to chance, it would be irrational to dismiss it.

APPEAL TO CONSEQUENCES

  • Appeal to Consequences – Definition and Examples - occurs when the truthfulness of a statement or belief is decided by the consequences it would have. Someone concludes that a statement, belief, or hypothesis must be true (or false) simply because it would lead to desirable (or undesirable) consequences if it were so. It has two logical forms, a positive and negative one. The positive form goes: If X is true, then Y will happen.Y is desirable.Therefore, X is true. And the negative one: If X is true, then Y will happen.Y is undesirable.Therefore, X is false.

    This is fallacious because the truth-value of a hypothesis or belief is not affected by the desirability of its consequences; the fact that a premise would have a positive outcome doesn’t make it more true. Moreover, the desirability of an outcome can be highly debatable because it is often based on a subjective point of view. This fallacy is seen as a type of emotional appeal: essentially, it makes inferences on the grounds of how people feel about something, rather than based upon facts and logic.

APPEAL TO EMOTION

  • Essential Guide: The Appeal to Emotion Fallacy With Examples - occurs when someone uses emotional appeals, such as pity, fear, and joy, instead of relevant facts and logic to support a claim. In other words, the arguer intends to get an emotional reaction from the listeners to help convince them that the claim being made is valid. For example, in a debate, someone may encourage the audience to dismiss the opposing argument and the evidence supporting it by arousing feelings of fear, hate, envy, or disgust towards the opponent.

    Appeal to emotion is a highly effective rhetorical technique in persuading and manipulating the recipient’s opinions, beliefs, and actions. It often utilizes loaded language — meaning language that is intended to raise emotions and directly affect the listener’s views — as well as concepts such as religion, country, and crime in order to appeal to various prejudices. Emotional appeals are particularly persuasive because, due to the nature of human cognition, people typically rely on their emotional responses to things when making decisions, instead of facts and logical reasoning (they tend to bypass people’s logic and skepticism commonly used in situations where one lacks factual evidence).

    The appeal to emotion is considered to be dishonest as a logical argument since it doesn’t rely on logic and fact-based reasoning. Put differently, no mathematician who values their own credibility would try to prove a mathematical theorem by appealing to the listener’s sympathy or any other feeling. The reality is that arguments are either valid or invalid; the fact that we desire it to be something doesn’t make it so.

    The common variations include: appeal to pride, appeal to popularity, appeal to nature, appeal to pity, appeal to fear, appeal to envy & appeal to hatred

APPEAL TO FORCE

  • Appeal to Force (Logical Fallacy): Definition and Examples - Someone uses force or a threat of force to gain acceptance for their argument or position. Rather than appealing to intellect, it fallaciously seeks compliance by evoking fear and anxiety. It is also known as argumentum ad baculum and appeal to the stick. Furthermore, it is a type of appeal to consequences: the truthfulness of the conclusion is decided by the consequences it would have, rather than looking at the actual merits of the argument. In essence, it states that “accept my argument, or I will punish you.”

APPEAL TO IGNORANCE

  • Appeal to Ignorance Fallacy – The Definition and Examples - Someone argues either for or against something because there is no contradicting evidence. In other words, it’s based on the mistaken assumption that a lack of evidence is evidence. One falls victim to this fallacious line of reasoning when they assert that a claim must be true if it hasn’t been proven false, or false if it hasn’t been proven true.

APPEAL TO NATURE

  • Appeal to Nature Fallacy – Definition and Examples - In essence, states that natural things are either good or better than synthetic ones. Occurs when it is claimed that something must be good or true because it is “natural”, or conversely, if something is “unnatural” it must be bad or untrue. These types of arguments are fallacious since the fact that a thing is “natural” cannot alone prove that it is somehow true or superior; it would be the factual evidence supporting the claim that makes it so.

APPEAL TO NOVELTY

  • What Is the Appeal to Novelty Fallacy? Definition and Examples - States that something, such as a product or idea, must be good or superior simply because it is “new”. May occur in two ways: First, something that is novel is claimed to be superior simply due to its virtue of being novel, and second, the alternative is said to be inferior because it is “older”.

    The underlying reasoning behind an appeal to novelty is that new versions will always be improved from the previous standards. However... “newness” alone will not guarantee improvement or superiority; even though the latest idea or solution can, and frequently does, prove to yield better results, sticking with the status quo is often the more sensible thing to do.

APPEAL TO PITY

  • Appeal to Pity – Definition and Examples - Someone appeals to the feeling of pity in place of valid evidence and sound reasoning. It’s a specific form of the appeal to emotion fallacy. Occurs when a person attempts to gain support for their claim or position by arousing the feeling of pity in their opponent and audience. Something must be true or false because it would be sad if it wasn’t. Like other emotional appeals... the way we feel about an argument doesn’t make it any more or any less true; it is the factual evidence supporting it that makes it so.

APPEAL TO PROBABILITY

  • Appeal to Probability Fallacy – Definition and Examples - Occurs when someone argues that because something will probably happen, or is probably true, it will necessarily happen, or is necessarily true. In other words, one mistakenly assumes that they can take for granted something that is probably the case. This is clearly fallacious: a probability is not the same as certainty, and, in terms of logic and argumentation, shouldn’t be treated as such. Its different variations include the appeal to improbability and the appeal to possibility.

APPEAL TO TRADITION

  • Appeal to Tradition Fallacy: Definition and Examples - Occurs when someone claims that a particular action or belief must be good or true because it is traditional. Its based on the false assumption that if something has been done a certain way for a long time (that is, traditionally), it is necessarily the right way of doing it. Its based on historical preferences, instead of factual evidence; the only evidence it presents is simply the fact that something is or has been a common practice. This alone does not provide enough evidence for the claim. It could be used to justify any discriminative or incorrect belief that has been long-held.

ARGUMENT FROM FALLACY

  • The Argument From Fallacy: Can Fallacious Arguments Have True Conclusions? - Occurs when a person claims that the conclusion of an argument must be wrong because the argument made for it contains a logical fallacy. Even if an argument contains a logical fallacy, it doesn’t mean that its conclusion is necessarily wrong. It is entirely possible to use a fallacious argument in an attempt to support any true proposition, without affecting its truth value. This doesn’t imply that it is wrong to point out when someone makes a fallacious argument: in most situations, it is reasonable to do so. Rather, we shouldn’t declare something to be false just because a certain argument made for it is incorrect.

ARGUMENT FROM INCREDULITY

  • Argument From Incredulity: How We Mistakenly Dismiss Concepts We Don’t Understand - Someone concludes that something must not be true (or false) since they cannot believe or imagine it being true (or false). Frequently used in debates over science and religion when certain theories and claims differ from our own deeply held beliefs. A form of appeal to ignorance. The fallacy of argument from incredulity occurs when someone asserts that because they can’t believe or imagine something being true, it must be false, or conversely, something must be true since they can’t imagine how it could be false. This line of reasoning is fallacious because it is not backed by real evidence.

B – D

BANDWAGON

  • Bandwagon Fallacy: How Appeals to Popular Beliefs Are Fallacious - Based on the assumption that something must be true or good if it’s in accordance with the opinions of many others. Also known by a number of other names: “appeal to popularity”, “argument by consensus” and “appeal to the gallery” to name a few. Bandwagon fallacy makes an appeal to a certain popular idea, value, or taste, and uses only its popularity (“everyone is doing it”) as evidence for its truthfulness. Put simply, it occurs when a person asserts that something must be true or good because it is popular. It’s the factual evidence supporting a theory that makes it true, not simply the fact that it’s popular. Many common ideas and beliefs are undoubtedly true, but many of them are also incorrect.

BEGGING THE QUESTION

  • Begging the Question Fallacy - Based on assumptions rather than on concrete evidence. It uses the claim it is trying to prove as a premise for the argument in order to prove the very same claim. It’s also known as petitio principii (Latin for “assuming the initial point”) and “chicken and the egg argument”, and it’s seen as a form of circular reasoning. The fallacy of begging the question occurs when the conclusion of an argument is assumed in one of its premises. The validity of this type of argument requires its own conclusion to be true.

BURDEN OF PROOF

  • The Burden Of Proof Fallacy – Definition And Examples - Refers to the obligation to provide supporting evidence for a claim. All logical arguments need to have sufficient evidence to back up their conclusions. Occurs when one abuses their burden of proof by attempting to shift it to someone else. In general, the person or party making an argument has the burden of proof to justify it (whether they argue that something is true or false). This applies, in particular, to situations where someone challenges a prevailing status quo or a well-established idea.

    In a debate, the burden of proof lies typically with the person making a claim; the opposing side doesn’t have a burden of proof until evidence has been provided for the original argument. However, once the evidence has been provided, it’s up to the opposing side to show if the evidence is insufficient. If the opposing side argues that your claim is invalid, then, in turn, the burden of proof is on them to justify the disagreement. Note that when someone makes an assumption, they don’t have a burden of proof. Only assuming that something may be true, for the sake of the argument, doesn’t have to be justified. People sometimes evade their burden of proof by attributing their claim to a secondary source. And, although it can be acceptable to refer to a secondary source’s opinion, it often leads to a weak argument if it is been used as the main evidence for a claim, especially if it’s done in order to avoid the burden of proof.

CHERRY PICKING

  • Cherry Picking (Logical Fallacy): Definition and Examples - Someone chooses to focus on the evidence that is in favor of their own position (while ignoring the evidence that would contradict it). It is also known as the fallacy of incomplete evidence, argument by selective observation, and the fallacy of exclusion. Frequently committed in various types of situations, from public debates to scientific research, and has major implications on how individuals present their claims, as well as on their reasoning process leading to their conclusions. It is a key factor why individuals tend to present, believe, and spread inaccurate or misleading information.

CIRCUMSTANTIAL AD HOMINEM

  • Circumstantial Ad Hominem: What Is It And Why Is It a Fallacy? - Also known as “appeal to motive” and “appeal to personal interest”, is a logical fallacy and one of the different types of ad hominem arguments. Like other types of ad hominem fallacies, this one also fallaciously focuses on the person behind the argument, rather than on the validity of the argument itself. Occurs when someone argues that their opponent’s argument must be invalid because his or her position is predisposed by their personal circumstances. In other words, this fallacy asserts that someone is arguing as they do only because they have a vested interest in the issue, and thus their argument should be dismissed as false.

    The circumstantial ad hominem is fallacious specifically because it claims that an argument is necessarily false if there is a conflict of interest between the arguer’s circumstances and the stance they hold. In reality, this alone is not strong enough evidence to make such an inference: A politician may really hold a certain view even if he were to gain personal advantage from its acceptance, or a salesman can truly believe that the product he is endorsing is the best product in the market. However, note that in a situation where there is strong evidence for a conflict of interest or enough reasons to believe that the arguer’s position is biased, it may not be unreasonable to point it out.

COMPOSITION

  • What Is Fallacy of Composition? Definition And Examples - Occurs when the properties of a whole and its parts are mistakenly thought to be transferable from one to the other (opposite of the fallacy of division). Arises when someone argues that something must be true of the whole because it is true of some parts of the whole. It’s considered to be fallacious because the collective (the group as a whole) and distributive (individual members of a group) don’t necessarily need to have the same properties. Even if the single parts of... [a] ...machine are “light”, it doesn’t follow that their total combined weight must also be “light.” The arguer doesn’t take into account the fact that while the whole and its parts can possess the same attributes, it is not necessarily so.

DENYING THE ANTECEDENT

  • The Formal Fallacy of Denying the Antecedent - Also known as inverse error and fallacy of the inverse, a logical fallacy whereby someone fallaciously makes an inverse deduction in a conditional statement. It takes one cause as a condition for something else to occur and then states that the latter won’t occur when the condition is observed to be untrue.

    It’s a type of formal fallacy, and closely related to affirming the consequent. Denying the antecedent occurs when the consequent of an “if-then” statement is inferred not to be true based on the fact that its antecedent is also said to be not true. Even though two premises of an argument are true, its conclusion can still be incorrect. The premises of the argument do not guarantee the truthfulness of the conclusion. It’s a formal fallacy because the error arises from a flaw in the structure of the argument, rather than from the strength of the evidence supporting the conclusion.

DIVISION

  • Fallacy of Division - Arises when the attributes of a whole are mistakenly presumed to apply to the parts, or members, of the whole (someone argues that something which is true of the whole, must also necessarily be true of each or some parts of the whole). It is based on the fallacious assumption that the attributes of the larger group and its members are transferable from one to the other. It is the converse of the fallacy of composition. It is also known as “false division” and “faulty deduction”.

E – G

ECOLOGICAL FALLACY

  • Ecological Fallacy: Definition and Examples - occurs when someone falsely assumes that an individual member of a group has the average characteristics of the larger group. An example - to assert that someone, who comes from a country with one of the highest crime rates, must be a criminal based on the crime statistics of their place of origin. This is considered fallacious since the evidence we have does not guarantee that the person is a criminal, even if it shows that they are more likely to be one. (may sometimes be confused with fallacy of division, however, the difference is that the latter is not seen as a statistical fallacy)

EQUIVOCATION

  • Equivocation Fallacy: Definition and Examples - arises when someone uses the same phrase to mean two different things in a way that renders the argument unsound. An example would be: “Singer X is a real star. The sun is also a star. Therefore, singer X and the sun are identical in many ways.” This is a simple example, but it shows how ambiguous language could be used — whether deliberately or by accident — to reach a conclusion that is not sound. Here, the word “star” is erroneously employed in two, unrelated senses

FALSE ANALOGY

  • The Fallacy of False Analogy: Definition and Examples - arises when one attempts to prove or disprove a claim using an analogy that is not suitable for the situation. It occurs either because one puts too much weight on the similarities, thus reasoning that the two cases being compared must be analogous in other respects too, or is unaware of the ways they are different. If the cases being compared do not share enough similarities, or the similarities are not really relevant to the issue at hand, the analogy is too weak to be used justifiably.

FALSE DILEMMA

  • False Dilemma Fallacy: The Definition And Examples - occurs when a limited number of choices, outcomes, or views are presented as the only possibilities when, in fact, more possibilities exist. As such, it unjustifiably puts issues into black-or-white terms. Also known as the either-or fallacy, all-or-nothing fallacy, and black-and-white thinking. Can be committed in two ways: by suggesting that there are only two possible options when more exist, or by incorrectly presenting the choices as mutually exclusive (only one of the options can be true). Also, one of the given options is often clearly undesirable, while the other one — which the arguer may want us to choose — seems acceptable and rational.

    Furthermore, it’s frequently characterized by “either-this-or-that” type of language, implying that if one of the choices is true, the other one must be false, or if you don’t accept one, the other must be accepted. In reality, however, both of the options may be false or could be accepted at the same time.

FALSE EQUIVALENCE

  • False Equivalence Fallacy – Or, Comparing Apples and Oranges - arises when one draws an equivalence between two things based on the notion that they share some characteristics that are, in reality, not significant enough to justify the equation. It equates the two subjects on false grounds, either exaggerating the importance of the similarities or ignoring the differences between them that are in fact too important to make the equivalence accurate. An example would be: “Bears and cats are both furry mammals; Therefore, it’s the same thing to have a bear as a pet”. Here, the supposed parallelism is simply drawn from one or two similarities between the animals without taking into account any of the significant differences, such as the difference in size and the fact that bears are not domesticated animals.

GAMBLER’S FALLACY

  • The Gambler’s Fallacy – Definition and Example - the erroneous belief that a certain event is less (or more) likely to take place in the future since it already did (or didn’t) occur a number of times in the past. May occur in either one of the following ways: If an individual event appears to have happened frequently in the past, it is believed that the chances of it happening again are now lower. If an individual event hasn’t yet happened in the past, it is believed that it’s now more likely to happen in the future. These types of assumptions are against reason because the probability of the events occurring does not in any way depend on how the past turned out; when the events are random, and they are all independent of each other, then any future events cannot be influenced by the previous ones.

GENETIC

  • The Genetic Fallacy: Definition And Examples - occurs when someone judges a claim simply based on its origin, rather than looking at the actual merits of the claim. In other words, a claim is accepted or rejected on the basis of from whom or where it came from. For example, dismissing an argument as invalid solely because the person behind it comes from a not-so-prestigious school would be a genetic fallacy.

    Generally, we should separate argument sources from the content of the argument; even if the source is deemed to be bad or good, it doesn’t mean that the argument itself is necessarily bad or good. Any claim should be evaluated by looking at its own merits (or demerits) unless the history of the claim is somehow related to its present-day value. As such, this line of reasoning becomes fallacious when the source or history of the claim is irrelevant to its truth value.

GISH GALLOP

  • Gish Gallop (Logical Fallacy): Definition and Examples - occurs when someone throws at you a myriad of half-truths and misleading statements in hopes of making their stance stronger. Debate tactic in which a person uses as many arguments as possible against their opponent, without any consideration into the strength of the arguments. The arguer’s aim is to quickly back their position with a large amount of “evidence”... only focuses on the quantity of the arguments, not quality, in order to achieve its objective: to make it too difficult for the opponent to respond. As a result, the person committing it seemingly gains an upper hand in the debate.

    Also, people who aren’t experts in the field under discussion find simple arguments, such as anecdotes, more persuasive than arguments involving complex scientific ideas and technical jargon. Naturally, such people – from which a typical audience mostly consists of – are more drawn to claims that they can understand

H – R

HASTY GENERALIZATION

  • Hasty Generalization Fallacy: Definition And Examples - someone generalizes from a too-small sample size. The conclusion of the argument is made hastily without looking at more reliable statistics which would enable the arguer to make a more accurate judgment about the situation or issue. This is also an example of jumping to conclusions, which is a term in psychology that refers to decisions or judgments made without having all the relevant facts, thus reaching an unwarranted conclusion. (one draws a conclusion about the whole or majority of the whole on the basis of too few examples).

    This type of argumentation is common and is often difficult to avoid due to our built-in biases. For instance, the bias of group attribution error refers to our natural tendency to assume that the characteristics and preferences of one group member are similar to those of the other members of the same group. This is the source of many incorrect generalizations and stereotyping.

LOADED QUESTION

  • Loaded Question Fallacy (With Examples) - an attempt to limit the possible answers to only “yes” or “no”, and choosing either response would end up hurting the respondent’s credibility or reputation. It occurs when someone asks a question containing an unjustified (and often offensive) presupposition. “So have you always had a gambling problem?” The loaded question fallacy is a question containing an implicit assumption – that is unverified or controversial – putting the person being questioned in a defensive and unfavorable position.

    It’s a type of trick question: it is designed to imply something that the interrogee probably disagrees with and make the listeners into believing that the implication is true. Moreover, it is typically made in a way that protects the person doing the questioning. Not every assuming question is loaded. This logical fallacy occurs only if the implication being made is not a verified and accepted fact. “Have you stopped beating your wife?” Also known as “complex question” (closely related to loaded question), “false question”, and “fallacy of presupposition”.

MIDDLE GROUND

  • Middle Ground Fallacy: Definition and Examples - a person argues that the correct conclusion must lie somewhere between two opposing arguments. It is also known as the argument from moderation and the golden mean fallacy. Occurs when someone asserts that a compromise between two opposing positions must be the truth. Person 1 argues A.Person 2 argues C.Therefore, B is correct.

    This type of thinking is erroneous because it makes a conclusion solely based on the fact that something is the middle point of two sides. It doesn’t provide us with any valid reasons to believe its truthfulness; simply picking a point between two falses wouldn’t mean that it’s necessarily correct, and the middle of truth and false is also false. As such, making compromises for the sake of compromise is not a proper way to reach conclusions — even if the middle sometimes is the safest bet.

NO TRUE SCOTSMAN

  • No True Scotsman Fallacy - someone defends a generalization by redefining the criteria and dismissing examples that are contradictory. Also known as “appeal to purity” as it aims to refute any arguments or evidence against a certain ideal by appealing to its “purity”. Occurs when someone attempts to defend a universal claim by excluding any counter-examples for not being “pure” enough. They reject instances that don’t fit into the category by changing the definition to a more specific one, rather than acknowledging the evidence that contradicts the generalization.

    In this fallacy “Scotsmen” can be replaced with any other group. This type of argument is common and can be made for any group. For instance, it is often used to defend a particular religious group by excluding those who behave in unfavorable ways as not “true” members of the religion. An example of cherry-picking, although in reverse; rather than choosing only the examples that are beneficial, one denies all the disadvantageous ones. This fallacy does not occur if there is a clear and accepted definition of the group and what it requires to belong to that group, and this definition is violated by the arguer.

POISONING THE WELL

  • Poisoning the Well (Logical Fallacy): Definition and Examples - adverse information about a target is presented preemptively in order to discredit or ridicule the target’s subsequent claims. As such, it’s mostly used to weaken or refute an opponent’s argument before they make them. It’s a variation of the ad hominem fallacy; it attacks directly the source of an argument, instead of addressing the argument itself. Also known as a smear tactic; rather than having to counter a claim in legitimate ways, one resorts to smearing their opponent’s reputation to take off from their credibility.

    The name comes after the analogy of a well that’s water is poisoned. Imagine that you were thirsty and came across a well. As you reach out for the water to quench your thirst, your friend tells you that the well’s water is poisoned. As a result, that particular water doesn’t seem so appealing to you anymore and you make an easy decision not to drink it – even though you can’t know for sure whether your friend is correct or not. The same effect translates to argumentation: if someone has a tainted image (i.e. their “well is poisoned”), their words don’t seem so compelling and are more likely to get dismissed or ridiculed.

POST HOC

  • Post Hoc Fallacy: Why “A” Didn’t Necessarily Cause “B” - post hoc ergo propter hoc (“after this, therefore because of this”), or post hoc fallacy, is a logical fallacy that occurs when someone assumes that one event must have caused a later event simply because it happened after the other. This type of thinking is the basis for various kinds of beliefs, superstitions, and false findings in the search for causes of certain diseases.

    The post hoc fallacy is based on the false notion that since event B followed event A, event A must have caused event B. Such reasoning is logically fallacious because the fact that event A happened earlier doesn’t necessarily mean that it was the cause. Note that events that occur in succession may well be causally related, but they may also be completely unrelated apart from the fact that the other happened after the other (or, “correlation does not imply causation”).

RED HERRING

  • Red Herring Fallacy: The Definition and Examples - an attempt to divert the attention away from the relevant issue by introducing another, irrelevant issue. It is an intentionally made distraction to move the argument or a question to a different issue that is easier to respond to. Also known as “ignoratio elenchi”, “irrelevant conclusion”, “beside the point”, “false emphasis” and the “Chewbacca defense”.

    In order to spot the fallacy, you need to remember that, essentially, an argument containing a red herring uses irrelevant information to change the topic of the discussion. When you do identify one, make it clear to the other party that the new issue is irrelevant to the topic of the discussion and, if needed, explain why they are committing a fallacy. Then, direct the focus of the discussion back to the original issue by rephrasing your argument or question. You have three options: accept the new topic of discussion and continue with it, disengage from the argument or insist on continuing with the original topic.

S – W

SLIPPERY SLOPE

  • Slippery Slope Fallacy: What Is It And What Are Examples of It? - someone asserts that a certain proposition or action must be rejected because it would have unintended consequences, typically leading to a disastrous outcome. Essentially, they assume that a chain of events will occur without providing enough proof to support their view. One argues, without providing adequate evidence, that a relatively insignificant event or course of action will lead to a chain of consequences, eventually resulting in some significant outcome. The first decision that would eventually lead to that outcome should be rejected.

    The basis of a slippery slope argument is that a certain action will have unintended consequences, and each step along the “slope” will logically lead to the next one. However, the connections made by the arguer are seen as unwarranted because they don’t provide direct evidence for them. As such, this fallacy can be often seen as a type of emotional appeal: Rather than relying on factual evidence, the arguer attempts to arouse fear in the listeners by showing that a particular decision may lead to some hypothetical, extreme consequences.

SPECIAL PLEADING

  • Special Pleading Fallacy: Definition and Examples - someone applies a certain set of criteria to other people and circumstances while exempting themselves from the same criteria. It is often committed in a situation where a person is emotionally attached to a position and feels the need to defend it, and, as a result, their reasoning is driven by emotions rather than logic. The arguer attempts to exempt themselves from the same standards that they expect to be applied to others. (can also be described as having double standards).

    This line of arguing is considered a fallacy specifically because it doesn’t – despite of trying – justify why a particular case should be considered “special”. Naturally, if the case in question matches the same criteria that has made other instances be treated in a certain way, then we should expect a reasonable, fact-based explanation for why it is an exception.

STRAW MAN

  • Straw Man Argument (Logical Fallacy): Definition and Examples - occurs when someone deliberately distorts or misrepresents their opponent’s position to make it easier to defeat. A flawed line of reasoning that occurs when someone substitutes an opposing argument with a distorted, oversimplified, exaggerated, or misrepresented version of it in order to make it easier to defeat. Typically, it gives the impression of being a reasonable counter to the original claim, but in reality, it attacks a position or view that their opponent doesn’t really hold; taking it out of context, focusing only on a single aspect of it, or being only remotely related to it.

    A good practice is to respond calmly and try to steer the discussion back on its course. To refute it, you may point out the fallacy and how it is incorrect. You may challenge them to justify their distorted view of your original argument, which will, in turn, put them on the defensive: they’ll have to either defend it or discard it. Choosing to completely ignore the straw man is often a bad idea: if your opponent keeps attacking it instead of your real position, it can get increasingly difficult for you to disprove it.

TEXAS SHARPSHOOTER

  • Guide to Texas Sharpshooter Fallacy | Definition and Examples - also known as the clustering fallacy, based on our tendency to look for similarities (or patterns) while ignoring differences and randomness. Occurs when someone ignores the differences in the data and overemphasizes the similarities, thus arriving at an inaccurate conclusion. Typically happens when a person finds and applies an existing pattern that nicely fits into their presumptions.

    For example, when someone happens to see a specific license plate number on their way to work, they might think that the odds of seeing that number must be extremely low. However, the odds were in fact the same as for seeing any other license plate number since there was no predetermined significance for seeing that specific number, prior to the fact that the person came across it.

    This line of reasoning is fallacious because it fails to take into account all the inconsistencies and randomness in the collected data. In almost every data collection, there will be coincidental clusters of data; if we randomly generate one hundred coordinates on a specific area — or fire one hundred bullets at the side of a barn — we’ll likely find clusters and patterns that seem like they must have a cause. As humans, we tend to look for patterns (see also clustering illusion) and causes and, consequently, often are quick to assume that a particular pattern must have some sort of significance.

    As such, in order for us to get meaningful results from a study, the target has to be pre-specified before we gather the data.

TU QUOQUE

  • Tu Quoque Fallacy – The Definition And Examples - occurs when someone’s argument is discredited solely based on the allegation that their past actions or words are not consistent with their views. Also known as “ad hominem tu quoque” since it’s considered to be one of the different types of ad hominem arguments. Someone asserts that their opponent’s argument must be invalid because it is inconsistent with their past words and actions. In other words, one points out that the opponent has acted in the same manner themselves, and fallaciously uses the (alleged) hypocrisy as evidence to refute their argument.

    Dismisses the argument solely on grounds of personal shortcomings; it doesn’t disprove the logic of an argument, even though it may show the arguer’s hypocrisy. In fact, such arguments often don’t address the substance of the opposing claim at all, even though they appear as relevant counter-arguments (also known as “appeal to hypocrisy”, the “you too” fallacy, and “pot calling the kettle black” fallacy).

    Similarly to red herring arguments, appeals to hypocrisy are used as a distraction so that one may avoid having to deal with a certain issue or question. It’s quite common to hear “but what about X, look at what they did”- type of allegations in various discussions, with both adults and children. It tends to include a strong emotional appeal, and thus can be effective in influencing people’s opinions and judgments. Such strategy is often employed in the political arena: During debate, a candidate shifts the focus to their opponent’s “poor” character, while seemingly refuting their argument, by pointing out that they are being a hypocrite

WHATABOUTISM

  • Whataboutism – When People Counter Accusations with Accusations - occurs when a person attempts to divert the focus away from the current issue by making a counter-accusation. A specific form of the tu quoque fallacy in which someone’s claim is discredited due to alleged hypocrisy by the arguer.

    Also called whataboutery, is a logical fallacy and rhetorical technique in which people respond to a difficult concern or question with a counter-accusation in order to divert attention to a different topic. As the name suggests, it’s characterized by the phrase “what about…?”, which would be followed by an issue that may be only remotely related to the original one. It’s typically used when one is charged with a harmful accusation regarding their past actions; one counters the charge by bringing up something negative about the opposing side and thus attempts to downplay the magnitude of their own actions.

RELATED ARTICLES

LISTS OF FALLACIES



Abstraction
Abuse of Etymology
Accent
Accident
Affirmation of the Consequent
Affirmative Conclusion...
...from a Negative Premiss
Affirming a Disjunct
Affirming One Disjunct
Affirming the Consequent
Fallacy of the Alternative Syllogism
Ambiguity
Ambiguous Middle
Amphiboly
Amphibology
Anecdotal Fallacy
Argument by Consensus
Appeal to / Argument from - Authority
Appeal to / Argument from - Celebrity
Appeal to / Argument from - Consequences
Appeal to / Argument from - Emotion
Appeal to / Argument from - Envy
Appeal to / Argument from - Fear
Appeal to / Argument from - Force
Appeal to / Argument from - Hatred
Appeal to / Argument from - Ignorance
Appeal to / Argument from - Nature
Appeal to / Argument from - Pity
Appeal to / Argument from - Popularity
Appeal to / Argument from - Pride
Argumentum ad Baculum
Argumentum ad Consequentiam
Argumentum ad Hominem
Argumentum ad Ignorantiam
Argumentum ad Invidiam
Argumentum ad Logicam
Argumentum ad Metum
Argumentum ad Misericordiam
Argumentum ad Naturam
Argumentum ad Nazium
Argumentum ad Odium
Argumentum ad Populum
Argumentum ad Superbiam
Argumentum ad Verecundiam
Asserting an Alternative
Asserting the Consequent
Authority of the Many
Bad Company Fallacy
Bad Reasons Fallacy
Bandwagon Fallacy
Base Rate Fallacy
Beard, Argument/Fallacy of the
Begging the Question
Biased Sample
Bifurcation
Black-and-White Fallacy
Black-or-White Fallacy
Card Stacking
Circular Argument
Circulus in Probando
Commutation of Conditionals
Company that You Keep Fallacy
Complex Question
Composition
Conjunction Effect
Conjunction Fallacy
Consequent, Fallacy of the
Continuum, Fallacy of the
Converse Accident
Converting a Conditional
Cum Hoc, Ergo Propter Hoc
Denial of the Antecedent
Denying a Conjunct
Denying the Antecedent
Dicto Simpliciter
Disjunctive Syllogism, Fallacy of
Division
Doublespeak
Either/Or Fallacy
Emotional Appeal
Equivocation
Etymological Fallacy
Exclusive Premisses
Existential Fallacy
Existential Assumption, Fallacy of
Fake Precision
Fallacist's Fallacy
Fallacy Fallacy
False Analogy
False Cause
False Conversion
False Dilemma
False Precision
Faulty Analogy
Formal Fallacy
Four-Term Fallacy
Gambler's Fallacy
Genetic Fallacy
Guilt by Association
Hasty Generalization
Heap, Fallacy of the
Hitler Card
Hot Hand Fallacy
Ignoratio Elenchi
Ignoring the Counterevidence
Illicit Contraposition
Illicit Conversion
Illicit Major
Illicit Minor
Illicit Negative/Affirmative
Illicit Process
Illicit Process of the Major
Illicit Process of the Minor
Illicit Quantifier Shift
Illicit Substitution of Identicals
Improper Disjunctive Syllogism
Improper Transposition
Informal Fallacy
Ipse Dixit
Irrelevant Thesis
Irrelevant Emotional Appeal
Loaded Language/Words
Loaded Question
Logical Fallacy
Many Questions
Masked Man Fallacy
Misplaced Precision
Misleading Appeal to Authority
Modal Fallacy
Modal Scope Fallacy
Monte Carlo Fallacy
Multiple Comparisons Fallacy
Naturalistic Fallacy
Negating Antecedent & Consequent
Negative Conclusion from Affirmative Premisses
Neglecting Base Rates
No-True-Scotsman Move/Ploy
Non Causa Pro Causa
One-Sided Assessment
One-Sidedness
Over-Extrapolation
Over-Generality
Over-Precision
Personal Attack
Petitio Principii
Plurium Interrogationum
Poisoning the Well
Post Hoc
Probabilistic Fallacy
Propositional Fallacy
Quantificational Fallacy
Quantifier Shift
Quaternio Terminorum
Question-Begging - Analogy
Question-Begging - Epithets
Questionable Analogy
Quoting Out of Context
Redefinition
Red Herring
Regression Fallacy
Regressive Fallacy
Scope Fallacy
Slanting
Slippery Slope
Some Are/Some Are Not
Special Pleading
Spurious Accuracy
Straw Man
Suppressed Evidence
Sweeping Generalization
Syllogistic Fallacy
Texas Sharpshooter Fallacy
Transposition, Improper
Tu Quoque
Two Negative Premisses
Two Wrongs Make a Right
Uncritical Extrapolation
Undistributed Middle
Unrepresentative Sample
Unwarranted Contrast
Vagueness
Vicious Circle
Volvo Fallacy
Weak Analogy
Wishful Thinking

No comments:

Post a Comment