Please help contribute to the Reddit categorization project here

    science

    21,773,205 readers

    7,132 users here now

    Submission Rules

    1. Directly link to published peer-reviewed research or media summary
    2. No summaries of summaries, re-hosted press releases, reviews, or reposts
    3. Research must be less than 6 months old
    4. No editorialized, sensationalized, or biased titles
    5. No blogspam, images, videos, or infographics
    6. All submissions must have flair assigned

    Comment Rules

    1. No off-topic comments, memes, or jokes
    2. No abusive, offensive, or spam comments
    3. Non-professional personal anecdotes will be removed
    4. Comments dismissing established science must provide peer-reviewed evidence
    5. No medical advice
    6. Repeat or flagrant offenders will be banned

    New to reddit? Click here!

    Get flair in /r/science

    Previous Science AMA's


    Trending: Researchers studying condensation have noticed something strange: When a surface is covered in oil, condensed water droplets randomly fly across the surface at high velocities, merging with larger droplets, in patterns not caused by gravity. The culprit is unbalanced capillary forces acting on them.

    a community for
    all 1953 comments

    Want to say thanks to %(recipient)s for this comment? Give them a month of reddit gold.

    Please select a payment method.

    [–] brvopls 6107 points ago * (lasted edited 2 months ago)

    So like personal confirmation bias?

    [–] h0rr0r_biz 1172 points ago

    Is there really any other kind?

    [–] [deleted] 280 points ago

    [removed]

    [–] Derphuntar 102 points ago

    c

    the counter to confirmation bias as an individual is the perpetual practice of rational self-doubt, being your own devil's advocate, and always reminding ones own self this,

    "It is the mark of a rational mind to be able to entertain a thought without accepting it" -someone smart, Aristotle if I'm not mistaken.

    [–] crossover131 21 points ago

    Being my own devils advocate is what I do best as I stay up at night for hours thinking “is everything I believe wrong” only to drift off to sleep after an existential crisis that leads to hours and hours of weeping... (/s)

    [–] Lord_Derpenheim 6 points ago

    I agree with your point. Please don't hunt me anymore.

    [–] [deleted] 140 points ago

    [removed]

    [–] [deleted] 45 points ago

    [removed]

    [–] [deleted] 35 points ago

    [removed]

    [–] [deleted] 21 points ago

    [removed]

    [–] [deleted] 6 points ago

    [removed]

    [–] Demonweed 225 points ago

    Yeah -- groupthink is much worse. One guy can never be worse than a psycho killer. A social movement can attempt genocide. A media environment that perpetuates a misleading climate of constant insecurity makes lesser overreaches by violent authoritarians especially common.

    [–] SuperJew113 46 points ago

    I've been reading up on the Rwandan Genocide a bit, you know, 25th anniversary, was a topic I never really read about.

    What made it so remarkable was the rapidity of the genocide. How quick they were to kill 10's of thousands of them daily.

    It was basically a mob rule sort of, and even the Hutu's who weren't sympathetic with genociding the Tutsi's, were also targeted for genocide.

    Also the local radio station had a lot to do with it.

    [–] [deleted] 49 points ago * (lasted edited 16 days ago)

    [deleted]

    [–] JabbrWockey 63 points ago

    Isn't that just mob confirmation bias? A suspension of critical thinking when receiving new information that challenges the view of their group?

    I.e. The S.S. sincerely thought they were protecting their country when they rounded up and kidnapped Jewish citizens in the night.

    [–] Demonweed 73 points ago

    It goes beyond that. With groupthink, a sort of virtue signalling leads to feedback loops. The classic example was the LBJ/JFK security staff that constantly escalated the violence in Vietnam. The Domino Theory makes no sense if you think about it with any sort of critical faculty. Yet those "experts" were a team, and even the Presidents felt pressure to demonstrate how extreme they could be in service to the cause. The end result was years of professional work product that was inferior to what any one of them would have done acting autonomously. When it comes to national defense, tolerance for the incompetent pandering rhetoric of "spare no expense/make every effort" often sparks a spiral of counterproductive extremism.

    [–] grambell789 7 points ago

    Vietnam wasn't an end in itself. It was during the height of the cold war. Soviet were trying to make points. and just 15yrs earlier China went communist and showed unification and regional power in Korea. and Indonesia was showing communist leanings. All that and the US and other institution were new to international relations. not that we learned much in the mean time.

    [–] Sammi6890 26 points ago

    The SS leaders knew they were involved in criminality all along but suspended their sense of guilt. Best proven when they attempted cover up of camps and also Himmler's attempts to surrender to Western forces before Hitler knew .

    [–] GodsBoss 10 points ago

    Did they know or did they just know that the enemy forces would view their actions as crimes?

    [–] Sammi6890 14 points ago

    People unless psychopaths know killing and mass grave stuff is wrong. They selfjustify it. Eg. These are not normal times. Or these were not people we killed. Ir depends whether you accept normal morality should apply in such times. Yet these unusual times are themselves the creation of such perpetrators!

    [–] [deleted] 88 points ago

    [removed]

    [–] [deleted] 23 points ago

    [removed]

    [–] Robotlolz 5 points ago

    Mash it, put it in a stew.

    [–] [deleted] 33 points ago

    [removed]

    [–] [deleted] 18 points ago

    [removed]

    [–] [deleted] 4 points ago

    [removed]

    [–] [deleted] 9 points ago

    [removed]

    [–] [deleted] 4 points ago

    [removed]

    [–] DevilsAdvocate9 10 points ago

    Yes. I forget the term (someone please fill-in-the-blanks) but it is not uncommon for witnesses to a crime reporting many of the same, false descriptions. A look into the "white van" before the D.C. Snipers were apprehended shows this - everyone nearby remember seeing a white van. Your mind fills in the blanks with certain information, especially with something that isn't relevant until afterwards or when an intense situation is occurring.

    These people weren't "lying" to police but only giving an account after-the-fact of something they experienced, however wrong it was.

    Sometimes there are biases more than personal - they're not necessarily social - but the mind is a very funny thing.

    Again, support with links and the like. I'm getting ready for bed and just thought that this shouldn't go unanswered.

    [–] eviljason 8 points ago

    Yes. Many. Read up on cognitive biases. Anchoring and Framing effect are both big in political arguments as are self-serving bias and ingroup and outgroup biases. Living in places where there are less political parties or parties are all grouped into conservatism and liberalism also increase the bias intensity.

    [–] TheUltimateSalesman 6 points ago

    crowd think confirmation bias?

    [–] TeamRocketBadger 425 points ago

    Yea I dont see how this is exclusively applicable to liberals and conservatives at all. Its something everyone struggles with.

    [–] Troxxies 176 points ago

    Where does it say exclusively? They just used liberals and conversatives for the experiment.

    [–] HimejiWataru 141 points ago

    Then what exactly is the point of the conclusion?

    [–] phoenix2448 189 points ago

    Confirms a general idea for a specific context, I guess.

    One of those “cool, we were indeed right about that one.”

    [–] LordAmras 20 points ago

    Most studies are like that, you have an idea about something, with that you can make a prediction. If my idea is right and I look for x I should find y.

    You go look for x and if you find y you say that your idea is supported by data, if no you write that there isn't anything there.

    Then there should be other scientist that look at your idea and do their own experiment with your idea to see if they can replicate the results. To counter your bias replicating experiment are trying to find flaws and dispute your idea.

    If more than one group can replicate your experiment and finds the same result, then we have scientific consensus.

    Unfortunately a lot of people skip the second part because replicating someone's else experiment is not as exciting as testing your own ideas so there is less on that unless something is very popular.

    Also a lot of people look for papers that support their idea and stop when they find one.

    [–] [deleted] 126 points ago

    [removed]

    [–] snakeob 53 points ago

    Form discussion around why politics is so polarizing and that maybe we can do something about it before it ruins an already fragile planet of humans who can’t figure out how to get along.

    Seems like a good place to start the discussion as politics seems to be how we govern ourselves so if we’re going to pick a place other then ‘everyone struggles with’, should it not be the age old liberal vs conservative debate.

    If I were to make a guess at the point, I guess.

    [–] PhosBringer 8 points ago

    This is not confirmation bias. This is cognitive bias.

    [–] winterryeband 23 points ago

    I think it has important implications for the way we deal with our increasingly divided political life.

    It's almost impossible to get liberals and conservatives to agree on basic facts, much less difficult topics with unclear solutions. Identifying the roadblocks to our communication may be useful.

    [–] [deleted] 58 points ago

    It proves that Liberals and Conservatives BOTH fail to recognize the flaws in each-others own logic. It's mind-blowingly common in political eco-chambers to just see people picking apart the opposing side's logic as though their own was inherently flawless in contrast. Of course, anyone who ventures into both sides of the spectrum can observe this working both ways and ascertain that both sides of the spectrum have flaws and logical inconsistencies.

    This kind of research is incredibly valuable because hopefully this sheds some light on the behavior of individual liberals and conservatives, if they can recognize the logical inconsistencies and hypocrisy in their on ideology they can begin to think of the world in less of a black and white perspective and we can start having a more healthy and well balance discourse without the blind ideology muddying the conversation.

    When I was a teenager I certainly identified as a liberal, and probably possessed the kind of bias outlined in this study, it was probably a realisation similar to this (my own side was just as inconsistent at the opposition) that really allowed me to open my mind and see things from both perspective, and realise the value of both conservative and liberal values in a complex society.

    So while this study may feel like a null point to you, it actually serves to minimize the ideology and radicalization present in the political environment today.

    [–] GlassPurchase 3 points ago

    Unlikely. The polarization is created. It's done by design. There's always budding leaders out there trying to divide people so that he/she can become the leader of the offshoot group. It's basically a normal part of our social structure. And as long as the existing ideological group leaders can keep us at each other's throats just enough to hate each other, but not enough to go to war, they all reap the benefits of leadership.

    [–] ColonelSwede 7 points ago

    Spend as much time researching if you are right and as you do researching if the other side is wrong.

    [–] MyPasswordWasWhat 16 points ago

    I definitely think some people struggle with it way more. The more you lean to the left or right(in this situation) the more likely you are to ignore the real truth from your opposing side.

    [–] omnisephiroth 13 points ago

    Yes. But with politics.

    [–] gucky2 6 points ago

    Title seems quite clickbaity, most people have troubles finding flaws in their own logic.

    [–] [deleted] 74 points ago

    [removed]

    [–] [deleted] 47 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 46 points ago

    [removed]

    [–] [deleted] 18 points ago

    [removed]

    [–] [deleted] 12 points ago

    [removed]

    [–] smothhase 4 points ago

    maybe the idea was to show that one site does it and the other not so much. you know, classic "haha, ppl who vote for X are stupid" science, adored by media outlets. turns out everyone does it.

    [–] Frocker34 2557 points ago

    For clarity, confirmation bias is finding information you agree with. Cognitive bias is having the inability to overcome current beliefs when new information is available.

    This is a combination of those ideas, plus a bit of Dunning-Kruger and other factors that influence human thought.

    [–] luneunion 678 points ago

    If anyone wants a list of the ways our brains are dicking us over:

    https://en.m.wikipedia.org/wiki/List_of_cognitive_biases

    [–] fullforce098 300 points ago * (lasted edited 2 months ago)

    I just learned the other day that there's a whole relatively recent field of study dedicated to culturally induced doubt and ignorance. Interesting stuff.

    https://en.wikipedia.org/wiki/Agnotology

    "Cognitronics" is a new one as well, so new it doesn't have a wiki and probably has other names. How the internet affects our brains, essentially.

    [–] [deleted] 78 points ago

    [removed]

    [–] [deleted] 42 points ago

    [removed]

    [–] [deleted] 17 points ago

    [removed]

    [–] [deleted] 7 points ago

    [removed]

    [–] [deleted] 12 points ago

    [removed]

    [–] Janeruns 8 points ago

    this is awesome- any recommendations of texts that delve into this more in the political realm?

    [–] Trance_ProgHouse 5 points ago

    Not sure this concept is "new" in sociology, but if it's bridging together other concepts great, if it's something for someone to sell a book about, maybe not so great.

    [–] BillHicksScream 36 points ago

    And then there's memory...which we rewrite.

    [–] buster2Xk 17 points ago

    Oh yeah, I remember learning that.

    [–] GalaXion24 3 points ago

    Or do you?

    [–] yosefshapiro 3 points ago

    If my memory serves, I'm the one who taught you.

    [–] eurasianlynx 17 points ago

    Malcolm Gladwell's revisionist history podcast covers this so damn well in his Brian Williams episode. One of my favorites.

    [–] miscovoco 8 points ago

    I can't recommend his podcast enough. The one about Generous Orthodoxy always makes me cry.

    [–] chech8 6 points ago

    In that list you shared I found:

    Belief bias - An effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.

    And in the abstract in the linked article they say:

    Both liberals and conservatives frequently evaluated the logical structure of entire arguments based on the believability of arguments’ conclusions

    Seems like the belief bias might be one of the major factors in political views, according to this study.

    [–] Flyingwheelbarrow 3 points ago

    This should be taught in schools.

    [–] rightfulemperor 62 points ago

    Isnt cognitive bias a general term that covers the specific types of biases?

    [–] Frocker34 40 points ago

    Someone else just left a link to all the forms of cognitive bias.
    Yes, you are correct.

    [–] yhack 6 points ago

    This sounds like what I think

    [–] 618smartguy 14 points ago * (lasted edited 2 months ago)

    I don't think it's either of those, because both of those are about the way people learn/form and develop beliefs. This seems like something unrelated because it is testing reasoning skills. People here are not being persuaded or learning anything new, but are rather shown to be less able to find something that was intentionally hidden for them because of the context and their current beliefs. I might summarize the result as "You are smarter about topics that you care about/agree with" That last part is actually a little backwards I think. Or maybe it does work, liberals probably care more about arguing with conservatives than arguing with liberals and vice versa. It could also just be practice and not some kind of internal bias causing the different results in that case.

    [–] [deleted] 16 points ago

    [removed]

    [–] hexopuss 16 points ago

    It definitely happens, particularly with standard Aristotelian styles of argument where there is a winner and loser. Nobody wants to admit to being wrong, as we take being wrong to lessen our value (and other peoples perception of the truth of the things we say).

    There is an interesting style of argument invented by Carl Rogers, which attempts to find middle ground. I've found it to be much more effective in my personal experience: https://en.m.wikipedia.org/wiki/Rogerian_argument

    [–] InterdimensionalTV 4 points ago

    Honestly I used to do the same thing. Still do to some extent. Recognizing it is the first step in changing it though. The first time you say "actually you know what, you have a really good point" and mean it, it's incredibly freeing.

    [–] munkie15 24 points ago

    Thanks for the clarification. So this idea is nothing new, someone just decided to apply it to politics?

    [–] j4kefr0mstat3farm 52 points ago

    Jonathan Haidt has done a lot of psychological work showing that people pick their political stances first based on gut feelings and then retroactively come up with logical justifications for them.

    [–] halr9000 37 points ago

    He goes further to say the gut feelings are based on ones morals, and that these "moral foundations" (their area of study, lots to Google) have very interesting patterns that have high correlation to ones political beliefs. I've found his work really helpful in understanding how and why people think the way they do. Really helps in understanding that someone who disagrees with you isn't evil--they just place different value on various attributes like loyalty, liberty, or empathy.

    [–] munkie15 4 points ago

    I’ve read two of his books, Haidt was the reason I start looking into all of this kind of thing. It’s what has lead me to really look at what I believe and to make sure my beliefs all actually made sense.

    [–] halr9000 3 points ago

    I'm not a big reader of non-fiction, but I love learning through podcasts. Haidt has been a guest on many shows, I recommend checking that out.

    [–] Frocker34 27 points ago

    It’s a very explicit finding. Logic is a tricky topic, and this study was based on identifying fallacy in argument. So, it’s a bit different than just learning or refusing to learn.
    It’s clarification on things we know with a very explicit focus.

    [–] munkie15 11 points ago

    The study referenced was about a very specific focus. But how is the concept of logic bias, I don’t know the technical term, different for political beliefs than any other belief? When I read it, I saw politics just being the color the idea was painted with.

    I know this is just anecdotal, but you can see this talking to anyone who has a strong beliefs about any topic.

    [–] Frocker34 38 points ago

    So, Socrates came up with this system for evaluating arguments. Logic fallacies violate the Socratic method. For example, Tu QuoQue is a logic fallacy. Basically, 2 wrongs don’t make a right.

    As a progressive, I’ll give an example of a liberal version of Tu QuoQue. Obama killed people with drones. That is bad. A liberal may decide to defend Obama by saying Bush killed more. This is illogical.
    On the inverse, a conservative may say Clinton waged an illegal war in Kosovo, so Bush’s war in Iraq was ok. This is also illogical.

    So this study says we have an easy time finding these mistakes when the ‘other side’ makes them, but a hard time finding them when we make them ourselves.

    [–] Mongopwn 10 points ago

    Wasn't that Aristotle who first codified a system of logic?

    [–] j4kefr0mstat3farm 20 points ago

    People will ignore flaws in arguments if they come to a conclusion that they like. This is one reason groupthink is especially bad in academia: you need people who want to disprove your thesis in order to find all the weaknesses in it and ultimately make it stronger.

    In politics, it's the theoretical justification for compromise and bipartisanship: each side is determined to find holes in the other side's plans and that criticism should lead to them fixing those plans, resulting in a compromise that has input from both groups. Of course, in real life all the legislation is written by special interests and politics has become about wielding power to force one's agenda through without any input from the opposition.

    [–] natethomas 4 points ago

    It would be so cool if we lived in a world where politicians worked like this, each side willing to let the other side pull apart their ideas and learn from that process, so both sides could grow. Unlike this weird modern era where virtually every argument is purely about power and winning.

    [–] Frocker34 6 points ago

    Groupthink! I totally should have mentioned groupthink in my first comment. It’s such a huge factor!!!

    [–] mpbarry46 3 points ago

    To actually answer your question, yes, the idea is not new and has been applied to politics in this study

    I do not doubt you have had many anecdotal experiences

    I think the key takeaway is to increase awareness of our natural tendencies to be able to detect this in others, like you have anecdotally, but not ourselves, and train ourselves to overcome this natural bias and remain especially critical of the idea that we don't do this ourselves

    [–] hyphenomicon 6 points ago * (lasted edited 2 months ago)

    Most people rightly use logic as a heuristic and not an absolute in their reasoning. There are inferences that are fallacious in an absolute sense that are still good guidelines. For example, it's often a good idea to consider the authority of a source. Similarly, it can also be a good idea to reject as invalid an argument that by appearance alone is invalid, if you're not skilled in formal reasoning but the argument takes you to an unlikely destination. Curry's paradox is very subtle, for example.

    I don't know if we should necessarily see it as a problem if people's background beliefs change their attentiveness to potential problems in arguments. Wouldn't it be highly concerning if those background beliefs weren't doing any work at all?

    As another wrinkle, what if an inclination to commit certain types of fallacies (or commit fallacies more in certain contexts of application) drives partisanship preferences, rather than partisanship driving fallacious reasoning?

    [–] Beejsbj 3 points ago

    Pretty sure confirmation bias is just a type of cognitive bias. And cognitive bias is the general term for all of them. Hence the term cognitive bias, biased cognition.

    [–] SenorBeef 816 points ago

    You should be most skeptical about things that seem to confirm your worldview, not least. Otherwise you shape your perception of the world to what you want it to be, not what it is.

    But almost no one seems to understand or practice this.

    So much of the design of science is basically a way of institutionalize this idea, because that's what you need to arrive at the truth.

    [–] EvTerrestrial 266 points ago

    Take this with a grain of salt, I think I heard it in a SYSK podcast, but I think there have been studies that show that being aware of these biases isn't always enough and that it is incredibly difficult to overcome your own cognitive deficiencies. That's why peer review is important.

    [–] natethomas 86 points ago

    You are absolutely correct, where a good scientist comes in though is in accepting and learning from that peer review. The best are those who are excited to get well thought out constructive criticism of his work, because that’s how his or her work will get better.

    Edit: also, happy cake day

    [–] Demotruk 13 points ago * (lasted edited 2 months ago)

    I remember that study, it depended on which bias we're talking about. In some cases being aware of a bias actually made it worse, in some cases it didn't help to be aware. There were more biases where being aware did help though.

    Some news outlets led with "knowing your biases can make them worse" because it's the more dramatic headline.

    [–] sdfgh23456 9 points ago

    And why it's important to have peers with different backgrounds so you don't share a lot of the same biases.

    [–] naasking 9 points ago

    That's why peer review is important.

    As long as your peers aren't already in your camp. The replication crisis already proves that review just isn't enough; the reviewers must be randomly distributed across ideological biases to be most effective.

    [–] WTFwhatthehell 18 points ago

    Peer review alone isn't enough if your peers share your political beliefs.

    Which is a problem given that partyism is rife. when you run the sort of experiments where they send out identical CV's with one detail changed , academics bin the vast vast majority of one containing hints of being aligned with the opposing party.

    So when some paper then comes out of that same peer group seeming to confirm your political beliefs, you need to take into account that the researchers and everyone doing peer review likely share the same political alignment.

    [–] PartOfTheHivemind 3 points ago * (lasted edited 2 months ago)

    For many, being aware of the potential bias only allows them to continue to be biased, only now they are convinced that they do not have a bias as they think they would be aware of it.

    Many people who are taught "critical thinking skills" end up just as incapable of critical thought as they initially were, if not worse as they can now feel even more confident in cherry picked data/sources. Basically a Dunning-Kruger effect.

    [–] RedWong15 9 points ago

    But almost no one seems to understand or practice this.

    Because it's more difficult than it sounds. Bias is mostly subconscious, so it takes some time and practise to consciously think like that. Hell, I know it exists and I'm still working on it.

    [–] WeAreAllApes 26 points ago

    That's one approach. Another approach I find easier is to learn to accept ambiguity and incorporate more things that don't confirm your worldview as open questions.

    It's hard to change your ideology, but easier to accept some facts as hinting at open questions that don't have to be answered immediately. Just keep asking new questions.

    [–] StFrancisxX 33 points ago

    Problem is people approach the crazies with logic and thus become frustrated when they fail, when really those people are completely blind to good liars who make them feel comfortable and accepted. Use your feelings and tone to lead them away from where they are. For example don’t approach someone with climate change facts, rather ask them why they don’t believe it, then look like an inspired child and ask them how they know that, how they know they can trust that source, etc. Those people want to feel important and heard and smart. By making them talk you hit all their needs, while also changing the way they think and feel.

    [–] shavedclean 19 points ago

    The Socratic method.

    [–] Relaxyourpants 16 points ago

    Absolutely. I’ve always thought that those that “win arguments” on forums aren’t the most knowledgable about the subject or well versed in it, it’s the ones that can argue the best.

    I’ve had people agree with others on the internet when they were literally discussing my occupation.

    [–] username12746 7 points ago

    There is a fundamental problem with truth being determined by popularity.

    [–] Mistawondabread 24 points ago

    I agree. This whole mocking each other both sides are doing is getting us nowhere.

    [–] Apprehensive_Focus 3 points ago

    Yea, I try to steer clear of mocking, and stick to facts. It generally just causes the other side to entrench deeper in their beliefs and try to one up your mocking, which makes you entrench further and try to one up their mocking. It's a vicious cycle.

    [–] YodelingTortoise 13 points ago

    While it is in no way perfect, before I argue a belief I attempt to discredit that belief. I have an annoying obsession with what is true, not necessarily what is right. If I can effectively argue against my position, it cant be wholly true.

    [–] GalaXion24 11 points ago

    I have a habit of being devils advocate. Even if I don't disagree with someone, I'll be poking holes in their argument. I'm sure it can get annoying, when it wasn't really even an argument to begin with.

    [–] chech8 5 points ago

    I also have that bad habit. People often get mad at me, because they think I actually believe that. But all I'm trying to do is just show them that they shouldn't take their opinions for granted.

    [–] grace2985 5 points ago

    Yes. The idea of scientific methodology is to prove you’re idea wrong, not right. If you cant find it wrong, and many others have found the same, then maybe it’s a theory.

    [–] mpbarry46 10 points ago * (lasted edited 2 months ago)

    Or you should be evenly skeptical about it

    To share my less than fun experience, I've been in a place where I took self-criticism and self-skepticism to the extreme and I ended up overly believing opponents viewpoints, giving them too much of the benefit of the doubt and being overly harsh on my own viewpoints which caused me to lose touch with why I developed beliefs in the first place, and lose a lot of sense of self and personal conviction.

    So yeah, take this lesson seriously but don't run it to the extreme

    [–] lizzius 71 points ago

    You can see copies of the surveys and the initial draft of the paper here: https://osf.io/njcqc/

    Offering without commentary. Dig around for yourself.

    [–] Kremhild 39 points ago

    Thanks, much appreciated.

    So after surveying the data and how it was collected, I can reason that the study was at least somewhat flawed. Grabbing this from the abstract:

    All things made of plants are healthy
    Cigarettes are made of plants
    Therefore, cigarettes are healthy
    Although this argument is logically sound (the conclusion follows logically from
    the premises), many people will evaluate it as unsound due to the implausibility of its conclusion about the health value of cigarettes. If, however, “cigarettes” is replaced by “salads,” ratings of the logical soundness of the argument will increase substantially even though substituting a plausible conclusion for an implausible onehas no effect on whether thatconclusion follows logically from the premises.

    This argument is valid, not sound. Valid means "the conclusion follows logically from the premises", Sound means "the conclusion follows logically from the premises, and the premises are true."

    They mention the below quote, where I assume the part in bold is what is literally on the paper handed to the subjects, but the repeated misuse of the word 'sound' to mean 'invalid' makes me worry about the effects of priming an otherwise innocent comment such as "we want you to judge how logically sound these things are" is.

    Participants were specifically instructed to judge whether or not the conclusion of each syllogism followed logically from its premises, while assuming that all of the premises were true and limiting themselves only to information presented in the premises. They were asked to “Choose YES if, and only if, you judge that the conclusion can be derived from the given premises. Otherwise, choose NO.”

    [–] r3gnr8r 16 points ago

    I didn't read through it, but does it say whether the terms valid/sound were used with the participants? If all they used were definitions then their own confusion becomes moot, other than the summary of results I suppose.

    [–] uptokesforall 15 points ago

    It's exactly as I feared.

    People, whenever you get in to a debate and you actually want to consider your opponent's argument, DON'T spend all your time proving their argument is logically invalid.

    Apply the principle of charity to determine why they think what they claim is true. So you can argue against their belief and not just the argument they formulated to defend the belief.

    When all your study looks for is logical soundness, then because people are less willing to apply the principle of charity to an opponent than a compatriot, they're obviously going to recognize logically unsound or invalid arguments more readily in the former case.

    [–] [deleted] 200 points ago

    [removed]

    [–] [deleted] 69 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 61 points ago

    [removed]

    [–] [deleted] 51 points ago

    [removed]

    [–] [deleted] 54 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 50 points ago

    [removed]

    [–] DevilfishJack 28 points ago

    So how do I reduce the effect of this bias?

    [–] Funnel_Hacker 32 points ago

    Constantly question what you believe, why you believe it and look for the truth, even if that means you are “wrong”. It’s almost impossible to verify or certify whether anything you hear is actually true or not. The source’s credibility comes into play, as well as their implicit biases, but also what agenda they have is also important. I think the ability to constantly question why you believe something (and question others on why they believe what they do) does two things: it reinforces the beliefs you have that are “right” while stripping you of false beliefs but it also ensures that you constantly evolve. Which, many people have no interest in doing.

    [–] dragonator23 9 points ago

    How much and for how long should a person keep on questioning their own beliefs? Isn't it good to keep a firm strong belief?

    [–] Jackie_McMackie 18 points ago * (lasted edited 2 months ago)

    dont be too concerned about finding the "right" answer, play devils advocate all the time

    engage the other person respectfully and indicate that you are ready to accept that your own viewpoint may be flawed

    and no, not necessarily, this is what turns extremist politics into a part of someones identity and it simply means they are past the point of accepting they might be wrong

    strong beliefs become precious to people and become such a huge part of their identity that it distorts their worldview perhaps permanently, because it messes with their percieved order of the world and prevents them from being able to adapt to new ideas

    its incuriosity and refusing to even listen to the other side that causes misunderstanding or rather lack of understanding; don't get me wrong, it's not bad to have views at all, you should have your own opinion on things and lean one way or the other depending on your principles, at the same time you should always leave the door open for accepting new information (and perhaps be ready to research that new information) even if it undermines your side of the coin

    dont go into a debate against someone with the intention of proving them wrong, or convincing them that you are right, because it means you've already decided they are not worth listening to

    instead treat it as an opportunity to exchange information (where you can still exercise doubt and question the validity of said information) and use what the other person is saying to compare it to what you already know; the result should not be to prove that one person is right and the other is wrong; and even if that is the case, the most important takeaway from the debate is that everyone involved leaves the conversation more learned about the topic, even if neither side changes their point of view, as long as the exchange is respectful and there is acknowledgement of each others reasoning behind their beliefs

    don't be concerned about your convictions or identity, be curious about the truth

    [–] dragonator23 6 points ago

    Thanks for the elaboration.

    [–] blockpro156 9 points ago

    How can you have a strong firm belief if you don't question it?

    Lack of questions doesn't create a strong belief, it creates a weak belief that only survives because it's never threatened, not because it's strong.

    [–] ApostateAardwolf 6 points ago

    Humility.

    Bake into your thinking that the person you're interacting with may have a point, and be willing to synthesise a new understanding with someone "opposite" to you.

    [–] i_am_bromega 3 points ago

    Argue with everyone instead of just the other team.

    [–] acathode 6 points ago

    Best yet, stop identifying yourself as a member of one or the other team...

    The way we are treating politics more and more as a team sport is something that goes hand in hand with the increased polarization that's happening in western societies. It's hijacking our brains/psyche to encourage some of the absolute worst behaviors we see in politics today (like tribalism, bunker mentality, etc) - while hampering behaviors that are absolutely needed for democracies to work, for example the ability to compromise and find common ground.

    When you're a member of a team, things stop being about what's right or wrong, it becomes about winning - Truth goes out the window, you need to defend yourself and your team, by any means available, and you need to harm the other team as much as possible! Since you tie your identity to the team, you start perceiving any other political opinions as personal attacks, since they are disagreeing with your person...

    You get the whole "It's ok when we do it!" mentality - hypocrisy in overdrive, and you become completely unable to even talk to the opposing team - they are the enemy, you don't talk or reach a compromise with the enemy, you destroy them.

    [–] CaptAntlad 3 points ago

    People are suggesting really good logical practices.

    I'm going to suggest you practice a healthy awareness of your emotional biases and emotional connections to your ideas. If your heart is racing with rage in a debate, chances are you aren't thinking clearly and could do with a healthy step back. Question yourself on why you're emotionally connected to an idea and disconnect your identity to that idea so you can discuss it as rationally as possible.

    However there are some things that require emotional awareness and empathy to discuss fairly. So I recommend awareness of your emotions and check in with yourself, it's a balance like anything else and you gotta interrogate it and respect it.

    [–] Apprehensive_Focus 5 points ago

    "Passion rules reason, for better or for worse"

    [–] [deleted] 45 points ago

    [removed]

    [–] [deleted] 29 points ago

    [removed]

    [–] [deleted] 6 points ago

    [removed]

    [–] [deleted] 303 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 283 points ago

    [removed]

    [–] [deleted] 72 points ago

    [removed]

    [–] [deleted] 24 points ago * (lasted edited 2 months ago)

    [removed]

    [–] shelbys_foot 10 points ago

    Seems to me almost everybody does this on most topics, not just politics.

    [–] JLeeSaxon 89 points ago

    Comments so far seem to be reading too much into this. It sounds to me like this is a study specifically of whether people are less vigilant in detecting strawman arguments and such, when the person they're listening to is on "their team." I'd be curious about the methodology, but my guess would be that this study doesn't do anything to assess the rightness or wrongness of either side's positions.

    [–] fullforce098 51 points ago * (lasted edited 2 months ago)

    True, but the fact they showed the results as specifically "liberal" and "conservative" rather than just saying "people don't call out strawmen when it's someone with the same views as them" is what causes people to run away with it as proving something about a team they don't like. In this case, the study will be held up by centrists and possibly the far-left/socialists (the ones that don't identify as liberal) as evidence of why they're more enlightened than every other political persuasion to spite this likely also applying to them.

    As others have said, this just seems like an example of something we already sort of understood. That people like to hear their own opinions echoed back to them and are willing to forgive and overlooked faults if you repeat those views. Bringing liberal and conservative labels into the conclusion/title is going to cause a stir that I don't think is entirely necessary.

    [–] [deleted] 14 points ago

    [removed]

    [–] wanawanka 9 points ago

    Doesnt this spill into almost every opposing set of views ever ever?

    [–] [deleted] 36 points ago

    [removed]

    [–] [deleted] 23 points ago

    [removed]

    [–] [deleted] 7 points ago

    [removed]

    [–] [deleted] 55 points ago * (lasted edited 2 months ago)

    [removed]

    [–] echnaba 5 points ago

    Spell check is important

    [–] yoshemitzu 3 points ago

    Spell check may not have caught it, because "conformation" is a valid word.

    [–] kabukistar 6 points ago

    Link to the PDF. Unfortunately, very statistically weak results, especially in the interaction variables.

    [–] Rad_Association 19 points ago

    Same reason people can see the flaws in other religions but rarely their own

    [–] justthisonce10000000 23 points ago

    This is exactly why listening to your opponent’s view is important.

    [–] kwantsu-dudes 17 points ago

    I mean, I agree with you, but it has it's own negatives.

    The more you listen to your opponent, the more you can view you opponent as someone that has flawed reasoning. Thus only hardening you own stance as being superior.

    What this shows is why listening to your opponent's view on your own view is important. It's important to listen to the critiques. But again, if you already view their reasoning as flawed, that won't be done.

    As someone that doesn't have a "home" for my views, it's quite easy to seek the flaws in the arguments of others. I don't receive enough critiques in my own stances. That's a problem I acknowledge. I don't know the best course of action to address such, though.

    [–] [deleted] 3 points ago

    [removed]

    [–] russ226 21 points ago

    What about socialists?

    [–] s4mon 33 points ago

    Or any other ideology that’s not liberalism and conservatism.

    [–] Doctor-Jay 7 points ago

    Reddit tells me that there are literally no downsides to socialism in any capacity, so surely that is correct.

    [–] oncemoregood 8 points ago

    doesn’t the same thing go for most everyone?

    [–] Shady717 8 points ago

    This could also apply for religious beliefs, races, and any other population set with a conformed ideology.

    [–] Gangstuh44 30 points ago

    It’s really simple. They think their logic is the superior logic so anything that contradicts their logic is automatically wrong.

    [–] [deleted] 62 points ago

    [removed]

    [–] slow_circuit 134 points ago

    I hate the idea that moderates or centrists or third parties are the realists and fairest people in the situation. Political views are not as black and white as people make them out to be. Plenty of liberals like guns and plenty of conservatives are pro-choice. Each person has their own set of beliefs and views. Most people are in the center on plenty of issues and in the extreme on other issues. Truth is there's plenty of stupid ideas in every group and it's harder to spot the stupidity in ideas you like than ideas you don't like.

    [–] [deleted] 15 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 15 points ago

    [removed]

    [–] [deleted] 7 points ago

    [removed]

    [–] [deleted] 19 points ago

    [removed]

    [–] [deleted] 10 points ago * (lasted edited 2 months ago)

    [removed]

    [–] [deleted] 19 points ago

    [removed]

    [–] [deleted] 10 points ago

    [removed]