William Wellman has a film called Incident at the Ox-Bow (1943), which is a strong argument against the lynching of a town in Nevada in 1885. This all stems from a rumor that some thief killed a cowboy to get his cattle. . Henry Fonda is another cowboy running through town who is opposed to the outraged taking the law into their own hands. But almost unanimous against it disarms him. It is the same role he would later play with greater courage and determination in Twelve Angry Men (Sidney Lumet, 1957). Although a mix of both themes precedes both, we have it in Furia (Fritz Lang, 1936). Spencer Tracy arrives in a town where a girl has been kidnapped and ransom demanded. They think he’s the kidnapper and they’re preemptively throwing him in jail. A rumor spreads that he is a criminal and the lofty neighbors set the prison on fire. The assailants’ embarrassing mutual cover-up during the trial is only broken when the footage recorded on the journalists’ cameras is shown. What a pity: It’s as if lynchings have passed away from the Wild West, with all its guarantees, without a continuity solution to the judicial (and political) system, with rumors always stirring. See if it’s the case of Rocío Wanninkhof and Dolores Vázquez or the attack on the Washington Capitol.
Rumor is a term that arouses little interest in political communication and the public sphere. On the one hand, it can be said that it sounds like a correveidile and a neighborhood courtyard, an anthropological universal, a destiny that now circulates on networks and screens and has managed to elude science, the ballot box, the judges’ knockers. questions from investigative journalists. On the other hand, it is clear that rumor is not something that overtly implies falsehood or devious will, as the terms of misinformation or fake news do: Rumors can eventually be verified, they are not always false. This lack of definition or ambiguity makes it difficult to address. How to rationally oppose the insidious popular wisdom of the river (then the water carries over) and the preventative scribble time (where something remains)? Despite all that, what a powerful tool for disinformation.
Rumor is a resource that ideally needs only three people: the conspirator, the victim, and the person who listens and acts. It is true that rumors that turn out to be true become news or reliable information and then even History collected in handbooks or textbooks. But it is also a fact that those that are never confirmed, or even proven to be blatantly false because there is evidence to the contrary, are not lost at all: They have a great harmful longevity, like those of plastics in the environment, and, as with some viruses that continue to exert their infectious powers by adapting to changing conditions and begin to relapse, great resistance reappears at certain special moments of weakness or confusion.
Rumors are statements of fact about persons, groups, events, or institutions that are spread from one person to another and made believable because other people seem to believe it, not by supporting evidence because it has never been proven. Their interlocutors and propagandists are not just any: they are generally inclined to accept such rumors because they support certain beliefs or previous assumptions, because they confirm heavily lived fears or hopes.
rumor gasoline
The reason for the spread of rumors lies in two facts (see Cass R. Sunstein on Rumors). The first is the cascading effect, where, in the absence of evidence to the contrary, we tend to believe what others believe and agree with the majority opinion, so the more people spew a rumour, the bigger the waterfall.
A famous experiment by psychologist Solomon Asch is illuminating. A subject is integrated into a group of nine, believing that they are also participants, even though they are actually traps that act on the instructions of the researcher. It was a matter of associating a line drawn on a white cardboard with one of three other proposals, one of which was the same length as the sample and the others were quite different. In the first two rounds, everyone gets the right answer, which is pretty obvious. But in the third, everyone collaborating with the researcher picks up the wrong line that looks like a blunder. Subjected to this conformity pressure, the subject allows himself to be caught in the bandwagon and erroneous opinion in 36.8% of cases, compared with the 1% error in the control tests. If our sensory perceptions, the factual facts before our eyes, are like this, what will happen to the rumors that refer to the facts beyond our eyes and told to us by those who did not witness it themselves? the event in question.
The second phenomenon is group polarization: when like-minded individuals come together, they begin to think more radically about it. An experiment conducted in Colorado in 2005 asked whether the United States should sign an international treaty to combat climate change. Each individual’s views have been gathered before, and the issues that emerge from them (some of which are filled with rumor gas), will then revolve around what the debate will revolve around: whether climate change is a scientific fact or a fraud, whether the commitments are derived from international treaty. if this agreement could avoid disastrous effects for the US and the world in the medium term, it would have done more damage to the US economy than any other, etc. Split into two large groups, liberals and conservatives, or more Democratic and Republican-leaning, and regrouping their ideas after only fifteen minutes of talk, the result was clear: both radicalized their positions, and this is individual statements presented anonymously.
Outside of lab work, in real social interactions, cascading influence and group polarization can occur in many everyday settings: a conversation with friends in a bar, a political meeting, a meeting or demonstration let alone a shareholder meeting. in a company, at a meeting of the neighborhood or labor unions, among fans of a football match in the stadium, among students in the parliament before a strike is possible, etc.
All these forms of gradual and polarization imply a physical intimacy between the participants, in a kind of excitement that chooses to imitate and also to bet higher. What about ranged effects? According to many researchers, virtual Internet communities actually function as “resonance chambers”, algorithms “filter bubbles”, and both are homophily, that is, contagious thinking the same thing and at the same time affirming by pointing to an intransigent “other”.
Debunk the rumor, satisfy the rigor
Now, in the face of all these (definitely depressing) experiments, perhaps the most surprising is someone else questioning the implications of what has been said. In a paper published at Yale University in 2013, researchers asked two groups (Democrats and Republicans) separated by political affiliation some questions and offered them a chance to win $200 in a draw if they got it right. The chance of winning was 1 in 100, but each correct answer increased the chance significantly. The questions referred to data on the unemployment rate under George W. Bush, the number of soldiers killed in Iraq between 2003 and 2011, and the percentage of federal budget allocated to health care for people without resources. At the same time, two similar control groups, also sympathetic to Democrats and Republicans, answered the same questions but did not receive any financial rewards. The responses of the control groups were very different, as expected, but the responses of the stimulated groups reduced polarization by 55% (taking those of the two control groups to 100%), that is, quite close. Their positions, which greatly softened the partisan and prejudiced reactions.
The researchers hypothesized that when in doubt about the data, participants gravitated towards a partisan response because they did not have the option to answer “I don’t know.” When even a small financial incentive was offered with “I don’t know”, the polarization was further reduced, up to 80%. So partisanship wasn’t as disproportionately ridiculous as some free, no-charge, useless answers might make you think. The fan changes the balance in favor of himself by adding a plus to what he knows or thinks he knows: he has not been deceived, he has not fallen into deception, on the contrary, he distorts what he knows and this shows his loyalty to the fans. causes by pleasing or augmenting the data that supports it. Seduction is wonderful if this happens before a smack of love ready to raise their thumbs in the wax. Now the same person can restrain himself the moment he has good objective justification, and even admit his ignorance when this attitude is encouraged.
It is a pity that the political debates in Congress are not adapted to the Saber y ganar format, where besides being televised, their lords are the contender, Jordi Hurtado acts as moderator, reads the questions written on the cards, and the answers are considered harmless. , with instant validation to the blushing of the most stubborn. We will see a remarkable reduction in absurdly triumphant or destructive statements, misleadingly interpreted figures, manipulated statistics and graphs, not to mention personal insults and humiliations.
Source: Informacion

Brandon Hall is an author at “Social Bites”. He is a cultural aficionado who writes about the latest news and developments in the world of art, literature, music, and more. With a passion for the arts and a deep understanding of cultural trends, Brandon provides engaging and thought-provoking articles that keep his readers informed and up-to-date on the latest happenings in the cultural world.