11.24.2013

Tolerating extreme positions

Last time I explained that the instrumental value of extremism lies not in realizing extreme ends, but rather in framing the limits of what is considered "reasonable" or "moderate" discussion. The upshot is that extremist views play an important organizing role in the social discourse, whether or not the extremists themselves are successful at realizing their ends. People tend to decry extremism and urge moderation in its place; but a careful understanding of the dynamics of social organization might suggest better strategies for tolerating extreme positions.

First, let's be precise about our terms. I'm using a very simple model of opinion dynamics, specifically the Deffuant-Weisbuch (DW) bounded confidence model from 2002; the figures below are taken from the paper linked here. A more complex and interesting model can be found in the Hegselmann-Krause (HK) model and its extensions, but the simpler model is all we need for this post.

The DW model describes a collection of agents with some opinions, each held with some degree of confidence. Individuals may have some impact on each other's beliefs, adjusting them slightly in one direction or another. The less confident I am about my beliefs, the more room I might move in one direction or another depending on the beliefs and confidence of the agents I interact with.

On this model, "extremists" are people who a) hold minority opinions, and b) are very confident about those opinions. Extremists aren't likely to change their beliefs, but can be influential in drawing others towards their positions, especially when there is a high degree of uncertainty regarding those beliefs generally. In fact, that's exactly what the DW model shows.


In Figure 5,  the y axis represents the range of opinions people might hold, centered on 0. The extremists hold their positions with very low uncertainty at the fringes in orange. Their beliefs they are not easily swayed. But in this simulation general uncertainty is also set fairly low (at .4). The graph simulates a series of interactions among the agents under these conditions, and we see the results stabilize as we move to the right of the graph, with the the vast majority of agents (96%) converging somewhere near 0. Since most people were fairly confident about their beliefs, the extremists had very little impact, and the overall system stabilizes into a moderate position. 

In contrast, Figure 6 describes a scenario where general uncertainty is very high (1.2). In this case, extremists are much more likely to sway those near them, and you can see how this eventually results in a radically polarized field of beliefs, where most agents are drawn away from the center, towards one extreme or the other. In more complex models like HK, you can show that these bifurcations will never reach a consensus, but will instead tend to maintain themselves as independent opinion communities that
never interact.

In any case, perfectly polarized situations like this are less common than situations where one extreme or the other dominate the field. Figure 7, below, shows a representative simulation where the upper extreme eventually dominates, again in a case of high uncertainty. Notice that in this simulation, the extremes become increasingly attractive for agents near the middle, which corresponds with the decreasing uncertainty. But its really the agents in the center, who are just slightly more extreme and more confident, that are really doing all the influential work. Interestingly, one extreme or other might dominate even when the number of extremists on both sides is equal.



This all demonstrates interesting instabilities in the dynamics of opinion, and suggests some immediate lessons for tolerating extremism in the discourse. These lessons might not be entirely surprising, but I think they are worth making explicit. By "toleration", I don't just mean the general respect for disagreement and difference that ought to accompany any human exchange. Instead, I mean "toleration" in the sense that, for instance, some digestive systems can tolerate lactose, while others (like mine) fall apart in its presence. Tolerating extremists means working in situations where the presences of extreme views is taken for granted and handled appropriately. The point is to understand precisely the power and influence extremists bring to social dynamics, and to manage and anticipate the extent of their potential impact and influence. Extremists aren't wanton forces for chaos, discord, and fear; they are an ineliminable aspect of human social dynamics. If we're to manage those dynamics at all we'll need to learn to digest extremist views. 

First and foremost, extremists have influence primarily in situations of high uncertainty, when people's beliefs are unstable. This suggests that urging moderation or attacking extremism directly may not be a particularly effective strategy for dealing with extremist beliefs. In fact, engaging extremists directly might have the adverse consequence of raising the general uncertainty of beliefs, thereby making people more easily swayed towards extremist views. One characteristic of stable systems is that distinct belief communities don't interact much. In cases where community cross-talk is correlated with increased uncertainty in beliefs, it will also increase the potential influence extremists might have on the community. 

It shouldn't be surprising, then, that extremists are usually marginalized and ignored, precisely as a way of reducing their potential impact on an uncertain public. But the model also shows that a lot of the work in pulling a moderate consensus towards an extreme is done not by the extremists but by the slightly off-center moderates who fall within the scope of influence of both the extremists and the moderates. Someone who "leans right" might actually be doing more to move the overall opinion rightward than the "strong right" minority that represent the extremes, since the right-leaning moderate will generally have more interactions with, and chances to influence, the rest of the centrists. This result is counter-intuitive, especially if we are trying to counteract extremists by urging moderation and marginalization. Moderating a formerly extreme positions might nevertheless pull the discourse towards the extremes, by making those extremes more palatable and familiar to moderates through the centrist-leaning proxies.

The DW model suggests an alternative to marginalizing extremists that may be more effective at managing their presence: instilling a higher confidence in people's existing beliefs. Which is to say, education is more effective than moderation. Reducing uncertainty in the discourse has the effect of reducing the influence of extremists without needing to attack or even address them directly. For instance, consider astrology as an example of an extreme belief, which few people hold with high confidence. In 2008, the NSF found that 78% of college graduates believed that astrology was "not at all scientific", compared with 60% of high school graduates. Although college students rarely receive any formal refutation of astrological claims, their confidence in their education nevertheless reduces the potential influence of the extreme astrological beliefs. This confidence can be instilled whether or not astrological beliefs are engaged directly, and isn't contingent on deliberately avoiding interacting with or censoring that community. In fact, one common way of establishing increased confidence in science is through a contrast with the alternatives like astrology. Science doesn't need to actively marginalize astrology through censorship or avoidance because it is stable even in the presence of extremist alternatives. 

Of course, not all extremes are bad. There might be good reasons to try and direct the beliefs of a stagnantly moderate population towards some extreme or other. After all, sometimes the extremists are right, and a strategy that merely seeks to avoid extreme positions also loses access to any wisdom they might provide. Even in science some new theory or discovery might warrant a shift in the consensus opinion, so that what was formerly an extremist view becomes the norm. The importance of updating beliefs is precisely why we'd want to encourage cross-talk among communities even at the risk of introducing instability, because those instabilities might make room for potential advantages that can't be accessed from intra-community talk alone. In other words, unstable positions are sometimes good, in that they can allow the discourse to find new and better positions, which is often more important than mere stability. 

To sum up briefly, extremism isn't always a problem, and if it is, moderation or maginalization aren't always solutions! Sometimes, the presence of extremists simply represents the fact that people aren't entirely confident in their beliefs, in which case marginalizing extremists may actually have counter intuitive consequences. Dealing with that lack of confidence through education can be an effective and reliable alternative to marginalizing extremists. 

I want to talk more deeply about the way belief communities form and maintain themselves, but for that we'll need a more complex model like HK, which we'll talk about next time! 



1 comment:

  1. 1xbet korean – Review & Welcome Bonus - legalbet.co.kr
    The best free bet offers from 1xbet are the main ones and the first few offer new 1xbet kz players who sign up and deposit with them. These include betting exchanges,

    ReplyDelete