[This is a follow-up post to “Worldviews, Narratives, and Ideologies” and “Everyone Has a Worldview.” My purpose is to understand how political ideas have an effect on events. You can find an overview with short summaries here that I will keep updated: “Synopsis: On Various Political Ideologies and Worldviews.”]
My starting-point for analyzing the impact of ideas on political developments is that the human mind has two modes of thought: intuitive and rational.
Intuitive thinking is very fast, it works mostly with rules of thumb and tries to match patterns. When new information comes in, our mind tries to fit it into what we already think we know. We may modify or even ignore facts that create tensions right at the start. If they cannot be ignored, the same mechanism can lead to changes for other views we hold, which may sometimes cascade to more and more of them. In extreme cases, this can lead to a conversion experience where we adopt a completely new view on the world. But that is rather rare. I have written about this phenomenon in my post: “Pavlov’s Dogs and Sudden Conversion.”
Intuitive thinking abhors contradictions and actively works to remove them. By contrast, confirmation for your views is not a problem, and it is hence gladly accepted as a reinforcement. The purpose of all this is apparently to keep our views internally consistent. If in doubt, intuitive thinking goes for coherence and not for the facts. My understanding is that the purpose is to make it possible for us to navigate the world with fast decisions. It is often better to operate on mistaken views that are consistent than on views that are correct, but make decisions hard.
Ideas that are tied to emotions — especially strong feelings — are more easily absorbed by intuitive thinking and stick more strongly than those that come to us in a sober mood. We tend to lump feeling and thinking together. We also easily fail to distinguish sharply between factual assertions and ethical or even aesthetical judgments, which may be handled in one go. So a typical mistake for our intuition is to answer a factual question with what we find good or which just looks attractive.
All in all, the things we learn over time — plus perhaps also innate assumptions — result in what I call a “worldview,” an intuitive understanding of how the world works. It appears as self-evident to us, and it is effortless to draw upon it. There are many parts in a worldview: Assumptions about what is the case, what the relevant entities are, ethical and aesthetic judgments, claims about how different things are connected with each other, how they have developed in the past, expectations for the future, and so forth.
In a way, this is very complex, there are many parts here and how they are related. But then a worldview cannot cover everything. There are rather stringent constraints on how much information we can absorb. We tend to group information into larger entities. It is impossible to think of all Germans individually because there are so many. So, we group them into “the Germans,” which can be handled like a single person.
Realistically, we know very little about most of the world. Many continents on our map just have to remain white. Still, with the help of mechanisms that can fill them out on the fly, we don’t experience this. It feels like we have this panoramic view of everything.
Try this as an experiment: Get into an intuitive attitude without conscious thinking and just listen to what comes immediately to your mind. Now, here is the question:
How much do think you know about the world and different countries.
Just feel the answer. My immediate intuition is this: I have knowledge about all of them, even roughly on the same level. A little later, my intuition walks this back a little: Okay, I know perhaps more about some countries than others, but basically there are no blind spots.
Only when I start to concentrate on what I really know, do I realize that actually, I don’t know anything about many countries. I could not find them on a map or give a single bit of information. As for Ghana, the only thing I can find is the very name of the country and that it is in Africa. As for Uruguay, I can recollect a few facts. But if I had to write them down, they would easily fit on a postcard.
On further reflection, what I intuitively felt I knew falls apart. I know quite a few things about some countries, but not all that much. And what I also realize is that my knowledge is extremely uneven. Actually, if I concentrate long enough I become very doubtful as to whether I know anything at all, even about the country I live in. Still, when I get back into an intuitive groove it feels again as if I basically know about the whole world and even roughly on a par for all countries.
Maybe this is just me, but my hunch is that it is not that different for other people. I would say this is normal. That’s how our mind works, and we all have such a worldview. So, I do not fault anyone for it. And often this is very innocous. The problem can only be when we do not reflect about it. There is this common phenomenon that people who lack any knowledge are the most confident about what they feel they know. One such phenomenon is the Dunning-Kruger effect. Or in the words of Bertrand Russell:
The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt.
But then this is not actually about intelligence. Intelligent people — and Bertrand Russell himself! — suffer from the same problem, too. It is more about whether you are willing to challenge what you feel you know. We all have a worldview and that’s just how the human mind works. It is both pointless and useless to try to get over it. The best we can do is just force ourselves to resist the temptation when it is important. Otherwise, relying on a worldview is not such a problem.
Since the underlying mechanisms work to keep our worldview consistent, it is rather resistant to change. That can still happen, but for the most part only slowly. If our worldview comes under pressure when it is confronted with too many contradictions, it might flip to a completely different one, a conversion experience, especially if you are emotionally agitated. However, that is rather rare. Otherwise, we absorb contradictory evidence only very slowly.
So far, what I have explained is a worldview on the individual level. It depends on our life experiences, what we have been exposed to, how much consistency we bring into it, and so forth. Despite this individuality, worldviews also tend to align in groups. We may learn new elements from others, and they can then fall in place in a worldview. Others could be peers, but often authority makes the adoption of a worldview more likely. You learn much from your parents, teachers, and in general people you look up to because of status or education.
Since worldviews often align, it is sensible to speak of them also on a social level in the aggregate: the worldview of this or that group. But be careful: This can be a misleading simplification. A group may behave like an individual in the aggregate, but not in all regards. There are similarities, but also imporant differences.
As for the similarities: You can observe the same mechanisms also in the aggregate how a worldview is kept together, and sometimes also how conversion experiences grip a group in extreme situations and when emotions run high. As for the differences: The aggregate worldview of a group can diverge and split up into many. That is unlike for an individual, where the underlying mechanisms keep a worldview unified, perhaps apart from some mental diseases.
Aggregation may also miss what is particular about individual worldviews in the group. The same element may play different roles in individual worldviews. Only with a consensus can you conclude to the whole, otherwise you have to keep in mind that it is a mix. There may be a lot of overlap between individual worldviews, but not for everything. Hence what you can observe in the aggregate is only what is common, while the rest comes across as noise.
There are many more caveats, and yet it can be useful to speak also about the worldview of a group of people. eg. of all adherents of a religion or of a political direction although they will not agree on every detail and certainly not on matters that have nothing to do with it.
On the social level, worldviews can be more persistent than on the individual level. Not only mechanisms for removing internal tensions see to this, but also the alignment within a group. This can lead to even more continuity over time than for individual worldviews. The latter only live in an individual, while the former can go on over generations.
However, there are also mechanisms here that go in the other direction. Worldviews on a social level, for example, have to be handed down from generation to generation, and some parts can and even must go missing in the process. If the older generation had some experiences that were formative for their worldview, and the younger generation does not know about them or only in an indirect way, different parts of a worldview can take on new meanings. This can then lead to a new worldview and to major changes over longer time frames.
— — —
The second mode of thought is rational and slow. It is hard work, and humans tend to avoid it. Basically, we keep track of all the steps of an argument, check whether they work, and always ask questions like: Is that really true? Could it not be otherwise? Or: Is there some way to refute this?
We will take more time to reach a conclusion and only treat it at first as tentative. We might also put more work in, do research, and seek out information that supports an argument or undermines it. Often rational thought kicks in when our intuition is at the end of its tether. It is when you say to yourself: Something is wrong here, I have to concentrate to get this right.
But we can also pursue rational thinking all by itself. However, this is rather the exception than the rule, which is to work with intuitive thinking. Some people rarely use their rational side, others often. But then, we all use our intuition when we do not concentrate. That has nothing to do with intelligence, only perhaps that someone likes it more to think something through.
I find Bertrand Russell funny in this regard. He can make very good arguments that appeal to reason, eg. in philosophy and, of course, mathematics. But when he slacks, he can also be just as naive as everyone else. Try his arguments for Socialism. That actually often happens when even very intelligent people are outside their domain of expertise.
On their own ground, they know they have to be careful and that there are many caveats. But outside of it, the quite naively assume a little intuition and absorbing a consensus view is all there is to it. Isaac Newton was a genius when it came to physics and mathematics, but he also dabbled in alchemy and made the prediction that the world would come to an end in 2016.
— — —
As opposed to intuitive worldviews, I call systems of ideas that work on a rational level “ideologies.” They may consist of different building-blocks that can stand on their own. Ideologies typically use precise definitions, split larger questions into smaller chunks, build a sequence of arguments, and often also address possible objections. That is what we usually find in essays or books although an ideology need not exist in writing.
While a distinction between the individual and the social level is important for worldviews because phenomena may work out in different ways, for ideologies the distinction is not as stark. An ideology is held together not by idiosyncratic associations, but by general rules. Of course, an individual can have their own take, eg. they might simply get something wrong. But there is a general argument behind an ideology that is independent from who thinks about it. My take here is Platonic: ideologies are in a world of ideas. They are potentially always there, and are only realized in our individual minds. Aggregation is hence not such a problem, only if you try to distill what is common in similar, but diverging ideologies.
Ideologies as present in human minds can evolve over time, and they can react to each other. They can also split up or merge with others. You would trace such a development then in a “history of ideas” or in postmodernist lingo as a “discourse.” Since ideologies are rather transparent and also often in some way preserved for posterity, it is rather easy to track them over time, while worldviews can remain elusive because they are implicit and have to be reconstructed. Worldviews are already hard to access in real time, and they are mostly lost over the long run. This leads to a certain survivorship bias where ideologies seem primary and worldviews secondary. It is as if some worldview came only into being when the first person tried to present it as an ideology. But that may often be wrong.
— — —
What makes an analysis difficult is that ideologies can be intertwined with worldviews in many ways. For example, ideologies can draw upon worldviews for background assumptions. An indication that a worldview plays a role is that certain steps in an argument are treated as obvious although they are not. Weak logic can also betray that a worldview supplies the self-evidence. Other warning signs are appeals to emotion as well as ethical or aesthetic judgments in a factual account, which are typical for intuitive thinking.
Ideologies often also go along with a certain worldview. While ostensibly the argument is rational, along with it a panorama of the world is presented and then reinforced that appeals to intuitive thinking. You can view this only as an illustration, but the worldview can take on a life of its own.
That is, for example my analysis of Thomas Malthus “An Essay on the Principle of Population,” first published in 1798. While the rational argument is extremely weak, the appeal to intuition is very strong. And that holds Malthus’ book together in my view and explains its tremendous impact over time, not the stringency of his arguments. You can find an overview of my posts on the topic in: “Synopsis: What’s Wrong with the Malthusian Argument?”
There are also other connections between ideologies and worldviews. As noted above, rational thinking takes over from intuitive thinking when the latter runs into insurmountable problems. Someone may shop around for an ideology not because of its persuasiveness, but because it can fix an underlying problem in their worldview. If they engage in rational thinking themselves, there is a danger of motivated reasoning or sloppy logic because the driving force is actually a worldview in search of a solution.
— — —
But there is also a connection the other way around: The outcome of rational thought can become part of intuitive thinking. And in this way ideologies can also have an impact on worldviews. Since the latter are rather resistant to change, it may take some time before ideologies gain momentum in this way and become “folk wisdom.”
Take as an example the idea that we live on a sphere, which is not intuitive: From my German perspective, people in Australia hang with their heads down and will fall into space! Still, via rational thought the ancient Greeks seem to have been the first to get it right. It was then lost again, and later rediscovered. First there were only some people who understood that we live on a sphere, then it spread slowly. By now, this is “folk wisdom,” and we chuckle about how stupid people were in the past. There are some holdouts who stick with a more intuitive worldview, though.
That can also mean that in a cross-section at some point in time, different worldviews are present: some newer, some older. You can often find that worldviews contain elements from ideologies of yore. The lag can be by decades or a century and more.
As an example: People regularly speak about “fresh air” as if this were in and of itself something beneficial, apart from that it surely feels good. What is behind this is the medical theory of the ancient Greeks that diseases come from bad odors or “miasmas” (pollutions). “Malaria” means literally “bad air.” And that was also the explanation: It came from the swamps and that’s what cause the disease, not parasites, Plasmodia, transmitted by mosquitoes.
The idea is quite understandable: Where something rots, it stinks, and a certain revulsion against such odors is even a part of our human nature. We react with digust. But the odor itself is not the problem, only what it is associated with it. The “miasma theory” remained the dominant explanation until late in the 19th century when people began to understand that perhaps bacteria, fungi or parasites were the real cause, not the odors. As for malaria, the French physician Charles Louis Alphonse Laveran figured it out in 1880. Still, a part of the old worldview survives to this day. And it is hard to shake off because it is so intuitive.
There can also be other effects when ideologies are absorbed at different speeds in a society. If it goes very fast, there can be an effect that I would call “shearing:” One part of society, eg. the urban population, mostly adopts a new ideology, and another, eg. the rural part, does not and preserves older views. But if it is rapid, then they may lose contact and the worldviews in a society split up.
There is also another connection here: Especially political ideologies gain their force from the numbers and maybe preeminence of the people who endorse them. If they can be absorbed into pre-existing worldviews that can work fast, otherwise it might take some time or even fail. You could view this as a market where the demand comes from worldviews and the supply from ideologies. And that can determine which ideologies have an impact.
Returning to the example of Thomas Malthus’ “Essay on the Principle of Population”: The argument is so weak, but it caught on with educated people anyway. Why?
My hunch is that it seemed to solve a problem in the worldview of Malthus’ readers. The French Revolution turned sour at the time, and everybody wondered why that was so. Malthus intuitive answer was that it was a law of nature: Since there will always be poor people and misery, it cannot be helped, and it is even good. That’s why the French Revolution could not work because it went against a natural law. And then his readers were perhaps also fretting about the poor rates that had been going up for them as taxpayers in England. Malthus’ explanation came in handy here: The poor rates produce more poor people and that’s why there are ever more of them and the poor rates keep on rising. It would be better if they were gone!
Actually, my guess is that many such theories owe their success to a certain flattery for an educated audience. Eugenics is a prime example here: You can think of yourself as the fortunate outcome of a ferious “struggle for existence” (a term actually coined by Malthus) over the ages. You are the pinnacle of this astonishing process. And now all those idiots multiply to the max, and it goes all to waste. Of course, there should be a program then that preserves the successful specimens for the future — you! — and sees to it that the idiots do not get in the way. No eugenicist ever drew the conclusion that they themselves should not have children because they were indeed stupid.
My general take is that worldviews are primary. Everyone has one. It is conceivable and probably so that there are many people who only have a worldview and no explicit ideology. Rational thought is hard and humans try to avoid it. You will not get around it altogether, but as it seems it is possible to do without it to a large extent. That does not mean, though, that ideologies do not play a role even for those who are only into intuitive thinking. They may learn their views from others who are exposed to ideologies. And in a slow process, also their worldviews may result from the ideologies of yesteryear.
But then the main driver appear to be worldviews, at least when you have to get a certain mass together. Most people who supported Communism probably never read “Das Kapital.” Still they certainly absorbed an associated worldview. The same goes for National Socialism that had an impact not via “Mein Kampf,” but again by an associated worldview. This is not to say that ideologies do not play a role, only that their impact is in my view more indirect and has to be understood via their effect on worldviews through which they become effective.