Sunday, March 16, 2025

of mathematical beauty, crystals and potatoes: a Darwinian explanation

της τε ταυτου φυςεως και της θατερον (the nature of the same and the different) -- Plato, Timaeus

Potatoes are not pretty. They are also not mathematically patterned, not even symmetric. You may have noticed, they grow in the earth unseen, as if hiding how ugly they are. 

Well, that'd be one just-so story of why they grow unseen. Maybe it's not so silly. More reasonable would be the just-so story of natural selection to conclude that they lack mathematical order and its beauty because they grow underground where you cannot see them. Natural selection for patterns would be wasted on potatoes. True of all the roots, btw.

This is, of course, the key to why patterns are perceived as beautiful. Randomness is mostly useless for communication since you can't control the signal -- randomness can only communicate randomness, of occasional communicative value, but quite limited to signaling itself, "random", regardless of context. Only patterns can communicate distinct, controlled signals. To the extent that communication benefits species, patterning should be found throughout the biological realm of visibility. Natural selection has happened upon a means by which to communicate and even attract organisms, a simple means of patterning with simple recursive math. (Language, the quintessential communicative means, itself is a highly recursive mathematical patterning, and it's all about controlled signals -- super-controlled and highly distinct.)

So natural selection, which "had no option" of using randomness for communication, leaves all of us visual organisms susceptible to patterning. It might even explain why we love theorizing to explain patterns. The regularities are the only option for effective communication, and theories are a form of communication. We want patterns because that's all that's worth understanding, as I think we'll see. 

Randomness is not only uncommunicative, it's not predictive, and as temporal survival organisms, we want to predict future threats most especially, so we can avoid them. Natural selection again. Patterns, we love 'em. Randomness, not so much. It's nice for a change or a challenge. A mystery novel starts with the random, but we read to the end to bring it all into pattern. Timaeus' alternation of the same with the different, the pattern of the random with the predictable, has a special charm. We seem to be driven for prediction's sake to understand or find patterns in what seems at first random, but once understood, it's boring and we take it for granted. That's why news media leans towards threats and dangers, bleeding ledes. Cognition is not stable, it's a driven, impatient process. The familiar knowns are stable, and are soon taken for granted and forgotten, the oblivious obvious (post on this upcoming). The well of uncertainty that leads to distrust and fear seems bottomless: conspiracy theories are more fun than mere facts

You object, "crystals are highly ordered and beautiful to our eyes, yet they are mostly in the earth". Ah, but that's the perfect exception that proves the theory. Crystals are not organisms.

They can't intend anything, much less communicate intentions outside themselves. Lacking reproductive DNA, they don't engage in generational natural selection. They're stuck with fixed forms that either survive (continue to exist in their local context) or don't, no chance of evolving better. 

That crystals abound in the earth tells us that they are not signaling (and that organisms can, and can use patterning to accomplish this), they don't know what they're doing -- not a clue, and this is important for the panpsychists -- and that we're selected for liking patterns, that we see them as beautiful, or, to use more scientific language, we see them as attractive. It's a natural selection gimmick available only to reproductive generations. On this planet, that comprises the biological organisms. A distinguishing difference between the organic and inorganic is that inorganic patterns may be entirely unseen for their entire long duration; organic patterns are meant to be seen. 

To put it in the plainest terms, if crystals were organismic, they wouldn't be hiding underground, b****. :-)

the gnomiad: the science and sociology of life advice and their paradoxical puzzles

This post is a preliminary stub. I'm still working on the project, but here's a sketch of the science and sociology of gnomes (life advice). It's in three parts: 1) most surprisingly, the contrary pairing of life prescriptions show that they are empty, which implies that the benefit we seek from advice or theories of the world, while psychologically soothing or reassuring, is impractical and unaligned with the realities we actually face everyday; we seek a simplified, confirmatory world of fantasy while we actually make practical choices in reality, 2) the prevalence of one of the pairs, and absence of the other, in the culture raises many questions about cultural values or taboos or public reactions to those values and taboos, 3) most obviously, advice-giving influencers can be conveniently analysed politically and socially through their advice selection of one pair over the other. Finally, since philosophies and religions across the world are not only shot full advice but are often motivated by advice-giving and draw their audience from advice-seekers, the social psychology of gnomes affords a novel perspective on these theories and insights into their audiences' motives. 

"...mystery, miracle and authority..." -- the Grand Inquisitor 

1. the complementarity of gnomes

Life advice is everywhere on social media. It has obvious appeal, but an objective look at advice -- objective in the sense of not looking for advice but just looking at advice as a phenomenon or set of phenomena -- reveals paradoxes and puzzles. First among them is the pairing of contrary advice. A piece of popular advice and its contrary advice can both be equally useful, wise and true. "Better safe than sorry" is just as good a piece of advice as "carpe diem" (seize the day) or "fortune favors the brave" that is, don't be safe or you'll be sorry. Even more puzzling, why do people find any of these gnomes inspiring, when their opposite should be equally inspiring? 

A moment's reflection on this complementarity of gnomes -- their tendency to come in contrary pairs -- shows that gnomes are little more than familiar, even banal definitions. "Better safe than sorry" is little more than a definition of what "safe" means; "seize the day" merely defines, metaphorically, "opportunity". And yet more puzzling, every English speaker already knows the definitions of these words and are familiar with the ideas they denote. Everyone already knows the benefit of safety and when safety is useful, and the same for opportunity and risk. We all know that we must assess risk and safety case by case. So what is the appeal of the advice?? Is it like quoting one's favorite song lyrics instead of using one's own words? Or is there something more going on? More, no doubt, as we'll see.

If life advice adds no useful information in practice, is the appeal purely psychological? Suppose it's just a way to make life seem simpler even though it has no practical value in decision-making. It's a kind of psychological soporific, a soothing reassurance to smooth away anxiety just a little bit. Grasping one of the complementary gnome pairs provides a way through the complexity of everyday life. And choosing that principle no doubt confirms what one wants to believe about oneself and one's place in the world.

I think this is one of the revealing surprises of what I want to call the gnomiad, collecting the range of advice (tongue-in-cheek hat tip to Wolfram), and gnomiology, the study of advice. Gnomiadry -- thinking about gnomes and their world -- tells us that our imagination of life is distinct from the life as we live it and that this imaginary has emotional value even though it has no other practical value. Isn't that what religion provides as well? Could it be that philosophical and political systems supply the same confirmatory reassurance and soothing simplicity? And what about science and its laws dependent on ideal situations? Don't these theories or narratives, understandings and explanations of the phenomenal world reflect a deep and basic species need? Put into predictive mind (Friston, Clark), isn't this what defines an organism -- creating a theory of its environment & itself to predict its future and preserve its free energy? Ironic that for humans, preserving our free energy from the wastefulness of anxiety and stress leads us to false beliefs (like many political beliefs or fringe theories) and partial, one-sided beliefs like life advice.

So the pairing of gnomes lead us to understand them as vacuous, familiar tautologies and then to a question, what's their appeal? Gnomiology has an answer that reaches deeply into our organic nature. 

2. asymmetry of gnomes

Here's a second question. Many gnomes have contrary complements that do not have any appeal and are not to be found in the culture. There are dozens of versions for "be yourself", "don't measure yourself by others", "don't live by other's opinions", "don't live your life trying to be someone else", "follow your truth" -- this advice is everywhere. But "conform!", which is, if you  think about it, very good practical advice, is nowhere to be found. Why? Does that tell us something about our culture -- that we're narcissistic? Or does it tell us something about our resistance to our culture or upbringing or the oppressiveness of our culture? 

There are structural pressures on advice as well. People seeking life advice are for obvious reasons looking for personal solutions, so introspective advice "discipline yourself" , "be yourself", "know yourself" should be more common than say, Chaucer's "out of thy stall!" or Aristotle's "contemplate the universe" by which he meant, learn to understand the world outside yourself. The human inclination to confirm rather than to investigate the full range and experiment predicts that people looking for personal solutions will end up with introspective advice -- "stress is the source of your troubles", for example, not "introspection is a waste of time and a harmful aggravation of one's troubles; engage with the world or with others and you'll forget your troubles soon enough". Advice seeking may be a self-selection of introverted, self-affirming gnomes. 

3. advice-givers, the politics of the self-help industry

Finally, there are trends in advice unevenly distributed. So the anti-woke espouse stoicism which seems to be a philosophy constructed almost entirely of personal advice. Aurelius' Meditations is one gnome after another (and people love it for this!). The stoic trend is all very political, or more accurately, politically apolitical: its politics is, "don't be political; attend to yourself." It's consistent with a lot of Christian morality including "give unto Caesar what is Caesar's" and "my kingdom is not of this world", and you can see it in Jordan Peterson's confluence of stoicism and Christianity. And I'm not trying to be clever here. Peterson is quite clear that his apoliticist advice is political, especially when it comes to social justice and Marxism. He is a partisan anti-Marxist and anti-woke. Those are strong political stands, as stubborn as political stands typically are. 

So the choice of gnomes culturally and individually tell us a lot. In the case of individual choices we get a picture of our political divides. I'm not sure what the cultural selection tells us, but a sociology of gnomes promises to be revelatory as all sciences are. 

4. gnomes vs practical advice

All of the above gnomiadry applies only to life advice. Practical advice doesn't have any of these paradoxical puzzles. This again shows just how strange and useless yet appealing life advice is. 

Gnomes can even be empty tautologies. "Be humble" might be the most common advice on how to overcome confirmation bias. It's perfect question-begging rephrasal of the problem to be solved: "if you don't want to prioritize your own beliefs, don't". Here's a post on practical advice to overcome confirmation bias -- use Bayeseian reasoning by looking for the base rate, we tend to be oblivious to the    unthreatening normal so don't focus on the rare threat at the expense of the frequent, look for disconfirmatory evidence, not confirmation, question metaphors, assumptions and intuitions. That's a program. List the items and apply. Practical. 

An instagram video gives this advice to those who ask "what kind of gloves should I buy for street calisthenics". The answer:

"it doesn't matter what gloves you wear, it's what's in the gloves that matter." 

Now, that's a kind of life-advice, frequent among exercisers: it's a form of "just do it!" (It's also a faint hint of an insult: You are asking about trivialities. Be a man and get down to your 6,000 push-ups!) The contrary complementary advice to "Just do it!" might include "discipline -- struggling with yourself -- is a losing battle, but simply structuring your time to include exercise may help you get you to it". But "What gloves should I buy" also has a practical answer without any contrary complement. Here it is:

For calisthenics you should buy gloves with a grippy palm on the outside and also on the inside so the glove will not slip on the bar and your hand won't slip in the glove. Warning: avoid cheap rubber that will degrade quickly into a sticky surface, which will stick the glove to the bar. This can be lethal if you wear them while riding your bike as the gloves will stick to the handlebars whenever you try to lift a hand from the handlebar, and that will unintentionally steer the bike (and you) unpredictably into a truck or over a cliff or other places you'd rather not be. 

Such practical advice has no contrary. It's not empty tautologies like 'if you want to do calisthenics, just do it!" It's substantive, if boring, practical advice. It's the kind of good advice you look for in consumer reviews. Some pithy gnomes are practical: use it or lose it (referring to movement and flexibility exercises). There's to complementary contrary unless it's "die young and leave a good-looking corpse" which lies on the border of cynical warning or sarcasm. 

What's so interesting about gnomes is their bizarre paradoxical puzzle and appeal. Useless, its opposite just as useless, and yet appealing. It has the mystery of religion, hasn't it? What did Ivan's Grand Inquisitor say, "mystery, miracle and authority"; that's what we're all looking for. Gnomes, not the plain bread of practical advice.

Gnomiology also plays into cultural confirmation and confirmation bias. A lot of calisthenics advice extols discipline -- do it even though you don't like it. The complementary contrary is equally sound and maybe even more practical "discipline is a battle with yourself that you will lose; if you enjoy something, you'll do it; if you don't enjoy it, find something that you do enjoy". Discipline, the strength to overcome one's weaknesses, is consonant with our gender norms. Enjoyment isn't. Discipline plays on the inspiration of being masculine, and, no surprise, that's a lot of what calisthenics is for. 

the morality of gnomes?

"Health is wealth", popular among the calisthenics movement, seems undeniable and without contrary until you realize that health actually isn't wealth, and lack of money-wealth statistically leads to poor health, and money-wealth correlates with better health outcomes. But there's no gnome, "wealth is health!" 

There's something morally prudish about all of this. "Fortune favors the cut-throat competitor" and "money will bring you health" seem too crass and crude, let alone "fortune favors an a-hole". They violate our be-nice morality. The cultural notion of wisdom and sagacity are inconsistent with crassness regardless however practical. Even "Just do it!" doesn't imply any harm to anyone. "Cut in line if you can" might have practical value, but unless you're Peter Thiel, you'd never publicly advise it. It's anti-social. And yet, "be yourself" is hardly social. It's again the self-selection of self-help that underlies much of the gnomiad. Is "nice guys finish last" advice, a complaint, a cynical condemnation of the world we live in, or a lament for our best moral advice? 

a dynamics of advice?

Contradictory pairs of advice are by nature structured as a decision tree. Go with daring then you've abandoned safe. Since time is unidirectionally always forward, this decision tree looks Markovian -- that is, it goes from one position to another and can't go back without loss of time. 

It already seems that the role of time plays a really important one, and generates its own advice. "Time is money." There's no regaining time. And there's no complementary contradiction to that one unless it be the empty truism "live in the present" or "don't fret over spilt milk", both imperfect complements. "There's no regaining time" is itself a truism. Does it imply "choose carefully" or "follow your gut instinct"? "Don't fret over spilt milk" also has a complement, "learn from your mistakes". 

But is the tree really Markovian? What if the choice of "be brave" applies only to youth? We need a push-down machine for this tree and a time tracker. 

Of course I'm plagiarizing Chomsky and Wolfram, but these are the tools available. Maybe we can discover more as we go along. 

You can imagine what's next. Contexts other than youth/age can apply -- "daring" might apply to careers, "safety" in relationships or marriages. What about long-term goals, ends prioritizing all other decisions? And "keep your yes on the prize" contrasts with "be flexible" and "life is what happens on your way to your goal". These also depend on time and context. At what point should you abandon a goal and "cut your losses" instead of "be constant, perseverant, dedicated"? 

Work on these contradictions begins to seem futile. All the advice is like conflicting religions. From afar, they all seem foolish since they each purport to be true, but deny each other. "Don't heed advice, learn for yourself!" of course is the paradoxical limit. It seems on the face of it an empty truism -- one can't learn for someone else without learning oneself. But there's always the contrarian "be skeptical" and its paradoxical complement and consequence "be skeptical of being skeptical". Aristotle the Wise: moderation in all things. It sounds like a cop out too good to be true, but moderation has the virtue of being without paradox. But he should have said, moderation in most things. Which things? We still need context. 

When asked for life advice in an interview, a famous actor replied, "I don't give advice. People have to face things themselves." Respect. 

entropy and truth

Why is simplicity the measure of truth in science? 

On the face of it, it's obvious. Of two theories, both equal in the accuracy of its predictions, the simpler is obviously preferable. 

Okay, but why is the simpler preferable? A friend says, "fewer parts". That's question-begging: "fewer parts" is one definition of "simpler". The simpler is preferable because it's simpler!! But why should fewer parts be preferable? Because it's simpler. I want to show that the "fewer parts", though it can't be the answer, is a key to the answer, but it's important first to recognize that there really is a problem with simplicity=true. Short answer: scientific theories are about probability, not truth. 

So let's get started. 

"Simple" and "true" are not synonyms. They appear to be quite distinct properties. And yet Occam's Razor and theoretical economy, sometimes called theoretical beauty, parsimony -- these are all expressions of simplicity in science, and in choosing between two theories, one complex, the other simple, we scientists insist that the simpler must be true or closer to truth than the complex one even if the two theories have the same predictions. Bu' Y?

There are other metrics as well -- the relation to other theories, that is, does the theory fit smoothly into other well-established theories? -- but even this is a kind of simplicity measure. Can the different theories be reduced to a comprehensive one or do we have to tolerate many sources of prediction for many categories of phenomena? 

Looking at emergent or higher-level sciences like psychology or semiology, where there really are many theories unrelated to the reductionist science of physics, the need for the emergent laws is also driven by simplicity. Reductionist physics -- like quantum physics -- can describe the shape of a statue's nose, but not explain why it has that shape. To explain how that nose came to be we need higher-level sciences like psychology and anthropology and art history to discuss and explain the role of the artist and why an artist would sculpt a nose-like shape, a question that quantum physics can't answer but that these higher level sciences can, sciences that are in many ways independent of physics (upcoming post "information faster than light"). The emergent science -- whether psychology or the anthropology of arts -- is not adding complexity, it's winnowing away the irrelevant complications of the reductionist details. 

A reductionist physical explanation might work if it started with the origin of the universe, detailing every moment until it arrived at the sculpting of the nose, but such a description would not only be a huge task, it would have to have developed all the emergent sciences along the way, and it will be those emergent properties that will explain the nose, not the physics, so the entire project would be useless. We already have those emergent sciences, so it's just simpler to cut to the chase. It's the quandary of reductionism described in Borges "On the Exactitude in Science" where the map is as large as the city it maps. It's useless. Reductionism isn't useful in understanding emergent properties like statues and cities. It's mere description. We want explanation. 

And there's actually no reason to believe that physics alone would be able reconstruct the emergent properties that explain the form of a statue. After all, historically the sciences of psychology or neurology did not begin with physics. They began with observing emergent entities like human behaviors and brains and cities and economies. Again, if we lived in a simplistic LaPlacean dream, we could retrace the world to its original state, but that would take us many billions of years before we'd provide an explanatory chain of the entire development of that bronze nose and we'd still need those emergent sciences.

Emergent sciences, though less accurate and less predictive than an entire reductionist history of the world, are vastly simpler. There it is again.

Notice that here simplicity overrides accuracy (truth), since the emergent sciences don't explain or even describe the microstates of the nose, just its emergent properties, the most important of which is that viewers will recognize that it's a representation of a nose. In fact, the microstates are often irrelevant. Whether the bronze has a high or low nickel content is irrelevant to the viewer.  The value of an emergent science is the simplicity of its understandings (commonly called "explanations") at the expense of truth. IOW, simplicity in emergent sciences is preferable to truth. lol

So I was puzzled, in my stupid, slow way, by this prima facie obvious equivalence between truth and simplicity in the sciences -- Occam's Razor, parsimony, elegance, beauty. It seems so obvious and natural, yet when you think about it, why should it be so? The more you think about it, the more you realize that truth shouldn't have any relation to simplicity. These two, simple and true, are clearly not synonyms, so why the relation? 

I like asking stupid questions. They often lead to the most surprising answers. Why does gravity work over distances? Space-time. Why does time go in one direction? Probability in an entropic universe. There are stupid questions about politics, and stupid questions about psychology and stupid questions about language. Their answers reveal the most wonderful surprises about political views, psychology and language. As I'll be posting soon, judging crowds out learning, and so does assuming -- and therefore forgetting -- the obvious. 

It occurred to me recently that there is an obvious answer to the simplicity metric in science: simple things are more probable than complicated things, ceteris paribus (all other things equal). In an entropic world, where even time's arrow is nothing more than the probability of increasing entropy is vastly higher than decreasing (though I have questions about this in a system being fed by an energy source like the sun; why isn't the direction of time reversed in natural selection evolution from simpler high entropy to lower entropic complexity? -- seems like the time's arrow theory is question begging too), a likelihood metric for the sciences makes perfect sense. Coordinating complexes is vastly much less likely than coordinating fewer elements. The math is exponential for just the number of elements, and if there's an ordering of elements, the possibilities are even greater and if there's any recursion in the elements, the possibilities are infinite. The simplicity metric is really just a probability metric in an entropic universe. It's not just fewer parts. It's that fewer parts are more likely to be coordinated, given entropy. This answer is not question-begging. It's a practical result. 

I should re-emphasize that the reason this simplicity metric works is not because simpler theories predict better. The question is, why prefer a simple theory in cases where a more complex one predicts just as well, exactly as well? That's the puzzle question. Now, it's also true that simple means or theories are more likely to predict better because nature prefers simplicity, but that is a different stupid puzzle, though the answer is the same: entropy.

And remember, you can always jimmy a theory to make it work. It'll get more complex, but it'll work. What's wrong with the theory, then, if it really does work? It's less likely that in an entropic world such complexity would would be its foundation. That doesn't refute the complex theory. It just makes the theory less likely -- the theory will be less likely,  not its predictions, which will be the same

This explanation of why theoretical simplicity is preferable has several interesting implications. For one, it implies that we're not assessing truth in science at all. We're assessing likelihoods. I believe that those of us in the sciences already hold this view, but it might be good to be more explicit about it. Scientists are well aware that science is not a body of knowledge but an ongoing investigation, and any scientific theory is not The Truth but merely the best approximation to The Truth currently available. Maybe we should shift that a little bit: a successful scientific theory is the most likely theory currently available. No mention of truth is needed (but see this post on the necessary use of truth in conversation). It's just likelihood -- its predictions are more likely, but even when the predictions are the same as more complicated theories, its terms are simpler, so the theory itself is more likely than an equally predictive but more complex theory. It's all just probability. 

Another interesting implication: it aligns theoretical value with another basic of physics, time. Both can be subsumed under probability: time, theoretical simplicity. 

Both share a kind of puzzling obviousness. Why should time progress in only one direction? Why should truth be simple? Two stupid-sounding questions. How could these phenomena be any different than they obviously are? 

Well, they are not stupid questions and their answers seem to be the same or closely related. There is no actual physical law of time's direction, and nothing specially truer about simpler theories. It's all just the probabilities of entropy. And for time, the probabilities don't require any math to understand. It is vastly more likely that a dropped egg will break into a mostly random mess than a broken egg will jump up and put itself together into the highly ordered egg state that took millions of years of natural selection to develop. As I learned when I was teaching undergrads, the possible wrong answers are infinite, right answers are few, so two identical wildly wrong answers in submissions of two students can be a red flag of cheating, if the exam question was not a trick. Entropy explains all, and it's all probability.

Talk about tricks, the title of this post is one. The post's actual content is about simplicity and entropy. It's intent is not to explain truth at all, but to replace truth in the sciences with probability. That's not to abandon all truth. There are two uses of truth (or "true"), one metaphysical used explicitly in the sciences, and another tacitly used in communication -- in conversation. There are many posts on this blog dealing with the communication function of truth and "true", in machine learning, in the logic of speech and the logic of conversation. The subtitle of this blog (de-perplex) mentions "interactive cognition" and the communication function of truth is one case of it. The metaphysical or scientific sense of "true" can be replaced with probability, but the interactive, communicative use of truth cannot be abandoned. 

Probably.


 


the sociology of false beliefs

Background (if you find this familiar and boring, skim to the next section which is the meat of the post)

Here are two mutually distinct sets of believers in our society: believers in government propaganda (weapons of mass destruction, e.g.) tend to dismiss fringe conspiracy theories, and believers in fringe conspiracy theories dismiss government propaganda (obviously, since an important component of fringe theory is distrust of government propaganda). 

Can we map these false beliefs -- like beliefs in government or partisan propaganda and fringe conspiracy theories -- onto the social pyramid or other social places like gender or religion or urban or rural? Can such socially located false beliefs tell us about those social places and vice versa, do the social places tell us about the beliefs? Answering those questions is my goal: a sociological explanation for who believes what and why.

Why only false beliefs? A true belief can be held solely because it is obviously true or highly plausible. It could be being held because the believer is biased or it could be being held because the believer is rational. But a patently false belief -- a belief that contradicts its own premises or conflicts with facts accepted by the believer -- can't be explained by the belief's rational consistency or perception of accuracy. The holding of it can only be explained by some choice, bias, ignorance or lack of rationality in the believer. False beliefs tell us more about the believer than the belief. 

How do we know which beliefs are true or false? In many cases it's not obvious but there are clear cases of contradictory beliefs. There are also beliefs or theories that are far more complex than a simpler theory (see the post "Entropy and truth" about the role of simplicity in theory choice). These over-complicated theories raise the question of motivation. Why choose the complicated theory over the simpler one? The answer should give us direct insight into the psycho-social mapping. 

Cognitive psychology, from Tom Gilovich and Daniel Kahneman to Jon Haidt, tells us that our choice of beliefs (e.g., Gilovich's How to know what isn't so, Kahnemen's Thinking fast and slow, Haidt's Righteous Mind, also see the posts on this blog "Fool's Errand Attachment: a cognitive bias" and "the free will oxymoron" among others) are not rationally motivated at heart; we believe whatever we want to believe, regardless of facts. The crucial word is "want". There is always some emotional desire motivating our choices, which we hide behind a veneer of rational justifications to defend and hide our selfish motivations from others -- and from ourselves, since honesty is easier than lying, deceiving ourselves makes it easier to deceive others. 

Our rationality seems to be in the service of two emotional drives, one a selfish personal interest, the other an interactive social interest. Our behavioral cognitive psychologist says a liberal Democrat might provide lots of rational justifications for being a liberal Democrat, but it's mere rationalizing, a public show hiding some deeper emotional commitment. It may be, following Jon Haidt, a seeking for approval from one's chosen peers -- the virtue-signaling of the liberal. Or following Robin Hanson, it's a creating of one's identity within one's social context -- holding a respected or prestigious ideology, in his words "good-looking ideas". Or it may be, following Rob Henderson, the luxury ideas that signal high socio-economic status. These are all interactive social drives. On the other hand, the political beliefs may be driven by cognitive dissonance over personal gain, not at all an interactive choice but a merely self-serving one: for example, I'm paid to teach in a public university, so I'm inclined to favor a candidate who endorses public higher education, and I then defend my choice by scraping around for other justifications for voting for that person. 

To the extent that we hold our beliefs for reasons other than their superficial public rational justifications, our beliefs are falsely held, regardless of the truth value of those beliefs. We should expect, then, to find many false beliefs held for no reason but personal and interactive reasons alone. If personal interests and interactive social interests depend on social identity, false beliefs should tell us about social locations including distinctions of class, rural vs urban location, even race or gender. The relevance to identity politics is obvious, but in this post I'm interested in two classes, the professional class (the educated elite) and the non elite or anti-elite(?) class. I'll start with the interests and emotional commitments of the professional class and their investment in partisan propaganda.

the professional class -- the virtue-signaling class -- and their beliefs

The following is a very broad-brush picture. Think of it as a kind of caricature. It's meant to describe a statistical inclination among the members of a class. I am well aware of the many many counterexamples to this description and the description in the following section. I beg the reader for indulgence. 

Members of the professional class can be identified easily. And quickly. Within five minutes of meeting one, they will tell you what they do for a living. They have a career that they are proud to announce, "I'm a partner with Bellstein and Whistle", "I teach at Superior T U" or "I'm a physician with a clinical practice in insurance-guaranteed pharmaceutical FDA-approved remedies and addictions -- a branch of government finance, you should know". Humor aside, they perceive themselves to be, and are, privileged. They live in safe, clean, well-served neighborhoods; their children go to fancy schools. They themselves attended fancy schools and have fancy degrees with fancy letters before or after their names. Their interactive position is one of prestige and social respect, and they are, and perceive themselves as, the beneficiaries of the many goods that their society affords. 

They are, therefore and crucially, deeply invested in their society that gives them so much, including respect. The last thing they want to see is a socio-political revolution in which they'd risk losing everything. They are the defenders of the faith in American democracy, progress, justice and equality. They want to believe that their government and social structure are basically good or at least, better than any alternative, or at the very least, capable of improvement and for that reason worthy of upholding. 

However, they also perceive that a vast non elite public does not reap all these social benefits. They have two responses to this inequality. One is self-justification, holding the belief -- perhaps false belief -- that they deserve their benefits. They worked hard for their fancy degrees, their education has given them an appreciation for fine culture, a superior understanding of the world, of politics and of economics and science, whereas, they believe, the non elite are without all of this. According to elite perception, the non elites don't appreciate culture, abuse and debase the heritage of the language, and understand so little of politics that they vote against their own material interests (apparently ignoring entirely that liberalism is defined by voting against one's material interests in order to benefit society as a whole -- see the post on Hochschild vs Stanovich and the liberal blind spot). They believe that the non elites are concerned only in creature comforts so they are naturally lazy and unambitious, lacking all merit. At best, they pity them. 

That strategy is widespread -- you see it in the regular publication every couple of years of a new book describing how the plebes are destroying the English language. It's almost comical how something so trivial and misguided (I'll post on this soon, but Pinker did a great job with this in his The Language Instinct) is the source of so much pride, while ignoring that the liveliest, subtlest and most beautiful linguistic innovation is ceaselessly created at the social non elite bottom. I digress. It's the linguist in me. 

This strategy of superiority, though widespread and deep, does not however alleviate the educated elite from feeling any less dickish. If anything, it makes them feel more dickish. They need another strategy to alleviate, resolve or hide the dick. It's virtue-signaling, social and financial liberalism, in Phil Ochs' 1966 words, "love me, love me, love me, I'm a liberal". 

As every news venue must find and cater to an audience market share to serve up to its advertisers, the NYTimes, Atlantic and NPR provide the desired information -- including false beliefs -- to the educated elite. The NYT reader reads the NYT not only because it has prestige, but also because it tells its audience what that audience already wants to believe. Your government, essentially good but imperfect, can be improved by voting Democratic. If you vote for the Democratic Party, it will take a bit of your taxes, not too much, but some, and redistribute it to The Poor. Then your conscience will be clear. And if this doesn't work and doesn't save the society that you get so much from, blame it on the evil Republicans. It's a tidy, neat package wrapped in a pretty blue ribbon. 

If the NYT reports government propaganda the NYT reader will believe it. Weapons of mass destruction is the classic case. NYT readers believed this propaganda even though anyone with their eyes open knew it was a dog and pony show from the start. Kofi Annan knew it. Hans Blix exposed it. Nevertheless NYT readers believed it. And today, now that we know that the NYT failed to present the propaganda as absurdly false, the educated elite still read the NYT. NYT readers are original victims of Murray Gell-Mann amnesia. The professional class -- educated elite -- are prone to false propaganda beliefs. It's their emotional investment in a society that provides them with all good things, especially their interactive social value -- respect -- that drives their beliefs. It's their need for virtue and virtue-signalling to cleanse their conscience. 

So much for the professional, educated elite class. 

[Once, when I was arguing that the educated elite have high voting rates because they believe in the voting process, the person I was talking to objected that there are too few of them to make a difference in elections. It dawned on me that on the street "elite" now refers to a cabal of CEOs or a small cohort of Skull & Bones members or Bilderburg members or Illuminati.  Conspiracy theory has overtaken the language. In this post "the educated elite" refers to the professional class and those that identify with it or aspire to it.]

the non elite distrustful and respect-signaling class

The non elite have no need for virtue-signaling. They do not perceive that they are privileged within their society. Many actually do get a lot from the government, but it is in the form of subsidies, so they can't claim them for respect. On the contrary, subsidies are signs of inequality, of the injustices of the society. Subsidies do not argue for an investment in the society, they demonstrate its injustices of its economic structure.  

What the non elite want and don't get from the society is respect. They know that the professional class looks down on them. They don't have careers, they have jobs. They have subsidies, not fancy degrees. It's in this world of want that conspiracy theories and distrust of government thrive. Respect is available only from their peers. The essence of conspiracy street cred is "Those educated professionals, they believe government propaganda. I know better. The government at every turn is lying to them and they believe it. Not me. They vote, those chumps. They've been had, losers." A flat-earther once used those very words to me. "You think the earth is round? You've been had." 

The sense among the non elite or anti-elite that government is inaccessibly remote is a kind of structural violence. Not physical, but a psychological violence. Its consequence is this response: distrust of government and social distrust of "the elites", the perennial target of their conspiracy theories. 

the outcome

There are many false beliefs that can be mapped onto social spaces and class structure. The post on the Liberal Blind Spot focuses on religion and fictional political narratives in urban vs rural social locations. The current post has focused only on the mutually exclusive government propaganda beliefs and conspiracy theories among the educated elite and the non elite respectively. 

This mapping suggests that beliefs are reflexes of social place, that properties of social place -- especially degrees of prestige and respect, but also material wealth and perception of inequality -- incline towards specific beliefs to resolve cognitive dissonances (having more wealth than others, getting less respect than others). 

There are other reflexes besides propaganda beliefs and conspiracy beliefs. Those who believe propaganda dismiss conspiracy theories as stupid and dangerous. Those who believe in conspiracy theories dismiss propaganda believers as fools. 

There's also a behavioral reflex. The virtue-signaling class expect government to solve social problems, and they support change. They are prone to the Fool's Errand Attachment bias

Conspiracy believers do not expect progress. Their baked-in distrust expects only harm from legislation and the corporate world's manipulations, harm in the form of agricultural poisoning, vaccines, surveillance, indoctrination, war. They are political fatalists (see the post on the Fool's Errand Attachment) and are naturally drawn to Reaganite neoliberalist ideology of small government and deregulation. 

However, the non elite are often found helping the people around them. Who is more dangerous, the non voting flat-earther sharing lunch with a homeless migrant and finding him a discarded winter coat, or the voting, unquestioning NYTimes reader, leaving the white cloister to join a protest march? Obviously, each thinks the other is more dangerous. Outside of the US, especially among the ordinary people in the so-called global south, the US Democratic Party is commonly viewed as extremely militaristic, dangerously aggressive and violently murderous. Something for the liberal Democrat to think about. 

Friday, March 14, 2025

true but wrong

My favorite statistic: 

some years ago I came across Citigroup's Global Economic Outlook & Strategy. In it I found a page on revolutions! According to their research, a revolution in a nation on average will induce a 14% decline in that nation's GDP. And if it is coupled with a civil war, the dip is likely to be twice as great!! 

I love this statistic. I think it's safe to say that this decline in GDP is hugely important and no doubt more or less true. And yet it also misses everything important about a revolution. From the perspective of the revolutionary and social justice, it's strictly irrelevant. It's totally true, and utterly wrong

When we consider theories and explanations and understanding, do we check to see whether we've got the right truth or the wrong one? Is there a bigger picture than the narrow truth? Is there a biggest picture? 

I know it's common to complain that, say, scientific theories are too narrow, that they may be true, but still wrong. I think these complainers are the narrow ones. Their view of the potential of the sciences is limited by their anti-scientism. There are many narrow scientific theories, but I think there are also scientific biggest pictures. 

The Citigroup statistic is narrow truth. The justifications for a revolution are also a narrow truth. A bigger picture is, obviously, a cost-benefit-plus-risk analysis, weighted by some values like justice, human rights etc. You can maybe see when the biggest picture is intuited by the public. If the regime is extremely repressive, it can be because it knows the public wants to take that risk. But a small radical military faction can also induce extreme oppression from the regime. A civil war following a revolution might also indicate a public rejection of that radical faction. Everyone involved is intuiting what Citigroup has measured in a number. Of course, Citigroup's statistic is addressed not to regimes nor to the public nor to the revolutionary. It's addressed to the investor, indifferent to them all. 

The anti-scientismists complain that science can't give us our values, our aesthetic sense, our emotions. But surely everyone knows that's not so. As E.O.Wilson pointed out, our moral values are the ones that natural selection selected for us as a social species, the values that conduce to social species survival. Same with emotions, and there's a post here on why mathematical beauty is beautiful to us, and it's all about natural selection of reliable signals. So could we retire this anti-scientismism? 

Ah, but once you know the scientific explanation for some value, will you still hold the value? That's a real problem, since some illusions are practically helpful and useful. Cleave only to truth and you're left with no reason to live at all. Except for your naturally selected values and emotions. 

So I say, don't worry about truth undermining your values. The values that science shows you are yours are in fact the ones you want. Don't worry, be happy. I mean, don't harm yourself, be engaged in something engrossing outside yourself. 

Saturday, February 15, 2025

"jones"'s four corollaries to Brandolini's Law (aka The Bull Shit Asymmetry Principle)

For those who don't know, "Brandolini's Law" is a tongue-in-cheek so-called law, akin to Godwin's Law and Murphy's Law, frequently applied to fringe conspiracy theories. It reads like this, according to Wikipedia:  The energy required to debunk a stupid idea is orders of magnitude greater than the energy it takes to produce a stupid idea, also known, according to Wikipedia, the bullshit asymmetry principle, coined by Alberto Brandolini, an Italian programmer, after listening to an interview with the former Prime Minister Sylvio Berlusconi.

Here are four corollaries that seem to me consequent to his "Law": 

1. The asymmetry is in inverse relation to the quantity or availability of evidence. The less evidence available for or against the stupid idea, the harder to debunk. 

If you can't practically go to the moon, you can't easily demonstrate that stars don't appear on camera there. Instead you have to rely on an elaborate explanation of optics on lunar-like surface locations. A conclusive and thoroughly persuasive explanation might require a comprehensive understanding of physics, a comprehensive understanding that might take years of effort and math. And even if the debating parties were willing to pursue that study, considering the resources of intelligence immediately available to the discussants, it's probably impossible. People will believe what they want to believe regardless of any facts. Why should they bother to learn all that background? "Learning? F*** no!"

What's motivating the preference for one or another belief about the moon landing? Here are two posts explaining these preferences: the sociology of false beliefs and the Gates-Musk paradox: conspiracy theories are not about what you think they are about.

To extrapolate this corollary (corollary 1.a): where there is no evidence, the orders of magnitude of difficulty reach to infinity, iow, impossibility. This leads to another corollary:

2. Corollary 2 is the flip side of (1): wherever there is no easily available evidence, therever will be a playground for stupid ideas and hoaxes to fill the vacuum of information. 

Consider the difficulty of proving that China doesn't really exist, it's just a gov't conspiracy. You'd have to persuade people that all the local Chinese immigrants and tourists, along with everyone who claim to have visited China and returned, are in the conspiracy, that all the books on China are fabricated, news casts from "China"  etc., etc. And the risk would be high. Anyone can just buy a ticket and go visit to prove your conspiracy hoax is false. So where there's evidence, hoaxes will be few and stupid ideas only at the very extremist end of stupidity. 

However, it does not follow from (2) that wherever there's lots of evidence, therever will be no conspiracies. Corollary 2 does not lead to an infinite conclusion as (1.a) does. A stupid idea can always either twist evidence in favor of a theory, or find some coincidence that supports a theory. This is one reason why confirmation bias is so intransigent -- it's so easy to find supportive evidence for almost any theory. Fringe conspiracy theories, like the poor, will always be with us. 

A flat earther tells me that maps have historically always been flat. Proof! If I point out that all the portraits for George Washington are also flat, but he certainly couldn't have led the armies if he were flat, the response I get is, there are statues of Washington. And if I argue that there are no statues of Hitler :-), the response I get is, yes, Hitler was flat and black and white too. He's a gov't fake. 

The persuasiveness of a belief depends on the degree and source of distrust and the sources of information. Paranoia is extreme and often contradictory, but not unwarranted. It's the difference between confirmatory evidence, of which the paranoid has plenty, and discomfirmation, which the paranoid either hasn't thought of or hasn't been exposed to. 

3. is a corollary of (2). Wherever there is little evidence available, therever will be a playground for real conspiracies, that is, propaganda. What's an opportunity for the goose is just as rich an opportunity for the gander. 

4. There's yet another corollary to (1). Where there's no evidence, or no practical evidence (like the moon), there the debunking of a stupid idea with evidence is waste of time reaching towards an infinite waste of time, and anyone engaging in debunking the evidence of such a theory is infinitely foolish. 

That's not to say that debunkers should give up on stupid hoaxes. Disproofs don't depend only on empirical evidence or even on best theories of physics. Many hoaxes rely on internal contradictions. It's just a matter of identifying them. They are not always obvious, often because the absurdities of the conspiracy focuses attention on the details set out by the conspiracy. Conspiracy theorists too easily set the terms of discourse. Accepting the assumptions of the theory is an easy trap to fall into.

The moon landing is a case in point. The temptation is to debunk the empirical evidence about stars and flags waving in "the breeze". It's easy to miss the contradictory assumption in the conspiracy itself, that gov'ts produce fakes for the purpose of propaganda. Take that seriously and you've got to ask, why didn't the Soviets (or as we used to say, the Russians -- a telling and interesting expression that has been overlooked since the collapse; it implies that we intuited that the CCCP was really an extractive Russian empire more than an ideological alternative, or maybe that was our gov't's propaganda) produce their own fake in response? Really. It was recognized across the globe that the Russians were ahead of the US in the space race. They'd sent a satellite in orbit first, the first dog and animal, the first extra-orbital object, and the first human in orbit. Anyone would have believed a Russian photo of Russian cosmonauts carousing on the moon. And their photo could have been even more persuasive, correcting and improving the US photo!

This question -- why the Soviets didn't produce their own fake landing -- takes the conspiracy assumption seriously and renders it, if not contradictory and inconsistent, then certainly stupid. 

I once asked a dyed-in-the-wool conspiracy theorist why didn't the Soviets produce their own fake. His  emailed answer (I don't know how long it took him to figure out the answer) was, if they had produced a fake, they'd be risking exposure! 

Do I need to point out that his answer is the reason to believe the US landing was real? That if he truly and fully credits that answer, then he can't also believe the US faked its landing? More accurately, he can't be a consistent thinker and hold onto the fake landing belief without some ad hoc extenuating belief like the US gov't doesn't care about being exposed even though the Soviet gov't is.

An even more interesting question is why anyone believes the conspiracy when the premise of the theory leads to an obvious inconsistency. One could salvage the theory by insisting that only the US practices propaganda, but you'd need an additional explanation for why that would be so. IOW, salvaging the theory only complexifies it, making it less probable. I've got a post on probability and truth ("entropy and truth) and another on why the lack of any Soviet fake moon landing is a really interesting puzzle regardless whether the US landing was real or fake ("where are all the missing fake moon landings?)

Also notice that his answer is a thinking answer. Yet he doesn't think to apply it to the US fake -- he doesn't apply to his own view. That's the character of human thinking -- we use it when we want it and only then. That's why conspiracy thinking is so important for us to understand ("the sociology of false beliefs). It shows us the general character of our thinking in bold relief. 

Thursday, May 16, 2024

Musk again? A lesson in inefficiency and ambiguity

Originally published on Language and Philosophy, September 1, 2023

A friend, explaining why he admires Elon Musk, describes the efficiency of Musk’s auto-transport tunnel: subway train cars are expensive because they can’t be mass-produced, they require large tunnels the boring of which require exponentially greater energy for every increment of diameter, and trains must support rush hour capacity even on off hours, whereas autos are mass-produced cheaply on the assembly line, they require little tunnel space, and on off hours, only the occupied ones run. He concludes that the Musk tunnel is efficient.

Well not so fast, pun intended. And there’s an important linguistic and conceptual lesson to be learnt from that lack of speed.

The carrying capacity of a packed subway/metro car is about 250 people. The average train is eight cars, so about 2000 people running, in NYC, every two to five minutes in rush hour. Off-loading that efficiency onto autos would require about 2000 autos every five minutes, or about 7 autos per second. A car would have to have 1/7th of a second to load onto Musk’s tunnel platform to meet rush hour demand. Not likely.

Even if that were possible, consider an on-ramp for a midpoint station B from destination A to C. In rush hour, there’d be a constant flow headed from A towards C.

A–>——B–>——–>C

To accommodate cars coming from station B, that traffic flow would have to slow down to allow a car to enter into the single-lane traffic of the tunnel, and the on-ramp itself would have to be backed up for the one-by-one entry onto the tunnel, even if this were all automated. A subway train typically has dozens of stations between its endpoints. Imagine the traffic jam in a single lane tunnel at rush hour, each car containing one person since the 2000 people in each train have now been distributed into individual cars. Depressing.

In other words, the Musk tunnel can’t handle rush hour traffic if used for a metro system. It would probably be so slow that either long lines would drive people away, or it will be priced high enough to prevent long queues. You don’t need to think long and hard to figure which option will be the business model. It only makes sense as a luxury shuttle from airport to hotel district for the business class, with a discount during off hours.

And what does the subway/metro serve? The entire public, the society as a whole, the economy that serves everyone. The inefficiencies of the system are sacrifices to the priority it serves. And it is fast. Very fast by compare!

Tunnels are not new. We know when and where to use them efficiently for the public. There’s a reason why we use tunnels only for A to B destinations, like midtown NYC to Weehauken — nobody needs to get out in the middle of a river.  There are projects that serve money, others that serve the public, and some that serve the vanity of their promoters’ public image.

Here’s the lesson: a grotesquely inefficient idea implemented efficiently remains grotesquely inefficient. Its efficiencies are intended to create more profit for the owner. So of course there will be efficiencies in any business model. That does not at all imply that the business purpose is efficient.

This notion of “efficiency” is trading on an amgibuity between the efficiency of the business idea (Musk’s is not) and the efficiency of its implementation (conveniently for his profit). The inefficiencies reveal the respective purposes: serving the public, the society, the economy as a whole or making money serving a luxury economy. This is why Adam Smith despised the wealthy — their servants could have been employed in the productive economy that serves all, but instead were wasted on serving wealthy individuals with no benefit to the nation’s economy, the wealth of the nation.

And it’s a lesson for the libertarian: because the market supplies demand, in unequal societies the wealthy will be served more and better in ways that are inefficient for the economy as a whole and the nation. Smith’s book is not entitled “The Wealth of Individuals” but “The Wealth of Nations”. Remember that.