The psychological consequences of equality

Our nitwit species has never overcome its own cleverness. If we find an idea or symbol or image that appears to compel people, we’ll use it — and worry about the consequences later.

hipster-borgEquality is a powerful symbol to use. It conveys inclusiveness, and an automatic sense of group bonding. “We all agree we all should be equal, right? Now all we have to do is crush those who disagree!” It’s also a neat way to institute a witch hunt. If your neighbor doesn’t believe in equality, maybe you deserve his farm.

But those well-worn (at least here on this blog) paths give way to a more interesting question: what are the psychological consequences of equality? In other words, does it make our brains healthier, and is it a good interface to life? Here are two problems with equality as it impacts our psyches.

  1. External focus
    • If we are all equal in value, then there is no way to distinguish ourselves except by our appearance. It’s like trying to make hamburgers interesting again. Put an avocado on the hippie one, arugula on the yuppie one, and a slab of ironically wholesome cheese for the hipsters. Your social rank is your burger. A bacon cheeseburger? You’re not as elite as someone with an arugula, avocado and feta burger.
    • Because we must assume others are equal, we cannot demand that we be measured by the content of our personalities instead of our external traits. We are interchangeable parts, not individuals who determine themselves from within. If you start asking we be judged on moral character, intellectual ability, honesty and sincerity… well that ruins equality, because we cannot look at you from a distance, see you are human, and figure you are equal. It would force us to engage with life, and that scares us.
    • Since we are all equal in value, and we cannot look within, external traits are how we draw attention to ourselves — and since others are doing it, we must all compete with them. In a mass of equal people, the person who figured out a unique and ironic hat stands out; this person is noticed, which advances their business, social and romantic prospects. Since there are few things not thought of before, this requires we embrace oddity and ugliness, like modern art and freak shows, and correspondingly become more “tolerant” so we can pretend we like them.
  2. No striving
    • If we’re all equal and are going to get equal treatment, the reward has come before the labor. We now expect to be entitled to things and status, instead of feeling that it is a reward for our contributions. As a result, everything we do becomes backward: we assume we belong, and therefore that whatever we do is right, but then we try to justify those actions by proving to others how altruistic or moral or unique/ironic we are.
    • Since equality is the goal of the society, rising above equality is a socially problematic issue. So instead of striving to make ourselves better internally, or to contribute in ways that might cause conflict as all, we focus on making life more comfortable for us. This inevitably involves selfish actions like retreating to the suburbs, buying an SUV, and turning up the volume to drown out the other equal people.
    • If equality is the norm, an attitude emerges which finds those who want to refine themselves or improve on anything but their material circumstance to be “elitist,” and that’s a problem since most equality-based societies exist after revolutions against the elites. You don’t want to raise your head above the herd, or it might get cut off. Don’t strive, except for the material comforts we all agree (equally) are important; coincidentally, these material comforts create the most waste and use the most energy.

An interesting way to view this situation. If we could step back from our modern lives, we could see how simple it all is. There were revolutions, and we are obligated to consider them as absolute Good, in the same context religion makes Good and Evil. The revolutions aimed for equality because they wanted to overthrow hierarchies. Now you either obey the official revolutionary dogma, or you are considered an enemy of equality, and possibly destroyed.

You’re oblivious, dear parents

Every now and then someone from the adult world stirs themselves to study kids, and finds out what we all knew: adults and children live in different realities.

You know how at this blog we always talk about multiple factors being considered at one time, as if it were an essential cognitive tool? Check this:

1. Kids are clueless in certain ways
2. Adults are oblivious to certain things they must endure
3. Kids are aware in ways adults are not
4. Adult experience brings an awareness kids cannot have.

All four are true — at the same time — which doesn’t invalidate either experience, but points us to where we should look.

A surprising number of teenagers — nearly 15 percent — think they’re going to die young, leading many to drug use, suicide attempts and other unsafe behavior, new research suggests.

The study, based on a survey of more than 20,000 kids, challenges conventional wisdom that says teens engage in risky behavior because they think they’re invulnerable to harm. Instead, a sizable number of teens may take chances “because they feel hopeless and figure that not much is at stake,” said study author Dr. Iris Borowsky, a researcher at the University of Minnesota.


Well, no kidding.

Our species cannot decide whether global warming will kill us or not happen at all.

Our species is tolerant of its criminals, parasites, etc. but never fails to go out of its way to bash down the one who rises above the crowd.

Our culture is garbage. Madonna, Michael Jackson? You’re kidding, right.

Our leaders are whores and the voters are even dumber whores who are content to be led with lies, because they cannot face difficult or complex truths.

Our media is full of fears, our leaders control us with fears, and worst of all, everyone around us appears oblivious to long-standing problems in our society — environment, racial conflict, crime, corruption — because these aren’t polite to mention.

Humanity has slipped into its own world, a world ruled by social devices and the avoidance of conflict, and as a result, cannot face reality.

At all.

Kids see this, because it’s new to them and they’re very afraid of these adult things they see coming down the pipe.

Adults survive by making polite commentary and ignoring problems, even though they have to know that eventually this mess will blow up in their faces… or in someone’s face, at least, because in fifty years these adults will be dead or on their way, and at that point, why should they care? (Obviously I disagree.)

So on to the next shocker:

American adults from young to old disagree increasingly today on social values ranging from religion to relationships, creating the largest generation gap since divisions 40 years ago over Vietnam, civil rights and women’s liberation.

A survey being released Monday by the Pew Research Center highlights a widening age divide after last November’s election, when 18- to 29-year-olds voted for Democrat Barack Obama by a 2-to-1 ratio.

Almost eight in 10 people believe there is a major difference in the point of view of younger people and older people today, according to the independent public opinion research group. That is the highest spread since 1969, when about 74 percent reported major differences in an era of generational conflicts over the Vietnam War and civil and women’s rights. In contrast, just 60 percent in 1979 saw a generation gap.


Remember how above I said all four factors were true at one? Kids are clueless about life and adults are oblivious to some things kids see, but kids are also inexperienced, where adult experience can be useful.

One of the biggest confusions we have is that kids are really good at spotting the elephant in the room, but their solutions are amateurish. Inexperienced, they tend to defend the individual, because they interpret the world personally. “It’s trying to get me,” they think, because they’ve been raised at the center of their own universe by their parents, and now they’re having to adapt to the fact the world doesn’t care. It just does what it does, and if you get snared, oh well!

So now adults and kids not only exist in two different realities, but are heading toward different polarized political views, one of which is liberal and one of which is reactionary.

And all these confused people vote.

Why I don’t buy Apple

mac_crashMost posts on this forum are not what I’d consider opinion pieces; they’re descriptions of knowledge about what will happen in certain circumstances, not prescriptions as to what should happen.

However, in this post, I’m going to describe why I detest Apple Computer, Inc. and will not buy any of their products, least of all a Macintosh computer. Ever.

So it looks like my Macbook Pro hates me. My monitor won’t display anything even though the computer is on. I can even log in and turn the volume on and off. I can hear my email sound and everything…but the monitor just doesn’t work.

{ pause for about 24 hours }

So I went in and he went through all the simple resets and tests that I had already gone through and he told me it was the logic board. I asked him to check if it was the NVIDIA defect and he did. Wasn’t that. Either I pay 1200 for him to fix it in store or I pay 300 to send it away. Lame but I guess I have to send it away.


This guy bought what’s probably a $1500 laptop and is now getting told that he can’t get it fixed here for a halfway decent price; he has to send it off, where the cost is that it’ll take a month to return. And why has the machine blown out?

The motherboard has failed.

macbombThis seems to happen to Apple machines quite a bit. You won’t find much mention of this in the spammy internet, but starting around the time of the Macintosh II, Apple began taking shortcuts with its motherboards. It mounted some directly on the plastic of the case, and with others, used daughterboards in odd configurations, or used sub-standard power supplies.

The result is that Apple computers have been blowing motherboards since 1987.

The company has no incentive to change this because they’ve got their audience on the hook. Apple’s marketing is like a microcosm of modern society: they convince you to buy the product for social reasons, surround you with people who chant blank-eyed about how great it is, and then hook you… if you want to be cool like us, you need to keep buying Apple stuff.

Even back in the 1980s, the Apple fanbase was notoriously dishonest about how much their machines failed, or even how they stacked up poorly compared to other machines. Apple users were even banned on several Houston BBSs because they couldn’t stop telling everyone else how inferior their machines were.

What causes this? First, the ego hook: Apple is the hip company (remember those “1984” ads?). Second, the price hook: you just paid a lot more for this thing. It better be good! But if it’s not, what are you going to do… lower your social status by admitting you didn’t buy the luxury brand, Apple?

So Mac users buy their machines, take them home, and when the thing blows up, the repair price is usually the same… about 75% of the cost of a new one. What would you do in that case? Of course, you buy the new one, and start the depreciation curve over.

Or if you’re like this poor gent, you sent it off for the $300 repair, and see it again a month later. Back in the 80s, they used to repair machines with refurbed motherboards, which meant they were often back, and then got sent away to be seen a month later. After several months of no computer, that $1200 starts to look cheap.

The MacBook Pro 13″ has a 6bit display. That means it cannot really display millions of color. Yes, on Apple’s website it claims it can “support millions of colors,” but what they don’t tell you is that it does so through a process called “dithering.” Any designer knows what that means. Anyone else: it means the screen will display colors closely in a pattern in order to give you the perception of a blended color.

A few years ago, a few individuals started a class-action lawsuit against Apple for advertising millions of colors with their 6bit displays. Unfortunately, they needed a “class” for a class-action lawsuit, and not enough people cared/noticed. The matter was settled out of court.

You already know I’m a designer, so you know how important color is to me. A 8bit screen such as my 30″ Apple Cinema Display is able to achieve 16.7 million colors. A 6bit MacBook Pro screen? 262,144 thousand colors. That’s roughly 60 times less colors. That means for all of those colors it can’t display, it blends with nearby pixels. This is just embarrassing and unacceptable.

Louie Mantia

They’re able to do this because of the difference between appearance and reality. If they’re able to forge a fake appearance that appears to complement you, and raise your social status, then you’ll like a crack addict do anything to keep it up. That means shouting down others who don’t agree.

Since there are enough of you to cause problems for anyone trying to launch a product, career, or even just have friends, people learn to be quiet. And so the illusion spreads. Just like in our modern time, when we have a decentralized totalitarian state, where sacred dogmas are chanted at each other and those who disagree are seen as the modern untouchables.

It’s a mental control structure that’s hard to shake, isn’t it?

What we need to fix as a species

Photo Credit: <a href="">Archibase</a>The “problem” with humanity, if you want it in a nutshell, is that we can choose what to believe and we can choose to ignore a necessary activity for a fun one.

While we might expect that behavior from orangutans and chimpanzees, our closest relatives, we also see it all the time in humanity.

As Matt Thomas says in his classic article, “Why free software usability tends to suck”:

Volunteers hack on stuff which they are interested in, which usually means stuff which they are going to use themselves. Because they are hackers, they are power users, so the interface design ends up too complicated for most people to use.

The converse also applies. Many of the little details which improve the interface — like focusing the appropriate control when a window is opened, or fine-tuning error messages so that they are both helpful and grammatical — are not exciting or satisfying to work on, so they get fixed slowly (if at all).

MPT (archived)

Translated from his somewhat delicate reference: people only do what they find fun.

Of course, this is a powerful motivational tool, if we can make things fun. But some just aren’t going to be. Our current means of controlling that is an economic system where some get to live the life divine and do the fun stuff, and others don’t have to. Mostly, it sorts them by competence, so it works better than the option, which is state assigned jobs and uniform rewards (raw socialism).

But there are still tasks that need doing, if we want our tools and technologies to be top notch.

It’s about completion: any job undertaken needs to be completed in whole, including interface and the difficult task of long-term design, including ancillary effects.

Even more than “fun,” we have a problem in that we can choose — using our big brains — to deny ideas or evidence that we find displeasing. Witness:

I was in Calcutta when the cyclone struck East Bengal in November 1970. Early dispatches spoke of 15,000 dead, but the estimates rapidly escalated to 2,000,000 and then dropped back to 500,000. A nice round number: it will do as well as any, for we will never know. The nameless ones who died, “unimportant” people far beyond the fringes of the social power structure, left no trace of their existence. Pakistani parents repaired the population loss in just 40 days, and the world turned its attention to other matters.1

What killed those unfortunate people? The cyclone, newspapers said. But one can just as logically say that overpopulation killed them. The Gangetic Delta is barely above sea level. Every year several thousand people are killed in quite ordinary storms. If Pakistan were not overcrowded, no sane man would bring his family to such a place. Ecologically speaking, a delta belongs to the river and the sea; man obtrudes there at his peril.

In the web of life every event has many antecedents. Only by an arbitrary decision can we designate a single antecedent as “cause.” Our choice is biased — biased to protect our egos against the onslaught of unwelcome truths. As T.S. Eliot put it in Burnt Norton:

Go, go, go, said the bird: human kind
Cannot bear very much reality.

Were we to identify overpopulation as the cause of a half-million deaths, we would threaten ourselves with a question to which we do not know the answer: How can we control population without recourse to repugnant measures? Fearfully we close our minds to an inventory of possibilities. Instead, we say that a cyclone caused the deaths, thus relieving ourselves of responsibility for this and future catastrophes. “Fate” is so comforting.

Every year we list tuberculosis, leprosy, enteric diseases, or animal parasites as the “cause of death” of millions of people. It is well known that malnutrition is an important antecedent of death in all these categories; and that malnutrition is connected with overpopulation. But overpopulation is not called the cause of death. We cannot bear the thought.

Garrett Hardin Society

What is the result of our ignoring the cause/effect relationships in reality? We pick effects that are comforting to our notion of personality as being in control of its world, and then we declare those important and the rest not.

The resulting focus on the “thing-in-itself,” or viewing objects as the causes of their roles in a larger context, allows us to deal harshly with immediate problems but completely ignore anything with a long-term consequence.

As Rowan Hooper wrote in an excellent article called “Is Earth set to go silent in the next hundred years?”:

But in his conclusion [Rees] got into truly cosmic realms, by offering his answer to a question he is often asked: Does astronomy offer any special extra perspective on our terrestrial lives?

Astronomers can set our home planet in a vast cosmic context: a backdrop of millions of galaxies, each containing billions of planets.

And we know that every atom in our body was forged in an ancient star somewhere in the Milky way. We are literally the ashes of long-dead stars – the nuclear waste from the fuel that makes stars shine. To understand ourselves, we must understand the atoms we’re made of – but we must also understand the stars that made those atoms.

But there’s something else that astronomers can offer: an awareness of an immense future. The stupendous timespans of the evolutionary past are now part of common culture. We’re the outcome of more than four billion years of evolution. But most people still perceive humans as the culmination of the evolutionary tree. That hardly seems credible to me as an astronomer.

Our Sun’s less than half way through its life. Darwinian evolution surely hasn’t run its course. Any creatures witnessing the Sun’s demise 6 billion years hence won’t be human – they’ll be as different from us as we are from a bug. Posthuman evolution – here on Earth and far beyond, organic or silicon-based – could be as prolonged as the Darwinian evolution that’s led to us – and even more wonderful.

Rees ended by taking the viewpoint of an alien that had been watching our planet. It would, he said, have seen carbon dioxide in the atmosphere rising “anomalously fast, due to burning of fossil fuels”.

Will these hypothetical watching aliens see the Earth go silent in the next hundred years?

This brings us back to the question: what is the human dilemma that keeps us from seeing and acting on these problems?

As Hardin points out, we tend to pick and choose about where we attribute cause. It’s much easier to blame the hurricane, which was the immediate prior act, than the situation which made the hurricane able to wipe out many. Similarly, it’s easy to finger government, a vast conspiracy (if you’re a leftist, it’s racist white male capitalists; if you’re a rightist, it’s anti-white socialists) controlling society, the rich, the poor, etc.

Could it be humanity’s epitaph will be six billion voices chanting in unison, in every language, “It’s not my fault”?

Could it be the solution to our problems is one that we’ve overlooked because it’s so obvious — to stop being polite about truth, to insist on it, and to insist on a design-level look at cause and effect?

That will offend many — but we presume is a lesser fate than extinguishing ourselves.

The culture of non-culture

So we’ve had some celebrity deaths, and like all things they come in threes, although science can’t explain that. Granted, science is also still not sure if eggs are good for you, if we’re all biologically the same, or what quantum theory underlies all matter. But scientists will arrogantly tell you The Absolute TruthTM nonetheless.

The trifecta of celebrity mortality is complete: Farrah Fawcett, Michael Jackson, and Billy Mays.

A pin-up, a jingle writer, and a late-night TV pitchman.

michael_jackson_beethovenIs this our “culture”? It’s the culture of non-culture. If you don’t have an ancestral culture with its dances, language, rituals, ceremonies, food preparation, costumes, literature, art and values, you just sort of pick up whatever trends are popular.

Michael Jackson was, at best, a talented songwriter in the pop style. Pop music, known for its endless repetition of catchy themes, is not rocket science to write. In fact, most of the best musicians avoid it because it’s really boring if you know anything about music or life. But Jacko was the king of pop, etc etc because we needed a hero and he was on our side during the Cold War. Awesome.

Farrah Fawcett, while a nice person, was known for her clingy swimsuit more than anything else. She did not invent rockets. She probably participated in human rights missions, but so do millions of others, except they’re not celebrities. Oh well.

Billy Mays was a lot of fun, because if you did encounter late-night TV when he was selling you some nostril cleansing product or tomato growing apparatus, he made it more amusing than most. But there’s not much distinction in that either.

What I’m getting at here… our culture is like the sweepings from the floor of history. We dote on these people because they’re famous, but then the trend changes, and things move on. We accumulate what’s left over and call it culture because we have nothing, because some wise idiot convinced us that culture like strong government was a form of oppression and we’d finally be “free” when we threw it out.

So now we get… heroes who aren’t heroes, a culture of non-culture, a society based not on working together but barely tolerating each other?

Good thinking.


This blog endorses a kind of primal realism that many people call conservatism, although it has nothing to do with the conservatism of today. It’s more like conservationism. One of its basic ideas is that our problems are not external (type of government, economics, politics) but internal, in that most people are unable to discipline their inner monkey and so end up as forces of chaotic destruction.

“As people age, they often realize that many of their youthful decisions, which seemed so correct at the time, were not such great ideas afterall.”

I haven’t noticed this. I have noticed that people tend to rationalize their behavior. Unfortunately people (personality-wise) change very little with age. So an impulsive ten year old will likely grow into an impulsive forty year old. And depressive people will remain depressive and honest people will remain deviant.

People will make excuses for their behavior if they get caught, and they will make excuses for their hypocrisy either way. There isn’t much altruism in people. People only find religion after they’ve been condemned to death. If they manage to break out of jail they tend to lose that religion.


I liked the statement this brave fellow made, even if he made it so quietly he stands little chance of the lynch mob figuring out how hard he’s got their number.

People act through justifications. Justification means you do something, and then invent another reason why you should have done it. It wasn’t the reason why you did it. But it’s the reason you offer to others.

Justification is inherent to knowing how to manipulate others. You can use it before you act, even. “I’m going to take this cocaine and look at this child porn to keep them out of the hands of our children…think of the children!”

We use justification because as individuals, we assume we deserve everything we can get our little hands on. We haven’t progressed from an anarchist hunter-gatherer stage to having some conception of civilization, in which anarchy is destructive.

Because we assume we are right, we assume the world should adapt to us, so we pedantically explain in its tokens of moral righteousness why we should be doing what we’re doing. And if others criticize us, we take it personally and attack them personally, because they attacked our assumption of being right, justified, and entitled.

Until humanity gets over this bad psychology, everything we doom will be tinged in ruin.

Surprise, Surprise: Our Ancestors Weren’t Morons

Archeologists said yesterday that they had unearthed the oldest musical instruments ever found – several flutes that inhabitants of southwestern Germany laboriously carved from bone and ivory at least 35,000 years ago.

Just a few feet away from a bone flute, researchers discovered one of the oldest examples of figurative art – the sculpture of a woman carved from mammoth ivory, a find announced earlier this year. Excavations have also unearthed an array of other art, including carvings of mammoths, cave lions, and mythic half-animal, half-human figures.

A culture rich in figurative art, sophisticated adornments, and music does not directly result in better hunting or more successful reproduction, but music in particular might have had an indirect effect, providing better social ties or improving communication, according to Conard.


This thinking appears to be backwards: music helped humans evolve into what we are today, so we could create symphonies, phone lines, and Facebook. No, I think that we created music because we were further evolved 40,000 years ago than many of us like to believe, and have had much better success in formulating societies even despite having to fight for survival more often in those earlier times.

Either way, the fact remains that the further back we dig, the more creepy our human past becomes to us moderns: we didn’t just evolve from block-headed monkeys into the iPhone users we are today, with a little Leave It To Beaver-esque 1950s society tucked neatly between our past and modern eras. There were societies of hunter-gatherers who could make art, play music, and do everything we do today, except they lived in a much more harsh and reality-driven world. Finding things like this in the context of modern society makes us think about our lineage in a backward fashion instead of owning up to the fact that maybe what we call “progress” isn’t all it’s cracked up to be.

The Dunning-Kruger effect

Dumb and Dumber (Screengrab)

The Dunning-Kruger effect states that incompetent people are also incompetent in assessing their own performance. Therefore, less competent people think their performance is competent, while smarter people focus on their own flaws.

It explains, among other things, how in a society that places too much value on image, idiots and insane people are able to get ahead by overestimating their value and getting fools to agree with them.

The essence of the Dunning-Kruger effect is that “ignorance more frequently begets confidence than knowledge.” Studies have shown that the most incompetent individuals are the ones that are most convinced of their competence. At work this translates into lots of incompetent people who think they are superstars. And what is worse is that if you have a manager that doesn’t closely supervise work, he or she may judge
performance based on outward appearances using information like the confidence with which these incompetent blockheads speak.

An important corollary of this effect is that the most competent people often underestimate their competence. This is a result of how you frame knowledge. The more you know, the more you focus on what you don’t know. For instance, people who can name 15 of the 50 state capitals tend to think “I know 15.” People who know 45 of the 50 state capitals tend to think “I don’t know 5.”

Business Pundit

Dunning and Kruger, two researchers at Cornell University, described their findings in a paper entitled Unskilled and Unaware Of It: How Difficulties In Recognising Ones Own Incompetence Lead To Inflated Self-Assessments in the Journal of Personality and Social Psychology.

Their conclusions can be summarized this way:

  1. incompetent individuals tend to overestimate their own level of skill,
  2. incompetent individuals fail to recognize genuine skill in others,
  3. incompetent individuals fail to recognize the extremity of their inadequacy,
  4. if they can be trained to substantially improve their own skill level, these individuals can recognize and acknowledge their own previous lack of skill.

Translation: without leadership at the top of the curve who is willing to call people on their incompetence, the incompetents will appear competent to other incompetents and be advanced, possibly even to the presidency.

This causes a mathematical problem for democracies since most people are not particularly competent at leadership, government or logical argument, meaning they are both unable to assess the best leadership choices and sure that they’re right.

It’s essentially similar to the Downing effect:

The Downing effect describes the tendencies of people with below average intelligence quotients (IQs) to overestimate their intelligence, and of people with above average intelligence to underestimate their intelligence. The propensity to predictably misjudge one’s own intelligence was first noted by C. L. Downing who conducted the first cross cultural studies on perceived intelligence.

His studies also evidenced that the ability to accurately estimate others’ intelligence was proportional to one’s own intelligence. This means the lower the IQ score of an individual, the less capably he or she can appreciate and accurately appraise others’ intelligence. The lower an individual’s IQ, the more likely they are to rate themselves as more intelligent than others around them.

Conversely, people with a high IQ, while better at appraising others’ intelligence overall, are still likely to rate people of similar intelligence as themselves as having higher IQs.


That tendency could go a long way toward explaining why many successful societies have relied on strong leaders who had no problem beating down the incompetent with force.

“Rights” may be a bad design

Fantastic, brave and thought-provoking article from David Mitchell at the Guardian:

Sacrificing our rights and freedoms, or the use of them, for the greater good is much called for at the moment. There’s pressure to recycle, pay higher taxes, not travel on planes, avoid products manufactured by enslaved children, stop borrowing money we can’t pay back, stop lending money to people who won’t pay it back and abstain from tuna. And psychologically we couldn’t be worse prepared.

For decades, our society has trumpeted liberty and its use, choice, self-expression, global travel and all forms of spending as inalienable rights. But only as the environment and economy teeter are we gradually becoming aware that with the power such liberties give us comes the responsibility to deal with the consequences.

But any self-sacrifice feels to us westerners like tyranny. We’re not ready for it. Our evolution into apex individualists has superbly attuned us to injustices against us while atrophying our awareness of the vastly greater number that work in our favour. It’s not our fault, it’s how we were raised.

Our fear of being encroached upon has made us forget that there are few freedoms that can be fully exercised without impinging on someone else’s. The freedom to stab has long since been subordinated to the freedom not to be stabbed. But we still have the freedom not to recycle and to borrow or lend money recklessly, regardless of others’ freedom to live on a habitable planet and in a functional economy. We’ve hugely prioritised our rights over our duties because it’s only the former that tyrants try to take away.

The Guardian

This blog has long covered the major problem of social reality, which is where people band together and create a consensual reality-image in order to protect themselves from anything they don’t want to do. This very negative thinking at its core is defensive, and knows what it hates but not what it loves.

It also makes us easy to manipulate: tell us that something is “not-free” and we are “free,” and we’re automatically against it, banded together into a lynch mob that doesn’t care about the details.

But “rights,” itself, as a paradigm, may be a bad design. It’s not a goal, but it is a surrogate for a goal. Instead of “do the right thing,” we have the mandate to “protect our inalienable right to do nothing we don’t want to do,” which makes us into brats who avoid doing the right thing because then we lose some of that freedom.

There’s another insidious problem which we see here:

The latest session of the United Nations Human Rights Council, which ended this past week in Geneva, was marked by a series of attempts to weaken the body.

Diplomats and non-governmental organisations have expressed concern over efforts by some states, including Cuba, China and Brazil, to muzzle independent reporting.

For many observers, a point of no return was reached during a special Council session on Sri Lanka in late May. The Sri Lankan government was able to impose the principle of non-interference in order to refuse an on-the-spot independent investigation.


“Rights” confers an implied right to dominate to whatever individual, group or political body is promising more rights. This is post-WWII logic that the UK and USA used to justify much of what they did in defeating the Germans and Japanese, and later, what they had to do to keep the Soviets at bay. Us=good got replaced by us=free; we had more rights, they had no rights, so we had a moral imperative to destroy them.

But the problem of rights as a concept is that it empowers selfishness.

In developing nations, this is more poignant than in the West. If you’re trying to get everyone to work together, build an infrastructure, get educated and update your technology — because organization of society, an end to corruption and technology define passage toward the first world — people demanding their right to not cooperate become a problem.

And many of these people were the same ones who benefitted from primal kleptocracy, which is the order we see in most of the world today, where corrupt warlords rule not for the good of their people but for their own lifestyle. It’s natural, in a sense: if the people around you are too disorganized to build an infrastructure, you might as well exploit them and get it for yourself. But it perpetuates itself.

In the same way, in the West, the rights of individuals have trumped positive changes in countless instances. We don’t want anyone to tell us where we can or can’t live, who we can or can’t marry, what we can or can’t do, what we can or can’t ingest, and so on. But that leads to a universal monoculture of anti-culture, where there are no shared values because any value imposed causes someone to send up a shriek about their rights.

As David Mitchell points out, this is culminating in a legacy of disaster. Our society is neurotic, alcoholic and hooked on pills, sexually miserably, unable to form families, politically corrupt in that genteel way that nothing gets done but everyone still takes full pay, filled with unproductive and mindless jobs, hampered by regulations, endlessly frustrating to anyone halfway intelligent, and so on. That’s the kingdom of rights.

This blog has suggested in the past a simpler course of action: instead of asking reality to adapt to us, we should adapt to reality, which is a series of patterns created by natural forces. These natural forces do not limit themselves to material, but reflect degrees of organization; for example, a social group can experience entropy just as matter does and just as ideas do when transmitted multiple times. That’s reality, and it’s something that requires careful study to understand.

But we’re not even trying. We’ve created a kingdom of brats who just want to do what they think they want to do, and even if the results make them miserable, they’re still going to persist. It’s good to see this illusion of rights slowly unraveling.

Abortion and neo-eugenics

When your fear rules you, you get worse consequences than you would have by facing what you fear.

Abortion terrified us, because if the death of a fetus becomes a casual option, maybe life is not sacred after all.

Eugenics terrified us, because if someone doesn’t make the cut, maybe there will be mission creep and we won’t, too.

We fear, we fear… we fear the consequences of nature, and we fear man’s ability to stand in for natural selection, something made obsolete by the fact of civilization itself and specialization of labor.

Instead of facing our fear, we denied it. Abortion battles raged and eugenicists were called fascists, racists — whatever, who cares, just some slander powerful enough to shut them down.

But now our technology has caught up with that.

Instead of picking people on the basis of the whole picture, meaning how they turned out as individuals, we’re going to pick them before they are born — by picking genes that are statistically likely to cause problems.

Males with a particular form of gene called MAOA are twice as likely to join a gang, compared to those with other forms, finds a new study of more than 2000 US teens. What’s more, gang members with these mutations are far more likely to use a weapon than other members.

Low MAOA activity has been linked previously to antisocial behaviour in people who experienced child abuse. While two brain regions involved in perceiving and controlling emotions are shrunken in people with no history of criminality or abuse who have the mutation.

New Scientist

This means a fair amount of throwing the baby out with the bathwater, since the same gene that may make teenage gangsters violent might also make someone with other genes an assertive leader. It’s like saying that red heads are more likely to be alcoholics, so we don’t want them — although at the higher end, smart red heads and red heads of Danish ancestry may be some of our best people. Statistics misleads us because it looks at one factor at a time.

If abortion remains illegal, these future citizens will be fertilized outside the mother and then the embryos will be tested; those that have the wrong statistically prevalent genes will not be implanted. Whether they were alive or not, they will be dead.

Welcome to the new science of neo-eugenics:

Every year, 4.1 million babies are born in the USA. On the basis of the well-known risk of Down syndrome, about 6,150 of these babies would be expected to suffer from this genetic condition, which is caused by an extra copy of chromosome 21. In reality, only about 4,370 babies are born with Down syndrome; the others have been aborted during pregnancy. These estimates are based on a prevalence rate of 0.15% and an abortion rate of about 29% of fetuses diagnosed with Down syndrome in Atlanta, GA (Siffel et al, 2004), and Hawaii (Forrester & Merz, 2002)—the only two US locations for which reliable data are available. Data from other regions are similar or even higher: 32% of Down syndrome fetuses were aborted in Western Australia (Bourke et al, 2005); 75% in South Australia (Cheffins et al, 2000); 80% in Taiwan (Jou et al, 2005); and 85% in Paris, France (Khoshnood et al, 2004). Despite this trend, the total number of babies born with Down syndrome is not declining in most industrialized nations because both the number of older mothers and the conception rate is increasing.

These abortions are eugenic in both intention and effect—that is, their purpose is to eliminate a genetically defective fetus and thus allow for a genetically superior child in a subsequent pregnancy. This is a harsh way of phrasing it; another way is to say that parents just want to have healthy children. Nevertheless, however it is phrased, the conclusion is starkly unavoidable: terminating the pregnancy of a genetically defective fetus is widespread. Moreover, because none of the countries mentioned above coerce parents into aborting deformed fetuses, these abortions—which number many thousands each year—are carried out at the request of the parents, or at least the mothers. This high number of so-called medical abortions shows that many people, in many parts of the world, consider the elimination of a genetically defective fetus to be morally acceptable.


Welcome to what happens when you do not take charge.

While few of us will agree that aborting fetuses who are destined to be retarded or malformed is a bad thing, we can all see that parents are now going to shop for what they want in a child. This could even extend to hair color, eye color, and genes with statistical prevalence of commercially-desired traits: lawyer, doctor, gets along well with others, likes caviar, whatever.

But as we progress in control of our own evolution, look for these lines to blur. People are going to pick what they desire in a child and edit their results to match.

This will in turn force a theological issue: are all individuals sacred, or is life itself sacred, and composed of individuals, some of which we prune and some of which we reward?

As often happens, science is forcing our hand where our emotional minds are afraid to tread.


This means that instead of natural selection picking our wiliest and most logical people, we’re boutique shopping for external traits — and not considering the whole mix of traits, since we’re looking at statistical single traits each time.

Enjoy your brave new world.