The worst kind of disasters for humanity have a trifecta: they are unexpected, the solution is contrary to inertia, and the crisis becomes nearly complete the moment it triggers. Think of skidding out on a curve when there is ice on the road, for example.
You are bumbling along down a road you know well. There may be snow, but it has not yet occurred to you that the snow melted under the wheels of interstate truckers on meth and then refroze.
Consequently you go into the turn with confidence and then, suddenly, your wheels are no longer gripping the pavement. You treat this with inertia: it must be a steering problem, so correct the direction, when in fact the issue is getting your wheels to grip again.
The paradoxical solution is to steer into the turn and not hit the brakes, then steer out of the turn once you have re-established contact with terra firma. But once you enter the skid, the crisis is already upon you and the disaster nearly complete.
In the same way, replacement migration seems to be going hunky-dory until the diversity percentage hits a certain threshold, maybe a fifth, where the diversity become a swing vote in elections and an important consumer base. Suddenly you need them to agree to get anything done.
That is a crisis, and a terminal one, since at that point compromise always favors the person with simpler demands. For the diversity, their demand is simple: we want to take over, because otherwise our position here is tenuous if you (say) change your laws and remigrate us.
Worse still, we have no warning signs except full-blown crisis and panic, following many other threshold-based trigger systems:
Most warning systems rely on predefined procedural thresholds: alert levels, activation protocols and emergency plans that kick in once specific criteria are met. Forecasting may indicate that flooding is increasingly likely, for example, but measures such as evacuations or road closures can only be triggered after formal thresholds are crossed.
Before that point, risk information passes through many layers of interpretation and judgment, where early signals are often noted but not acted upon.
Thresholds serve important purposes. They help coordinate response, clarify chains of command and reduce unnecessary disruption. But they also embed a structural preference for certainty. Action is authorized only once risk is framed as imminent, even when credible evidence already points to escalating danger.
In other words, for scientific certainty, you can only detect a crisis once it has happened. You get no points for avoiding a recession, revolution, or war; history does not notice. But once it occurs, suddenly others can see what is happening. Like a symbol.
When you base your detection of crises on thresholds, you have created a category: beyond this point, it is a certain thing (like a crisis). However, this shows us the weakness of categories, which is that like linear history they fail to note the continuum of states from cause to effect.
This creates an incentive to use categories as a weapon. Is the crisis officially defined as one or not? If not, it can be ignored; if it is, that definition can be inverted by questioning whether the threshold was truly met or not.
That in turn creates a battle over official definitions, using propaganda, to create false binaries that shape the panic responses of the herd, like “good” and “evil.” These binaries serve to force minds to think positively of one of the extremes:
After a century of research on advertising, scholars still don’t have an empirically solid grasp of exactly how or why it works.
They found that ads not only boost visits to an advertiser’s website but also interfere with consumers’ recall of alternative brands, dislodging the competition from their minds. The findings help explain why spending millions on repetitive campaigns, even by well-known brands, is essential to remaining in people’s memories.
Repeat something enough to enough people and they begin repeating it, creating official “truth.” Repeat it as an opposite to what they fear and they will cling to it. The herd is swayed by enough force of message, even if the message is oversimplification.
These moral binaries allow people to easily confirm bias while driving dissenters toward uncertainty because their generalized lack of following has now been made into a negative:
New research from the University of California San Diego Rady School of Management explores how people constantly evaluate whether messages are true or false and finds that a surprisingly small ingredient — whether a word has an easy opposite — can shape how confident people feel when deciding whether a message is true.
The research shows that when companies frame a message with words that are “reversible,” meaning they have an easily retrievable opposite (such as intense/mild or guilty/innocent), people who disagree with the claim tend to mentally flip it to the opposite meaning (for example, “The scent is intense” becomes “The scent is mild”).
“For marketers, this creates a powerful advantage: by using easily reversible words in a positive affirmation — such as ‘the scent is intense’ — companies can maximize certainty among those who accept the claim while minimizing certainty among people who reject the message, because they tend to feel less strongly about their opposing belief,” said Maimone, who is now a postdoctoral scholar in marketing at the University of Florida.
If someone officially defines “good” and the herd accepts it, then anything else will be seen as “evil.” This is how Control operates: it focuses on methods instead of goals, and by categorizing methods in binaries, excludes anything but what the herd wants to believe is real.
Even worse, it turns out that if you tell a lie that flatters moral pretense of equality, people are inclined to believe it automatically, which means they are rejecting everything else:
It turned out that people who resorted to prosocial lies (those intended to spare someone distress) were evaluated as more moral than those who told the truth directly. “Prosocial liars” who provided overly optimistic feedback, were perceived positively, likely because they demonstrated sensitivity to the needs of the other party.
People described as “socially sensitive,” able to tailor their feedback to the recipient, were evaluated similarly. People in this group told the truth to those who are able to face it, while softening their feedback (and even lying) for those who might be hurt by criticism. This inconsistency did not lower their moral assessment; the study participants accepted a flexible approach to truth, assuming it served others.
In other words, society is run by socializing, or the act of making others feel comfortable in social settings, not any kind of thinking about end results. The herd fears its insignificance, therefore designates pretense of human importance as “good” and all else as “evil.”
Much of this arises from the power of doubt which is that we fear bad outcomes more than we relish or celebrate good outcomes:
Research from the Universities of Bath and Waterloo (Canada) shows that when people imagine future losses, the emotional impact of dread is more than six times stronger than the pleasure they feel from anticipating equivalent gains.
Individuals who experience stronger negative than positive anticipatory emotions are significantly more likely to avoid risk and less willing to wait for delayed outcomes — even when waiting could lead to greater rewards.
The emotional impact of realized losses was found to be roughly twice as strong as that of equivalent gains, consistent with the established economic theory of loss aversion.
Humans are easily manipulated because the fear of loss is greater than the desire to achieve actual good results. Consequently, social groups like any idea that degrades responsibility to results and replaces it with subsidized anarchy.
Political correctness works by inducing us to repeat the binary. Instead of having it shouted at us, everything but it is removed from our vocabulary, resulting in a dialogue about the world as seen in the context of that topic.
When you get people repeating your dogma back to you, then you have altered memory permanently and they do not know a world outside of your creation:
Information that we select for ourselves, such as things we click online, has a stronger impact than passively acquired information on our perception of truth and falsehood.
The more often we see information, the more likely we are to believe it is truthful. This has been known for 50 years. Researchers in Bochum have now demonstrated that this “truth effect” becomes stronger when we select the information ourselves: All it takes is clicking a headline to give its contents more credence later.
To make this compelling, it is best to style it as a victory of some sort, usually “progress,” which makes the person important and significant for participating. To this end, they enjoy the process, since it feels to them like winning at life.
In support of their fiction-absolute, they must believe they are living the best life possible for themselves, so they favor any narrative where they are in control and doing something relevant even if no one else recognizes it.
This enjoyment, even if false and not satisfying in the long term, binds them to the task:
In nine studies, the research team — co-authored by Kaitlin Woolley ’12, professor of marketing at the Samuel Curtis Johnson Graduate School of Management, and Yuchen Wu, doctoral student in marketing — found that people relied more on the enjoyment they derived from the activity, and less on perceived or actual time spent in it, when gauging their progress toward a goal.
“Whether you feel good or not should not influence your progress judgments as much as the influence of time spent, but we found the opposite to be true,” Wu said. “People decide whether they are making progress or not simply based on how they feel during the pursuit.”
The problem is that as in all human errors, the method replaces the goal. Goals are end-states, like the perpetuation of a cycle or existence of a certain level of order. Methods are ways you get there, but this runaway method has no goal so merely repeats ad infinitum.
It turns out to be easy to manipulate people, since all you need to do is preach proscial ideas like equality and property:
Across a series of studies, researchers found that acts involving equality and property powerfully shape how we see someone’s character, how much we trust them, and even whether we’re willing to cooperate with them in everyday life. These judgments happen quickly, consistently, and even when our attention is stretched thin.
“Fairness and respect for property may be the moral behaviors that matter most when it comes to social trust,” said study co-author Savannah Adams, U-M doctoral candidate.
When your audience has no actual goal like culture or the perpetuation of a civilization, they easily fall for this basic manipulation pattern. Tyrants rely on this, which is why per Plato and Aristotle they always import foreigners to be their permanent voting base.
This happened early in America with the Irish vote when demagogues weaponized the Irish against the founding population, replacing it:
James Michael Curley, a four-time mayor of Boston, used wasteful redistribution to his poor Irish constituents and incendiary rhetoric to encourage richer citizens to emigrate from Boston, thereby shaping the electorate in his favor. As a consequence, Boston stagnated, but Curley kept winning elections.
Whenever this starts happening, it justifies itself with the idea of fairness, a binary which implies that anything but it is unfair and bad.
Over time, by observing others, people become indoctrinated into this egalitarian behavior:
Published in eLife, the study found that participants became more willing to reject unfairly favorable offers after observing another person consistently do so.
“People can learn to punish advantageous inequity even when it might come at a cost to themselves,” Otto said.
In this way, you can see the trap of diversity. You start by simply accepting some as different, and then your culture gets erased in order to accommodate their cultures, and then you become dedicated to giving everything you have to them in order to be fair.
The West is learning. Diversity seemed like a good idea at the symbolic level but quickly broke down and now is threatening to genocide us, stop reproduction, bankrupt us, and leave behind a dysfunctional third world empire.
Like sliding on ice, we lost sight of the goal. The goal is to have our civilization. Diversity, presumed to be a method of achieving that, became a perpetual method and replaced that goal. It was always a trap and the only solution is to end it entirely.
Tags: advertising, curley effect, diversity, immigration, irish, political correctness, propaganda, thresholds