If there is one concept in the modern age that needs to be folded, spindled, and mutilated, it is the idea of safety. Safety represents an entirely negative idea: the removal of risk, which inevitably translates into protecting the weaker from the stronger by neutering power. In order to fully render power impotent, however, those who desire safety must also limit the information which justifies power, specifically any knowledge above that upon which the weaker are acting as part of their modus operandi.
People in this modern age tend to view it as anomalous because of its technological advances. This outlook requires a fallacious assumption that technology exists on an absolute scale. Past empires have far exceeded the abilities of their neighbors in terms of technology, most notably the Greek, Persian, Roman, Mayan and Indian empires, but they fell by the same method the modern West is declining — class revolt, reckless outbreeding and corruption — mainly because technology alone does not insulate an empire from crisis.
Even the leadership equivalent of technology, advanced managerial and legal systems designed to dole out power in minute increments producing supposedly “equal” results, breaks down if given false starting assumptions or administered by those determined to circumvent it. In fact, management seems to work the opposite way of how it is intended by protecting the corrupt through its tendency to cloak them in authority and hide them behind a maze of rules, standards and measurements that baffle anyone but the extremely dedicated person with lots of time to sift through thousands of pages of bureaucratese.
These institutions justify themselves with the idea of safety, or the defense of people against potential harms, whether from themselves or others. Since the topic of our human tendency to do the exact opposite of what we need to be doing remains unpopular, their focus inevitably shifts to the mysterious enemy or scapegoat upon whom all failings can be blamed and in whose name all new powers can be rationalized. Like the mythological Satan, the best scapegoat is one who does not exist and cannot defend himself, such as the role of Emmanuel Goldstein in 1984 who seemed to be filled by various actors but may not have in fact existed at all.
If we scapegoat a nonsense entity, anything we attribute to that entity is assumed to be true without proof, and since the shadow figure cannot contradict that, all charges stick. To listen to those who advocate government and society being focused on “safety,” risks lurk behind every corner. Mattresses without tags will burst into flame and kill you; food additives will reach out and gift you with tumors in your sleep; bad thoughts will jump off the PDF page or out of a book and turn you into a full-fledged Nazi or anarchist setting cars ablaze. Naturally, risks exist, but not to the degree that the safety-advocates say they do, and they are limited by the choices made by those who encounter them. Few people who avoid smoking in bed find their mattresses suddenly ablaze, and the risk of most “dangers” is less than the chance of being stung to death by bees, while everyday threats like obesity, drunkenness, accident, and other forms of human lack of self-control are the most likely forces to kill any one of us.
Even more, statistics lie about circumstances. Most who die of the various terrors described in wide-eyed self-important glow by the news are elderly, and many who manage to damage or destroy themselves do so in the midst of disorganized lives where a long stack of bad, selfish and short-term decisions lead to conditions where nothing but failure remains. An obese person living in a trailer park in the path of a tornado, sucking down his 15th menthol cigarette and fourth cheeseburger of the day while drinking watery beer and re-attaching his propane tank using chewing gum — maybe even in a “hoarder” style whirlpool of useless possessions — faces one real risk, which is that the accumulated stupidity will find some way to snuff them. This is where modernity disconnects cause and effect; if someone under such circumstances dies from a mattress fire, is the mattress to blame, or simply the tottering house of cards assembled by the oblivious human?
Governments dedicate infinite resources to “educating” us about risks such that most public places are interrupted by ugly warning signs, blinking indicators and recorded messages. Hours of educational video, years worth of seminars and presentations, decades of mandatory classes and aeons of public policy discussion accompany these. If someone dies, it is a “tragedy” even if that person was worthless (and if we are honest, every single one of us considers some categories of people to be worthless) and brought it upon themselves, and through the magic of “accountability” we blame those in power for this unnecessary death.
And nothing is worse than death, we the assembled crowd think from our armchairs, because we fear nothing more than death itself. Thus we panic and foam at our mental mouths and demand that something be done. The press fans the flames with hysterical paranoia disguised as “advice.” Politicians make rules, ugly signs and blinkers go up, and we have another barrier of red tape and bureaucracy thrown in our path before we can accomplish simple life tasks. The accumulated rage makes us angry and we scapegoat the world, much like before that we scapegoated those who are more powerful than us. To take revenge on it, we find some reason to blame it, namely that it is bad and full of risk, and so we lash out at it with more rules. Then life gets more insufferable and the cycle begins again. Round and round. Round and round, again.
I suggest a society based on the creative principle instead: we focus on goals instead of fears. This requires recognition that life is not safe and never will be, and that the concept of “safety” — perceived as an abstraction in a universal context, then applied by our neurotic minds to every possible niche in our daily lives — is itself fallacious. We can design our society not to avoid risk, but to be logical, so that risk comes in proportion to our awareness of what is around us. This corresponds roughly to the results we get anyway, because even with thousands of rules idiots are dreaming up new ways to maim, mangle and murder themselves daily, but without the overhead of making ourselves into worrywarts.
What holds up this transition? I will submit to you this simple axiom: in a group of a hundred people, only a handful have actual direction. The rest have attached to something — a job, a sports team, a church, an ideology, a dollar amount — that they can believe in and they make their lives’ importance contingent upon that. When asked what they want from their leaders, they will not (unsurprisingly) state a goal, but fears. They have no goals, so what concerns their minds is interruptions of what they already have, like bad gamblers unwilling to take risks and therefore equating taking any risk with the behavior of compulsive risk-takers who rolled the dice and lost everything. In a society ruled by popularity, the fear of risk takes over from any attempt at goals.
Almost all public policy can be explained through the quest for safety. Patriotism is safety from foreign threats; diversity and welfare are safety through buying off the lower classes; global warming is a kind of talisman against our general fear of the sheer havok we are wreaking upon our environment. Democracy produces products in the form of visions, like how we project ourselves into the comfortable living room and stable families we see in video ads, and the best products channel an amorphous series of fears into a single symbol and produce a similarly symbolic solution. As with all human failings, our smart monkey-plus brains deceive us and we become a howling mob of simians demanding tangible assurances against an intangible order which determines our future.
It is a popular saying in our churches and political halls: “we are all one.”
In fact, it will make almost any group perk up and listen to you with misty eyes. It encapsulates so many of our sentiments in this fallen time, from egalitarianism to the idea that we should all “just get along.” But it is an incorrect and degraded version of a greater statement, much as our time is an inferior residue of a better one, albeit without iPads and hip-hop music.
The original statement, unfashionable in this time, reads “We are all one in God.” If you are atheistic like many are and in fact I tend to be whenever I fill out triplicate forms, you may substitute purpose for the name of the deity. Most of the time however I find that a religion, in parallel with what was once called “science” that encompassed all learning, marks the mind which has sought beyond the boundaries of the visible and into the non-existent structure that nonetheless emerges everywhere, in both logic and the arrangement of physical matter in discernible patterns. God, purpose, nature, logic — pick which one you feel most comfortable with — because what I describe is in common with all three.
Now, “we are all one” is a much more convenient statement. It is the equivalent of the kindergarten teacher saying that we should all share and get along, or the politician talking about bipartisanship, or even the come-one-come-all cry of the barker. It limits our focus to the human world only, and thus like so many other human behaviors is entirely social in its scope, which reduces our problems from a complex management of ourselves as both individuals and species to a simple matter of socializing with others. Like many things in this world, it is a surrogate for the real task that, being easier to grasp and more tangible in focus, allows us a measure of comfort in how easy it is on our minds and the weight we carry as we consider what our future decisions will be.
On the other hand, “we are all one in God” represents a type of conditional statement. We are united where we are in God. This type of statement makes sense only when God refers to an order, not a physical person or discrete entity. God is Godliness, a participation in the order that is holy and what produces that type of pattern that is simultaneously good, beautiful and true. True means reality; good means a morality of creating greater order — complexity, endurance, universality, efficiency, quality — wherever we go; and the beautiful is what shows us the transcendent in the mundane. It peels back the layer of the visible and shows us an invisible order pervading all reality which gives it the possibility of purpose, and shows us a path to make us like ourselves more. But these must occur at the same time so that all three traits are one.
People shy away from this phrase not only because it mentions that least sociable of ideas, that of the higher order and possibly but not necessarily a metaphysical one, but because it mentions purpose. If God is an order, our purpose is that order; this is not to say that the order is inherent, because we can choose to avoid it. But like any thought which is more good, beautiful and true than others, it calls to us like a childhood dream or the image of early and perpetual love. The problem with purpose is twofold: first, we can fail to achieve it; second, we must cut out of the social circle those who fail to achieve it. Like the small rodents of the forest, we fear the predators in our world, but the greatest predator is existential despair or the sense that our lives have been wasted. We never want to be wrong, and wake up to find that we have spent our irreplaceable time and energy on the worthless, revealing ourselves as fools or lost souls. In other words, “in God” adds a burden of an order which the best of us embrace, revealing the rest of us as lesser beings, and with that burden comes the necessity of exclusivity. Some rise, others fall.
Exclusivity is the least popular topic in any social gathering. People take it personally when they are found wanting. (The best form of exclusivity is secret exclusivity, because then one may feel the rush of ego-opiates brought by recognition, but not suffer from the wrath and resentment of others when they realize they did not make the cut). In contrast, inclusivity remains perennially popular because it gives us all warm feelings. “All are welcome” and “we are all one” are the same statement. Inclusivity conveys not only the sense that peace will prevail and all will be happy, but the notion that individual advocating it has risen above the earthbound tensions of animals and has become a higher being, if only socially. Our society sways under the weight of many would-be prophets who feel the rush of endorphins and dopamine that comes with having negated the self in preference for the group. Inclusivity creates a group where everyone feels good about themselves and feels safe from others.
In the view of that group, exclusivity represents a cruel and primitive urge to stimulate the ego by being above others. They view it as a vestige of our simian past and congratulate themselves on having enlightened, progressed, transcended and most of all “been better than that” or “been the bigger person.” Exclusivity threatens the circle of warm feelings that socialization through altruism/egalitarianism provides. With purpose, exclusion of some becomes inevitable. And yet without purpose, our lives become a prison confined to personal power and desires, a game which rapidly becomes pointless and boring, but which we play out of habit and the same desire for completism and uniformity which motivates our obsessive-compulsive cleaning and organizing of ideas. Purpose gives us a reason to rise above ourselves, but most people fear that challenge through the assumption that they will fail, even when it is highly unlikely that they will.
In an exclusive society defined by purpose, inclusion becomes revealed as what it is: a great injustice. The person who does right gets the same reward as the person who does nothing, or stops just short in his wrongdoing as to fall below the threshold of laws or rules. To be fair to people, those who do more should receive more reward and recognition, and also be given power so they can more effectively continue doing more. This is the nature of any society with order, any belief system with consistency, and even nature which rewards adaptation over illusion. Exclusivity is entirely incompatible with inclusivity because each is anathema to the other and would un-do it in short order. Where “we are all one” is a statement of inclusion, “we are all one in purpose” forces upon us both the greatest gift of a life — purpose — and what we fear most, the lack of pacifism and sociability uniting us into a happy circle.
At the same time, purpose raises standards. Our goal is not the constantly downgrading cycle of acceptance that lowers standards in order to fit every plausible candidate in the circle, but the rising of standards that says that as we improve the goal rises higher. Perhaps we might even reach the stars. Where inclusion calls for a minimum standard, in other words a negative measure based on fear of excluding, purpose calls for a positive standard which rewards all of those who step outside their fuzzily self-referential minds and begin the climb toward excellence. This is why purpose, like so many experiences, begins with terror and then progresses after an initial learning curve to a golden era of greatness. But our fear holds us back.
You will hear “we are all one” anywhere people wish to unite others and manipulate them. Inclusivity carries an automatic threat which is that if you are not inclusive enough, you can be excluded. Exclusivity, on the other hand, has no additional hidden layer; you do what achieves a higher degree of the order sought and you are rewarded, and there is no other standard that can be used to exclude you. In addition, all benefit from the increased stability to society as a whole, and the decrease of ideas that make us neurotic. If we listen to pop-philosophers like Bill and Ted, who tell us to “be excellent to each other,” we see an initial progress out of pure inclusivity. It is not enough to be all one; we must be excellent. If we wish to return that statement to its original balance, it can be shortened to the simple be excellent. Discover the order to life — and beyond! — and make yourself and your civilization excellent through it. Not everyone can participate, but those who are excluded fail only by their own fear, and those who beat the fear go on to become more of all the good in themselves.
Civilization creates its own fatal disease which is the predominance of popular notions over realistic ones. This disease proves difficult to diagnose because it is invisible, intangible and omnipresent. Like a virus in a computer network, it spreads through any and every program, elusive in its lack of a center to attack.
If these writings seem to rage too much against scapegoats — The JewsTM, “thugs,” The Rich, government itself instead of the voters who empower it — it is to avoid falling into the pitfall of popular notions, which perpetually prefer a tangible and easily-understood target to the more complex task of unraveling different threads and separating truth from lies.
Other popular illusions get short shrift sometimes but merit our attention, with two of them being the “fact”-based narrative and the obsession with details that demands lengthy research and vocabulary to merely discuss an item at a deeper level than “insight porn,” the pop culture styled contrarianism that creates a Thomas Kinkade level of philosophy: bright colors, simple scenes, and essentially a pleasant illusion avoiding the deeper problems within.
Many of us distrust the “fact”-based narrative for a simple reason:
There are no facts, only interpretations. – F.W. Nietzsche
That is to say: our language cannot convey wholly what is in reality, so it is inherently selective. This extends to fact-finding itself, which must choose facts to fit a narrative instead of assessing all facts and then looking to see what remains. A selective narrative produces a 300-page book of compelling ideas, where an assessment of all facts would produce a 10,000-page spacy analysis that few would read, until a final chapter appears which seems to magically make broad conclusions.
The left will always attack with the idea that conservative ideas are not “fact”-based, because the left specializes in cherry-picking data especially within a recent time frame, mainly because their goal is to explain away the unbroken historical record of failure to democracy, egalitarianism and subsidy-based economies (“socialism”). They have more to conceal than they have to say, so they specialize in generating “facts” that are in fact a very selective reading of reality, transferred into narrow categorical containers to produce a binary, and then spun into broad universal conclusions derived from relatively thin evidence.
Over the course of my life, I have seen both popular wisdom and the latest scientific studies fall. Not just arrive at a state of doubt; outright fail. This is because there are numerous levels of selection bias. Paul Krugman, a talented writer whose conclusions are often wrong because they are based on false assumptions, hits the nail on the head — broken clock right twice a day, perhaps — with this statement:
It doesn’t matter that the skeptics have been proved right. Simply raising questions about the orthodoxies of the moment leads to excommunication, from which there is no coming back. So the only “experts” left standing are those who made all the approved mistakes. It’s kind of a fraternity of failure: men and women united by a shared history of getting everything wrong, and refusing to admit it.
In other words, there is a selection bias among those who have become recognized leaders in their field, and it is not unfair to assume that much of this consists of destroying any ideas which conflict with their own. Their careers are based on their ideas; unlike even fifty years ago, when people were promoted based on their character and generalized abilities, in the current time people are vaulted to the top of their profession for attracting public interest. This leads to the second form of selection bias.
Crowd selection bias exists as a positive distinction, meaning that the masses reward what they find appealing. Note that these are not the masses as a whole, but the specific plurality which consumes news and intellectual products (usually books and movies). They ignore anything which is too complex or offends their conventional wisdom, but if they find a champion for an idea they find compelling, they will lift that person up through their purchases and attention. These heroes are the talk of the town for a few years, then are forgotten because their theories did not redefine the world. Thus Thomas Piketty passes into history and joins a list of other names I could cite here, but none of us would recognize them. They are past favorites, now comfortably serving as heads of departments or laboratories across the West.
In addition to the above selection biases, a type of negative selection bias exists which is fear of offending. We on the realist fringe are familiar with this one! Any idea that is too dangerous, or too insane — and the opposition likes to conflate these two much as the Soviets did — will be viewed as potentially incurring risk of offending either a plurality that is vocal or worse, a group or individual with protected pity-status. Those are dangerous and must be avoided, and so these are filtered out before they reach the surface.
Those three alone guarantee that “facts” as released into the mainstream will rarely provide useful information; “useful” is a better test than crowd favorite “valid,” which merely means placed in a form that is coherent. More likely, the facts issued forth will take the form of the far wall of an echo chamber, repeating what is already believed by excluding anything which does not fit that narrative.
Some useful facts make it through. These are either advanced by those who know their importance, or sneak past in a variety of guises. The best guise is insignificance, or the noting of a small detail and allowing others to interpret it. Another is as internal criticism within already accepted theory or ideology. Yet another is the infamous backwards attack, in which the researcher or writer advances a terrible argument in favor of an idea in order to show how hollow the idea is. These different types of guises are generally employed by those who work for the crowd heroes who run the departments.
None of these filters however disguise the raw problem with “fact”-based reasoning: the facts are chosen in order to be popular, and the method is bad. Modern science consists of surveying data, picking a factor to look at, and implying a causative relationship through statistical means that address only the data itself. Inherent in that are a number of assumptions which rely on universal tendencies to data, or similarities between context based on the form of information and not the specifics of its derivation, and these fail time and again. No one cares: this is an industry, not a moral crusade to be realistic.
On the other side from the “fact”-based narrative is another narrative which seems to be different: the detail obsession of specific domains of knowledge and vocabulary, which hold that to discuss a topic you must have read thousands of pages of dense material and mastered many small nuances. If humans retained their ancestral intelligence, they would see this for what it is, which is job protection through obscurity. Remember “security through obscurity,” the idea that if you make your computer products cryptic enough no one will hack them, despite the fact that hackers specialize in the cryptic because much like regulation offers more options to cheat, it offers more different wrinkles to exploit? Job security requires that specialized workers make their tasks so obscure and rife with tedious detail that outsiders cannot critique, oversee or redesign them. This perpetuates “the way we do things around hereTM” in perpetuity, guaranteeing jobs but reducing competitiveness. The same is true of academics and other thinkers, who want to claim ideographic space on the great blueprint of known ideas, and the defend it by making entry impossible, and forcing those who would enter to adopt enough of the language of the discipline as to force them to accept the specific precepts of its owners.
Within this topic, I side with the philosophers: all ideas reduce to a very simple core, and there are not many actual ideas, so generally what one finds is a variation on a previous idea. What is needed is not an in-depth look, but a clarification of the basic concepts in as few words and specialized terms as possible, or discussion is moved into a domain controlled by the specific knowledge which makes extrusion to other domains of knowledge nearly impossible. Academia hates this idea because it would put the philosophers and literature teachers back in charge, and since the best of those tend toward realism, they would focus on collapsing the empty spaces of rhetoric and domain-anchoring dogma and replace it with simpler, clearer concepts. Compare The Republic or Reverence to the average book of academic writing and the difference leaps from the page: good thinking expresses itself clearly in few concepts and then reveals their depth; bad thinking expresses itself in a nearly flat hierarchy of specialized concepts, hiding meaning within, then explains it through examples which only gradually reveal what is actually being said.
As always, the problem of humanity chases us here. Why is it that all of our knowledge is corrupted, all of our leaders are bad, and all great civilizations extinguish themselves? The only smart money says that a similar pathology, or repeated behavior that is indifferent to its results, explains all three. We got a hint of this in the news this week when attention whoring made the news:
Sen. Claire McCaskill (D-Mo.) on Saturday backtracked from recent comments in which she seemed to suggest that Sen. Elizabeth Warren (D-Mass.) was getting more attention than she deserved by admitting what’s widely known about Washington: everyone seeks attention.
So that we all catch the tacit admission here, let us look at the normal, healthy leadership. A good manager seeks what must be done to succeed and then works to accomplish it. But as McCaskill says, democratic Washington acts on the opposite principle: it seeks what is popular, and then finds a way to justify it by arguing toward some recognized policy goal. In other words, we are no longer in the domain of leadership, but in entertainment, except that it uses the mantle of authority given to leaders to grant itself gravitas and extort money from us all. People, she said that politicians make their careers by attention-whoring; no one mentioned leadership or acting on what is important here. Grab headlines and win, just like the “fact”-based studies, and do what is right and be ignored.
In this light, our society resembles a closed circle: each of us does what is popular, so that we may become popular, based on what has been popular in the past. Surface-level alterations, such as what hipsters excel at like adding tubas to indie-rock bands and proclaiming it “a new sound,” are in fact affirmation of sameness in the same way the exception proves the rule: if the only differentiation possible is aesthetics only, then no other idea is possible, which affirms the predominance of the idea. This closed circle means that we as a society are like a dog chasing its own tail, entirely self-referential and oblivious to the larger reality around us. “Fact”-based argument, and argument from detail-obsessive specialized domains of knowledge, are methodologies which endorse and promote this outlook. Its end result is that reality is ignored and supplanted by social reality, or the collective consensual hallucination formed of what people desire, judge or feel — in other words, what they wish were true instead of what they deduce or induce to be true. This is the end result of all crowd selection algorithms, whether democracy, consumerism or simply social popularity, and constitutes a revelation of the implicit goal in those methods which is to obscure difficult truths by re-directing our focus elsewhere.
All of this leads to the point of the essay you are now (still?) reading: universalism creates subjectivity. Our theory is that in order to find objective truths, we must create an objective truth which is shared among people. However, by doing so, we grant a weight to that objectivity which guarantees it will be manipulated, and because people have different levels of the power of discernment — this is distinct from subjectivity; it suggests that we have different degrees of the same abilities, not different abilities which produce different truths — they will then use the same objective symbols and tokens but mean different things, gradually poisoning the objective truth by redefining its tokens. A better approach is to reject the objectivity/subjectivity dichotomy and instead take an esoteric approach, which may be summarized as “the truth reveals itself to those who are ready, in varying degrees according to readiness.” With esotericism, we expect no objective truth to be universal, and correspondingly guard against poisoning by cherry-picked facts (in “scientific” “studies”) and biased language controlling specialized domains of thought alike.
The media inflates again with seemingly endless bloviation about the Iraq war. Again this exists to conceal some difficult truths behind the scenes. Let us investigate the Iraq war (II) and see what may be found.
At first the question seems simple and to coincide with the one asked by the media: was this a just war? Was it a successful war? And, did it simply damage American prestige worldwide?
Each of these questions aims to hide a more complex truth.
First, was this a just war? Meaning: did the bad guys deserve punishment, and are we in a moral position to do it? Glossing over for a moment the specious and irrelevant question of whether there is morality to war, as it is generally a thought debated by those far from the battlefields who have no intention of participating, let us look to Saddam Hussein. There was much to admire about him as he unified Iraq at least temporarily.
However, the dark side to Saddam Hussein was that he was also the ruling power in the Arab world and he had supported terrorist attacks against Israel including launching Scud missiles from long distances. That in turn prompted the question of whether Iraq had “WMD,” a term that in its honest use means NBC (nuclear, biological, and chemical) weapons. Lobbing 150kg of Semtex into downtown Haifa is bad enough, but hitting it with the same amount of VX gas could achieve a measurable percentage of genocide and precipitate a collapse of Israel. Speaking as realists, we must recognize that the West would not simply sit on its hands when that happened, so a rather extensive conflict would result. Further, Iraq was the weapons clearinghouse of the middle east, and many of those ended up in the hands of Palestinian terror groups. For these reasons, the war on Hussein was “justified” if such a thing must be done.
In addition, the war against Hussein represented a clear response to the middle east as a whole: if terrorist attacks happen to the United States, we will show up to where you are and destroy enough stuff that you will regret your support. Then you will be engaged in fighting us in your backyard and will not have the time or resources to follow up on terror attacks. This also displaced all terrorists from Iraq, a populous area, to the more sparsely-populated regions of Afghanistan and Pakistan where they could be droned from armchair comfort.
The question of WMD was never resolved. Iraq had developed WMD in the past and might be doing so currently. What changed this question was Hussein’s ability to launch Scuds and the rising power of Islamic “extremism” in response to the 9/11 attacks. The Iraq war took all of that down a peg. Oddly, like the Viet Nam war, the Iraq war was massively successful in that it dissuaded others from following the path of mideast terror. Resistance happens in increments and is emboldened by a lack of strong response, because that means that one can be a revolutionary and also face a likelihood of zero penalty. When the penalty rises, support falls. In the same way that the Viet Nam war stopped Chinese Communist expansion, the Iraq war stopped Muslim extremist expansion. This re-asserted a geopolitical balance where the nations intolerant of this resistance movement held the upper hand by the fact of not only being unwilling to tolerate it, but picking a semi-arbitrary nation to sacrifice for having supported it in the past. Message delivered: support this and you may be next.
Then there comes the question as to whether Iraq was a successful war and with it, the question of whether America lost or won. The factors that determined these questions were beyond a single president. It makes sense to divide the Iraq action into two parts, the “war” and the “occupation.” The first resembles what Bush I did, which was to smash the opposing army and level huge parts of its industrial capacity, cutting it back if not to the stone age at least to less harmful levels. As in the first Iraq war, the second one — the war part at least — was successful. After that came the occupation, where almost all of the casualties and expense occurred. Occupations, as happened also in Viet Nam, are generally costly because all of the strategic advantage goes to the guerrillas, much as it did during the American revolution. As luck has it, occupations are also virtually demanded by any democracy fighting a war because it cannot support the un-democratic alternative, which is relocation or destruction of the subjugated population. You can avoid a guerrilla war, but it takes extreme means.
Saddam Hussein lobbed missiles at Israel, experimented with WMDs, gassed his own people and supported terrorists. Together these showed a pattern that revealed a strong nose-thumbing at Western authority in the middle east, which is essential since the middle east is Europe’s southern flank. This was in itself not a problem, but with 9/11 the Islamic terrorists got too arrogant and strong for our interests, so they had to be spanked down. In Iraq, this consisted of a successful war to depose a tyrant, and a less-successful occupation demanded by our “democratic feelings.” In Afghanistan, it consisted of a prolonged guerrilla war which drove its targets to remote areas where we identify them and drone them to this day. Much like Viet Nam, this war pushed back against a challenger to our authority, and radically reduced the support for terrorist organizations in the middle east.
Television chooses its own audience: the witless prospers and the wise disappears, and the case of The Assets is no different. Released in 2014, the show has disappeared from the site of its producer and most movie review sites, yet remains one of the most insightful and compelling narratives unleashed onto the small screen. Portraying realistic spycraft and situations, this show focuses on CIA agent Aldrich Hazen Ames and his decision to become a double agent for the Soviet Union in 1985, passing CIA secrets to his controllers in Moscow.
Filmed in eight episodes of approximately 45 minutes each, the show efficiently tells the complex story of Ames, a CIA case agent with drinking problems and a failed first marriage. When in Mexico, he meets a Colombian woman, Rosario Casas Dupuy, and brings her back to the United States where they get married. Spurred on by the agency’s failure to recognize his self-alleged brilliance and Rosario’s compulsive spending, Ames meets with KGB agents and begins to sell them CIA secrets. When they demand higher quality information in exchange for the type of money he desires, he delivers to them the complete files on every Russian working for the CIA from within the Soviet bureaucracy. Such people were called assets, and most of them were summarily executed. As this process shocks the CIA, a case officer named Sandy Grimes assembles a team to locate the source of the leaks by determining whether it was a communications failure, sloppy tradecraft or a human intelligence failing that allowed the most massive leak in CIA history to occur. It takes her and the team another nine years, interrupted by bureaucratic bungles, to gather enough evidence to first ascertain that Ames is the leak and second to enable his prosecution.
The Assets suffers for being a brainy show with a brainy topic that will not be appreciated by most Americans who currently want to deny that the Cold War existed because we are both heading toward an ideological rigor like the Soviets, and also having the same problems that reduced that empire to rubble. In addition, people do not like television that struggles with a lack of moral ambiguity and points to life as a greater struggle than for personal achievements that glorify the individual. These officers sacrifice much of their lives in the belief that they are doing something good, including some of the assets especially Dimitri Polyakov, a highly-placed asset who loathed his government and resisted it by giving — without asking for money in return — information to the West. No car chases or glamorous overseas work intrudes on what is a basic narrative of hunter and quarry, but The Assets raises questions of allegiance and morality that resonate throughout all eight episodes. Its portrayal of tradecraft looks accurate and emphasizes the long hours and evasive tactics of spies, and it pulls no punches and refuses to re-write history when it comes to Soviet treatment of those they capture, including ad hoc executions. As a result, this is both a grimly real and highly emotional portrayal of an intricate and deadly game, balancing scenes of intense and compactly-written dialogue with atmospheric intrusions into the lives and personalities of the people involved. In the process, it tells the story of several great friendships, a clash between different empires, and the struggle of individuals to do what they see as right despite overwhelming odds.
Some may complain that this series reveals the Soviets to be brutal and calculating, but it also portrays some of their greatest moments in the strategic calculus of espionage. In addition, while individual Americans are shown as highly principled and thoughtful people, and the CIA is in general cast in the best light, the bureaucracy and complacency of the West also take center stage and show how grindingly slowly this investigation went — with several interruptions — as a result of bureaucracy and public image wrangling that ultimately served no one but the enemy. If any theme can be assigned to this series, it is the primacy of individual morally-inspired action against the brutality of dictatorships and glacial timorousness of bureaucracies alike. The cinematography takes a relatively straightforward approach, halfway between a documentary and a classic film, but the editing takes over by reducing shots to the shortest duration necessary. That technique creates a compelling energy to the process of the story by giving each moment its due without becoming overly focused on any single part of the narrative. The result is a story that draws in the viewer, conveys factual detail and procedure very well, then explodes to an emotional conclusion as all the pieces fall into place and the story arc completes itself. It is a shame this series did not get more attention as it uncovers one of the more interesting stories from a vital period in recent history.
Above you can read the short humorous pamphlet Euphemism: the Language of Evasion, by George W.S. Trow (click image for full-size scan). Probably dating back to the 1970s, this humor piece mocks the tendency of the age to conceal unpleasant truths behind nonsense language.
That phenomenon, while in itself not unique, represented a new wave in post-war Western society as the old social order and social hierarchy faded away and was replaced by the egalitarian flat hierarchy. Under the old ways, every person had a role and the limitations of that social status, and the likely limitations of the person serving it, were well-known. The castes did not mix and so there was no need to obscure the grim truths of life.
Enter the 1960s. Now, everyone rubs elbows. This means that where equality is not in fact true — a common occurrence — there must be a social taboo on noticing things. For example, if you are a corporate lawyer and your daughter brings home her new boyfriend who works in food service, it is not considered polite to mention. And yet, with this politeness was inverted: the original idea of politeness was to find a non-confrontational way of discussing just about anything, but the “new” politeness simply cut out any friction by censoring the speaker.
Four Ways to Avoid Unpleasantness
by George W.S. Trow
ESCAPE the ugly consequences of Straightforward Speech
The Language of Evasion
(* reg. trademark)
Do you need Euphemism?
Read these sentences:
You’re stupid aren’t you, Mary?
Is that a pimple on your chin, Melvin?
I understand you’re a garbage man.
So many people of your age seem to be dead.
Did you spot the treacherous Straightforward Words (evocative of painful reality) in these simple sample sentences? If you didn’t, you can expect endless difficulty and embarrassment in your pathetic little life. Let’s review the FIVE MOST TREACHEROUS WORDS IN OUR MOTHER TONGUE, the words that cry out for translation into Euphemism, the language of evasion. They are (and, if you play your cards right, you need never face them again): “STUPID,” “PIMPLE,” “GARBAGEMAN” and “DEAD.” Learn Euphemism, the only language endorsed by the Department of Health, Education and Welfare (as well as three leading Midwestern universities) and we’ll tell you how to avoid these dread words, EVEN WHEN TALKING TO OR ABOUT DEAD GARBAGEMEN!*
Our booklet, “The Lore of Euphemism,” available for a nominal fee, tells the moving story of Euphemscholar Nancy Tmolin, who translated the sentence “You’re a stupid, pimply, dead garbageman,” into Euphemism in ten seconds flat.
NOW LOOK AT THE SUBTLE PROBLEMS POSED BY THIS SECOND GROUP OF SAMPLE SENTENCES:
How come you don’t have any children?
I have plenty of time, Mother, and I would like to come to see you more often, but actually I find you depressing.
I guess you’re in the hospital for good this time.
But fat people always sweat, Bertha.
We’ll teach you to defuse even these problem sentences.
You will wear the miracle Eu-pho-phone (yew-foe-foe-nn), which automatically bleeps out offensive words in the speech of others.
Send coupon today:
I’m tired of saying what I mean.
I want to escape.
Help me learn Euphemism, the language of evasion.
The language of euphemism, like the political correctness to follow it in the subsequent decade, focuses on eliminating inequality in language. The terms covered by this pamphlet relate to physical inequality (fat, pimply), class and caste (garbageman), desire for exclusion (depressing) and mental ability (stupid), in addition to classic white lie territory like death and sterility.
As we enter an age where government and media support the “useful idiots” of the SJW-herd in enforcing speech codes on the population through manufactured outrage and convenience ostracism, whereby anyone singled out by the herd is dropped by all associates to prevent the herd from in turn attacking them, it is useful to remember the root of this line of thinking: evasion of reality.
Back in the hazy days of the early 2010s, a blog named In Mala Fide grew itself a reputation as being a nexus between anti-liberals, men’s rights activists, pick up artists, red pillers, and the growing traditional right (although it disclaimed that association). It went down in glorious flames and later writer Matt Forney announced that he had been Ferdinand Bardamu, the persona borrowed from the protagonist of far-right traditionalist Louis-Ferdinand Celine.
During the glory era of In Mala Fide, I wrote four pieces for the site which are now replicated here:
Nihilism, in a nutshell, argues that life is without objective purpose or value. This philosophy is something that has always seemed to cause controversy, as it seems society has always a some sort of fear when it pertains to Nihilism. Briefly describe what Nihilism means to you, and its relation to Parallelism.
What is Nihilism?
Nihilism is a philosophy based on the idea that reality alone is important. It rejects belief, faith, wishful thinking, ideology, morality and socialization as in any way a form of reality and/or “inherent”; these are human projections. All potential actions are choices we can make. However, nihilists are not relativists. We do not say all choices are equal, because equality is also a human projection. All choices are simply whatever their results are, because intentions exist only within the human mind and are not important.
Most people want to read into nihilism the typical kiddie-rebellion fatalism that infects the industrialized nations: “Nothing matters, so do whatever you want!” This is broken, because nihilism eschews the yes/no question of “matters,” since even having something matter at all is a choice. Nihilism also avoids the “do whatever you want” because to prescribe that is to give it a value. The only statement nihilism makes is that nothing is real except reality. Human projections are irrelevant because they are unrelated to outcomes.
Every action we undertake on earth is a choice. Do I eat the red-spotted mushroom? The utilitarians will say that if most people like eating them, you should do it; the formalists will say that if it’s socially approved, you should do it; the instrumentalists will ask if the goal of eating the mushroom is moral; the materialists of course will say that it depends on what comforts or wealth it gets you. A nihilist says to use the scientific method and look at what the whole of the results are. Will it poison you? Will it mislead others? Will it harm the forest? Will it bring about any gain of any kind? These are all choices, and must be considered in turn.
Nihilism is not a morality. Morality is what comes between humans and making choices. I can choose to commit crimes, but if morality exists, I will be reacting to the moral judgment of right/wrong instead of the consequences of my actions. This puts us back to measuring our acts by intentions, when we really should instead look at what the results will be. We then have to confront those results and say, “The result of this crime is that I’m going to force this person to work another 40 hours to pay for what I took, and my reward will be 10% of the purchase value, and it’s likely that more people will follow my example and commit crimes.”
That sort of measurement is emotionally heavier than saying some action is bad or good. If an action brings about good results, we can talk about those anticipated results by looking at past similar actions and pointing out the similarity. In the same way, if a proposed action is likely to bring about bad results, we need to only compare it to past events. “Last time we lit our cigarettes off the propane tank, we blew up three houses and a dog. Is that the result we want again?”
Nihilism is not negation. If there is religion in a nihilist world, it is esotericism, or the discovery of religious principles from patterns in our environment. If there is morality in a nihilist world, it is unceasing awareness of consequences. These things can exist, but they, too, are choices. However, as mentioned above, nihilism is not relativistic, so “it’s a choice” doesn’t mean “it’s accepted” as it does in pluralist moralist societies. It means instead that the burden of consequences is upon the person who makes a choice.
Nihilism is also not anarchy. Anarchy is a moral judgment that a leadership structure should not exist. A nihilist will reject the idea that a State is necessary, but by recognizing that leadership is a choice, forces us to consider the consequences of types of leadership versus no leadership. Nihilism does not choose what “ought” to be; it chooses what works. And so the first nihilist question to an anarchist would be, “Where can I find a successful anarchist community?”
Unlike ideological political systems, nihilism does not view wishful thinking — what “ought” to be, what society “should” do, or a moral jihad for equality — as useful. It questions causes->effects and by looking at effects, chooses to pick the corresponding cause (action) that can be undertaken to achieve those effects. As a result, it is pragmatist, or non-utilitarian consequentialist. This makes it more like the paleoconservative right and less like modern post-1789 state/ideology-based systems.
As a philosophy, nihilism recognizes that rejection of all values negates itself because it is in itself a value. Instead, nihilism views all values as choices. When these values are based on aspects of reality, they are nihilistic, but the creation of values like morality is dangerous because it removes us from thinking about reality and instead has us thinking about the words, symbols and relationships that comprise those values. A nihilist would suggest that the healthiest human system is one where we look at consequences alone.
Nihilism is ultimately a philosophy of affirmation. When we clear the human projection out of our heads, we are like children again, and can instead of reacting blindly to social projections, choose what we want out of life. As a conservative nihilist, I choose what Plato found to be the apex of human existence: the good, the beautiful and the true.
Why society fears Nihilism
I no longer believe that society exists. I should say instead that it’s a moving target. Societies have a life cycle just like humans. If you take care of your society, it can last for a really long time. If you do not, it self-destructs quickly. The remnants of destroyed societies are what we call third world nations. In each of these, there was once a prosperous society led by intelligent and noble people. These people pitied others, and so made life more hygienic, safer, abundant and easier for them, which resulted in incompetents outbreeding competents and dooming the society to failure.
During the early days of a civilization, there is no need for formalization. People recognize a shared purpose and set of values to achieve that purpose. It can be as simple as adaptation to a geographic area, but only if it includes an added dimension, which is the desire to not just survive but to thrive. Essentially, the best human value is laziness, because it causes us to want to improve our knowledge and self-organization such that we have more time to relax, ponder, create music, wage war, fall in love, etc. You know of Mazlow’s pyramid of needs; in my view, civilization begins in the upper parts of this pyramid where emotions and the need to use the mind like a weapon are found.
Unfortunately, over time, the aforementioned process of “helping others” leads to a proliferation of incapable people. These people do not mean badly, but they have a fatal flaw, which is that they are thoughtless. They will either overpopulate their geographical area or cause some other tragedy of the commons (an event where a public resource is exploited unto destruction because its cost to each individual is free) and as a result, will find themselves starving, diseased or in wars they can’t win. At that point they turn on their leaders, who are usually the people who had been trying to stop the decay and getting beaten back by the crowd of people who want to believe in what they wish were true, not what they can discern is true.
As a result, wishful thinking predominates up until the very end, where there is a sudden and conclusion confrontation with reality itself, and the civilization falls apart. It doesn’t just explode, but all the levels of civilized behavior drop precipitously until it is corrupt, dishonest, whorelike, ugly, dirty, commerce-ridden, violent, and directionless. It is usually ruled by warlords or a military junta because such disorder requires authoritarian government to keep it in line.
During this process people attempt to enforce their wishful thinking because (a) they want to stay in denial about the collapse and (b) this enables them to control others and get ahead through manipulation. As a result, they invent the myth of inherency. These words we use to describe things are not just token symbols we exchange in their view, but are the actual names of things. Our religions are not interpretations of metaphysics, but the whole truth. Government and collective approval are the only legitimate ways to make decisions. Good is a certain list of things; bad is anything that opposes it. Soon we are living in a world of “inherent” symbols that are human-created and often either arbitrary or deliberately controlling.
This is the origin of modern control. Unlike ancient control, which was cooperation based on having a hierarchy, or a decent authoritarian state, which is essentially paternalistic pragmatism (a form of consequentialism — the idea that we measure our actions by their results, not their intent — that, unlike utilitarianism, is based on reality for society as a whole and not the approval of a majority of its members, a subjective…or should we say “wishful thinking”….measurement), modern control is individuals controlling one another to keep any of us from upsetting the fragile balance created by a civilization dedicated to equality. In practical terms, “equality” means pluralism or that there is no right/wrong except for what is proscribed by the dominant ideology which we see as giving us equality and thus “freedom.” To a modern person, freedom and equality mean the same thing, which is pluralism or no social standards, which is naturally extended to diversity/multiculturalism/internationalism (these terms mean the same thing) and approval of every underdog group that doesn’t violate social/political norms.
Nihilism shatters this control by attacking inherency. As a nihilist, you realize that everything is indeed a choice. You can choose to deny reality. You can choose to eat feces. You can choose to shoot yourself in the head. All of these are possible choices, and there’s only two ways to make such choices. The first way is wishful thinking; the second way is reality-based thinking. Since we know wishful thinking varies with the quality of the individual, and it can be easily observed that most individuals (I’ll add the Southern hybrid between good-will and pity, “Bless their hearts!”) make most decisions poorly, it makes zero sense to pick wishful thinking, or a subjective standard. Instead, it is logical to pick a reality-based standard. The prole has trained themselves to say “but who decides?” and the answer to that is obvious: we pick the best among us. However, to a non-nihilist, that answer seems dangerous. Someone is more than equal? There are differences between people? But you can’t say that in polite conversation! You will never get laid!
This is why nihilism is controversial. It destroys control, but unlike anarchy, does not affirm the necessity of control through picking an opposite model. Instead, it tells us we have choices. We can choose a rising society, or by making a different decision, choose to have a dying one. The results of our decisions are clear because similar types of decisions have been made in the past, and we can compare cause->effect and see what effects our actions are likely to have. Most people get freaked out by that “deterministic” view of life, so choose to believe that they can choose an effect, and then assign to it any cause they want, thus they can do whatever they want and claim they “intended” to have a certain effect. Tee hee, aren’t they clever! Logicians will know this as a B->A error: If all A->B, then all A are B, but not all B are A (B->A). Mistaken cause->effect reasoning is the foundation of our declining society today.
On a simpler level, nihilism is controversial because people prefer pleasant/easy lies to complex/difficult truths. They want to hear absolute and universal guarantees, like the talismans of an ancient religion: just slaughter a lamb to Baal, and you will get rich. Don’t worry about your decisions, and trying to figure out if you do the right one; get the right symbol on there, and everything will be OK. Social decision-making works this way, interestingly enough. If I say nice things to my friend, and then answer with wrong information when she asks me a factual question, I don’t get blamed or seen as having failed because the link in the friendship is the social kindness, not accuracy. People want that level of acceptance-without-challenge extended to all portions of their lives.
What is Parallelism?
Parallelism is a solution to linear thinking. Nihilism has us thinking in terms of choices; parallelism has us realizing that to make these choices, we need to compare more than one factor out of many to consider the before-state and after-state of our decision. Humans tend to project their own arbitrary choices onto situations by choosing one factor out of thousands or millions to look at when evaluating a decision.
For example, “Will this new car produce more or less carbon output than my old car?” If you look only at that one factor, you’ll go buy a Prius, but then there’s the question of what environmental damage is caused by the batteries in the Prius and the energy required to make it. There are other questions to be asked as well: am I more likely to be in a wreck, and thus send both cars to the junkyard? Will this be as reliable as a “regular” car? Is a better use of the money required to pay for its higher cost to simply purchase a few acres of forest land? Can I drive less with my existing car? These questions involve the assessment of environmental impact only.
Parallelism suggests that decisions are made according to indicators found in parallel between multiple factors. This reduces the arbitrary nature of linear decision-making. As a corresponding notion, parallelism also suggests that structures exist in parallel throughout the universe. This includes the vertical dimension of complexity and the possibility of metaphysics. “As above, so below,” would be an expression of parallelism; another way to view it is that there are no structures in the cosmos which are radically incompatible with any others.
As such, parallelism is an attack on how most people conceive of religion. The average person is either (a) a materialist, believing that there is nothing but physical matter and thus enhacing physical comfort for people is the best goal (utilitarianism), or (b) a dualist, believing that there is some “other side” where all things are pure and clear and people will live in perfection in the order of God or gods. Parallelism suggests instead that any additional metaphysical dimension will resemble what is here, because in all aspects of reality, nature uses mirrored structures to create an architectonic or self-balancing order. The greatest is found in the least and vice-versa. It is a perfect design.
In addition, parallelism points out another structure in nature, which is a natural selection-like mechanism that is found in nature, but also in mathematics and thought. Roughly speaking, for any possible action there are many parallel impulses, and each one reflects a certain degree of maturation toward completeness of organization. The most organized tend to form a parallel harmonic level — imagine the parallels themselves as verticals, and a horizontal line being drawn where completeness of order occurs — and thrive, while others go away. Our thoughts are like this: we have many impulses in response to stimulus, and our brain selects those which are the most complete and which do not trigger any negative feedback loops.
Parallelism also has political implications, notably that it’s nonsense to base a society on a single arbitrary idea (equality, finance) when many other things need to be considered. We need to consider happiness, and more importantly, being a rising society where we’re constantly getting better at what we do, instead of a declining one. Physical health needs to be considered as well, as does environmental impact, as does social consequence. There is no “freedom” from any of the consequences of our actions.
Further, parallelism suggests that different civilizations go through the same patterns if they use similar forms of organization. This ratifies Plato’s “civilization cycle,” by which nations are born, age and die. Every nation that undertakes the attitude and organization typical of a senescent nation will become senescent; any nation that adopts the attitude and organization typical of a new nation will be reborn. Further, parallelism suggests that the fortunes of our societies are not caused by geography, but by where in the cycle we choose to put our effort. In addition, parallelism would have us thus separate these societies so that each can evolve according to its choices.
A parallelist worldview also includes that idea that we cannot divide leadership by separating it into different subject matters. For example, financial decisions have effects on the same things that legal or social decisions do, but so also do non-government actions like those of the media, religions, social groups etc. It makes more sense to organize government by the things upon which we are having effect, than by the flavor (religious, economic, social, political) of activity undergone.
As such, parallelism is an entry point to the birth stage of the cycle of civilizations, called Tradition, and is utterly incompatible with modernity. However, since parallelism is reality-based, it explains the consequences of choices rather than formulate an ideology toward their ends. For this reason, it is a useful tool for diagnosing modern stumbles and finding ways to work around them.
What are some important figures in history that have shared the same viewpoint, to some degree?
Every great leader in history has recognized these principles to some degree. Nihilism belongs to strategic realists like Niccolò Machiavelli and Kautilya, but also to clear-minded thinkers like Siddhartha and Eckhart. Parallelism has to my knowledge never been articulated as such, but was an understood (which is better than written down — it lives in the culture and, as culture shapes its population through natural selection according to Race-Culture Theory, becomes part of the genetics of that population) part of ancient cultures.
Because these viewpoints are more descriptive (analysis of cause->effect decisions) than prescriptive, or ideological and moral values imposed on a population to control it, they do not comprise an ideology per se but are methods that can be applied by anyone. Josef Stalin can be said to be a nihilist with his pronouncement “no man, no problem”; then again, Bill Clinton also displayed nihilistic thinking when he adopted the practice of creating his current political platform by reading the polls and selecting any idea that polled highly as something he would support. However, none of these consciously adopt a nihilistic or parallelist viewpoint.
I would imagine that artists share a good deal of these philosophies because artists are naturally outsiders, since their job is to notice what society cannot. Further, artists are naturally realists, because in order to portray life accurately, one must notice how it functions and not the type of social statements that can be made to gloss-over that or make it sound appealing. Finally, art is inherently meditative; meditation is the root of all understanding, since it calms the mind and allows exploration of all factors at once. To be an artist, you must find what is hidden in plain sight and style it so that it and any solutions needed to it are appealing, making people want to engage with it. Artists fight back against numbness induced by social conformity of behavior which in turn exhausts the mind of any possibilities other than obedience and reward.
Dogs, despite being nature’s kindest and most enthusiastic animals, have the baffling habit of chasing their tails. They notice the attraction and lunge for it, as if this discovery of themselves could give their lives meaning.
Reputedly, humans are more intelligent and not prone to such behaviors. After some years of experience in the world, I can no longer agree. We are the ultimate tail-chasers but, being social animals, we’ve found a way to pretend that we are not chasing our own tails if we project the image of a tail onto others.
After a few weeks in the wild one may return to society and notice as if for the first time how it is literally covered in advertising. Not just the wheat-paste posters, the giant billboards lining the roads, the advertisements on TV and radio blaring from all angles, but even the little stuff.
People repeat what they’ve seen or heard. Movies and music even feature product placement. When they’re not doing that, people advertise themselves. They brag about their kids, pitch you a business plan, or as happens every day describe for you their method of doing something-or-other and expect you to validate it with approval.
The point is that we are chasing our own tail. Merchandisers try to find out what “the people” want and advertise it to them, hoping “the people” will buy it. As a result, there’s no leader. The people are in theory the deciders, but they are also shaped by those who want to benefit from their decisions. And so like a dog chasing its tail, business pursues consumers who pursue business.
It’s not any different in the world of politics. The candidates try to figure out what the voters want so they can offer it to them. The voters are in turn shaped by what the candidates are offering. The entire political process chases its own tail because no one is in control, only two groups attempting to placate one another.
At a social level the same thing is happening. To be popular, you need to be where it’s at. That means wherever what is trendy and viral right now is occurring. This gives us a whole crowd of people chasing its tail, waiting for someone to do something trendy so they can all chase the trend and thus earn the esteem of the crowd itself.
With this kind of circular logic going on, it’s no wonder our society can barely make simple decisions and has fallen behind under a giant pile of unresolved details. We are not making decisions, but waiting for others to validate us, and they’re waiting for the same from us, which creates a sort of pre-emptive negotiation based on our mutual weaknesses.
This mutual weakness negotation might be described as: “I won’t approve of what you’re afraid of, if you don’t approve of what I’m afraid of.” We are no longer leading by what we desire in a positive sense, but by what we must avoid in order to not destabilize our self-image.
Advertising is pitched to fears. Do you have bad breath? Don’t know what to cook for the kids so the neighbors think you’re a good mom? Afraid you don’t look sharp enough for that promotion? We have solutions for your fears, except because they don’t address the underlying problems (bad hygiene, neurotic distraction, low job performance) you’ll always come back for more. We won’t mention your weakness if you don’t mention we’re a scam.
Politics cannot focus on what we agree on because there’s almost nothing we can agree on. When you need to unite a group of people, and they’re each pulling in an individualistic direction, all they can agree on is avoiding big and obvious problems. Otherwise, they return to inertia. Our agreement is based on fear, specifically the fear that threatens our ability to be oblivious to anything but fear. Politicians agree not to mention the callowness of voters, if voters don’t mention that politics is manipulative by its essential nature.
Our broken social scene reveals the real culprit. If you can envision a group of monkeys sitting around at a clearing in the African jungle, you can see our glorious simian roots. Each monkey watches the others. When another monkey acts, whether to pick up a stick or pick a fruit, the other monkeys assess the likelihood of his success. If he succeeds, how can I get ahead off of what this guy is doing? I can imitate him and start a trend, and thus become “important.” Or, I can fling dung at him and shriek, making myself seem like a protector of the tribe.
The problem with this type of thinking is also what explains why the monkeys stayed in the jungle while humanity moved on. When you chase your own tail, you never pick up a direction other than thinking of the tribe. Your world becomes the tribe, and you become blind to physical reality outside of what others think. You also limit your thoughts to variations on what has already been thought.
Humanity broke free, for a while. We rewarded the independent thinkers and as a result, we created a growing edge within our population. Our leaders made good choices and invented realistic responses, and so we thrived. But then the other monkeys sitting around the clearing saw this and wanted their share of the action.
Because being clever is easier than thinking, they started with cleverness. First they equated leadership thinking with “new ideas” instead of “realistic ideas.” Then they started inventing new ideas. Of course, like modern art, these ideas had nothing to do with reality and weren’t even new. They were new-looking variations of the same old stuff, because that’s what succeeds, in a social sense.
Success in a social sense however determined who succeeded, for a time.
Most of the monkeys can’t tell the difference between a “new idea” and a good idea, so when they saw that trend of newness forming, they got behind it. When others objected, the new monkeys flung dung and called those other monkeys reactionaries. What mattered was what was new, exciting and made all the other monkeys excited. The chattering reached a fever pitch, the Bastille was overthrown, and from henceforth the new monkeys ruled.
But as it says above, “for a time.” The new ideas did not work so well, but monkey society was resting on such a huge momentum of the past, both in terms of wealth and technology, that all it had to do was keep encouraging the same stuff to happen time and again. Keep throwing money at technology, advertising to the consumers, lying to the voters, and hoping it will all work out.
For over two centuries it seemed to, if you could ignore the fratricidal wars and gnawing sense of inner emptiness and purposeless existence. That doesn’t bother everyone. The people who could be extras in Idiocracy tend to find an empty existence pleasant because that way nothing impedes their pursuit of entertainment, donuts and sex. The screeching and flinging of dung reaches a fever pitch.
Now, the monkeytime has come to an end. The problems that we blew off because they were long-term, and thus not popular, have begun to manifest themselves. They aren’t apocalyptic, but worse, they’re never going away. They will slowly grind us down until we are a nub. We have created a tunnel vision of our own prospects.
It’s funny because that’s what those reactionary monkeys warned us about. Our new ideas were just chasing our own tail. And like all circular motions, eventually they wind down and we lose inertia, and then sit becalmed while decay absorbs us.
Of all that I have read or written on the decline of the West, very little examines a basic topic that might be the ultimate elephant in the room: the decline in sanity.
We have passed so far beyond the point of “common sense” and normalcy that insanity is seen as the norm to the degree that anyone calling for actual sanity is widely viewed as themselves insane.
It’s worth having a momentary chuckle about, before you look around worried that you let the mask slip and might be recognized as a realist among sheep.
The situation resembles the inverse of those “they walk among us” shows about aliens who disguise themselves as people and manipulate us. In this case, we — the few normal remnant — walk among them, and they control almost everything. Invasion of the Body Snatchers and They Live closely portrayed this idea, but even those missed the basic problem of modern democracy: the majority are the abnormal normal and the few who are actually sane are a persecuted minority.
Part of this insanity comes from the series of wars that brought us liberalism in the West. The American Civil War, the Napoleonic Wars, the World Wars and the war against Communism. We have grown up under the searchlights and to the beat of the drum and now, it is the only metaphor we understand. The War on Poverty. The War on Drugs. The War On Inequality. They come from the same source, which is our need to ideologically polarize people who have basically nothing in common.
Much of the insanity is inherent to social reality itself. We replaced natural order with social order. This was the triumph of The EnlightenmentTM: people no longer had to pay attention to reality, because they were now equal, and equal means that insanity is on par with sanity. You cannot tell people that they are equal and their ideas are all valid (an equal baseline, in other words) and then start imposing social constructs like sanity, gender, race, intelligence and reality upon them. Not that. You must accept them as they are. The first place to explore this was California, which quickly became a “get freaky with your bad self” welfare state. But all the world is now California, making each individual entitled to participate in every aspect of social life no matter how insane that person is. Until they start shooting, they are presumed to be sane, unless they’re right-wingers. Then it is off to a military hospital for re-education, or at least, soon will be.
Some of the basic craziness of the modern West comes from our need to deny certain things. Like… the downfall of our civilization. Or that most people are totally frickin’ nuts and it is socially taboo to notice that fact, so we all just hum and pretend to not care while secretly withdrawing all of our opinions, ideas and personalities from public. The atmosphere in the modern West is downright Soviet in that, since any random statement may trigger the wrath of the lynch mob, the only safe statements are those which affirm that narrative in the big liberal papers, The New York Times and The Washington Post. Repeat what you read there and you are probably OK. Anything else? You might get drummed out of your job, have your rent or house note called in, be ostracized by friends and shopkeepers alike, and watch your spouse leave because he or she also fears ostracism. Those who deny denial are political enemies of the state, and thus of the two big liberal papers, and thus are the only acceptable targets of hatred in polite conversation.
It is viewed as a personal failing to notice denial. If you observe that our society is falling apart, people smugly tell you, “Well, sorry to hear it isn’t working out for you… It’s working just fine for me.” Coincidentally, they get promoted and find themselves well-off because they have the right opinions and the smarter they are, the more valuable it is that they publicly refuse to notice the denial. This increases insanity by making it impossible to speak commonsense observations in public, making each person who notices such things feel as if they personally are the problem. When we deny truth, only untruth remains, and those who are inclined to truth get treated like mental patients even though the actual mental patients are those getting rewarded by the system.
Our society represents not just inverted natural selection, but natural selection based on the wishful thinking of the average person who, upon being told he was “equal,” pushed that conclusion to its extreme — as was forseeable — and equated insanity to sanity. Sanity is the new taboo; insanity is the new normal. A small remnant of people who remember nature, order, logical thinking and other forms of now-taboo “noticing” or “denial of denial” huddle in their basements, only recording these thoughts where they will not be observed in any significant form and definitely not tied to their own names. If they are discovered, a van full of mental patients wearing lab coats will show up and carry the sane away, lock them in an asylum and forget about them, sacrifices to the altar of our necessary denial.