One thing I’ve learned about life and society in general is that you need to be ready to change, and to admit error and move on, without becoming neurotic about it. This took some time, as I grew up in a very neurotic time and area of America, where people would wring their hands about small things and bypass the large issues because they didn’t touch our small, tedious, consumer lives. The combination of working too hard, feeling guilty about missing existential pleasures and thus turning to religion, and basic boredom created a neural stimulation festival where the slightest error or change provoked mental screaming and insidiously evasive behavior. Anything would excite a mental flaring except the issues most commonly avoided: mortality, love, meaning. It almost felt like you really could fool people by saying “he passed on” instead of “he fucking died.”
This morning, I open up the virtual paper (still killing trees, are you?) and found an article titled “Reshuffle Points to Japan Reforms.” While normally this would not strike any of our registers, perhaps the intense moralizing before a war that has afflicted America has affected me, because I find myself looking through narrowed eyes at the language of the press. They remind me of the suburban neurosis I describe above crossed with a high school theatre department sense of Drama, creating a self-important demand to categorize all things with a spin. It’s sort of like religion itself. He didn’t die, but it was “his time to go to Heaven.” Oh, so it wasn’t the interactions of a natural system, but it was deliberate in some form. Should it feel better then? In theatre departments, tights don’t simply tear, they become recalcitrant, they fight back at the actor or actress in question, and eventually triumph as paragons of the resistance and futility of life. Same way with the news: an article whose content is basically “Japan moves to plan B for failing banking system” becomes a crusade toward “reforms.”
When people were less inclined to turn bullshit into cake frosting, “reform” meant a place where you went after drugs, alcohol and fast living had wrecked your prospects. Reform school. Parents would tell their children that if they screwed up once more in a big way, they were heading off to the large impenetrable brick building just outside of town surrounded by a tall iron fence. “If I find cocaine in with your coloring books one more time, I’ll send you to reform school!” has become antiquated, because society has eroded to the point of this being commonplace, but the linguistic implication remains. Reform is fixing what was “bad.” It can’t be an error, it has to be an honest to goodness “bad” intent.
The article goes on:
“As a result, Koizumi has been under heavy domestic pressure to change
his economic ministers, who have been criticized for not instituting more
Yanagisawa had been criticized for being too timid with the county’s
depressed banking industry, while Takenaka is a proponent of bold
The implication here is that “radical reform” is the absolute necessary course of action. It’s like an executive who picks up the phone and barks “Fire him!” whenever a subordinate has a bad day. Or Apple CEO Steve Jobs, long known for berating his employees into resigning only days after praising their significant efforts, and his famous “hero-shithead roller coaster” style of management by fear. What we have in Japan is similar, a panic reaction disguised as something positive. Like most of the rest of the lies in this society, this one rolls easily off the tongue and confuses us. What once was is bad, so we need some vast and gigantic change (as if banking itself has changed radically). We need progressive, liberal, radical changes. As a knee-jerk reaction.
In other knee jerk reaction news, Mark Zach has fucking died. Described as “distraught” by the news media, he was the police officer who last week pulled over one of the suspects in this week’s grimy and bloody failed bank robbery in Norfolk, NE. Apparently when entering a serial number from a gun found in the car into his computer, Zach transposed two digits and therefore didn’t realize the gun was stolen. Thus a heinous crime in theory was not prevented. Even though it’s a tenuous jump to assume that all four collaborators would have given up had one gotten busted, Zach obviously felt the pressure of many things, including a hysterical local community and a sudden zooming in of the news media who are slowly dramatizing every commercial-second-selling detail of the robbery. With all of this bearing down on him, Zach succumbed to the disease of America, and in a moment of assumption saw himself as both powerless and responsible, therefore fatally neurotic, something he put into action like a good, diligent worker.
Events like this make me think that the true enemy of humanity is the television screen, and the media culture it has endowed. After all, you can’t have a boring newscast. Jazz it up a bit. Add the human element. Put some “spin” on it. It wasn’t another boring day – someone has a cat who can spread its feces in the shape of the Virgin Mary, or is instituting progressive “reforms” to some system that happened to fall upon hard times. Get more smiling faces of “the common man” onscreen and show us how important the tedious, neurotic lives of the small people are.
Because that’s what sells, and the art of the parasitic seller needs no reform to be profitable.
American politics will perhaps never achieve the degree of honesty our founding fathers envisioned, because their vision wisely did not embrace democracy as we know it. Their vision of democracy was limited to those who make the daily leadership decisions that keep society going: those who owned the farms and businesses, and ran the churches and military.
In the last decade, we’ve seen The People swindle themselves by believing that the categories of good and evil communicate vital information. Someone labels Iraq evil, and they go to war; someone complains about the war, and they label the president who called the war good as evil.
In theory, having your peers get together and decide “as one big happy family” what should be done is an emotionally fulfilling plan. If we all make the decision, we’ll get it right, and there won’t be much dissent – everyone will be “represented.” However, the truth is that since the dawn of language, it’s been trivial for leaders to manipulate the average person.
Think about this. Joe American isn’t a “bad” person or a “good” person, but an average person. He works forty hours a week, fifty with commuting time included. He trusts his newspapers and teevee. With family, shopping, home repair, children and entertainment to look after, Joe’s a pretty distracted guy.
He’s not going to think critically about everything he reads, but will make a conclusion based on how it hits him at the moment — and of the Joes of the world, only one in ten can even hope to make such decisions, if given education and time. This is where Joe is susceptible.
Like a good hacker, a government knows the first step in any attack is to set up fake data that can later be referenced in order to compromise the system. Think of it is a mental trojan horse, or a virus: they literally upload a binary logic into his brain.
First it’s “good” and “evil.”
Next comes the ol’ Cold War canard, about how America is “free’ and “democractic” and this is therefore better than any other system on earth and therefore they all hate us.
Joe “Voter” American thus feels empowered when the decision is presented to him in a simplistic, binary way. “Well, who wouldn’t want to get rid of these horrible people?” he thinks. “I’m with the good guys! Let’s go get the bad guys!” he says – thus social control is instantly implanted and effected at the same time.
The New World Order will involve crushing any dissidents before we’re informed, because of the decisions “we” make now. The NWO gets its power because it can fool we, the people, by using this good/evil logic.
The discerning reader can now see why I, and many more thinkers and writers of late, have begun to speak up about democracy in the media age, referring to it by an older appellation: tyranny.
When we strip away the pretense, and the social justification, and the corporate-governmental-theocratic manipulation, democracy means “the mob rules” and does the bidding of its masters, who are pursuing only profit. We don’t need to blame corporations or government; we need to blame ourselves, or rather our lack of control in that we allow an undifferentiated Crowd to make decisions for us.
The only thing “wrong” with this is what doesn’t get done as a result. No collective action is taken to make civilization more intelligent, to protect and nurture the environment, or to plan for a sustainable long-term future.
The dead white males that are currently fashionable to disregard had a solution for this. When they spoke of democracy, they didn’t mean that everyone could vote. They meant that only men of a certain age who were powerful enough to be familiar with the workings of power should vote. These were those who had been required to survive and endure in a business environment, and who had seen the root process behind it. They are not as easily fooled as someone too young, someone who has not forged their way in society yet, or someone with a built-in compassion overabundance as a result of biological programming for child care. They were hardened and could make difficult decisions independent of “bad’ and “good” programming.
So today, when they trot out a symbol – a “successful” black female enfranchised empowered voter – to tell you that it’s time to kill the brown man in Iraq or a white man on ruby ridge, think: It’s not.
Consider what you support carefully. And consider turning off your teevee and shutting down the instant popularity contest we call “democracy.” Before it’s impossible to make that change.
Postmodern humor: I think that’s what to call what is popular in the media today. Basically, it’s a contexting game. You mention enough pop culture things and then twist their context to be an inversion of their basic function, and it’s “funny.” It goes something like this:
McDonalds’s Restaurants today announced that it would begin eliminating bodily waste as part of its new program to enhance the image of its bathrooms. Microsoft Chairman Bill Gates had no immediate comment, but Steve Ballmer informed us officiously, “They could have used MSDRAW.EXE.” The restaurant chain also plans to offer night courses. While evolution would be taught, creationism would be preserved as a “valid theory important to many people,” and degrees would be given in the fields of Not Hurting Feelings, Not Groping in Public and Discrete Vomiting. George W. Bush tossed the first handgrenade at Iraqi workers in the kitchen and proclaimed, “A new world order which will not tolerate evil will preserve its first strike capability to reduce threats from those who do not agree.” The former cast of Friends were in attendance and clapped loudly, with Jennifer Anniston handing out topless papparazzi photos of herself with her resume, phone number and blog address printed on each. Elsewhere former president Jimminy Carter proclaimed the value of a work relationship with Cuba’s Fidel Castro, who recently beat Venus Williams in mixed doubles tennis. Winona Ryder, arrested as part of the Buffalo, NY cell of al-Qaeda, proclaimed her innocence with a public protest involving nudity, an electric eel, and a recording of Martin Luther King, Jr. boffing Eleanor Rooseveldt while Herbert Hoover and Ariel Sharon watch wearing matching dressers. In other news, Adolf Hitler was spotted on Miami Beach trading diamonds for plutonium which he promised was “for medical use only.” Speaking of, Nevada today legalized small amounts of uranium, plutonium and strontium-90 for “personal use” by those not already identified as al-Qaeda sleeper cells.
You could easily write a script to generate this kind of weird shit, but in the meantime, this type of thinking rules the media. It’s easy to create. It reinforces the need for more media. It can be done infinitely without worrying about a loss of content impetus. Sort of like the soft drink of post-web material. I blame THE ONION in part, SOUTH PARK in part, but generally, a convergence of media itself on the instant, the distracted, and the vague. Postmodernism left the hallowed halls of academia years ago and has trickled into public life, and when you filter vapid TV + the chaotic social self-worship of the internet + the increasingly fragmentary, absolutist view of the world through american politics and culture, you get this mess. “A dog’s breakfast” as they used to say at the old country store, before Mr. T came in with Gov Ventura and fed the lot of them what was left of John Holmes.
How will the end come? I think for humanity, as for individuals, it will be “with a whimper and not a bang,” and furthermore, it will be a predictable downslide.
Very few deaths are truly shocking, or unexpected. Cancer runs its course, old age leaves little doubt that the end is on the books and will “be here shortly,” and even war is a question of which bullet, not that a bullet enters the forebrain and liquefies it, banishing life to a memory from a separate observer. Similarly – it seems to me – a society in degeneration will not blink when the decay has run its course, but will shrug with a release of tension that could approximate relief, and say “So it goes.”
Insert photographs of AGM-109f units penetrating buildings, soldier hurriedly applying field dressing in order to realize it’s too late and move on, another family looking into the pit and trying to ignore the cracking gunfire behind them until the last possible moment.
Will our end be by war? I think not: I think instead it will be a gradual unknotting of ability for a society to take care of itself, resulting in a breakdown both sublime and eventually, profound. The only conflict will be the continuing problems which we stifle to end the loss, in effect ending debate on vital issues and thus smashing portions of our necessary social and intellectual infrastructure. Symbols will replace reality. Reward will replace motivation. Control will replace autonomous agreement on values. And when those linchpins are pulled? The guts will fall out.
We are in the age of unravelling, yet we’re still in control. I suppose all we need to counter this trend is what kills any trend: enduring values and something to believe in, something to work for. All it takes is for one strong voice to speak up against the suicide. But already, it may be that none of us are motivated.
Oh, yeah. Just a brief word about vegetarianism. Controversial philosopher F.W. Nietzsche expressed a dislike for vegetarians because he saw them as being unwilling to accept life as it necessarily is, including the concepts of dominance and predation, and thus they were creating an artificial world based on morality or a “looks like it should be” motivation instead of a natural impulse. In his view, they were Christians of a secular type.
Recently PETA served up some Christians with a notice that the Church’s annual pig roasts were raising not only funds but dander on animal lovers.I recalled N’s comments and had to laugh a bit. How can one group of Christians contradict another? Well, because they’re both predatory, and grandstanding for attention off of each other’s backs. “I am most moral! None others have the same righteousness – nor right – to rule as I! My subjective universe triumphs over the outside world!”
It’ll be interesting to see if they eat each other. In the meantime, for those that like meat, we wish PETA would do something useful like forming a rating scheme for which companies that produce meat products treat their animals like animals, and which treat them with a dishonorable degree of pointless cruelty and dangerous quick-growing schemes like hormones and artificial nutrient boosting. But that probably won’t happen – they’re more concerned about public image as the “most moral.”
And those of us who lay no claim to being moral wish we could just eat them.
I read a lot about the war in Iraq — I mean, the coming slaughter of the untechnologized natives — and it makes me grin slightly sadly to think of the humor of declaring oneself “moral” and therefore going after “the bad people.” Who could live up to that, dealing with more than a few people; rephrased, this question is, who would want to make themselves that yardstick? Not I – “evil” republics and “good” republics, like people, are each a mixed bag.
Of course, this has the virtual Jerry Springer Audience tamatoguchi screaming, “But why does ‘evil’ exist then?” Having been an expert on evil for some time, my answer snaps back like a punch at an ARA rally: evil is our way of characterizing what doesn’t go our way. No lollipod? Bloody hell. In America, we’ve been taking what we want and making insane profits for so long that we think ‘evil’ is anything that threatens our ability to make millions from oil futures.
Evil is just another mental justification, a symbol that’s religious because you have to believe to think it’s real. Illusion, as Buddhists might say. There are no evil republics and our inner sense of ‘morality’ is rotted to the point where it pursues only appearance. It’s easy to talk shit about GWB, but is he any different than Clinton the comical criminal? Are either of these guys doing anything but a job? Their job is to keep the wealthy industries happy.
And it makes me laugh to see the right-left split keeping you all baffled. I run into these pie-eyed leftists who assure me that inegalitarianism is the greatest evil, that a White Supremacist empire runs the united states, and that there’s a crazy group of elites who have a holy war to keep down the more talented, impoverished brown peoples of the third world. The right as known to most of America seem to me to be slightly hardboiled liberals, and the far right — despite their correct identification of religions like Christianity and Judaism as an insane part of the problem — often become tied up by their own doctrinaire approach. What’s a boy looking for truth to do?
The first thing I’m doing is dropping any kind of pro-human viewpoint I have, and by that meaning anything that assumes humanity is any different from any other animal. We do what feels good, including what benefits us. Our greed is self-preservation. We aren’t good or evil, but we are in competition, so it becomes useful to characterize each other as absolutes:stupid, gay, evil, good, ally, enemy. Oh, and we respond to self-programming, so given healthy values we respond and build healthy societies.
As I’m watching the USA spin and twist to justify its Judeo-Christian war against all that isn’t capitalist Christian and allied with Democracy in the middle east, I’m sitting back and siding with nature. It’s not a he or a she or an it, but a series of principles that control the equilibrium of this universe. Whatever those are however, they’re going to in time start bringing down that which claims too much moral authority by the very nature of power that moral authority must wield. My final metaphor is this: power is like a drug. The more you can get, the better it is, but the more intoxicated you become, the less likely you are to see the thug in the corner with your name on his bullets.
The Postmodern Condition (1979) publ. Manchester University Press, 1984. The First 5 Chapters of main body of work are reproduced here.
1. The Field: Knowledge in Computerised Societies
Our working hypothesis is that the status of knowledge is altered as societies enter what is known as the postindustrial age and cultures enter what is known as the postmodern age. This transition has been under way since at least the end of the 1950s, which for Europe marks the completion of reconstruction. The pace is faster or slower depending on the country, and within countries it varies according to the sector of activity: the general situation is one of temporal disjunction which makes sketching an overview difficult. A portion of the description would necessarily be conjectural. At any rate, we know that it is unwise to put too much faith in futurology.
Rather than painting a picture that would inevitably remain incomplete, I will take as my point of departure a single feature, one that immediately defines our object of study. Scientific knowledge is a kind of discourse. And it is fair to say that for the last forty years the “leading” sciences and technologies have had to do with language: phonology and theories of linguistics, problems of communication and cybernetics, modern theories of algebra and informatics, computers and their languages, problems of translation and the search for areas of compatibility among computer languages, problems of information storage and data banks, telematics and the perfection of intelligent terminals, to paradoxology. The facts speak for themselves (and this list is not exhaustive).
These technological transformations can be expected to have a considerable impact on knowledge. Its two principal functions – research and the transmission of acquired learning-are already feeling the effect, or will in the future. With respect to the first function, genetics provides an example that is accessible to the layman: it owes its theoretical paradigm to cybernetics. Many other examples could be cited. As for the second function, it is common knowledge that the miniaturisation and commercialisation of machines is already changing the way in which learning is acquired, classified, made available, and exploited. It is reasonable to suppose that the proliferation of information-processing machines is having, and will continue to have, as much of an effect on the circulation of learning as did advancements in human circulation (transportation systems) and later, in the circulation of sounds and visual images (the media).
The nature of knowledge cannot survive unchanged within this context of general transformation. It can fit into the new channels, and become operational, only if learning is translated into quantities of information.” We can predict that anything in the constituted body of knowledge that is not translatable in this way will be abandoned and that the direction of new research will be dictated by the possibility of its eventual results being translatable into computer language. The “producers” and users of knowledge must now, and will have to, possess the means of translating into these languages whatever they want to invent or learn. Research on translating machines is already well advanced.” Along with the hegemony of computers comes a certain logic, and therefore a certain set of prescriptions determining which statements are accepted as “knowledge” statements.
We may thus expect a thorough exteriorisation of knowledge with respect to the “knower,” at whatever point he or she may occupy in the knowledge process. The old principle that the acquisition of knowledge is indissociable from the training (Bildung) of minds, or even of individuals, is becoming obsolete and will become ever more so. The relationships of the suppliers and users of knowledge to the knowledge they supply and use is now tending, and will increasingly tend, to assume the form already taken by the relationship of commodity producers and consumers to the commodities they produce and consume – that is, the form of value. Knowledge is and will be produced in order to be sold, it is and will be consumed in order to be valorised in a new production: in both cases, the goal is exchange.
Knowledge ceases to be an end in itself, it loses its “use-value.”
It is widely accepted that knowledge has become the principle force of production over the last few decades, this has already had a noticeable effect on the composition of the work force of the most highly developed countries and constitutes the major bottleneck for the developing countries. In the postindustrial and postmodern age, science will maintain and no doubt strengthen its preeminence in the arsenal of productive capacities of the nation-states. Indeed, this situation is one of the reasons leading to the conclusion that the gap between developed and developing countries will grow ever wider in the future.
But this aspect of the problem should not be allowed to overshadow the other, which is complementary to it. Knowledge in the form of an informational commodity indispensable to productive power is already, and will continue to be, a major – perhaps the major – stake in the worldwide competition for power. It is conceivable that the nation-states will one day fight for control of information, just as they battled in the past for control over territory, and afterwards for control of access to and exploitation of raw materials and cheap labor. A new field is opened for industrial and commercial strategies on the one hand, and political and military strategies on the other.
However, the perspective I have outlined above is not as simple as I have made it appear. For the merchantilisation of knowledge is bound to affect the privilege the nation-states have enjoyed, and still enjoy, with respect to the production and distribution of learning. The notion that learning falls within the purview of the State, as the brain or mind of society, will become more and more outdated with the increasing strength of the opposing principle, according to which society exists and progresses only if the messages circulating within it are rich in information and easy to decode. The ideology of communicational “transparency,” which goes hand in hand with the commercialisation of knowledge, will begin to perceive the State as a factor of opacity and “noise.” It is from this point of view that the problem of the relationship between economic and State powers threatens to arise with a new urgency.
Already in the last few decades, economic powers have reached the point of imperilling the stability of the state through new forms of the circulation of capital that go by the generic name of multi-national corporations. These new forms of circulation imply that investment decisions have, at least in part, passed beyond the control of the nation-states.” The question threatens to become even more thorny with the development of computer technology and telematics. Suppose, for example, that a firm such as IBM is authorised to occupy a belt in the earth’s orbital field and launch communications satellites or satellites housing data banks. Who will have access to them? Who will determine which channels or data are forbidden? The State? Or will the State simply be one user among others? New legal issues will be raised, and with them the question: “who will know?”
Transformation in the nature of knowledge, then, could well have repercussions on the existing public powers, forcing them to reconsider their relations (both de jure and de facto) with the large corporations and, more generally, with civil society. The reopening of the world market, a return to vigorous economic competition, the breakdown of the hegemony of American capitalism, the decline of the socialist alternative, a probable opening of the Chinese market these and many other factors are already, at the end of the 1970s, preparing States for a serious reappraisal of the role they have been accustomed to playing since the 1930s: that of, guiding, or even directing investments. In this light, the new technologies can only increase the urgency of such a re-examination, since they make the information used ‘in decision making (and therefore the means of control) even more mobile and subject to piracy.
It is not hard to visualise learning circulating along the same lines as money, instead of for its “educational” value or political (administrative, diplomatic, military) importance; the pertinent distinction would no longer be between knowledge and ignorance, but rather, as is the case with money, between “payment knowledge” and “investment knowledge” – in other words, between units of knowledge exchanged in a daily maintenance framework (the reconstitution of the work force, “survival”) versus funds of knowledge dedicated to optimising the performance of a project.
If this were the case, communicational transparency would be similar to liberalism. Liberalism does not preclude an organisation of the flow of money in which some channels are used in decision making while others are only good for the payment of debts. One could similarly imagine flows of knowledge travelling along identical channels of identical nature, some of which would be reserved for the “decision makers,” while the others would be used to repay each person’s perpetual debt with respect to the social bond.
2. The Problem: Legitimation
That is the working hypothesis defining the field within which I intend to consider the question of the status of knowledge. This scenario, akin to the one that goes by the name “the computerisation of society” (although ours is advanced in an entirely different spirit), makes no claims of being original, or even true. What is required of a working hypothesis is a fine capacity for discrimination. The scenario of the computerisation of the most highly developed societies allows us to spotlight (though with the risk of excessive magnification) certain aspects of the transformation of knowledge and its effects on public power and civil institutions – effects it would be difficult to perceive from other points of view. Our hypotheses, therefore, should not be accorded predictive value in relation to reality, but strategic value in relation to the question raised.
Nevertheless, it has strong credibility, and in that sense our choice of this hypothesis is not arbitrary. It has been described extensively by the experts and is already guiding certain decisions by the governmental agencies and private firms most directly concerned, such as those managing the telecommunications industry. To some extent, then, it is already a part of observable reality. Finally, barring economic stagnation or a general recession (resulting, for example, from a continued failure to solve the world’s energy problems), there is a good chance that this scenario will come to pass: it is hard to see what other direction contemporary technology could take as an alternative to the computerisation of society.
This is as much as to say that the hypothesis is banal. But only to the extent that it fails to challenge the general paradigm of progress in science and technology, to which economic growth and the expansion of sociopolitical power seem to be natural complements. That scientific and technical knowledge is cumulative is never questioned. At most, what is debated is the form that accumulation takes – some picture it as regular, continuous, and unanimous, others as periodic, discontinuous, and conflictual.
But these truisms are fallacious. In the first place, scientific knowledge does not represent the totality of knowledge; it has always existed in addition to, and in competition and conflict with, another kind of knowledge, which I will call narrative in the interests of simplicity (its characteristics will be described later). I do not mean to say that narrative knowledge can prevail over science, but its model is related to ideas of internal equilibrium and conviviality next to which contemporary scientific knowledge cuts a poor figure, especially if it is to undergo an exteriorisation with respect to the “knower” and an alienation from its user even greater than has previously been the case. The resulting demoralisation of researchers and teachers is far from negligible; it is well known that during the 1960s, in all of the most highly developed societies, it reached such explosive dimensions among those preparing to practice these professions – the students – that there was noticeable decrease in productivity at laboratories and universities unable to protect themselves from its contamination. Expecting this, with hope or fear, to lead to a revolution (as was then often the case) is out of the question: it will not change the order of things in postindustrial society overnight. But this doubt on the part of scientists must be taken into account as a major factor in evaluating the present and future status of scientific knowledge.
It is all the more necessary to take it into consideration since – and this is the second point – the scientists’ demoralisation has an impact on the central problem of legitimation. I use the word in a broader sense than do contemporary German theorists in their discussions of the question of authority. Take any civil law as an example: it states that a given category of citizens must perform a specific kind of action. Legitimation is the process by which a legislator is authorised to promulgate such a law as a norm. Now take the example of a scientific statement: it is subject to the rule that a statement must fulfil a given set of conditions in order to be accepted as scientific. In this case, legitimation is the process by which a “legislator” dealing with scientific discourse is authorised to prescribe the stated conditions (in general, conditions of internal consistency and experimental verification) determining whether a statement is to be included in that discourse for consideration by the scientific community.
The parallel may appear forced. But as we will see, it is not. The question of the legitimacy of science has been indissociably linked to that of the legitimation of the legislator since the time of Plato. From this point of view, the right to decide what is true is not independent of the right to decide what is just, even if the statements consigned to these two authorities differ in nature. The point is that there is a strict interlinkage between the kind of language called science and the kind called ethics and politics: they both stem from the same perspective, the same “choice” if you will – the choice called the Occident.
When we examine the current status of scientific knowledge at a time when science seems more completely subordinated to the prevailing powers than ever before and, along with the new technologies, is in danger of becoming a major stake in their conflicts – the question of double legitimation, far from receding into the background, necessarily comes to the fore. For it appears in its most complete form, that of reversion, revealing that knowledge and power are simply two sides of the same question: who decides what knowledge is, and who knows what needs to be decided? In the computer age, the question of knowledge is now more than ever a question of government.
3. The Method: Language Games
The reader will already have noticed that in analysing this problem within the framework set forth I have favoured a certain procedure: emphasising facts of language and in particular their pragmatic aspect. To help clarify what follows it would be useful to summarise, however briefly, what is meant here by the term pragmatic.
A denotative utterance such as “The university is sick,” made in the context of a conversation or an interview, positions its sender (the person who utters the statement), its addressee (the person who receives it), and its referent (what the statement deals with) in a specific way: the utterance places (and exposes) the sender in the position of “knower” (he knows what the situation is with the university), the addressee is put in the position of having to give or refuse his assent, and the referent itself is handled in a way unique to denotatives, as something that demands to be correctly identified and expressed by the statement that refers to it.
If we consider a declaration such as “The university is open,” pronounced by a dean or rector at convocation, it is clear that the previous specifications no longer apply. Of course, the meaning of the utterance has to be understood, but that is a general condition of communication and does not aid us in distinguishing the different kinds of utterances or their specific effects. The distinctive feature of this second, “performative,” utterance is that its effect upon the referent coincides with its enunciation. The university is open because it has been declared open in the above-mentioned circumstances. That this is so is not subject to discussion or verification on the part of the addressee, who is immediately placed within the new context created by the utterance. As for the sender, he must be invested ‘with the ’ authority to make such a statement. Actually, we could say it the other way around: the sender is dean or rector that is, he is invested with the authority to make this kind of statement – only insofar as he can directly affect both the referent, (the university) and the addressee (the university staff) in the manner I have indicated.
A different case involves utterances of the type, “Give money to the university”; these are prescriptions. They can be modulated as orders, commands, instructions, recommendations, requests, prayers, pleas, etc. Here, the sender is clearly placed in a position of authority, using the term broadly (including the authority of a sinner over a god who claims to be merciful): that is, he expects the addressee to perform the action referred to. The pragmatics of prescription entail concomitant changes in the posts of addressee and referent.
Of a different order again is the efficiency of a question, a promise, a literary description, a narration, etc. I am summarising. Wittgenstein, taking up the study of language again from scratch, focuses his attention on the effects of different modes of discourse; he calls the various types of utterances he identifies along the way (a few of which I have listed) language games. What he means by this term is that each of the various categories of utterance can be defined in terms of rules specifying their properties and the uses to which they can be put – in exactly the same way as the game of chess is defined by a set of rules determining the properties of each of the pieces, in other words, the proper way to move them.
It is useful to make the following three observations about language games. The first is that their rules do not carry within themselves their own legitimation, but are the object of a contract, explicit or not, between players (which is not to say that the players invent the rules). The second is that if there are no rules, there is no game, that even an infinitesimal modification of one rule alters the nature of the game, that a “move” or utterance that does not satisfy the rules does not belong to the game they define. The third remark is suggested by what has just been said: every utterance should be thought of as a “move” in a game.
This last observation brings us to the first principle underlying our method as a whole: to speak is to fight, in the sense of playing, and speech acts fall within the domain of a general agonistics. This does not necessarily mean that one plays in order to win. A move can be made for the sheer pleasure of its invention: what else is involved in that labor of language harassment undertaken by popular speech and by literature? Great joy is had in the endless invention of turns of phrase, of words and meanings, the process behind the evolution of language on the level of parole. But undoubtedly even this pleasure depends on a feeling of success won at the expense of an adversary – at least one adversary, and a formidable one: the accepted language, or connotation.
This idea of an agonistics of language should not make us lose sight of the second principle, which stands as a complement to it and governs our analysis: that the observable social bond is composed of language “moves.” An elucidation of this proposition will take us to the heart of the matter at hand.
4. The Nature of the Social Bond: The Modern Alternative
If we wish to discuss knowledge in the most highly developed contemporary society, we must answer the preliminary question of what methodological representation to apply to that society. Simplifying to the extreme, it is fair to say that in principle there have been, at least over the last half-century, two basic representational models for society: either society forms a functional whole, or it is divided in two. An illustration of the first model is suggested by Talcott Parsons (at least the postwar Parsons) and his school, and of the second, by the Marxist current (all of its component schools, whatever differences they may have, accept both the principle of class struggle and dialectics as a duality operating within society).”
This methodological split, which defines two major kinds of discourse on society, has been handed down from the nineteenth century. The idea that society forms an organic whole, in the absence of which it ceases to be a society (and sociology ceases to have an object of study), dominated the minds of the founders of the French school. Added detail was supplied by functionalism; it took yet another turn in the 1950s with Parsons’s conception of society as a self-regulating system. The theoretical and even material model is no longer the living organism; it is provided by cybernetics, which, during and after the Second World War, expanded the model’s applications.
In Parsons’s work, the principle behind the system is still, if I may say so, optimistic: it corresponds to the stabilisation of the growth economies and societies of abundance under the aegis of a moderate welfare state. In the work of contemporary German theorists, systemtheorie is technocratic, even cynical, not to mention despairing: the harmony between the needs and hopes of individuals or groups and the functions guaranteed by the system is now only a secondary component of its functioning. The true goal of the system, the reason it programs itself like a computer, is the optimisation of the global relationship between input and output, in other words, performativity. Even when its rules are in the process of changing and innovations are occurring, even when its dysfunctions (such as strikes, crises, unemployment, or political revolutions) inspire hope and lead to belief in an alternative, even then what is actually taking place is only an internal readjustment, and its result can be no more than an increase in the system’s “viability.” The only alternative to this kind of performance improvement is entropy, or decline.
Here again, while avoiding the simplifications inherent in a sociology of social theory, it is difficult to deny at least a parallel between this “hard” technocratic version of society and the ascetic effort that was demanded (the fact that it was done in name of “advanced liberalism” is beside the point) of the most highly developed industrial societies in order to make them competitive – and thus optimise their “irrationality” – within the framework of the resumption of economic world war in the 1960s.
Even taking into account the massive displacement intervening between the thought of a man like Comte and the thought of Luhmann, we can discern a common conception of the social: society is a unified totality, a “unicity.” Parsons formulates this clearly: “The most essential condition of successful dynamic analysis is a continual and .systematic reference of every problem to the state of the system as a whole … A process or set of conditions either ‘contributes’ to the maintenance (or development) of the system or it is ‘dysfunctional’ in that it detracts from the integration, effectiveness, etc., of the ‘system.” The “technocrats” also subscribe to this idea. Whence its credibility: it has the means to become a reality, and that is all the proof it needs. This is what Horkheimer called the “paranoia” of reason.
But this realism of systemic self-regulation, and this perfectly sealed circle of facts and interpretations, can be judged paranoid only if one has, or claims to have, at one’s disposal a viewpoint that is in principle immune from their allure. This is the function of the principle of class struggle in theories of society based on the work of Marx.
“Traditional” theory is always in danger of being incorporated into the programming of the social whole as a simple tool for the optimisation of its performance; this is because its desire for a unitary and totalising truth lends itself to the unitary and totalising practice of the system’s managers. “Critical” theory, based on a principle of dualism and wary of syntheses and reconciliations, should be in a position to avoid this fate. What guides Marxism, then, is a different model of society, and a different conception of the function of the knowledge that can be produced by society and acquired from it. This model was born of the struggles accompanying the process of capitalism’s encroachment upon traditional civil societies. There is insufficient space here to chart the vicissitudes of these struggles, which fill more than a century of social, political, and ideological history. We will have to content ourselves with a glance at the balance sheet, which is possible for us to tally today now that their fate is known: in countries with liberal or advanced liberal management, the struggles and their instruments have been transformed into regulators of the system; in communist countries, the totalising model and its totalitarian effect have made a comeback in the name of Marxism itself, and the struggles in question have simply been deprived of the right to exist. Everywhere, the Critique of political economy (the subtitle of Marx’s Capital) and its correlate, the critique of alienated society, are used in one way or another as aids in programming the system.
Of course, certain minorities, such as the Frankfurt School or the group Socialisme ou barbarie, preserved and refined the critical model in opposition to this process. But the social foundation of the principle of division, or class struggle, was blurred to the point of losing all of its radicality; we cannot conceal the fact that the critical model in the end lost its theoretical standing and was reduced to the status of a “utopia” or “hope,” a token protest raised in the name of man or reason or creativity, or again of some social category such as the Third World or the students – on which is conferred in extremes the henceforth improbable function of critical subject.
The sole purpose of this schematic (or skeletal) reminder has been to specify the problematic in which I intend to frame the question of knowledge in advanced industrial societies. For it is impossible to know what the state of knowledge is – in other words, the problems its development and distribution are facing today – without knowing something of the society within which it is situated. And today more than ever, knowing about that society involves first of all choosing what approach the inquiry will take, and that necessarily means choosing how society can answer. One can decide that the principal role of knowledge is as an indispensable element in the functioning of society, and act in accordance with that decision, only if one has already decided that society is a giant machine.
Conversely, one can count on its critical function, and orient its development and distribution in that direction, only after it has been decided that society does not form an integrated whole, but remains haunted by a principle of oppositions The alternative seems clear: it is a choice between the homogeneity and the intrinsic duality of the social, between functional and critical knowledge. But the decision seems difficult, or arbitrary.
It is tempting to avoid the decision altogether by distinguishing two kinds of knowledge. one, the positivist kind, would be directly applicable to technologies bearing on men and materials, and would lend itself to operating as an indispensable productive force within the system. The other the critical, reflexive, or hermeneutic kind by reflecting directly or indirectly on values or alms, would resist any such “recuperation.”
5. The Nature of the Social Bond: The Postmodern Perspective
I find this partition solution unacceptable. I suggest that the alternative it attempts to resolve, but only reproduces, is no longer relevant for the societies with which we are concerned and that the solution itself is stilt caught within a type of oppositional thinking that is out of step with the most vital modes of postmodern knowledge. As I have already said, economic “redeployment” in the current phase of capitalism, aided by a shift in techniques and technology, goes hand in hand with a change in the function of the State: the image of society this syndrome suggests necessitates a serious revision of the alternate approaches considered. For brevity’s sake, suffice it to say that functions of regulation, and therefore of reproduction, are being and will be further withdrawn from administrators and entrusted to machines. Increasingly, the central question is becoming who will have access to the information these machines must have in storage to guarantee that the right decisions are made. Access to data is, and will continue to be, the prerogative of experts of all stripes. The ruling class is and will continue to be the class of decision makers. Even now it is no longer composed of the traditional political class, but of a composite layer of corporate leaders, high-level administrators, and the heads of the major professional, labor, political, and religious organisations.
What is new in all of this is that the old poles of attraction represented by nation-states, parties, professions, institutions, and historical traditions are losing their attraction. And it does not look as though they wilt be replaced, at least not on their former scale, The Trilateral Commission is not a popular pole of attraction. “Identifying” with the great names, the heroes of contemporary history, is becoming more and more difficult. Dedicating oneself to “catching up with Germany,” the life goal the French president [Giscard d’Estaing at the time this book was published in France] seems to be offering his countrymen, is not exactly exciting. But then again, it is not exactly a life goal. It depends on each individual’s industriousness. Each individual is referred to himself. And each of us knows that our self does not amount to much.
This breaking up of the grand Narratives (discussed below, sections 9 and 10) leads to what some authors analyse in terms of the dissolution of the social bond and the disintegration of social aggregates into a mass of individual atoms thrown into the absurdity of Brownian motion. Nothing of the kind is happening: this point of view, it seems to me, is haunted by the paradisaic representation of a lost organic” society.
A self does not amount to much, but no self is an island; each exists in a fabric of relations that is now more complex and mobile than ever before. Young or old, man or woman, rich or poor, a person is always located at “nodal points” of specific communication circuits, however tiny these may be. Or better: one is always located at a post through which various kinds of messages pass. No one, not even the least privileged among us, is ever entirely powerless over the messages that traverse and position him at the post of sender, addressee, or referent. One’s mobility in relation to these language game effects (language games, of course, are what this is all about) is tolerable, at least within certain limits (and the limits are vague); it is even solicited by regulatory mechanisms, and in particular by the self-adjustments the system undertakes in order to improve its performance. It may even be said that the system can and must encourage such movement to the extent that it combats its own entropy, the novelty of an unexpected “move,” with its correlative displacement of a partner or group of partners, can supply the system with that increased performativity it forever demands and consumes.
It should now be clear from which perspective I chose language games as my general methodological approach. I am not claiming that the entirety of social relations is of this nature – that will remain an open question. But there is no need to resort to some fiction of social origins to establish that language games are the minimum relation required for society to exist: even before he is born, if only by virtue of the name he is given, the human child is already positioned as the referent in the story recounted by those around him, in relation to which he will inevitably chart his course. Or more simply still, the question of the social bond, insofar as it is a question, is itself a language game, the game of inquiry. It immediately positions the person who asks, as well as the addressee and the referent asked about: it is already the social bond.
On the other hand, in a society whose communication component is becoming more prominent day by day, both as a reality and as an issue, it is clear that language assumes a new importance. It would be superficial to reduce its significance to the traditional alternative between manipulatory speech and the unilateral transmission of messages on the one hand, and free expression and dialogue on the other.
A word on this last point. If the problem is described simply in terms of communication theory, two things are overlooked: first, messages have quite different forms and effects depending on whether they are, for example, denotatives, prescriptives, evaluatives, performatives, etc. It is clear that what is important is not simply the fact that they communicate information. Reducing them to this function is to adopt an outlook which unduly privileges the system’s own interests and point of view. A cybernetic machine does indeed run on information, but the goals programmed into it, for example, originate in prescriptive and evaluative statements it has no way to correct in the course of its functioning – for example, maximising its own performance, how can one guarantee that performance maximisation is the best goal for the social system in every case. In any case the “atoms” forming its matter are competent to handle statements such as these – and this question in particular.
Second, the trivial cybernetic version of information theory misses something of decisive importance, to which I have already called attention: the agonistic aspect of society. The atoms are placed at the crossroads of pragmatic relationships, but they are also displaced by the messages that traverse them, in perpetual motion. Each language partner, when a “move” pertaining to him is made, undergoes a “displacement,” an alteration of some kind that not only affects him in his capacity as addressee and referent, but also as sender. These moves necessarily provoke “countermoves” and everyone knows that a countermove that is merely reactional is not a “good” move. Reactional countermoves arc no more than programmed effects in the opponent’s strategy; they play into his hands and thus have no effect on the balance of power. That is why it is important to increase displacement in the games, and even to disorient it, in such a way as to make an unexpected “move” (a new statement).
What is needed if we are to understand social relations in this manner, on whatever scale we choose, is not only a theory of communication, but a theory of games which accepts agonistics as a founding principle. In this context, it is easy to see that the essential element of newness is not simply “innovation.” Support for this approach can be found in the work of a number of contemporary sociologists, in addition to linguists and philosophers of language. This “atomisation” of the social into flexible networks of language games may seem far removed from the modern reality, which is depicted, on the contrary, as afflicted with bureaucratic paralysis. The objection will be made, at least, that the weight of certain institutions imposes limits on the games, and thus restricts the inventiveness of the players in making their moves. But I think this can be taken into account without causing any particular difficulty.
In the ordinary use of discourse – for example, in a discussion between two friends – the interlocutors use any available ammunition, changing games from one utterance to the next: questions, requests, assertions, and narratives are launched pell-mell into battle. The war is not without rules, but the rules allow and encourage the greatest possible flexibility of utterance.
From this point of view, an institution differs from a conversation in that it always requires supplementary constraints for statements to be declared admissible within its bounds. The constraints function to filter discursive potentials, interrupting possible connections in the communication networks: there are things that should not be said. They also privilege certain classes of statements (sometimes only one) whose predominance characterises the discourse of the particular institution: there arc things that should be said, and there are ways of saving them. Thus: orders in the army, prayer in church, denotation in the schools, narration in families, questions in philosophy, performativity in businesses. Bureaucratisation is the outer limit of this tendency.
However, this hypothesis about the institution is still too “unwieldy”: its point of departure is an overly “reifying” view of what is institutionalised. We know today that the limits the institution imposes on potential language “moves” are never established once and for all (even if they have been formally defined), Rather, the limits are themselves the stakes and provisional results of language strategies, within the institution and without. Examples: Does the university have a place for language experiments (poetics)? Can you tell stories in a cabinet meeting? Advocate a cause in the barracks? The answers are clear: yes, if the university opens creative workshops; yes, if the cabinet works with prospective scenarios; yes, if the limits of the old institution are displaced. Reciprocally, it can be said that the boundaries only stabilise when they cease to be stakes in the game.
This, I think, is the appropriate approach to contemporary institutions of knowledge.
Introduction to the Bhagavad-Gita (Translation of Bhagavad-Gita by Swami Prabhavananda and Christopher Isherwood.) by Aldous Huxely.
More than twenty-five centuries have passed since that which has been called the Perennial Philosophy was first committed to writing; and in the course of those centuries it has found expression, now partial, now complete, now in this form, now in that, again and again. In Vedanta and Hebrew prophecy, in the Tao Teh King and the Platonic dialogues, in the Gospel according to St. John and Mahayana theology, in Plotinus and the Areopagite, among the Persian Sufis and the Christian mystics of the Middle Ages and the Renaissance–the Perennial Philosophy has spoken almost all the languages of Asia and Europe and has made use of the terminology and traditions of every one of the higher religions. But under all this confusion of tongues and myths, of local histories and particularist doctrines, there remains a Highest Common Factor, which is the Perennial Philosophy in what may be called its chemically pure state. This final purity can never, of course, be expressed by any verbal statement of the philosophy, however undogmatic that statement may be, however deliberately syncretistic. The very fact that it is set down at a certain time by a certain writer, using this or that language, automatically imposes a certain sociological and personal bias on the doctrines so formulated. It is only the act of contemplation when words and even personality are transcended, that the pure state of the Perennial Philosophy can actually be known. The records left by those who have known it in this way make it abundantly clear that all of them, whether Hindu, Buddhist, Hebrew, Taoist, Christian, or Mohammedan, were attempting to describe the same essentially indescribable Fact.
The original scriptures of most religions are poetical and unsystematic. Theology, which generally takes the form of a reasoned commentary on the parables and aphorisms of the scriptures, tends to make its appearance at a later stage of religious history. The Bhagavad-Gita occupies an intermediate position between scripture and theology; for it combines the poetical qualities of the first with the clear-cut methodicalness of the second. The book may be described, writes Ananda K. Coomaraswamy in his admirable Hinduism and Buddhism, “as a compendium of the whole Vedic doctrine to be found in the earlier Vedas, Brahmanas and Upanishads, and being therefore the basis of all the later developments, it can be regarded as the focus of all Indian religion” is also one of the clearest and most comprehensive summaries of the Perennial Philosophy ever to have been made. Hence its enduring value, not only for Indians, but for all mankind.
At the core of the Perennial Philosophy we find four fundamental doctrines.
First: the phenomenal world of matter and of individualized consciousness–the world of things and animals and men and even gods–is the manifestation of a Divine Ground within which all partial realities have their being, and apart from which they would be non-existent.
Second: human beings are capable not merely of knowing about the Divine Ground by inference; they can also realize its existence by a direct intuition, superior to discursive reasoning. This immediate knowledge unites the knower with that which is known.
Third: man possesses a double nature, a phenomenal ego and an eternal Self, which is the inner man, the spirit, the spark of divinity within the soul. It is possible for a man, if he so desires, to identify himself with the spirit and therefore with the Divine Ground, which is of the same or like nature with the spirit.
Fourth: man’s life on earth has only one end and purpose: to identify himself with his eternal Self and so to come to unitive knowledge of the Divine Ground.
In Hinduism the first of these four doctrines is stated in the most categorical terms. The Divine Ground is Brahman, whose creative, sustaining and transforming aspects are manifested the Hindu trinity. A hierarchy of manifestations connects inanimate matter with man, gods, High Gods, and the undifferentiated Godhead beyond.
In Mahayana Buddhism the Divine Ground is called Mind or the Pure Light of the Void, the place of the High Gods is taken by the Dhyani-Buddhas.
Similar conceptions are perfectly compatible with Christianity and have in fact been entertained, explicitly or implicitly, by many Catholic and Protestant mystics, when formulating a philosophy to fit facts observed by super-rational intuition. Thus, for Eckhart and Ruysbroeck, there is an Abyss of Godhead underlying the Trinity, just as Brahman underlies Brahma, Vishnu and Shiva. Suso has even left a diagrammatic picture of the relations subsisting between Godhead, triune God and creatures. In this very curious and interesting drawing a chain of manifestation connects the mysterious symbol of the Divine Ground with the three Persons of the Trinity, and the Trinity in turn is connected in a descending scale with angels and human beings. These last, as the drawing vividly shows, may make one of two choices. They can either live the life of the outer man, the life of the separative selfhood; in which case they are lost (for, in the words of the Theologia Germanica, “nothing burns in hell but the self”). Or else they can identify themselves with the inner man, in which case it becomes possible for them, as Suso shows, to ascend again, through unitive knowledge, to the Trinity and even, beyond they Trinity, to the ultimate Unity of the Divine Ground.
Within the Mohammedan tradition such a rationalization of the immediate mystical experience would have been dangerously unorthodox. Nevertheless, one has the impression, while reading certain Sufi texts, that their authors did in fact conceive of al haqq, the Real, as being the Divine Ground or Unity of Allah, underlying the active and personal aspects of the Godhead.
The second doctrine of the Perennial Philosophy–that it is possible to know the Divine Ground by a direct intuition higher than discursive reasoning–is to be found in all the great religions of the world. A philosopher who is content merely to know about the ultimate Reality–theoretically and by hearsay–is compared by Buddha to a herdsman of other men’s cows. Mohammed uses an even homelier barnyard metaphor. For him the philosopher who has not realized his metaphysics is just an ass bearing a load of books. Christian, Hindu, Taoist teachers wrote no less emphatically about the absurd pretensions of mere learning and analytic reasoning. In the words of the Anglican Prayer Book, our eternal life, now and hereafter, “stands in the knowledge of God”; and this knowledge is not discursive, but “of the heart,” a super-rational intuition, direct, synthetic and timeless.
The third doctrine of the Perennial Philosophy, that which affirms the double nature of man, if fundamental in all the higher religions. The unitive knowledge of the Divine Ground has, as its necessary condition, self-abnegation and charity. Only by means of self-abnegation and charity can we clear away the evil, folly and ignorance which constitute the thing we call our personality and prevent us from becoming aware of the spark of divinity illuminating the inner man. but the spark within is akin to the Divine Ground. By identifying ourselves with the first we can come to unitive knowledge of the second. These empirical facts of the spiritual life have been variously rationalized in terms of the theologies of the various religions. The Hindus categorically affirm that thou art That–that the indwelling Atman is the same as Brahman. For orthodox Christianity there is not an identity between the spark and God. union of the human spirit with God takes place–union so complete that the word deification is applied to it; but it is not the union of identical substances. According to Christian theology, the saint is “deified,” not because Atman is Brahman, but because God has assimilated the purified human spirit in to the divine substance by an act of grace. Islamic theology seems to make a similar distinction. The Sufi, Mansur, was executed for giving to the words “union” and “deification” the literal meaning which they bear in the Hindu tradition. For our present purposes, however, the significant fact is that these words are actually used by Christians and Mohammedans to describe the empirical facts of metaphysical realization by means of direct, super-rational intuition.
In regard to man’s final end, all the higher religions are in complete agreement. The purpose of human life is the discovery of Truth, the unitive knowledge of the Godhead. The degree to which this unitive knowledge is achieved here on earth determines the degree to which it will be enjoyed in the posthumous state. Contemplation of truth is the end, action the means. In India, in China, in ancient Greece, in Christian Europe, this was regarded as the most obvious and axiomatic piece of orthodoxy. The invention of the steam engine produced a revolution, not merely in industrial techniques, but also much more significantly in philosophy. Because machines could be made progressively more and more efficient, Western man came to believe that men and societies would automatically register a corresponding moral and spiritual improvement. Attention and allegiance came to be paid, not to Eternity, but to the Utopian future. External circumstances came to be regarded as more important that states of mind about external circumstances, and the end of human life was held to be action, with contemplation as a means to that end. These false and historically, aberrant and heretical doctrines are now systematically taught in our schools and repeated, day in, day out, by those anonymous writers of advertising copy who, more than any other teachers, provide European and American adults with their current philosophy of life. And so effective has been the propaganda that even professing Christians accept the heresy unquestioningly and are quite unconscious of its complete incompatibility with their own or anybody else’s religion.
These four doctrines constitute the Perennial Philosophy in its minimal and basic form. A man who can practice what the Indians call Jnana yoga (the metaphysical discipline of discrimination between the real and teh apparent) asks for nothing more. This simple working hypothesis is enough for his purposes. But such discrimination is exceedingly difficult and can hardly be practiced, at any rate in the preliminary stages of the spiritual life, except by persons endowed with a particular kind of mental constitution. That is why most statements of the Perennial Philosophy have included another doctrine, affirming the existence of one or more human Incarnations of the Divine Ground, by whose mediation and grace the worshipper is helped to achieve his goal–that unitive knowledge of the Godhead, which is man’s eternal life and beatitude. The Bhagavad-Gita is one such statement. Here, Krishna is an Incarnation of the Divine Ground in human form. Similarly, in Christian and Buddhist theology, Jesus and Gotama are Incarnations of divinity. But whereas in Hinduism and Buddhism more than one Incarnation of the Godhead is possible (and is regarded as having in fact taken place), for Christians there has been and can be only one.
An Incarnation of the Godhead and, to a lesser degree, any theocentric saint, sage or prophet is a human being who knows Who he is and can therefore effectively remind other human beings of what htey have allowed themselves to forget: namely, that if they choose to become what potentially they already are, they too can be eternally united with the Divine Ground.
Worship of the Incarnation and contemplation of his attributes are for most men and women the best preparation for unitive knowledge of the Godhead. But whether the actual knowledge itself can be achieved by this means is another question. Many Catholic mystics have affirmed that, at a certain stage of that contemplative prayer in which, according to the most authoritative theologians, the life of Christian perfection ultimately consists, it is necessary to put aside all thought of the Incarnation as distracting from the higher knowledge of that which has been incarnated. From this fact have arisen misunderstandings in plenty and a number of intellectual difficulties. Here, for example, is what Abbot Josh Chapman writes in one of his admirable Spiritual Letters: “The problem of reconciling (not merely uniting) mysticism with Christianity is more difficult. The Abbot (Abbot Marmion) says that St. John of the Cross is like a sponge full of Christianity. You can squeeze it all out, and the full mystical theory remains. Consequently, for fifteen years or so, I hated St. John of the Cross and called him a Buddhist. I loved St. Teresa, and read her over and over again. She is first a Christian, only secondarily a mystic. Then I found that I had wasted fifteen years, so far as prayer was concerned.” And yet, he concludes, in spite of its “Buddhistic” character, the practice of mysticism (or, to put it in other terms, the realization of the Perennial Philosophy) makes good Christians. He might have added that it also makes good Hindus, good Buddhists, good Taoists, good Moslems and good Jews.
The solution to Abbot Chapman’s problem must be sought in the domain, not of philosophy, but of psychology. Human beings are not born identical. There are many different temperaments and constitutions; and within each psycho-physical class one can find people at very different stages of spiritual development. Forms of worship and spiritual discipline which may be valuable for one individual maybe useless or even positively harmful for another belonging to a different class and standing, within that class, at a lower or higher level of development. All this is clearly set forth in the Gita, where the psychological facts are linked up with general cosmology by means of the postulate of the gunas. Krishna, who is here the mouth-piece of Hinduism in all its manifestations, finds it perfectly natural that different men should have different methods and even apparently differently objects of worship. All roads lead to Rome–provided, of course, that it is Rome and not some other city which the traveler really wishes to reach. A similar attitude of charitable inclusiveness, somewhat surprising in a Moslem, is beautifully expressed in the parable of Moses and the Shepherd, told by Jalauddin Rumi in the second book of the Masnavi. And within the more exclusive Christian tradition these problems of temperament and degree of development have been searchingly discussed in their relation to the way of Mary and the way of Martha in general, and in particular to the vocation and private devotion of individuals.
We now have to consider the ethical corollaries of the perennial Philosophy. “Truth,” says St. Thomas Aquinas, “is the last end for the entire universe, and the contemplation of truth is the chief occupation of wisdom.” The moral virtues, he says in another place, belong to contemplation, not indeed essentially, but as a necessary predisposition. Virtue, in other words, is not the end, but the indispensable means to the knowledge of the divine reality. Shankara, the greatest of the Indian commentators on the Gita, hold the same doctrine. Right action is the way to knowledge; for it purifies the mind, and it is only to a mind purifies from egotism that the intuition of the Divine Ground can come.
Self-abnegation, according to the Gita, can be achieved by the practice of two all-inclusive virtues–love and non-attachment. the latter is the same thing as that “holy indifference,” on which St. Francois de Sales is never tired of insisting. “He who refers every action to God,” writes Camus, summarizing his master’s teaching, “and has no aims save His Glory, will find rest everywhere, even amidst the most violent commotions.” So long as we practice this holy indifference to the fruits of action, “no lawful occupation will separate us from God; on the contrary, it can be made a means of closer union.” Here the word “lawful” supplies a necessary qualification to a teaching which, without it, is incomplete and even potentially dangerous. Some actions are intrinsically evil or inexpedient; and no good intentions, no conscious offering them to God, no renunciation of the fruits can alter their essential character. Holy indifference requires to be taught in conjunction not merely with a set of commandments prohibiting crimes, but also with a clear conception of what in Buddha’s Eightfold Path is called “right livelihood.” Thus, for the Buddhist, right livelihood was incompatible with the making of deadly weapons and of intoxicants; for the mediaeval Christian, with the taking of interest and with various monopolistic practices which have since come to be regarded as legitimate good business. John Woolman, the American Quaker, provides a most enlightening example of the way in which a man may live in the world, while practicing perfect non-attachment and remaining acutely sensitive to the claims of right livelihood. Thus, while it would have been profitable and perfectly lawful for him to see West Indian sugar and rum to the customers who came to his shop, Woolman refrained from doing so, because these things were the products of slave labor. Similarly, when he was in England, it would have been both lawful and convenient for him to travel by stage coach. Nevertheless, he preferred to make his journeys on foot. Why? Because the comforts of rapid travel could only be bought at the expense of great cruelty to the horses and the most atrocious working conditions for the post-boys. In Woolman’s eyes, such a system of transportation was intrinsically undesirable, and no amount of personal non-attachment could make it anything but undesirable. So he shouldered his knapsack and walked.
In the preceding pages I have tried to show that the Perennial Philosophy and its ethical corollaries constitute a Highest Common Factor, present in all the major religions of the world. To affirm this truth has never been more imperatively necessary than at the present time. There will never be enduring peace unless and until human beings come to accept a philosophy of life more adequate to the cosmic and psychological facts than the insane idolatries of nationalism and the advertising man’s apocalyptic faith in Progress towards a mechanized New Jerusalem. All the elements of this philosophy are present, as we have seen, in the traditional religions. But in existing circumstances there is not the slightest chance that any of the traditional religions will obtain universal acceptance. Europeans and Americans will see no reason for being converted to Hinduism, say, or Buddhism. And the people of Asia can hardly be expected to renounce their own traditions for the Christianity professed, often sincerely, by the imperialists who, for four hundred years and more, have been systematically attacking, exploiting, and oppressing, and are now trying to finish off the work of destruction by “educating” them. But happily there is the Highest Common Factor of all religions, the Perennial Philosophy which has always and everywhere been the metaphysical system of prophets, saints and sages. It is perfectly possible for people to remain good Christians, Hindus, Buddhists, or Moslems and yet to be united in full agreement on the basic doctrines of the Perennial Philosophy.
The Bhagavad-Gita is perhaps the most systematic scriptural statement of the Perennial Philosophy. to a world at war, a world that, because it lacks the intellectual and spiritual prerequisites to peace, can only hope to patch up some kind of precarious armed truce, it stands pointing, clearly and unmistakably, to the only road of escape from the self-imposed necessity of self-destruction. For this reason we should be grateful to Swami Prabhavananda and Mr. Isherwood for having given us this new version of the book–a version which can be read, not merely without that dull aesthetic pain inflicted by all too many English translations from the Sanskrit, but positively with enjoyment.
Translated by Harri Heinonen and Michael Moynihan
Introduction by Michael Moynihan
Is Pentti Linkola posing the most dangerous thoughts mankind has ever considered? Or is he this planet’s only remaining voice of sanity? Living an ascetic existence as a fisherman in a remote rural region of his frigid homeland, the Finnish philosopher has pondered mankind’s position vis-à-vis the earth it inhabits and dares to utter the unspeakable. In order for the planet to continue living, man—or Homo destructivus, as Linkola names him—must be violently thinned to a mere fraction of his current global population. Linkola’s metaphor for the predicament is as follows:
What to do, when a ship carrying a hundred passengers suddenly capsizes and only one lifeboat, with room for only ten people, has been launched? When the lifeboat is full, those who hate life will try to load it with more people and sink the lot. Those who love and respect life will take the ship’s axe and sever the extra hands that cling to the sides of the boat.
As time creaks onward, Linkola’s predictions and indictments grow more dire. He has come to realise that extreme situations demand extreme solutions: “We still have a chance to be cruel. But if we are not cruel today, all is lost.” The sworn enemy of Christians and Humanists both, Linkola knows that the fate of the earth will never be rescued by those who exalt “tenderness, love and dandelion garlands.” Neither the developed nor under-developed populations of the planet deserve to survive at the expense of the biosphere as a whole. Linkola has urged that millions will starve to death or be promptly slaughtered in genocidal civil wars. Mandatory abortions should be carried out for any female who has more than two offspring. The only countries capable of initiating such draconian measures are those of the West, yet ironically they are the ones most hamstrung by debilitating notions of liberal humanism. As Linkola explains, “The United States symbolizes the worst ideologies in the world: growth and freedom.” The realistic solution will be found in the implementation of an eco-fascist regime where brutal battalions of “green police,” having freed their consciences from the “syrup ethics,” are capable of doing whatever is necessary.
In Finland, Linkola’s books are best-sellers. The rest of the world clearly cannot stomach his brand of medicine, as was evidenced when the Wall Street Journal ran an article on Linkola in 1995. A stack of indignant hate-mail ensued from ostensibly turn-the-other-cheek Christians, loving mothers, and assorted do-gooders. One reader squawked, ‘Sincere advocates of depopulation should set an example for all of us and begin the depopulating with themselves.” Linkola’s reply is far more logical: “If there were a button I could press, I would sacrifice myself without hesitating if it meant millions of people would die.”
What follows is the major text of Linkola’s to be translated into English. It is a chapter from his 1989 book johdatus 1990-luvun ajatteluun [Introduction to the Thought of the 1990s].
* * *
What is man? “Oh, what art thou man?” the poets of the good old days used to wonder. Man may be defined in an arbitrary number of ways, but to convey his most fundamental characteristic, he could be described with two words: too much. I’m too much, you’re too much. There’s five billion of us—an absurd, astonishing number, and still increasing. . . . The earth’s biosphere could possibly support a population of five million large mammals of this size, given their food requirements and the offal they produce, in order that they might exist in their own ecological niche, living as one species among many, without discriminating against the richness of other forms of life.
What meaning is there in these masses, what use do they have? What essential new contribution is brought forth to the world by hundreds of human societies similar to one other, or by the hundreds of identical communities existing within these societies? What sense is there in the fact that every small Finnish town has the same choice of workshops and stores, a similar men’s choir and a similar municipal theater, all clogging up the earth’s surface with their foundations and asphalt slabs? Would it be any loss to the biosphere—or to humanity itself—if the area of Äänekoski no longer existed, and instead in its place was an unregulated and diverse mosaic of natural landscape, containing thousands of species and tilting slopes of gnarled, primitive trees mirrored in the shimmering surface of Kuhmojärvi lake? Or would it really be a loss if a small bundle of towns disappeared from the map—Ylivieska, Kuusamo, lahti, Duisburg, Jefremov, Gloucester—and wilderness replaced them? How about Belgium?
What use do we have with Ylivieska? The question is not ingenious, but it’s relevant. And the only answer isn’t that, perhaps, there is no use for these places—but rather that the people in Ylivieska town have a reason: they live there. I’m not just talking about the suffocation of life due to the population explosion, or that life and the earth’s respiratory rhythm cry out for the productive, metabolic green oases they sorely need everywhere, between the areas razed by man. I also mean that humanity, by squirting and birthing all these teeming, filth-producing multitudes from out of itself, in the process also suffocates and defames its own culture—one in which individuals and communities have to spasmodically search for the “meaning of life” and create an identity for themselves through petty childish arguing.
I spent a summer once touring Poland by bicycle. It is a lovely country, one where small Catholic children, cute as buttons, almost entirely dressed in silk, turn up around every corner. I read from a travel brochure that in Poland the percentage of people who perished in the Second World War is larger than in any other country—about six million, if my memory doesn’t fail me. From another part of the brochure I calculated that since the end of the war, population growth has compensated for the loss threefold in forty years. . . . On my next trip after that, I went through the most bombed-out city in the world, Dresden. It was terrifying in its ugliness and filth, overstuffed to the point of suffocation—a smoke-filled, polluting nest where the first spontaneous impression was that another vaccination from the sky wouldn’t do any harm. Who misses all those who died in the Second World War? Who misses the twenty million executed by Stalin? Who misses Hitler’s six million Jews? Israel creaks with overcrowdedness; in Asia minor, overpopulation creates struggles for mere square meters of dirt. The cities throughout the world were rebuilt and filled to the brim with people long ago, their churches and monuments restored so that acid rain would have something to eat through. Who misses the unused procreation potential of those killed in the Second World War? Is the world lacking another hundred million people at the moment? Is there a shortage of books, songs, movies, porcelain dogs, vases? Are one billion embodiments of motherly love and one billion sweet silver-haired grandmothers not enough?
All species have an oversized capacity for reproduction, otherwise they would become extinct in times of crisis due to variations of circumstances. In the end it’s always hunger that enforces a limit on the size of a population. A great many species have self-regulating birth control mechanisms which prevent them from constantly falling into crisis situations and suffering from hunger. In the case of man, however, such mechanisms—when found at all—are only weak and ineffective: for example, the small-scale infanticide practiced in primitive cultures. Throughout its evolutionary development, humankind has defied and outdistanced the hunger line. Man has been a conspicuously extravagant breeder, and decidedly animal-like. Mankind produces especially large litters both in cramped, distressed conditions, as well as among very prosperous segments of the population. Humans reproduce abundantly in the times of peace and particularly abundantly in the aftermath of a war, owing to a peculiar decree of nature.
It may be said that man’s defensive methods are powerless against hunger controlling his population growth, but his offensive methods for pushing the hunger line out of the way of the swelling population are enormously eminent. Man is extremely expansive—fundamentally so, as a species.
In the history of mankind we witness Nature’s desperate struggle against an error of her own evolution. An old and previously efficacious method of curtailment, hunger, began to increasingly lose its effectiveness as man’s engineering abilities progressed. Man had wrenched himself loose from his niche and started to grab more and more resources, displacing other forms of life. Then Nature took stock of the situation, found out that she had lost the first round, and changed strategy. She brandished a weapon she hadn’t been able to employ when the enemy had been scattered in numbers, but one which was all the more effective now against the densely proliferating enemy troops. With the aid of microbes—or “infectious diseases” as man calls them, in the parlance of his war propaganda—Nature fought stubbornly for two thousand years against mankind and achieved many brilliant victories. But these triumphs remained localised, and more and more ineluctably took on the flavour of rear-guard actions. Nature wasn’t capable of destroying the echelon of humanity in which scientists and researchers toiled away, and in the meantime they managed to disarm Nature of her arsenal.
At this point, Nature—no longer possessed of the weapons for attaining victory, yet utterly embittered and still retaining her sense of self-esteem—decided to concede a Pyrrhic victory to man, but only in the most absolute sense of the term. During the entire war, Nature had maintained her peculiar connection to the enemy: they had both shared the same supply sources, they drank from the same springs and ate from the same fields. Regardless of the course of the war, a permanent position of constraint prevailed at this point; for just as much as the enemy had not succeeded in conquering the supply targets for himself, Nature likewise did not possess the capability to take these same targets out of the clutches of humanity. The only option left was the scorched earth policy, which Nature had already tested on a small scale during the microbe-phase of the war, and which she decided to carry through to the bitter end. Nature did not submit to defeat—she called it a draw, but at the price of self-immolation. Man wasn’t, after all, an external, autonomous enemy, but rather her very own tumour. And the fate of a tumour ordains that it must always die along with its host.
In the case of man—who sits atop the food chain, yet nevertheless ominously lacks the ability to sufficiently restrain his own population growth—it might appear that salvation would lie in the propensity for killing his fellow man. The characteristically human institution of war, with its wholesale massacre of fellow humanoids, would seem to contain a basis for desirable population control—that is, if it hadn’t been portentously thwarted, since there is no human culture where young females take part in war. Thus, even a large decrease in population as a result of war affects only males, and lasts only a very short time in a given generation. The very next generation is up to strength, and by the natural law of the “baby boom” even becomes oversized, as the females are fertilised through the resilience of just a very small number of males. In reality, the evolution of war, while erratic, has actually been even more negative: in the early stages of its development there were more wars of a type that swept away a moderate amount of civilians as well. But by a twist of man’s tragicomic fate, at the very point when the institution of war appeared capable of taking out truly significant shares of fertile females—as was intimated by the bombings of civilians in the Second World War—military technology advanced in such a way that large-scale wars, those with the ability to make substantial demographic impact, became impossible.
Originally published in Apocalypse Culture II, ed. Adam Parfrey (Venice, Cal.: Feral House, 2000).