Why voting should be reserved for those who’ve gotten control of themselves:
By some stretch of the imagination, then, it’s not too unreasonable to imagine asking a candidate whether he or she would smother a baby to death. It may seem abominable to pose such a question, but let’s explain. Imagine we’re at war, and a group of people are hiding from the bad guys in a basement. The bad guys are upstairs, prowling the home for dissidents, when the baby in the basement begins to cry. Should the baby be smothered to death? If the baby is quieted, everyone else in the group lives. If the baby keeps crying, the bad guys find you, and everyone else in the group dies as well, including the baby.
You may be able to understand rationally how it’s better to sacrifice the baby for the good of the group, but could you actually be the one to put your hand over its mouth? Do you want a president who is able to? We actually might not have that much choice in the matter, if some researchers are to be believed.
In 2001, a research team led by philosopher and neuroscientist Joshua Greene released a paper detailing the work of using functional MRI to scan the brains of people wrestling with a moral dilemma.
The subjects in the study were presented with a scenario that involved killing a person with his or her own hands in order to save a large group of people, such as the circumstances with the crying baby we discussed on the first page. …Several areas of the subjects’ brains lit up, including two parts of the frontal lobe….This suggests that people weighed the benefit of saving the group against their emotions about killing an innocent baby.
Then the subjects were presented with a dilemma in which they didn’t have to get their hands dirty. The same person would die, but someone else would do it or a switch could be flipped to accomplish the task. In this scenario, only the reasoning part of the brain was active in scans. When people didn’t have to wrestle with their emotions about how they’d feel if they did something, they just completed a utilitarian analysis of what was best for the group.
Our problem as always is that we refer to ourselves in making such decisions. How do I feel? How do I look if I do this? And, what is my gut reaction?
Disgust over an unfair or immoral social situation is hard-wired into the human body as strongly as the reaction to a foul taste, according to research published today in the journal Science.
By studying the electrical activity of a muscle in the upper lip in both physically and morally offensive situations, scientists determined that disgust is equally strong in both cases.
“People use the term disgust in terms of morally offensive situations,” said Adam Anderson, a professor of neuroscience at the University of Toronto and a co-author on the study. “Our study looked at whether this reaction was genuine disgust or just a metaphor.”
Our animal reactions override our thinking, in many cases. For this reason, we’re better with “someone should” than “I will act to,” especially since the latter involves risk to ourselves.
While Cochran and Harpending don’t have much respect for Gould, their book serves to complement the much-touted Jared Diamond’s 1997 bestseller Guns, Germs, and Steel, showing you what Diamond left out in his successful bid for political correctness.
So, what happened 10,000 years ago?
Farming changed everything. Planting crops and raising livestock allowed the human population to grow enormously.
A hundred-fold growth in world population from its pre-agriculture size to the 60 million alive during the Bronze Age 3,000 years ago meant a similar hundred-fold increase in the rate of genetic mutations.
Moreover, agriculture dramatically changed the environment that selects which mutations turn out to be favorable. To flourish, farmers have to be harder-working than hunter-gatherers, more orderly in densely crowded locations, less susceptible to alcoholism, and more foresighted (farmers can’t eat the seed corn).
Different cultures bring about different genes.
Jonathan Haidt noted this debate is just warming up:
The most offensive idea in all of science for the last 40 years is the possibility that behavioral differences between racial and ethnic groups have some genetic basis. Knowing nothing but the long-term offensiveness of this idea, a betting person would have to predict that as we decode the genomes of people around the world, we’re going to find deeper differences than most scientists now expect. Expectations, after all, are not based purely on current evidence; they are biased, even if only slightly, by the gut feelings of the researchers, and those gut feelings include disgust toward racism..
A wall has long protected respectable evolutionary inquiry from accusations of aiding and abetting racism. That wall is the belief that genetic change happens at such a glacial pace that there simply was not time, in the 50,000 years since humans spread out from Africa, for selection pressures to have altered the genome in anything but the most trivial way (e.g., changes in skin color and nose shape were adaptive responses to cold climates). Evolutionary psychology has therefore focused on the Pleistocene era – the period from about 1.8 million years ago to the dawn of agriculture — during which our common humanity was forged for the hunter-gatherer lifestyle.
But the writing is on the wall. Russian scientists showed in the 1990s that a strong selection pressure (picking out and breeding only the tamest fox pups in each generation) created what was — in behavior as well as body — essentially a new species in just 30 generations. That would correspond to about 750 years for humans. Humans may never have experienced such a strong selection pressure for such a long period, but they surely experienced many weaker selection pressures that lasted far longer, and for which some heritable personality traits were more adaptive than others. It stands to reason that local populations (not continent-wide “races”) adapted to local circumstances by a process known as “co-evolution” in which genes and cultural elements change over time and mutually influence each other. The best documented example of this process is the co-evolution of genetic mutations that maintain the ability to fully digest lactose in adulthood with the cultural innovation of keeping cattle and drinking their milk. This process has happened several times in the last 10,000 years, not to whole “races” but to tribes or larger groups that domesticated cattle.
Recent “sweeps” of the genome across human populations show that hundreds of genes have been changing during the last 5-10 millennia in response to local selection pressures. (See papers by Benjamin Voight, Scott Williamson, and Bruce Lahn). No new mental modules can be created from scratch in a few millennia, but slight tweaks to existing mechanisms can happen quickly, and small genetic changes can have big behavioral effects, as with those Russian foxes. We must therefore begin looking beyond the Pleistocene and turn our attention to the Holocene era as well – the last 10,000 years. This was the period after the spread of agriculture during which the pace of genetic change sped up in response to the enormous increase in the variety of ways that humans earned their living, formed larger coalitions, fought wars, and competed for resources and mates.
The protective “wall” is about to come crashing down, and all sorts of uncomfortable claims are going to pour in. Skin color has no moral significance, but traits that led to Darwinian success in one of the many new niches and occupations of Holocene life — traits such as collectivism, clannishness, aggressiveness, docility, or the ability to delay gratification — are often seen as virtues or vices. Virtues are acquired slowly, by practice within a cultural context, but the discovery that there might be ethnically-linked genetic variations in the ease with which people can acquire specific virtues is — and this is my prediction — going to be a “game changing” scientific event. (By “ethnic” I mean any group of people who believe they share common descent, actually do share common descent, and that descent involved at least 500 years of a sustained selection pressure, such as sheep herding, rice farming, exposure to malaria, or a caste-based social order, which favored some heritable behavioral predispositions and not others.)
I believe that the “Bell Curve” wars of the 1990s, over race differences in intelligence, will seem genteel and short-lived compared to the coming arguments over ethnic differences in moralized traits. I predict that this “war” will break out between 2012 and 2017.
There are reasons to hope that we’ll ultimately reach a consensus that does not aid and abet racism. I expect that dozens or hundreds of ethnic differences will be found, so that any group — like any person — can be said to have many strengths and a few weaknesses, all of which are context-dependent. Furthermore, these cross-group differences are likely to be small when compared to the enormous variation within ethnic groups and the enormous and obvious effects of cultural learning. But whatever consensus we ultimately reach, the ways in which we now think about genes, groups, evolution and ethnicity will be radically changed by the unstoppable progress of the human genome project.
And Tom Wolfe noted how there is one exception, which is our ability to con each other with language and appearance:
Evolution came to an end when the human beast developed speech! As soon as he became not Homo sapiens, “man reasoning,” but Homo loquax, “man talking”! Speech gave the human beast far more than an ingenious tool. Speech was a veritable nuclear weapon! It gave the human beast the powers of reason, complex memory, and long-term planning, eventually in the form of print and engineering plans. Speech gave him the power to enlarge his food supply at will through an artifice called farming.
No evolutionist has come up with even an interesting guess as to when speech began, but it was at least 11,000 years ago, which is to say, 9000 B.C. It seems to be the consensus . . . in the notoriously capricious field of evolutionary chronology . . . that 9000 B.C. was about when the human beast began farming, and the beast couldn’t have farmed without speech, without being able to say to his son, “Son, this here’s seeds. You best be putting ’em in the ground in rows ov’ere like I tell you if you wanna git any ears a corn this summer.”
One of Homo loquax’s first creations after he learned to talk was religion.
Shall we take a look at the actual nature of the human beast–an artificial selection, 100% man-made?
Weber was well known in academia for his essay “The Protestant Ethic and the Spirit of Capitalism,” written after he toured the United Sates in 1904. It was the origin of the unfortunately non-Protestant cliché, “the work ethic.” He introduced the terms “charisma” and “charismatic” in their current usage; also “bureaucracy,” which he characterized as “the routinization of charisma.” He coined the term “style of life,” which was converted into the compound noun “lifestyle” and put to work as the title of a thousand sections of newspapers across the United States. But what caught my imagination was the single word “status.”
We naturally select ourselves according to status, because people grant those of higher status more leeway. We program ourselves as to what status is by trying to work around reality, and come up with an alternate explanation of what is valuable, such as morality or religion.
Language is a powerful tool, but also a reality-denial tool, and that can effect our ongoing human evolution. Of course, no one wants to talk about it that way. We like to think we’re born static like gods, immutable and forever “in control.”
“[Our] findings show that a natural, common mutation in the GRIK4 gene protects against bipolar disorder,” said Ben Pickard, lead author of a study in this week’s issue of the Proceedings of the National Academy of Sciences and a member of the department of medical genetics at the University of Edinburgh in Scotland. “If a natural mutation can result in protection, then this may offer clues as to how future drug treatments might be directed. . .”
The GRIK4 gene provides the genetic coding for the glutamate neurotransmitter receptor known as the KA1 kainate receptor. These kainate receptors are considered “excitatory,” because they generally make neurons more prone to firing signaling messages. The glutamate transmitter has been linked to different psychiatric disorders.
The deletion seems to be responsible for generating more glutamate receptors, thereby increasing glutamate signaling. “If kainate signaling can be stimulated, then that, too, might protect against bipolar disorder, Pickard said. “However, one problem with modulating glutamate activity like this is that too much glutamate is also harmful.”
In nature, this means that when enough creatures without this mutation die off, it becomes a standard part of the human being — until it is no longer constantly being tested, for example, when we have drugs to keep bipolar people from killing themselves before they breed.
Of course, if these creatures are smart enough to get themselves to a source of a supplement that suppresses their bipolar tendencies, they may survive — but will have created a future line with dependencies on that supplement:
Bipolar disorder is a devastating condition that causes extremes of mood. More than 12 million Americans suffer from this disorder every year, including men, women, and children. For as yet unknown reasons, women are more likely to develop the disorder than men.
Folic Acid: Folic acid is found in fruits, such as oranges, and leafy green vegetables, like spinach. Folic acid tends to be found in low levels in people suffering from depression. A supplement may help alleviate depressive symptoms.
This shows you one of the many reasons it’s important to eat the diet of your ancestors.
Most of our “great” (but not really great) “art” comes from bipolar people trying to express themselves. I could live easily with their absence, in exchange for having people who are inherently indisposed toward bipolarity.
We warned you some time ago. Now, there’s some alarming news about the imminent water wars:
Dwindling water supplies are a greater risk to businesses than oil running out, a report for investors has warned.
Among the industries most at risk are high-tech companies, especially those using huge quantities of water to manufacture silicon chips; electricity suppliers who use vast amounts of water for cooling; and agriculture, which uses 70% of global freshwater, , says the study, commissioned by the powerful CERES group, whose members have $7tn under management. Other high-risk sectors are beverages, clothing, biotechnology and pharmaceuticals, forest products, and metals and mining, it says.
“Water is one of our most critical resources – even more important than oil,” says the report, published today . “The impact of water scarcity and declining water on businesses will be far-reaching. We’ve already seen decreases in companies’ water allotments, more stringent regulations [and] higher costs for water.”
Droughts “attributable in significant part to climate change” are already causing “acute water shortages” around the world, and pressure on supplies will increase with further global warming and a growing world population, says the report written by the US-based Pacific Institute.
The loss of gasoline sounds worse, but only the most primitive of stupid monkeys thought that resource was infinite. Water? We’ll always have our two liters a day.
Institutional investors are urging companies to measure, disclose and reduce their use of water to reduce long-term financial risks as supplies dry up from overuse and as higher temperatures melt glaciers away.
“Companies need to be analyzing their water risk … and to find ways to conserve water and minimize the opportunities for literally having their business shut down,” Mindy Lubber, the president of Ceres, a Boston-based coalition of investors. said in an interview.
We may not. Water is required in abundance for our industry and infrastructure, not just personal consumption. But that’s out of sight, so out of mind. Rage on, you crazed monkeys.
Flannery, who has written eloquently about global warming, drove through the fire belt, and reported:
“It was as if a great cremation had taken place… I was born in Victoria, and over five decades I’ve watched as the state has changed. The long, wet and cold winters that seemed insufferable to me as a boy vanished decades ago, and for the past 12 years a new, drier climate has established itself… I had not appreciated the difference a degree or two of extra heat and a dry soil can make to the ferocity of a fire. This fire was different from anything seen before.”
Meanwhile, central China is experiencing the worst drought in half a century. Temperatures have been unseasonably high and rainfall, in some areas, 80% below normal; more than half the country’s provinces have been affected by drought, leaving millions of Chinese and their livestock without adequate access to water. In the region which raises 95% of the country’s winter wheat, crop production has already been impaired and is in further danger without imminent rain.
In our own backyard, much of the state of Texas—97.4% to be exact—is now gripped by drought, and parts of it by the worst drought in almost a century. According to the New York Times, “Winter wheat crops have failed. Ponds have dried up. Ranchers are spending heavily on hay and feed pellets to get their cattle through the winter. Some wonder if they will have to slaughter their herds come summer. Farmers say the soil is too dry for seeds to germinate and are considering not planting.” Since 2004, in fact, the state has yoyo-ed between the extremities of flood and drought.
A good compilation of drought data there, although it lacks a global model to show that the water missing in these droughts is not just distributed elsewhere. However, common sense dictates: as temperature rises, there’s going to be less water around.
A mother’s life experience can affect the biology of her offspring, according to new animal research in the February 4 issue of The Journal of Neuroscience. The study shows that a stimulating environment improved the memory of young mice with a memory-impairing genetic defect and also improved the memory of their eventual offspring. The findings suggest that parental behaviors that occur long before pregnancy may influence an offspring’s well-being.
“While it has been shown in humans and in animal models that enriched experience can enhance brain function and plasticity, this study is a step forward, suggesting that the enhanced learning behavior and plasticity can be transmitted to offspring long before the pregnancy of the mother,” said Li-Huei Tsai, PhD, at Massachusetts Institute of Technology and an investigator of the Howard Hughes Medical Institute, an expert unaffiliated with the current study.
In the current study, Feig and his colleagues found that the offspring of mothers who had experienced environmental enrichment before adolescence also showed enhanced LTP (enhanced long-term potentiation (LTP), which is thought to form the cellular basis of memory), despite never experiencing the stimulating environment themselves. Offspring born to environmentally enriched mothers, but reared by other mice, showed enhanced LTP as well. These findings suggest that environmental enrichment’s enhancement of LTP is transmitted to the next generation before birth.
Interesting how the early debate over evolution plays itself out now that we can observe these things:
“Lamarckism” or “Lamarckianism” is now often used in a rather derogatory sense to refer to the theory that acquired traits can be inherited. What Lamarck actually believed was more complex: organisms are not passively altered by their environment, as his colleague Geoffroy Saint-Hilaire thought. Instead, a change in the environment causes changes in the needs of organisms living in that environment, which in turn causes changes in their behavior. Altered behavior leads to greater or lesser use of a given structure or organ; use would cause the structure to increase in size over several generations, whereas disuse would cause it to shrink or even disappear.
It’s a feedback loop. Darwin observed the negative side of the loop, or the culling of the unfit; Lamarck observed the positive side, which is that organisms respond actively to their environment and so direct their own evolution.
This new research is the mediate stage: that epigentics, hormones during birth, and past experience all contribute to the recombination of genes that produces a newborn.
And the headline? Well, it’s easy. If you live like a hipster or third-worlder, and really there’s not much difference except that the first-world people around you support you, your life experience is one of dumbing-down. Instant gratification. Cheap sex. Anti-intellectualism, yet intellectual posing. No direction, no struggle, just an easy life of avoiding obligation and struggle.
What do you think that passes on to your kids? A dumbing-down. But if you live a moral life, working hard to do what’s right and also prosper, and avoid the easy dissolution and glib self-justification of the hipster, you produce better kids.
Which is good, because they’re going to be the ones who have to gun down the millions of hipsters and grey people surging out of the cities as they fail.
You will not hear this in the mainstream media, because it’s socially unacceptable. That’s at an even more basic level of human taboo that political correctness. It offends people who have no politics, because it offends their conception of themselves as “in control.”
You have a choice for human future:
- Genetic Engineering
- Natural Selection
You can have eugenics and natural selection, but with genetic engineering, the other two go out the window.
I’m not talking about trivial stuff like aborting babies with bad genes. I’m talking about these same scientists who cannot cure cancer starting to throw together genetic combinations, thinking it’ll make everything better.
I don’t trust them because I don’t trust us because our conception of reality is rooted in having ourselves be “in control,” and the tail wags the dog for everything else.
The same genetic testing, called pre-implantation genetic diagnosis (PGD), has been used to test for inherited disorders such as cystic fibrosis and Huntington’s disease, life-shortening diseases known to be certainly acquired by those carrying a single gene.
The events might presage other screenings designed to create designer babies based on gender, IQ or athletic ability, some ethicists fear.
“There are many complex issues to take into account and the decision will finally come down to an individual’s personal ethics,” said Kath McLachlan, a clinical nurse specialist at the charity Breast Cancer Care.
Some fear the worst if laws are not crafted to corral the burgeoning field of “reprogenetics,” as it is called — combining reproductive technologies with genetic screening.
Aborting a cancer-bound baby is a good thing, if you ask me. Trying to play God with genetics we barely understand is not.
Big economic interests and subtle changes in terminology are helping spread a wider acceptance of eugenics, said Archbishop Rino Fisichella, president of the Pontifical Academy for Life.
“The term ‘eugenics’ seems something of the past and just mentioning the word elicits horror,” he said during a Vatican press conference Feb. 17.
But, he said, scientific progress must be accompanied by greater ethical awareness that respects the full dignity of every human person.
The introduction to the congress program said excesses in the field of genetics can “lead to so-called eugenics which, in its various forms, seeks to obtain the perfect human being,” which includes unethical means that violate respect of all forms and conditions of human life.
I think you monkeys are in denial about just how much trouble we’re in. Not surprising; you want to be “in control,” and you’ll bend everything else you think to fit that paradigm.
As we run out of supplies, the population burgeons, water becomes scarce and fuel becomes scare, some kind of bottleneck will occur.
If you’re luckily, it’ll be a combination between natural selection and eugenics. Kill off the criminals; set the smart people up to breed more, and then let nature sort the rest out.
If you’re not, our arrogant and often imbecilic scientists will start trying to create the new multicultural master race — and they’ll partially succeed. They will give a being some enhanced abilities, and none of the wisdom to use them.
Think about what a disaster a rifle is in the hands of someone without judgment, or compassion. Now imagine someone very, very smart and strong with that same lack of judgment and compassion.
We don’t understand enough about genes to play with this technology yet.
Most liberals are middle-class people living in cities, so they are unable to save any money and are too narcissistic to have families.
Their goal? To be the wrecking balls that does in the suburban upper middle class.
During the presidential campaign, Barack Obama tempered his pledge to substantially raise taxes for high earners with an important proviso: He’d simply restore rates to their levels during the Clinton Administration. The implication was that families in the upper brackets would see their total tax bite go back to the levels of the 1990s, but no higher.
Now, it sure looks like Obama is reneging on that promise. The burden will indeed go far higher than in the Clinton years via a technicality — one that will come as a rude shock even to the taxpayers already braced for a soaking.
The group that’s hit hardest are the taxpayers I call the HENRYs, for “High Earners Not Rich Yet.” The HENRYs are families who make between $250,000 and $500,000 a year. I wrote about the HENRYs in a Nov. 17 Fortune cover story, “Who Pays for the Bailout?” They’re among America’s most productive, hard-working citizens: our doctors, attorneys, architects, and entrepreneurs, the owners and builders of cleaning companies, delis and security franchises.
Though President Obama brands them as rich, they’re usually far from it. “Rich” means personal wealth, or net worth, not income. These HENRYs are already strapped by a combination of high income taxes, soaring property tax levies, and college savings for the kids. Their chance of accumulating the couple of million dollars needed to qualify as rich were virtually nil even before Obama took the stage.
Revenge, always the liberal motivation.
We need these people. They are the business owners who work hard, not the lazy slackers who just kind of let stuff fall apart.
They are generally highly-motivated, family-oriented, culture-supporting people.
In fact, they’re what keeps America from becoming a wasteland of whiners who contribute nothing but witty opinions.
While the Jews of today are connected historically and religiously to the Jews of ancient Israel, the DNA evidence also indicates that a significant amount of Jewish ancestry can be traced directly back to their Israelite/Middle Eastern ancestors. However, these ancestors represented a heterogeneous mix of Semitic and Mediterranean groups, even at their very beginnings.
While earlier studies focused on the Middle Eastern component of Jewish DNA, new research has revealed that both Europeans and Central Asians also made significant genetic contributions to Jewish ancestry. Moreover, while the DNA studies have confirmed the close genetic interrelatedness of many Jewish communities, they have also confirmed what many suspected all along: Jews do not constitute a single group distinct from all others. Rather, modern Jews exhibit a diversity of genetic profiles, some reflective of their Semitic/Mediterranean ancestry, but others suggesting an origin in European and Central Asian groups. The blending of European, Semitic, Central Asian and Mediterranean heritage over the centuries has led to today’s Jewish populations.
While the Canaanites were a Western Semitic people indigenous to the area, they appear to have consisted of a diverse ethno-cultural mix from the earliest times. It is from this diverse group that the evolution of the Israelites occurred. Although little is known about these groups, they probably included some of the following populations:
1. Amorites: Western Semites like the Canaanites. They were probably the pastoral nomadic component of the Canaanite people.
2. Hittites: A non-Semitic people from Anatolia and Northern Syria.
3. Hurrians (Horites): A non-Semitic people who inhabited parts of Syria and Mesopotamia. Many kings of the early Canaanite city-states had Hurrian names.
4. Amalekites: Nomads from southern Transjordan. Even inimical references to this group in the Hebrew Bible “tacitly” acknowledge that the Israelites and Amalekites shared a common ancestry.
5. Philistines: Referred to in ancient texts as “Sea Peoples.” They invaded and settled along the coasts of ancient Canaan. Their culture appears to stem from that of Mycenae.
(Dever 2003, pp. 219-220).
Ironically, however, many scholars believe the Ashkenazi population probably had its earliest roots in Rome, where Jews began to establish communities as early as the second century B.C. While some of these Jews were brought to Rome as slaves, others settled there voluntarily. There were as many as 50,000 Jews in and around Rome by the first century CE, most who were “poor, Greek-speaking foreigners” scorned for their poverty and slave status (Konner 2003, p. 86). Eventually, however, many of these slaves gained their freedom, continuing to live in and around Rome.
By 600 CE, Jews were present in many parts of Europe, with small settlements in Germany, France and Spain. More to the east, there were also small Jewish settlements along the Black Sea, as well as larger communities in Greece and the Balkans (Konner 2003, p. 110).
By the 12th-13th centuries CE, Jews were expelled from many countries of Western Europe, but were granted charters to settle in Poland and Lithuania (Ostrer 2001). The Ashkenazi Jewish population expanded rapidly in Eastern Europe, growing from an estimated 15,000-25,000 people in the 13th-15th centuries, to two million by 1800 and eight million in 1939 (Ostrer 2001, Behar 2004b). Thus, Jewish settlement in Eastern Europe became the dominant culture of the European Jews, and then of most Jews throughout the world.
For anyone like me who loves history told through genes, this article from 2005 is a complete goldmine. Newer data has come out that does not radically contradict anything found here.
Researchers have been racing against time to find a cure for the deadly facial tumour disease threatening the tasmanian devil with extinction.
The problem is a lack of genetic diversity because of inbreeding.
“We’re also looking at ancient DNA from samples that have been collected from animal skins, collected from animals that have gone extinct on the mainland.
“Perhaps that will give us some indications of how much genetic diversity there was in the past.”
When humans encroach on their land, species are reduced to too low of a breeding population to maintain any kind of internal health. Inbreeding is one problem; another is that when there are too few breeding partners, standards drop radically. As a result, the species appears to slip away rather than suddenly die, which is what modern morons would require to see it’s a human cause.
People still do it for that sense of group approval:
Suicide attacks—today most often associated with acts against Americans or Israelis by Muslims—seem to be one aspect of a wider phenomenon in which collective religious ritual fosters a mindset known as parochial altruism, according to psychologists. Parochial altruism is a combination of negative attitudes toward another social group and sacrifice for one’s own.
Suicide attacks would be an extreme form of parochial altruism, said the psychologists who conducted the study, from the New School for Social Research in New York and the University of British Columbia. And when forms of parochial altruism other than suicide attacks were considered, the researchers found many cultures and religions followed the pattern identified in the Middle East.
This “parochial altruism” explains many dysfunctional behaviors in our own society, like ethnocide, which is probably why our media and pundits foam at the mouth when talking about Islam.