One man’s terrorist is another man’s freedom fighter. But, how can you tell the difference, and who influences your decision?
Asking these questions brings us into the marketplace of ideas. Into that part of the market that involves shaping public opinion.
When the public sees terrorism it reacts one way. When it sees freedom fighters, it reacts differently.
Recently, acclaimed playwright Tony Kushner found himself at the center of a major brouhaha over the refusal (since reversed) of the City University to grant him an honorary degree.
CUNY trustees had decided that Kushner’s antipathy toward Israel, his assertion that it had been founded on an act of “ethnic cleansing,” and his willingness to promote economic warfare against the Jewish state placed him beyond the academic pale.
Certainly, it should have placed him beyond being honored by CUNY.
As you know, then the $%&# hit the fan. Through the New York Times, in particular, the intelligentsia came out in force to support Kushner’s right to free speech.
New York's great minds took a few moments off from slandering Sarah Palin as stupid, to offer us a true lesson in ignorance. They tried to explain that refusing to give out an honorary degree impinges on Kushner’s right to free speech.
Honorific degrees confer honor on a person for career accomplishments. Honorific titles grant authority to the work of said person.
People want to have honors because they covet the increased respect that such honors confer. Now that Tony Kushner has be re-accorded his honorary degree, we should take a brief glance of some of the ideas that the CUNY trustees have now decided are worthy of respect.
Begin with Kushner's oft-asserted notion that the state of Israel was founded on acts of “ethnic cleansing.” Kushner knows where the phrase comes from. He knows that he is branding Israel a Neo-Nazi enterprise. In effect, he is adopting the party line that animates Palestinian terrorism and the refusal of the vast majority of Middle Eastern countries to acknowledge Israel's legitimacy.
If America were to believe that Israel was founded on “ethnic cleansing” our policies would be far different. As of now, the policies of both political parties have been based on the idea that Israel embodies the principles of Western democracy and free enterprise.
Kushner himself not proposing anything more than economic warfare, but the nations, and, in particular, the terrorist groups that hate Israel have been engaged in open warfare since the day Israel was born. Why? Because they believe that it was founded on "ethnic cleansing."
If Israel was founded on an act of “ethnic cleansing” then Hamas and Hezbollah are freedom fighters. If the Palestinians are the victims of Western imperialism, as Kushner and left-thinking European intellectuals believe, then they have the right to do whatever it takes to destroy the Jewish state.
Kushner might not support all of those activities, but when you take strong public stands you are also taking sides. At that point, you do not have the leeway to accept only those consequences that please you.
How does a proud gay Jewish American like Tony Kushner accept the propaganda points of organizations that would have happily hung him and his like from lampposts? Does anyone want to contemplate how Tony Kushner would fare under Sharia law?
What then animates Kushner's thought. I trust we would agree that there is nothing rational in it. It feels like it has been completely given over to blind fear. Tony Kushner's attacks on the state of Israel feel to me like pure, unadulterated cowardice. When you ally yourself with your enemies you are showing that you are so afraid that they will hurt you that you are willing to give them anything they want.
Great minds don't always think alike. Sometimes they don't even think at all.
The Tony Kushner kerfuffle tells us that the liberal left, especially the New York version, is more concerned with toeing the party line than with intellectual seriousness.
More importantly, it is concerned with maintaining its control over the marketplace of ideas. Given that the New York Times has been losing that control-- the wails of anguish from West 41st St. fill the New York air-- it has happily lit on the case of Tony Kushner to show how powerful it still is.
But, Tony Kushner is the minor leagues compared with Noam Chomsky.
Think about it: Osama bin Laden was comforted by the thought that a serious and well-honored intellectual like Noam Chomsky was trying to steer public opinion toward bin Laden's cause.
Justly celebrated as a linguist Chomsky transformed himself into an activist intellectual, of the radical leftist variety, during the Vietnam War. For lo these past decades he has worked hard to defame and slander America and Israel.
Regardless of what happens in the world, the fault, Chomsky has argued, always lies with America and its Middle Eastern ally.
While the top al Qaeda honchos were gloating about their successful attack on the World Trade Center, Chomsky was arguing that al Qaeda was innocent and that America deserved what it got.
Now that Navy SEALs have dispatched bin Laden to the nether depths of the Inferno, Chomsky felt compelled to defend his intellectual benefactor, declaring that America has violated international laws when it executed bin Laden.
Yet, as Bret Stephens argues in the Wall Street Journal today, the important thing about a crackpot like Chomsky is that his ideas have gained status because he has been honored, even lionized, by the American academic world. Which ideas might these be? Link here.
In Stephens’ words: “Among the subjects of Mr. Chomsky's solicitude have been Holocaust denier Robert Faurisson (whom he described as a ‘relatively apolitical liberal‘), the Khmer Rouge (at the height of the killing fields), and Hezbollah (whose military-style cap he cheerfully donned on a visit to Lebanon last year).
“In the West at least, the marketplace of ideas is also the great equalizer of ideas, blunting edges that might once have had the power to wound and kill.
“So it is that Mr. Chomsky can be the recipient of over 20 honorary degrees, including from Harvard, Cambridge and the University of Chicago. None of these degrees, as far as I know, was conferred for Mr. Chomsky's political musings, but neither did those musings provoke any apparent misgivings about the fitness of granting the award. So Mr. Chomsky is the purveyor of some controversial ideas about this or that aspect of American power. So what?”
Honorary degrees confer respectability, not just to Noam Chomsky the linguist, but to the ideas he has been promulgating for the past four decades.
Academic honors increase the value of his ideas in the marketplace, and allow you to drop them at New York cocktail parties. Instead of being shunned for trafficking in dangerous idiocies, you will be taken to be a serious thinker.
Givingn the imprimatur of Chomsky (or even Kushner) allows vile and pathological ideas to gain a foothold among New York’s wanna-be thinkers, but especially among those who run most of the major media operations.
Once Chomskyite thinking becomes respectable, you find it in the mind of a decidedly lesser intellectual light, a Columbia psychiatrist named of Dr. Robert Klitzman. Link here.
Klitzman lost his sister in the 9/11 attacks… for which we offer our condolences… but he has since proceeded to establish his intellectual bona fides, that is, to show that he can think with the big boys, by explaining that the fault for the terrorist attack lay with America and its policies.
It’s not quite at the level of conspiracy theory. Klitzman is glad that bin Laden is dead. Yet, he still cannot resist blaming America.
In his words: “When the members of Al Qaeda attacked on 9/11, Americans wondered, ‘Why do they hate us so much?’ Many here believe they dislike us for our ‘freedom,’ but I think otherwise.
“There are lessons we have not yet learned. I feel [my sister] Karen would share my concerns that underlying forces of greed and hate persevere. American imperialism, corporate avarice, abuses of our power abroad and our historical support of corrupt dictators like Hosni Mubarak have created an abhorrence of us that, unfortunately, persists. We need to recognize how the rest of the world sees us, and figure out how to change that. Until we do that, more Osama bin Ladens will arise, and more innocent people like my sister will die.”
As Dennis Prager points out, if you follow this logic to its bitter end, you end up: “Asking what America did to elicit the hatred of Muslim terrorists is morally equivalent to asking what Jews did to arouse Nazi hatred, what blacks did to cause whites to lynch them, what Ukrainians did to arouse Stalin’s hatred, or what Tibetans did to incite China’s harshly repressive treatment of them.”
Of course, Klitzman is not a serious thinker. He is not a thinker at all. He is a poseur, someone who seeks to breathe the same rarified air as radical leftist thinkers.
The least we can say is that if terrorism is designed to provoke fear, then, it has succeeded beyond its dreams with Robert Klitzman. By removing ultimate responsibility from bin Laden, Klitzman adopted a posture that resembled... cowering in the corner and begging to be spared.
You might think that precious few Americans take Noam Chomsky very seriously. And yet, the marketplace of ideas works in strange ways. On the one hand they breed diluted versions like the one offered by Klitzman, but they have other effects, more difficult to pinpoint.
Stephens has some trenchant remarks: “Dulled (and dull) as Mr. Chomsky's ideas might be in the West, they remain razors outside of it. ‘Among the most capable of those from your side who speak on this topic [the war in Iraq] and on the manufacturing of public opinion is Noam Chomsky, who spoke sober words of advice prior to the war,’ said bin Laden in 2007. He was singing the professor's praises again last year, saying ‘Noam Chomsky was correct when he compared the U.S. policies to those of the mafia‘.
“These words seem to have been deeply felt. Every wannabe philosopher—and bin Laden was certainly that—seeks the imprimatur of someone he supposes to be a real philosopher. Mr. Chomsky could not furnish bin Laden with a theology, but he did provide an intellectual architecture for his hatred of the United States. That Mr. Chomsky speaks from the highest tower of American academe, that he is so widely feted as the great mind of his generation, that his every utterance finds a publisher and an audience, could only have sustained bin Laden in the conceit that his thinking was on a high plane. Maybe it would have been different if Mr. Chomsky had been dismissed decades ago for what he is: a two-nickel crank.
“Now bin Laden is dead. Yet wherever one goes in the Arab world, one finds bookstores well-stocked with Chomsky, offering another generation the same paranoid notions of American policy that mesh so neatly with an already paranoid political culture.”
According to bin Laden, Chomsky is useful because he helps to manufacture public opinion, thus, a political environment where America will be more likely to feel justified in walking away from the war on terror.
If so, then Chomsky’s mental drool has not yet had its desired effect in America.
Yet, Stephens also notes that a Comsky gives to these tyrants and despots and terrorists “an imprimatur,” a confidence that they are thinking clearly and precisely, that they have not made some grievous mental miscalculation.
Many of these great tyrants, whether bin Laden, or, most especially Mao Zedong, have seen themselves as great thinkers, great philosophical minds, as people who do god’s work.
Let's not think that these despots, real or virtual, do not harbor doubts about what they are pursuing. If they believe that the opinion of a great philosopher grants them the confidence to execute their plans, then perhaps these philosophers do deserved something less than academic laurels.
Tuesday, May 10, 2011
Monday, May 9, 2011
The Cultural Causes of Anorexia
Unless you live there you are probably not too worried about the incidence of anorexia in Hong Kong.
Unless you have spent some time pondering the origin of psychoanalysis, you have probably not given too much thought to the incidence of hysteria among European women in the 19th century.
Unless you were affected, directly or indirectly, you have probably not spent too much time asking why there was an outbreak of bulimia in Great Britain during the 1990s.
Truth be told, epidemiology is not the most compelling subject.
Yet, epidemiology can give us a fascinating insight into certain types of mental illness.
Such was the wager that Ethan Watters accepted when he started researching his book, Crazy Like Us.
I have already posted about Watters here, but I am returning to his work because I consider it to be exceptionally important.
More than that, I think that there is every risk that we ignore its highly inconvenient truths.
In my view the book’s title does it little justice. Crazy Like Us sounds like the title a stand-up comic’s memoir, lacking the seriousness that the subject deserves. The title is too glib, too tongue-in-cheek to communicate Watters’ concept.
If and when you read the chapter on how anorexia came to Hong Kong, you will see how a campaign to increase the awareness of the risk of anorexia provoked a rash of new cases. It may be counterintuitive, but the evidence is clear.
This means, Watters continues, that the psychiatrists and the media mavens who assert that increased public awareness can serve as a prophylactic measure against mental illness are in fact contributing to the problem that they believe they are solving.
When Watters asks psychiatrists how they feel about being the germ as much as the medicine, they become decidedly uncomfortable, even embarrassed.
Physicians are supposed to cure illness, not help it to spread. Given the embarrassment they feel, it should hardly be surprising that psychiatrists prefer to ignore the subject.
Surely, those who go on television talk shows to warn people about the dangers of anorexia or bulimia are not going to give up a great marketing technique. Even if they know that they are contributing to the problem, they are too self-interested to avoid the spotlight.
You can imagine the play of conflicting emotions that accompanies the realization.
Aside from the fact that good intentions do not always produce good results, and that we should never use our good intentions to excuse our failures, the root of the problem lies in the metaphor of mental illness.
Our insistence on thinking that mental illnesses are real illnesses has caused us to imagine that if increased public awareness can help people to avoid physical illnesses it can do the same with mental illness.
Being more aware of the risk of skin cancer leads people to use more sunscreen or to spend less time in the sun.
But, when we arrive at mental illnesses, the same rule does not pertain.
Watters reports on the following thesis. Anorexia, among other mental illnesses, begins with a feeling of anomie, of social dislocation, of rejection, of loneliness.
The person beset with anomie wants to belong. Being a member in good standing of a community, practicing its rules and rituals, its customs and mores, is the solution to the problem.
Those who do not feel that they can be part of the everyday social whirl have a last recourse; they seek to become part of the group of people suffering from mental illness. In order to be a member in good standing of that community, they adopt-- unconsciously-- the cluster of symptoms that the culture recognizes as meaningful.
Back in the old days anorexia was unknown in Hong Kong. Then one girl fell ill and died and the media starting writing about anorexia. The more the media and the medical community became aware of the illness, the more girls became anorexic.
Something similar happened in Europe during the Victorian era. In that case social mobility produced different forms of social dislocation and fed into an outbreak of what was then called hysteria.
In the 1990s, Princess Diana made clear to the world that she was suffering from bulimia. Perhaps she wanted to increase awareness of the affliction. What she did do was to provoke a rash of new cases of bulimia.
Watters explains: “A recent study by several British researchers showed a remarkable parallel between the incidence of bulimia in Britain and Princess Diana’s struggle with the condition. The incidence rate rose rapidly in 1992, when the rumors where first published, and then again in 1994, when the speculation became rampant. It rose to its peak in 1995 when she publicly admitted the behavior. Reports of bulimia started to decline only after the princess’s death in 1997.”
For those who had believed that Diana’s willingness to exhibit her emotional problems in public had something of a salutary effect, this information should produce anguish.
This view of symptom formation implies, quite clearly, that these psychiatric symptoms are NOT expressions of unresolved infantile mental conflict. They do not mean that the young woman was improperly breast fed or that she suffered from bad mothering or that she suffered a sexual trauma.
If a woman chooses the symptoms that tell the woman that the therapy world will take her seriously, they are not going to be resolved by assuming that they express an unspeakable thought or feeling or fantasy.
Forget about hidden meanings; forget about root causes. Girls stop eating for completely other reasons.
When it comes to anorexia, the mental health profession has not been the sole, or perhaps, the most important contributor to the new cases of anorexia.
That honor belongs to feminism.
Watters writes: “To begin with, there is no doubt that anorexia became iconic, a cause celebre, within the feminist movement of the 1970s and 1980s. Whatever else can be said about the disorder, anorexia packs a wallop of a metaphoric punch. As the feminist philosopher Susan Bordo pointed out, anorexia calls attention to ‘the central ills of our culture.’ In various writings on the topic, anorexia has been used to decry unrealistic body image standards, patriarchal family structures, the subjugation of women by post-industrial capitalism, unrealistic ideals of perfection, and more.”
Being accepted among the psychiatric patient population is one thing. Imagine how much more powerful an incentive it was for young girls when they learned that having an eating disorder would make them martyrs to the feminist cause.
They could in their person assert the truth of the feminist vision of patriarchal oppression, the beauty myth, and society's campaign against women's orgasms.
Eating disorders do not derive, as feminists and other ideologues have stated, from the way the patriarchal, capitalist West treats women. They do not come from the way the culture defines female beauty. Nor do they arise from the culture’s obsession with dieting.
It has arisen out of feminist ideology, and especially the way feminism happily used women’s bodies to provide an irrefutable critique of a culture that they were trying to deconstruct.
Unless you have spent some time pondering the origin of psychoanalysis, you have probably not given too much thought to the incidence of hysteria among European women in the 19th century.
Unless you were affected, directly or indirectly, you have probably not spent too much time asking why there was an outbreak of bulimia in Great Britain during the 1990s.
Truth be told, epidemiology is not the most compelling subject.
Yet, epidemiology can give us a fascinating insight into certain types of mental illness.
Such was the wager that Ethan Watters accepted when he started researching his book, Crazy Like Us.
I have already posted about Watters here, but I am returning to his work because I consider it to be exceptionally important.
More than that, I think that there is every risk that we ignore its highly inconvenient truths.
In my view the book’s title does it little justice. Crazy Like Us sounds like the title a stand-up comic’s memoir, lacking the seriousness that the subject deserves. The title is too glib, too tongue-in-cheek to communicate Watters’ concept.
If and when you read the chapter on how anorexia came to Hong Kong, you will see how a campaign to increase the awareness of the risk of anorexia provoked a rash of new cases. It may be counterintuitive, but the evidence is clear.
This means, Watters continues, that the psychiatrists and the media mavens who assert that increased public awareness can serve as a prophylactic measure against mental illness are in fact contributing to the problem that they believe they are solving.
When Watters asks psychiatrists how they feel about being the germ as much as the medicine, they become decidedly uncomfortable, even embarrassed.
Physicians are supposed to cure illness, not help it to spread. Given the embarrassment they feel, it should hardly be surprising that psychiatrists prefer to ignore the subject.
Surely, those who go on television talk shows to warn people about the dangers of anorexia or bulimia are not going to give up a great marketing technique. Even if they know that they are contributing to the problem, they are too self-interested to avoid the spotlight.
You can imagine the play of conflicting emotions that accompanies the realization.
Aside from the fact that good intentions do not always produce good results, and that we should never use our good intentions to excuse our failures, the root of the problem lies in the metaphor of mental illness.
Our insistence on thinking that mental illnesses are real illnesses has caused us to imagine that if increased public awareness can help people to avoid physical illnesses it can do the same with mental illness.
Being more aware of the risk of skin cancer leads people to use more sunscreen or to spend less time in the sun.
But, when we arrive at mental illnesses, the same rule does not pertain.
Watters reports on the following thesis. Anorexia, among other mental illnesses, begins with a feeling of anomie, of social dislocation, of rejection, of loneliness.
The person beset with anomie wants to belong. Being a member in good standing of a community, practicing its rules and rituals, its customs and mores, is the solution to the problem.
Those who do not feel that they can be part of the everyday social whirl have a last recourse; they seek to become part of the group of people suffering from mental illness. In order to be a member in good standing of that community, they adopt-- unconsciously-- the cluster of symptoms that the culture recognizes as meaningful.
Back in the old days anorexia was unknown in Hong Kong. Then one girl fell ill and died and the media starting writing about anorexia. The more the media and the medical community became aware of the illness, the more girls became anorexic.
Something similar happened in Europe during the Victorian era. In that case social mobility produced different forms of social dislocation and fed into an outbreak of what was then called hysteria.
In the 1990s, Princess Diana made clear to the world that she was suffering from bulimia. Perhaps she wanted to increase awareness of the affliction. What she did do was to provoke a rash of new cases of bulimia.
Watters explains: “A recent study by several British researchers showed a remarkable parallel between the incidence of bulimia in Britain and Princess Diana’s struggle with the condition. The incidence rate rose rapidly in 1992, when the rumors where first published, and then again in 1994, when the speculation became rampant. It rose to its peak in 1995 when she publicly admitted the behavior. Reports of bulimia started to decline only after the princess’s death in 1997.”
For those who had believed that Diana’s willingness to exhibit her emotional problems in public had something of a salutary effect, this information should produce anguish.
This view of symptom formation implies, quite clearly, that these psychiatric symptoms are NOT expressions of unresolved infantile mental conflict. They do not mean that the young woman was improperly breast fed or that she suffered from bad mothering or that she suffered a sexual trauma.
If a woman chooses the symptoms that tell the woman that the therapy world will take her seriously, they are not going to be resolved by assuming that they express an unspeakable thought or feeling or fantasy.
Forget about hidden meanings; forget about root causes. Girls stop eating for completely other reasons.
When it comes to anorexia, the mental health profession has not been the sole, or perhaps, the most important contributor to the new cases of anorexia.
That honor belongs to feminism.
Watters writes: “To begin with, there is no doubt that anorexia became iconic, a cause celebre, within the feminist movement of the 1970s and 1980s. Whatever else can be said about the disorder, anorexia packs a wallop of a metaphoric punch. As the feminist philosopher Susan Bordo pointed out, anorexia calls attention to ‘the central ills of our culture.’ In various writings on the topic, anorexia has been used to decry unrealistic body image standards, patriarchal family structures, the subjugation of women by post-industrial capitalism, unrealistic ideals of perfection, and more.”
Being accepted among the psychiatric patient population is one thing. Imagine how much more powerful an incentive it was for young girls when they learned that having an eating disorder would make them martyrs to the feminist cause.
They could in their person assert the truth of the feminist vision of patriarchal oppression, the beauty myth, and society's campaign against women's orgasms.
Eating disorders do not derive, as feminists and other ideologues have stated, from the way the patriarchal, capitalist West treats women. They do not come from the way the culture defines female beauty. Nor do they arise from the culture’s obsession with dieting.
It has arisen out of feminist ideology, and especially the way feminism happily used women’s bodies to provide an irrefutable critique of a culture that they were trying to deconstruct.
Labels:
feminism
Sunday, May 8, 2011
Al Qaeda Declines; the Muslim Brotherhood Ascends
As we cheer the demise of Osama bin Laden, we should not lose sight of the fact that the so-called Arab Spring has brought the Muslim Brotherhood within reach of political power in Egypt.
Whether or not this granddaddy of radical Islamist organizations gains power, it is certainly moving in that direction. From being outlawed by Mubarak, the Muslim Brotherhood has made a serious comeback.
Last Friday Ayaan Hirsi Ali reminded us not to allow the death of bin Laden to lull us into complacency about the continuing threat of Islamic radicalism.
The true danger, as she sees it, does not lie in acts of terrorism, but the kind of slow and methodical infiltration of government institutions that has been so skillfully practiced by the Muslim Brotherhood. Link here.
Whether or not this granddaddy of radical Islamist organizations gains power, it is certainly moving in that direction. From being outlawed by Mubarak, the Muslim Brotherhood has made a serious comeback.
Last Friday Ayaan Hirsi Ali reminded us not to allow the death of bin Laden to lull us into complacency about the continuing threat of Islamic radicalism.
The true danger, as she sees it, does not lie in acts of terrorism, but the kind of slow and methodical infiltration of government institutions that has been so skillfully practiced by the Muslim Brotherhood. Link here.
Labels:
Egypt
Universities in Decline
Clearly, something is wrong with the American university system. Those who have found a professional home there are concerned.
Recently, William Deresiewicz wrote what appears to be a magnum opus on the subject, reviewing all the recent literature on the subject. Link here.
Given that his article appears in The Nation, I expected to find the usual leftist boilerplate about how the current state of academia reflects the degradations of American capitalism and free enterprise.
I was not disappointed: “What we have in academia, in other words, is a microcosm of the American economy as a whole: a self-enriching aristocracy, a swelling and increasingly immiserated proletariat, and a shrinking middle class. The same devil’s bargain stabilizes the system: the middle, or at least the upper middle, the tenured professoriate, is allowed to retain its prerogatives—its comfortable compensation packages, its workplace autonomy and its job security—in return for acquiescing to the exploitation of the bottom by the top, and indirectly, the betrayal of the future of the entire enterprise.”
Deresiewicz declaims against everyone but those who are actually teaching the Humanities. Which tells you where the real fault lies.
He fails to notice the universities have functioned as liberal fiefdoms, places where no outsider has been allowed to influence the way courses are taught or institutions administered.
If the states are considered to be the laboratories for democracy, why not consider American universities to be laboratories for liberal policies?
Deresiewicz is especially concerned with the decline of the Humanities. More and more students are taking courses in practical subjects, like engineering and finance. Fewer and fewer students are signing up for literature courses.
He insists that we need the Humanities to become better citizens. And yet, how many of those courses teach students to trash the nation, its traditions, and its heroes. And how many Humanities professors have made it their life mission to indoctrinate their students in the most fashionable politically correct thinking.
In his words: “Yet the liberal arts, as we know, are dying. All the political and parental pressure is pushing in the other direction, toward the ‘practical,’ narrowly conceived: the instrumental, the utilitarian, the immediately negotiable. Colleges and universities are moving away from the liberal arts toward professional, technical and vocational training.
“A liberal arts education creates citizens: people who can think broadly and critically about themselves and the world.”
I note that Deresiewicz seems to suggest that education will make students into better citizens of the world. He does not make a connection between citizenship and national identity.
And he does not seem to understand that citizenship involves taking pride in your country’s achievements, feeling loyal to its traditions. Critical thinking, the kind that politically correct professors teach, finds fault with America and often encourages disloyalty.
Deresiewicz seems to be perturbed by the possibility that parents, students, and taxpayers have a say in the process. He is perturbed by free people making free choices in a free market.He does not say who he wants to empower, but clearly, he is recommending that people like him should be making the decisions about who studies what.
It doesn’t matter whether the students want to study this or that subject. Universities should force students to take the courses that the universities consider to be in the best interest of the universities. Taxpayers should be forced to fund them and parents should be forced to pay for them.
Doesn’t this feel a bit like what happens when a labor union sets down work rules? It doesn’t matter whether the rules promote efficiency or enterprise. It doesn’t matter whether the workers are adding enough value to the company to justify their salaries and time off.
Deresiewicz sees humanists engaged in the pursuit of knowledge for the sake of knowledge. Either he has chosen to blind himself to what is going on in Humanities departments or he is employing a cheap rhetorical trick.
Take an example of what is being taught in California middle schools, imposed by the California regents.
According to Caitlyn Flanagan, her thirteen year old sons are being not really being taught American history. They are being trained to become political activists. Link here.
In her words: “Such a fine generation of young Americans—it's too bad that they are being systematically robbed of the great national story that is their birthright. Here in California, history classes are now required by law to include the contributions to the state and nation that have been made by Native Americans, African Americans, Mexican Americans, Asian Americans, Pacific Islanders, European Americans and persons with disabilities. Soon to be added to this list—provided that Senate Bill 48, which just passed in Sacramento, becomes state law—are the contributions of ‘gay, lesbian, bisexual and transgendered Americans.’"
Ostensibly, history is being rewritten because administrators have decided that certain children are bullied in school because the curriculum has overemphasized the influence of straight white males, and ignored the contributions of everyone else.
This is naïve beyond reason. It functions as a convenient rationalization for a larger agenda, which is, to indoctrinate young people in politically correct thinking.
Middle school students are a captive audience. They can be force fed any ideology the school board or the Regents decides it wants them to have. Given the power that teachers can exercise, pupils have every interest in thinking the way their teachers want them to think.
Some teachers are selfless disseminators of pure knowledge, but no small number of them have long since forgotten that ideal.
And that means, Humanities departments are hard at work teaching students to be the kinds of people you do not want to hire. Perhaps it is a way of undermining capitalism and free enterprise, but young people who are trained to be activists look to companies like incipient troublemakers.
While the education establishment has the power to do what it pleases within school walls, the outside world, in the person of hiring officers, also has a say. And recently it has been saying that it does not want to hire young people who have been indoctrinated by so-called Humanists.
The older children become, the less power teachers have. Once children reach college age, and once their parents are forced to pony up tuition, they start asking themselves why they should be paying good money to have their children taught leftist ideology.
Learning to think critically, a hallmark of every leftist defense of the humanities, makes young people incapable of organizing projects, of working as part of a team, and of actually constructing something.
Being able to identify flaws does not tell you anything about how to make things run well.
If you are in college and want to have a career once you have graduated, you would do better to study engineering than classics. Not because classics are not useful, but because you run the risk of having your classics professors use their power to indoctrinate you in their ideology, Homer and Aristotle be damned.
It isn’t surprising that companies prefer hiring engineering students over literature majors. If academics want to know how this situation came to pass, the reason lies in the way they themselves have been teaching their subjects.
In my view, college students are voting against the Humanities because they prefer to major in subjects that involve numbers. They know that they have a better chance to learn about reality if there are quantifiable entities in play. And they know that reality testing will give them a better chance of being judged according to the quality of their work rather than the correctness of their thought.
Deresiewicz does offer a standard plaint about how the California university system is horribly underfunded, but he should have looked at how the kinds of policies he and his friends at the Nation have supported, the ones that have dominated California governance, have bankrupted the state, to the point where it can no longer support its wondrous university system.
(Victor Davis Hanson has an excellent article on how liberal politics has produced an economic depression in California today.) Link here.
If Deresiewicz is capable of self-criticism that would be a good place to start
It would be nice to see academics and former academics take some responsibility for the system that they created.
Recently, William Deresiewicz wrote what appears to be a magnum opus on the subject, reviewing all the recent literature on the subject. Link here.
Given that his article appears in The Nation, I expected to find the usual leftist boilerplate about how the current state of academia reflects the degradations of American capitalism and free enterprise.
I was not disappointed: “What we have in academia, in other words, is a microcosm of the American economy as a whole: a self-enriching aristocracy, a swelling and increasingly immiserated proletariat, and a shrinking middle class. The same devil’s bargain stabilizes the system: the middle, or at least the upper middle, the tenured professoriate, is allowed to retain its prerogatives—its comfortable compensation packages, its workplace autonomy and its job security—in return for acquiescing to the exploitation of the bottom by the top, and indirectly, the betrayal of the future of the entire enterprise.”
Deresiewicz declaims against everyone but those who are actually teaching the Humanities. Which tells you where the real fault lies.
He fails to notice the universities have functioned as liberal fiefdoms, places where no outsider has been allowed to influence the way courses are taught or institutions administered.
If the states are considered to be the laboratories for democracy, why not consider American universities to be laboratories for liberal policies?
Deresiewicz is especially concerned with the decline of the Humanities. More and more students are taking courses in practical subjects, like engineering and finance. Fewer and fewer students are signing up for literature courses.
He insists that we need the Humanities to become better citizens. And yet, how many of those courses teach students to trash the nation, its traditions, and its heroes. And how many Humanities professors have made it their life mission to indoctrinate their students in the most fashionable politically correct thinking.
In his words: “Yet the liberal arts, as we know, are dying. All the political and parental pressure is pushing in the other direction, toward the ‘practical,’ narrowly conceived: the instrumental, the utilitarian, the immediately negotiable. Colleges and universities are moving away from the liberal arts toward professional, technical and vocational training.
“A liberal arts education creates citizens: people who can think broadly and critically about themselves and the world.”
I note that Deresiewicz seems to suggest that education will make students into better citizens of the world. He does not make a connection between citizenship and national identity.
And he does not seem to understand that citizenship involves taking pride in your country’s achievements, feeling loyal to its traditions. Critical thinking, the kind that politically correct professors teach, finds fault with America and often encourages disloyalty.
Deresiewicz seems to be perturbed by the possibility that parents, students, and taxpayers have a say in the process. He is perturbed by free people making free choices in a free market.He does not say who he wants to empower, but clearly, he is recommending that people like him should be making the decisions about who studies what.
It doesn’t matter whether the students want to study this or that subject. Universities should force students to take the courses that the universities consider to be in the best interest of the universities. Taxpayers should be forced to fund them and parents should be forced to pay for them.
Doesn’t this feel a bit like what happens when a labor union sets down work rules? It doesn’t matter whether the rules promote efficiency or enterprise. It doesn’t matter whether the workers are adding enough value to the company to justify their salaries and time off.
Deresiewicz sees humanists engaged in the pursuit of knowledge for the sake of knowledge. Either he has chosen to blind himself to what is going on in Humanities departments or he is employing a cheap rhetorical trick.
Take an example of what is being taught in California middle schools, imposed by the California regents.
According to Caitlyn Flanagan, her thirteen year old sons are being not really being taught American history. They are being trained to become political activists. Link here.
In her words: “Such a fine generation of young Americans—it's too bad that they are being systematically robbed of the great national story that is their birthright. Here in California, history classes are now required by law to include the contributions to the state and nation that have been made by Native Americans, African Americans, Mexican Americans, Asian Americans, Pacific Islanders, European Americans and persons with disabilities. Soon to be added to this list—provided that Senate Bill 48, which just passed in Sacramento, becomes state law—are the contributions of ‘gay, lesbian, bisexual and transgendered Americans.’"
Ostensibly, history is being rewritten because administrators have decided that certain children are bullied in school because the curriculum has overemphasized the influence of straight white males, and ignored the contributions of everyone else.
This is naïve beyond reason. It functions as a convenient rationalization for a larger agenda, which is, to indoctrinate young people in politically correct thinking.
Middle school students are a captive audience. They can be force fed any ideology the school board or the Regents decides it wants them to have. Given the power that teachers can exercise, pupils have every interest in thinking the way their teachers want them to think.
Some teachers are selfless disseminators of pure knowledge, but no small number of them have long since forgotten that ideal.
And that means, Humanities departments are hard at work teaching students to be the kinds of people you do not want to hire. Perhaps it is a way of undermining capitalism and free enterprise, but young people who are trained to be activists look to companies like incipient troublemakers.
While the education establishment has the power to do what it pleases within school walls, the outside world, in the person of hiring officers, also has a say. And recently it has been saying that it does not want to hire young people who have been indoctrinated by so-called Humanists.
The older children become, the less power teachers have. Once children reach college age, and once their parents are forced to pony up tuition, they start asking themselves why they should be paying good money to have their children taught leftist ideology.
Learning to think critically, a hallmark of every leftist defense of the humanities, makes young people incapable of organizing projects, of working as part of a team, and of actually constructing something.
Being able to identify flaws does not tell you anything about how to make things run well.
If you are in college and want to have a career once you have graduated, you would do better to study engineering than classics. Not because classics are not useful, but because you run the risk of having your classics professors use their power to indoctrinate you in their ideology, Homer and Aristotle be damned.
It isn’t surprising that companies prefer hiring engineering students over literature majors. If academics want to know how this situation came to pass, the reason lies in the way they themselves have been teaching their subjects.
In my view, college students are voting against the Humanities because they prefer to major in subjects that involve numbers. They know that they have a better chance to learn about reality if there are quantifiable entities in play. And they know that reality testing will give them a better chance of being judged according to the quality of their work rather than the correctness of their thought.
Deresiewicz does offer a standard plaint about how the California university system is horribly underfunded, but he should have looked at how the kinds of policies he and his friends at the Nation have supported, the ones that have dominated California governance, have bankrupted the state, to the point where it can no longer support its wondrous university system.
(Victor Davis Hanson has an excellent article on how liberal politics has produced an economic depression in California today.) Link here.
If Deresiewicz is capable of self-criticism that would be a good place to start
It would be nice to see academics and former academics take some responsibility for the system that they created.
Saturday, May 7, 2011
"Know Thyself!"
Know thyself!
The phrase is so ambiguous that it can refer to just about anything.
Is it an injunction or an admonition? Is it an invitation or an imperative?
It comes down to us through Plato, but it was first inscribed in the forecourt of the Temple of Apollo.
Addressed to someone who is about to enter the temple it seems to be telling him to know his place, to express proper humility in a sacred space, in a space that honors a god. By extension, it is saying that he does best not to mistake himself for a god.
As you know, therapists have been doing their best to provide their own spin to the term. They have declared it to mean that you should gain self-knowledge, and that self-knowledge will make you healthy and happy.
I can pretty much guarantee that no therapist has ever imagined that knowing yourself meant knowing your proper place. Therapists have never been great proponents of propriety.
They favor introspection. Knowing yourself does not require extensive introspection. In truth, too much introspection will obscure your notion of who you really are. Better to know how others see you than about what you think or feel about yourself.
After all, you are publicly identified by your face. By something that everyone but you can see directly. The best you can do is to look at your face's reflection.
Putting your soul aside, much of what you can know about yourself is public knowledge. Start with your name. You know it; it isn’t a mystery. Since your name designates you as a member of a family, that too is something that you know about yourself. No mystery there.
Next, you can know your place in a network of family relationships. You are someone’s father or mother, son or daughter, brother or sister, uncle or aunt.
All of the either/or constructions involve gender differences. It is good to know which gender you belong to. Everyone else will likely know, so you would look foolish if you were the only one who didn’t.
All of that defines your social being, and you are nothing if not a social being. Trust me, it's better than being a mind.
Also, you know which community you belong to. It usually involves a local and a national community. Nothing mysterious there either.
You should also know your place within the community. Are you a pillar of the community or someone whose role is less august?
Entering the Temple of Apollo involves being sufficiently humble to receive whatever the god wants to communicate you. For that you need to know that you are a human being, a social being, existing within a web of relationships.
I am confident that this version of self-knowledge is not the one you were expecting. Yet, if you do not know and grant sufficient weight to these objective realities about your social existence, you will have considerable difficulty navigating the world.
So far I have said nothing about your personality. I am not very interested in whether you are outgoing or contemplative, whether you are cheerful or somber, whether you are talkative or reticent.
Not one of these qualities changes your name or your place in the world.
Setting out to learn about your personality strikes me as a largely uninteresting quest.
Of course, there is more to you than this list. I mentioned it first because most people tend to trivialize it. Beware the therapeutic impulse to trivialize what really matters.
If you were worrying about whether I was going to leave a place for your individuality, fear not. Beyond the qualities I have just mentioned, lies your most important human quality: your character.
Clearly, the Temple of Apollo was prescribing humility and humility is an important character trait.
But, ask yourself this: do you know whether or not you have good character?
Are you trustworthy, loyal, reliable, responsible, and decorous? Do you keep your word? Are you respectful and courteous to others?
Most other people know whether or not they can trust you. Do you?
If you are content to answer that you are trustworthy and loyal some of the time, then you have some work to do.
A person with good character is always trustworthy, always responsible and reliable, always good to his word. A person with good character does not go back on his word when he can get away with it. He does not keep his word only when it is expedient.
If you know how good your character is or isn't, you can correct it. Even if you have the best character, there is always room for improvement. If you think that your character cannot get any better, then you are mistaking yourself for a god.
So far, I have excluded the therapist’s favorite quest: to discover your hidden motives, what makes you run or jump or lust. More than a few people have run aground over the therapist’s imperative to discover what you really, really want.
Following your bliss or your passion or your heart’s desire is a fool‘s errand. As I’ve often said, what matters is not how badly you want it, but how good you are at it.
Knowing yourself does mean knowing what you are good at. Do not undertake enterprises for which you have no real talent.
And if you have real talent, be it for the cello or golf or accounting, you owe it to your talent, or to the divinity that gave you the talent, to guard it, to cultivate it, and to allow it to achieve excellence.
The phrase is so ambiguous that it can refer to just about anything.
Is it an injunction or an admonition? Is it an invitation or an imperative?
It comes down to us through Plato, but it was first inscribed in the forecourt of the Temple of Apollo.
Addressed to someone who is about to enter the temple it seems to be telling him to know his place, to express proper humility in a sacred space, in a space that honors a god. By extension, it is saying that he does best not to mistake himself for a god.
As you know, therapists have been doing their best to provide their own spin to the term. They have declared it to mean that you should gain self-knowledge, and that self-knowledge will make you healthy and happy.
I can pretty much guarantee that no therapist has ever imagined that knowing yourself meant knowing your proper place. Therapists have never been great proponents of propriety.
They favor introspection. Knowing yourself does not require extensive introspection. In truth, too much introspection will obscure your notion of who you really are. Better to know how others see you than about what you think or feel about yourself.
After all, you are publicly identified by your face. By something that everyone but you can see directly. The best you can do is to look at your face's reflection.
Putting your soul aside, much of what you can know about yourself is public knowledge. Start with your name. You know it; it isn’t a mystery. Since your name designates you as a member of a family, that too is something that you know about yourself. No mystery there.
Next, you can know your place in a network of family relationships. You are someone’s father or mother, son or daughter, brother or sister, uncle or aunt.
All of the either/or constructions involve gender differences. It is good to know which gender you belong to. Everyone else will likely know, so you would look foolish if you were the only one who didn’t.
All of that defines your social being, and you are nothing if not a social being. Trust me, it's better than being a mind.
Also, you know which community you belong to. It usually involves a local and a national community. Nothing mysterious there either.
You should also know your place within the community. Are you a pillar of the community or someone whose role is less august?
Entering the Temple of Apollo involves being sufficiently humble to receive whatever the god wants to communicate you. For that you need to know that you are a human being, a social being, existing within a web of relationships.
I am confident that this version of self-knowledge is not the one you were expecting. Yet, if you do not know and grant sufficient weight to these objective realities about your social existence, you will have considerable difficulty navigating the world.
So far I have said nothing about your personality. I am not very interested in whether you are outgoing or contemplative, whether you are cheerful or somber, whether you are talkative or reticent.
Not one of these qualities changes your name or your place in the world.
Setting out to learn about your personality strikes me as a largely uninteresting quest.
Of course, there is more to you than this list. I mentioned it first because most people tend to trivialize it. Beware the therapeutic impulse to trivialize what really matters.
If you were worrying about whether I was going to leave a place for your individuality, fear not. Beyond the qualities I have just mentioned, lies your most important human quality: your character.
Clearly, the Temple of Apollo was prescribing humility and humility is an important character trait.
But, ask yourself this: do you know whether or not you have good character?
Are you trustworthy, loyal, reliable, responsible, and decorous? Do you keep your word? Are you respectful and courteous to others?
Most other people know whether or not they can trust you. Do you?
If you are content to answer that you are trustworthy and loyal some of the time, then you have some work to do.
A person with good character is always trustworthy, always responsible and reliable, always good to his word. A person with good character does not go back on his word when he can get away with it. He does not keep his word only when it is expedient.
If you know how good your character is or isn't, you can correct it. Even if you have the best character, there is always room for improvement. If you think that your character cannot get any better, then you are mistaking yourself for a god.
So far, I have excluded the therapist’s favorite quest: to discover your hidden motives, what makes you run or jump or lust. More than a few people have run aground over the therapist’s imperative to discover what you really, really want.
Following your bliss or your passion or your heart’s desire is a fool‘s errand. As I’ve often said, what matters is not how badly you want it, but how good you are at it.
Knowing yourself does mean knowing what you are good at. Do not undertake enterprises for which you have no real talent.
And if you have real talent, be it for the cello or golf or accounting, you owe it to your talent, or to the divinity that gave you the talent, to guard it, to cultivate it, and to allow it to achieve excellence.
Labels:
coaching lessons
Friday, May 6, 2011
Socially Constructed Reality
The phrase has almost descended to the rank of slogan. College students and people who should know better repeat it as though it were a mantra: reality is socially constructed.
It is not a fact. It’s a theory. It purports to explain, not facts, but the way social groups construct belief systems. And how these same groups force everyone to accept their beliefs as facts.
To be brief, it’s a complex and thorny issue.
Stanley Fish has presented some of the basic ideas in two columns in the New York Times.I find them both puzzling and intriguing. Links here and here.
I have been pondering the questions for several days now, and, the more I think about them, the stranger they become.
At first, it feels like yet another mind-over-matter problem. Fish begins by explaining that some people believe that we only know reality because our minds process sensory and perceptual stimuli, and then identify objects. This theory assumes that, as a consequence, if we learn to think differently we can change the world.
Fish insists, correctly, that this is an exercise in epistemology, the philosophy of how we gain and acquire knowledge.
He asserts that you cannot get from epistemology to political action. Epistemology is descriptive; political action is prescriptive. The former pretends to show how our minds gain knowledge; the latter involves what we should do to change that reality.
I find his distinction to be germane, salient, and correct.
Just as hard science does not have a moral dimension, neither does epistemology.
Of course, using our minds to know reality is not the same as using our minds to construct reality.
Being involved in an exchange with real objects is not the same as creating or constructing those objects out of a blur of stimuli.
Normally, we develop ideas of reality and test them against real objects in the world. Therefore, we affirm or deny the validity or our ideas.
The theory of mentally constructed reality makes us all artists. It says that when reality does not correspond to our ideas then we need to work harder to ply it into an aesthetically pleasing form.
Saying that reality is mentally constructed is not the same as saying that it is socially constructed. The latter, more prevalent theory, says that we create reality by the way we talk about things.
It’s one thing to say that if we learn to think differently we can change the world. It’s quite another to say that if everyone speaks differently the world will be changed. In some circles the way we speak about things is called discourse.
The theorists who concocted this witch’s brew would not be contented with changing the way we think, They want us to change the way we speak about things. And this cannot happen without indoctrination and thought reform. You cannot achieve this goal without policing speech.
Obviously, the theorists who are doing this work are not very worried about your ability to distinguish a plum from a peach, or a rock from a tree.They are more worried about how you group objects, how you place them in categories. Which is the next step beyond identifying them.
And they are not even very interested in whether a tomato is a fruit or a vegetable. They are more concerned with how people organize in society, how they belong to this group and not that one.
Fish states it quite succinctly: they are aiming to modify the structure of society itself, and especially of social groups. In principle, groups include some people and exclude others. In other words, groups discriminate. They organize the world into friends and foes.
And they have different reasons for including or excluding people.
Some groups value blood ties. These tribal groups confer membership by birth. You cannot become part of the group by work, by conforming to the group‘s values, or by practicing the group’s rituals and ceremonies.
If a group values blood ties, you are generally obliged to support those with whom you are linked by blood, regardless of what they do or how they do it.
Group organization by blood ties generally have more tolerance for the behavior of those who belong and more intolerance for the behavior of those who do not.
Tribal groups are not just limited to those who are linked by blood. People who join cults, which are pseudo-tribes, often feel obliged to excuse the bad behavior of fellow cult members and to denounce the bad behavior of anyone who does not belong.
Other groups value good conduct. They offer membership to those who affirm their values through the way they conduct their lives.
Following codes of good conduct sustains your membership in such a group. You are not obliged to defend the bad behavior of fellow group members; you are obliged to marginalize them.
So, groups include some and exclude others. That is the reality of group membership. If groups do not include some and exclude others there is no real value to membership.
Those who are excluded most often belong to different groups. Occasionally, someone will be ostracized, thus excluded from all groups.
By this concept groups are inherently discriminatory. Some exclude those who do not have the right blood lines. Others exclude those who do not share their values and practice their rituals.
Of course, critical theorists are opposed to discrimination in all forms. So they seem to want to alter the nature of groups to make it that everyone is included.
It doesn’t much matter how you got to America. You need to be offered citizenship. It doesn’t matter how you behave yourself; you must be included in the group. If you are not, then society has suffered the effects of an incorrect way of speaking about outsiders.
The theory holds that people are outsiders because we say that they are outsiders, not because of their behavior, their own ability to respect the laws of the nation and to adopt the cultural norms.
In some cases outsiders are persecuted; in others they are allowed to practice local customs as long as they share the nation's values.
Of course, if everyone must be welcomed into a group, then it is not really a group any more. To say that humankind should be a big group that includes everyone is to sabotage group values and to foster anarchy.
Anyway, according to Stanley Fish, those who theorize that reality is socially constructed tend to embrace a liberationist ideology. We might even say that it leads to liberation theology.
Since Fish does not believe that epistemology can lead to a political program, he disputes this liberationist ideology. Yet, he points out how important it is on the political and academic left.
In his words: “And I say this even though each movement on the intellectual left — feminism, postmodernism, critical race theory, critical legal studies — believes that the thesis generates a politics of liberation. It doesn’t; it doesn’t generate anything. Consciousness-raising has always been a false lure, although changes in consciousness are always possible. It is just that you can’t design them or will them into being; there is no method that will free us from the conceptual limitations within which we make invidious distinctions and perform acts of blindness. The best we can do is wait for a tree to talk to us.”
Of course, there’s liberation and there’s liberation.
When theorists posit that reality is socially constructed they are saying that they do not need to submit their theories to the judgment of reality.
You are free to prattle on, to spin theoretical webs, to apply them willy-nilly, without having to care about whether or not they work.
Saying that reality is socially constructed allows you to do as you please and to explain away the bad consequences by insisting that reality does not matter.
If your policies do not work, that can only mean that you need to work harder to force people to think and to speak differently.
In itself this is not a new idea. It is an old idea revived and dressed up to look like something else.
All of the revolutionary and liberationist theories of the past century have done everything in their power to explain why their theories should never be judged according to whether or not they worked in reality.
In fact, critical theorists have an even clearer goal in mind. Fish identifies it, and he seems to sympathize with it. That goal is ridding the world of the ideological sins that accompany discrimination.
Those are racism, sexism, homophobia, lookism, ageism, and the like.
Once these are identified as sins, and once the people of America are told that it is a crime against a higher power to use language or to institute practices that discriminate on the basis of race, gender, and the like, we have entered a world that resembles a religion.
One that holds as an article of faith that America is a corrupt country that needs nothing more than to cleanse its sins.
As it happens, America has the distinction of being one of the few countries in the world, and certainly the most prominent nation, where you can enjoy the full rights of citizenship and gain national identity regardless of race, creed, or national origin.
America has rules for acquiring citizenship, but they do not involve race, gender, sexual preference, age, or appearance.
Making America the whipping boy for one’s ideological obsessions requires a willful ignorance of the reality of the politics of national identity.
Deciding that America ought to accept anyone who enters the country, legally or illegally, as worthy of citizenship undermines the effort to play fair and by the rules.
America has succeeded because it has insisted that everyone play by the same rules. When the playing field has been tilted in one or another direction, America has done everything possible to make it fair and level.
One might say that the effort has been flawed and imperfect, but its considerable success has flowed from a spirit of fair play and good sportsmanship.
It has not come about because we undertook cathartic mental exercises to rid our minds and hearts of negative emotions toward a group.
When you set about to solve problems by exercising imperious control over hearts, minds, and discourse, you are running a risk. You have made the mistake of thinking that reality is socially constructed, that what matters is not whether the game is being played fairly but what the outcome is. At that point, you will undoubtedly run up against the realities of the marketplace.
Those who believe that reality is socially constructed have no real use for the marketplace. They do not care for fair competition and free enterprise. Theirs is a world that ignores reality in order to impose an ideology on the world.
It is not a fact. It’s a theory. It purports to explain, not facts, but the way social groups construct belief systems. And how these same groups force everyone to accept their beliefs as facts.
To be brief, it’s a complex and thorny issue.
Stanley Fish has presented some of the basic ideas in two columns in the New York Times.I find them both puzzling and intriguing. Links here and here.
I have been pondering the questions for several days now, and, the more I think about them, the stranger they become.
At first, it feels like yet another mind-over-matter problem. Fish begins by explaining that some people believe that we only know reality because our minds process sensory and perceptual stimuli, and then identify objects. This theory assumes that, as a consequence, if we learn to think differently we can change the world.
Fish insists, correctly, that this is an exercise in epistemology, the philosophy of how we gain and acquire knowledge.
He asserts that you cannot get from epistemology to political action. Epistemology is descriptive; political action is prescriptive. The former pretends to show how our minds gain knowledge; the latter involves what we should do to change that reality.
I find his distinction to be germane, salient, and correct.
Just as hard science does not have a moral dimension, neither does epistemology.
Of course, using our minds to know reality is not the same as using our minds to construct reality.
Being involved in an exchange with real objects is not the same as creating or constructing those objects out of a blur of stimuli.
Normally, we develop ideas of reality and test them against real objects in the world. Therefore, we affirm or deny the validity or our ideas.
The theory of mentally constructed reality makes us all artists. It says that when reality does not correspond to our ideas then we need to work harder to ply it into an aesthetically pleasing form.
Saying that reality is mentally constructed is not the same as saying that it is socially constructed. The latter, more prevalent theory, says that we create reality by the way we talk about things.
It’s one thing to say that if we learn to think differently we can change the world. It’s quite another to say that if everyone speaks differently the world will be changed. In some circles the way we speak about things is called discourse.
The theorists who concocted this witch’s brew would not be contented with changing the way we think, They want us to change the way we speak about things. And this cannot happen without indoctrination and thought reform. You cannot achieve this goal without policing speech.
Obviously, the theorists who are doing this work are not very worried about your ability to distinguish a plum from a peach, or a rock from a tree.They are more worried about how you group objects, how you place them in categories. Which is the next step beyond identifying them.
And they are not even very interested in whether a tomato is a fruit or a vegetable. They are more concerned with how people organize in society, how they belong to this group and not that one.
Fish states it quite succinctly: they are aiming to modify the structure of society itself, and especially of social groups. In principle, groups include some people and exclude others. In other words, groups discriminate. They organize the world into friends and foes.
And they have different reasons for including or excluding people.
Some groups value blood ties. These tribal groups confer membership by birth. You cannot become part of the group by work, by conforming to the group‘s values, or by practicing the group’s rituals and ceremonies.
If a group values blood ties, you are generally obliged to support those with whom you are linked by blood, regardless of what they do or how they do it.
Group organization by blood ties generally have more tolerance for the behavior of those who belong and more intolerance for the behavior of those who do not.
Tribal groups are not just limited to those who are linked by blood. People who join cults, which are pseudo-tribes, often feel obliged to excuse the bad behavior of fellow cult members and to denounce the bad behavior of anyone who does not belong.
Other groups value good conduct. They offer membership to those who affirm their values through the way they conduct their lives.
Following codes of good conduct sustains your membership in such a group. You are not obliged to defend the bad behavior of fellow group members; you are obliged to marginalize them.
So, groups include some and exclude others. That is the reality of group membership. If groups do not include some and exclude others there is no real value to membership.
Those who are excluded most often belong to different groups. Occasionally, someone will be ostracized, thus excluded from all groups.
By this concept groups are inherently discriminatory. Some exclude those who do not have the right blood lines. Others exclude those who do not share their values and practice their rituals.
Of course, critical theorists are opposed to discrimination in all forms. So they seem to want to alter the nature of groups to make it that everyone is included.
It doesn’t much matter how you got to America. You need to be offered citizenship. It doesn’t matter how you behave yourself; you must be included in the group. If you are not, then society has suffered the effects of an incorrect way of speaking about outsiders.
The theory holds that people are outsiders because we say that they are outsiders, not because of their behavior, their own ability to respect the laws of the nation and to adopt the cultural norms.
In some cases outsiders are persecuted; in others they are allowed to practice local customs as long as they share the nation's values.
Of course, if everyone must be welcomed into a group, then it is not really a group any more. To say that humankind should be a big group that includes everyone is to sabotage group values and to foster anarchy.
Anyway, according to Stanley Fish, those who theorize that reality is socially constructed tend to embrace a liberationist ideology. We might even say that it leads to liberation theology.
Since Fish does not believe that epistemology can lead to a political program, he disputes this liberationist ideology. Yet, he points out how important it is on the political and academic left.
In his words: “And I say this even though each movement on the intellectual left — feminism, postmodernism, critical race theory, critical legal studies — believes that the thesis generates a politics of liberation. It doesn’t; it doesn’t generate anything. Consciousness-raising has always been a false lure, although changes in consciousness are always possible. It is just that you can’t design them or will them into being; there is no method that will free us from the conceptual limitations within which we make invidious distinctions and perform acts of blindness. The best we can do is wait for a tree to talk to us.”
Of course, there’s liberation and there’s liberation.
When theorists posit that reality is socially constructed they are saying that they do not need to submit their theories to the judgment of reality.
You are free to prattle on, to spin theoretical webs, to apply them willy-nilly, without having to care about whether or not they work.
Saying that reality is socially constructed allows you to do as you please and to explain away the bad consequences by insisting that reality does not matter.
If your policies do not work, that can only mean that you need to work harder to force people to think and to speak differently.
In itself this is not a new idea. It is an old idea revived and dressed up to look like something else.
All of the revolutionary and liberationist theories of the past century have done everything in their power to explain why their theories should never be judged according to whether or not they worked in reality.
In fact, critical theorists have an even clearer goal in mind. Fish identifies it, and he seems to sympathize with it. That goal is ridding the world of the ideological sins that accompany discrimination.
Those are racism, sexism, homophobia, lookism, ageism, and the like.
Once these are identified as sins, and once the people of America are told that it is a crime against a higher power to use language or to institute practices that discriminate on the basis of race, gender, and the like, we have entered a world that resembles a religion.
One that holds as an article of faith that America is a corrupt country that needs nothing more than to cleanse its sins.
As it happens, America has the distinction of being one of the few countries in the world, and certainly the most prominent nation, where you can enjoy the full rights of citizenship and gain national identity regardless of race, creed, or national origin.
America has rules for acquiring citizenship, but they do not involve race, gender, sexual preference, age, or appearance.
Making America the whipping boy for one’s ideological obsessions requires a willful ignorance of the reality of the politics of national identity.
Deciding that America ought to accept anyone who enters the country, legally or illegally, as worthy of citizenship undermines the effort to play fair and by the rules.
America has succeeded because it has insisted that everyone play by the same rules. When the playing field has been tilted in one or another direction, America has done everything possible to make it fair and level.
One might say that the effort has been flawed and imperfect, but its considerable success has flowed from a spirit of fair play and good sportsmanship.
It has not come about because we undertook cathartic mental exercises to rid our minds and hearts of negative emotions toward a group.
When you set about to solve problems by exercising imperious control over hearts, minds, and discourse, you are running a risk. You have made the mistake of thinking that reality is socially constructed, that what matters is not whether the game is being played fairly but what the outcome is. At that point, you will undoubtedly run up against the realities of the marketplace.
Those who believe that reality is socially constructed have no real use for the marketplace. They do not care for fair competition and free enterprise. Theirs is a world that ignores reality in order to impose an ideology on the world.
Thursday, May 5, 2011
Obama's Halo
Commenting on Barack Obama’s success in executing Osama bin Laden, Stephen Dubner suggested that an executive who succeeds is often seen to have gained a halo. Link here.
Thanks to this halo effect, the leader will be seen as competent and capable in other areas of responsibility. Being the captain of the winning football team induces other people to assume that you are great in math and history.
Unfortunately, the halo effect have very little staying power. If you assume that the quarterback is great at math, you assumptions will be affirmed or denied by his next math test.
Similarly, when an administration succeeds in executing Osama bin Laden, we naturally assume that it has gotten its act together. We assume that its communications and messaging operations will operate as efficiently as a platoon of Navy SEALS.
Yet, when someone who has never shown any exception leadership skills finally manages to display some, we will also be asking whether his success was an accident or whether it will be repeated.
Was success a singular occurrence, a mix of happenstance and good luck, or was it a sign of a high level of a hitherto unrecognized competence?
If the execution of bin Laden shows American resolve and was intended as a warning to future jihadis, then the question will be: If the leader had to do it over again, would he have.
Was the operation an accident or a policy?
Thus, people must pay close attention to further evidence that would tell us and the world what to expect in the future.
Everyone wants to know, when the next crisis hits, who is going to show up: the intrepid commander or the bumbler.
Everyone agrees that Obama made a courageous and correct decision when he order the Navy SEALS to invade Osama bin Laden’s compound and execute the terrorist mastermind.
It was more risky than sending in a few bombs, but it provided us with a body, thus, with proof that bin Laden had been killed.
Shooting bin Laden in the face was a great touch, a loss of face, an extra humiliation. It is far better than shooting a man in the back.
At the least, you do not want your enemies to be able to save face.
I also agreed that disposing of the body in the sea was a fitting end to the episode.
I questioned whether or not bin Laden ought to have been accorded the respect involved in a memorial service. Some have suggested that the service was minimal, but others have countered that a man who was considered by our government to have hijacked Islam, thus, who was not a true Muslim, should not have been treated as one.
One commenter on this blog suggested that the funeral service on the SS Vinson was minimal, which may well be the case. Others have taken a position closer to my original position, namely, that bin Laden did not deserve a proper funeral at all.
Some members of the Obama administration seem to feel that bin Laden deserved this respect. I find the assertion questionable.
Today Alan Dershowitz argued that the body should have been preserved for an autopsy so that we can best know what happened. Dershowitz recommended that we should have followed the procedure used in murder cases. Link here.
Here I disagree. Bin Laden was not a criminal; he was not going to be brought to trial, even by the Obama administration. Autopsy results might have been used to collect evidence against the SEALS who executed him.
American guilt culture does not need more fodder.
Obama was also correct to go on national television Sunday night to announce the operation’s success.
He erred, however, in drawing too much attention to himself and his role.
A great leader takes responsibility for errors and credits others for his successes. It’s a basic principle of leadership.
Obama had no real need to draw attention to himself. No one could have failed to notice that he was in charge. To say so was mildly insulting, too close to gloating.
Perhaps Obama felt that since he finally had gotten something right, he needed to pat himself on the back.
I think it right that Obama is traveling to Ground Zero today to lay a commemorative wreath. Ceremonial closure feels to me like the proper gesture.
And Obama did well to invite George W. Bush to attend the ceremony. From a man known for a singular lack of grace and respect toward his predecessor, this was clearly a step in the right direction.
I trust that this represents a fair appraisal of everything that the Obama administration got right.
Now, for the other side.
As Neil Munro just explained on The Daily Caller, the White House seems to have lost control of the public relations aspect of Obama’s most singular achievement. Link here.
Let us count the ways:
I do not think that Obama should have accepted to go on 60 Minutes this Sunday. It was bad enough, but excusable, for him to claim a large amount of the credit for the success of the operation last Sunday. Appearing on 60 Minutes feels like gloating, like what he called, in an unfortunate metaphor, “a victory lap.”
In the same interview Obama claimed that he could not release the photo of bin Laden’s corpse because: “We don’t trot out this stuff as trophies.”
As John Podhoretz pointed out, these metaphors are strikingly inappropriate. Link here.
Sports metaphors, even mixed sports metaphors, do not fit the occasion. We are not dealing with NASCAR; we are talking about a war.
The sports metaphors detract from the seriousness and solemnity of the occasion. Executing bin Laden was not entertainment.
It sounds as though Obama was indulging in what he must have thought was cool adolescent argot. In so doing he made himself seem less a commander, more a spectator.
And then, Obama announced on 60 Minutes that he would not release photos of the dead Osama bin Laden.
One does understand that there are two sides to this question. Still and all, as many commentators have noted, if we sent in the SEALS in order to bring out a corpse, why hide its image?
Keeping the pictures secret has nothing to do with convincing the doubters who believe that Obama is still alive. Al Qaeda operatives have already declared that bin Laden is dead. Those who fail to believe now will never believe.
Some have argued that we must hide the photos because Muslims are an especially sensitive, that is, thin-skinned people? But, since when do we compromise our principles because someone might take grievous offense? Do they believe that Salman Rushdie's Satanic Verses should have been censored.
Some have suggested that humiliating photos would have incited Muslim violence. Others have countered that aspiring jihadis do not really need another motive to commit violence and mayhem. They have sufficient motivation in the death of bin Laden.
If Obama wasn’t worried about the “national security risk” in shooting Osama bin Laden in the face and dumping his body in the North Arabian Sea, why does he think that a photo will provoke a wave of Muslim violence?
If our government suspects it will, then Osama bin Laden must have been a respected figure in the Muslim community.
If bin Laden, as our government has been proclaiming for nearly a decade now, did not represent Islam, then moderate Muslims probably do not miss him. If they do not consider him to be one of them, they are not likely to be outraged by his ultimate humiliation and loss of face.
As Alan Dershowitz argued, there is no reason to believe that the image of the dead bin Laden would have incited any more violence than did the images of the dead Saddam Hussein.
Harold Evans suggested that showing the image of bin Laden would place us in the same moral universe as the Tudor monarchs who decapitated their enemies and place their heads and torsos around London.
Obviously, there’s a difference between visual images and suppurating body parts, but, as Evans notes, the picture of a dead and dishonored bin Laden might give some future jihadis pause.
It’s one thing to think that you can become a terrorist and be covered in honor and glory. Quite another to think that you are going to be publicly humiliated and that your mother will be exposed to a picture of you with half of your face shot off.
Now the administration has coupled the strong, courageous act of executing bin Laden with the weakfish, retiring action of refusing to release his last portrait. In so doing it has compromised the message sent by the SEALS.
Public relations executive Eric Dezenhall suggested that the administration was projecting the view that: “displays of strength are immoral, that it is somehow immoral to defend yourself or neutralize an enemy with force.”
Clearly, that is not the message we ought to be sending.
By hiding the photo Obama is projecting regret, embarrassment, an unwillingness to take full responsibility for the consequences of his actions.
Failing to release the photos risks compromising what should have been Obama’s greatest success.
You want your messaging to state that the action was purposeful, intentional, and that, if you had to do it over again, you would.
Obama has both bragged too much about his success and hid the evidence that would have shown his strength and power.
We are all willing to accord Obama the full measure of credit for his success. He himself seems to be projecting hesitation and embarrassment.
Thanks to this halo effect, the leader will be seen as competent and capable in other areas of responsibility. Being the captain of the winning football team induces other people to assume that you are great in math and history.
Unfortunately, the halo effect have very little staying power. If you assume that the quarterback is great at math, you assumptions will be affirmed or denied by his next math test.
Similarly, when an administration succeeds in executing Osama bin Laden, we naturally assume that it has gotten its act together. We assume that its communications and messaging operations will operate as efficiently as a platoon of Navy SEALS.
Yet, when someone who has never shown any exception leadership skills finally manages to display some, we will also be asking whether his success was an accident or whether it will be repeated.
Was success a singular occurrence, a mix of happenstance and good luck, or was it a sign of a high level of a hitherto unrecognized competence?
If the execution of bin Laden shows American resolve and was intended as a warning to future jihadis, then the question will be: If the leader had to do it over again, would he have.
Was the operation an accident or a policy?
Thus, people must pay close attention to further evidence that would tell us and the world what to expect in the future.
Everyone wants to know, when the next crisis hits, who is going to show up: the intrepid commander or the bumbler.
Everyone agrees that Obama made a courageous and correct decision when he order the Navy SEALS to invade Osama bin Laden’s compound and execute the terrorist mastermind.
It was more risky than sending in a few bombs, but it provided us with a body, thus, with proof that bin Laden had been killed.
Shooting bin Laden in the face was a great touch, a loss of face, an extra humiliation. It is far better than shooting a man in the back.
At the least, you do not want your enemies to be able to save face.
I also agreed that disposing of the body in the sea was a fitting end to the episode.
I questioned whether or not bin Laden ought to have been accorded the respect involved in a memorial service. Some have suggested that the service was minimal, but others have countered that a man who was considered by our government to have hijacked Islam, thus, who was not a true Muslim, should not have been treated as one.
One commenter on this blog suggested that the funeral service on the SS Vinson was minimal, which may well be the case. Others have taken a position closer to my original position, namely, that bin Laden did not deserve a proper funeral at all.
Some members of the Obama administration seem to feel that bin Laden deserved this respect. I find the assertion questionable.
Today Alan Dershowitz argued that the body should have been preserved for an autopsy so that we can best know what happened. Dershowitz recommended that we should have followed the procedure used in murder cases. Link here.
Here I disagree. Bin Laden was not a criminal; he was not going to be brought to trial, even by the Obama administration. Autopsy results might have been used to collect evidence against the SEALS who executed him.
American guilt culture does not need more fodder.
Obama was also correct to go on national television Sunday night to announce the operation’s success.
He erred, however, in drawing too much attention to himself and his role.
A great leader takes responsibility for errors and credits others for his successes. It’s a basic principle of leadership.
Obama had no real need to draw attention to himself. No one could have failed to notice that he was in charge. To say so was mildly insulting, too close to gloating.
Perhaps Obama felt that since he finally had gotten something right, he needed to pat himself on the back.
I think it right that Obama is traveling to Ground Zero today to lay a commemorative wreath. Ceremonial closure feels to me like the proper gesture.
And Obama did well to invite George W. Bush to attend the ceremony. From a man known for a singular lack of grace and respect toward his predecessor, this was clearly a step in the right direction.
I trust that this represents a fair appraisal of everything that the Obama administration got right.
Now, for the other side.
As Neil Munro just explained on The Daily Caller, the White House seems to have lost control of the public relations aspect of Obama’s most singular achievement. Link here.
Let us count the ways:
I do not think that Obama should have accepted to go on 60 Minutes this Sunday. It was bad enough, but excusable, for him to claim a large amount of the credit for the success of the operation last Sunday. Appearing on 60 Minutes feels like gloating, like what he called, in an unfortunate metaphor, “a victory lap.”
In the same interview Obama claimed that he could not release the photo of bin Laden’s corpse because: “We don’t trot out this stuff as trophies.”
As John Podhoretz pointed out, these metaphors are strikingly inappropriate. Link here.
Sports metaphors, even mixed sports metaphors, do not fit the occasion. We are not dealing with NASCAR; we are talking about a war.
The sports metaphors detract from the seriousness and solemnity of the occasion. Executing bin Laden was not entertainment.
It sounds as though Obama was indulging in what he must have thought was cool adolescent argot. In so doing he made himself seem less a commander, more a spectator.
And then, Obama announced on 60 Minutes that he would not release photos of the dead Osama bin Laden.
One does understand that there are two sides to this question. Still and all, as many commentators have noted, if we sent in the SEALS in order to bring out a corpse, why hide its image?
Keeping the pictures secret has nothing to do with convincing the doubters who believe that Obama is still alive. Al Qaeda operatives have already declared that bin Laden is dead. Those who fail to believe now will never believe.
Some have argued that we must hide the photos because Muslims are an especially sensitive, that is, thin-skinned people? But, since when do we compromise our principles because someone might take grievous offense? Do they believe that Salman Rushdie's Satanic Verses should have been censored.
Some have suggested that humiliating photos would have incited Muslim violence. Others have countered that aspiring jihadis do not really need another motive to commit violence and mayhem. They have sufficient motivation in the death of bin Laden.
If Obama wasn’t worried about the “national security risk” in shooting Osama bin Laden in the face and dumping his body in the North Arabian Sea, why does he think that a photo will provoke a wave of Muslim violence?
If our government suspects it will, then Osama bin Laden must have been a respected figure in the Muslim community.
If bin Laden, as our government has been proclaiming for nearly a decade now, did not represent Islam, then moderate Muslims probably do not miss him. If they do not consider him to be one of them, they are not likely to be outraged by his ultimate humiliation and loss of face.
As Alan Dershowitz argued, there is no reason to believe that the image of the dead bin Laden would have incited any more violence than did the images of the dead Saddam Hussein.
Harold Evans suggested that showing the image of bin Laden would place us in the same moral universe as the Tudor monarchs who decapitated their enemies and place their heads and torsos around London.
Obviously, there’s a difference between visual images and suppurating body parts, but, as Evans notes, the picture of a dead and dishonored bin Laden might give some future jihadis pause.
It’s one thing to think that you can become a terrorist and be covered in honor and glory. Quite another to think that you are going to be publicly humiliated and that your mother will be exposed to a picture of you with half of your face shot off.
Now the administration has coupled the strong, courageous act of executing bin Laden with the weakfish, retiring action of refusing to release his last portrait. In so doing it has compromised the message sent by the SEALS.
Public relations executive Eric Dezenhall suggested that the administration was projecting the view that: “displays of strength are immoral, that it is somehow immoral to defend yourself or neutralize an enemy with force.”
Clearly, that is not the message we ought to be sending.
By hiding the photo Obama is projecting regret, embarrassment, an unwillingness to take full responsibility for the consequences of his actions.
Failing to release the photos risks compromising what should have been Obama’s greatest success.
You want your messaging to state that the action was purposeful, intentional, and that, if you had to do it over again, you would.
Obama has both bragged too much about his success and hid the evidence that would have shown his strength and power.
We are all willing to accord Obama the full measure of credit for his success. He himself seems to be projecting hesitation and embarrassment.
Labels:
Barack Obama
Subscribe to:
Posts (Atom)
