Sunday, May 31, 2015

India vs. China: Competing Cultures

China vs. India; India vs. China.

The great debate about liberal democracy will not be decided in debating societies or even in deliberative governmental bodies. It is being played out on a grand scale in the competition between India and China.

You see, China has achieved extraordinary economic growth over the past three decades without holding democratic elections. One notes that it has chosen a series of highly competent political leaders. The people who have been running China these many years are not vainglorious political hacks.

The notion of prosperity without democracy irritates idealists. They believe that liberal democracy and free enterprise must function in tandem.

For many years now, serious thinkers have been predicting that China will suffer an economic collapse, or at least a political rebellion from the oppressed masses. More sophisticated thinkers, people who do not really care who does or does not starve, will tell you that the Chinese people have been provided with jobs and opportunity and wealth and consumer goods… in order to dispossess them of their political freedom.

Besides, they will add, China is immensely polluted, filthy beyond human imagination or endurance. Rapid industrialization without liberal democracy has a price.

If China becomes the role model for developing countries around the world, the triumph of liberal democracy will not seem quite as inevitable as it once did.

Those who wish to promote an alternative to China often light on the largest democracy in the world: India.

For now the per capita GDP of China is more than twice that of India, so the world’s largest democracy has some catching up to do.

Thus, I was amazed to read an article in the New York Times yesterday about the living conditions in India. The article’s author, Times New Delhi bureau chief Gardiner Harris has just finished his tour in India. Harris, you see, is being reassigned. From the perspective of his family, not a minute too soon.

You see, the air and the water in New Delhi are polluted to the point that they have seriously damaged the health of Harris’s son, Bram.

In Harris’s words:

FOR weeks the breathing of my 8-year-old son, Bram, had become more labored, his medicinal inhaler increasingly vital. And then, one terrifying night nine months after we moved to this megacity, Bram’s inhaler stopped working and his gasping became panicked.

My wife called a friend, who recommended a private hospital miles away. I carried Bram to the car while my wife brought his older brother. India’s traffic is among the world’s most chaotic, and New Delhi’s streets are crammed with trucks at night, when road signs become largely ornamental. We undertook one of the most frightening journeys of our lives, with my wife in the back seat cradling Bram’s head.

When we arrived, doctors infused him with steroids (and refused to provide further treatment until a $1,000 charge on my credit card went through). A week later, Bram was able to return home.

Sadly, it was not an uncommon occurrence:

We gradually learned that Delhi’s true menace came from its air, water, food and flies. These perils sicken, disable and kill millions in India annually, making for one of the worst public health disasters in the world. Delhi, we discovered, is quietly suffering from a dire pediatric respiratory crisis, with a recent study showing that nearly half of the city’s 4.4 million schoolchildren have irreversible lung damage from the poisonous air.

How bad was it?

Sarath Guttikunda, one of India’s top pollution researchers, who moved to Goa, on the west coast of India, to protect his two young children, was unequivocal: “If you have the option to live elsewhere, you should not raise children in Delhi.”

These and other experts told me that reduced lung capacity in adults is a highly accurate predictor of early death and disability — perhaps more than elevated blood pressure or cholesterol. So by permanently damaging their lungs in Delhi, our children may not live as long.

And then there are nascent areas of research suggesting that pollution can lower children’s I.Q., hurt their test scores and increase the risks ofautismepilepsydiabetes and even adult-onset diseases like multiple sclerosis.

But, we imagine, it must be much worse in China. After all, India is a democracy and in democratic nations the people will never tolerate poisoned air and water. Besides, a democratic nation must have an environmental lobby. A capitalist bastion like China is not forced to respond to will of the people... right?

Apparently, this is not the case. When it comes to pollution India is in a class by itself. It far exceeds China… and China is certainly not a model of clean air and water.

Harris writes:

… the air and the mounting research into its effects [in India] have become so frightening that some feel it is unethical for those who have a choice to willingly raise children here. Similar discussions are doubtless underway in Beijing and other Asian megacities, but it is in Delhi — among the most populous, polluted, unsanitary and bacterially unsafe cities on earth — where the new calculus seems most urgent. The city’s air is more than twice as polluted as Beijing’s, according to the World Health Organization. (India, in fact, has 13 of the world’s 25 most polluted cities, while Lanzhou is the only Chinese city among the worst 50; Beijing ranks 79th.)

Of course, Harris is questioning whether he or any other expatriate ought reasonably to subject his children to the pollution that infests the air and water of New Delhi.

China might be better than India in this regard, but still expatriates in Beijing pose the same question.

Surely, it is an interesting question, but it implies that people have a choice in the matter. Other nations also have a choice. They can decide to emulate the Chinese example or the example set by Indian democracy.

Unfortunately, despite their having the vote the citizens of New Delhi probably do not find that a consolation for the conditions in which they are forced to bring up their children. 

Liberty and Rights in the Magna Carta

Obviously enough, the history of the Magna Carta is significantly more complicated than what Daniel Hannan can recount in a newspaper column.

And yet, Hannan, a British representative to the European parliament is correct to emphasize how momentous this 1215 document has been.

Some believe that human culture is founded on dramatic events. Freud famously suggested that it all began when a band of sons murdered their father. Other modern versions see its origins in revolution, in a rebellion against unjust and oppressive authority.

For that among other reasons the fact that British civilization began with a deal, with a contract between King John and a group of rebellious feudal barons bears emphasis.

Hannan writes:

It was at Runnymede, on June 15, 1215, that the idea of the law standing above the government first took contractual form. King John accepted that he would no longer get to make the rules up as he went along. From that acceptance flowed, ultimately, all the rights and freedoms that we now take for granted: uncensored newspapers, security of property, equality before the law, habeas corpus, regular elections, sanctity of contract, jury trials.

History as negotiated compromise. Who would have imagined such a thing?

Hannan hastens to note that the Magna Carta did not create democracy. In fact, it was not really about democracy:

The very success of Magna Carta makes it hard for us, 800 years on, to see how utterly revolutionary it must have appeared at the time. Magna Carta did not create democracy: Ancient Greeks had been casting differently colored pebbles into voting urns while the remote fathers of the English were grubbing about alongside pigs in the cold soil of northern Germany. Nor was it the first expression of the law: There were Sumerian and Egyptian law codes even before Moses descended from Sinai.

Hannan takes it a further when he waxes philosophical about what was achieved in 1215:

The rights we now take for granted—freedom of speech, religion, assembly and so on—are not the natural condition of an advanced society. They were developed overwhelmingly in the language in which you are reading these words.

When we call them universal rights, we are being polite. Suppose World War II or the Cold War had ended differently: There would have been nothing universal about them then. If they are universal rights today, it is because of a series of military victories by the English-speaking peoples.

Since we always assume that human rights are the natural condition of human beings, we are somewhat taken aback by the notion that they are an Anglo-Saxon invention, imposed by military conquest.

This does not make it an unnatural condition for human beings, but still, human rights did not flower naturally in all places and at all times.

Hannan adds that the rights enumerated in the Magna Carta were conceived in negative terms. They limited what the king or the government can do to individuals, what he can impose on them by fiat:

And indeed, Magna Carta conceives rights in negative terms, as guarantees against state coercion. No one can put you in prison or seize your property or mistreat you other than by due process. This essentially negative conception of freedom is worth clinging to in an age that likes to redefine rights as entitlements—the right to affordable health care, the right to be forgotten and so on.

Thus, at their inception, human rights were not entitlements.

Nor, Hannan reminds us, were they about universal democracy, taken to be the expression of the will of the majority.

As you know, at its inception, the United States was anything but a participatory democracy. At a time when only property owning males could vote, America’s government did not express the will of the majority of the people.

Hannan writes:

Liberty and property: how naturally those words tripped, as a unitary concept, from the tongues of America’s Founders. These were men who had been shaped in the English tradition, and they saw parliamentary government not as an expression of majority rule but as a guarantor of individual freedom. How different was the Continental tradition, born 13 years later with the French Revolution, which saw elected assemblies as the embodiment of what Rousseau called the “general will” of the people.

When practiced by totalitarian dictatorships, the concept of the general will of the people became detached from democracy. Most Communist dictatorships called themselves democracies and insisted that their leaders embodied and expressed the general will of the population.

I suspect that we Americans are not ready to return to the old days of more republican governance, but still it’s intriguing to ask whether a government can express the will of the majority and can also guarantee the individual freedoms of everyone.

At the least, we should get over the idea that holding a democratic election is somehow a political panacea.


Saturday, May 30, 2015

A Long Hot Summer Begins

It’s too late to save the Obama presidency. The shadow of ignominious failure is falling on the first African-American president. From the Middle East to Asia to the American economy, the list grows.

Only Wall Street, the energy business and tech oligarchs seem to have flourished under Barack Obama.

Among those who did not flourish were the members of minority communities, the ones who voted in near unanimity for Obama. Those who had the greatest hope were the most disappointed. And now, they are angry.

America’s inner cities have floundered during the Obama years. By now, one cannot reasonably blame it on the Bush administration, so Obama and his satraps have found a new target: white police officers.

The problem is not that anyone commits a crime. The problem is that they get arrested.

You would almost think that the administration is trying to foment a race war. It’s almost as though they went out looking for a convenient scapegoat to blame for the Obama failure.

If Obama failed, the fault must lie with white America. Now, white America will have to pay… in the person of its police officers. If the net effect of this last effort at community organizing is a spike in crime in minority communities, well… you can’t have it all.

This morning Heather MacDonald reports on the crime wave that the Obama administration has unleashed on the people who voted for it:

In Baltimore, the most pressing question every morning is how many people were shot the previous night. Gun violence is up more than 60% compared with this time last year, according to Baltimore police, with 32 shootings over Memorial Day weekend. May has been the most violent month the city has seen in 15 years.

In Milwaukee, homicides were up 180% by May 17 over the same period the previous year. Through April, shootings in St. Louis were up 39%, robberies 43%, and homicides 25%. “Crime is the worst I’ve ever seen it,” said St. Louis Alderman Joe Vacarro at a May 7 City Hall hearing.

Murders in Atlanta were up 32% as of mid-May. Shootings in Chicago had increased 24% and homicides 17%. Shootings and other violent felonies in Los Angeles had spiked by 25%; in New York, murder was up nearly 13%, and gun violence 7%.

Those citywide statistics from law-enforcement officials mask even more startling neighborhood-level increases. Shooting incidents are up 500% in an East Harlem precinct compared with last year; in a South Central Los Angeles police division, shooting victims are up 100%.

MacDonald identifies the cause:

The most plausible explanation of the current surge in lawlessness is the intense agitation against American police departments over the past nine months.

In particular:

This incessant drumbeat against the police has resulted in what St. Louis police chief Sam Dotson last November called the “Ferguson effect.” Cops are disengaging from discretionary enforcement activity and the “criminal element is feeling empowered,” Mr. Dotson reported. Arrests in St. Louis city and county by that point had dropped a third since the shooting of Michael Brown in August. Not surprisingly, homicides in the city surged 47% by early November and robberies in the county were up 82%.

Similar “Ferguson effects” are happening across the country as officers scale back on proactive policing under the onslaught of anti-cop rhetoric. Arrests in Baltimore were down 56% in May compared with 2014.

And, of course, no one talks about the victims of black-on-black violence. Except MacDonald:

If these decriminalization and deincarceration policies backfire, the people most harmed will be their supposed beneficiaries: blacks, since they are disproportionately victimized by crime. The black death-by-homicide rate is six times higher than that of whites and Hispanics combined. The killers of those black homicide victims are overwhelmingly other black civilians, not the police. The police could end all use of lethal force tomorrow and it would have at most a negligible impact on the black death rate. In any case, the strongest predictor of whether a police officer uses force is whether a suspect resists arrest, not the suspect’s race.

Harassing Laura Kipnis

A short time ago Northwestern University professor Laura Kipnis wrote an article in the Chronicle of Higher Education. In it, Kipnis questioned today’s campus orthodoxy about rape culture and trigger warnings.

I dutifully posted on it here.

Now, the campus Brown Shirts and junior Red Guards have chosen to harass Kipniss over her ideas, the better to create a hostile intellectual atmosphere on college campuses. She compares it to being brought before the Inquisition. In that she is largely correct.

According to Kipnis, they have succeeded. The decline and fall of American education proceeds apace.

In a new article from the Chronicle Kipnis recounts her experience:

According to our campus newspaper, the mattress-carriers were marching to the university president’s office with a petition demanding "a swift, official condemnation" of my article. One student said she’d had a "very visceral reaction" to the essay; another called it "terrifying." I’d argued that the new codes infantilized students while vastly increasing the power of university administrators over all our lives, and here were students demanding to be protected by university higher-ups from the affront of someone’s ideas, which seemed to prove my point.

Clearly, the tactic is fascistic. Kipnis explains:

Marching against a published article wasn’t a good optic — it smacked of book burning, something Americans generally oppose. 

On what basis is Kipnis being assaulted? On the basis of Title IX and a new policy issued by the Obama Department of Education.

Score one for the zealots become bureaucrats. They have no real interest in the criminal justice system. They want to impose their will on the nation and they have a special animus against free speech.

If you wanted to know where all of that government spending is going, you now know that the Obama administration has hired more bureaucrats and more lawyers to conduct kangaroo courts, administrative trials that circumvent due process and that shut down free inquiry on college campuses.

Kipnis explains:

Things seemed less amusing when I received an email from my university’s Title IX coordinator informing me that two students had filed Title IX complaints against me on the basis of the essay and "subsequent public statements" (which turned out to be a tweet), and that the university would retain an outside investigator to handle the complaints.

I stared at the email, which was under-explanatory in the extreme. I was being charged with retaliation, it said, though it failed to explain how an essay that mentioned no one by name could be construed as retaliatory, or how a publication fell under the province of Title IX, which, as I understood it, dealt with sexual misconduct and gender discrimination.

She continues:

Title IX was enacted by Congress in 1972 to deal with gender discrimination in public education — athletics programs were the initial culprits — and all institutions receiving federal funds were required to be in compliance. Over time, court rulings established sexual harassment and assault as forms of discrimination, and in 2011 the U.S. Department of Education advised colleges to "take immediate and effective steps to end sexual harassment and sexual violence." 

And also:

Apparently the idea was that they’d tell me the charges, and then, while I was collecting my wits, interrogate me about them. The term "kangaroo court" came to mind. I wrote to ask for the charges in writing. The coordinator wrote back thanking me for my thoughtful questions.

It’s big government at its worst:

The Title IX bureaucracy is expanding by the minute. A recent emailed update from my university announced new policies, programs, hires, surveys, procedures, websites, and educational initiatives devoted to sexual misconduct. What wasn’t quantified is how much intellectual real estate is being grabbed in the process. It’s a truism that the mission of bureaucracies is, above all, to perpetuate themselves, but with the extension of Title IX from gender discrimination into sexual misconduct has come a broadening of not just its mandate but even what constitutes sexual assault and rape.

Ambivalent sex becomes coerced sex, with charges brought months or even years after the events in question. Title IX officers now adjudicate an increasing range of murky situations involving mutual drunkenness, conflicting stories, and relationships gone wrong. They pronounce on the thorniest of philosophical and psychological issues: What is consent? What is power? Should power differentials between romantic partners be proscribed? Should eliminating power differences in relationships even be a social goal — wouldn’t that risk eliminating heterosexuality itself?

Stay tuned….


Friday, May 29, 2015

The World Is Watching

Conrad Black seems to belong to the crotchety-old-man, a plague-on-both-their-houses school of thought. In that he follows David Goldman, aks Spengler who has even-handedly attacked both Democratic and Republicans for their foreign policy follies.

Of course, being a crotchety old man does not disqualify anyone from offering a wise opinion. We do not ordinarily think of the elderly as possessing wisdom—we think of them as having lost their youth—but in the best cases wisdom accrues with age. Experience is an excellent teacher.

Better to heed the words of those older and wiser than to slobber over the empty thoughts of celebrities.

In a recent National Review article Black took up a question that I have often had occasion to opine on. He addressed the decline in American prestige around the world. It’s not about how much the world’s people like us or don’t. It’s about how much they do or do not respect us.

One hastens to mention that, in pondering your vote in the next presidential election, you should keep in mind that the world is watching and that the world is judging America by the quality of its leaders.

No nation can long function as world leader if it keeps electing presidents who do not command respect.

In the last two presidential elections America lost considerable prestige by electing an incompetent bumbler. If it next elects an equally incompetent bumbler, a woman who does not inspire respect, its prestige will take yet another tumble.

Polls show that a majority of Americans consider Hillary Clinton a strong leader. One must conclude that most Americans are suffering from a severe mental defect. Or else, that we, collectively do not have a clue about what constitutes strength or leadership.

As for the current occupant of the White House, Black observes:

As President Obama and his entourage and imperishable following persevere in their conviction that this president’s benign championship of non-intervention, arms control, and giving rogue states the benefit of the doubt is winning hearts and minds to a new conception of a kindly, detached America, it is clearer every week that this administration’s foreign policy is contemplated with astonishment and contempt by practically everyone else.

Fair minded to a fault, Black provides us with a list of the foreign policy failures of previous presidents, going back to Jimmy Carter:

President Carter was instrumental in removing the shah of Iran, the greatest ally the U.S. has ever had in the Middle East, not excluding Anwar Sadat and the Israelis, and the most enlightened leader in the 5,000-year history of Persia. President Reagan maintained civilized relations with Iraq in order to be on normal terms with one of the major Persian Gulf countries, and it may be a long time before there is agreement on exactly what Saddam Hussein concluded from his meeting with U.S. ambassador to Iraq April Glaspie on July 25, 1990, before subjugating Kuwait. (Ms. Glaspie’s next overseas posting was consul general in Cape Town.) President George H. W. Bush conducted a masterly coalition response to evict Saddam from Kuwait, but left him in power in Baghdad, massacring Kurds and putting on airs of triumph before the Muslim world. President Clinton pursued his zeal for nuclear non-proliferation to the point of imposing embargoes on both India and Pakistan when they acquired that capability, leaving the U.S. without any country to speak with in a normal and constructive manner all through South Asia between Jordan and Thailand. The initial campaign against terrorists, which attracted universal support and has been largely successful, mutated into overthrowing Saddam Hussein. George W. Bush promoted democracy to the point of destabilizing the friendly governments of Egypt and Pakistan and securing the democratic election victories of the anti-democratic Hamas in Gaza and Hezbollah in Lebanon. By dismissing the entire government and armed forces and police of Iraq, 400,000 men unemployed but retaining their weapons and ammunition, Bush ensured the country’s descent into an unorganized bloodbath, and President Obama, by his abrupt withdrawal, ensured the preeminence, in most of the country, of Iran, and a slugging match between Iranian proxies and the Islamic State death squads.

As opposed to many of his predecessors Barack Obama’s foreign policy was driven by leftist ideology. Seeing America as the cause of most of the world’s ills, he apologized for the country, abased himself in front of other world leaders and withdrew from the international stage.

If America causes all that is bad, America’s absence can only be a force for good.

By supporting leftist causes and disparaging traditional American allies, Obama also sowed confusion.

Black explains it well, while also underscoring the fact that Republican foreign policy mavens did not seem to have any better ideas:

At the outset of his administration in 2009, Mr. Obama gave portentous addresses in Cairo and in Ghana that indicated that he thought all previous frictions that the U.S. had had with Middle Eastern and African countries could be laid at the door of the formerly entirely Caucasian and Judeo-Christian leadership, and that, given his more multiracial and multi-sectarian ancestry and orientation, these frictions had become obsolete. It was as if the president imagined that relations between states were ultimately determined otherwise than on the basis of their interests; that pigmentation and the religious and racial connections of ancestors could seriously influence interstate relations. Having abandoned George W. Bush’s sophomoric confidence in the panacea of democracy in countries inhospitable to it, Obama destabilized the Egyptian dictatorship of Hosni Mubarak while turning a blind eye to the brutal theft of reelection in Iran by Mahmoud Ahmadinejad. He thus completed the elevation of the Muslim Brotherhood in Egypt (the Arab world’s 900-pound gorilla for 75 years), which ransacked the Israeli embassy, poured sophisticated ordnance into Gaza for Hamas to use against Israel, and subverted the democratic constitution, Allende-style (Chile, 1973). Even when overthrown by the military high command that the Muslim Brotherhood had itself installed, the Muslim Brotherhood continued to be solicitously referred to not only by the Obama administration but by prominent Republican senators such as John McCain and Lindsey Graham.

It is a painful and notorious narrative. The Obama administration believes that U.S. involvement in the world has been largely harmful: Obama has decried the lack of alliance consultation by Franklin D. Roosevelt and Winston Churchill as they directed the war efforts of the Western Allies “brandies in hand.” He has apologized for President Truman’s use of the atomic bomb against Japan and for President Eisenhower’s approval of the removal of (the wildly incompetent if not mad) Mohammad Mosaddegh as leader of Iran and the return of the shah. This was an astonishing sequence of criticisms of three of America’s most distinguished leaders and arguably the most generally esteemed statesman in the world of the past 150 years (Churchill).

After decades worth of presidential incompetence, or, more precisely, voter insouciance, we are facing an ungodly mess.

Black concludes:

America’s traditional allies have lost all respect for American foreign-policy-making, and have certainly not replaced it with any sense of purpose of their own. As the U.S. cranks up to another presidential election, and rhetoric echoes loudly around the country about “the greatest nation in human history” (certainly a fair description in many respects), Americans should be aware of how the country is perceived by foreigners. Never mind the usual international-Left caricature of a police-run, coast-to-coast shooting gallery in the rubble heaps of many American cities (and there is unfortunately some truth to this version also), it has almost become, as President Nixon warned, “a pitiful, helpless giant.”

A Psychiatrist Becomes a Life Coach

Apparently, life coaching is coming of age. In the course of an article explaining how mental health professionals, and psychiatrists in particular could learn from life coaches, Dr. Steven Moffic (via Dr. Joy Bliss at Maggie’s Farm) told of a psychiatrist who had turned to coaching:

A psychiatrist colleague recently retired but he turned to coaching as a second career because it emphasized the relationship he had valued most in his work as a psychiatrist. As an example, he finds that coaching is particularly relevant for dieting and exercise needed to reduce obesity. A late-career psychologist switched more and more to coaching techniques. Those who have had mental health care training can add depth to coaching that others may not be able to obtain.

Coaching seems to be a variant of what is called supportive psychotherapy. In the day, insight-oriented psychodynamic psychotherapy was considered the more prestigious treatment. It was obviously a derivative of psychoanalysis.

Moffic defined the two forms of therapy:

One of the overlapping skills these disciplines required was supportive psychotherapy. In contrast to insight-oriented, psychodynamic psychotherapy, supportive psychotherapy did not try to examine underlying conflicts that contributed to symptoms. Rather, it emphasized a casual and conversational interaction that focused on everyday life. The therapist could provide realistic praise, advice, guidance, and, at times, confrontation.

Supportive psychotherapy was most commonly provided for persons with severe and chronic mental illness in order to help such patients develop, or re-develop, life skills in relationships, work, and daily living. It was sometimes thought to be “second-rate” psychotherapy, especially compared with psychodynamic psychotherapy.

Moffic believes that an association with coaching might help psychiatry to overcome the stigma that is currently attached to it.

If stigma there is, it must have to do with two facts. First, that psychiatrists emphasize what is going wrong. They seem to believe that once they eliminate a mental pathogen all will be well. Second, psychiatrists increasingly limit their work to running a checklist and writing prescriptions. They do not really connect with their patients.

As Moffic noted, psychiatry directs its attention to mental disease and defects. Coaching works to help people to improve their ability to function in the world. It belongs to the realm of positive psychology.

In his words:

Curiously, mental health care services were available, but perhaps those seeking help felt stigmatized for therapy that seemed to address the normal challenges of life. Coaching focused on positive psychology rather than mental dysfunction.

Thursday, May 28, 2015

Why Are So Many College Students Suffering Anxiety?

More than a few American college students are suffering from anxiety. One might consider it a mental health issue, but perhaps it is more.

First, imagine that it’s about mental health.

A group of students that was brought up on a steady diet of unearned praise is lost and adrift, suffering from anxiety.

The students do not know how to cope. They do not know how to focus and concentrate. They are terrified of failure, because they were never told that they were not good enough and never learned how to deal with it. They never learned how to work hard and to work effectively. They lack resilience. They are filling up the offices of campus counselors.

The New York Times has the story:

Nearly one in six college students has been diagnosed with or treated for anxiety within the last 12 months, according to the annual national survey by the American College Health Association.

The causes range widely, experts say, from mounting academic pressure at earlier ages to overprotective parents to compulsive engagement with social media. Anxiety has always played a role in the developmental drama of a student’s life, but now more students experience anxiety so intense and overwhelming that they are seeking professional counseling.

As students finish a college year during which these cases continued to spike, the consensus among therapists is that treating anxiety has become an enormous challenge for campus mental health centers.

Nearly one in six was diagnosed. How many more were not diagnosed?

Why is this happening? The experts offer expert explanations:

Because of escalating pressures during high school, he and other experts say, students arrive at college preloaded with stress. Accustomed to extreme parental oversight, many seem unable to steer themselves. And with parents so accessible, students have had less incentive to develop life skills.

“A lot are coming to school who don’t have the resilience of previous generations,” Dr. Jones said. “They can’t tolerate discomfort or having to struggle. A primary symptom is worrying, and they don’t have the ability to soothe themselves.”

Surely their upbringing matters, but they also live in a world where they are pressured to enjoy all aspects of the college experience fully.

Curiously, over 20% of Harvard undergraduates did not have sex during their four years at the school.

Without knowing anything more, ask yourself whether these were the overachievers or the underachievers? Did this group flock to the counseling service or were they holed up in the library? Were they the most popular kids, the kids who majored in partying?

One hesitates to blame everything on social media, but apparently the knowledge that other students are indulging in bacchanalian revelry bothers many students who are not.

The Times writes:

Social media is a gnawing, roiling constant. As students see posts about everyone else’s fabulous experiences, the inevitable comparisons erode their self-esteem. The popular term is “FOMO” — fear of missing out.

One cannot, on principle, doubt the value of the Times analysis.

And yet, ask yourself the question that any good therapist would attempt to ascertain: if a patient is anxious, there might be a real reason for it.

Could it be that these students are not optimistic about their future because they have very little reason for optimism? Do they understand that America is in decline? Do they know that they are not being prepared to compete against their counterparts in other parts of the world?

Unless the world is about to redefine competition to give extra credit for decadence, that is.

Beyond that, are these students proud of their country? Are they proud to belong to this country?

If the nation’s heroes are forgotten while we obsess about its villains, if its proudest achievements are diminished by an obsessive emphasis on the bad, children will not be allowed to feel pride in their country and pride in themselves.

Students who have been brought up on a steady diet of criticism, who have learned to be anguished about all of the ills America is visiting on its citizens and the world, will very likely feel demoralized.

Students who learn in school and in the media that America is a bad country, filled with people who harbor the worst criminal intentions—like racism, sexism, homophobia, Islamophobia, lookism, ageism—are not going to feel very much pride, in their country or in themselves. And they are not going to feel very optimistic about the future.

If that is the problem, then anxiety is the correct emotional response. 

An Alcoholic Commits Egotistical Suicide

What do you do when your husband decides to drink himself to death?

Can you stop him?

Can anyone stop him?

If you can’t stop him, are you an enabler?

Could you have done something?

Could something have been done?

Such questions assailed Paula Ganzi Licata after her husband died of alcoholic hepatitis.

She described her late husband’s problem:

My husband was a high-functioning alcoholic, which is a clinical-sounding way of saying no one knew he had Scotch before breakfast and urinated in the basement utility sink each night, too drunk to climb the stairs. His doctor estimated that Robert started drinking heavily only five or six years before his death. There was a sudden spiral, perhaps exacerbated by excessive amounts of Tylenol and unprescribed Xanax, in conjunction with a genetic predisposition. A perfect storm.

She reacted with waves of self-doubt, guilt and shame:

Could I have done more? Was I too harsh? Too easy? An enabler? Should I have kicked him out to scare him straight? Or driven him to an A.A. meeting every night? Should I have told more people? If I had left, would he have stopped drinking? Why did I stay? Hindsight is filled with “what if” scenarios, second-guessing every step of the past.

What about treatment?

Licata tried everything she knew. The physicians and the AA sponsors tried everything they knew how to do. The problem was, her late husband refused to take advice. He was a very bad patient:

I was outraged by Robert’s denial and disregard; yet protective and heartbroken, wanting to save him from himself. In what I thought was the beginning of recovery, I accompanied a jaundiced Robert to his doctor where we were told his condition was reversible. He started an outpatient program, began seeing a psychologist specializing in addiction and attended only a few A.A. meetings, despite doctor’s orders that he go every day for the rest of his life.

I can’t say Robert “fell off the wagon,” as he never fully abstained. And I couldn’t force him into rehab: We lived in New York, where a person with an alcohol or substance abuse problem must voluntarily appear for treatment unless he presents an immediate threat to himself or others. The threat Robert presented wasn’t the kind they meant.

Six months later Robert was given a diagnosis of alcoholic hepatitis and given a 90 percent chance of dying within two weeks. All my anger and frustration vanished, replaced with heart-wrenching devastation.

I changed doctors to a specialist who offered better odds and a steroid program; arranged to have A.A. reps visit and tell their survival stories; coordinated bedside therapy sessions. Family, friends and professionals all tried to keep Robert focused. But he was a terrible patient. Robert’s phobias made him demanding and uncooperative, refusing dialysis, treatment rooms with low lighting or hospital rooms on a high floor. I slept beside him in a recliner. “Don’t be long,” he’d call out whenever I left the room.

After her husband died Licata was enraged. Appropriately so, his slow-motion suicide was clearly an egotistical act, an assault on her moral being. (We owe the concept of egotistical suicide to French sociologist Emile Durkheim.) 

Robert didn’t seem to care what happened to him, as long as he could hurt her. He did it by inflicting psychological torture, by showing that he was so powerful and so willful that he could do as he pleased. No one could stop him. His legacy to his wife would be a curse. She would feel shame for having failed to save him.

She wrote:

In the wake of Robert’s death, I began to process the past. What I’d come to accept — living separate lives with an alcoholic — was a wretched existence. Some surviving spouses are angry at God or at the cancer; I was angry at my husband. Hell hath no fury like a widow born.

If her husband was so hellbent on hurting his wife and those who cared for him, there was nothing she could do. One likes to imagine that the state should pass laws allowing us to commit such people involuntarily to rehab, but that seems farfetched.

Should Licata have abandoned her husband and left him to his bottles? In many cases this is the correct response.

And yet, if her departure appeared to have precipitated his descent into oblivion, she might have suffered more guilt, coupled with social opprobrium. After all, they had been married for nearly two decades when he began his death spiral.

If she had left him while he was apparently sick, she would also have risked social censure for abandoning a sick man.

Had she spent the rest of her life grieving for him and guilt-tripping herself, he would have succeeded.

In reality, Licata moved on. She did not allow his abuse to define her life. She found a new husband and concluded:

The scales tipped as surely as Robert’s last years of drunken selfishness, recklessness and verbal abuse obliterated our good years. 

Wednesday, May 27, 2015

"Escaping into Blank Mental Oblivion"

Too much of a good thing can easily become a bad thing.

I have not found a reason to oppose mindfulness meditation, but I also recognize—see this post--that too much meditation can produce negative consequences.

The therapeutic power of mindfulness has been widely reported and accepted. Still, a little skepticism is always a good thing.

It’s one thing to take a few minutes to rest one’s mind. It’s quite another to make clearing one’s mind a way of life.

Oxford historian Theodore Zeldin has recently issued a caution about mindfulness.

The London Telegraph explains:

Theodore Zeldin said too many people were avoiding using their brains and instead escaping into a state of blank mental oblivion.

One appreciates a nicely turned phrase: “escaping into blank mental oblivion” surely counts as one.

Zeldin believes that the mind should be engaged in thought, not in self-emptying. And he believes that instead of trying to get into our minds we ought to get into the world, the better to develop and sustain relationships with other people.

The Telegraph writes:

But Dr Zeldin said the practice was distracting people form discovering more about other people and the world around them, and encouraged them to instead seek to make new relationships with those who shared different views. He said the world needed to move away from an era of self-discovery.

“It’s important not to think just about yourself,” Dr Zeldin told the Hay Festival. “You think that trying to avoid things by doing exercises which free the mind from thought and will empty out minds.

“I think mindfulness and meditation are bad for people, I absolutely think that. People should be thinking.”

After all, you cannot engage in the marketplace of ideas all by yourself. To develop your thought you need to exchange ideas with other people.

Zeldin recommends that you look outward, not inward. He wants you to stop introspecting, stop trying to be whatever you want to be and stop engaging in voyages of self-discovery:

He said people should stop believing they could be anything they wanted to be and instead start appreciating the value of those around them.

“One of the beliefs of this time is you’ve got to be yourself and develop your own potential, but only thinking of oneself is a feeble and cowardly activity.

“Our potential on our own is very limited. We go to these motivational speakers and they say you can be anything you want to be. You can’t. Your potential is very limited.”



The Perils of Paid Family Leave

You remember the law of unintended consequences.

A zealous ideologue concocts a solution to a problem that mostly exists in his mind. Being an ideologue he does not recognize that the problem merely manifests a reality that he denies. He also fails to see that, however pure his intentions, the outcome need not fulfill his wishes.

When the supposed problem exists within the labor market the zealous ideologue will want to use the power of government to impose a solution. He does not understand that government attempts to control the labor market have often come a cropper.

Today’s instance involves laws that provide new mothers with generous dollops of paid parental leave. In principle, these rules allow women to keep their jobs and their places on career tracks while giving them some quality time to bond with their newborns.

Sounds good, doesn’t it?

In practice, it ends up being bad for women.

The New York Times has the story:

In Chile, a law requires employers to provide working mothers with child care. One result? Women are paid less.

In Spain, a policy to give parents of young children the right to work part-time has led to a decline in full-time, stable jobs available to all women — even those who are not mothers.

Elsewhere in Europe, generous maternity leaves have meant that women are much less likely than men to become managers or achieve other high-powered positions at work.

Why is this happening?

For one thing, employers know that young women are more likely to have children and therefore are more likely to take extended leaves of absence. They also know that they will be paying these women their salaries during these absences.

The result: companies are less likely to hire young women. When they do hire young women they pay them less and grant them fewer responsibilities.

Why hire someone who is going to cost you more and will spend less time on the job when you can hire someone who will not be taking an extended family leave?

It doesn’t take advanced math.

Look at the situation in Spain:


Spain passed a law in 1999 giving workers with children younger than 7 the right to ask for reduced hours without fear of being laid off. Those who took advantage of it were nearly all women.

Over the next decade, companies were 6 percent less likely to hire women of childbearing age compared with men, 37 percent less likely to promote them and 45 percent more likely to dismiss them, according to a study led by Daniel Fern├índez-Kranz, an economist at IE Business School in Madrid. The probability of women of childbearing age not being employed climbed 20 percent. Another result: Women were more likely to be in less stable, short-term contract jobs, which are not required to provide such benefits.

“One of the unintended consequences of the law has been to push women into the lower segment of the labor market with bad-quality, unprotected jobs where their rights cannot be enforced,” he said.

The same is true in many other countries:

These findings are consistent with previous research by Francine Blau and Lawrence Kahn, economists at Cornell. In a study of 22 countries, they found that generous family-friendly policies like long maternity leaves and part-time work protections in Europe made it possible for more women to work — but that they were more likely to be in dead-end jobs and less likely to be managers.

Naturally, the zealots are trying to find a way to solve the problem they created.

Among their solutions: encourage men to take family leave when their wives give birth. Because, after all, if extended family leave is bad for a woman’s career, why not share the badness.

If both parents take parental leave, both will be consigned to lower paying jobs. Both will be knocked off the management track. 

Unless paid family leave is obligatory for men, those men who choose to take it will put themselves at a disadvantage. The zealots want men to take time off too because they believe that there is no real reason why women should mother their children.

People who fail to understand why a mother’s presence is more important for a newborn should not be making public policy on these matters.

Tuesday, May 26, 2015

Lena Dunham Finds a Therapy That Works

As you know, Lena Dunham of the HBO series “Girls” has taken over from Woody Allen as the poster patient for psychotherapy.

By her own admission Dunham began therapy as a child and has continued it. She has dutifully taken the medications that were prescribed for her anxiety and despression.

Now, however, she has found a better way to treat her emotional problems: regular physical exercise.

The Daily Mail has the story. It quotes a remark Dunham posted on Instagram:

Last month the comedic-actress told her Instagram fans that regular exercise 'has helped with my anxiety in ways I never dreamed possible'

Not exactly a full-throated endorsement of therapy.

To that Dunham added:

'To those struggling with anxiety, OCD, depression: I know it's mad annoying when people tell you to exercise, and it took me about 16 medicated years to listen. I'm glad I did,' she captioned a photo of her figure in leggings and a yellow sports bra.

I couldn’t have said it better myself.

The Mental Health of Women Warriors

Apparently, the American military is so far superior to its adversaries that it can send women to the front lines. In the Iraq and Afghanistan wars we had more women than ever before.

After all, why not use a war as an exercise in gender-bending social engineering?

The dominant ideology dictates that there is no significant difference between men and women. Anything a man can do a woman can do, too. And vice versa. Thus, women soldiers should be allowed in combat or in combat support, lest they lose out on career opportunities.

As we know, people fight wars to advance their careers and to make ideological points.

Reporting the experience of Courtney Wilson, a lieutenant who led a platoon in an engineering batallion, Benedict Carey shows that Wilson had real difficulty bonding with male soldiers.

Naturally, no one is allowed to imagine that women are never going to enjoy male bonding and male camaraderie. It’s easier to think that the women and perhaps even the men just need more therapy.

In Carey’s words:

One of the biggest adjustments the United States military attempted during the Iraq and Afghanistan wars was cultural: the integration of women into an intensely male world. Women made up about 15 percent of the force during these two wars, compared with 7 percent in the Persian Gulf war of 1991, and they saw more combat in greater numbers than ever before.

Yet even though women distinguished themselves as leaders and enlisted soldiers, many of them describe struggling with feeling they do not quite belong. For men, the bonds of unconditional love among fellow combatants — that lifeblood of male military culture — are sustaining. But in dozens of interviews with women who served, they often said such deep emotional sustenance eluded them.

To today’s military officers, it’s a problem needing a psychological fix:

“Clearly these data beg us to account for why there’s this apparent surge in felt hopelessness and alienation among so many women service members during deployment,” said Dr. Loree K. Sutton, a retired brigadier general, a psychiatrist and the commissioner of the New York City Mayor’s Office of Veterans’ Affairs. “This is a critical endeavor, and it’s got to go beyond individual factors and look at group dynamics.”

Of course, everyone asserts that women are great soldiers and great officers. But, if these women cannot bond with their troops and if their troops do not respect them, how good can they really be?

Carey does not address another important issue, so we will. Do female soldiers undergo the same basic training as male soldiers? Are women required to undergo the same physical exertions and demonstrate comparable strength?

If the military changes the standards in order to accommodate women’s constitutional weakness, one understands why men would not treat them as one of the guys.

If Lt. Wilson is any indication, women soldiers do not feel that they belong. Thus, they are prone to interpret even good-natured ribbing as ridicule and contempt.

One hates to say it, but even though the dominant ideology forces everyone to agree that there are no significant differences between men and women, the facts remain and reality will out.

Considering how strict the army is about sexual harassment and sexual assault and even fraternization, the presence of women represents a threat to men’s careers.

Consider this:

In contrast, the women said, they got mixed messages. The Army bans most jewelry and makeup yet is institutionally protective toward women, at least out in the field. “You’re treated like a girl, and yet you can’t really be a woman — that’s the feeling,” Lieutenant Wilson said.

Mixed into this odd displacement were ever-present sexual undercurrents. Many women said that at night, on base, they would not go to the bathroom without an escort. Lieutenant Wilson said a noncommissioned officer in her unit continually made sexual jokes that made her so anxious she thought about reporting him. She decided against it, but the threat lingered.

In fact, almost any consorting with a male soldier was enough to feed an appetite for gossip that rivaled high school, veterans said. “What made it unbearable were the moments you felt you couldn’t socialize or bond with the men after a hard day — they were mostly hard days — because of the rumors that would fly,” said Susanne Rossignol, who served in Baiji, Iraq, in 2004 and 2005.

Lt. Courtney Wilson might have been tough as nails, but she could not deal with mockery and nearly became anorexic when someone called her fat.

Carey describes the problem:

“Lieutenant Wilson is a model officer whom I would trust with the most difficult mission,” her company commander wrote in September of 2010.

But she was less certain she inspired affection. “Courtney doesn’t have that laid-back humor a lot of guys have, so she’d get teased and didn’t know how to shrug it off,” said Lieutenant LaPonte, who became a close friend. “She took everything personally.”

Perhaps no more so than when a couple of soldiers cracked that she looked fat.

It was a bad joke, at best; a distance runner, she worked out whenever she could. Still, it got under her skin. “I was living on carrots and water,” she said. “I was down to 122 pounds, so skinny you could see my clavicle. It was crazy, but I felt I had to prove something to them.”

Weigh her words: she felt she had to prove something to them—prove what, exactly? That she was a slim, attractive woman?

When her efforts at self-medication failed, Lt. Wilson tried therapy:

Jack Daniel’s and Coke blunted the anxiety, but the relief did not last. She tried biofeedback, prayer, meditation and psychiatric medications. Finally, reluctantly, she began regular talk therapy with a psychologist at the Fort Hood military base in Texas.

 “She really struggled to connect with other people, and in part it’s because she was trying to be someone she was not,” Roger Belisle, a clinical psychologist at Fort Hood’s Resilience and Restoration Center, said in a phone interview.

It’s not just that she was trying to be someone she was not. It’s that the army was allowing her to pretend to be someone she was not.

This all raises an interesting question. America has chosen to ignore the fact that men and women are different. It has chosen to place more and more women in combat or combat support missions.

And yet, most men consider themselves duty-bound to protect women. Does this mean that as more and more women join the military the nation will feel less and less interested in fighting wars? Does it mean that with more and more women in the military America will prefer to see wars in terms of winning hearts and minds? Does it mean that the nation will feel queasy sending women into combat, unless it’s in a war against climate change?