Some people abhor self-control. Or at least they think they do. Their life goal is to let loose, to free themselves of all civilized constraint, and to overcome inhibition. They are more than willing to sacrifice their character on the altar of full and open self-expression.
And why not? Didn't Freud teach us that the effort to control your appetites, your instincts, and your very lust, was an exercise in futility?
If human life is as Freud said it was, that is, a struggle to the death between your mind and your appetites, between your ego and your instincts, or between you and your cravings, then you are destined to lose.
All the recent research in this field tells us that you cannot control impulses or appetites by focusing your attention on controlling your impulses or appetites.
If you believe Freud, then, you should not waste your time trying to exercise self-control. You should join the winning team and let your instincts have more or less free reign.
I am well aware that Freud did not make this recommendation. That does not prevent it from inhering in the logic of his argument. And, the logic of the myth that underlies it.
For those of you who wish to become Counterfreudians, I recommend Jonah Lehrer's article outlining the latest research findings from the field of self-control. Link here.
When it comes to New Year's Resolutions Lehrer offers this utterly sensible advice: take one resolution at a time. If you are trying to change too many bad habits at once you will surely fail. The brain is not set up for drastic, instant transformations.
And Lehrer adds these two salient points: "... practicing mental discipline in one area, such as posture, can also make it easier to resist Christmas cookies. And when a dangerous desire starts coming on, just remember: Gritting your teeth isn't the best approach, as even the strongest mental muscles quickly get tired. Instead, find a way to look at something else.:
It sounds like Lehrer has read his Aristotle. If you want to build discipline, you must, Aristotle would have had it, practice discipline. And it is not all that important where you start. Try making a habit of showing up on time or of returning messages promptly.
Surely, this is quite different from the notion that you can build discipline by discovering the hidden reason why you have none.
The second point, also via Aristotle, is simple. When you have a bad habit, it is fruitless to get into a struggle to control or contain it. The best, and perhaps the only way, to get rid of a bad habit is to replace with a good one.
If it is just you and the Christmas cookies, you are going to lose. If you want to control your consumption of cookies, every time you start thinking about cookies, try thinking about carrots.
Image matters. Images that symbolize a presidency are often called iconic.
In prior posts I joined those who suggested that Obama's bows to the King of Saudi Arabia and the Emperor of Japan were iconic expressions of a deferential foreign policy.
A skilled propagandist like Michael Moore knows well the power of iconic images. When Moore was working to undermine the Bush administration he suggested that the image of Bush reading to children during the 9/11 attack should be an iconic image of the Bush terrorism policy.
It did not work. For a very simple reason. It was overwhelmed by the power of the image of George Bush standing in the rubble of the World Trade Center, his arm around a firefighter, announcing that the terrorists would soon hear from us.
That image held through the 2004 election. It defined Bush and his administration until Hurricane Katrina. Then it was drowned out by the image of Bush saying: "Heck of a job, Brownie."
That iconic image defined the last years of the Bush administration. Whenever a catastrophe happened, the public reaction was that Bush was detached and oblivious, that he had abandoned the nation to its fate.
It may not have been fair; it may not have been historically accurate; but it was surely the lens through which the public judged the last years of the Bush administration.
Now, Maureen Dowd, apparently suffering from buyer's remorse, has set out to identify the iconic image that is defining the Obama administration. Echoing George Bush's glib remark, she performs a deft ironic twist and offers her own judgment of Obama's response to the Christmas terror attack as: "Heck of a job, Barry." Link here.
Dowd began her column beautifully. Walking past the White House, she had an epiphany: no one is home. Meaning: no one is in charge. Absent presidential leadership, we are more vulnerable than before.
Bush was present at ground zero after the 9/11 attack. On Christmas day Obama was in Hawaii enjoying the sun and surf and golf. Obama was not going to let a small thing like the failure of our counter-terrorism policy interrupt his sun-filled vacation.
His first reaction was perfectly diffident. He seemed to be taking a few moments from his golf game to read a prepackaged statement. Emotionless and tieless Obama lapsed into dead-letter legalese, leaving everyone confused and in doubt.
Where Bush was resolute; Obama was legalistic. Where Bush was present; Obama was barely there. Where Bush was courageous; Obama was pusillanimous. Where Bush took command of the situation, Obama blamed his predecessor.
As a ploy, it did not work.
After measuring the universal public combination, Obama decided to restate his position. Only this time he phoned it in. As Dowd shrewdly notes, when Obama stepped forth to denounce the systemic failure of the government, not only did he not seem to recognize that he was the head of this government, but he did not even show his face.
As Dowd says, there was no video feed. For her it was a Spock moment: "In his detached way, Spock was letting us know that our besieged starship was not speeding into a safe new future, and that we still have to be scared."
What are we to make of the absence of a video feed? Does it mean that the administration does not want to risk producing an iconic image of diffidence.
An iconic image must be an image. Perhaps the Obama image-controllers outsmarted Maureen Dowd, leaving her and the rest of us still hungry for the right image to symbolize the Obama presidency.
Doubtless the first year of the Obama presidency will provoke a blizzard of commentary. Among the best, in regard to Obama's foreign policy, is Fouad Ajami's: "A Cold-Blooded Foreign Policy." It's an excellent read. Link here.
With the end of the decade fast approaching historians are ramping it up to find a concept to summarize the period. So says the eminent historian Simon Schama. Link here.
Schama informs us that the British have dubbed the decade: the Noughties, a nice play on nothingness and decadence. In American they would be called the Aughties, but not, as Schama says, the Oughties.
I'm not sure why a decade filled with wars and capped off with an economic crisis should be defined in those terms, but, then again, I am not an historian.
Schama, however, takes the concept and draws the next and most logical conclusion. Having just passed through the Noughties or Oughties, he declares: "... if we end up calling the next decade 'the Teenies,' then, I must regretfully announce There Truly Is No Hope."
While I hate to correct such an eminent an historian, I will this one time, in the interest of keeping a glimmer of hope alive.
Strictly speaking, next year will not begin "the Teenies;" it will usher in "the Pre-Teenies."
I am feeling a bit behind the times today. Last February Mary Eberstadt published a wonderful essay called: "Is Food the New Sex?" I hadn't heard of it until today when David Brooks gave it a well-deserved award. Link here.
Eberstadt's argument is elegant for its simplicity, and brilliant for its concision. It's very "high concept."
She begins by observing that we Americans have now arrived at the point where we can, for all intents and purposes, eat as much as we like and have as much sex as we like.
What happens when food consumption is no longer governed by laws regulating scarcity and when sex is plentiful and readily available?
Eberstadt is not just interested in our patterns of consumption, but even more in the moral codes we use to regulate our appetites, or not.
Eberstadt argues that our moral attitudes toward these two appetites have undergone something of a sea change. We used to eat what we liked, and did not worry about what other people ate. Food consumption used to be a matter of personal taste.
And yet, among the young and the educated, or better, among the higher classes-- as though it were a matter of class identification-- people believe that what we eat should be subjected to strict moral regulation.
In many enlightened circles a good person, a person of high moral standing, is someone who does not eat animal flesh or animal products.
More than that,he does not stop at the brown rice and tofu. He also feels that he must proselytize his gospel. He sets out on a mission to convert others to his beliefs and practices. Having appointed himself to the august position of savior of the planet and all its creatures, he feels a moral necessity to make sure that everyone consumes according to his example.
Whether or not he is aware of it, he is, as Eberstadt notes, a Kantian. He believes that his personal moral decisions should be universal laws.
As Eberstadt readily admits, much of what we now know about nutrition points us toward more healthy eating. But at the same time, much of what we know about sexual license points us toward a more judicious expenditure of our sexual resources.
Yet, the moral scolds among us have decided that we must all show the greatest respect for our alimentary canals, while we dare not show the least respect for our sexual being.
According to the new rules, when it comes to sexual appetite, the commonly-held moral wisdom says that anything, or just-about-anything goes When it comes to your appetite for food, nothing, or just-about-nothing goes.
When someone is misbehaving sexually we dare not utter a word. When someone is mistreating his body by filling it with junk food or processed food or sugar or animal fat we feel that we are morally obligated to speak out and condemn his choices.
Eberstadt's argument should alert us to the simple fact that our moral reasoning has been infected with intemperance. This all-or-nothing moral thinking means that neither of our two appetites is submitted to rational control. Surely, we would all be better off it food consumption suffered less regulation and sexual consumption more.
Take the example of a borderline foodstuff... tobacco. See Claude Levi-Strauss' examination of South American Indian myths about tobacco consumption in his book: "From Honey to Ashes."
We do not ingest tobacco, but we consume the product of its combustion. And we do not do so any which way. Rituals and ceremonies involving smoking have long been commonplace. Their existence alerts us to the fact that tobacco has a social, if not biochemical, privilege. There was a time, within the memory of those of us who are older, when cigarettes were present at parties and business meetings.
We also know that tobacco smoke contains a powerful anti-depressant called nicotine. It is not an accident that so many people have been addicted to tobacco smoke.
Now, of course, we also know that tobacco is a poison. It is bad for our health, and it is something that most of us have or should quit.
Yet, the culture does not consider smoking a personal or a private matter. It is not just the smoker's business; it is presumed to be everyone's business.
The culture condemns smokers in the most strenuous terms. Smokers are not merely considered to be fools who are damaging their health. They are treated as moral lepers; they are shunned from polite society; they are treated as pariahs.
You might say that we all pay for the smoker's bad habit, through increased health insurance premiums, just as we all pay for the obese individual's overeating. Smoking makes people sick; overeating does the same.
This is surely true, but the same can be said about anonymous and promiscuous sex. How many young women have contracted STDs from such practices? What is the prevalence of herpes and PID among young women today? How many women have fertility problems because of the aftereffects of youthful sexual indiscretions?
I am not trying to moralize about sex. I am not even trying to bring back the bad old days of sexual repression. I am simply supporting Eberstadt's argument that bad health consequences alone do not explain why we feel that we have the right to condemn those who smoke or who eat the wrong foods, while we feel that we must not cast the least moral aspersion on any sexual practice performed by two consenting human beings.
One of our two appetites is subject to strict moral regulation; the other is subject to nearly none.
Eberstadt is well aware of the fact that many religions have sumptuary rules. These rules, she explains, told believers what to do with food once they got it. They did not tell people to dwell on food, to make it a quasi-religious experience, an assertion of their fundamental moral goodness.
Dietary rules have traditionally been invoked for reasons of health, but also to enhance religion. They make you part of an organized community, not an inchoate cult.
According to Eberstadt, the new mania about food is not part of a religion; it is supposed to replace religion.
As you read Eberstadt one question looms about all else. How did we get to this point, and what does this moral deformity mean?
Her conclusion: in the hidden recesses of our brains we are uncomfortable with the excesses of the sexual revolution. Yet, we are so afraid to be considered sexually repressed or neurotic that we cannot bring ourselves to find fault with any form of sexual expression.
But if we deregulate one area of human behavior, we are still human and we still much demonstrate that we have the capacity for following rules and for controlling our appetites. If we did not we would be less than human, and surely less than ethical members of human community.
Since the rules governing good consumption are only incidentally about food consumption, they are far too rigid and, at times, irrational.
Yet, we follow them anyway, because our human dignity is at stake.
Psychological theorists have long explained the transition between childhood and adulthood in terms of the ability to delay gratification. Apparently, the younger among us go immediately for any object or actions that would gratify our needs while the more mature among us are capable of holding off until the moment is right.
Now, John Tierney has informed us of new research that proves, yet again, that it is possible to have too much of a good thing. Link here.
It seems that some people become too good at delay. They miss the moment for pleasure because they are constantly putting things off until tomorrow.
In Tierney's words: "Once you start procrastinating pleasure, it can become a self-perpetuating process if you fixate on some imagined nirvana. The longer you wait to open that special bottle of wine, the more special the occasion has to be."
The moral of the story: do not wait for the arrival of an unattainably ideal moment. You would probably do better not to drink that great bottle of wine the minute you receive it, but, if you are prone to delay your gratification excessively, it is best to keep in mind a maxim that Tierney uses to end his article: the moment you open that special bottle of wine is a special occasion.
Sounds rather Zen... in the good sense of the word.
Ask yourself what makes people happy and you will likely find the answer Prof. Sonja Lyubomirsky gave: good relationships. At least, that was her answer before she researched the question in depth.
After she and two of her colleagues did their research they discovered that people gained and sustained happiness, in part through relationships, but, more importantly, through ... work. Link here.
Anyone who has lived in our therapy culture will be shocked by this answer.
Not so, Harvard psychiatrist, Dr. Richard Mullican who famously once said: "the best antidepressant is a job."
If you are looking for happiness, work will take you there. As Lyubomirsky explained in an article that accompanied the publication of her book, "The How of Happiness," work structures your life, it structures your relationships, it directs your efforts toward a productive end, it provides opportunities for success, and it allows you to fail.
You have to start with a job, but having a job is not enough. You have to be good at it; you have to work hard at it and to take it seriously; and you have to succeed at it.
Shockingly, happiness seems to involve competition!
When self-proclaimed self-esteem gurus decide that they do not want children to compete in spelling bees or dodgeball games-- lest the losers feel badly about themselves-- they are really, according to these research findings, making it more difficult for the children to be happy.
When people like your humble blogger make much about the value of work and the importance of following a work ethic, they are not trying to cause you to repress your instincts. They are trying to point you toward the road to happiness.
I will mention that Lyubomirsky's conclusions predate the advent of this blog.
While I was not at all surprised to read her conclusions, the point that struck me most forcefully was what she left out of the equation.
That was: all of the prescriptions offered by the therapy culture. Happiness did not involve the pursuit of pleasure for pleasure's sake. It did not privilege true love or romance. It cannot be attained by following your bliss or by getting in touch with your feelings. It does not involve acting according to your heart's desire, having a rich fantasy life, or living your dreams. It definitely has nothing to do with expressing your emotions freely and openly. And it does not require an interminable experience of psychotherapy.
If this is true, and I believe that it is, then the prescriptions that the therapy culture has been peddling as antidotes to the modern condition, or even as cures for what ails you, will do nothing more or less than distract you from the task at hand: working your way toward happiness.
A while back I confessed to being a fan of relationship coach Jag Carrao. Link here.
Now, having read two recent posts about relationship lies women tell themselves, I am even more of a fan. Links here and here.
Carrao is concerned that women have fallen into the habit of telling themselves lies about why their relationships are not working. In her view they feed off the false hope that these lies engender.
In her words: "These myths may feel comfy, but by insulating us from sometimes unpleasant realities, they undermine our ability to make rational decisions based on complete information, thus sabotaging our long-term romantic goals."
Of course, women tell these lies to keep hope alive, to try to sustain a relationship that is not working, to avoid the clear signs that it is going nowhere, and, above all else, not to have to feel rejected.
Or so everyone thinks. A very popular book and movie has suggested that women suffer because they cannot face the cold, hard truth that: "he's just not that into you."
Clearly, the authors have a point here. No one wants to embrace rejection, and no one wants to rush toward it when there might be other explanations for the rejecting behavior.
And yet, rejection implies fault, and I suspect that we would all be better off if we found a better way for people to make more rational decisions about their relationships.
As it happens, the good old days offer a suggestion. What if the relationship is not working because it just wasn't meant to be. It is one thing for a woman to have to feel that he is just not that into her, thus that he is rejecting her, but quite another for her to accept that the two of them were just not meant to be together.
Perhaps there are places where it is possible to make it work, but relationships do not seem to be one of those places.
For those of you who are interested in the latest and most interesting research in psychology, I have just added a link to a blog called Simoleon Sense. As we know much of the most interesting research into psychological motivation is coming from the intersection of economics and cognitive neuroscience.
The brainchild of Miguel Barbosa, Simioleon Sense offers links to important academic papers in the field of behavioral economics, mixed with some current columns on finance.
I myself was struck to read a report suggesting that social rejection really does cause physical pain. Link here. Since social rejection and isolation have everything to do with depression and general states of demoralization, it is valuable for us all to understand that when depressed people say that they are feeling pain, that they not using a metaphor. Severe depression really does hurt.
Or else, here is a link to an article by George Loewenstein about the visceral, or appetitive influences on behavior. Link here.
Or, this older paper by Loewenstein on the role of emotions in economic theory or behavior. Link here.
In an earlier post I noted that the Protestant Work Ethic was alive and well... in Israel. Link here.
As I suggested, if the work ethic has been killed off in America, much of the credit must go to the therapy culture. Having denounced the work ethic as a ploy designed to repress your sexuality and render you neurotic, it convinced far too many people that their life's goal should be the pursuit of pleasure.
God knows, I am not suggesting that people should experience any less enjoyment in life, but God also knows that success is the mother of pleasure, and that if you fail too often your capacity for pleasure will be seriously undermined.
All of this to introduce an article entitled "My Lazy American Students" by Professor Kara Miller. Link here.
Miller has had the enviable opportunity to teach college-level students from all over the world. She reports that American students are markedly less diligent, less assiduous, less focused, and less hard-working than students from the rest of the world.
Today's American college students are leading the world in sloth! They have no concept of getting their work in on time or of concentrating on the tasks at hand.
Miller judges her American charges to be more creative than their Asian and Latin American counterparts, but what use is their creativity when they are constantly falling behind on their assignments and are barely prepared for class.
Miller's conclusion echoes one that I have been arguing for some time now. If you believe that America's future lies in its youth, and if you believe that hard work, and only hard work, can dig us out of the financial hole we have gotten ourselves into, then there is little cause for optimism.
Apparently, students the world over understand that the world is a highly competitive place and that consistent success will only come to those who work hard and who follow the simple precepts of the Protestant Work Ethic.
Only American students are thoroughly clueless about this concept, and its practice.
It's nice to have an idea of what makes for a good or a bad apology-- surely I have my opinion on the subject-- but it is better to have something resembling objective data.
Thus, hats off to Michael Maslansky for convening a focus group to judge the best and the worst apologies of 2009. Better yet, he allows us to see the apologies-- via Youtube-- along with the reactions of the focus group participants. Link here.
If you're still interested in some apology theory, here are my posts on the topic. Link here.
Normally, global warming believers and advocates belong to the political left. Skeptics and deniers are constantly demeaned as members of the extreme right wing of the planetary polis.
That is what makes Alexander Cockburn's article on anthropogenic global warming so interesting. Cockburn is a card-carrying leftist; no one could ever challenge his credentials as a radical of the most leftist tinge.
Yet, here he is, in The Nation, of all places, concluding from his reading of the recently-divulged emails from the Climate Research Unit of the University of East Anglia that man-made global warming is a "farce. Link here.
Those of us who believe in rigorous public debate and discussion, and who revile the notion that people must all adhere to party-line dogma on global warming and other public policy issues will find Cockburn's article refreshing, even cause for rejoicing.
It feels deliciously quaint, even retrograde, to sit down with a fountain pen to write a thank-you note on your personalized note cards.
In this age of email and texting and tweets, people are probably communicating more than ever. But their communications are, in the words of Geoffrey Parker, branding consultant and great-grandson of the founder of Parker pens, cheapened for as much. Link here.
When you dash off a text message filled with abbreviations and misspellings, it does not say thank-you. It says that you are careless and do not hold the other person is very high esteem.
The discussion feels like a throwback to a more formal age, but Parker still makes an important point, one that we can easily be applied to different situations. If the gift is special the note should look and feel special. We make impressions with these notes, and why not present yourself at your best.
If everyone is texting their thank-yous and you are sending a formal note, you will stand out from the crowd and will be showing the other person that you hold him or her in very high regard.
If you are about to stop reading this post, because you find it mildly insulting for anyone to encourage you to adopt such habits, keep firmly in mind that you will be called upon, after your next job interview, to write a thank-you note.
You may be burning with desire to get that job, but if you send off a cursory text message in lieu of a thank-you note you will be telling your potential employer that you do not care very much for him or his job.
I am not saying that you should never send an email, but note paper-- not a note card-- is a better way to say thank-you to someone who has interviewed you. For some interviewers it will not matter, but, all things considered, why would you want to take an unnecessary risk.
That being the case, Geoffrey Parker offers some excellent advice about composing thank-you notes.
First, do not just dash it off. Do not just wing it. Before committing ink to paper, try writing a draft. Then, put it aside for a while, and read it with fresh eyes.
If you are like me you will be shocked and awed about how many mistakes you have made, and how easy it was to ignore them in the thrill of writing the first draft.
Second, don't just say thank-you. By my lights, and Parker seems to be on about the same page, a thank-you note should contain three sentences. It is insufficient to thank someone for the sweater and just leave it at that. And it is certainly insufficient to thank an interviewer for his time and leave it at that.
Similarly, a thank-you note is not the occasion to enter into a multi-paragraph disquisition on the state of your friendship. If you are saying thank-you after a job interview, you would not offer an extended critique of the interview or the company, any more than you would present several paragraphs worth of the good points that you forgot to mention in the interview.
According to Parker, a thank-you note should establish something like a conversation. He suggests that you should begin by referring back to the last time you saw each other. And you might continue by asking about an associate or a family member. Then you can express your gratitude for the gift, or for the opportunity to get to know the interviewer, and look forward to enjoying the gift or seeing the interviewer again.
Third, if this is a thank-you for a gift, then, as Parker recommends, you should sign it with your first name. Surely, this also means that you should address the other person informally, either as Jim or Jane or Aunt Sally.
If you are thanking someone for a job interview you should address the other person formally, as in Mr. or Ms. Smith, and should sign with your full name.
The rest of us, as Lehrer writes, consider ourselves to be empiricists. We pretend that our decisions are guided by facts, not fictions. We believe that trial-and-error is the best approach to life decisions. We consider ourselves rational individuals, shielded from the temptations of emotion, conducting our lives according to the same principles that scientists use when testing their hypotheses.
Apparently this is not the case, either among scientists or among the rest of us.
In Lehrer's words: "The fact is, we carefully edit our reality, searching for data that confirm what we already believe. Although we pretend we're empiricists-- our views dictated by nothing but the facts-- we're actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isn't that most experiments fail-- it's that most failures are ignored."
As I said, we all believe that we think rationally and evaluate our decisions in relation to the evidence of their success or failure. We apply these principles to the way we bring up children, choose lovers, pursue careers, or trade the markets.
Much of the time, however, we are fooling ourselves. We are just as likely, perhaps more likely, to double down on a bad decision than to change our minds.
If scientists are blind to anomalous data, so too are lovers. When humans fall in love they become especially oblivious to any data that contradict their plans, dreams, or feelings. Most especially they are blind to anything that would challenge their idealized judgment of their lover.
Strong feelings seem to trump all other considerations. It's as though we assume once we have been overwhelmed by a strong passion, we are necesarily in touch with a higher truth, one that will always trump facts.
But we do not just fall in love with other people. We also fall in love with our own ideas, our own theories, and our own beliefs. Here again the moment of inspiration feels so good and so right that we allow it to trump any facts that seem to contradict it.
It gets worse. Once we have become convinced of a lover's consummate wonderfulness or an idea's ultimate truth, we invest in them. We invest time and energy in our lovers and ideas, we become publicly identified by our investments.
Once we have altered our life to bring a lover into it or alter our life plan in order to make it conform to a new idea we have a stake in being right. This makes us less willing or able to admit of error, to accept the verdict of reality.
Often enough we become so invested in a bad relationship or a bad idea that we get angry at anyone or anything that would cause us to doubt it.
Among scientists falling in love with an idea seems to be less important than another influence: peer group pressure. As Lehrer explains it, scientists often ignore non-confirming data because they belong to a culture or a department or a fraternity that has invested in a specific theory. When you are part of a group you try to be in mental harmony with other members of the group. Not only do you want to think as they think, but you feel honor bound to uphold the group's values and beliefs.
If you discover data that threatens your theory, you are putting yourself at odds with a group in which you have status and prestige. How many of us, in those circumstances, would sacrifice status and prestige for data that may or may not be conclusive?
Scientists with academic stature come to have what Lehrer calls "a vested interest in the status quo."
I would add that they also have some respect for tradition. While the will to conform inhibits new scientific discovery, it also makes for a more orderly field of study. The truths of past science might be superceded by new work, but they still deserve some respect.
Lehrer offers some advice about how people can become more open to new data and new evidence. After all, it is possible that the theory is wrong, but it is also possible that the data is wrong. How can anyone know?
For academics in a department, the solution involves breaking out of the closed world of their department, and discussing the problem with people from other disciplines. If people in the chemistry department veer toward a form of academic groupthink, they can best solve their research problems by conversing with biologists and medical professionals.
Once the study group has been expanded to include people who are unfamiliar with departmental jargon, everyone will have to reformulate their thoughts and rephrase their issues so that outsiders can understand them.
Such reformulations, occurring in conversation, often clarify issues and advance research. As Lehrer points out, such discussions are far more effective than going it alone.
In some ways this same point applies to the practice of life coaching, especially as it concerns business coaching. How can a business coach help someone when the coach does not have a background in business?
Applying the same principle, we can say that when a businessman has to explain his business to an outsider he will have to articulate details and assumptions that he might have taken for granted or taken on faith, because they are the linguafranca of his business, but which sound a lot less valid once they are presented to an outsider.
Compare this with the practice of psychoanalysis. For those few who still imagine that psychoanalysis has something to do with science, consider this: in psychoanalytic treatment the analyst presents a theory or interpretation. The patient is then required to produce confirming material, and to persuade the analyst that this material has persuaded him of the truth of the analyst's interpretation.
The psychoanalyst's theory is not subject to trial-and-error. If the patient discovers material that seems to contradict the interpretation the patient is labelled resistant, neurotic, and in need of much more treatment.
Feminism, as we know it, is both an ideology and a political movement. As ideology it wants women to identify themselves as feminists before all else. Read through the posts on the excellent DoubleX website and you will find frequent discussions about whether an action or an idea is consistent with feminist ideology.
This endless self-criticism leaves this blogger with the impression that the most important thing for women intellectuals is knowing that they are good feminists.
As I have mentioned before, being a good feminist is not the same thing as being a good woman or a good wife or a good mother or even a good girlfriend. Link here.
Some feminists refuse even to spell out the word-- woman-- because it implies, in their minds, subservience and oppression.
So, I have been intrigued by a series of recent articles questioning the fact that so many women today feel a need to undergo artificial and cosmetic procedures to enhance their beauty. You would almost think that modern women had not read Naomi Wolf! Links here and here and here.
After four decades of feminist ascent women are more obsessed than ever about their appearances. The only difference-- which the authors and I agree is disturbing-- is the increased usage of surgical enhancements.
It is one thing for a woman to feel that she needs to pad her bra; quite another for her to feel that she must have breast implants. It is one thing for a woman to wear cosmetics; quite another for her to believe that she has to have her faced altered, almost beyond recognition, in order to look younger.
For whatever the reason women's relationships with their own bodies have suffered markedly during this latest feminist epoch. Cosmetic surgery is only a part of it. Eating disorders are another form of self-sacrifice, even to point of being like self-punishment.
From Botox to Brazilian bikini waxes to bulimia... the modern feminist is almost pathologically insecure about her body. Not only is she not proud of her curves or her form, she is constantly finding fault with every aspect of her physical appearance.
Evidently, the world has known worse. Chinese footbinding was a horror that lasted for close to a thousand years. And female genital mutilation is still practiced today in many parts of the world.
But no one is forcing women to undergo extreme makeovers. They have earned their way, and they are exercising their freedom in buying these procedures.
When an independent, self-defining, autonomous woman spends all of her spare money on Botox, laser hair-removal and vaginal rejuvenation... you might imagine that something is wrong somewhere.
Naturally, feminism has an answer. It blames the patriarchy. Not so much because the patriarchy is forcing these women to mutilate themselves, but because feminism always blames the patriarchy. It is, after all, an ideology, and ideologies are never wrong. They never accept that any form of real evidence could ever throw doubt on their dogmatic beliefs.
None of these thinkers seem to imagine that feminism has contributed to women's loss of confidence in their bodies might. Is it possible that feminism has so thoroughly undermined women's pride in being women that it has caused them to feel badly about their bodies?
Feminists do not see things this way. They blame the culture, whatever that is, because the culture is the agent through which the patriarchy manipulates women.
To the authors of the articles I linked, it seems that after four decades of feminist thought, which, if it means anything, tells women to distrust any and every piece of advice coming from a man... women are systematically allowing themselves to become prey to male-dominant cultural forces that are punishing them for having careers.
Why is it that feminism failed women so completely on that score?
Is the patriarchal backlash really that powerful?
Feminists consider that women are suffering because the culture is looking askance at their bodies. The culture is forcing them to feel insecure and unattractive because it is criticizing their choices of plastic surgeon, hairdo, and pantyhose.
But whether it is through fashion magazines or gossip columnists or the woman at the next cubicle, do you really believe that these social instances are all controlled by men?
For all I know there might be a relationship between the relentless criticism and self-criticism that these women are subjected to and the fact that the theoretical soulmate of feminism is something called critical theory.
If you spend your college years learning how to criticize-- the better to firm up your ideological commitment-- why would you not turn this tool on yourself? It would almost be inevitable that you would.
Anyway, sophisticated theorists will tell you that what feminists call the culture is really just another term for the female gaze, for women looking at and criticizing other women.
Feminism held that the male gaze was tyrannically oppressing women by making them into sex objects. So feminism liberated women from the male gaze... only to see them get abused by the female gaze!
If, as often happens, women are competing against other women for male attention, I am not so sure that the female gaze is entirely objective in its appraisal.
Where past abuses of women might well be attributed to the male gaze-- footbinding comes to mind-- others-- like genital mutilation-- cover up a profound anxiety about paternity and a mistrust of women.
That is not what is happening today. Today's feminine aesthetic has little to do with male taste. Surely, men prefer women to be voluptuous more than stick-thin. They prefer women with facial expressions over women whose faces no longer move. And, precious few men are enamored of artificial bodies.
Of course, there is more to the problem than the difference between male and female gaze. There is a sociosexual component that needs mention.
Feminism likes to pride itself on facilitating the entry of women into the workforce. Clearly, this has been a good thing. Yet, feminism also exercised a considerable cultural influence; it told women how best to manage their lives, in order to have successful careers and to be good feminists.
Feminism told women to postpone marriage and family, the better to get a head start on career development. This is certainly a defensible idea.
Yet, it produced an unintended consequence. Women who started looking to settle down in their thirties found themselves competing against younger women. Some of these young women were not even feminists.
These thirtysomethings felt old and insecure about themselves. They worried about their fertility, thus, their biological clocks. To compete with the younger crowd, they believed that they needed to have surgically-enhanced, even exagerrated female features... the better to attract men.
In the past women who married could feel that they had overcome the woman vs. woman competition for male attention. Given the emancipation of women and the feminist attacks on marriage, this was no longer the case.
Recall that feminism believed that the institution of marriage was an oppressive institution invented by men to turn women into chattel slaves.
In the feminist mythos the miserable anorgasmic suburban housewife would free herself from the her oppressive husband and children and run off to go to graduate school. There she would meet the dreamiest lover and discover levels of satisfaction that had previously escaped her. See, for example, Marilyn French's "The Women's Room."
To facilitate the transformation of this scenario into reality, feminism had to destigmatize divorce.
Well and good. Except for the unintended consequences. Once divorce was destigmatized more men decided to liberate themselves from their marriages. They decided to trade in their aging wives for younger versions. No stigma; no problem.
This meant that married women could not stop competing for men. They were living under a more-or-less constant threat that their husbands would leave them for someone younger. Thus, they felt pushed to undergo surgical procedures that would at least keep them looking and feeling young.
But that is not all. In the new feminist world young women who had chosen to postpone marriage now entered the workforce in large numbers.
There have always been women in the workplace. For good or for ill they were often relegated to secretarial work, and were not considered to be of quite the same social class as their male bosses and managers.
Now, however, the young women who were beginning their careers were of exactly the same class as their male colleagues, bosses, and managers. They had gone to the same schools; they came from the same neighborhoods. Worse yet, they were both young and, in many cases, available.
A married mother living in the suburbs, even if she also pursued a career, felt that she was competing against the newest and youngest of female hires. These young women were in constant contact with their husbands.
At the very least, the wives and mothers would have felt somewhat insecure about seeing their husbands go off every day and spend the major part of their waking hours surrounded by nubile and available young females.
No one intended that this combustible situation come to pass. But, as they say, beware of unintended consequences.
The second is by Richard Fernandez of the Belmont Club. Link here.
One is tempted to say that the era of smart diplomacy is over. World leaders have taken the measure of Barack Obama and have chosen not to take him very seriously. The premier of China snubbed him in Copenhagen.
When your banker treats you as a person of little consequence, it is time to worry.
Perhaps the world's people still love Barack Obama more than they loved his predecessor. But unless you live in a Sally-Field world, where your life goal is to have people like you, you should be distressed to see the embarrassing performance of the man who represents us all on the world stage.
University of Pennsylvania psychologist Martin Seligman first defined depression as learned helplessness. Taking his cue from experiments where dogs were taught to be pessimistic about their chances to receive treats, Seligman applied the insight to depression. People who feel depressed often feel that they will not be rewarded no matter what they do. Thus, they, as most of the dogs in the experiment, tended to give up.
Take an individual who comes to a fork in the road. He believes that he has nothing to gain by turning right and nothing to gain by turning left. He stops in his tracks, embraces inactivity, and becomes depressed.
If he withdraws from the world he may start feeling like a spectator watching his life unfold.
Seligman's conclusion also implies that depression can be learned behavior. In his terms, it was learned helplessness. And this implies that if you want to make someone depressed, you need but convince him that life is a tragedy and that he cannot do anything to change the outcome.
Depressed individuals are especially prone to see life as a drama, as a story with a predetermined outcome. The Shakespearean character who declared that "all the world is a stage" was famously melancholic.
Psychotherapy deriving from Freud does not try to cure depression. Lacking optimism, it assumes that depression cannot be treated effectively with therapy. It's alternate goal is to make the depression make sense.
Some forms of Freudian therapy work like literary criticism: they pretend to reveal the true story of your life, the unconscious narrative that has been making you do what you do. They they allow you to critique the narrative.
Other more recent forms pretend that you can rewrite the narrative, but that assumes that it is good or possible to induce all of your friends and family to live in your newly constructed narrative.
To my mind the way out of this Freudian dead end is to see life as a game. A game differs from a story because a game's outcome is not predetermined. When you play a game you are not following a script.
If a game is going on, you do not want to abstract yourself away and discover its meaning. You are better served by improving your ability to play it. In this context coaching has a manifest advantage over therapy.
When you learn better how to play the game of life, you become an active participant, not an actor playing a role or a passive spectator. Moreover, as a player you participate actively in your life, you have a hand in shaping your future, and you become responsible for the outcome.
If you are acting a role in a play, you are surely not responsible for what happens.
Gamesmanship involves learned optimism. As Dr. Helen Smith points out in an illuminating post (link here), when the first experiments into learned helplessness were performed on dogs, not all of the dogs gave up. Some refused to accept that all actions were futile, but found ways around the obstacles that the researchers had created.
According to Dr. Helen government policies create a culture that can either induce learned helplessness by depriving us of freedom or encourage us to act as free individuals who can alter the outcomes of our lives.
For her the government insurance mandate is another effort to deprive us of our freedom. We can add, as Dr. Helen has in other posts, that confiscatory taxation policies tend to deprive people of the freedom to choose how to spend, save, and invest their money. The result is that people work less; in her felicitous phrase, they go John Galt!
As a coda, if you were wondering how anyone can overcome the sense of futility you feel when you come to a fork in the road and believe that there is no right move to make, you should remember the immortal words, supposedly attributed to Yogi Berra: When you come to a fork in the road, pick it up.