Strait the Gate, Narrow the Way

Critical reflection upon life, faith, and and goodness in our modern world.

On Robin Williams, Tragedy, and Thumper’s Mommy’s Rule

Since the actor and comedian Robin Williams died two days ago, there have been a multitude of tributes aired on television networks and posted online. Mostly they extol his quick wit, his devastatingly satirical humor, and his dramatic presence onscreen. As of this writing, his death has been attributed to suicide resulting from depression, so others have used this opportunity to focus on that mental disease. Also, given that his death occurred during a time of violent conflict in the Middle East and heightened tensions with Russia, not to mention anticipation of an ideologically charged election a few months hence, other less complimentary media has blown off Mr. Williams’ suicide as insignificant compared to larger events, or characterized it as cowardly, selfish, and particularly reprehensible considering his immense wealth and prestige. This latter vein of commentary is disturbing.

I understand the motivation to pay tribute to a popular figure. Through his movies and other public appearance, Mr. Williams has influenced a lot of people–chiefly by making them laugh. Many of his jokes and one-liners have entered into our common lexicon. People admired him, I guess, because his comedy uplifted their spirits. We sympathized with his confusedly righteous entertainer in Good Morning, Vietnam, we laughed at his comically entertaining everyman in Mrs. Doubtfire, and we drew wisdom from his portrayal as a counselor in Good Will Hunting. It’s no surprise that we should be shocked by his death, at his own hands, and apparently because of the omnipresent sadness, hurt, and anger of depression. The very nature of the event–popular and widely-reported–gives us the opportunity to reflect on the role laughter, sadness, and death play in our own perception of our lives. I confess that his comedy seemed a little wacky to me, so I am (unfortunately) not as affected by his death as others. But why spit on those who do, in fact, grieve?

Demeaning his death, or the attention lavished on it, sends a clear message that any grief felt for it is worthless. That is manifestly not true. Grief is the product of tragedy; any event which shocks us and provokes us to contemplate our own mortality, even vicariously, is tragedy. Mr. Williams’ death is one of many which happen every day, and perhaps one of the least gruesome. Certainly he did not die due to indiscriminate rocket fire, or beheading for being something other than a Muslim. The fate of nation-states does not hang in the balance because of his suicide. But his death is no less tragic for seeming lack of context. Christian doctrine, to which I subscribe, teaches that every person has inherent dignity because they are intimately created, loved, and valued by God, and therefore Mr. Williams’ death, even at his own hands, and even if he is rich and famous, is objectively a diminution of all of us–equally so as the death of a non-Christian in Iraq, or a Palestinian in Gaza, or a Ukrainian Soldier. The loss of a life is certainly much worse than a disliked piece of legislation or an unfavorable election result. As to his depression, I’ll be the first to agree that there are more immediately threatening issues than depression before us–but the relative importance, for whatever reason, of other issues does not diminish the cause of eradicating or mitigating depression (or any other mental illness). I personally grieve for Mr. Williams, more so because I have known his contributions to our culture and laughed with him. That makes the tragedy of his death more present to me than the death of others, and so it has a greater impact on me. There’s no question that Mr. Williams’ death is a tragedy, and he–along with those who loved him, which include his family and his fans–deserves our pity and compassion by virtue of the humanity he shares with us.

The negative reactions to this event raises the question of why we sometimes disbelieve people when they tell us about themselves. I don’t mean when people boast, or curry sympathy, or otherwise seek attention–I mean when they tell us their experiences. Many people who suffer from depression have written about it, and psychologists and psychiatrists alike have documented a pattern of symptoms and results leading of this clearly defined mental disease. Apparently Mr. Williams suffered from it. It is ludicrous to contradict that diagnosis on the barest speculation, as some have done by pointing out that he was a comic, or that he was wealthy, or that he was influential. Those things, nice as they are to be, do not have relevance on mental illness any more than they do on cancer or the common cold. I won’t conjecture whether there’s a connection between comedians and depression, but I do question why some angrily reject that such mental illness can occur in certain people. Can’t they imagine anyone being depressed if they’re rich?

Whatever the reality, second-guessing the experience of others is odious. To use a well-documented issue as an example, some question whether homosexuals really experience same-sex attraction as part of their nature. Why wouldn’t we believe someone who says that about him- or herself? Unless we have a similar frame of reference–i.e. we’ve experienced same-sex attraction ourselves–then we literally cannot understand what that’s like, and cannot judge the truth or falsehood of it. Any glib, ideologically-aligned causes we propose for homosexuality are mere speculation. In rejecting that aspect about another person, we are essentially demeaning them and all who share that experience by denying them personal agency and self-knowledge. Similarly, if one does not suffer depression, then rejecting Mr. William’s mental illness or that it could cause suicide is demeaning to him and all who suffer the same disease. That’s especially true for the self-styled academics who comfortably theorize that suicide is a selfish act and (if they’re religious) a sin. While the experiences of those afflicted with depression attest to both a physical aspect (i.e. a physical defect in the brain, or the operation of the brain) and a mental/spiritual element, scientists and theologians both admit they are very far from understanding the human mind. Therefore commentary on whether Mr. Williams’ suicide was a poor choice or an inevitable result of the disease is only more speculation. On top of that, who among us could say he or she knew Mr. Williams’ conscience, which seems more the point? God alone knows that. And finally, anecdotal evidence about someone falsely claiming depression–or any other sort of identity–in order to get attention is absolutely not sufficient reason to lack compassion. Any number of people who play the martyr by claiming depression, or who whine about the pressures of a life of fame, do not diminish the real thing. The only creditable source about Mr. Williams’ depression is Mr. Williams himself, and those who were close to him. It seems logical we would trust them.

No doubt those profess themselves offended by this suicide, or by all the attention spent on it, will respond to this post (if they read it) by asserting their right to believe and whatever they want. I don’t contradict that right. For my part, I’m certainly aware that I’m a poor source for information: I have no first-hand knowledge of Mr. Williams, nor could I improve upon the tributes written about him by better writers than I. I only remind the participants in this discussion that Mr. Williams had humanity and therefore dignity, as do all those saddened by his death. For that alone he and they are worthy of consideration and compassion. So please remember the rule of Thumper’s Mommy in Disney’s Bambi: If you can’t say something nice, don’t say anything at all–and leave those who grieve Mr. Williams’ death and reflect on their own mortality in respectful peace.

Reflections on the proper age of marriage

In all the many relationship discussions I’ve had and/or observed, it seems that age is considered one of the biggest factors in the decision or advisability of the relationship–especially if the relationship is marriage. Whenever people talk about someone else’s marriage make the age of the two married people a central issue. Maybe the commentary is positive–they married at the right time. Maybe the commentary is negative–they married too young, or (increasingly) they were too set in their ways; the second of which is a way of saying they waited too long, or maybe that they got too old.

Regarding age as a critical ingredient in marriage success (or relationship compatibility) has always ‘stuck in my craw’ a bit. It feels like one of the many blithe assumptions that come easy to us when explaining our own superiority, like a conventional belief that relationships in the 1950s and 1960s were all loveless, patriarchal shells of a family with an absent and philandering father. All right, maybe I exaggerate a bit there. Certainly few believe that all 1950s relationships (or any historical relationships) were loveless. Yet I suspect that many of us feel just a little bit lucky that we don’t live in the bad old days of arranged marriages, commercial exchanges to accompany weddings, and 14-year-old brides. Despite all that, however, I just can’t believe that the majority of relationships were unhappy or stilted. People loved each other back then, too. I’ll be careful here: I’m not saying that the bad old marriage conventions should be revived; I’m proud to live in an age where spouses choose each other freely and where either can be the breadwinner or caretaker as their fancy (and economic realities) take them. Even if we’ve made improvements socially since then, it doesn’t follow that our forbearers were unhappy. In fact, there are reasons to believe people might have been happier in those benighted old days of crusty tradition, sexual repression (or, depending on who you ask, aggression) and male dominance. They worked shorter hours on average, and slept more, than we do–both of which cause us increased stress and health problems.

In any case, I’m unconvinced that we are better off socially in the 2010s than we were in the in the past. That’s the nice thing about the past, if you have a point to prove: it is easily molded into a structure fitting your preferred narrative. Its easy to make a sweeping assertion that families were stronger back then, or that women were more repressed back then; both are true. Some things have improved; others have degraded. Comparisons are dangerous, because they are usually in the service of prejudices, like our particular prejudice about marrying young.

Statistics tell us that rural and/or less educated people marry younger than their urban, educated brethren; also the average age of marriage has risen from those terrible (but healthier!) olden days. And I often hear (maybe I sense?) a high degree of self-congratulation about that fact, which is funny, because up until about the last 10 years, marriages were becoming steadily less successful, indicated by a rising divorce rate. So we’re doing better because we’re marrying later, but our marriages are less successful? I’m not following. Certainly, some have argued that the rising divorce rate was a good thing, believing most marriages were unhappy because they were essentially coerced. Yet however went the marriage, a divorce is the breaking of a strong relationship that carried a lot of hope and promise, and so it seems likely that many (or most) divorces were bitter and painful. Maybe the practice of marrying older isn’t the social victory we think.

But wait! It would be ridiculous to go back to marrying upon graduation from high school. That is beyond doubt. Who is ready for something that after high school? I was certainly not ‘ready’ for marriage by the time I completed high school? If I’m honest with myself, I think it’s better to say that I was not even ‘capable’ of marriage. I was shockingly self-absorbed and my thoughts were consumed with a) whether I had the right friends and/or girlfriend, b) how I could get the most out of college (and we’re not just talking academically here), and c) agonizing about who I was. You know, the important things. Should I listen to Guster? Can I get away with just a T-Shirt and jeans, because I think it’s so much more chill? Is it ok if I enjoy my classes, or should I make myself enjoy partying more? There was barely enough room in my life for myself, let alone a life partner.

Ridiculous indeed. I doubt anyone would argue that. But by historical standards I was pretty immature for my age. I was 17 years old, able to drive, and almost able to vote. Do you think I was ready to select the most powerful person in the world in an election? I can scarcely believe they let me vote, considering my mental state. But I was not alone. Nearly everyone I know was at a similar maturity level upon their high school graduation. We had been kids for a long time, whose only real responsibilities were…nothing. Homework? Please. Most of us found ways out of it. Summer jobs? Doesn’t count, really–we were usually only making money to finance our weekend plans. For our entire lives thus far, in classes and sports teams and music and school plays, we were totally isolated in a world made for kids. And we weren’t done yet: we still had college to attend. Most of us were big kids, intellectually adults but emotionally (and socially) very young.

By comparison, children who grow up in rural areas, or who grew up in a culture which emphasized community and family, such as the socially backwards past, were probably much more mature than we were at the same age. It’s more likely they were vital contributors to their families, either by helping out the breadwinner with his/her business, or caring for siblings, or doing serious chores (like home maintenance or farm work). They lived in smaller communities, and had more relationships with adults (friends’ parents, aunts and uncles, grandparents, neighbors, etc.). Those 18-year-olds occupied a much less striated society, where they had to have become adults socially by the time they were in their mid teens. Upon the age of high school graduation they were actively part of a community, certainly deserved the right to vote. More importantly, they could also be a good partner in the community of a marriage.

There are great structural advantages to marrying at that tender age. Neurological studies have shown that one’s brain continues to develop until their mid-20s. More importantly, the cognitive functions of the brain usually finish development by about 16-18 (adulthood!), and the moral values and judgment functions of the brain develop after, finishing between the ages of 24-26. In fact, the reason teenagers believe they are invincible has been shown to be linked to the fact that their brains have not fully developed the capacity of judgment, which makes it harder for them to comprehend the risks they take. And while some, if not most, will argue that it’s irresponsible to marry when you haven’t even finished developing your personalities, I will turn the argument on its head and suggest that the best foundation for a lasting relationship is to develop similar values together by shaping each other’s moral growth.

Biologically, people between the ages of 18 and 25 are at their most fertile. Males produce the most testosterone, and therefore the most sperm at that age; females at the same age produce the most estrogen, and have the easiest time conceiving–which seems like a cruel joke, considering that we view that period in our lives as the most undesirable for marriage and starting a family. That age is for exploration, we say, it’s for discovering yourself! Partying! Traveling! There’s no doubt about it–all of those things are easy and fun when we’re in our early twenties. Do you remember how we thought nothing of going on little to no sleep, had no idea what a hangover was, and couldn’t understand the need to diet. We were beautiful, invincible, unstoppable; the world was our oyster. But as a parent in my 30s I will note wistfully that those physical advantages would be very helpful when dealing with children. When I’m chasing my toddler around, or when I have to get up to comfort the baby, I yearn for the energy I had in my 20s.

But of course it’s not a good idea to marry young these days. A college diploma (or at least a tech school certification) is more or less required to find work, and I can’t even imagine what college would be like as a newly-married person (and not just the social aspect; think about beginning a marriage with that kind of debt). But more practically–for the marriage part, anyway–is the fact that no high school graduate I’ve ever known is emotionally capable of marriage. The schooling process, along with popular media, has kept them from any sort of social or real responsibility and instilled in them the fervent, insidious belief that hedonism and wanton self-discovery are the essential components to a happy youth. Have fun! Enjoy college! Date many people! The expected result, of course, is that by a fun process of elimination these 20somethings will find the perfect job and partner, and settle down happily and much later.

These are generalizations, of course. And I am not trying to write a “kids these days,” fist-shaking rant. It’s long been fashionable to blame society for these developments, as if society were some kind of entity with intentions on us. Unfortunately, however, society does not make us (or our kids) do things. It has no intentions or opinions. It just is. And it is made up of us. It is merely the institutions formed out of our cultural perspectives. We think it’s important for kids to be kids, so we have created institutions which keep our kids in a school until they are 18, and take up their free time with sports and music and drama extracurriculars. We as a culture value self-discovery and self-actualization, so we institutionally establish these things in the emphasis on college, or the explosion of self-help books, or our worship of adventures and extreme sports. We value sexual actualization, too, so institutionally we accept more present sexuality and eroticism in things like television, music, and advertisements. The effect of these cultural perspectives is not the fault of our institutions (schools, media, etc.) any more than it’s the fault of a piece of wood that it was made into a chair. We share cultural perspectives; society results.

But frankly we needed our 20s. From my own experience, marriage requires contribution and unselfishness. I’m pretty sure the majority of my peers (and I) did not possess those virtues sufficiently in our 20s to have successful marriages. We still had to learn to support ourselves in the ‘real world,’ to be a part of a work team, to rely on others. Until then, we had parents and teachers and college staff to back us up. We also had to learn by trial and error how to take care of another person, because the school pipeline insulated us somewhat from observing other successful marriages by keeping us in our own age groups. It’s certainly plausible that our parents and grandparents, or kids growing up in rural areas learned all these things during their childhood, in more integrated social groups. But not today. Today, we have our 20s for that.

The fact remains that to be successful in a relationship, we must develop a certain maturity. So those who argue the doctrine of waiting for a bit, in order to mature, are wise. But maturity is not tied to a certain age. One may be less mature at 30 than some are at 18 (watch The Bachelor and see what I mean). And though everyone knows that maturity is only one piece of a great marriage–I’m not sure anyone has adequately explained the romantic longing, or fierce desire, or deep contentment with and for one other person that characterizes the love which leads to and sustains marriage without invoking Grace–I am concerned here with practicalities. Practically, marriages require partnership and respect. Maybe it would be nice to learn those things fully in our first 18 years, for we could happily and successfully marry then, deal with exhausting young children at the peak of our physical capabilities, and skip off to travel the world in our forties (which, at this day and age, practically constitute our healthiest decade of life!). Not an unpleasant prospect.

But our culture makes this near-impossible. So the point of all this rambling is: carry on. We all need a little growing before we marry (successfully). But from someone who has taken that step into marriage, I’ll tell you that it is much better than my early 20s. I’m glad I made it.

Aurora, Santa Barbara, and Waseca as an invitation to reflect

Last night my wife’s friend joined a news show panel a big TV network, so of course we tuned in to “cheer her on” through the screen. The subject was John LaDue, the upper-middle-class, never-been-bullied, no-reason-to-ever-go-wrong, almost-perpetrator of yet another violent, tragic school shooting.

He, of course, is only the latest in a line of demographically similar young men who have, for reasons yet under debate, become violent. The Aurora shootings shocked us because the location and event seemed vaguely symbolic: a movie theater, at the premier of a much-anticipated movie claiming to delve into the darkness of the human soul. The Santa Barbara killings angered us because the killer wrote elaborate fantasies about being violent, especially toward the women who unfairly denied him sex and the men who received in his stead. John LaDue’s planned violence stands out because the police stopped it–and because his matter-of-fact assertion that he felt mentally ill, that he wanted to kill his peers and hold out until taken down by SWAT, is a chilling glimpse into psychopathy.

The talking heads of the panel were all very unsympathetic towards young Mr. LaDue. The talked about how he was “simply evil,” “beyond rehabilitation” and the like, while the host sagely agreed. They may be right, of course, though I hesitate on principle to presume what someone might do out of respect for certain legal protections on which the United States are founded, but by and large I agree with them: Mr. LaDue ought to be charged with all the crimes associated with planning such a terrible deed (conspiracy to commit murder comes to mind).

It was interesting that they referred to previous, similar crimes–which actually took place–almost as aggravating circumstances. As if the fact that similar spree killings in the recent past somehow made his planned attack worse. It might just have been a trick of phrase; I’m fairly sure the commentators simply wanted to draw attention tangentially to this mystery of young men, from what we collectively consider to be “good” homes, who slowly and without concealment develop a rage and desire to kill, and then execute that desire despite a host of teachers, counselors, and peers who warn against them. I think it’s wonderful that the police caught Mr. LaDue, and if that was the result of a greater awareness of such crimes, then bravo to the talking heads. But the whole exercise in condemnation seemed to be dodging the main issue.

I suppose it’s natural to vent frustration on Mr. LaDue. He did, after all, plan to murder as many of his classmates as he could and (he hoped) some cops sent after him as well. And as a large portion of spree killers end up dead by their own hand, it’s satisfying to finally have someone to punish–especially if he is a better receptacle of our anger than James Eagan Holmes, the Aurora Theater shooter, who presented convincingly as a complete psychopath, and who showed all amusement and no remorse for the court proceedings against him.

Yet I wonder how much of the anger directed at people like Mr. LaDue and Mr. Holmes is to assuage our own consciences. I wonder how much of the condemnation and indignation, however superficially righteous, serves to draw a distinction between us and them; to say in essence, “the spree killer is evil and I am not, therefore get him away from me into jail and then death.” Perhaps shock and anger sometimes mask the relief people feel that they know what is “bad” when they see these spree killers, and it is not them. Perhaps too much of the talk about such men–easy laments about the decline of our society, titillated surprise that the scions of upper-middle-class stability, satisfying outrage at expressions of psychopathy and misogyny–is disassociation.

This bears some discussion. After all, the young men in question grew up among us. They received the same stimuli from media and from our pervasive culture as we have, and they had all the material things they needed. Clutching our pearls and wondering in bemusement how such criminals and terrible crimes could occur is the easy way out, a safe way to avoid hard questions about our own behavior–or at least our participation in a social behavior–which may have (at least) set the stage for a spree killing. Worse is to use these events to forward a philosophical or socio-political agenda, like the opposing crusades of the NRA (which seems to want to arm all teachers) and those who advocate total gun control. It’s ludicrous to think that arming teachers or taking away all guns would somehow solve the problem. The problem isn’t the weapons or lack thereof, it’s that young men decide to spree kill and then do it. They can do it with sticks, steak knives, home-made explosives, or bows and arrows. The problem is that they do it, and it’s our problem because in important ways the perpetrators are similar to us.

At this point I’m sure many readers have rejected this train of thought. They angrily proclaim that bad people exist, and that bad people will always exist, and that there’s absolutely no similarity between the sickoes that spree kill in schools and the rest of us law-abiding Americans. They may angrily point out that only young men have ever committed spree killings, and so it’s not a problem for women in our society. They may passionately argue that if nobody had access to guns, nobody would be able to kill so randomly. Or they may simply brindle at the suggestion that they are anything like the monsters that kill, and decide they don’t really want to discuss it any further. But if so, these readers are taking the easy way out. They are disassociating. They are saying that the problem of spree killing is not their problem, because spree killers are wholly alien. They would rather be right, ultimately, than make the sacrifice of compassion to see if there is any way such killers could be reduced.

Nearly every recent spree killer has come from the same demographic makes a mockery of coincidence. Nearly every spree killer has come from, and targeted, the influential middle class. Nearly every spree killer has evinced rage, most notably the Santa Barbara killer who (horrifyingly) seemed to actually believe that mere fact of others having sexual relationships was a violation of his rights. And nearly every spree killer seems to want attention–they choose schools and movie theaters and prominent universities as their tableau, knowing that they will earn headlines and time on “The Situation Room” and endless panels of talking heads like the one I saw last night.

That, actually, may hold the key to the problem. Attention. Why do spree killers want attention? Attributing it to their generation, as many do, is doubtful–otherwise more entitled millennials (in full disclosure, I’m a millennial too) would turn to violence. No, I would guess that spree killers want attention for the same reason that normal people develop a need for attention: some kind of fundamental, developmental neglect.

Now before people break out the mocking tears and sneer about mommies and daddies not loving their children enough, consider: first, numerous studies have shown that young girls without a close relationship to their parents are statistically more like to engage in promiscuity, drug use, and other risky behaviors; and second, studies into gang membership/affiliation (male and female) cite lack of dedicated parents as a prime causal. It’s not about whining on a daytime talk show, it has been studied and proved that neglected children have a higher propensity towards clinically anti-social behavior. And I have unfortunately met too many middle-class or wealthy parents who are more interested in the next vacation destination, or the new episodes of Mad Men, or in their own jobs, than in their children. Though it looks like stay-at-home-parenting is on the rise, the teenagers and young adults of today are perhaps the generation most commonly dumped into daycare so that parents could have satisfying careers and social lives.

Where it comes to males in all of this, to young men, is a sort of generalized neglect. Wait, hear me out. I know that across the board, women make less than men for similar work. I know that there exists an insidious “motherhood” penalty in the workplace. I think that as the gap between the wealthy and the rest of us has grown, life across that gap on the wealthy side has preserved and protected the old male-dominated social architecture. But back here, in real life, important changes are taking place: compared to men, women collectively get better grades in school, participate in more extracurricular activities (including sports), attend college at higher rates, and in many cases are more readily hired. These are all very good things, and hopefully a harbinger of true equality in the workplace.

Other investigative journalism indicates, however, that laudable attempts to push women to higher social achievements have unintentionally marginalized men. “Socially acceptable” extracurriculars in high school have shrunk to a few high-profile sports in order to spend equally on women’s teams. Universities faced with a majority of female students have invested money in programs of study and student life infrastructure which cater specifically to women. Companies hoping to achieve a certain diversity actively pursue female employees. And I wonder if maybe developmental authority figures like teachers have become mostly female, and less interested (understandably) in focusing on traditionally male interests like war. None of this is to blame the system, but rather to suggest that the intersection of parental neglect and social neglect may be a place frighteningly devoid of normal social obstacles to psychopathy, narcissism, and spree killing.

Obviously not all neglected children turn to violence. And women almost never turn to violence, perhaps because they usually have less aggression due to lower testosterone (though there are exceptions, of course). But I think it no accident that most spree killers commit their deed(s) after puberty, and they all seem to be seeking attention and revenge. Attention, maybe because they never got it; revenge, likely against those who refused to pay attention to them (or suitable surrogates). And I also think it telling that spree killers are usually characterized as loners, and notably lack the comfort and restraint of a social group–a family or a team–to draw them towards good social relationships. Maybe they aren’t necessarily born loners, but possibly are made loners by their development. I wonder if the anger and hatred that many women sense, in catcalls (check out #NotJustHello on twitter) and sexual dominance (#YesAllWomen), isn’t rooted in this cauldron of socially marginalized young men. And I wonder whether a parent, a mentor, a teacher, a friend who cared about [insert name of spree killer] might not have made the difference.

I don’t advocate sympathy for any spree killer. It is for the good of society that they be charged and punished to the full extent of the law. I also don’t advocate some kind of large-scale enterprise or campaign to remedy social wrongs. I suspect that by the time spree killers start exhibiting the signs (posting YouTube rants, rage-filled blogs, and so on) it’s too late for intervention and time for police involvement. But I invite us all to not wring our hands, spit out righteous rhetoric, and go about our daily business, comfortably believing these events have nothing to do with us. I invite us to take the hard road and try to see the killers with compassion, and hopefully to see a way that we can, in the future, make a difference.

Some thoughts on the words “Faith” and “Religion”

I recently saw an article that claimed Islam wasn’t a religion. There have been high-profile debates between religious leaders and scientists about which perspective contains more truth. There have even been debates within faith communities, as between Christian sects who acknowledge gay marriage, and those who don’t. It seems that somewhere in the diatribes we’ve collectively lost an understanding of what it means to have “faith” or how to define a “religion.”

Religion indeed seems a difficult thing to define. Christians, by and large, regard it as a free exercise of will to believe. No matter where you come from, if you believe what’s written in the Gospels regarding Jesus Christ, you are a Christian. Certain more conservative groupings, however, treat Christianity as a sort of ‘social contract,’ binding those within the group to act and value certain things. My extremely limited experience with Judaism indicates that certain conservative Jews have exclusionary believes about their religion–namely, that it accrues only to the children of Jewish mothers. Less conservative Jewish sects appear to regard Judaism as more of an ethnic identity than a belief system, happily accepting agnosticism or downright atheism among their peers as long as the overarching identity remains.

If my understanding of Judaism is “extremely limited,” then my understanding of Islam is not even worth mentioning. The so-called “fundamentalists” (a charged word, in that it implies that the fundamental tenets of a religion are bad, instead of perhaps a tangential tenet of the religion) treat Islam as a socio-political system, in which laws protecting the status quo are given legitimacy by (it is believed) divine approbation. The status quo in many Islamic countries in the Middle East is, at least regarding the dignity and attendant rights of women and children, oppressive and even barbaric in light of our liberal ideals. Opposition to that system strikes me as more akin to opposing Communism or Fascism insofar as it’s a political system. Islam in that sense is very different than Christianity and Judaism, and rightly condemned.

In the sense of religion, on the other hand, the issue is murky precisely because we use the word “religion” to describe different things. There are Muslims who practice Islam as a free exercise of the will to believe in Allah and the teachings of the Koran. I’ve never read the Koran, so I don’t know if it is filled to the brim with hateful writings, loving writings, or (as is the case with the Jewish and Christian scriptures) a mixture of both. There are other Muslims who probably practice Islam as a social contract, a way of distinguishing their group from others. But using religion to describe the entire practice of Islam, Judaism, or Christianity confuses things, and probably lets unlawful behavior proceed under the First Amendment while simultaneously restricting legitimate religious practice.

By and large, the test for “freedom of religion” ought be simple. If a behavior is lawful in a non-religious context, then it should be permitted as a religious practice. If I may display statues on my lawn, then I may display a Nativity scene at Christmas. If I may wear as much clothing as I’d like, as long as I’m not indecent, then I may wear a hijab or burkha. As a side note, Middle Eastern Christian (some of which who subordinate themselves to either the Pope or the Patriarch) and Jewish sects direct that female adherents wear hijabs. If assaulting someone is illegal, then I should not be able to stone or otherwise injure a person for engaging in lawful sexual behavior. It’s more difficult when trying to decide whether a person should be forced into religious participation, even tangentially. But that sort of question is why we have legislatures and courts.

The word “faith” seems misused as well. The dictionary defines faith as, “1) confidence or trust in a person or thing; 2) belief that is not based on truth; 3) belief in God or in the doctrines or teachings of religion; 4) belief in anything, as a code of ethics, standards, or merit.” I think the first definition hits closest to the mark on the intent of the word. A religious person, you might say, has confidence and trust in the tenets of his/her religion. The thing is, that attitude seems to apply to a lot of non-religious people too.

There are many voices trying to put faith and/or religion in the same category as ignorance and barbarism. That saddens me because I happen to be religious, of course, but it also strikes me as disingenuous and dishonest. As a Catholic I believe that Jesus Christ was the Son of God, and that He emptied Himself to become like us and share in our struggles on this earth, and that as He was killed He offered Himself as reparation for all our sins (past, present, and future), and that His offer was worthy because of His own perfection, and so I believe that if I follow Him I will be free of this earth and with Him in paradise. In analyzing that long narrative sentence it is immediately obvious that I could offer no empirical evidence of this. Even if I had a time machine and could record video of Jesus becoming incarnate in the womb of the Virgin Mary, then record all of His miracles, then record His crucifixion and leave the camera in the tomb recording the moment of His resurrection, there is still no way to see and record the thoughts of God, nor attach the camera to Jesus during His ascension into heaven and remotely view the video. My senses are unable to even gather that ‘behind the scenes’ evidence, even if I could prove by two chemical tests on controlled samples of water (for example) that it turned into wine. Therefore I must either have confidence that the narrative is true, or not.

This is not all that different, say, than belief in the Theory of Evolution. Nobody has a time machine that would enable them to bring back irrefutable evidence of evolution, perhaps by filming the birth and maturation of the first Cro-Magnon person with two Neanderthal parents (complete with genetic testing to compare to the remains of both species already cataloged). All we do have is snapshots of evidence, which we believe to be of a certain age, based on the belief that we can tell the age by extrapolating chemical deterioration, which only a few of us have ever observed with our eyes in a microscope (and I’m not sure it’s even possible to observe radiation decay). There is a narrative suggested by these snapshots of evidence–the oldest remains being more ape-like, the newer ones more human-like–but it is the invention of scientists and authors. Therefore I must either have confidence that the narrative is true, or not.

We’ve so far ignored the question of the chicken or the egg. Certain scientists, for example, claim that emotion is merely the work of certain hormones in a human brain. Feelings of arousal are due to release of sex hormones, which (it is theorized) are triggered when presented with a set of conditions, like say a procreatively attractive human of the gender which the subject of arousal finds attractive. Feelings of affection are due to the hormone Oxytocin, which is triggered in certain situations as a hardwired social response, which our genes have developed to increase our rate of survival by causing us to work together. But that is a hypothesis. It is plausible, too. But it is also unprovable. It’s equally plausible (and possible) that such hormone activity is the result of emotions–the mechanism or vehicle by which feelings manifest themselves physically (as arousal or tears). None of us can go inside our brains to determine the exact causal order of whether the emotion is received first, or whether the hormones are released first. Therefore I must have confidence that either one narrative is true, or the other.

The scientist Neil deGrasse Tyson famously noted, “the good thing about science is that it’s true whether you believe in it or not.” With respect, I beg to differ. There were a great many scientists who believed in Eugenics between 1880 and 1945 (including Margaret Sanger) along with luminaries like H.G. Wells, Theodore Roosevelt, and George Bernard Shaw. Eugenic research was funded by the Carnegie Foundation and the Rockefeller Foundation.* By “believe in Eugenics,” I mean its proponents believed that there was a genetic cause which disposed certain people toward poverty, retardation, sexual deviance (i.e. homosexuality), and antisocial behavior. Science was not true in that case, and we shouldn’t be so quick to conveniently compartmentalize that into the “the funny old days when we had silly theories” and “the evil things Nazis did, from which we saved the world.” Science is only as true as the ethics and character of the people who do it, much like religion. One commonality between the two ‘sides’ is that authority figures in both realms–scientists and priests–are only human, and subject to the same propensity to self-deceive and enjoy attention as the worst Hollywood celebrities or politicians.

Ultimately, faith comes down to what inspires confidence. My experience has taught me confidence both in the religious salvation narrative and in the scientific narrative of the world. As another author pointed out, there is not much difference between the big bang theory and the Christian explanation that God said, “Let there be light.” In both cases, our fantastically complicated universe exploded into something without warning or apparent material cause. What does it matter whether one believes it happened randomly or at the will of an entity too big to imagine?

Understanding and meaningful engagement with others demands a certain rigor of thought. Proponents of rational explanations fall into hypocrisy when they succumb to the “blind faith” that others who disagree with their perspective are somehow less important because they are “religious,” and proponents of religious-faith-based explanations fall into hypocrisy when they fail to acknowledge the faith that rationalists have in science-based narratives. It might advance both sides of this odd little culture struggle if we all recognized our own “religious” and “faith” tendencies, including those with no affinity towards and/or opposition to an established religion.

Memorial Day Remembrance, 2014

I wrote this speech to deliver to the Village of Kohler, Wisconsin, as part of their 2014 Memorial Day parade and ceremony.

Memorial Day is dear to Americans because it isn’t about us. Simply put, if we are here to celebrate it, then it isn’t about us — because we are alive to remember. It honors the achievement and sacrifice of our countrymen and women whose service required their very life.

As a Marine, the stories of my forbearers who gave their lives in service are legendary to me. Nearly any Marine can tell you the story of Lieutenant Bobo. Quoting from his Medal of Honor citation: “When an exploding enemy mortar round severed Second Lieutenant Bobo’s right leg below the knee, he refused to be evacuated and insisted upon being placed in a firing position to cover the movement of the command group to a better location. With a web belt around his leg serving as a tourniquet and with his leg jammed into the dirt to curtail the bleeding, he remained in this position and delivered devastating fire into the ranks of the enemy attempting to overrun the Marines.” That occurred in Viet Nam in 1967.

A more recent example is Corporal Dunham. His Medal of Honor citation relates, “…[A]n insurgent leaped out and attacked Corporal Dunham. Corporal Dunham wrestled the insurgent to the ground and in the ensuing struggle saw the insurgent release a grenade. Corporal Dunham immediately alerted his fellow Marines to the threat. Aware of the imminent danger and without hesitation, Corporal Dunham covered the grenade with his helmet and body, bearing the brunt of the explosion and shielding his Marines from the blast.” This occurred in Iraq in 2004.

These young Marines, and their sacrifice, live on in the institutional memory of the service. I first encountered Lieutanant Bobo’s name in 2003, when I underwent Officer Candidate School in Quantico, Virginia. It was the name of our Chow Hall, a place of great importance to us candidates, and our Drill Instructors never wasted an opportunity to tell us the story of the hall’s namesake (usually as part of a larger diatribe regarding our worthlessness and general incapacity to become Marines. Ah, the sweet nurturing environment of Basic Training!). Enlisted Marines also learn about Lieutenant Bobo in their Boot Camp. I know that in time, buildings and roads on bases throughout the Marine Corps will bear the name of Corporal Dunham, and newer generations of Marines will learn about — and be inspired by — his heroic deeds as well.

These two stories from different wars show us that the decision to give what President Lincoln called “the last full measure of devotion” at Gettysburg (arguably the first Memorial Day celebrated by this nation) is not made in the moment of stress. Lieutenant Bobo would not have had the fortitude to resist evacuation and direct the fight after losing his leg unless he had already decided, in some deep unconscious center of his soul, that he would give his all for his country. Corporal Dunham could not have jumped on that grenade “without hesitation” and within the five-second fuse of such weapons, had he not already chosen — in the months and years of training and operations prior to that moment –that the success and integrity of his mission and his team were more important than his own life.

This day is set aside to celebrate our nation’s fallen, but not only their final heroic deed of service. It celebrates also their lives, for each of them had the character and courage to dedicate themselves wholly to the rest of us long before we collectively asked them to sacrifice themselves. They represent the best of these United States, the ones who have made our existence and prosperity possible: the Minutemen who faced British cannon and muskets in 1775; the 2nd, 6th, and 7th Wisconsin Volunteer Regiments who as part of the famed Iron Brigade defended the high ground west of Gettysburg on the first day of that battle, enabling the rest of the Union Army to emplace and finally score a victory which led to the preservation our nation whole; the Soldiers and Marines who faced the unprecedented peril of amphibious landings at Normandy and throughout the Pacific; the heroes of Viet Nam and recent conflicts in the Middle East.

Today I remember the Marines I knew personally who died in service. Some, like Lieutenant Blue, died in Battle. He was as an outstanding officer, who routinely aced physical and tactical tests at The Basic School where we were classmates. He was also known as a “good dude” (in our lingo), which meant he was the kind of guy who would give up weekends to help his fellow students master testable skills, like marksmanship and compass navigation. He already had what the rest of us recent college graduates were struggling to develop: outstanding character. In training, he had all the talent and drive to graduate as the number one student, but chose instead to use his gifts to help his fellow students (and even so he graduated in the top 10% of our class). Our success was more important to him than his own. If anyone understood the importance of character and service at the tender age of 25, when he was killed by a roadside bomb in Iraq (2007), it was Lieutenant Blue. Word of his death spread quickly among his classmates, even to those like me who had limited interaction with him during our short time in school together. I believe he was the first of our class to die in the conflict, and he proved the old adage “the good die young.”

I also remember Marines who died in Training. A fellow fighter jock of mine, Reid Nannen, died this year [2014] when his F/A-18 Hornet crashed into the mountains of Nevada, where he was training at the Naval Fighter Weapons School (otherwise known as “Top Gun”). His callsign, or nickname, was “Eyore” because he was always comically pessimistic, but it under-laid his solemn unwavering dedication to the craft of aerial combat and aviation ground support, which had earned him the rare and coveted spot at Top Gun in the first place. He was also known for his dedication to his family, and was survived by his pregnant wife and three children. Although he was only training, it’s easy to forget that  our service members assume serious risk beyond what most non-military folks ever encounter in just training for combat. And it’s important to note that his family served our country in a way as well, suffering his absence when the country needed him to get ready for war as well as execute it, as he did in Afghanistan, and suffering his loss in the deepest way. Memorial Day is for them, too.

We celebrate the men and women who have died for us because we recognize that the highest and best use of freedom is in the service of others. Some wars we fought to carve out and preserve a spot of freedom on the earth to call home, these United States, and some wars we fought to bring freedom to others. But the men and women who died in our wars swore their lives to protect that freedom, firstly for us, but also for others less fortunate. I ask you all, as I would ask any of our countrymen, to enjoy this day as Americans — enjoy our freedom, our happiness, and our prosperity at the dawn of summer. Enjoy barbecues, enjoy some pick-up basketball games, and enjoy this time with your families. Enjoying our blessings is how I believe fallen service members want us to remember them.

But while enjoying this Memorial Day holiday, I will also honor the fallen with a quiet personal toast of my beer. I invite all of you to do the same.

Faith, Reason, and Debating the Existential “Big Questions”

I’m past college, and with those years has passed the incidence of earnest debate about things like religion and the meaning of life. That I attended a Catholic university and majored in a “Great Books” meant that I fielded my share of challenges from those who believed something different than I did, and one of the most pressing questions that came up at that time was why.

Why do you believe?

There is something fantastic and mythological, certainly, about the story of a God coming to earth in order to offer Himself up as a perfect, spotless sacrifice in order to atone for every human sin, past and future, and reconcile the human race to Himself as God. The particulars of the story are indeed quaint and uncomfortably sentimental: a sweet young woman chosen to miraculously conceive God’s child; archetypal authority figures hatching dastardly plots and darkly scheming to stop this bright young hero; a set of bumbling accomplices; an impossibly evil death; and the most mythical and unbelievable thing of all: that he was killed and then came back to life.

To my friends, well-educated and mostly liberal humanists, the tale of Christ bears too many similarities to the quaint myths of many other cultures, and is only the biggest myth in a child-like narrative of the world with a stylized creation story and a lot of horrible barbarities. Compared to sophisticated promise of modern disciplines like sociology, psychology, and specialized sciences, a primitive culture’s myth seems plainly archaic. How could anyone believe this, much less someone college-educated?

The challenge about answering this question is that it is ideological rather than academic. Those who ask it have a certain perspective which I don’t understand, but which seems to preclude the idea of a supernatural. Some profess to be humanists, who believe that continued enlightenment in sciences will eventually conquer our social and personal afflictions. Others profess to be rationalists, believing only in those things that science has proved or theorized.

Such alternative belief systems are not, in and of themselves, ideological. They fall more truly into the existential category, defining who we are and why we exist. But they seem to come with a lot of ideological baggage these days. After all, elements of our society today are unabashed and even aggressive apologists for faith (professing the Christian doctrine of sola scriptura) and many of them speak in terms of condemnation, specifically condemnation of those who disagree with them, to hell. They often stand for uncomfortably traditional values as well, like maintaining traditional gender and socio-economic roles. Now all of a sudden we aren’t talking about a different moral and existential perspective, we’re talking about an ideological opponent. And, to be fair, there are fundamentalist Christians who are offensive and judgmental in proselytizing their beliefs.

But to turn the tables, many so-called rationalists and/or humanists can be just as aggressive, and I am skeptical that their explanations of the world are actually more ‘rational’ than a faith-based one. It’s easy to talk about gravity or astronomical relations and say that we can “prove” real science empirically, but I doubt that many of us have empirically viewed the behavior of a virus, or the release of certain brain hormones causing affection or depression. We accept that viruses and brain hormones work a certain way because we have studied the effect of those things and measured them in actual humans, so we know they exist and they affect, somehow, our health or mental state. We also believe people called “scientists” when those people tell us about viruses and brain hormones (and the behavior of chemical elements, and many other things), because we have faith that their education and certification makes them intrinsically trustworthy on certain issues.

Whether or not you trust a scientist or a theologian (or a priest) is really the question, unless on. An Op-Ed in the Washington Post recently pointed out very thoroughly that the two sides are not mutually exclusive. I have little to add to the writer’s argument because I agree with him — I believe in the story of the Christ and yet also pursue understanding of scientific matters, because I want to know more about us and this world we inhabit. He ends with a marvelous paragraph worth quoting in full:

The problem comes when materialism, claiming the authority of science, denies the possibility of all other types of knowledge — reducing human beings to a bag of chemicals and all their hopes and loves to the firing of neurons. Or when religion exceeds its bounds and declares the Earth to be 6,000 years old. In both cases, the besetting sin is the same: the arrogant exclusive claim to know reality.

The answer to the question of why I believe the entirety of the Christian story, with it’s quaint mythological narratives about paradisiacal gardens and apples of knowledge of good and evil and floods and prophets and whales and the Son of God is that I find it more plausible than any of the alternatives. It really makes more sense to me. Not necessarily in they physical particulars (“do you really believe that some prophet actually parted water to create a passage?”), but in the tale it tells of how humanity became prone to doing bad things and how God then came Himself to redeem humanity from its sinful nature.

The Christian tale is plausible to me mostly because of my own experiences in sin and redemption. The vast majority of these experiences are with my own sins and redemptions in my life so far, and a few of them are observations of other peoples’ sins and redemptions. On a precious few occasions I recall witnessing a miracle, or experiencing a beatific presence I attribute to the Christian God. These things are open to interpretation in an academic sense, of course. Rationalists might argue that my experiences of good and bad in myself and others are filtered through a strong inculcated Catholic belief system. They might doubt that I, in fact, saw or experienced so-called “supernatural” things, and point to the demonstrated phenomenon of humans to manufacture memories that suit their subconscious perspectives. And as far as that goes, they may be right. I can’t transmit my experiences to others, so therefore I can’t expect anyone else to believe my conclusions. And yet I can no more forget them than an astronaut could forget his view of a round earth from space, or an astronomer could forget the sightings and calculations that the earth and nearer bodies revolved around the sun in elliptical trajectories.

My point here is not to convince anyone in my beliefs. I don’t think that’s possible — neither a rationalist nor a faith-based belief system can be truly transmitted via dialectic. Any belief system has to be experienced to be believed, personally and deeply experienced. And for a human, that means engaging both the intellect and whatever part of the brain controls belief.

Someone who believes that human emotions like love and depression are a combination of neuron activity and chemical activity in the brain has probably actively engaged the subject: he or she likely wondered why people experience love and other emotions, and pursued the answer until they found an explanation. That’s the activity of his or her intellect. He or she also had to exclude other explanations for emotions (presuming they found others), such as activity of a metaphysical soul, or instinctual behavior bred in by evolution, which is primarily a decision of faith. Does he or she trust neurologists who measure neuron activity and brain chemicals? Priests, philosophers, and/or wise men and women, who have reached a supernatural explanation due to their long experience in considering and/or observing human behavior? What about sociologists and/or biologists who study behavioral patterns and instinct activity?

Personally, I don’t believe that a scientist is intrinsically a better person than a priest or a philosopher. All three are human, which means they are subject to the same ideological myopia and vices, as well as the same inspiration and virtue, as the rest of us. No single person knows everything, and experience teaches that even if a person did, he or she would forget part of it, or hide part of it, or even use it to his/her advantage. Positing that it’s possible to know everything, and use that knowledge correctly, is coming dangerously close to positing God. Whether we follow to that conclusion, or stop short — and who/what we decide to trust and therefore believe — well, that’s just our obligation as rational beings. We each must individually decide what to believe.

It’s natural that each of us would seek like-minded friends in the world, and so it’s easy to see how we would gravitate towards those who believe the same things. So begins ideology, or the pursuit of actualizing an ideal, which carried to the extreme ends up forgetting that ideas are not more important than people — or so I argue as a Christian: that individuals have the highest intrinsic value; ideas may be valuable but they’re not worth more than life itself.

I plead that we don’t let this social instinct push us into prejudice. I and many people I know believe in the teachings of Christianity and yet also follow the progress of scientific knowledge. Many of these people are scientists or doctors themselves. And likewise, I know that people who religious faith (Christian or other) is irrational do not reduce the human experience to the peculiar behavior of a peculiar animal, enslaved to instinct and evolutionary imperative.

So let’s not discuss these existential issues of faith, science, reason, and belief with a desire to win, especially to win by painting other belief systems in pejorative colors. Rather let’s do it to better understand ourselves and each other.

Restoring the Meritocracy, or addressing concerns about the US Officer Corps

Recently Mr. William Lind published his latest article, and as usual it was provocative. Titled “An Officer Corps that can’t score,” it argues that the United States military has lost the competitive edge in combat for the following reasons:

  • An ego problem, the apparent perception of US Officers that they oversee the best military that’s ever existed;
  • A personnel problem, that officers are punished for creative thinking and innovation (and the mistakes that invariably accompany such a mindset);
  • A staffing problem, which shortens command tours of duty so everybody on the bench gets a chance to play, if only for a short period of time; and worst of all,
  • A moral problem, in which officers support and perpetuate the status quo to protect their careers–notably a problem the US Military did not have after the Vietnam conflict (according to Mr. Lind).

Certainly these are serious accusations. Mr. Lind’s article sparked a great deal of response, too. Several active duty officers penned articles which asserted indignantly that there *is* a great deal of debate in the military regarding staffing, weapons acquisition, force structure, and other ‘big picture’ issues. What is conspicuously absent from the responses, however, is a critique of the personnel situation–which, as the lynchpin of Mr. Lind’s argument, probably deserves the most thoughtful consideration.

Mr. Lind’s own history plays a big part in his critique as well. I’ve never met the man, but if you’ll indulge in a little amateur psychology, I would say that Mr. Lind very much has a dog in this fight. He was foremost among what he calls the most recent wave of “reformist innovators,” and highly praises his contemporaries Col Boyd (USAF) and Col Wyly (USMC), with whom he generated much of the intellectual foundation of so-called Maneuver Warfare. He also helped introduce and develop the theory of Fourth-Generation Warfare, an extension of Col Boyd’s definitive and much-lauded omnibus theory of combat “Patterns of Conflict.” Anyone who is a bit startled (and/or stung) by the opening line of his article, “The most curious thing about our four defeats in Fourth Generation War—Lebanon, Somalia, Iraq, and Afghanistan—is the utter silence in the American officer corps,” ought to at least realize that Mr. Lind is aggressively applying the theories of warfare that he developed and championed to his very broad-brush of a statement about our apparently constant defeats.

The predictable–and justified–knee-jerk reaction by junior officers in the US Military is that Mr. Lind is wrong, and that there is anything BUT silence about the struggles and outcomes of these so-called “Fourth Generation Wars.” Indeed, in my own experience there is a lot of debate about technology (drones, bombs, tanks, and their efficacy) and tactics regarding the most recent conflicts in the Middle East. That is all very good. But I think Mr. Lind hits the nail on the head when he criticizes the military–particularly the officer–personnel system. And while there is a lot of debate about that issue as well, it’s usually conducted in hushed voices and away from field grade and higher officers.

Complaints about personnel issues usually center around field grade officers focused on achieving the next rank (and running their subordinates into the ground to get it), or general officers trying to maintain their reputation to their civilian masters with an increasing administrative burden of annual training and paperwork accountability. To the uninformed, it just sounds like bitching, but hearing enough of it reveals that both types of anecdotes coalesce around one central issue: today’s officer cadre does not have either the time or resources to focus on warfighting.

How has this come to pass? At the danger of theorizing ahead of data, I have some suggestions:

  • First, during the Iraq and Afghanistan conflicts we created a whole sub-combatant-command for each location, complete with Joint Force Commanders, Functional Component Commanders, Service Component Commanders, and associated staffs. This effectively doubled the requirement for staff officers in each of the four major service components. In addition to being top-heavy, it prevented the whole coalition from having any true cohesion as a unit, because new units were revolving in and out under a joint commander who, in addition to directing the whole campaign, also had to administer the vastly increased relief-in-place and transportation requirements of such an ad hoc system. Imagine if Patton had new armored and mechanized units rotating in and out of the 3rd Army throughout 1944 and 1945. Would he have been able to build such a successful and dynamic fighting force?
  • Second, as a corrolary to the first, there are career requirements for officers appointed to joint commands. The demand for those officers has forced the services to cut career billeting corners to get enough qualified officers to meet the demand. That is a recipe for “check-the-box” leadership and careerism from start to finish.
  • Third, most services made a decision to shorten deployment times in order to ease the burden on servicemembers’ families. This was a social decision, and it may not have been a bad one. However it did create a ‘revolving door’ in nearly every unit in the military, as whole combat units turned over from year to year and had to be assigned places in the supporting establishment, which in turn was bloated beyond needs and suffered the same ‘revolving door’ effect. The Army alone experimented with year-long deployments in the hopes that more time in country would allow greater innovation and success in the counterinsurgency fight; I’d be curious to see if there were any positive results.
  • Finally, Congress has micromanaged the benefits of servicemembers to the point of restricting officers from shaping their force. I doubt anyone in the military, including me, would complain about pay increases, money earmarked for better base gyms and housing (including ‘in country’), and a reduction on sexual assault and/or suicide. The problem is the way Congress has enacted these changes. Forcing them down the military’s throat creates a culture of ‘yes-men’ who must “support and defend” the Constitution by bowing to each new decree of a prime Constitutional institution, Congress, no matter what that does to already scarce military resources. Sergeant Major Barrett’s comments, while tactless and insensitive, demonstrate the frustration of many military leaders that servicemembers need meaningful combat training, expensive as it is, more than they need administrative sexual assault training and fast-food joints on base.

The prevailing sentiment among junior officers is that the military is directionless, or maybe more specifically suffering the pull of too many ‘missions’ at once. There’s Congress, forcing social changes and shutting down government. There’s the so-called “War on Terror,” which carries real danger but no real reward–neither Congress nor the Services themselves seem to care much about it anymore. There’s the Administration, preaching a “pivot to the Pacific” and a drawdown, which ominously promises more tasks for the military to accomplish with fewer people, and there’s the innate sense of honor in the services themselves that expect the officer cadre to keep all these masters happy and still field fighting units.

In this context, I will speak heresy to the die-hards and state that there’s small wonder junior officers in particular keep their heads down and try not to screw up (i.e. bring all their servicemembers back alive with comparatively little regard for ‘the big picture’). It also explains why so many veterans of the recent conflicts look back nostalgically on the simpler world of their combat tours, when they had a single direct mission and a feeling of accomplishment.

So what sort of reform would make Mr. Lind happy? I’m not sure, as he simply bemoans US Officers’ lack of creativity and moral fibre, but I have some suggestions on that score as well. But first, I’ll point out that some of the best ideas have come from much more creditable sources than me. Go there, and explore.

My ideas are pretty simple. There is a romantic conception floating around that the military is a meritocracy–in other words, the officers who are best at their jobs should be the ones that get promoted. The shortened command tours, vast administrative requirements, and glut of officers in the services effectively obscure the good officers from the mediocre, lowering moral and motivation. I believe that the best leaders in today’s military truly seek a chance to lead and to show their mettle, so I propose the military make a few structural changes to recover a merit-based promotion system.

  • Lengthen command tours, including the tours that are required for command screening, to 3 (or 4) years. This would first of all require existing commanders to put a lot of thought into the junior officers they promote, knowing that the officers they evaluate highly will eventually control a combat unit for three years (instead of 18 months), and would allow existing junior officers a lot more time to develop and lead their troops under the guidance of one Commanding Officer. 
  • Longer tours help mitigate the ‘zero-defect mentality,’ a colloquialism which refers to the reality that one mistake in an officer’s career is enough to prevent him/her from making it to the next step, because he/she will always be compared to other officers with no such mistakes. It’s a lazy way to evaluate, because the positive effects of the officer with the mistake may be greater than those of his/her peers, and may indicate greater potential. But at least with a full 3 years of observed time, officers will be able to recover from mistakes–and their seniors will be forced to consider which of their subordinates are best suited for further opportunities, knowing that maybe only one will have the opportunity.
  • Longer command tours also permit greater unit stability, which will increase esprit de corps, has been shown to reduce things like suicide and sexual assault, and will certainly increase combat effectiveness.
  • Increasing tour length will be essentially meaningless if officer staffing remains high, because right now it seems like every officer gets the chance to move on regardless of his/her performance against peers. As part of the draw-down, the military as a whole should reduce officer staffing to the minimum level required for service administration, starting with Generals and working down the rank structure (and this reduction should occur before any enlisted personnel cuts, in accordance with good leadership practices). The military should also eliminate the additional joint force staffs located in Iraq and Afghanistan. This will be an unpopular step, as many generals will be forced into retirement, many more field grade officers will be forced into early retirement, and many junior grade officers will not have the opportunity to continue in the military past their first tour. It would help ensure, however, that only the best officers in each rank will remain–reinforcing the idea of the military as a meritocracy.

Actual, active duty officers have much more specific lists of things which need to change, most of which revolve around their ability to train their servicemembers. And we should listen to them. But we can’t force current officers to change their way of thinking–most of them have been shaped by the questionable leadership environment that Mr. Lind notes for the entirety of their career. We can, however, collectively change the game–we can stop playing that ‘everybody gets a chance’ and start giving our officers the space and responsibility to fully lead their men and women. That’s why most of them sought a commission in the first place.

These kinds of changes will force leaders at all level to focus on quality, not qualifications; it will force officers to make tough evaluation decisions after years of watching their subordinates develop. Ultimately, only the top 20-30% will have a career each tour, which will ensure that only the most effective officers run our military.

When our nation’s security and American lives are at stake, isn’t that what we want?

Reflections on Veteran’s Day 2013

This article was published first by Military Spouse Magazine. Please check out their site!

Actual Text below:

The 11th of November is recognized around the world as “Armistice Day,” and was first celebrated in 1918 at the cessation of the First World War. Since that day, the combatant nations have developed their own traditions about the day, the most common being a 2-minute silence observed at 11:00 AM (the eleventh hour) with the first minute dedicated to the 20 million people who died in the fighting and the second minute dedicated to those they left behind, specifically their families and friends (who were recognized also as victims of the war).

In the United States, Armistice Day was renamed to Veteran’s Day. Its purpose was changed, too, because the United States already had a day of remembrance for those who died in combat. Instituted after our deadliest war, the Civil War, the final Monday in May is known as Memorial Day and is dedicated to all Americans who died in battle. Our Veteran’s Day, however, is meant to recognize not specifically those who died for our nation, but all those who stepped up to take that risk.

The importance of this holiday lies in the nature of our own democracy. Whereas colonial powers in the 18th century chiefly fought with professional armies and mercenaries, the nascent United States chose to ask its civilians to bear the hardships and risks of military service. The founding fathers reasoned that citizens, who were aware of their value to the state and invested in its continuance, would both best defend the country and prevent tyrants backed by professional armies from threatening their freedom. And so the idea of a citizen-soldier came into being.

We all contribute to our national defense mostly by paying taxes that finance our military. During the Second World War, we collected scrap metal, scrap rubber, and planted victory gardens. We may post social media statuses in support of our military, or advocate better care for those suffering the physical and emotional wounds of conflict, or put a supportive sticker on our car. And those are great and appreciated acts, especially considering the many voices that vilely condemn and degrade our service members.

But what separates the Veterans from the rest of Americans is their oath to support and defend the constitution—and, by extension, both the people it represents and the institutions it created—even unto their own death. The Veterans willingly chose to give up some of their inalienable rights for the sake of military discipline, to give up the comfort and safety of family, friends, and society, to practice and execute wildly dangerous tasks necessary for the defense of our nation. They risk their lives, not just in all the conflicts we’ve fought since the ceasefire in Compiègne, France in 1918, but in their daily existence: they train in all weather, risking heat stroke and hypothermia; they service and operate engines, pushing ground and air vehicles to the very edge of design capability; they practice using firearms and explosives. They also forego the luxury of leisurely self-discovery in their service of a higher cause, as well as suffer deployments which take them away from their loved ones during holidays, birthdays, anniversaries, births, deaths, and all the other little life events that are markers for memories in a relationship.

For most Veterans, their service was mostly enjoyable. It bestows confidence, meaningful achievement, strong friendships, and unforgettable experiences. But many Veterans also bear scars from their service. They remember comrades who died, or terrible hate in the faces of their enemies, or the price of a second’s neglect, perhaps on the trigger of a gun or in the cockpit of an airplane. That is often the price of military service, though it mostly gets little press or attention, and most Veterans bear such anguish stoically because they know they “signed up for it” and are unwilling to demean their sacrifice by making it the burden of another.

And finally, let us not forget that the privation and suffering of Veterans are shared by their families and friends, who are often left alone and bereft during deployments or training, and who do not have the military support structure of discipline and camaraderie. Service members’ families also receive far less emotional support from our society than military men and women. As they share the burden, so also should they share recognition on this day.

On November 11th, we remember that what Veterans—and those who love them—have done, what they have risked, is special to our country. It continually validates our democracy and our society, recognizing that our nation’s will is truly of the people and by the people. So for those people who take the risk imposed by their oath to defend this country, and who bear the burdens of military service, we (whether we are Veterans or not) offer our thanks and appreciation.

Thank you for your service.

On Proselytizing

A recurring discussion in our democratic world is the role of religion in society. The issue is divisive and applied to all manner of tangential issues–immigration and abortion are the first that come to mind.

Historically, Christianity was the dominant religion. By the numbers, it still is, though at one point there was a sizeable Jewish minority and there are growing Islamic and atheist communities. The percentages have changed with immigration and secularization, which is the process by which a growing number of Americans who were raised in a religious tradition leave it behind as adults.

Theoretically, none of this matters. The First Amendment ought to make the United States a nation in which all religions can be practiced freely, as well as a nation which does not endorse one religion over another (or, perhaps, any religion at all). But historical oppression of Jews, Catholics, Evangelicals, and Muslims by a majority has illustrated the fragility of the First Amendment in the face of the mob. I say a “mob,” because the United States is not meant to be governed by ‘majority rule,’ it is a country which purports to protect minorities–whether they be racial, religious, intellectual, sex, or sexual orientation–from the tyranny of the majority.

But of course it matters. Research in psychiatry and psychology has noted that humans, as social creatures, respond most positively when they are a part of something larger than themselves–little wonder, then, that religion is so dominant a perspective in our lives. I use the word “religion,” but I mean any articulated belief system (and yes, a rationalist perspective advancing the supremacy of science is an example of an articulated belief system, and may be described in this manner as a religion). After all, we can understand and have complete faith in the efficacy of gravity (it has literally never been disproven, as far as humanity knows) without knowing the why; postulating that the why is irrelevant–or, for that matter, speculating on the why at all–is literally an act or statement of faith. In any case, what matters to a discussion on these belief systems is the uncomfortable fact that each of us tends to have beliefs that we regard as truth, such as: that Christ died for our sins; that there is one God and Jews are His chosen people; that good people are rewarded in heaven, and bad ones condemned to hell; that God is the ‘opiate of the masses’ and humanity is steadily progressing toward a socialist communal lifestyle of rational equality; or even that there is no God at all, and He was invented as a panacea for the terrible greatness and apparent unpredictability of our world, and that we have developed so far as to understand that, and may eventually understand all things. Such beliefs are examples, of course. They are probably facile and archetypal. I don’t pretend to speak for any person, though I suspect that some kind of core belief lies at the bottom of every human’s value system.

From the standpoint of meaning, there is no connection between recognizing the importance of belief in human cognition and self-regard, and the truth or fiction of the beliefs themselves. I am a Catholic, and I believe Jesus Christ was a real person who was also really God, who suffered a horrible death made worse by my sins, and who in doing so redeemed humanity. That belief has informed my entire perspective on the world, and mostly unconsciously–I bet it’s engrained upon my soul in ways I will never, ever comprehend (even after a lifetime of reflection). It is almost certainly the primary source of my values and therefore of my interaction with the world. So, from a psychological standpoint I actually need that belief, because without it there is no foundation for my perspective. I need that belief the way a Jew, a rational humanist, a principled atheist, or really anyone else needs their own core beliefs.  Yet the need for such a belief does not–cannot–imply that such beliefs are either true or false. Just because something is necessary does not mean it is manufactured. Truth or falsehood is another matter entirely, and one for the theologians and philosophers.

Truth and falsehood are also very touchy subjects. I know many an atheist who would brindle at the very suggestion that his or her rational world-view is in fact a sort of faith (religion) with its own doctrine and structure. By making that faith-based critique, I am quite literally attacking the foundation of his/her self, the place of his/her most deeply held beliefs. It’s important to remember that. When encountering a news story about a legal decision favoring a religious group, or perhaps an email or social media anecdote about one person getting the better of another, or even lamenting or lauding a decision regarding Christmas Nativity scenes or the phrase “under God” in the Pledge of Allegiance, most people will react strongly because the perceived ‘victory’ or ‘defeat’ is either very self-affirming or very threatening.

On that note, I’ll remind my fellow Christians that “proselytize” is a negatively connoted word, implying that the proselytizer is representing or practicing their beliefs in a way that intrudes upon the victim. Comments like “God is watching!” and “God would be so happy!” on a story, picture, or other piece of internet media are proselytizing comments. They actually condemn (though indirectly) those who don’t share the commenter’s particular perspective. In the New Testament, Jesus pretty explicitly forbids condemnation, as does the Apostle Paul. I think the Christian community (and the world) would be much better served with comments like “I disagree with…” or “I think this is wonderful…,” mostly because such comments establish a perspective based on values, and therefore acknowledge the dignity of other people’s values, without intruding upon them.

Traditional Catholics may point out that one of the seven Corporal Acts of Mercy is to ‘admonish the sinner.’ I agree, though I would add that admonishing the sinner does not excuse one from respecting their essential dignity as a free, rational human being, and would beg to recall Jesus’ own comment about the mite in one’s neighbor’s eye with respect to the beam in one’s own. Besides, disagreeing with someone out of conviction does not condemnation of their world-view. And for those who find value in the act of asserting themselves (the “I won’t apologize if someone else is offended!” set), then I would reiterate Jesus’ call to Charity and turning the other cheek, and remind that causing offense, even if it’s unwarranted, is a sure way to cause further division.

Finally, I’ll say that no-one should have to tiptoe around their beliefs. I certainly don’t. But if the First Amendment’s promise is to come true in our society regarding religion, then we all need to practice our own belief system–rational or faith-based (or both)–with courtesy and respect for others. A good first step for this is to stop proselytizing and to engage others’ values instead of their beliefs, and by extension their essential personhood.

On Pope Francis and the teachings of the Catholic Church

Pope Francis has really shaken up the Catholic Church this time. He affirmed that people of good will, even if they are atheists, are close to Christ. He condemned the, er, widespread condemnation of homosexuals. He has called for the Church to more open and welcoming, which is seen by many as a hint he will relax the mandates of the Catholic Magisterium. His “stance,” a ridiculously vague term which implies his worldview, agenda, and perspective, has prompted much spilling of ink on these controversial subjects from news media and Catholic commentators. And yet he has affirmed Church teaching as well. What could be going on?

I think the commentators may be missing the point. Pope Francis wouldn’t be the first pope to install sweeping changes, but until that happens I’m going to assume that he is doing what his predecessors have done, which is teach the faith. And so far, his comments affirm what the Catholic Church has always taught, though perhaps not with so much emphasis: that humans have free will gifted to them by God, with which others should not interfere; that they are called to follow their conscience to be people of good will; and that they must treat others as they would treat themselves.

These teachings mirror, as far as I can tell from my own scriptural study, what Jesus himself taught. Remember the Good Samaritan, the woman charged with adultery, the tax collector at the temple, the centurion, and the Prodigal Son? In each of stories, Jesus chooses forgiveness over condemnation. He also identifies unlikely protagonists; namely Samaritans, Roman soldiers, tax collectors, and outright, confessed sinners. Most homilies/sermons I’ve heard on these scriptural passages emphasize that the humble, the lost, those seeking goodness are the ones close to God–and the spiritual authorities (the Pharisees) are outside God’s favor.

Notably, the Gospels have little good to say about the Pharisees. They constantly try to trick Jesus and get him to blaspheme against the law, they grumble about him associating with enemies of the Jews, and Jesus himself condemns them pretty stridently, calling them “hypocrites.” The reason for this, I think, is because they are overly scrupulous, a word which used to mean “overly concerned with rules.” Put simply, to be scrupulous, a person must be dedicated so much to following the letter of laws and dictates that he or she fails to accomplish the good for which those laws and dictates were instituted in the first place.

Pope Francis seems to have evoked this element of Gospel teaching in his recent statements, speeches, and interviews…and it is not surprising that he has caused a furor in doing so. The Gospel’s challenge in this regard is a very personal one, and it strikes at the core of each person’s unconscious, but deeply held, beliefs and convictions. Left to each of our own devices, I’m sure we would each live a good life according to our own perception and experience. But such is not our world: we are immersed in society, and so we contact nearly infinite other perspectives and experiences. What are we to do when another perception, or someone else’s experience, challenges our own?

This question is not a Catholic one; it is a universal one. Who among the people of the earth today has never been shamed out of an opinion by someone else’s story, or never had a rival in love, or cause for jealousy, or has violated his or her own values? These essential human conflicts we can resolve in one of three ways: first, we can ignore the conflict by becoming a hermit–either separate from the world, or existing within it yet unwilling to challenge our fellows; second, we can identify one rigid set of values to give our interactions structure, and never change them no matter what additional experience we receive; third, we can interact dynamically with our world, seeking understanding of others, though we are aware that we might hurt them. The third option requires love, humility, compassion, generosity, and forgiveness, and it is the Catholic answer, for the Gospel teaches it.

With much distinguished scholarship on sin, objective evil, and the elements of a good life, the Catholic Church has (probably inadvertently) created the elements of a rigid set of values, for those who choose the ease of such a moral compass. Yet other members of the Church (and our society) seem to withdraw from the difficulty of true compassion and generosity, preferring a simpler course of benevolently accepting all experiences–and, along the way, granting themselves license to ignore the idea of “true values” in favor of no values.

I don’t mean to generalize here. I realize that few, if any, people live their lives entirely on one side of the spectrum or another. In truth, I suspect that each of us has beliefs about which we are scrupulous, and others which we choose not to engage. But Pope Francis seems to be guiding Catholics toward the third way, reminding them that they have a responsibility to dynamically engage the world, seeking to love and care for all people–even (or perhaps especially) for those who are most offensive and pharisaical to them. Whatever sins we abhor in our neighbor, remember Jesus calls us foremost to love them anyway.

And should we be tempted to pass judgment when dynamically engaged, I submit that we remember (whether we are progressive or traditional Catholics, for all that we imperfectly know of God’s perspective from the Bible and Church teaching, and no matter the reach of our perspective which is necessarily limited by the tiny fraction of humans we happen to know) that in the end we surely must admit that cannot know the heart of God (as the bible reminds us though Isaiah and Christ himself).

So let us not commit the sin of the Pharisees, and use our religion to condemn and degrade others, either by accusing them of a lack of love or by passing judgement on them for failing to sufficiently respect the rules. Let us not pridefully arrogate God’s province of salvation and condemnation to our imperfect human understanding. Let us remember Jesus’ warning about the child and the millstone, and remember that when we fail to love and be compassionate for someone who appears to us as sinful, and then treat them with disdain and unlove, then we may very well be the agent of their stumbling in their relationship with God.

I believe that is the message of Pope Francis. It can be boiled down to love your neighbor, whether he/she is scrupulous or progressive, tends toward latin mass or the vernacular, is gay, or has had an abortion, or is divorced. Loving your neighbor doesn’t mean condoning what they do, but there’s a catch here. If one perceives a sin in someone else, couching a correction (or condemnation) in loving words is not the same thing as loving them. In fact, it is usually the opposite. And anyway, we are pretty specifically told to focus on our own sins, which–if we presume to pretend that we’re better than others morally–may trend toward the deadly.

As far as how to love our neighbor, I suppose our best teacher is Jesus, who wants us to feed, clothe, care for, and visit in prison our neighbors, in metaphorical ways as well as material. Pope Francis, bless him, provides us an excellent example of this, and the crowing or panicked pundits of his papacy only indicate that we certainly need both his reminder to love, and his leadership.

Follow

Get every new post delivered to your Inbox.

Join 134 other followers