Wednesday, October 20, 2010

LIve Webcast today! Ethics lecture/discussion/chat


Lecture/Discussion/Webcast/Chat:


"When is a Pile a Heap? How to deal with moral vagueness."
  Richard Gilmore
IPPL Regional Fellow
Professor of Philosophy, Concordia College

October 20, 6 p.m. central

Watch online and chat with the speaker and audience at:
http://www.philosophyinpubliclife.org/Live.html


 Or, if you are in Grand Forks, join us at the University of North Dakota Bookstore, 775 Hamline St,
Grand Forks, ND. Attendance and parking are free!



The discussion is for general audiences,
and no previous knowledge of philosophy is required.

---

Abstract:
"Although some situations are morally clear (the right or wrong thing to do is obvious), most are ethically ambiguous. How can we act properly when what the right thing to do is so vague? In this discussion, Richard Gilmore, a Professor of Philosophy at Concordia College, will discuss the search for moral clarity by focusing on a classic philosophical problem: the paradox of the heap. How many grains of sand make a pile? Twenty? One hundred? When does the pile become a heap? After a thousand? Two thousand? Gilmore hopes to show that, in many ways, ethics is like this; there are no moments of absolute precision. But, he will argue, through trying to define the heap, we can also clarify what it means to accurately define the right thing to do.
---
Please contact: ippl@und.edu for any questions
---

RSVP on Facebook here

Monday, October 18, 2010

How does one know when gender negatively affects day-to-day interactions?

 (Image is from the great online cominc strip xkcd)

So, I walk into the student union to get a sandwich before my night class and the place is slow and empty. There are two grad-student aged women waiting by the deli counter, where I plan to order, and there are workers mingling in various corners of the food court, but no one is attending to the women or the deli. I stand behind them for a minute and say (in a friendly tone of voice), “you have the look of people who are being ignored.” They had been shifting around watching the employees while they went by, pleading with their eyes to be noticed. The woman nearest me agreed that they had been ignored, so, a few moments later, when a worker rolled a cart past us from about twenty feet away, I leaned towards him and called out: “Excuse me. They’ve been here a while, and no one has been here to help them. Can you please ask someone to come to the counter so we can all get some food?” The guy looked at me for a moment then said sure and disappeared behind the scenes.

The woman I had already spoken with expressed appreciation, I remarked that it helped to be assertive (and I might have made a joke about being taller, but I’m not sure). She responded by saying that of course I would get more attention, I was male. I, in turn, suggested that it was probably my professorial demeanor that got his eye. Then she and I had a nice conversation while a worker came and took our orders. We all went our separate ways.

That’s what happened verbally. Here’s what I thought in my head when she made the remark about being male: “it has nothing to do with being a man. It has to do with being assertive. If you had called out for someone instead of standing there waiting quietly to be served and digging holes in the back of people’s heads with your angry eyes, maybe you would have gotten served too.” But I wouldn’t have said that, and frankly, as you can imagine, it, I don’t know that it’s true.

Being assertive is a male characteristic. I tend to think of it as “New York” attribute, since that’s where I’m from and since most of my daily comparisons are between passive North Dakota and my aggressive city of birth. But the local culture doesn’t change the fact that men are socialized to be more forceful in just these kinds of situations, and that even if the worker didn’t notice me because I was male, he might have ignored them because they were female. In the end, gender, if it did play a role, might have influenced the surface interaction or it might have gone much deeper. None of this even considers tangential question such as whether, were I not a professor on my own campus, I would have refrained from talking with them in the first place for fear that I might appear to be hitting on them. It’s complicated.

In fact, it’s so complicated that I do not know if I am a reliable interpreter of when gender is a factor and when it isn’t. My wife and I rarely argue, but some of our tensest conversations are ones that, at root, revolve around gender difference. And then there’s this: a new blog about the experience of women in professional philosophy. I have always been super conscious about making sure that women are represented equally in conversations in my classes. My Introduction to Ethics class is basically a feminism class, I incorporate women authors when I can in my other courses, and I am insistent that Why? Radio and IPPL events don’t sell women philosophers short. I knew that there was some subtle but pervasive sexism in philosophy, but I had no idea – no idea – of the extent that women in professional philosophy still had to deal with brute misogyny, sexual harassment, exclusion, abuse, insult, and delegitimization. And if this stuff can go on around me so obviously and I am just unconscious of it, then what does this mean about my own abilities to detect when others are being sexist, and when I am being sexist?

It is reasonable to assume that if I have difficulties determining when gender is a factor in social relations (largely understood), that other men do as well. And, frankly, it is reasonable to assume that have analogous difficulties. Just because the woman in the story above thinks my male-ness was the dominating factor in our attention-getting, doesn’t mean that it is. People sometimes see discrimination when there is none. Of course, just because a woman sees an act as sexism-free doesn’t mean it is egalitarian either. As J.S. Mill tells us so forcefully in On Liberty, none of us are infallible. But this goes both ways. We may be wrong when we identify prejudice, but we may be just as wrong when we don’t see any.

There are long-standing conversations about the value of giving marginalized individuals and groups the benefit of the doubt when they point out discriminatory actions. My favorite comes from Feminist Standpoint Theory; it argues, basically that those discriminated against are able to see things, by virtue of their perspective, that gives them a kind of “expertise” in social science research and inquiries into justice. But I don’t need the theory to understand this. As the faculty advisor to the UND Jewish Student Organization, and the most vocal advocate for the Jewish students on campus, I am well aware of how blind non-Jews can be to the anti-Semitic things that they do daily. But this doesn’t change the fact that my ability to see discrimination against women has clearly been compromised (or that I may see anti-Semitism when there is none). So, what other knowledge do I think I have that I really don’t?

This is, obviously, an instance of the more general epistemological problems that come from being a human in the world with limited information. It again recalls Mill’s assertion of human fallibility, but it is also the theoretical foundation for why intersubjective research is so important to modern inquiry. Thus, with the need for social inquiry in mind, I leave all of you with a version of the question I am asking myself: how does one know when gender is a relevant factor in social interactions and when it isn’t? And if philosophers are so bad at seeing it, then does this mean that philosophy is much less effective in this regard than it claims to be? All evidence suggests that the answer to the second question is yes indeed, it is much worse that most philosophers like to think. I’m hoping all of you can help me with the first one.

Monday, October 11, 2010

Is “selective” government possible?


 The United States is in a frenzy these days about taxes. Despite the fact that we have the lowest average tax burden since 1970, many people want to pay fewer taxes, and one of the consequences is that states and counties are no longer providing universal coverage for different services. In response, some might suggest that people could choose to pay for what they want to support. Some might refuse to pay for the military; others may not want to pay for public schools. It is therefore worth considering what a government might look like that has an a la carte menu.

Look no further. A Tennessee man recently watched as his house burned down, right in front of firefighters, and while they ended up putting out the fire once it spread to his neighbor’s house, the fire fighters just watched his house burn. The reason: he hadn’t paid his subscription fee.

His county, it turns out, doesn’t offer fire service, but residents could pay a nearby county $75 for fire protection. He didn’t. He neighbor did. When the fire started, his neighbor’s house was saved, his wasn’t, and despite his offer during the blaze to pay any amount of money, the fire department refused.

There are a whole host of philosophical questions that arise here. First and most obviously: was the fire department morally correct in refusing? Some might say they weren’t since a person’s house was at stake – why not make an exception, and why not charge him afterward? Others might retort that the homeowner is a consenting adult, and if the firefighters made an exception in his case, no one would pay the fee, and everyone would expect coverage anyway. On the one hand, fire departments could make a killing charging for their services when they are most necessary… say $7500 instead of $75. On the other hand, this might be considered gouging. Is this exploitation or just capitalism at its finest?

For me, the more interesting question, however, is the very nature of a la carte governance. John Rawls famously wrote that part of what it means to live in a community is to “share one another’s fate.” Is this shared fate possible if each person has different coverage? Additionally, a la carte governance means, as usual, that the wealthier citizens have more chance of comprehensive coverage than the poorer. This seems like a recipe for factionalism and injustice. (It is also how the world actually is, so, some would ask if this scenario is meaningfully different than the world as it is.)

On the other hand, there are a multiplicity of government services that are voluntary right now – we in North Dakota can choose whether or not to have federal flood insurance – and saving this money may actually help the poor. It’s all a matter of risk, one might say, no different than the millions of Americans who choose not to have health insurance. (As with the health insurance, some might object that the benefit to the poor is short term, but in the long run, it’s more harmful.)

So, the question before us is whether justice is possible in a world of different governmental coverage and whether managing risk and saving money are suitable alternatives to forcing tax payers to pay for something they might not want. Finally, should freedom to choose trump mandatory tax laws? As always, I’m curious as to your opinions.

Friday, September 17, 2010

Bonus Post: Inevitable Comments on Qur'an Burnings



The Dakota Student, the UND student newspaper, asked me to write a guest column on the rise of anti-Islamic sentiment in the U.S. I felt uncomfortable about sticking my faculty nose in a student paper, but I agreed. (This month's WHY? monologue is about my trepidation.) And since it was for the student paper, I decided to, in the words of Malcolm X, "talk right down to earth, in a language that everybody here can easily understand." (To hear his voice, and an awesome 80's song that samples it, go here:) So, for those of you who are interested, here is the column:


On Burning Books and Terror Babies

Some people are just dicks. I recognize that as a philosophy professor, one might expect a more sophisticated response to the Reverend Terry Jones’s desire to burn Qur’ans, but there isn’t much more to say about him. He’s a charlatan – an attention-whore – and he’s using tragedy and politics to gain his fifteen minutes of fame. He’s not an agent of God. He is a purveyor of adolescent angst. Don’t give him any attention and he’ll go away.

The people who condemned the book burning – politicians ranging from Barack Obama to Sarah Palin – did so by citing the imminent danger to American service personnel. I certainly don’t disagree. But they neglect the greater danger to the Baptist church in Lawrence, Kansas; the Pentecostal community in Boston, Massachusetts; the Mormons in Lubbock Texas. The terrorists can’t tell the Christians apart – they can’t even tell Americans apart in general – and when they strike back, they’re not going to bomb Jones’s church, they’re going to bomb someone else’s. They might bomb a synagogue, or a Chuck E Cheese. They believe all Americans think the same way, have the same values, and share the same religious points of view.  They’re bigots.

Of course, so is Jones, and so are many of us. Lots of Americans can’t tell Muslims apart. They think that the folks who bombed the World Trade Center share the same religion as those who want to build a cultural center a few blocks from ground zero, but those two groups have as little in common as White Pride skinheads do with Episcopalians. Yes, they share the same book. But so does the guy who shoots the abortion doctor and the priest who distributes clean needles to heroin addicts. If we shouldn’t paint all those people with the same brush, then why should we paint Muslims as such?

The worst offender, to my mind, is not the moronic Reverend Jones, but Texas Representative Louie Gohmert who claims that there are “terror babies” in our midst. Muslims, he claims, are having children in the U.S. just so they can gain citizenship and train them to blow us all up. He is not a dick. He is much worse, although my professional judgment prevents me from using the appropriate words to describe him accurately. His message can be boiled down a familiar racist accusation, “they don’t love their children the way we love ours,” and its logical extension, “they are not human.”

For Gohmert, Muslims are lesser creatures who willingly procreate to develop an army. They suckle their babies smiling, not with love, but with the vision of their child’s body parts spraying across the landscape taking as many people with them as they can. He thinks they are monsters, that they must be stopped.

In the nineteenth century there was a debate as to whether “negroes” were able to love. Black people were not regarded as fully human, and scholars wondered if they were capable of this most human emotion. This debate ought to make us all nauseous, but it is a short walk from it to Gohmert’s position, one enabled by Jones’s and by all those who can’t tell Muslims apart anymore than the fanatics of Al Qaeda can distinguish individual Americans, Americans, by the way, who themselves may be Muslim and wholly loyal to our democracy.

Terry Jones is a dick and the response to him is simple. Everyone should just walk up to him and say “stop being a dick.” But Gohmert and the rest are anti-American bigots and the response to them must be to tirelessly point out their ignorance. This requires learning more about the world and the people who inhabit it. It means accepting, once and for all, that Islam is not a four-letter word, and that assuming it is makes all of us bigots too. So stop it. 


The article has now been posted on the Dakota Student site. There is a comments section there too, as of yet, no one has responded.

Friday, August 13, 2010

What ought we do when we learn the horrors of human history?

Painting: Anselm Kiefer

In my reading, movie viewing, talking with people, and day-to-day life, I encounter a lot of horror stories -- terrible things foisted upon people by nature, politics, and war. When they are presented compellingly -- when they move me -- I always have the same reaction: "I have to tell my students about this." This happened today.

I subscribe to a story-telling podcast called "The Moth," which I enthusiastically recommend; a recent episode focused on a survivor of the Khmer Rouge regime in Cambodia. (All stories on the Moth are true, by the way, and told without notes.) If you don't know about this period of history, the Killing Fields, or of Pol Pot, I suggest you take this opportunity to read up on them all. It is yet another chapter of a horrifyingly genocidal twentieth century. They're important to know about and interesting in their own right.

But even if you don't seek out more info, I would encourage you to listen to the podcast I have embedded into this entry. It's horrifying, naturally, but it is also very human and connects you to a person who you would never otherwise meet, an ordeal which, hopefully, you will never approximate. I'm not one for pornographic violence -- there's nothing about the story that is titillating or glorifies suffering. Some may call the end uplifting, but I'm not sure I can use such a word in this context. DON'T listen to it around children:



You can imagine all of the questions I had because of this episode. Why are people so horrible? Will we ever learn? What does genocide accomplish? What makes such a woman so strong? Would I have the capability to do what she did -- or to even survive?  I will, of course, never think of pedicures and manicures the same way again.

But the other question that comes to mind is what my moral responsibility is now that I've heard this woman's story? This is a different question than what should be done in the face of current horrors? All the events of this story took place in the past. They can't be undone. This story is information,  and asking what to do with knowledge is a different kind of question.

As a teacher, my instinct is always to pass information on to others. I do this in class often, and I did it this season for the IPPL Art & Democracy film festival when I showed War Child. I felt it was a movie everyone should see. Others were less moved than I, others were less compelled to act, but at least people were exposed to and talked about the movie. That, I felt, was my part.

This desire to tell others is, I think, the legacy of post-holocaust prescriptions to "never forget" and to tell the victims' stories. I believe that such information helps us understand that other people are real, that political strife is real, that the suffering of others matter, even when they are unknown or unconnected to us. It gives name to the nameless, recognition to the unrecognized. (Metaphysicians might ask about the worth of recognizing the dead if it does not impact them in any way, but that's a separate question.) However, I do not believe that "those who don't know the past are doomed to repeat it," because plenty of people repeat what they know and plenty of others are too good to commit atrocities that hey had never heard of.

Surely, there must be more to my moral obligation than passing on stories. Information can be passed from person to person in perpetuity without changing behavior so telling stories just doesn't seem to be enough. So, what else is there? Maybe I learn that if I were ever in absolute power, I shouldn't be like Pol Pot. Maybe I will now avoid electing someone like Pol Pot or assisting in their acquisition of power. But if these are the lessons, they are either useless or trite. I'll never have absolute power and I already know not to vote for a dictator. So, what else is there?

There are others answers, of course: give money to a related organization, go to Cambodia and offer my services, be grateful for what I do have and for what I or people I know haven't had to endure. Yet, all of these are options, there is not moral obligation to do any of those things. Thus, the question I pose to all of you is, are there any moral obligations or moral duties that follow from hearing such a story? I don't have an answer. I'd be curious to see if any of you do.

Bonus Teaching-related Post



The folks at the Teaching Thursdays, a blog at the University of North Dakota, asked me to write a piece reflecting on teaching and the Institute for Philosophy in Public Life. For those of you who are interested in such things, I foigured, I ought to provide a link on PQED. So:

Monday, August 2, 2010

Is spending a lot of money immoral?


As everyone in the world seems to know, Chelsea Clinton got married this past weekend. And, as everyone also seems to know, the estimated cost of her wedding was 3 million dollars. That’s a lot of money, and the media (both new and old) are challenging the morality of spending that much on a party, however important it might be to the bride, groom, and family. Just think, Philosopher Michael LaBossiere asks us to consider, how much good that money could do for those who need it.

This is a standard argument against extravagant spending. Some things are just too expensive. It rests on the notion of opportunity cost – those sacrifices one makes by doing one thing instead of doing something else. So, the opportunity costs of spending 3 million dollars on a wedding might be one million textbooks that never get purchased, or hundreds of thousands of malaria inoculations that never get distributed, or what have you, whatever the Clintons would have chosen if they hadn't chosen the wedding. Of course, everything has an opportunity cost, not just money. The time I spend writing this blog has an opportunity cost because I could be working on an academic book, or jogging, or hanging out with my daughter, or feeding the poor at a soup kitchen. And hanging out with my daughter has an opportunity cost – writing this blog, for example, or jogging, or feeding the poor, or searching for enlightenment. Anytime you do one thing as opposed to another, or anytime you spend money on one thing or another, the opportunity cost is the other thing you could/would have done or bought if you hadn't chosen what you chose. Most of the time, we think that the cost is worth the price – this blog is important enough to me to sacrifice an hour of Adina-time every few days, but it might not be if, God forbid, she had terminal cancer or something. LaBossiere is therefore suggesting that the opportunity cost is simply too high to justify spending 3 million dollars on a wedding; many people seem to agree.

The problem with this argument, however, is that it assumes money has an absolute value – that three million dollars is three million dollars period, and that every person’s three million dollars is the same. Especially in the Clinton’s case, this just doesn’t seem to be the case. Hilary Clinton’s net worth is estimated to be 34.9 million; Bill is estimated to be worth 200 million. Consider, for example, what Money Magazine tells us:

Although she takes in $165,200 a year as a senator, these days Bill is breadwinner-in-chief. His presidential pension is $201,000 a year, and he grabbed a $12 million advance for his 2001 memoir, "My Life." (Her "Living History" won an advance of $8 million and $7 million in royalties.)

But it's been Bill's great gift for gab that has really feathered the Clintons' nest. He earned an astounding $41 million speaking to groups and corporations in the first six years since he left office. Standard fee: $150,000. The fact that he may be married to the next President can only burnish his star power.

So, it seems to me that the question isn’t whether 3 million dollars is too much, but rather, whether someone’s only child’s wedding is worth 1.2% of his or her net worth, and that’s ignoring future earnings. Asked this way, the amount seems negligible to say the least. When my wife and I were married, our wedding cost about 12% of our combined annual salaries. (Debt from student loans kept our net worth in the negative, but there’s no point getting into that.) So, unless money has an absolute value – again, three million dollars is just three million dollars, no matter who spends it – then Kim and I are much more immoral than Bill and Hilary.

One might argue that no one should be as rich as the Clintons, or that they don’t spend enough on charity. Perhaps, but again, those are different discussions. Instead, the question is whether, given how rich they actually are, and given how much money they will continue to make, whether it is immoral for them to spend so much money on this particular wedding.

It’s easy to blame the rich, but it’s even easier to forget that most everyone reading this blog counts as rich. In terms of U.S. dollars, the average person in Zimbabwe makes ten cents a year. Folks in the Republic of Congo earn (on average) $334 per year; in Liberia they earn $379. (There’s a list of the ten poorest countries and their per capita income here .) Outside of Africa, Afghanis make an average of $800 per year, and according to many sources, Haitians make $2 per day or $730 per year. So, you know what’s really immoral by LaBossiere’s standard? My Netflix membership. In one year, I could feed 2176.8 Zimbabweans for the amount that I spend to not watch the DVDs I have piled up on my television for three months at a time. I could feed a Congolese person for almost a year. I could save lives, real people, with real need, and even if the aid organizations were imperfect, I could still have substantial impact. Thus, if opportunity cost, based on absolute value, is the standard by which we are to judge whether someone is spending too much money on something, I have a lot to answer for. And so does, I expect, LaBossiere. So do you.

Peter Singer agrees, incidentally, and he argues that if it costs $30,000 for an American family to live, then any penny more than that ought to be donated to the poor. That too is a conversation for another time, but it seems to be the logical consequence of Labossiere’s position.

I intend to take a different tack. What saves us from being horrible immoral people? The answer is hinted at by a misunderstanding that LaBossiere waves away. He writes, “Then again, perhaps this is an excellent example of trickle down economics: the fabulously wealthy Clintons spend $3 million on a wedding and this trickles down money to those involved, such as the waiters who will be working there and the folks making the cake. Also, people who are out of work and poor can enjoy watching news of the event on the TV in the local coffee shop, thus lifting their spirits.”

First of all, what he describes isn’t trickle-down economics. Trickle-down economics is a tax-theory that argues that the wealthy should pay fewer taxes because the wealthier they are, the more money they spend, and that this spending has a disproportionally positive impact on the economy. What Labossiere waves away, instead, is basic capitalist theory, that is, that the more goods purchased, in general, the more labor there is, and when there is more labor, there is a higher standard of living for everyone. Adam Smith called this increase in standard of living "universal opulence." Sure, trickle-down economics is a variation of this, but so is every other market-based economic theory. I will let Adam Smith explain the point further, since he does it better than anyone else, but before I do, let me warn you that this is a REALLY REALLY long quote, and I’m including it all to counter Labossiere’s caricature that the Clinton wedding helps out a few select people materially and a few more psychologically. Certainly, if the wedding helped out a handful of people only, it would be hard to justify. But Smith’s point is that something like this does not just help out a few people, it helps out lots and lots and lots and lots.

You don’t have to read the whole thing if you don’t want. Read the first few sentences and then skip ahead to my text, if you prefer, but if you want to really understand the power of Smith’s argument, you should take the time to get the picture Smith is drawing. Anywhere, here is Smith’s description of all of those who benefit from the purchase of a simple coat. It's from The Wealth of Nations:

    Observe the accommodation of the most common artificer or day-labourer in a civilized and thriving country, and you will perceive that the number of people of whose industry a part, though but a small part, has been employed in procuring him this accommodation, exceeds all computation. The woollen coat, for example, which covers the day-labourer, as coarse and rough as it may appear, is the produce of the joint labour of a great multitude of workmen. The shepherd, the sorter of the wool, the wool-comber or carder, the dyer, the scribbler, the spinner, the weaver, the fuller, the dresser, with many others, must all join their different arts in order to complete even this homely production. How many merchants and carriers, besides, must have been employed in transporting the materials from some of those workmen to others who often live in a very distant part of the country! how much commerce and navigation in particular, how many ship-builders, sailors, sail-makers, rope-makers, must have been employed in order to bring together the different drugs made use of by the dyer, which often come from the remotest corners of the world! What a variety of labour too is necessary in order to produce the tools of the meanest of those workmen! To say nothing of such complicated machines as the ship of the sailor, the mill of the fuller, or even the loom of the weaver, let us consider only what a variety of labour is requisite in order to form that very simple machine, the shears with which the shepherd clips the wool. The miner, the builder of the furnace for smelting the ore, the feller of the timber, the burner of the charcoal to be made use of in the smelting-house, the brick-maker, the brick-layer, the workmen who attend the furnace, the mill-wright, the forger, the smith, must all of them join their different arts in order to produce them. Were we to examine, in the same manner, all the different parts of his dress and household furniture, the coarse linen shirt which he wears next his skin, the shoes which cover his feet, the bed which he lies on, and all the different parts which compose it, the kitchen-grate at which he prepares his victuals, the coals which he makes use of for that purpose, dug from the bowels of the earth, and brought to him perhaps by a long sea and a long land carriage, all the other utensils of his kitchen, all the furniture of his table, the knives and forks, the earthen or pewter plates upon which he serves up and divides his victuals, the different hands employed in preparing his bread and his beer, the glass window which lets in the heat and the light, and keeps out the wind and the rain, with all the knowledge and art requisite for preparing that beautiful and happy invention, without which these northern parts of the world could scarce have afforded a very comfortable habitation, together with the tools of all the different workmen employed in producing those different conveniencies; if we examine, I say, all these things, and consider what a variety of labour is employed about each of them, we shall be sensible that without the assistance and co-operation of many thousands, the very meanest person in a civilized country could not be provided, even according to what we very falsely imagine, the easy and simple manner in which he is commonly accommodated. Compared, indeed, with the more extravagant luxury of the great, his accommodation must no doubt appear extremely simple and easy; and yet it may be true, perhaps, that the accommodation of an European prince does not always so much exceed that of an industrious and frugal peasant, as the accommodation of the latter exceeds that of many an African king, the absolute master of the lives and liberties of ten thousand naked savages.

There is a lot that follows from Smith’s quote, including, obviously, an implicit argument for globalization. Also, there is some language in the last sentence that rubs us moderns the wrong way – eighteenth century norms allowed people to speak of Africans as “naked savages;” we certainly don’t do that. However, we can talk about whether Smith was a racist or not another time. (He wasn’t.) Instead, Smith’s point about African tribes at the end is that the income differential in a free-market society is never as large as the income differential in a tribal society in which no one owns anything but those in charge. Is that less true these days with Bill Gates around? I don’t know. But if we take away the handful of multi-billionaire exceptions, it seems to be the case.

Smith's writing is, of course, an argument for free-market theory, and the reason why I bring it up is to show that Labossiere’s attack on the Clinton wedding is, in essence, an attack on capitalism in general. Now, it's okay if you want to attack capitalism. I just don't like the idea of someone sneaking the critique under the door without acknowledging it.

(Incidentally, if you are interested in these debates, I'm teaching a graduate seminar on philosophy of economics, including the morality of markets and globalization in the Fall. Come join us. There are still spaces available.)

Smith's point is that consumption helps the general population; the Clinton wedding does too. However, my point, or rather, the philosophical question before us now, is whether or not we can measure the value of something by an absolute standard or whether we should consider it in the context in which the value is determined. Everything I have written suggests that value is contextual, whereas Labossiere seems to argue otherwise. What do you think? Does the Clinton fortune mitigate the amount they spent on Chelsea’s wedding or is it simply too much? If it is too much, then what standard are you using to decide? Where does absolute value come from, and if we rely on it, why aren’t you just as immoral as the Clintons or me?

UPDATE: As always, when I walk away from the blog I realize ten things I wanted to say. So, I'm back to add one clarification. My point is not that spending is always good. I am not arguing for rampant materialism or conspicuous consumption. (Smith doesn't either.) Instead, I am asking about the standards we use to determine whether spending money on something specific is moral or not. Are the criteria derived from context or from the absolute value of the money? Obviously, I suggest the former, but this doesn't mean that some purchases still can't turn out to be immoral.

Wednesday, July 14, 2010

Are we morally obligated to keep other people's secrets?


The other day I found a phone that someone left behind in our local dog park. I looked through the contacts, called “Mom” to tell her that I found her child’s phone and then texted someone listed as “grlfrnd.” It was she who eventually picked up the phone, and the owner of the phone called me on his phone as well. The whole thing took about two hours and Grlfrnd came to my house to get the phone. Everyone was very nice. But there was one question that kept going through my head: do I look at the pictures?

I’m a curious guy. Nosy, at times, and I’m certainly as voyeuristic as the next person. So I have someone’s phone and I was curious what secrets it held. But I also knew that in doing so, I would be violating someone’s privacy. And what if I found something I shouldn’t. In addition to the variety of things I didn’t want to see based on personal taste, I could also imagine the media-fueled stories of child porn, and the obligation to go to the police if I found some of that, so was I willing to open the Pandora’s Box?

People will respond by saying that you shouldn’t put something on a phone that you don’t want others to see, and this is good advice. But this is a strategy not a moral position. The question I was asking myself was instead about my own ethical behavior. Just because I was curious – perhaps pruriently so – does that mean it’s morally OK to look?

As always, my questions dovetailed with other things, most particularly, a column by the ethicist in the New York Times magazine. The column, for those of you who don’t know, is a pseudo-intellectual Dear Abbey written by Randy Cohen. People write in with moral dilemmas, and Cohen offers his advice. He is not a philosopher, does not appeal to theory all that often, but it’s a fun read, mostly because I, like many other readers, am curious if his answers agree with mine. They do a fair amount of the time.

So, an orthodox Jewish woman writes in to ask whether she should “out” a guy she went on a date with for being transgender.

So, an orthodox Jewish woman writes in to ask whether she should “out” a guy she went on a date with for being transgender. They had a nice time, she googled him, found out that he was born female, but she also indicates that she has special research skills and found out about him in ways others would not be able to. Presumably, his identity would stay a secret if she didn’t tell anyone. She broke it off with him, knows he dates other women in the community, and wants to know whether she should “urge” the rabbi to tell others. Cohen’s response is that while she can certainly tell her friends in everyday conversation, she shouldn’t tell the rabbi.

Thus we have the question: is she obligated to keep his secret? Now, we’re not sure what special research skills she has, and if she somehow accessed private medical records then she shouldn’t tell – accessing the records would itself be breaking the law, likely immoral, and passing on the information would be so as well. But let’s assume she found the information via public records. What then? Again, Cohen’s answer seems fair at first.

However, Cohen misses at least one point: in a tight-knit community like an orthodox Jewish community, there are few secrets. Telling one person something is, in essence, telling everybody. Yes, Jewish communities have prohibitions against gossip, but we all know how well those work in any group. So, we have to assume that there is no real distinction between telling one person and telling the rabbi, and thus Cohen’s solution falls apart. She either tells no one, or she tells everyone.

Jos, on the blog Feministing, has another objection. She writes that outing a transgender person often leads to violence and thus telling people is subjecting the man to great personal risk. This is an interesting objection because, at least in part, it offers a criterion: we do not tell people’s secrets when it could be dangerous to do so. But Jos too misses the point: for the orthodox woman, being transgender is a profound moral wrong – it is unnatural – and thus people need to be “warned.” Would Jos make the same claim if the woman found out he was a convicted child molester?

That last sentence will likely make Jos's head explode.She will likely claim that I am comparing being transgender to molesting children.

That last sentence will likely make Jos's head explode.She will likely claim that I am comparing being transgender to molesting children. (I am assuming from the context that Jos is female, but the profile link is broken so I may be incorrect about this.) I am not. I personally do not find anything morally wrong about being transgender, I don’t think there are victims involved, and I don’t think that anyone has an obligation to tell partners that they were of a physically different sex at some point unless, perhaps, there is a committed relationship involved. Furthermore, the latter, for me, has more to do with intimacy and being a part of a joint project, rather than the “trans panic” that Jos warns of or the distaste of people who are offended by fluid sexualities.

The point is that for many conservative people, being transgender is like being a child molester in that the person needs to be protected against. Perhaps the act is immoral, perhaps it is unnatural; it is not for me to say in this blog for these people. But since the woman thinks this, and truly honestly thinks this, the objection that it might be dangerous to the “perpetrator” doesn’t hold moral weight in just the same way that arguing against outing a child molester because it might be dangerous to the child molester doesn’t hold any moral weight. Protecting children comes first, and, for the orthodox woman, presumably, protecting the community comes first.

Jos might argue, of course, that the orthodox woman is just wrong, that there is nothing wrong or dangerous about being transgender and thus the woman, by telling of the man’s past, is committing a dangerous act based on false assumptions. Holding such a critique of orthodox Judaism would certainly be within someone’s rights, but it still doesn’t solve the problem at hand. That orthodox Jews need to be educated about real morality and the way their religion distorts justice and reality is certainly a common opinion held by those on the left and the right alike, but it isn’t going to convince any orthodox Jews. It isn’t going to convince the woman who wrote it. Were Cohen to suggest it, his answer would have been akin to the statement: “this is a moral dilemma only because your religion is wrong and bad. Leave it and you won’t have this problem anymore.” I hope everyone can see why this is not a viable response. (For a wonderful and sympathetic documentary about growing up gay in the Orthodox community see: Trembling before G-d).

As a side note, I wonder if the many gay activists who encourage, for political reasons, outing closeted people, feel the same way about transgender folks. Do they think that the orthodox woman has a moral obligation to tell her rabbi and community, not to protect the community but to force the hand of the transgender person? Consistency would suggest that such activists would have to claim just this.

In short, there are compelling reasons to tell and not to tell, some of which I would agree with, some of which I wouldn't. In the end, though, the issue is that the transgender man in question wants his secrets kept just as the person whose phone I found presumably wants his photos to remain private. Am I morally obligated to refrain from looking at the photos? If I do look, am I obligated to keep what I found to myself (even if there is nothing illegal or immoral)? Is the fact that a person desires their secret to remain a secret enough to compel the rest of us not to tell it?  This was a long entry. What's your opinion? And was Cohen or Jos right? Or neither? What would you suggest?

Monday, July 12, 2010

Can "American" be an ethnic term?


Years ago, my friend Mike O’Connor told me that what made us a unified American culture is that we “all know who Michael Jordon is, and we have all eaten Big Macs.” The quote stuck with me, especially as I moved to different regions seeking a tenure track position. (Mike, it should not be surprising, ended up getting a Ph.D. in American Studies.) The quote came back to me yet again when I found this video of a woman in a German grocery store coming upon an “America Ethnic Foods” section.



The selections should not be too surprising: there are marshmallows and baking mixes (I used to bring these products as gifts to Austrian friends who had visited America, since they did not exist in their country); Hershey’s chocolate syrup, Campbell’s Soup, Hellman’s Mayonnaise (known as “Best Foods west of the Rockies, by the way), and others. I won’t ruin the whole video.

The video does bring up the question of whether American can be seen as an ethnicity or just a conglomeration of people with differing ethnicities. There are philosophical reasons to suggest the latter. Being an American (citizen) is a purely political distinction. A person who is naturalized is as much of an American the moment they become a citizen as someone whose family came to the new world on the Mayflower. A child of any parents, if born on American soil, is considered an American. So, history, culture, loyalties, political perspective, and other such things are irrelevant to the categorization. Yet, at the same time, we do want to think that there is something akin to American culture. I think we would feel hollow without it. But what would it be? Is it just the standard notion that we are all participating in the great American experiment? Or, perhaps, it is that which we absorb by living within the borders, regardless of which part of the border we live in.

If Mike was right – if this video is right – that American ethnicity is somehow defined by its consumer culture, then we encounter an even deeper connection to capitalism than those who claim that it is just democracy that goes hand-in-hand with free-markets. (I do not claim this postulate is true, by the way, just that some people claim it.) It suggests that capitalism is itself a culture and an ethnicity – a heritage in itself.

P.S.

I found an interesting definition for “ethnic” on Answers.com: “Relating to a people not Christian or Jewish.” That seems odd to me, although it does recall Fraser Crane, on Cheers, declaring that he wished he had an ethnicity.

Someone on Urbandictionary.com clearly agrees with answers.com, and defines ethnicity as “Something white people should never try to relate to.”

Clearly it is a loaded and political term.

Friday, July 9, 2010

Blog Hiatus almost over -- here is an article about WHY? from Humanities Magazine


Greetings all. As you may have noticed, I have taken the month off to exhale and recharge. I was traveling and participating in a wonderful NEH institute on the history of political economy at Duke University. But the blog should be up and running starting next week -- once I turn my attention away from Sunday's episode of Why? Do tune in. Paul Sum returns to discuss his year-long research trip to Romania. His work focuses on analyzing America's efforts to aid that country's emerging democracy. Paul's episode before the trip can be found here.

In the meantime, here is a nice piece on Why? that appeared in the NEH magazine, Humanities. It's IPPL's and Why?'s first national exposure. We are, as you can imagine, thrilled.

Saturday, June 12, 2010

On Second Thought - The Philosophy Issue is now available online!


The Philosophy Issue of On Second Thought, a joint publication from IPPL, Why? and The North Dakota Humanities Council is now available online. You can read it on the web, download your own copy, or request a physical copy from the NDHC. Click here. It's Free!!!

Table of Contents
 

Philosophy and its Public:
Mediating the War between Philosophers and Everyone Else
By Jack Russell Weinstein

Wild Business: A Philosopher Goes Hunting
By Lawrence Cahoone

Yield: Taking Measure of the Land
By Scott Stevens

Fireflies
By Jonathan Twingley

Why Philosophers Should Study Sports
By Paul Gaffney

Breaking all the Rules: A North Dakotan Philosopher
By Elizabeth M.K.A. Sund

Bibliomancy
By Heidi Czerwiec

A Lesson in Living
By Jessie Veeder Scofield

Advice
By Nancy Devine

Philosophy Goes Popular
By Clay Jenkinson

Again, click here for the magazine.

Tuesday, May 25, 2010

IPPL is now accepting applications for 2-week Visiting Fellowships!




Applications for 2010-2011 Visiting Fellowships at the Institute for Philosophy in Public Life are now being accepted.
The deadline for applications is July 1, 2010.
.
The Institute for Philosophy in Public Life is dedicated to two project: cultivating philosophy amongst the general public and bridging popular philosophy with academic research. This includes not only providing resources and opportunities for those interested in engaging with general audiences but also providing a venue for the presentation of their work. IPPL hopes to advance public philosophy by advocating the position that such work ought to count towards tenure and promotion. 




IPPL Fellowships are both invited by the director and chosen via open competition. Any interested party is encouraged to apply, and prospective applicants are welcome to contact the director informally to ask for advice or to "test the waters" for their suitability and competitiveness.

An IPPL Visiting Fellowship is intended for philosophical professionals who seek an intensive short-term period to work on a specific project free from the intrusions of daily work and family responsibilities, and who wish to translate that same project into language easily understood by general audiences. Visiting fellows are in residence at the institute for two weeks. They receive travel, meal, housing allowances, a $1,000 stipend, access to the University of North Dakota library and all relevant university resources, a $500 grant to purchase research materials to be housed within the UND Chester Fritz Library, and an office within which to work. In exchange, visiting fellows are expected to make at least two public presentations suitable to lay audiences and write a ten to fifteen page article for publication either online or in the North Dakota Humanities Council magazine On Second Thought. Normally, IPPL grants three - four visiting fellowships per year.

Regional applicants are encouraged to apply, but are not exempt from the two-week residence requirement.



International applicants may only have a portion of their airfare paid, but are eligible to receive all other benefits. 




Click here for application information
http://www.philosophyinpubliclife.org/Instute/fellowshipapplication.html

For more information, contact ippl@und.edu.

Is “it just makes sense” evidence for an argument?



Political races in Alabama appear to be heating up, emphasizing issues related to immigration and otherness, evolution, and thugs in government. (Thanks to The Faculty Lounge, for calling attention to these ads.) Perhaps one of the strongest attacks comes from gubernatorial candidate Tim James who decries the fact that Alabama driver’s licenses are given in twelve languages:



What concerns me is not any alleged xenophobia or whether English should be a state’s official language, although these are important issues and need to be discussed. Instead, PQED readers may want to ask about the nature of his argument itself. Is there an argument at all? Here is the text:

“I’m Tim James. Why do our politicians make us give driver’s license exams in 12 languages? This is Alabama. We speak English. If you want to live here, learn it. We’re only giving that test in English if I’m governor. Maybe it’s the businessman in me, but we’ll save money, and it makes sense. Does it to you?”

The subordinate argument is that giving tests in only one language saves money. Fair enough. That sounds like it could be true, although how much money would be saved is probably minimal, especially compared with the court costs of prosecuting those who drive without licenses because they cannot read the test. The major argument, however, is encapsulated in the phrase “and it makes sense.” This is a classic rhetorical move that sounds like it is saying something but really isn’t. Is there an argument there? I don't think so.

The thing about appeals to intuition is that they attach themselves to possible rather than actual arguments. In other words, whatever the audience intuits the argument to be is the argument that justifies the conclusion. Yet, there is no reason to think that James’s reason for opposing multi-lingual testing is the same as someone else’s.

What could those arguments be? Will English-only testing stop foreign-language speakers from driving? Probably not. Will it force them to go “home” to their country of origin because there they can drive? Probably not. Will it inspire them to learn English faster so they could drive legally? I suppose one might claim so, but I would have to see significant evidence for that position. In fact, what a law like this would do is, as indicated above, promote driving without licenses and increase the poverty and suffering of those who need cars to work, shop, and be good citizens. It would also punish them for not speaking English, and that, in the end, is what James's argument appeals to – the desire to hurt those who are different. Why? because the punishment will likely have no other consequences than the punishment. The suffering is all that there is. Punishment without purpose is retribution; it is not rehabilitation.

Appeals to intuition are inherently conservative – and I don’t mean this in the political sense of Republican vs. Democrat. I mean, instead, that they preserve the status quo. It is my daughter’s intuition that some “new” food is yucky before she tries it, it was the slave-owner’s intuition that slaves were not full human beings long before any of them experienced an equal playing field, and it is most everybody’s intuition that the familiar is more comfortable than the different. Most people who enjoy newness and difference enjoy the adventure, excitement, and intellectual challenges of the new experience. (Unless the familiar is so unpleasant that any change would be better.) Familiarity is easy. As Edmund Burke tells us, change causes tremendous disruption, and thus, there must be a compelling reason for anyone to want to endure it.

I would argue, therefore that James makes no argument, and in the end, he piles on another fallacy, the appeal to the bandwagon. It makes sense to him, “Does it to you?” And herein lies the problem: the audience is not forced to ask why they believe what they believe or whether what they believe is right. Instead, they are only faced with the question of whether his appeal to emotion is enough to motivate them to vote for James. In short, there are a lot of non-arguments here, but very few reasons to support James's position that multilingual driving tests ought to be prohibited. Or am I wrong? Are there arguments in the ad that I have missed? Does James, in fact, make sense to you? I'd like to know.

Tuesday, May 18, 2010

What are taxes for?


Today’s Grand Forks Herald (PDEQ’s local newspaper) has a story about the city council approving community gardens. The city itself has a notoriously bad track-record for approving new projects – it took us five years to get a dog park – but after some controversy, and by a narrow vote, the gardens were approved. The article reports that some residents opposed the gardens for two different reasons. The first is their claim that it would be “unsightly and bring strangers into the neighborhood.” I lived near community gardens in Boston for close to four years. They were beautiful, and more importantly, the attentive and caring gardeners made the area more safe, not less. There were always people around, people who had a sense of ownership and affection for the area. It was, indeed, a safe haven. And, of course, this underscores the important realization that strangers are not bad. Most newcomers are assets to communities – variety is, after all, the spice of life.

It is however the second reason that is the occasion for this post: a comment made by Ms. Sherri Brossart, a local resident. She is quoted as saying, “This is my neighborhood. I did not pay takes to have that community area.” And here is where the philosophical issue is most explicit. She claims she does not pay taxes to support a community and I would contend just the opposite, that this is precisely why she paid them in the first place.

In the last few decades, Americans have come to believe two false things about taxes. The first and less dangerous notion is that one can pick and choose what their money contributes to. The government does not work like this and it shouldn’t. Large-scale projects are only possible when citizens pool their wealth to create large amounts of capital. If we could select what we funded, there would be inadequate funding for the military, drug enforcement, schools, or highway repair. Everyone would support only their pet projects and few of them would have enough to succeed. But the more dangerous attitude about taxes is that we pay the money for ourselves. People regularly claim “I pay taxes so my streets will be safe,” or “I pay taxes so my child can go to school.” But this is not what taxes are for and this is where Ms. Borssart makes her mistake. We pay taxes so that everyone’s streets will be safe, and we do so to let other people’s children go to school, not just our own. That is what makes us a community, a society of interwoven individuals with a common national project. It is what allows John Rawls to claim that a just society is one where people acknowledge that they all share the same fate.

If Ms. Brossart wants to say that her taxes cannot go to someone else’s community garden – and by the way, I have a backyard and have no interest in participating in the garden; this is not a self-serving blog post – than theirs (and mine) shouldn’t go to her street lamps. If she can’t support a community endeavor like the garden, then I don’t want my money going to the police that protect her and the fire fighters that hose down her house. Those things don’t help me, so why should my money serve her interests. It’s my money after all, not hers.

But, of course, I do want her to have those protections. I want her to be safe, secure, and happy in her home, just as I want that of every American. And this is why I pay taxes to the community and not to a given project. Ms. Brossart just doesn’t understand want it means to live under common governance, and it is unfortunate that our civic education has let this attitude prevail.

Adam Smith told us that one of the necessary roles of government is to provide people with things to do. This helps educate the citizenry, keep people from being self-destructive, curb factionalism, and create public space for community cohesion. The community garden is precisely an example of a cohesive activity. This is also why Smith thought taxation was permissible and not theft, as some falsely claim. The question now is whether he was right, whether I am in my interpretation. Are taxes for services to our selves or are they to assist the community? Answering this philosophical question will go a long way to solving some of the most divisive issues in our country today.

Wednesday, May 5, 2010

How do we evaluate evidence and belief?




Dave Barry was interviewed in this past Sunday’s New York Times Magazine. In the process, he described his pet peeve:

“One thing that is not in my fridge is ketchup and mustard. You know why? Because you don’t have to put them in the fridge! Too many Americans are putting cold ketchup on nice, hot hamburgers. And I ask those Americans, When you go to the diner, where is the ketchup? Sitting out on the table.”

Now, this made my wife and I laugh because it’s true: we have our condiments in the refrigerator and it never occurred to either of us not to. Yet, Barry is right. We never see ketchup anywhere but on the table at restaurants. We joked about it for a bit, clearly resolved to take them out of the fridge, and then Kim got the bright idea to look at the labels. Lo and behold, each has very explicit instructions to refrigerate after opening.

I would be hesitant to suggest that we were committing the appeal to authority. We weren’t persuaded by the fact that it was Dave Barry who told us to move our ketchup and mustard; we were persuaded by the evidence. His argument is irrefutable. Every day in restaurants millions of people eat ketchup and mustard from the table and they don’t get sick. How can that be wrong? One might argue that restaurant condiments get used much quicker than those at home, and that over the long term it's best to refrigerate them, but this doesn’t seem to be the case. The issue is, it seems like you don’t have to keep them cool at all.

So, there is a philosophical problem here. Not only who do we believe, but how to we evaluate the argument? First, we can’t believe Dave Barry just because he is Dave Barry; that would be appeal to authority. But, we can’t believe the ketchup and mustard people just because they are the ketchup and mustard people. That would be appeal to authority as well. But the ketchup and mustard people presumably have some expertise that makes them more reliable than Dave Barry, so maybe believing them is not the appeal to authority after all. Then again, the evidence that we have – years and years, case after case – verifies that Barry is right. Refrigeration is unnecessary. How do we determine who to believe?

There must be some standard of objectivity to evaluate the evidence, and presumably, that evidence has to be larger than our experience. Is it enough that millions of restaurants continue the practice, or do the tests have to be conducted in a laboratory? And if the latter is true, does this discredit the experience that we each rely on day-to-day?

Obviously, the most basic question is where we should keep our condiments, but the deeper one is how we evaluate the evidence of arguments. What counts as enough to regard a proposition as true? Dave Barry, you opened a Pandora’s box, and I’m not making this up.

Saturday, May 1, 2010

How much does truth matter to political opinions?


This morning, a friend of mine sent me a link to this article documenting Rush Limbaugh’s recent comments about the water quality in Prince William Sound. Discussing oil spills, he claimed that the pristine water quality of the Sound is proof that we need not worry about cleaning water after such accidents. Nature, he asserted, cleans itself. Unfortunately, according to experts, "one of the most stunning revelations of Trustee Council-funded monitoring over the last 10 years is that Exxon Valdez oil persists in the environment and in places, is nearly as toxic as it was the first few weeks after the spill." So, Limbaugh was just wrong. (I remain agnostic on whether he was lying or just mistaken since I have no interest in either attacking or defending Limbaugh.)

Coincidentally, I just finished writing an article in which I reference last year’s debate about the morality of death panels, emphasizing that health care legislation never created death panels and that no one suggested they should. Add to these, the persistently false claims that President Obama never showed his birth certificate or that the one he showed was a forgery and countless other examples, and I am forced to ask whether truth is relevant to political opinion anymore. There are plenty of good reasons to be conservative, there are plenty of good reasons to be moderate or liberal, yet as a philosopher, I am always baffled by those who seek to win arguments through fabrication. If the goal of politics is simply to gain power, then what does that say about the role of honesty, character, and truth in modern democracies? (Perhaps, we should all read Thucydides’s The History of the Peloponnesian War together and consider his view on the role of power and the fickle public in Athenian democracy.) If one wins an argument by persuading people of falsehood, then I am Socratic enough to think that everyone involved lost and no one won anything.

There are plenty of psychological reasons why people make mistakes about judgment; many of them have to do with the human desire to preserve their own beliefs. A phenomenon called confirmation bias explains how people selectively register evidence for their point of view rather than notice facts that contradict their opinion. Simply put, if someone is of the political opinion that all Muslims are terrorists or that all terrorists are Muslim, then he or she will neither notice nor remember the new stories about non-Muslims engaging in terrorist acts. If someone is suspicious that his (heterosexual) wife is cheating on him, then he will be hyper-sensitive every time he sees her talking to a man but not register the many instances of when she is talking to a woman, even if she talks to women significantly more often than men.

However, confirmation bias is about people’s experience and not about news reports that must be vetted by groups. The news reports I cited above are all the products of tremendously complex processes involving reporters, writers, producers, and others, all of whom acquiesce to the false claims in the story. Granted, maybe Limbaugh specifically acted without consultation, but if so, he is likely unique in doing so. In other words, I am arguing that to maintain such falsehood the media must do so intentionally – I claim that they know they are lying and that they are okay with it. It is neither new nor noteworthy for me to observe that many news services preach to their respective choirs: there are conservative, moderate, and liberal news outlets and they attract conservative, moderate, and liberal audiences respectively, although certain media labeled as conservative or liberal may not necessarily be so. (I’m thinking here of NPR specifically but also of the New York Times.)

The question I am now asking is whether truth has any relevance to political beliefs – at least as far as the media is concerned – and I ask you what would it take for you to change news channels or even political allegiances. Does knowing that there were never proposed death panels or that Obama’s birth certificate has been viewed and confirmed make a difference to those who don’t like Obama? If it doesn’t make a difference why doesn’t it, and if you refuse to believe these facts, then why do you reject them? They’re true. Several years ago there was a widely believed e-mail documenting findings that George W. Bush had the lowest IQ of all American presidents and that Republican presidents in general had lower IQs that Democratic ones. Many people I know reveled in it even though the email was a hoax and no such study ever existed. Did knowing it was a hoax change their perception that Bush is stupid? No. So what, if anything, could change their mind about that?

In the end, I’m asking not just about human political-psychology, but also about the role of truth in itself. Perhaps I’m becoming a crotchety old man (I can hear my wife in the background respond with “perhaps?!?!?!”), but I’m wondering if truth holds any place in public discourse or if, in the end, all anyone wants is to win or be believe that they are right. If so, is there anything we can do about it?

Thursday, April 29, 2010

When is the genetic fallacy not a fallacy?


Rachel Maddow has an interesting and compelling condemnation of the new immigration law in Arizona. The law states that the police have a legal obligation to stop anyone who looks like they are an illegal alien, and people can sue the police if this is not enforced. As many of you will know by now, this brings up tremendous issues including questions about racial profile, unlawful search, constitutionality, and states making “foreign policy.” This is, at least what the critics say. Maddow takes a different tack. She shows that the people who wrote and endorsed the law have ties neo-Nazi groups, commitments to white supremacy, and general racist motivations. You don’t have to watch her piece, but it is worth considering if you have the time:



Her argument brings up a question about the genetic fallacy. A fallacy is a common mistake in logical reasoning, and the “genetic” variation is committed when an individual condemns something because of its origin. So, if I refuse to consider an idea simply because it was suggested by a student, or if someone refuses to eat ice cream because it was first invented by Arabs or the British (aaahh, Wikipedia, is there anything you can’t tell us), then they are committing the genetic fallacy because the worth of the idea or the taste of the ice cream is not dependent on its origin. (This is different from saying that an idea is true because it was put forth by the Pope, for example, since conservative Catholics regard the Pope as infallible. Under conservative Catholic theology, the origin – the messenger – is relevant to the truth. Anything the pope asserts – under certain conditions – is by definition true.)

So, Maddow is suggesting that the law is immoral because it was written by neo-Nazis and racists. Does this, in fact, make the law so? When I first considered it, I thought of her attack this as a classic example of the genetic fallacy. But the issue is a bit more complicated. Because the law involves issues of race or ethnicity, because it is designed to, at least according to the information she reveals, cleanse Arizona of “fertile” non-whites, then all of a sudden the origin of the law seems relevant. Her reporting helps reveal the intention behind the law, and questionable motives may very well indicate immorality.

On the other hand the intentionality of the law would not necessarily condemn the law if we are to judge it by its consequences. If the law is effective, if it stops illegal immigration, if illegal immigration ought to be stopped, if the law doesn’t actually violate people’s civil rights, and if the law is indeed constitutional (this is a heavy list of conditions), then one might be able to argue that the law is moral even though its authors and original supporters are not.

I will refrain from stating my own opinion on the law. I ask you, as always, what you think, and wonder, not whether the law is a good or bad law per se, but if, in this case, it can be condemned because of its originators. Is the Arizona law immoral simply because those who wrote it hold immoral beliefs?

Sunday, April 11, 2010

Fighting Sioux Part 2: Is tradition a good in itself?



As one might imagine, the retirement of the UND Fighting Sioux nickname has gotten a lot of attention, and my last blog entry inspired numerous responses both on the blog and on Facebook. One of the most common is the argument from tradition – the team name represents a long-standing athletic tradition at UND and deserves to be kept in order to honor the history of excellence it represents.  
This argument can’t be dismissed outright because the logo itself is steeped in questions about tradition. Some claim that the logo is meant to honor the Indians after whom it is named, recognizing their achievements and culture. Others claim exactly the opposite, that it is a stereotype that degrades the Indians and dishonors their traditions. These are both positions that are well worn, and I won’t address the controversies here.
What I shall ask instead is the more basic question: is tradition a good in itself? Is ‘”because we have always done things this way” a good reason to keep doing the same thing? The best defense of tradition is by the father of modern conservatism Edmund Burke who argues that change is so destructive that all else being equal, traditions should remain consistent. In law, this principle is known as starre decisis – judges have an obligation to obey precedents and respect “settled law.” Supreme Court justices who believe in respecting the original intent f the constitution take this to an extreme. So, in short, conservatives seek to conserve (or preserve) the past – all else being equal, things should stay the same.
Liberals, on the other hand, believe that all else being equal things can change. They presume that change makes things better and that the positive effects of change (or, at least of trying new things) outweigh the negative impact of the change. Thus, liberal judges believe that legal interpretation should reflect current standards rather than original intent. Whether the liberals are more correct than the conservatives is a matter of great controversy, of course. Slavery would likely still exist under a purely conservative philosophy; inheritance might disappear under a purely liberal one. (See, for example, John Rawls’s argument against inheritance in A Theory of Justice.)  
I’m oversimplifying a great deal of philosophical and legal argument here, but my point is to get to the root question: is keeping the name “The Fighting Sioux” justified by the fact that the team has been so-named for a long period of time? Is tradition worthy of defense in-itself? Obviously, the place where this question is most explicit is in religion: anyone who is religiously observant willingly attaches themselves to tradition and regards that tradition as good. For example, conservative Catholics think Catholic policy should remain the same (in the late nineteenth century they would have rejected the notion that the Pop was infallible, now they would support the belief) while liberal Catholics want the system to evolve. Reform or Reconstructionist Jews change their liturgy and the meaning of their rituals to represent discoveries about justice and science, while Orthodox and Hasidic Jews seek to return to an older time with an older philosophy. Every religion has this battle, from Buddhism to Islam to Zoroastrianism, and this battle has existed from religion’s inception. Feelings about sports in America are similar to feelings about religion in many ways, and it therefore does not surprise me that these days, at UND, we are, in essence, fighting a religious battle in the name of a logo and nickname.
So, again, here is the question: independent of questions of racism, representation, money, stereotype, or anything else, is the fact that the Fighting Sioux representative of a long tradition of athletics at UND a relevant factor in the debate about its continued existence? Is it a determining factor? And if it is, how are we to justify any change at the university when change, by definition, involves a partial (if not complete) rejection of tradition?