Monday, December 30, 2013

Inflation and the Dragon

One of the hardest things for many people to grasp during the Great Recession has been the idea that inflation is too low. We generally talk about inflation as pure economic evil, something that could never possibly be too low. But it is.

If you say inflation is too low, some people will bring up the high inflation of the 1970s or, more hysterically, the hyper-inflation in Weimar Germany during the rise of the Nazis as proof that Inflation Is Bad. But that doesn't really make sense. Inflation is bad when it gets too high, but that doesn't make a modest amount of inflation bad. The sun is bad in Death Valley when it's 130 degrees, but that doesn't make sunshine a universal menace. 15% inflation would be a very bad thing, but that doesn't mean 1.5% inflation is a good thing. 130 degrees Fahrenheit is murderous, but so 13 degrees is also a killer. A lot of our public debate about inflation is like trying to treat a case of frostbite while people keep shouting that heat is a terrible thing and then angrily tell you a long story about forest fires.

Some of the people warning against any inflation under any circumstances either should know better or actually do. They have various political or ideological motives. Some are under the spell of fringe economic theories, like Hayek's. Some are simply seeking short-term advantages for particular business interests, such as the banking sector, that benefit directly from low inflation although the wider economy might suffer. Some, including a healthy slice of libertarians, take their economic thinking from science-fiction or fantasy media and games. The enthusiasm in some quarters for the fictional virtual currency BitCoin is partly driven by genre-fiction economics. Bitcoin imitates gold to the degree that the processing of making it is called "mining"and there is a fixed maximum that can be generated, in imitation of the old gold standard, so that eventually the BitCoin money supply will become inflexible and incapable of expansion. This will make BitCoin immune to inflation (assuming anyone accepts it at face value), and in fact make the currency deflationary. Inflation and deflation are about how much money there is compared to how much stuff there is to buy with the money; when the money supply grows too fast, prices grow too fast. If the amount of goods and services money could buy kept growing, but the money supply didn't because all the money had already been created, as in the BitCoin plan, then the existing money would become more and more valuable as prices kept dropping, as in the Great Depression. BitCoin enthusiasts think this a good idea, partly because they read books like Neal Stephenson's Cryptonomicon and partly because World of Warcraft has been on the gold standard for years.

So I'm going to stoop to the fantasy-example level. Let me use The Hobbit to illustrate the dangers of an inflation-free world.

Tolkien's world, like most fantasy worlds, seems to feature virtually no inflation. A piece of gold is a piece of gold, with value that never ebbs. (This kind of tidiness and solidity is part of the appeal to many digital goldbugs, who like fixed numbers and find the arbitrary and negotiable nature of money 
unsettling.) In fact, Tolkien's world is probably deflationary, in that ancient treasures seem only to appreciate in value. Treasure just gets more precious with time because, as in most heroic fantasy set in an idealized pre-industrial world, there is virtually no economic progress.

The Hobbit of course features a dragon, Smaug, who is sitting on a vast hoard of gold and jewels which represents basically the entire money supply for several hundred square miles. Smaug is quite literally wallowing in his wealth. He has made a big pile of it and is sleeping with his belly on it, while everything else around him for miles and miles is a wasteland. This is all sensible enough draconian behavior because there is no inflation, and therefore Smaug has nothing to lose.

In fact, a deflationary world is excellent for Smaug. The money underneath his scaly belly only gains in value as he naps. If prices in the rest of the economy keep falling, then Smaug's gold will actually buy more this year than it would have last year, and buy more next year than it would this year. He doesn't have to worry about investing his money, or making more, because the money he has keeps gaining in value. The rich get richer by doing nothing.

But this is the problem. Deflation creates an incentive not to invest money, and not to spend it. So that money and the economic value it creates get sucked out of the economy. In deflation, you should never buy anything before you have to, because it will get cheaper the longer you wait. And you don't need to bother investing, because money just gains value by sitting there on the floor. Deflation rewards you for becoming, in the most literal sense, a hoarder. Maybe all that saving sounds virtuous. But if no one ever buys anything, then no one makes any money either. And if no one invests their money, no new businesses can grow. In fact, there is no new money; there's just the old money that gets more and more valuable while everyone else becomes poorer and poorer.

And so the area around Smaug is a wasteland, not simply because he's set it on fire at one point but because no one else can make any money or do any business. Nobody mines any more gold, or works gold into objects. Nobody grows any food. Respectable hobbits turn to lives of crime. No business can take place, because there is no capital. Capital is an accumulation of resources set aside for further investment; money that just gets piled up in a cave for years is not capital. And in fact, Smaug could only burn the area down because he had no further economic need for it. He'd grabbed all of the existing wealth and had no interest in anyone creating more, because his wealth would grow in value by itself. The Desolation of Smaug is actually the Depression of Smaug. And it's the platonic ideal of a deflationary economy: an enormous hoard of money with virtually no goods or services worth buying.

But let's imagine the basic economic conditions changing just a little. Let's say that Mirkwood, Long Lake, and the areas to their east actually have an annual rate of, say, 5% inflation. Now Smaug is still enormously wealthy with his ill-gotten gold, but he's not actually getting richer. In fact. he's getting a little poorer every year he holds onto that gold without doing anything with it. Its value is slowly leaking away. This sounds terrible and unfair to some people, who respond by inventing dumb things like BitCoin, but in fact this leakage moves people to more economically virtuous behavior.

What is a dragon to do? He could just be satisfied with his diminishing net worth, but let's face it: he got where he is because of his overpowering greed. So he has to do something. The only thing to do is to make more money. And the quickest way to do that is to leverage the money he has. If inflation is slowly eroding the value of Smaug's gold, Smaug needs to invest his gold for a rate of return higher than inflation. 

So Smaug, with 5% inflation nibbling at his tail, wants to make a 7% to 10% annual return on his gold. So let's say he hires some dwarves, Thorin and Company, to reopen the mining shafts in the Lonely Mountain and to work new gold into new, value-added cups, rings, and whatnot. He tries to sell off some of existing inventory of goldsmithery to the local Elvenking, or to the men of Long Lake, in exchange for other investments. Naturally, the dwarves don't work for free, and neither men nor elves willingly make deals that lose them money. Smaug has to work out arrangements that are profitable for everybody, so that Thorin et al. make enough to keep them motivated while Smaug nets the 7%-10% he's looking for. And suddenly, we have capitalism. The gold is no longer piled up doing nothing, but actively fueling more enterprise; it has become capital. (The "saving" Smaug indulged in in the other scenario may sound virtuous to those who equate saving and virtue, but it is literally the least capitalist behavior possible.)

Now, Smaug's various partners, employees, and trading partners are also facing 5% inflation, so they are also going to want to build their money into more money by investing in new things. And they also have to eat, so some of their wages and profits are going to be consumed. But money someone spends is money someone else earns. The area around the Lonely Mountain will have to become less lonely, because all of those people are going to need places to eat, sleep, buy new shoes, and so on. Bilbo Baggins moves to town and starts selling everyone second breakfast. And Smaug needs all that to happen, because his business can't survive without those things around. He's not going to burn it down again. Instead, his gold is going to circulate out into the community, through many hands, and fuel growth. Pretty soon, you have a bustling Lonely Mountain Economic Zone.
And in fact, this is pretty much the happy ending in Tolkien; once the hoard gets broken up and distributed into many different hands, rather than re-hoarded by Thorin, peace, love, and commercial industry abound.
Of course, if inflation gets too high, the economy suffers. If inflation is devaluing your money faster than you can make it, the economic incentives break down pretty seriously. But deflation also wrecks the incentives and ruins the economic system. A little inflation, in moderate doses, provides a compelling reason to make more money from your money, and money making more money is what makes the economic world go round. Moderate inflation is good for nearly everyone. Deflation is strictly for dragons.

cross-posted from Dagblog

Tuesday, December 24, 2013

Eating the Turkey Soup: A Christmas Story

One December when my brother and I were around ten and twelve years old, our mother enlisted us in a holiday good deed she was doing. She wouldn't tell us who we were doing it for, and after we got caught up in our task itself we stopped wondering. When we were finished, we went back to thinking about other things. But on the afternoon of Christmas Eve someone came by our house with a pot of turkey soup to thank our mother, and we realized who we'd been doing that small good deed for.

The first lesson from that moment was immediate and overwhelming: we didn't think it so much as feel it, like a powerful physical reflex. We both knew right away that we were never talking about this again to anyone, ever: not even to each other. We were sorry that we knew. (If the version of the story I told above is vague and lacking in detail, well, good.) People are entitled to their privacy and their dignity. I could not have explained that in words, then, but I understood. And I remembered that lesson later, not because I talked about it, but because it provided the example that kept me from speaking about other things when I should not.

The other lesson I learned that week came more slowly, and I tried to refuse it. My brother and I did not want to eat the turkey soup.

We didn't especially like the soup. We preferred our mother's chicken soup (still a gold standard as far as I'm concerned), and there was no shortage of things to eat in our parents' house during the holidays. And anyway, the person who had given us the soup would never know whether or not we ate it.

But our parents insisted. Of course, we had to eat the soup. It was not because we liked it, or needed it ourselves. It was not because the person who had given it to us would know. It was because we would know. Eating that soup was part of our obligation.

I was nowhere close to understanding this then, but that turkey soup taught me the difficult lesson of gratitude. And I owed the person who gave us the soup gratitude. I owed that person the opportunity to give something back. That wasn't merely a social obligation; it could not be satisfied through a polite pretense. It was a moral obligation and it had to be made real. I really had to eat the soup. I had to take it into my body, and accept the gift honestly.

It is better to give than to receive, they say. But you do an unkindness when you do not allow someone else to give to you. And when the person doesn't have much to give, you do them wrong to refuse their gift or deny them gratitude. It's not always an obvious lesson, and I didn't find it an easy one, but I am grateful to the person who taught it to me.

Happy holidays.

cross-posted from Dagblog

Wednesday, December 18, 2013

In Praise of the Late Term Paper

It's that time of year again, or actually one of the two times each year, when semesters end and bleary-eyed college professors scale mountains of ungraded papers and exams. One of my friends claims that he can track the academic calendar by the crescendo of professors griping on Facebook and Twitter about bad papers, worse excuses, and outrageous examples of student entitlement. Some of this is necessary foxhole camaraderie, some of it verges on the unprofessional, and some does a lot more than verge. Too many lame papers and excuses will put most people in an ugly mood. But I want to give two cheers to one group of students who never get any love at this time of year: the students whose papers are late because they take the assignments seriously.

I like an on-time paper as much as the next person. Meeting deadlines are an important adult skill that students should be learning. Of course, I admire the excellent students who always do their best work by the stated deadline. That is intrinsically admirable. And when every student is late, it becomes impossible to help any of them; the greatest obstacle to rescuing students from their last-minute emergencies is the sheer number of other last-minute student emergencies. 

But all that said:

I've read some papers in my time that should have been late. I have read papers that have been turned in on the due date or earlier but that the writer hadn't even begun to work on seriously. Oh, those papers were presentable enough. They weren't full of comical errors. There was nothing to quote on Facebook. The margins were correct. But the papers were nothing. The writers had done as little work on them as they felt they could get away with, and avoided most of all the labor of thinking hard about anything.

Some of those papers would have been good papers at a lower level. The writers just stuck with what had worked before, handing in a polished introductory-class paper in an advanced class, or a meticulous high-school paper in a college class. Faced with the problem of an assignment that explicitly demanded a rather different paper, some worked tirelessly to misconstrue the assignment and find some loophole that would justify writing a simpler, more familiar assignment. And then, hoping for extra points, the writers handed those easier pieces of writing in early. They preferred to be judged on promptness rather than thoughtfulness, and many of them reasoned that there was no more room to improve their essays, so spending a few more days won't help. The saddest part is, they were right. They had set themselves elementary writing tasks, using skills they mastered years before, and executed those tasks well. It is like watching a high school senior filling in a coloring book, or listening to a forty-five-year-old playing "Chopsticks" on the piano. There is no way to do those tasks better, which is why I did not assign those tasks in the first place.

Those are the most demoralizing papers that I read. The mess and chaos of students trying to write something that they are not yet quite capable of bringing off does not bother me. But the orderly, sealed-off neatness of a paper that refuses to learn or grow makes me ask myself what I'm doing in the first place. That refusal is polite but insistent and unbendable. And sometimes the only thing that breaks through that stubborn insistence is a grade that makes the student upset.

On the other hand, some of the students who do accept the assignment and try to do it honestly find themselves struggling. They are trying to work out new skills, in response to new demands, and that doesn't happen on a predictable timeline. The work is messy. Progress is non-linear. So sometimes the deadline rolls around while the student is still up to her or his elbows in wet clay, trying to find the piece's shape. Those students aren't late because they're lazy. They're late because they are working hard. Giving them a few extra days to complete an assignment is productive, because they will use that time productively. Their papers will genuinely be much better a few days after the deadline than they could have been on the appointed day. An extension leads to a better product.

Not that every student who needs such an extension will ask for one. Some do not feel entitled to one, and some students will simply abandon an entire class in despair because they don't have a paper written on time. Of course, the same class will contain some squeaky wheels who are trying to get themselves as greasy as possible, and who will have no qualms about asking for all kinds of special arrangements. Some of the more demanding students prompt eye-rolls, but the only real harm they do is distract the professor from the students who are suffering in silence. It's important to shake your head clear at the end of the semester and look for the students who are in danger because they haven't asked you for anything. Many times, those students are the ones who generally enjoy less privilege in their daily lives: more likely to be the first in the family to go to college, more likely to have gone to a troubled high school, more likely to find tuition a major burden. Those students don't expect to get any breaks because they usually haven't gotten any. They read the rules in your syllabus, which some of their more affluent classmates simply view as initial negotiating positions, and take those rules seriously. If they can't meet a deadline, they just assume they're done for, because that's consistent with their previous experience. The only way to persuade them differently is to show them differently, and you can't wait for them to come to you.

Saturday, December 07, 2013

Keeping Christmas at Home

Last Sunday was the first day of Advent, which means in the most traditional sense possible the beginning of the Christmas season. Of course, Retail Christmas Season began five minutes after Halloween ended, prompting me to some bleak reflections in my last post. But the truth is, I love Christmas, no matter how much this year's commercial display may be getting me down. Last Saturday I bought a wreath and a bunch of assorted greenery. My spouse made an Advent wreath from some of it, and decorations from the rest. Christmas lights frame our living room window, and I've got some nice holiday jazz on the stereo. I enjoy this holiday a lot. But the Christmas-Industrial Complex defeats the season's purpose.

This is not a post about the meaning of Christmas. "The Meaning of Christmas" has become one of those phrases that has been used in so many slippery ways that it's become hard to use straightforwardly, even with the best of intentions. This is a post about the uses of Christmas.

Christmas belongs to you. It does not belong to The Man, whoever The Man might be. It is not meant for you to work harder on your boss's behalf. It is not meant for you to spend more in order to make Wall Street happy. It is meant for you to refresh yourself, physically, spiritually, and emotionally, using whichever form of refreshment you need. What it was never, ever meant to be was an excuse for the Powers That Be to squeeze more work or more money out of you.

Like every medieval holiday, Christmas began as a break from daily labor, an interruption in the grind of ordinary time. Of course that break had an official religious purpose; every holiday was originally a holy day. Religious observance was the only acceptable excuse for a day when you didn't work. And of course, people did a lot of other things beside praying and churchgoing on those official holy days. Calling the "true meaning of Christmas" religious is not the whole truth, because the religious celebration (no matter how genuine, or how personally important to me) has never been the whole story.

In a comedy by one of Shakespeare's contemporaries, a character who's been tricked into marrying a prostitute (which is considered hilarious), consoles himself:

Marry a harlot, why not? . . . if none should be married but those who are honest [i.e. chaste], where should a man seek a wife after Christmas?

17th century Christmas was not a festival of childhood innocence. It was a holiday for adults to feast, drink, and make merry in adult ways, like a snowier Mardi Gras. (This is one of the reasons that the Puritans despised it, and did not celebrate it in early New England.) If it seems odd to have a holiday from the church calendar celebrated with drinking and sex, remember that Mardi Gras comes from the church calendar, too.

And to say that the "meaning of Christmas" is "the children" ignores most of the holiday's history. The 19th century re-invented Christmas as a family holiday centered on kids, but since Christmas has been celebrated in various ways since the 4th century AD, that's about 1500 years of Christmas not being about the children. (It was also the Victorians who turned Christmas into a shopper's festival to boost end-of-the-year consumption. The focus on children and the materialism arrived hand in hand.)

So I'm not going to tell you that modern Christmas is too secular or too materialistic. It has always been secular and excessive, just as it has always been spiritual and reflective. It has never been one or the other. The meditation and the merry-making are part of the same thing.

Christmas is a break from ordinary time: an annual pause. It is a day to suspend our everyday concerns and our anxious labor. That has been its use and purpose for more than a thousand years. Those who have no interest in its religious symbolism can still profit from that break. The body needs it. The mind and the soul need it. And it comes at the most necessary time, the darkest hour of the year, when nature, including our own nature, cries out for a rest and a respite. It is an annual snow day, scheduled in advance because we all need it.

There is nothing wrong with buying things for Christmas. There is something wrong with the pressure to buy things for Christmas, with the relentless campaign to make Yuletide consumers prop up our sagging economy, with the annual barbaric spectacle of Black Friday. There is something wrong with turning an annual day of rest into a fount of stress, competition, and financial anxiety. Christmas is meant to be a step out of our daily grind. This is true whether you spend the day drinking, giving gifts, or singing hymns: they are all faces of the same fundamental, necessary break from the everyday. And that purpose can be served by December traditions that have nothing to do with Christianity. The traditional Manhattan Chinese-restaurant-and-movie combination is a seasonal respite par excellence, specifically designed by and for non-Christians.

What Christmas was never meant to be is an intensification of the everyday routine, the rat race writ large. And that, no matter how many sentimental pieties it comes wrapped in, is deadly. It is inhumane at its heart. That takes the bleak midwinter and makes it bleaker.

Take a break this December. Your body and your spirit need it. Celebrate the official holiday any way you like, even if Jesus isn't involved. (It's not his actual birthday; he won't mind.) Some of you will have your batteries recharged by connecting with family. Some will feel better after a little quiet prayer. Some will benefit most from a new sweater and a stiff egg nog. But whatever it is, make sure you end the season more rather than less rested. The holiday is meant to nourish you, not the other way around. And there's a lot of winter left to go. This is the moment to turn on some lights and relax.

cross-posted from Dagblog


Saturday, November 30, 2013

The Long, Cold Christmas

My morning commute these days takes me through a shopping center; the train lets me off underneath it. It's been Christmas in the mall since the first day of November. That's no surprise. Christmas has become the crutch our retail economy leans on. Many stores run in the red for eleven months and see Christmas put them in the black for the year. A bad year calls for a big Christmas, and a string of bad years calls for bigger and bigger Christmases. If shoppers don't keep finding more and more money for Christmas presents, the whole economy shrinks. It doesn't sound sustainable, but I don't blame local merchants for wanting to start Christmas early and hoping to extend that sweet jolt of retail steroid. We are out of other ideas.

Part of the shopping center was once a department store, famous in our area and featured in a well-known Christmas movie. People in this city are enormously proud of that movie, and proud that it was shot here. I have been urged, while standing in check out lines, to visit the house where many of the film's family scenes were filmed. We don't have too many other things to be proud of. But the store shown in the movie has not a department store for many years; it closed before I came to this city. Still, the store windows have traditional Christmas window displays, all Dickensian scenery and nodding, grinning mechanized dolls. We will go through the motions of Yuletide joy. But there are no children's toys in the space beyond those windows. Last year the old department store became a casino.

The casino's advertising mixes with the lights and Christmas wreaths and the carols piped onto the sidewalk. Gamblers stand outside, smoking. It's hard to see anyone who looks like a high roller. Most of the gamblers seem lower-middle-class, hard working people come to gamble. They are not on vacation, and this is not a resort. The casino's posters are pathetic come-ons, designed to flatter the gamblers that they are daring and glamorous: "Risk Takers Wanted" and "We Turn Gamblers Into Legends." I never see anyone who looks like a legend, and I find the posters sad. No one should be played for a sucker so plainly and publicly. Nobody's self-deceptions should be used to con them in the public square.

We have been voting whether to allow this casino ever since I arrived in this city. It feels like I have voted against it every two or four years. The casino was always voted down until 2012. The long, sluggish recession finally made it impossible for the rest of the voters to say no. We had to do something, anything, to bring some money back to downtown. We are out of other ideas.

Christmas is a cold season, and it gets longer every year. The only part of Christmas that never starts any earlier is charity. The lights are on commercial trees and the seasonal music is on the radio long before the Salvation Army appears. People ask you to spend early and often. They're slower about asking to give. Christmas charity still starts on the old schedule, reflecting older values. As for the rest, the bleaker things get, the more Christmas we all seem to need: more shiny lights and temporary discounts, more glossy commercials and easy sentiment. The harder it gets for everyone, the more we are prodded to celebrate and to spend. We watch endless fantasies of bleak, Dickensian London, and those not making sufficiently merry are told not to be like Scrooge. Most Americans are in no danger of being like Scrooge; saving too much of our enormous wealth is not really our big problem. We may be in a remake of A Christmas Carol, but we're not playing the lead. Instead it's the Bob Cratchits who are asked to be the founders of their employers' feasts, to spend money they don't have, to work themselves a year older but find themselves not an hour richer, all to leave an annual profit in the boss's stocking. Dickens's Scrooge finally vows to keep Christmas all year round; our Scrooges would probably do the same thing if they dared.

cross-posted from Dagblog

Monday, November 18, 2013

Jonathan Martin Does Not Need Your Nonsense

A few weeks ago, an NFL player named Jonathan Martin, offensive left tackle for the Miami Dolphins, walked off the team and sought counseling for emotional health issues. This has led to the suspension of his teammate, the incongruously-named Richie Incognito, on charges of outlandish workplace harassment; an official NFL investigation into the team, now reaching to behavior by the coaches; and the kind of publicity you just can't buy. Plenty of NFL players, sports pundits, and armchair tough guys have denounced the 6'5", 312-pound Martin as soft and weak and proclaimed that outsiders just can't understand what goes on in an NFL locker room. While I don't accept that locker rooms are sacred spaces that outsiders have no right to criticize, it is true that we don't have a great sense of what is going on inside them. Maybe Miami's locker room is unusually dysfunctional, and maybe it's dysfunctional in ways that every NFL locker rooms is. But in either case, Martin may not be any weaker than the other players. He may simply have more options.

Martin played for Stanford and majored in classical history. He's a 6'5" behemoth who's fluent in Latin. His parents are attorneys who met at Harvard, and he turned down a chance be a fourth-generation Harvard student in order to play for nationally-ranked Stanford. (It's worth noting that Martin is bi-racial and the longest Harvard legacy in his family is on the African-American side, going back to a great-grandfather in the 1920s.)

Those facts may lend themselves to a familiar narrative about the privileged kid who isn't used to doing things the hard way. But playing football for Stanford is not really the easy way. Martin played on a top-ten, nationally ranked team; that team could not have competed that successfully unless the football training was extremely rigorous. Playing football for Harvard, on the other hand, would be much easier. The football part of playing football for Stanford isn't significantly different than playing football for Nebraska, Mississippi, or UCLA. What is different is that Stanford football players (and excuse my pride here) also have an extremely rigorous academic program, comparable to any Ivy League school's. At Stanford, football practice is like football practice at Nebraska and class is like class at Harvard. Which part of that sounds like the easy way?

But Martin's background does give him options that many other NFL players don't have. I don't think Martin is the only player who's felt like he can't take the abuse any more. The difference is that most other players have to take it anyway, even when they can't. Where else are they going to go?

If Jonathan Martin never plays football again, he will certainly not see NFL-style money for many years, and possibly never earn as much over the course of his working life. The NFL is paying him seven figures a year. But the total sacrifice of lifetime income may actually not be that large, and there is a real possibility that over the next thirty years Martin could make as much outside the game as he would if he stays in the NFL. He is one of the players most likely to replace his football income by other means. That money would come slower and be less glamorous. But if he went back to Stanford to finish his classics degree and then went into law or finance, he might make upper-middle-class money or better very quickly, and potentially pull down seven figures a year near the end of his career. Instead of making millions for a few years in his twenties, and then some lesser income, he could make a lesser income and then a huge one. Or he could decide that making a six-figure salary in a workplace where no one calls him a "half nigger," talks about "shit[ting] in his mouth," threatens to gang-rape his sister, or shakes him down for $15,000 is better than making millions while being relentlessly extorted and degraded. Jonathan Martin can afford, in the most literal sense, not to put up with that shit.

The average NFL player, on the other hand, does in fact need to take that shit. A great many NFL players cannot afford to walk away. They do not have the choice between making their NFL salaries and making a dignified living as an affluent professional somewhere else. They have the choice between making millions of dollars, no matter the cost to them personally, and making working-class money. If they left the game, many of them would simply be large men without college degrees. For many, their partial college educations were not a genuine preparation for any other field. For most players, there is no other plan. It's the NFL or nothing.

Nothing is more human than resenting someone who is free to walk away from abuse that you have to accept. It isn't reasonable or fair, but it is very, very human. If you have to take as much as your coaches and teammates dish out without showing that it bothers you, and you've developed whatever coping strategies you need to keep accepting that mistreatment, there's no one you hate more than the guy who doesn't have to take it. You can't afford to hate the people who actually beat up on your body and your mind. You have to tell yourself you don't. So there's lots of resentment waiting to be transferred to the guy who decides he's too good to take the abuse that you have to take. What makes him special? It isn't fair. And actually, it isn't, although it's not the guy who's refusing the mistreatment who's to blame.

And let's be very clear: people defending Incognito's behavior or denigrating Martin's have talked about the hazing and bullying, always in vague generalizations, as necessary "toughening." But when you get down to the specific behaviors involved, it's never about making the younger player tougher. It's about making them compliant and docile to authority, beating them down and forcing them to accept any abuse whatever petty tyrant in the locker room decides to dish out. Being "tough" in this context means being a better victim. No one who demands that you give them $15,000 so they can take a trip to Las Vegas is doing that to build your character. And forking over the money does not make you strong. It makes you a chump. That is what the hazing in Miami's locker room was designed to do: not to make the younger players strong, but to break their will, to make them pushovers, terrified to miss a "voluntary" extra practice, afraid to do anything that would displease a coach.

Most of our public conversation about the NFL lately has been about concussions and brain damage. But we cannot have any serious conversation about player health when the players themselves have been cowed this way. An NFL culture that prizes obedience above anything else, that demands players accept any abuse they receive in silence, can never protect its players' health. The whole system is designed to destroy them.

cross-posted from Dagblog

Monday, October 28, 2013

Dead Man's Name Tag

I've been away at an academic conference for nearly a week, leaving blog posts unfinished, e-mail unanswered, and campus office untenanted. I had a wonderful time with a bunch of scholars and actors at the American Shakespeare Center's reproduction of Shakespeare's Blackfriars playhouse. (If you'd like to see some excellent theater, a trip to see the ASC's company in Staunton, Virginia, is a great idea.) But I also bumped up against a small problem that's began to follow me wherever I go professionally: the problem of my (real) name.

I am not the only Shakespeare scholar with my name. That's not surprising. My real name is quite common, not "John Smith" but a dirt-common first name with a vanilla-ethnic surname, so I bump into nominal doppelgangers all the time. I've gotten other people's phone calls and in one memorable case another person's subpoena. For years I went to a father-son barber shop, right across the street from my workplace, where both barbers shared my name and the younger barber, the one who cut my hair, even shared my middle name. So I wasn't surprised to discover that there was an older Shakespearean with my name, someone who began university teaching while I was still in grade school. And I knew what to do about it.

I have always been careful to use my middle initial in my academic byline, especially in my publications but also when registering for conferences, so that conference programs and name tags identify me as "Cosimo P. Cleveland." This is the simplest (and likely the only) way to differentiate myself from the earlier scholar who published as plain "Cosimo Cleveland" or, occasionally, "Cosimo T. Cleveland." Using the middle initial feels a little fussy and overly-formal, and it wouldn't be my preference in an ideal world. I certainly don't introduce myself with my middle initial when I'm shaking hands; in fact, I use a less formal nickname, equivalent to "Cozmo" or "Coz." But leaving the initial out of my byline would be sloppy and disrespectful. It would also verge on filial disrespect for my own father, a published spy novelist whom we can call "Cosimo C. Cleveland," but there's not much chance my work will be confused with Dad's; books by that Cosimo Cleveland tend to be about fearless Mossad agents and books by this one tend to be about Elizabethan actors marking up their scripts. In any case, the middle initial is on my book and on all my published articles so far, so there's no changing it. My byline is now my byline.

But when I go out in the professional world, that fussy little middle initial has been increasingly dropping away. The problem isn't that people confuse me with the other Cosimo Cleveland. The problem is that he's getting erased from history.

Cosimo T., who got his first job teaching college a quarter-century before I got mine, published much less than I have. This isn't a reflection on him, and certainly isn't a reflection on me, but an indication of how much our profession changed between his generation and mine. Professors hired in the seventies did not publish nearly as much as professors must today, because professors then were not expected to publish the way we are today. I published more articles before I got tenure than Cosimo T. published in his entire career, not because I am smarter or more industrious than Cosimo T. was, but because one of the requirements for me to get tenure in the 21st century was to publish more than Cosimo T., who taught at a somewhat better school than I, published over his three-decade career. If I hadn't out-published him by the end of year five, I would have been fired. This is true everywhere. All academics of my generation have to produce much more research than the older generation did. Those are just the facts of our business.

And, unlike Cosimo T., I have published a book.  That's partly about generational expectations as well. But it means, inevitably, that more people have heard of the younger Cosimo than of the elder. So they don't necessarily see the point of my fussy little middle initial, except as something pointlessly fussy. They don't see it as differentiating me from the other Cosimo Cleveland, because they don't know there ever was another Cosimo Cleveland.

So I unfailingly send in my conference registration paperwork as "Cosimo P.," but sometimes I show up and open my program to find "Cosimo Cleveland" on the schedule. Not always, of course, but twice in the last month.  And I get a name tag identifying me as simply "Cosimo," so that I have walked around a conference hotel for a long weekend wearing a retired man's name, and now, I have come to fear, a dead man's name. I googled the other Cosimo Cleveland this morning and saw an ambiguous reference to his death. But when I search "Cosimo Cleveland Shakespeare obituary" google just gives me a bunch of links that refer to myself. "You've totally eclipsed that guy," one of my conference friends told me six months ago when I explained this problem. "Everyone who talks about 'Cosimo Cleveland' means you." But being part of someone else's eclipse, even unintentionally, is not a good feeling.

And while I can usually insist on the middle initial in print, I can't make people remember it when they cite my work. It's very common to leave out middle initials by accident, or to misremember the initial, when writing footnotes. I've made both mistakes myself, meaning no disrespect to the scholars I was quoting; before I had any work of my own to footnote, I had not thought about why people might prefer a specific form of their names. (I apologize to those scholars here, and will happily do so again in person.) There's no great conspiracy at work, but the error is clearly related to the earlier Cosimo falling out of academic memory. If my name were Stephen X. Greenblatt, I would not be having this problem.

There's no way to insist on the middle initial or to make any kind of fuss when it gets dropped. That wouldn't revive Cosimo T.'s reputation, but would surely give Cosimo P. one. All I can do is to scrupulously use that initial myself, because the fact of that earlier career deserves to be acknowledged. History is part of my work. If I spend much of my time trying to retrieve lost details four centuries gone, I should not consent to forgetting the recent history of my own guild. And as someone who will die someday, I find it sad to see another person's life being forgotten. Devouring time may blunt the lion's jaws, but it is also devouring the memory of one Cosimo Cleveland, a dedicated teacher and scholar, and in due course it will come for me. There isn't even malice involved. People simply stop knowing. The name tag hanging around my neck in this or that hotel ballroom gives no testimony to that earlier Cosimo's work or life, but I have to read it as a reminder: memento mori.

cross-posted from Dagblog

Monday, October 14, 2013

A Plague on False Centrists

“A plague on both houses!” I've seen that line from Romeo and Juliet quoted repeatedly for the last two weeks,  as pundits and bloggers devoted to “balance” argue that the Democrats and Republicans share the blame for the current budget shutdown and the looming threat of default. The line itself is a cliche, but quoting Shakespeare makes you sound learned, and that is too often the major aim of both-sides-do-it journalism: making the journalist seem wise and above the inconvenient facts of the fray. Shakespeare was a poet, not a pundit, more interested in dramatic complexity than sound bites but if we’re going to mine his plays for lessons, we should remember what we’re quoting. Saying “a plague on both your houses” does not solve political conflict in Romeo and Juliet or in the real world. It accelerates a destructive feud, and it's meant to. Those who curse both houses are not trying to make peace. They are egging on a brawl.

    The line “a plague on both your houses” is of course spoken by Romeo’s friend Mercutio, mortally wounded in a street duel. Taken out of context, it sounds like an accusation that the Montague and Capulet families are both equally violent and equally blameworthy. But the audience can see for itself that this is not true. Romeo steadily refuses to fight. He and the other Montague on stage, his cousin Benvolio, work tirelessly to head off the duel, and when that fails Romeo physically throws himself between the fighters to stop them. Romeo and Juliet is a play about dangerous civil disorder, and the leaders of the Montague and Capulet houses do share the blame for disrupting the peace. The senior leaders who should rein in each house’s servants and young men, the clowns and hotheads, instead actually encourage aggression. But Mercutio pretends that there is no difference between the play’s most violent characters and its handful of peacemakers, no difference between starting a fight and trying to stop it. His pretense of neutrality is worse than an empty pose; it actively promotes violent conflict.

    Mercutio deliberately goads one of Juliet’s cousins to combat, and when Romeo refuses to be baited Mercutio jumps in to start the fray. He not only wants a fight; he insists. “A plague on both your houses” is the cry of a man killed with a sword in his hand. It is meant to spur Romeo to further bloodshed, accusing him of causing the death he tried to prevent, and it succeeds. Every death in Romeo and Juliet comes after the "plague on both houses" line. Blaming both sides equally undermines the peacemakers but empowers the hostile and unreasonable, freeing them from any public responsibility. If every time you kill a Montague (or a Capulet), people shake their heads and blame both the Montagues and Capulets, you might as will kill as many Montagues or Capulets as you can; your victims split the blame with you. And if you have to take half the blame every time you try to make peace and get attacked with a sword instead, you will never manage to make peace. Apportioning half the guilt for every crime to the criminal and half to the criminal’s sworn enemy is not an act of moderation. It promotes and rewards extremism.

    The American media has unerringly taken the side of confrontation and brinksmanship and discord, through years of mounting political disputes and manufactured crises. The press has done this while pretending neutrality with sad, wise shakes of the head, lamenting the unreasonableness of both sides. That head-shaking is not neutrality. It is active intervention on behalf of the unreasonable. Unprecedented acts of obstructionism are treated as routine tactics. Partisans abusing the legislative process to extract concessions are awarded the same stature and coverage given to national leaders seeking compromise. Worse, those who work for conciliation and those who work against it are portrayed as equally partisan, as if deal-making and deal-breaking were simply two sides of the same coin. Individual journalists may be liberal or conservative, but the political media itself is clearly biased toward confrontation: indifferent to policy results but hungry for drama, always looking for more juicy showdowns and shutdowns and crises. So the press has played Mercutio, standing in the street hoping to see a fight, scoring the “winners” and “losers” of every pointless showdown. The media pose as objective bystanders because they only forgive half of every crime and only slander half of every good deed. But absolving the cheats and brawlers is not objectivity. Cursing the peacemakers is not standing honest witness. The press has not been a bystander. It has acted, scene after pointless scene, to build more conflict. Our political journalists have helped to write the sorry drama our nation must now play out. They should take their bows.

cross-posted from Dagblog

Thursday, October 10, 2013

How Much Do You Have to Write to Stay Sane?

Flavia has a post about her writing process, with many thought-provoking comments from her readers, and Dame Eleanor Hull posts a great deal about the academic writing life. I find that I can't give a clear account of my writing process right now, if by "writing process" we mean my composition process. But I have learned, through difficult trial and error, that I need three things to keep my writing going well:

1. Something accepted but not yet in print.

2. Something submitted but not yet accepted.

3. Something new that I'm actively working on.

I know these sound like results, or productivity targets. But I don't think of them that way. The goal isn't necessarily to have x amount of work accepted in y amount of time. When I do manage to have all three of these things at once (and I certainly have not always done so), they operate as a security blanket. They allow me to write, because they keep me from worrying about my writing.

Having at least one article in press, one out for review, and one on the boil, when I can manage that trick, keeps me from obsessing about the response to any individual piece of writing. Otherwise the danger is that too much energy goes into worrying about one specific piece. That's not healthy because no one piece of writing defines you as a writer, and not healthy because the things you're worried about are beyond your control. If you're an academic writer, your articles can get swept around in the unpredictable weather of peer review, or becalmed for months and months at a journal that's going through organizational problems. You can't control that. Once it gets published, people will read it or not, like it or not, cite it or not. You can't control that either. And while you're working alone at your desk, you can lose your way worrying about whether or not what you're doing is any good at all, meaning whether anyone will like it. Anyone who's written a dissertation knows how easy it is to despair over a piece of writing that you've spent too much time working on by yourself.

But if something's out in the mail and something else is in press and something is getting worked on steadily at my desk, it's a lot easier not to worry about the one in the mail, or pin too many hopes on the piece you have coming out next summer. And it's easier to let the thing you're working on be itself, and let the worries about venues and reviews come in due time. Most of all, no single thing starts to feel like the barometer of your success. Yes, some things, especially the book you're working on, are more important. But having more than one piece of writing at various stages is, at least for me, a wonderful psychological buffer.

My three-things rule is especially suited for academic writing, where it takes months for a response to come back after submitting an article, and often a year and more between acceptance and publication. But fiction writing works on the same schedule, and literary fiction perhaps a slower one. Literary magazines can take more than six months to give you any response. The other reason to try to have pieces in various stages of acceptance, submission, and preparation is that you can not afford to stop working for the three to four months it takes to hear back about each piece. When an article or a story is out the door, you need to work on another article or story. You don't have that many months to waste. And if a fiction editor takes a pass on a first story but asks to see another, you had better have another story, better than the first, to show her.

You're not a writer because you have one story you're proud of, or one article you think is important, or even one book manuscript that you hope will win some prize. You can't afford to let your sense of yourself as a writer be tied to the fate of that one piece as it tries to find a home. Or maybe you can. I certainly can't. I need to feel that if I'm a writer, there's more where that came from. If I approach my work that way, I can afford setbacks to this or that particular piece, and you have to be able to afford the setbacks if you want to write because sooner or later they're coming.

Writing is a public act performed in a private place, something you do alone at your desk for the widest audience you can manage. If you give too much weight to what other people think, or give too much weight to your own private anxieties, you will have trouble writing at all. You give up hope after too many rejections in a row, or begin undermining your work in a misguided attempt to give people what you think they want. Or you will wrestle with yourself endlessly in private and pin yourself, never putting anything in the mail because it's never "finished" and never finishing anything because you refuse to show it. Either way, you get lost, and the work suffers. Staying sane enough to write means positioning yourself somewhere between your inner voices and the outside world, where you are able to listen to both clearly, because you need to listen to both, but where neither gets the last word.

cross-posted from Dagblog

Tuesday, October 08, 2013

The Tragedy of the Will

Twenty years ago, while I was talking politics with my friend Mike, he said that Reagan's great achievement was what he called "the Nietzschification of the Right." I didn't grasp what he meant at first, since I typically encountered Nietzsche quoted by leftist literary critics. Mike's point was that Reagan had transformed American conservatism from a stodgy, rationalist enterprise into an emotional, charismatic movement like the New Left of the 1960s. Main Street conservatism gave way to Movement Conservatism, founded upon passionate emotion and conviction. I've thought of that conversation a lot over the last two decades, through the rise and fall of Newt Gingrich, the second Bush Presidency, and the flood tide of the Tea Party. Mike's case has gotten stronger year by year. Mike himself has been furloughed in the government shutdown; he's now a government regulator.

Part of the right wing's Nietzschification has been its emphasis on the will as the decisive force in events. The current version of conservatism has become convinced, more and more thoroughly, that any reality can be reshaped by a sufficiently powerful imposition of one's will. Nothing is impossible if you just believe. But this turns out not to be true. In the actual world, reality takes belief's lunch money on a regular basis. Movement conservatism as practiced by Reagan was still largely the art of the possible; he was empowered by his movement's fervor, but mostly did what he could get through Congress and what his military forces could manage. When he did unrealistic things, like raising the deficit sky-high with tax cuts that were sold as likely to pay for themselves, the consequences either got shunted to the future (because Reagan's huge national debt would eventually be someone else's problem, i.e. ours) or borne by people without any political muscle to fight back, such as the mentally ill or the homeless. He ignored the consequences he could afford to ignore. But when he lost some Marines in Beirut, he pulled the Marines out. He didn't try to will the situation to his preferred result.

By the second Bush presidency, much of the Republican party had lost its ability to make that distinction. The Iraq War is nothing if not the disaster of policy makers who felt they could reshape the world simply by willing it. This is the period during which a White House source talked derisively about the "reality-based community" and ranted about how the Administration was "creating new realities." That's the force-of-will worldview right there. And you heard an enormous amount about will during the Bush II years. Military strategy was often cast as about demonstrating sufficient amounts of will, as if once our enemies realized we were serious, nothing else would matter. (This of course leaves out the possibility that our military enemies might themselves bend intense willpower toward achieving their goals. Since our primary enemies were hardened religious fanatics, that was more than a possibility.) This led Matt Yglesias to coin his phrase "the Green Lantern Theory of Foreign Policy," after a comic book superhero who could do anything with sufficient willpower. The last decade demonstrated just how poorly that theory worked.

Now the conservatives in the House are not merely trying to impose their will over policy realities, but over the reality of the political process itself, as if they could guarantee a victory over Obama simply by being more committed to the goal. They have made demands and not gotten what they demanded, and they have no plan but to stick to those demands. That's it. They ultimately believe Obama will cave because the power of their belief itself will make him cave. They don't have any other plan, and they have no endgame. Recently, some Republican senators from swing states angrily asked Ted Cruz what his strategy was, and he answered, apparently unconcerned, that he did not have a strategy. When this provoked his fellow Republicans to vocal rage, Cruz allegedly responded by calling them "defeatists." Think about the mindset that reveals. Someone with no game plan at all, someone who has no idea of how to try to win, takes the suspicion that he will therefore not win as a sign of a character flaw. Those who expect to lose simply because they cannot see any possible way to win are defeatists. Winners, evidently, do not need plans in Cruz's view of the world. They just need to believe in themselves.

That the Republicans, and especially the Tea Party wing of the Republicans, might actually suffer a political defeat seems to strike them as inconceivable. Their plan is to will themselves to victory. The fiscal and political health of our nation is in the hands of people too unrealistic even to calculate their own selfish chances. They are not unrealistic by chance, but by design. They are not simply poor gamblers, bad at estimating their odds. They are opposed to realism on principle. Realism is just defeatism. They are committed, more than anything, to the primacy of will over reality. That is the beating heart of their value system. To accept facts that they cannot change would be a betrayal of their most important principle. To do so would leave them lost and rudderless. Of course they can't make concessions to reality, let alone to Barack Obama. They cannot bring themselves to concede that "reality," as we know the term, even exists.

cross-posted from Dagblog

Wednesday, September 18, 2013

Larry Summers Is Not the Main Problem

I'm as pleased as anyone that Larry Summers has withdrawn from consideration as the next Chair of the Fed. I thought he would do a terrible job. But Summers himself was never the real problem. His candidacy was only a symptom. The real problem is that we have a President who wanted to nominate Summers in the first place. Obama does not understand what's wrong with the American economy, and five years into his term, he persists in some basic misunderstandings.

There are two basic Democratic narratives to explain the 2008 financial meltdown, and they contradict each other. When Obama took office, he had to choose which story to believe. The first story is that the economy thrived under Clinton, and Bush's people screwed it up. I'll call that the Democrat vs. Republican story. It's partisan, but not ideological.

The other story is that Clinton's economic policies led to a short-term boom, but set us up for the long-term bust that started in 2008. The toxic securities that crashed the system in 2008 were deregulated under Clinton. Deregulation of banks started under Clinton. Clinton thought Alan Greenspan was a genius. The list goes on. The Bush people, at worst, only exaggerated what Clinton's people had already been doing. Their basic emphases (favoring investors over workers, worrying more about inflation than unemployment, etc.) were the same. Call this the Left-vs.-Right story. It's ideological, but not partisan.

You can't believe both of these stories if you're going to actually come up with a plan to improve the economy. You have to pick one. If the Democrat-vs.-Republican story is the right one, the best thing to do is to put Clinton's old academic advisers back in charge. But if the Left-vs.-Right story is true, then putting the old Clinton guys back in charge is the LAST thing you should do. Clinton's economic policies, devised by Robert Rubin and the so-called "Rubinites" associated with him, are either the way out of our country's economic mess or a way further into that mess. It can't be both.

Obama clearly chose the "Clinton knew how to run the economy" story at the outset of his first term. That makes sense. Obama had never had a strong personal vision for economic policy. (Read the economy chapter in The Audacity of Hope and you'll see what I mean.) He was immediately forced to take responsibility for a national economic crisis that had hit late in his election campaign, giving him almost no time to think our economic problems through or develop new policy ideas. And he had to stop the bleeding somehow. Going with the Democrat-vs.-Republican story gave Obama a ready-made team to put in charge and a set of basic policies to follow. (Larry Summers, Clinton's old Treasury Secretary, is one of the main Rubinites.) Going with the Left-vs.-Right narrative would have meant coming up with a completely new team and a completely new set of ideas. But who would he have picked? How would he distinguish good policy advice from bad? Accepting the Left-vs.-Right narrative meant moving into uncharted territory during a national emergency. Throwing out the old playbook and starting over is a much riskier move, and Obama hates unnecessary risks. Electing Hillary Clinton instead of Obama would not have avoided this problem. Hillary would have relied on Bill's old economic advisers, too.

While Obama's original choice might have been reasonable at the time, it has also turned out to be wrong. Five years later, growth is still sluggish, unemployment still high, and income inequality more rampant than ever. We've had five years of the Rich Man's Recovery, where the tiny fraction at the top have started growing even richer than they were in the Bush II years, but the rest of the country is still nowhere close to getting back to economic health. Not only is that not success, it's potentially a recipe for much bigger failure. The high levels of inequality make the whole system less stable and more prone to catastrophe.

Sure, we are almost certainly better off than we would have been if McCain, rather than Obama, had been calling the shots, and better off than we would be under President Romney. A move to the kind of Austrian economics that people like Rand Paul favor would have been a disaster. Obama understandably wants credit for keeping the economy from going off the rails completely and for whatever recovery has taken place over the last five years. He's committed on some level to defending his earlier decisions, and doesn't feel he has any room to maneuver on his left. He's right as far as that goes: his centrist policies are surely healthier than hard-right economic ideology would be. But "better than crazy" is not good enough. And while Obama's policies fit reality better than the right wing's do, the actual economic reality is still far to Obama's left.

Centrism is almost never the long-range solution to a fundamental crisis. A major crisis is usually a sign that a set of policies have major underlying problems. Sticking to the middle of the road makes sense in the good times, but disasters as big as 2008 are reality's way of telling you that you are on the wrong road. Proceeding cautiously down the wrong road and obeying a reasonable speed limit only changes how fast you get lost. To actually get out of trouble, you have to turn around and go in a different direction. That Obama wanted to put Larry Summers, the chief advocate of deregulating the exotic securities that caused the 2008 crisis, in charge of the Federal Reserve, shows that Obama still thinks that he can keep going down the Clinton/Bush economic road and it will all be okay if he just drives carefully enough. That he wanted to have Larry Summers riding shotgun with him is bad. But even if Summers isn't officially navigating, Obama is still following the wrong directions.

cross-posted from Dagblog

Thursday, September 12, 2013

Why Obama Won't Make College Cheaper, Part 1

Education reform in America is always an attempt to get something for free. It has been that way for at least twenty-five years. No matter what the scheme of the hour is (charter schools, Teach for America, No Child Left Behind, Race to the Top) or whether you're talking about K-12 or college, every reformer makes one of two promises. Either they promise to make education better without spending any more money, or they promise to make education better while spending less money. Education reformers basically say, "Four dollars is too much to pay for a hamburger. Bring me a three dollar steak."

If you ask why the reformer expects this strategy to work, or how it could, you will be told to "be realistic." It is of course deeply unrealistic to expect taxpayers to increase spending on education. So the only "realistic" course of action is to create excellence through funding cuts. Since no solution but a three dollar steak is acceptable, we just have to figure out a plan that gets us an especially juicy, delicious, and healthy steak for three dollars. There's always a clever new three-dollar-steak scheme, no matter what happened with the last one, and you just have to give the new idea a chance. Twenty-five years on, and education reformers are still talking about the newest, hottest plan. That's because none of the old plans did much good.

President Obama's new college affordability plan is one of these reform efforts. He plans to make college cheaper for everyone, and to do this without spending any more money. Instead, he will shift around the money the federal government already spends to create incentives. College costs are a real problem, and Obama is right to want those costs lower. His plan will not actually do that.

Why not? There is the problem of getting the plan through Congress.There is the problem that this plan (like No Child Left Behind) relies on crude and oversimplified metrics that pretend to measure complicated and slippery things, and then penalizes schools based upon that statistical pretense.But the real problem is that you can't set prices for things that you don't buy. Obama's plan won't change college prices, or colleges' underlying costs, because there simply isn't enough money on the table to make a difference.

The federal government is the biggest individual player in American higher education, because it's the only entity that contributes, directly or indirectly, to nearly every college or university in the country. That means a lot of money, in absolute terms: the country has nearly 4500 colleges, universities and community colleges, with well over twenty million students. But the federal government isn't the most important contributor to any of those 4500 college budgets. It isn't the second-biggest contributor to those budgets, either. (The only exception is the service academies. The United States government pays for the entire budget at West Point and Annapolis, and they get to set the tuition: free.) We don't have a federal education system. Washington doesn't run our colleges, and doesn't pay for them. Instead, we have a vast decentralized system with thousands of schools competing in a free market.

The federal government does pay enough money to influence college's actions. Title IX, for example, is enforced by a threat to strip all government grants and funds from schools that don't comply. Losing ten or twelve percent of your annual budget really hurts; think what a ten-percent pay cut would do to you. So paying ten (or twelve, or six) percent of every school's budget gives you a lot of say. What it doesn't give you is the power to make schools give up larger sources of revenue. Tuition is a bigger part of the budget than federal funding, hands down. A deep cut in tuition could very quickly add up to more revenue lost than the federal government adds. Even if you threaten to pull all of your financial contribution (and in practice the government would only be diminishing it, in gradual stages), that's not enough to make a school forgo MORE than what you're paying. If you tell a restaurant that they have to cut the price of a steak dinner in half or you'll stop leaving a fifteen percent tip, what do you think will happen?

The reason no restaurant will sell you a steak for three dollars is that no restaurant can buy a steak for three dollars. People don't generally sell things for less than those things cost them. In fact, the only thing for sale in America for less than it costs is a college education. Colleges are already discounting tuition as much as they feel they can afford. Tuition never covers the full cost of operating a college, and the wealthier the school, the more it spends in excess of tuition. (The Harvards and Princetons of the world have the luxury of spending much more than they charge.) So college tuition is already being set significantly below cost. Tuition grows because costs grow. Why?

If you want to be a successful education reformer, meaning you want to make a good living peddling your five-point plan for three-dollar steaks, your answer should be cast in moral terms. Costs are high because someone lacks character! High costs can only be a sign of laziness and corruption! After all, that's what makes those costs amenable to "reform." And of course, the moral explanation is satisfying, because it allows you to attack anyone who disagrees with you as a bad person.

But if you look around America's 4500 colleges, you see almost all of them behaving the same way. There isn't a group of "virtuous" colleges holding down costs and another group of decadent spendthrifts charging 50% more than other schools. There are only slight differences between institutions. It is not that all 4500 colleges happen to be run by weak and depraved characters. When you see thousands of independent institutions behaving the same way, it's a sign that there are actual economic reasons in play. Colleges act the way they do because they're responding to real pressures that "character" will not make go away. Colleges don't spend money because somebody at the college was raised wrong. Colleges spend money because they believe they have to. Colleges spend what they need to spend to survive.

Those costs keep growing for reasons that I'll try to explain in Part Two.

cross-posted from Dagblog











Wednesday, August 21, 2013

My Neighborhood, Times Two

I was back in my old neighborhood a couple of weekends ago, walking toward the farmer's market, when I passed a little knot of people who were looking up and gesturing toward the dignified brick apartment buildings that line one of the boulevards. They were all clearly from somewhere else, and one of them was explaining the handsome buildings, which apparently struck them as odd, to the others:

"I think they're pretty dumpy on the inside, but they look good from out here," he said.

I thought that was pretty remarkable, because the guy wasn't actually claiming to have been inside any of the buildings he was talking about. He just thought they were run-down dumps inside. All he could actually see were the buildings' admittedly-impressive outsides, but he he didn't or couldn't permit himself to be impressed by them. So he assumed that the handsome buildings were all squalid inside.

He was dead wrong. I should know. He was pointing at my old building.

I lived in that place for seven years, in a big pre-war apartment with hardwood floors, and the only thing that was ever remotely squalid in that place was my bachelor housekeeping. It was a nicer place than I really should have rented right out of graduate school; my excuse is that I'd come straight from California, where the rent on even a shabby studio was always basically all the money you had, and so my big beautiful apartment with the fancy view seemed like a steal. And having an apartment like that made feel like I was finally, after so many years of school, a middle-class grownup. I only left that building when I got married and began my weekly interstate commute, because I needed a place closer to my office when I was in town.

Now, there may theoretically be an apartment building on that street that isn't well maintained on the inside. Maybe they weren't all as nice as mine. But I've been in lots of those buildings, either as a prospective renter or while visiting a friend, and I've never seen any of the dumpy apartments this guy was talking about.

The guy explaining how terrible the apartments on that street were (don't let the fancy outsides fool you!) wasn't saying that because he knew it to be so. He apparently believed that those buildings were all concealing slum conditions because he wanted (or needed) to believe that. I don't know about you, but when I'm in a place I haven't been before I generally assume that the houses I'm looking at are pretty much the way they look, with the insides roughly as shabby, shiny, or well-kept as the outsides. I would never look at a house with its paint peeling off and say, "I bet it's an absolute palace inside," or presume that a fancy-looking house on the lake is secretly a dump. But for whatever reason, these strangers were not ready to accept that my neighborhood actually was the way it looked. So they had to invent facts not in evidence, the dumpy apartments secretly hidden inside impressive buildings, rather than deal with the reality staring them in the face. Those nice-looking buildings just couldn't be what they looked like, because they weren't supposed to be there.

(And actually, it was a little bit worse than that. As my spouse pointed out to me later, that guy had to actively ignore evidence he could see, namely the carefully-maintained landscaping around the buildings he was calling dumpy. In his world, the landlords have let those beautiful old buildings run to complete ruin but also meticulously landscaped them.)

Why not just accept the evidence in front of their eyes? One possible explanation is what I'll call suburbanite bias: the conviction that life in the Big Dirty City is just one long squalid nightmare. I don't just mean preferring to live outside the city yourself. I mean the insistence that living anywhere in the city is so hopelessly awful that anybody would count themselves blessed to "escape" to the suburbs. I should admit that I've never viewed the suburbs as a place to which I would eagerly escape; there's a reason that my blog name isn't "Doctor Pepper Pike, OH." But I see that some people might like a suburb better than the city. What I'm talking about is the belief that everyone in the city, except maybe a handful in luxury high-rises, must be living in a horrifying slum. Call it Urban Derangement Syndrome.

It could also have been about the specific part of the city my neighborhood is in. The skeptical visitors might simply have been unable to believe the sight of lovely vintage buildings in the black part of town. The neighborhood is actually mixed-race; I've spent years there, and I'm so white I'm nearly translucent. African-Americans are a plurality rather than a majority. And it's also a mixed-income neighborhood, with a healthy share of working-class homeowners but a bunch of doctors and classical musicians too. But the neighborhood has enough African-Americans that visitors from a racially unmixed area might view it as a "black neighborhood." (In this case, that which is not all-but-completely white becomes "black.") They may have refused to believe in the impressive apartment buildings they were seeing because they were under the impression that they were in The Ghetto, where all African-Americans live in miserable tenements and have The Blues. If you can buy decent soul food, it must be a slum. The Ghetto, in this case, is positively full of endocrinologists and cellists, but this isn't about the details. It's about the Big Picture, where all black people live in Bad Neighborhoods. How can there be nice apartments in a Bad Neighborhood? It makes no sense.

A slightly different version of this problem would be that the visitors viewed an entire side of town, the stereotypically "black" side, as one vast undifferentiated expanse of The Ghetto, and could not process that the "black" portion of a city actually has all kinds of neighborhoods, good, bad, and in-between. One way or another, the outsiders' refusal to accept what they were seeing as real is about a refusal to accept complexity. It's refusal to accept the variety that messes with easy simplifications. The "black side of town" is no more one single place than a city or a neighborhood is one place: they contain multitudes.

My other neighborhood, in the city where I own a home with my spouse, is also probably misunderstood by some outsiders. That neighborhood too is economically and ethnically mixed, and also viewed as the scary desperate city by surbanites with Urban Derangement Syndrome. Our house was built in the 1920s, and has no room for a huge lawn or huge attached garage. And it's only a few blocks from a high school with a large proportion of African-American students. ZOMG! Black teenagers! It are an urban jungle! Every night, my spouse and I lock our vintage leaded-glass windows and huddle by the working fireplace in terror.

Neither neighborhood is an exclusive bougie enclave. They have petty crime; you need to lock your doors, you shouldn't leave valuables in the car, and you shouldn't believe that everyone buttonholing you on the street is telling you their real story. When I first moved in to my old apartment my morning newspaper would get stolen in the morning. In other words, they are neighborhoods in cities, where you should take basic sensible precautions and generally not be an idiot. Does that make them "high crime" neighborhoods? Depends on how you're counting. Are they "dangerous" neighborhoods, where random pedestrians will be waylaid by a bunch of extras from The Wire? No. The scary thugs only live in the secret slum apartments hidden inside nice buildings. They never come out.

The thing about a city is that no neighborhood is very far from a different neighborhood; a good city doesn't sprawl. A city that does is a collection of suburbs on steroids. That boulevard of brick pre-war apartment buildings is only a block or two in one direction from a street full of blue-collar single-family homes. Half a mile in another direction is a shady street lined with what I can only call minor mansions. One nearby street is a depressed and dispiriting commercial strip. Another nearby street is filled with antiques dealers. Half a mile's run in yet another direction takes you to a park filled with live deer. It's a neighborhood. It neighbors other things. That's the point.

I left that neighborhood, but I didn't "escape" it. In fact, on the morning that I passed the guy explaining how all the apartments were actually dumps, I was in the neighborhood because I was moving back. My spouse has taken a year's leave from her job, so I gave up the bachelor pad near my office and moved with her back to another big pre-war apartment in another of those handsome buildings that the guy considered dumps in disguise. (Meanwhile, we rented the house in our other urban neighborhood to a group of classical musicians. Mostly string section. You know: animals.) So my old neighborhood is also my new neighborhood, at least for a year. And if the apartments in the neighborhood are secretly dumpy, well, I just rented another. Its dumpiness is still secret.

After I passed those confident visitors I went to the farmer's market and then back to my new apartment where my wife and my unpacked boxes were waiting. Then I stood at the counter in my newly-renovated kitchen and ate an organic peach. Just another day in the hood.

cross-posted from Dagblog

Monday, August 05, 2013

A Tale of Two Newspapers

Everyone's talking about Jeff Bezos buying The Washington Post. But it's also been a dramatic week for two newspapers close to my heart in different ways: The Boston Globe and The Cleveland Plain Dealer. Two days ago, The Globe, like the WaPo, was sold to an individual billionaire with a high profile. Today the Plain Dealer, which has not been sold, stopped delivering the newspaper. It will still be printed every morning, but it will only be delivered three days a week. Nearly one third of its reporters were laid off on Wednesday. It's not the first round of buyouts or layoffs at the PD, and it's not the second either. The newsroom is now down to about a third of what it was in the 1990s.

The Plain Dealer will be "digital-first" from now on. On the first day of this bold step into the future, naturally, the electronic version of the newspaper crashed. Digital-first means you try to log into the website first, and when that doesn't work you go out and see if the drugstore will sell you an actial copy of the paper. But hey, you get what you pay for, right? And the web side of the business is strictly non-union.

Now, plenty of people will tell you that this is an inevitable consequence of our modern age. The newspaper has to die, because the internet demands it! Also, video killed the radio star, which is why you no longer own a radio. But apparently, it's not inevitable everywhere. The Boston Globe got sold to John Henry, the principal owner of the Boston Red Sox, for a Filene's-bargain-basement price of $70 million, even when other bidders offered more. Henry isn't talking layoffs. Boston is going to stay a two-newspaper town, and the premier newspaper is going to keep competing hard.

(Anyone who thinks that Henry bought the newspaper to get more or nicer coverage for the Red Sox, by the way, has no idea how the Boston sports media works. The Red Sox are not going to get more attention from Henry's Globe, because it is literally impossible for the Red Sox to get any more attention than they already do. And the coverage is not going to get nicer or softer, because the readers won't read that. The gold standard for Boston sports coverage remains a complex brew of idolatry and hostility. Boston baseball reporters don't play softball.)

Starting today, Cleveland is a less-than-one-newspaper city, with a Plain Dealer that is somewhat less than a newspaper. And that brings Cleveland one step closer to becoming something less than a city. It is part of a great city's death. Boston, a city that has thrived in the information age, will keep the major newspaper that a major city needs to function and thrive. The New York Times seems to have deliberately sold The Globe to an owner whose other enterprises are tied to Boston's civic health, and who seems motivated to protect the city's basic ecosystem. That's a choice on the Times Company's part. And the way Henry eventually runs his newspaper (or Jeff Bezos runs his) is a choice, just as the Newhouse family's decision to gut The Plain Dealer was a choice.

There's more than one route to profitability here. John Henry is not stupid with money. But he might have to be content with a lower direct return. A good newspaper in the internet age is only modestly profitable. A gutted newspaper is more profitable for a while, as long as you keep cutting costs faster than you lose revenue, but that's not a sustainable business model. That is selling copper wiring from abandoned houses.

Cleveland is a city being slowly run to death by economic rationalists whose business model adds up to sheer madness. It illustrates American business's suicidal focus on cost containment, with everyone trying to run the leanest operation in a city suffering economic famine. It is cheaper to lay off workers, and so stores no longer have enough consumers. The stores scale back, and their suppliers go hungry. It no longer "makes sense" to run a department store downtown. It no longer "makes sense" to run a daily newspaper.  And then you are trying to attract new enterprises to a city you have to leave to find a Macy's, trying to recruit employees to a city where you can't get a newspaper delivered on Monday or Tuesday. It makes no sense.

Cleveland is a city of assets whose value has been allowed to decline, and assets whose value is ignored. It is a city of grand buildings left unrenovated and unoccupied, because no one chooses to value them. It is a city where a great newspaper is allowed to become a part-time enterprise, because no one chooses to see its value. And little by little it has become a place torn down by its owners, stripped down to be sold for parts by people who do not live there, who do not need or wish for the city to thrive. The car that could be rebuilt is sold for scrap metal. The business that could be made profitable is closed.

Don't cry for the Washington Post. There are worse things than being bought. The worst thing that happened to The Plain Dealer was its out-of-town owners keeping it.

cross-posted from Dagblog

Wednesday, July 31, 2013

Tribal Knowledge

Fox News's hostile interview with Reza Aslan has lit up the internet. (See Michael Maiello and Historiann for two of the smarter takes.) Obviously, interviewer Lauren Green's insistence that something must be very wrong for a Muslim to write a book about Jesus, and that such a book must be wrong, is a problem. But Green (and her producers) are simply peddling a toxic version of an idea that lots of us entertain in various forms. The idea that a non-believer cannot understand (or worse, should not be allowed to speak about) a belief is only a more aggressive outgrowth of the common conviction that being a believer, or identifying with some specific group, gives you a special insight or understanding denied to outsiders. The conviction that no Muslim could write a "fair" book about Jesus grows from the belief that Christians, by virtue of being Christians, understand Jesus better than anyone else possibly could. That sense of privileged understanding comes from one's social identity, not from actual knowledge, and can be actively hostile to such knowledge. In the Reza Aslan example, faith in Jesus Christ might be construed as more important than, for example, the ability to read Biblical Greek. To a certain kind of Christian, a Muslim who can read New Testament Greek represents not one but two problems. Aslan's scholarly accomplishment is perceived as a threat to "knowledge" derived by other means.

This is by no means limited to Christians, or to religious believers. I happened to read Janet Malcolm's In the Freud Archives the week before the Aslan interview. In that book, Malcolm deals, fairly indulgently, with the common claim among Freudians that only someone who has undergone Freudian analysis is entitled to an opinion about Freud. Only insiders, only properly initiated believers, can be authorities on the subject. (Replace "Freud" here with "scientology" and "Freudian analysis" with "dianetics counseling" and see how that sounds.) And this insistence that outsiders could not possibly understand led important Freudians to restrict access to factual evidence. (Malcolm also weirdly treats this as not-unreasonable.) The custodians of the Freud Archive refused to allow scholars to see biographical evidence about Freud, including letters in his own hand, unless they were sure that the scholar was a true believer who would stick to the current Freudian orthodoxies. The facts are a threat to  belief.

A less toxic but more annoying version of this behavior that I run into a lot is some people's conviction that they have a special insight into certain literature because of their ethnic background. There are, unfortunately, Irish-American students who are profoundly convinced that they have special insight into James Joyce or W. B. Yeats because of their "Irishness." This is observably untrue. Being named O'Shaughnessy doesn't give you any special insight into Ulysses, or guarantee that you'll understand any of it. Likewise, you occasionally run into Brits who are convinced that anyone who is not English or (if the person in question is Scottish or Welsh) not British could ever properly understand Shakespeare. There's much more I would like to know about Shakespeare's works, but I am willing to match my current knowledge of the subject against David Beckham's at any time. He can bring along Posh Spice for help.

But followed to its conclusion, this identification with the literature of one's tribe can shade into racism; the presumption of privileged understanding is, after all, a presumption of privilege. An English racist gives himself credit for Shakespeare's works, despite not having been much help writing them, and presumes that this gives him one-up on, say, a Pakistani immigrant who got at least as much Shakespeare in school as the Englishman did. The English racist may not understand a quarter of Henry V, but he takes credit for it because it was produced by "his people." Indeed, literary scholarship once actively promoted this racialist approach. You can still find old anthologies with titles like "Poems of the English race." Similarly, the Irish-American undergraduate who gives himself credit for Joyce, Yeats, and Heaney is in danger of slipping into the belief that his ethnic group possesses certain kinds of innate superiority. The more seriously he takes that belief, the uglier it has the potential to get.
 
There's nothing wrong with taking a particular interest in something because of your tribal affiliations. (I've got my Yeats and Heaney on the bookshelf.) But believing that you have special access to understanding it is a problem, not least because it keeps you from doing the work required to actually know about something.

I'm a practicing Christian and Reza Aslan is not. But Reza Aslan can read the Christian Gospels in the original and I cannot. That means he has a lot of things to say that I'm interested to hear. I'm going to make my own decisions about my beliefs at the end of the day. (I also know a translator of the New Testament and I acknowledge his superior learning, but I don't go to the same church he does on Sundays.) I'm perfectly happy to admit that Reza Aslan knows things about my religion that I do not know myself. If I refused to admit that, I would cut myself off from learning more.

In the end, tribal knowledge isn't about knowing. It's about believing one knows. That is a very different thing.  And at a certain point, actual information starts to feel like a threat to one's tribal certitude. Nothing is more dangerous to the illusion of knowledge than facts. There's more than one reason that Lauren Green didn't let Reza Aslan talk about his book. She might have learned something about the historical Jesus, and that would have been intolerable.

cross-posted from Dagblog

Thursday, July 25, 2013

Larry Summers Is Bad with Money

So, apparently Larry Summers is now the leading candidate for Chairman of the Federal Reserve. This is a bad idea, for lots of reasons, not least of which is that Summers' sudden ascendancy is a sign that The Usual Suspects are talking him up, and it's The Usual Suspects who not only got our economy into this mess but made our government's top priority not getting out of the mess "too quickly." Summers himself was one of Obama's leading economic advisers during the first term, and neither his advice nor Obama's first-term policy were effective in turning the Great Recession around. The result of Summers's advice was always too little, too late. It was Summers who insisted on asking Congress for a smaller stimulus package than the economy needed, on the theory that the smaller package would get passed. Of course, Congress took that smaller package and cut it down even more.

Larry Summers is also responsible for doing major financial damage to America's largest educational non-profit. People mainly remember Summers's stint at Harvard for the way it ended, with Summers making stupid and self-destructive remarks that cost him the job. That's a real problem; without wading into everything problematic about that speech, it displays a lack of discipline that may be disqualifying. But people generally don't focus on the dire financial consequences of Summers's leadership. Summers's bad economic decisions cost Harvard a staggering amount of money. His main legacy at Harvard is an enormous hole in the ground.

During the bubble/boom years, Summers decided to put billions of dollars of Harvard's endowment into complex financial derivatives, mainly interest rate swaps. He got his way. After all, wasn't he an economist? Hadn't be been Treasury Secretary? Surely, he knew what he was doing. But Summers put three and a half billion dollars into some of the most toxic and illiquid investments possible. When the crash happened in 2008, those investments got hammered: a billion dollar loss for starters, followed by hundreds and hundreds of millions more in interest and bankers' fees as the school had to borrow money at a disadvantage to meet margin calls on all those toxic securities. All told, Harvard lost nearly $11 billion dollars of its endowment in the crash. It survived; going from $37 billion to $26 billion is not the end of the world. But it is an enormous waste, and Summers made it worse by billions. That money has not been made back. And Washington power players are now seriously talking about putting this man in charge of our national bank. Really.

[UPDATE: Here's a piece from Bloomberg setting all of this out in greater detail.]

But that's not all. Back in the heady days of the bubble, Larry Summers decided to bend one of Harvard's oldest fiscal rules. Harvard, which is rich in part because it has traditionally been cautious with its money, has a long-standing rule about not starting construction on any new buildings until after it has raised the cash. First you get the money, then you build the building. It's not complicated, and it works. (As I've said elsewhere, one of the biggest differences between a fiscally sane university and a university headed for financial trouble is that a healthy university raises the capital for new buildings and an unhealthy university borrows it.) Larry Summers was impatient with that rule. And he wanted to build a huge new science building across the river from Cambridge, in Boston's Allston neighborhood. So he broke Harvard's money-in-the-bank rule for new construction. He had the pledges for the money he needed, so he gave the go-ahead. Then the 2008 bust hit, and the donors who had promised money no longer had the money they promised.

Harvard's contractors had dug the hole for the foundation of that big, ambitious new building when the money dried up. So they stopped work. Harvard (and the neighborhood) was left with nothing but a five-acre hole in the ground. That hole is still there. It's been there longer than it takes to get a Harvard degree; the students who graduated last month have never known a Harvard that did not own a massive hole in the ground. Harvard hopes to do something about that hole next year, maybe.

Is the five acre hole in the middle of a major city going unused? Of course not. Rats are using it.

That's what Larry Summers's fiscal mind brought to the richest university on earth: a gaping five-acre pit. That is the genius being proposed as leader of the Fed. Because here's the truth: the American economy is another huge hole, even bigger than the one in Allston, dug in 2008 and still, all these years later, not filled in. The work has not even begun. Larry Summers is not the man to get us out of that hole. He's one of the men who dug it.

cross-posted from Dagblog