Every year around opening day I try to read a few books about baseball. This year I am reading Lee Lowenfish’s excellent biography of Branch Rickey. Let me quote a passage:
On the morning of March 13, 1945 Branch Rickey was drinking coffee and reading the newspaper in the spring training lodging at Bear Mountain. [Where the Brooklyn Dodgers had their very chilly spring training during the war] Suddenly, he looked up from his paper with an animated expression on his face. “What’s wrong dear?, Jane Rickey asked her husband, wondering what now was bothering her easily agitated mate. “It was in the paper, Mother, that Governor Dewey has just signed the Ives-Quinn Law!” he exclaimed. “They can’t stop me now.”
And what “they” couldn’t stop Rickey from doing, of course, was signing a Negro ballplayer to the Dodgers, which Rickey would do later that year, when he announced the signing of Jackie Robinson to a minor league contract, and changed baseball forever. With the passage of the Ives-Quinn Law, New York became the first state to ban discrimination in employment. It was a state extension of the Fair Employment Practices Commission, an executive agency created by FDR in 1941, and due to southern hostility and much northern indifference was allowed to lapse shortly after the end of the war. The Ives- Quinn Law was passed in NYS as a state version of the FEPC, and it was controversial, and many business groups (and racial conservatives like Robert Moses) opposed it; Dewey was on the fence about it, but was eventually persuaded to sign it.
The Ives-Quinn Law was certainly within the radical penumbra of the New Deal, the creation of a whole new class of rights, and pointing towards a fundamentally new relation between government and business, where private hiring and admissions practices were subject to government review. Branch Rickey was not a New Dealer. He strongly opposed the New Deal, and not just around the breakfast table. He was a prominent Republican in Missouri (where he worked for the St. Louis Cardinals before joining the Dodgers in 1941), and was considered for the US Senate in 1940, and actively campaigned for Republican candidates using strongly anti- New Deal rhetoric; that FDR was a tyrant, and was destroying the basis of the free enterprise system, and so on. He was very frightened by the Communists, and by what he saw as their demagoguery on racial matters, and frightened by racial radicals of all sorts whom, he felt wanted to do too much too soon.
But Rickey was a complex man. According to Lowenfish. Rickey’s favorite book in 1944 was Gunnar Myrdal’s An American Dilemma, the basis of many animated conversations with friends and family. His favorite book in spring training in 1947 (the year Jackie Robinson joined the Dodgers) was Frank Tannenbaum’s classic Slave and Citizen. He interpreted both books as having the same message, to quote Lowenfish “that with the passage of time, increased education, the rise of a black middle class, and greater proximity between races, the American racial dilemma would be ameliorated, if not totally solved.”
Racial gradualism generally has a bad press, and deservedly so, because moving towards racial equality with “all deliberate speed” is generally a euphemism for doing nothing at all. This obviously was not the case with Rickey, who expended tremendous care and effort into integrating baseball (and also reaped the rewards, as the Dodgers became first major league team to tap into the resources of the Negro Leagues.)
You sometimes hear that too much is made of Jackie Robinson, and that in the bigger scheme of things as pertaining to the search for racial justice in America, it was attention grabbing but not all that significant. In some ways this is so, certainly. It would be 12 years until the last major league club, the always evil Boston Red Sox hired their first black player. (Although this life long Yankees fan must acknowledge that the boys in the Bronx weren’t much better, not hiring their first black player until 1955, and supposedly missing the chance to sign Willie Mays because they felt he wasn’t “Yankee” material. Anyway, here's hoping that baseball returns to the natural order of things, starting today, with the Red Sox stareing up at the Yankees.) But to return to the point of this post, it is certainly true that in many ways the Jackie Robinson story is isolated; one cannot say that it was really was the start of a general trend towards ending discrimination in employment.
And perhaps this is the point. The Ives-Quinn law was never intended, certainly not many of its more timid backers, like Gov Dewey, to start a process whereby private businesses would be compelled to integrate, but to act as a form of moral suasion, exerting pressure to do the right thing. Rickey moved up the announcement of the signing of Robinson from the end of the baseball season in 1945 from his original intention, in November in the depths of the college football season, when LaGuardia, who sort of found religion on racial matters after the 1943 Harlem riots, wanted a pledge from the city’s three teams that they would abide by the Ives-Quinn law. The law, as Rickey said when he first read about, would not compel him to hire black players, but it would make it impossible for anyone (such as the commissioner of baseball) to cite either written or unwritten laws to stand in his way.
The limitations of voluntary desegregation are obvious, which was why in the 1960s the advocates of civil rights passed a series of measures, with far more teeth than the Ives-Quinn Act, that would compel private businesses to integrate. (Rickey, Republican to the end, backed Goldwater in 1964, feeling his civil rights stance was preferable to “national degradation’ under Johnson.)
But Rickey’s desegregation of baseball does need recognition for what it did accomplish. First, it needs to be seen as something that probably could, in 1945, 46, and 47, only have happened in New York State, and was one of the incidental triumphs of the Ives-Quinn Law. And it also needs to be seen as perhaps the greatest triumph of what racial moderates devoutly hoped would happen in post-war America, that private businesses and parties would see the evil of their ways, adopt Myrdal’s American creed and desgregate. Branch Rickey felt that integrating baseball was fully in keeping with his deeply conservative and profoundly Republican values. If there had been more people like him, the history of post-war America would have been vastly different.
Monday, March 31, 2008
Sunday, March 30, 2008
Democrats and Working-Class Voters
For Democrats, 2008 marks two anniversaries: their party's ascent in 1933 with the inauguration of FDR and its crackup in 1968. Working-class Democratic voters play central roles in the stories of both years--first as members of a vigorous labor movement that animated FDR's New Deal and later, so the story goes, as blue collar conservatives who defected to the Republicans.
But participants in a panel on the usable past of liberalism at the Organization of American Historians convention this week suggested that working-class voters are not the conservatives that many observers make them out to be. And that is good news for the Democrats.
Dorothy Sue Cobble of Rutgers argued that since 1968, the defection of the white working class from the Democrats has been confined to the South: elsewhere, it amounts to a loss of only one percent. And even in the South, the great Republican gains have been among middle class and upper class voters.
Indeed, she reminded us, income predicts voting: lower income voters vote Democratic, middle income voters are mixed in their preferences, and high-income voters tend to vote Republican. If the votes of lower-income voters were all that counted, we would have been spared the Bush presidency.
There's a counterargument to this, of course: that the cultural extremism of the Democratic party (on gay rights, for example, as some see it) drives away culturally conservative working-class voters. Thomas Edsall, who was on the panel with Cobble, makes this point.
The way to reconcile these different perspectives is that they look at two different forms of data: polls on cultural attitudes and voting in elections. Each method of counting produces different results. There is inevitably some statistical murkiness in all of this, but at the end of the day the lower your income the more likely you are to vote Democratic.
Of course, with the percentage of unionized workers at an awful low, it would be a mistake to expect a resurgent labor movement to carry the Democrats to the White House. But the growing and politicized number of Hispanic workers are likely to vote Democratic in the fall.
Just as immigrant, working-class voters helped lift FDR and the New Deal to victory, Hispanic workers can helps send a Democrat to the White House in 2008.
But participants in a panel on the usable past of liberalism at the Organization of American Historians convention this week suggested that working-class voters are not the conservatives that many observers make them out to be. And that is good news for the Democrats.
Dorothy Sue Cobble of Rutgers argued that since 1968, the defection of the white working class from the Democrats has been confined to the South: elsewhere, it amounts to a loss of only one percent. And even in the South, the great Republican gains have been among middle class and upper class voters.
Indeed, she reminded us, income predicts voting: lower income voters vote Democratic, middle income voters are mixed in their preferences, and high-income voters tend to vote Republican. If the votes of lower-income voters were all that counted, we would have been spared the Bush presidency.
There's a counterargument to this, of course: that the cultural extremism of the Democratic party (on gay rights, for example, as some see it) drives away culturally conservative working-class voters. Thomas Edsall, who was on the panel with Cobble, makes this point.
The way to reconcile these different perspectives is that they look at two different forms of data: polls on cultural attitudes and voting in elections. Each method of counting produces different results. There is inevitably some statistical murkiness in all of this, but at the end of the day the lower your income the more likely you are to vote Democratic.
Of course, with the percentage of unionized workers at an awful low, it would be a mistake to expect a resurgent labor movement to carry the Democrats to the White House. But the growing and politicized number of Hispanic workers are likely to vote Democratic in the fall.
Just as immigrant, working-class voters helped lift FDR and the New Deal to victory, Hispanic workers can helps send a Democrat to the White House in 2008.
Friday, March 28, 2008
Clair de Lune
The most interesting article in the Times yesterday was surely the announcement of the discovery, in France, of a version of a sound recording made in 1860, some two decades before the first Edison phonograph. The recording was made on a phonoautograph, consisting of a recording horn and a stylus, which made an impression on a soft surface, much like an early wax cylinder. Although the recordings made on a phonoautograph were only intended to be looked at and studied as the visual representation of sound, using modern recording technology, the groves and indentations can be played back, and a somewhat ghostly voice can be heard singing the French folk song “Clair de Lune.”
I have long been fascinated by the oldest generation of recordings; voice recordings made by Sir Arthur Sullivan and William Gladstone in the 1880s, a horrible sounding piano roll, with a few barely audible notes containing all we have of Brahms at the piano; the recordings of Adelina Patti, the most famous soprano in the second half of the 19th century, with barely any voice left, but enough to get some sense of the former splendor of the voice, the earliest brass band recordings from the early 1890s, and so on, all with the crackle and distortion that, for instance, gives daguerreotypes their charm and fascination. In listening to old recordings on hears not only a particular recording, but the birth of an entirely new way of listening to the world, emerging in its dim, noisy, static-filled birth cry.
The discovery of this recording just underlines for me one of the great frustrations in the history of technology. Why oh why wasn’t the phonograph invented and developed earlier? As I understand the technological issues, there was absolutely no reason by the 1840s, by the time of the invention of the telegraph, that the best tinkerers in Europe or America couldn’t have invented the prototype of a recording device. Give it a decade or so to get developed, and by the 1860s we could have recordings conducted by Verdi or Wagner, Liszt or Clara Schumann at the piano; recordings of slave spirituals made during or immediately after the Civil War, and then somewhat later, blues recordings from the 1890s, or Buddy Bolden on the cornet. And old recordings take one back further than recording date—recordings made around 1900, like those of Patti, give evidence of what voices sounded like a generation earlier, so in listening to Patti, one hears operatic technique c. 1860. And if sound recording was truly possible in 1860, one could hear the voices of people born around 1800, and who played with Beethoven, or a few arias by Jenny Lind, or a few songs by Dan Emmett and other pioneers of the minstrel stage, perhaps accompanied by Stephen Foster on the piano.
There was an a fifty or sixty year period, from about 1840 to 1890, when it was possible to record images but not sound, and I have long wondered what it was like during this period of recorded asymmetry, when sounds were still assumed to be completely ephemeral and transient, while images were increasingly seen as something that could be locked and fixed. But somehow the transition from painting to photography seems, as epochal as it was, seems less astounding from what happened in 1860, when sound went from being forever ephemeral and transient, fading and lost forever almost immediately, to something that could be captured, controlled, and, eventually, reheard.
I have long been fascinated by the oldest generation of recordings; voice recordings made by Sir Arthur Sullivan and William Gladstone in the 1880s, a horrible sounding piano roll, with a few barely audible notes containing all we have of Brahms at the piano; the recordings of Adelina Patti, the most famous soprano in the second half of the 19th century, with barely any voice left, but enough to get some sense of the former splendor of the voice, the earliest brass band recordings from the early 1890s, and so on, all with the crackle and distortion that, for instance, gives daguerreotypes their charm and fascination. In listening to old recordings on hears not only a particular recording, but the birth of an entirely new way of listening to the world, emerging in its dim, noisy, static-filled birth cry.
The discovery of this recording just underlines for me one of the great frustrations in the history of technology. Why oh why wasn’t the phonograph invented and developed earlier? As I understand the technological issues, there was absolutely no reason by the 1840s, by the time of the invention of the telegraph, that the best tinkerers in Europe or America couldn’t have invented the prototype of a recording device. Give it a decade or so to get developed, and by the 1860s we could have recordings conducted by Verdi or Wagner, Liszt or Clara Schumann at the piano; recordings of slave spirituals made during or immediately after the Civil War, and then somewhat later, blues recordings from the 1890s, or Buddy Bolden on the cornet. And old recordings take one back further than recording date—recordings made around 1900, like those of Patti, give evidence of what voices sounded like a generation earlier, so in listening to Patti, one hears operatic technique c. 1860. And if sound recording was truly possible in 1860, one could hear the voices of people born around 1800, and who played with Beethoven, or a few arias by Jenny Lind, or a few songs by Dan Emmett and other pioneers of the minstrel stage, perhaps accompanied by Stephen Foster on the piano.
There was an a fifty or sixty year period, from about 1840 to 1890, when it was possible to record images but not sound, and I have long wondered what it was like during this period of recorded asymmetry, when sounds were still assumed to be completely ephemeral and transient, while images were increasingly seen as something that could be locked and fixed. But somehow the transition from painting to photography seems, as epochal as it was, seems less astounding from what happened in 1860, when sound went from being forever ephemeral and transient, fading and lost forever almost immediately, to something that could be captured, controlled, and, eventually, reheard.
A New Deal Moment?
Wall Street is in a crisis. Home foreclosures are on the rise. The Republican presidential nominee argues that "it is not the duty of government to bail out and reward those who act irresponsibly," while the Democratic rivals for the nomination debate the best government response. Could it be that a New Deal moment is at hand in American politics?
While "Intervention Or a Bailout?" in the Times argues that both parties recognize the important role of government in solving the problems of housing and financial markets, the similarity ends there. Republicans, as William A. Niskanen of the Cato Institute observes, are likely to protect markets. Democrats are likely to protect individuals.
That's an old and defining difference between Democrats and Republicans, one that Democratic politicians rode to victory for decades after the New Deal. While it would be too much to expect a wholesale to return to the regulatory state of the New Deal, it is heartening to see that the economic issues and core ideas at play in the coming election will play to the Democrats' strengths and the Republicans' weaknesses.
Paul Krugman makes a case that Hillary Clinton's policy responses to the crisis of housing and finance are more progressive than Barack Obama's. But whatever the differences between Obama and Clinton, it looks like the Democratic nominee will run by arguing that the federal government must help ordinary Americans. John McCain will offer the typical Republican medicine of preserving free markets.
In this year's political climate, that should give Democrats a big advantage. The party of Obama and Clinton has an opportunity to do something that Democrats haven't done in a long time: make a persuasive case that their party's philosophy of government best represents the interests of ordinary Americans. If the Democrats can make this point stick, they might even be able to turn a 2008 presidential victory into a new Democratic majority.
While "Intervention Or a Bailout?" in the Times argues that both parties recognize the important role of government in solving the problems of housing and financial markets, the similarity ends there. Republicans, as William A. Niskanen of the Cato Institute observes, are likely to protect markets. Democrats are likely to protect individuals.
That's an old and defining difference between Democrats and Republicans, one that Democratic politicians rode to victory for decades after the New Deal. While it would be too much to expect a wholesale to return to the regulatory state of the New Deal, it is heartening to see that the economic issues and core ideas at play in the coming election will play to the Democrats' strengths and the Republicans' weaknesses.
Paul Krugman makes a case that Hillary Clinton's policy responses to the crisis of housing and finance are more progressive than Barack Obama's. But whatever the differences between Obama and Clinton, it looks like the Democratic nominee will run by arguing that the federal government must help ordinary Americans. John McCain will offer the typical Republican medicine of preserving free markets.
In this year's political climate, that should give Democrats a big advantage. The party of Obama and Clinton has an opportunity to do something that Democrats haven't done in a long time: make a persuasive case that their party's philosophy of government best represents the interests of ordinary Americans. If the Democrats can make this point stick, they might even be able to turn a 2008 presidential victory into a new Democratic majority.
Wednesday, March 26, 2008
The Art of Being Found Out
I borrow the title from a fascinating article in the latest issue of the London Review of Books by Colm Tóibín on the way in which some figures, in literature or in real life, seemed determined, whether consciously or not, to out themselves, reveal their secrets, and often, mire themselves in scandal and obloquy. This phenomena which seemed to have crested in late 19th century Britain, with Oscar Wilde and Dr. Jekyll being perhaps the best-known examples of the factual and fictional self-destructive self-exposer. Most of the article is devoted to an interesting consideration of Ford Madox Ford’s The Good Soldier, a novel Tóibín’s article convinced me I should add to my ever surging “ought to read” list. As it is, the only thing I know about The Good Soldier is its famous opening line, “This is the saddest story that I have ever heard.” And this would be a fine description of New York State politics over the last month.
So tell me again, why is it again that we are supposed to be more outraged about Eliot Spitzer’s whoring than David Paterson’s numerous marital infidelities, his creative financing of his Days Inn’s rendezvous, and his former girl friends turning up in very high positions in the current administration? I will let the casuistrists out there parse the differences, but I suppose the real difference comes down to two words, Joe Bruno. Democrats will obviously fight to the death to prevent Joe Bruno from becoming governor, and unless a howitzer size smoking gun emerges, Paterson is in like flynt until 2010. It is increasingly clear that Joe Bruno and his minions largely orchestrated the downfall of Spitzer—I hope the Democrats in Albany aren’t too intimidated by what happened last year’s investigation of Bruno not to fearlessly look into any Republican plotting of Spitzer’s political demise. Alexander Cockburn in the Nation, never one to pass up the opportunity to monger a conspiracy theory, suggests that Spitzer’s downfall, if greased by his own indiscretions, was perhaps planned as pay back by Wall Street. I don’t know, but I would not at all be surprised if there is more to l’affaire Spitzer than the reported tracking of financial irregularity.
Paterson is proving to be what he probably is, a mediocre machine politician, a third-generation product of the last great Democratic political machine in New York City, J. Raymond Jones’s operation in Harlem, which spawned, among others, Charles Rangel, David Dinkins, Percy Sutton, and Basil Paterson (David’s dad.) Like many machine politicians they combined occasional worthwhile public service with getting along with the powerful and looking out for # 1. This is why I suppose Paterson has been so successful in the state senate, filled with people just like him.
It is perhaps becoming clearer what we lost when we lost Eliot Spitzer. Spitzer had his faults, goodness knows, but he was a crusader, a reformer. Paterson, and I hope I have to eat my words, along with crow and humble pie, seems to be a hack. I guess the most mysterious question about Spitzer is why he proved to be such a lousy reformer. Given the history of great reforming governors in the state, among them TR, Al Smith, FDR, Spitzer, even before his afternoon delights were exposed, seemed to be an utter failure.
Why? I’m not sure. I welcome further commentary on this. I would say that crusading governors need to make allies as well as enemies, need to have clear reform agendas, need to figure out ways to make the legislature approve the curtailing of their prerogatives by presenting a popular program they have choice but to endorse, and the need, to follow Machiavelli, to be either feared or loved, and probably a bit of both. Spitzer was none of these things. And he never learned, as Bruno has, to let underlings do the real dirty work, and develip palusible deniability.
As time goes on, his personal tragedy might become our collective loss. New York State politics remains a fetid dung heap, and no one seems up to the task of cleansing the stables. The last guy who tried tripped on his muckrake.
So tell me again, why is it again that we are supposed to be more outraged about Eliot Spitzer’s whoring than David Paterson’s numerous marital infidelities, his creative financing of his Days Inn’s rendezvous, and his former girl friends turning up in very high positions in the current administration? I will let the casuistrists out there parse the differences, but I suppose the real difference comes down to two words, Joe Bruno. Democrats will obviously fight to the death to prevent Joe Bruno from becoming governor, and unless a howitzer size smoking gun emerges, Paterson is in like flynt until 2010. It is increasingly clear that Joe Bruno and his minions largely orchestrated the downfall of Spitzer—I hope the Democrats in Albany aren’t too intimidated by what happened last year’s investigation of Bruno not to fearlessly look into any Republican plotting of Spitzer’s political demise. Alexander Cockburn in the Nation, never one to pass up the opportunity to monger a conspiracy theory, suggests that Spitzer’s downfall, if greased by his own indiscretions, was perhaps planned as pay back by Wall Street. I don’t know, but I would not at all be surprised if there is more to l’affaire Spitzer than the reported tracking of financial irregularity.
Paterson is proving to be what he probably is, a mediocre machine politician, a third-generation product of the last great Democratic political machine in New York City, J. Raymond Jones’s operation in Harlem, which spawned, among others, Charles Rangel, David Dinkins, Percy Sutton, and Basil Paterson (David’s dad.) Like many machine politicians they combined occasional worthwhile public service with getting along with the powerful and looking out for # 1. This is why I suppose Paterson has been so successful in the state senate, filled with people just like him.
It is perhaps becoming clearer what we lost when we lost Eliot Spitzer. Spitzer had his faults, goodness knows, but he was a crusader, a reformer. Paterson, and I hope I have to eat my words, along with crow and humble pie, seems to be a hack. I guess the most mysterious question about Spitzer is why he proved to be such a lousy reformer. Given the history of great reforming governors in the state, among them TR, Al Smith, FDR, Spitzer, even before his afternoon delights were exposed, seemed to be an utter failure.
Why? I’m not sure. I welcome further commentary on this. I would say that crusading governors need to make allies as well as enemies, need to have clear reform agendas, need to figure out ways to make the legislature approve the curtailing of their prerogatives by presenting a popular program they have choice but to endorse, and the need, to follow Machiavelli, to be either feared or loved, and probably a bit of both. Spitzer was none of these things. And he never learned, as Bruno has, to let underlings do the real dirty work, and develip palusible deniability.
As time goes on, his personal tragedy might become our collective loss. New York State politics remains a fetid dung heap, and no one seems up to the task of cleansing the stables. The last guy who tried tripped on his muckrake.
Monday, March 24, 2008
Playing Games at the Library
The main lobby of the New York Public Library at 42nd Street and Fifth Avenue was buzzing Friday with kids playing video games. The library thinks it’s creating a new learning environment and bridging a generation gap. To me, as a teacher and lifelong library user, it looks like a road to nowhere.
The problem isn’t multimedia, the Web, or educational video games. At their best, they dramatically expand students’ intellectual experiences. And I’ve worked on Web-based learning projects and use them in my own classes.
The problem lies in the relentlessly commercial games offered at the event, “Game On @ The Library!” “Guitar Hero” and “Super Smash Brothers Brawl” don’t connect easily with the riches in the rest of the library. And they don’t encourage the habits of mind—close reading, independent analytic thinking, and strong writing--that students need to succeed in the classroom and in life.
The library staffers I spoke with didn’t share my concerns. This is only the beginning of a dialogue, they said. While the kids play the games they’re already learning strategy, calculation, and visual sense.
But where’s the bridge to the rest of the library? They assured me it will come.
I looked around the hall, though, and didn’t see anything to lure the kids into books, films, or the library’s vast digital holdings.
The library’s Web page promoting “Game On” says video games “make learning fun” and lower “the emotional stakes of failing.” They encourage kids to make their own discoveries. They’re an active experience. They overcome language barriers because you don’t have to speak English to participate.
Maybe, but good teachers already know how to make these things happen in a classroom or on the Web. As for serving library visitors that don’t speak English, the library already has a heroic history of lending books to immigrants in the many languages of our world.
The library, which is increasingly circulating video games throughout its system, is confusing the world of work and the world of play. Both have a place in learning, but they don’t always overlap. And making the connection between the two is difficult.
At Rutgers-Newark, where I teach many students who are immigrants or the first in their family to attend college, we devote a lot of reading, writing and class discussions to subjects such as the Supreme Court decision in the Pentagon Papers case. It’s hard, detailed work. But it’s worthwhile because we develop the ideas that help us live as citizens in a democracy. The talents you acquire racing cars in “Burn Out” don’t prepare you to do that.
The library staff might say that “Burn Out” is only a visitor’s first stop at the library, not his last. But the programmed experiences available yesterday don’t compare to open-ended explorations in a library.
In the Seventies, as a teenage in New Jersey, I spent days wandering in a local library. One day, I picked up a collection of newspaper columns by Pete Hamill; one piece was about the Abraham Lincoln Brigade, the Americans who fought in the Spanish Civil War. One thing led to another: I talked with my dad about men he knew from East New York who fought in Spain; I wrote my first college term paper on the Lincoln Brigade; I befriended one of the volunteers. Today I write on the Spanish Civil War and serve on the board of the Abraham Lincoln Brigade Archives. It all began with a trip to the library and it continues with visits to the library. I just can’t imagine anything comparable growing out of a session with “Excite Truck.”
Of course, as I have learned from my children, their friends, and my students, the enjoyment of video games is perfectly consistent with a life of literate learning. It’s just that one doesn’t automatically lead to the other. Despite the claims of the good people at the library, I fear that we are on the edge of dividing our young people into those who can read, write and play video games and those who can just play video games.
The real problem here is the library promoting games produced to entertain and make money for corporations. The library might have exhibited the best new digital learning exercises created by educators. It might have put the real effort of education in the foreground. It might have done the work of leading visitors from the games to the kind of learning that goes on in the rest of its collections. Instead, it turned its great hall into an ad for the likes of “College Hoops 2K7.”
The problem isn’t multimedia, the Web, or educational video games. At their best, they dramatically expand students’ intellectual experiences. And I’ve worked on Web-based learning projects and use them in my own classes.
The problem lies in the relentlessly commercial games offered at the event, “Game On @ The Library!” “Guitar Hero” and “Super Smash Brothers Brawl” don’t connect easily with the riches in the rest of the library. And they don’t encourage the habits of mind—close reading, independent analytic thinking, and strong writing--that students need to succeed in the classroom and in life.
The library staffers I spoke with didn’t share my concerns. This is only the beginning of a dialogue, they said. While the kids play the games they’re already learning strategy, calculation, and visual sense.
But where’s the bridge to the rest of the library? They assured me it will come.
I looked around the hall, though, and didn’t see anything to lure the kids into books, films, or the library’s vast digital holdings.
The library’s Web page promoting “Game On” says video games “make learning fun” and lower “the emotional stakes of failing.” They encourage kids to make their own discoveries. They’re an active experience. They overcome language barriers because you don’t have to speak English to participate.
Maybe, but good teachers already know how to make these things happen in a classroom or on the Web. As for serving library visitors that don’t speak English, the library already has a heroic history of lending books to immigrants in the many languages of our world.
The library, which is increasingly circulating video games throughout its system, is confusing the world of work and the world of play. Both have a place in learning, but they don’t always overlap. And making the connection between the two is difficult.
At Rutgers-Newark, where I teach many students who are immigrants or the first in their family to attend college, we devote a lot of reading, writing and class discussions to subjects such as the Supreme Court decision in the Pentagon Papers case. It’s hard, detailed work. But it’s worthwhile because we develop the ideas that help us live as citizens in a democracy. The talents you acquire racing cars in “Burn Out” don’t prepare you to do that.
The library staff might say that “Burn Out” is only a visitor’s first stop at the library, not his last. But the programmed experiences available yesterday don’t compare to open-ended explorations in a library.
In the Seventies, as a teenage in New Jersey, I spent days wandering in a local library. One day, I picked up a collection of newspaper columns by Pete Hamill; one piece was about the Abraham Lincoln Brigade, the Americans who fought in the Spanish Civil War. One thing led to another: I talked with my dad about men he knew from East New York who fought in Spain; I wrote my first college term paper on the Lincoln Brigade; I befriended one of the volunteers. Today I write on the Spanish Civil War and serve on the board of the Abraham Lincoln Brigade Archives. It all began with a trip to the library and it continues with visits to the library. I just can’t imagine anything comparable growing out of a session with “Excite Truck.”
Of course, as I have learned from my children, their friends, and my students, the enjoyment of video games is perfectly consistent with a life of literate learning. It’s just that one doesn’t automatically lead to the other. Despite the claims of the good people at the library, I fear that we are on the edge of dividing our young people into those who can read, write and play video games and those who can just play video games.
The real problem here is the library promoting games produced to entertain and make money for corporations. The library might have exhibited the best new digital learning exercises created by educators. It might have put the real effort of education in the foreground. It might have done the work of leading visitors from the games to the kind of learning that goes on in the rest of its collections. Instead, it turned its great hall into an ad for the likes of “College Hoops 2K7.”
Obama and Supersessionism
Scooped! Yale University Law Professor Akhil Reed Amar in Slate today put forward an idea that I have been thinking about but never got around to posting on, that the ideal way to solve the current Democratic dilemma on whose on who should head the ticket, and how this could be done without utterly pissing off the partisans of the other would be to alternate, and let either Hillary or Obama get the presidential nomination, with a public agreement beforehand that halfway into the term, the president would resign, the VP would become president, and then promptly name the other to the vacated VP slot. (Although as befitting a law professor, Amar spun out the implications of this at greater length then I considered.)
It seems unlikely that this idea will go anywhere (perhaps a bit more likely would be an agreement for the winning candidate t to name the losing candidate to the first opening on the US supreme court—by the way, in case any Democratic party leaders read this any are thinking of doing me a soIid, I would much rather be a supreme court justice than vice president.)
I suppose the idea of a co-presidency won’t go very far, but it is of course the very stratagem the oldest republic of them all, the one in Rome, used to avoid the problems of monarchy, by having two consuls elected annually to guide the affairs of Rome. Having two people share supreme power is a most effective way to deflate the charismatic aura that surrounds leadership, which tends to make those in power want to hold onto their position as long as possible. The consul system worked pretty well from 509 BCE to about 100 BCE, when the pressure of Roman imperialism made the system unwieldy, and eventually Pompey and Julius Caesar came along to exploit its limitations. In any event, we have become so supremely hierarchical in our views of power and human nature, that most of us assume that the any true power-sharing arrangement is doomed to failure, though I don’t know of any study (not that I have looked) that shows that executive power sharing is inherently more unworkable than the single executive model. Whatever one thinks of the Roman Republic, they were not a bunch of starry-eyed utopians, or lacking in the requisite executive toughness to make hard decisions.
But the main reason why a co-presidency would not work, in the current situation, is that one of the candidates, Barak Obama, has ascended to the status of front runner on the basis of a charismatic afflatus that would have made Jeremiah or Isaiah envious, and the routinization of that charisma in a co-presidency would not satisfy his many followers.
Holy Week having been recently concluded (and a happy Easter Monday to all, especially those of you mulling about outside the GPO on O’Connell Street) I was reading an interesting book about Jesus, and a word I picked up, a new one to me, is supersessionism , the act of superseding, which is what Jesus claimed to do (or rather what later followers of Jesus claimed that he had done) superseding God’s covenant with Israel, with a new dispensation, based on his expiation.
Anyway it occurred to me that Obama is one of the most supersessionist presidential candidates in recent American history. His whole aura bespeaks the creation of a new era, the turning of a page, the belief that we are making a rendezvous with a much needed transformation. Supersession is not abandoning the past, but fulfilling it, as Christians believe the New Testament fulfills the promise of the Hebrew Bible. He is sometimes compared, unfairly (even by himself) to Reagan, which misses the point. By the time Reagan became president, the “conservative revolution” had been gathering for over a decade, and Reagan just put into in practice. Obama’s transformation is far more inchoate (he is not the tribune of an obvious social or political movement, but the beneficiary of a mood, the conviction that things have gone down a disastrous path, and need changing. )
Obama’s wonderful speech on race represents his supersessionism at its purest. Jeremiah Wright was a good man, but labored under the old dispensation, one that believed race and racism were the fundamental principles around which American society is ordered. Have things changed? In some ways they have, in some ways they haven’t, but the promise of his candidacy is precisely that if you believe in it, the new era moves that much closer, and the promise of overcoming racial divisions has moved that much further along. Oh, I suppose I am still an Old Testament man myself, and I remain unconvinced that we have made as much racial progress as some of us seem to think. Too many people in recent years have proclaimed too easily and too facilely that we have turned a corner on race relations for me to be easily convinced. But the promise of Obama is that he can finally fulfill the promise of true racial equality in this country. Only time will tell if his supersessionism will bear the fruit that it seems to bear.
It seems unlikely that this idea will go anywhere (perhaps a bit more likely would be an agreement for the winning candidate t to name the losing candidate to the first opening on the US supreme court—by the way, in case any Democratic party leaders read this any are thinking of doing me a soIid, I would much rather be a supreme court justice than vice president.)
I suppose the idea of a co-presidency won’t go very far, but it is of course the very stratagem the oldest republic of them all, the one in Rome, used to avoid the problems of monarchy, by having two consuls elected annually to guide the affairs of Rome. Having two people share supreme power is a most effective way to deflate the charismatic aura that surrounds leadership, which tends to make those in power want to hold onto their position as long as possible. The consul system worked pretty well from 509 BCE to about 100 BCE, when the pressure of Roman imperialism made the system unwieldy, and eventually Pompey and Julius Caesar came along to exploit its limitations. In any event, we have become so supremely hierarchical in our views of power and human nature, that most of us assume that the any true power-sharing arrangement is doomed to failure, though I don’t know of any study (not that I have looked) that shows that executive power sharing is inherently more unworkable than the single executive model. Whatever one thinks of the Roman Republic, they were not a bunch of starry-eyed utopians, or lacking in the requisite executive toughness to make hard decisions.
But the main reason why a co-presidency would not work, in the current situation, is that one of the candidates, Barak Obama, has ascended to the status of front runner on the basis of a charismatic afflatus that would have made Jeremiah or Isaiah envious, and the routinization of that charisma in a co-presidency would not satisfy his many followers.
Holy Week having been recently concluded (and a happy Easter Monday to all, especially those of you mulling about outside the GPO on O’Connell Street) I was reading an interesting book about Jesus, and a word I picked up, a new one to me, is supersessionism , the act of superseding, which is what Jesus claimed to do (or rather what later followers of Jesus claimed that he had done) superseding God’s covenant with Israel, with a new dispensation, based on his expiation.
Anyway it occurred to me that Obama is one of the most supersessionist presidential candidates in recent American history. His whole aura bespeaks the creation of a new era, the turning of a page, the belief that we are making a rendezvous with a much needed transformation. Supersession is not abandoning the past, but fulfilling it, as Christians believe the New Testament fulfills the promise of the Hebrew Bible. He is sometimes compared, unfairly (even by himself) to Reagan, which misses the point. By the time Reagan became president, the “conservative revolution” had been gathering for over a decade, and Reagan just put into in practice. Obama’s transformation is far more inchoate (he is not the tribune of an obvious social or political movement, but the beneficiary of a mood, the conviction that things have gone down a disastrous path, and need changing. )
Obama’s wonderful speech on race represents his supersessionism at its purest. Jeremiah Wright was a good man, but labored under the old dispensation, one that believed race and racism were the fundamental principles around which American society is ordered. Have things changed? In some ways they have, in some ways they haven’t, but the promise of his candidacy is precisely that if you believe in it, the new era moves that much closer, and the promise of overcoming racial divisions has moved that much further along. Oh, I suppose I am still an Old Testament man myself, and I remain unconvinced that we have made as much racial progress as some of us seem to think. Too many people in recent years have proclaimed too easily and too facilely that we have turned a corner on race relations for me to be easily convinced. But the promise of Obama is that he can finally fulfill the promise of true racial equality in this country. Only time will tell if his supersessionism will bear the fruit that it seems to bear.
Friday, March 21, 2008
Make Mine a Spitzer
The great charm of Eisenberg's Sandwich Shop on Fifth Avenue at 22nd Street is its artful blend of old and new. The decor doesn't seem to have changed since the place opened in 1929, but the new sandwich, advertised on a sidewalk billboard, is "The Spitzer"--fresh hot tongue on rye.
Owner Josh Konecky invented the sandwich soon after the story broke; he says it was a way to get tongue on the menu.
I stopped by Eisenberg's for lunch today and walked to the far end of the counter. There, I sat down on a stool and ordered tuna on rye with lettuce and a lime rickey. Eisenberg's is one of the few places left in New York where the staff don't get confused when you order an old standard like a lime rickey. Or an egg cream.
But one of the best things about my lunch was the man who waited on me. Severiano-- eight years working at Eisenberg's, a whirlwind of energy switching back and forth from English to Spanish, took my order and shouted to another counterman, "tuna, lettuce, whiskey down."
Eisenberg's counter may have the comfortable look of the past, but don't let that fool you. "The Spitzer" is a reminder that the smart, tart New York sense of humor endures. And Severiano's easy use of "whisky down" for toasted rye bread proves that lunch counter slang has reached a new generation of working New Yorkers.
Owner Josh Konecky invented the sandwich soon after the story broke; he says it was a way to get tongue on the menu.
I stopped by Eisenberg's for lunch today and walked to the far end of the counter. There, I sat down on a stool and ordered tuna on rye with lettuce and a lime rickey. Eisenberg's is one of the few places left in New York where the staff don't get confused when you order an old standard like a lime rickey. Or an egg cream.
But one of the best things about my lunch was the man who waited on me. Severiano-- eight years working at Eisenberg's, a whirlwind of energy switching back and forth from English to Spanish, took my order and shouted to another counterman, "tuna, lettuce, whiskey down."
Eisenberg's counter may have the comfortable look of the past, but don't let that fool you. "The Spitzer" is a reminder that the smart, tart New York sense of humor endures. And Severiano's easy use of "whisky down" for toasted rye bread proves that lunch counter slang has reached a new generation of working New Yorkers.
Monday, March 17, 2008
A Bearless Market
First, I promise to spell, in this post, and every subsequent post in which he is mentioned, David Paterson’s name correctly, with one ‘t” and not two. In his excellent “swearing –in” speech today, Paterson made reference to the remarkable collapse of Bear, Stearns, which was acquired by J.P. Morgan of a mere $2 a share, a tenth of what it was trading for on Friday. J.P. Morgan, reprising its role as a white knight, a role on Broadway it has played since the 1890s, has once again come to the rescue of a foundering investment firm. There are many aspects of the deal that remain uncertain, and I do not claim to fully understand the complex and intricate financial arrangements that have gone on the past week. What happens to the Fed’s guarantee to vouch for up to $30 billion of Bear Stearns troubled securities? What happens if more dominoes start to fall? One reason the price for Bear Stearns was so low was that no one knows what sort of dangers lurk for the acquirer of its debt; J.P Morgan has reportedly put aside $6 billion to cover law suits and losses from its new acquisition.
The role of the Fed in all of this remains somewhat mysterious to me; whatever Bear Stearns is in the hermaphrodite financial culture that has existed since the end of Glass-Steagal, it is not a commercial bank, the supervision of which was the primary and original responsibility of the Fed, but since large investment firms have taken over the former role played by commercial banks—the very name J.P. MorganChase is an indication of the change in corporate culture, who else is the Fed supposed to rescue if the economy starts to falter?
I would second the comment of a commentator who pointed out that one of the chief abilities of a government is to declare a crisis, and then argue that the exigencies of the crisis require that normal legal protections be relaxed or entirely obviated, in the interest of addressing the matter at hand. We have seen the Bush administration do this, in spades, after 9/11/ and in the run up and subsequent to the invasion of Iraq, where anyone who questioned the legality of various acts, from warrantless wiretaps to torture to the legal rationale of the invasion itself were derided as triflers, speaking of legal piffle when Rome burned. In a modern state, the bureaucracy almost always trumps the legal system, which at best is reduced to the role of playing perpetual catch-up. In recent months, the Fed has been declaring a crisis, and then trying to figure out how to assume the powers appropriate to crisis management.
Now, I am not saying there is not a crisis in the financial markets, but the very act of declaring a crisis has loosened an already gossamer-like regulatory apparatus, giving the Fed the right, so it seems, to intervene in financiakl markets in almost any way of its choosing. The Fed’s actions in recent weeks have had an improvisatory character, trying one thing after another, to apparently little avail. I am ever more convinced that there will be no calming of the financial markets until there is a comprehensive rethinking of their regulatory framework, and this will not happen until Democrats place the question of financial regulation front and center in their platform.
The role of the Fed in all of this remains somewhat mysterious to me; whatever Bear Stearns is in the hermaphrodite financial culture that has existed since the end of Glass-Steagal, it is not a commercial bank, the supervision of which was the primary and original responsibility of the Fed, but since large investment firms have taken over the former role played by commercial banks—the very name J.P. MorganChase is an indication of the change in corporate culture, who else is the Fed supposed to rescue if the economy starts to falter?
I would second the comment of a commentator who pointed out that one of the chief abilities of a government is to declare a crisis, and then argue that the exigencies of the crisis require that normal legal protections be relaxed or entirely obviated, in the interest of addressing the matter at hand. We have seen the Bush administration do this, in spades, after 9/11/ and in the run up and subsequent to the invasion of Iraq, where anyone who questioned the legality of various acts, from warrantless wiretaps to torture to the legal rationale of the invasion itself were derided as triflers, speaking of legal piffle when Rome burned. In a modern state, the bureaucracy almost always trumps the legal system, which at best is reduced to the role of playing perpetual catch-up. In recent months, the Fed has been declaring a crisis, and then trying to figure out how to assume the powers appropriate to crisis management.
Now, I am not saying there is not a crisis in the financial markets, but the very act of declaring a crisis has loosened an already gossamer-like regulatory apparatus, giving the Fed the right, so it seems, to intervene in financiakl markets in almost any way of its choosing. The Fed’s actions in recent weeks have had an improvisatory character, trying one thing after another, to apparently little avail. I am ever more convinced that there will be no calming of the financial markets until there is a comprehensive rethinking of their regulatory framework, and this will not happen until Democrats place the question of financial regulation front and center in their platform.
Sunday, March 16, 2008
Lessons from the Spitzer Case, Especially for Journalists
The surveillance regimen that opened the door to Eliot Spitzer's downfall, and federal officials' release of salacious details in an affidavit delivered to the news media early in the case, are reminders of the distance traveled in American government and journalism since the Clinton impeachment and the aftermath of 9/11. Together, they pose interesting questions for Spitzer, the government, and the press.
The astonishing question for Spitzer, a man who presumably knows the law even if he has trouble obeying it, is that he chose such a risky way of booking and paying for a prostitute, even though surveillance of financial transactions has tightened since 9/11. Spitzer's actions were so bound to leave a trail. Indeed, his actions were so incriminating that people have been asking whether he somehow wanted to get caught.
I'll leave speculation about Spitzer's desire to be apprehended to psychologists. But just as he should have anticipated getting caught, he should have foreseen his humiliation. The initial details--such as client number nine's aversion to using condoms--were astonishing and steamy. But the affidavit, as former New Jersey attorney general John Farmer pointed out in a good Times op-ed, is part of a growing trend in prosecutorial strategy. For me, the released information carried echoes of the Starr Report released during the Clinton impeachment proceedings. In both cases, the prurient details were humiliating, beyond the bounds of legal necessity, and almost certainly included to put the defendant on the defensive in an ugly situation.
The key political difference between the situations of Clinton and Spitzer is that the president still commanded significant loyalty in the Democratic Party and the voters at large. Spitzer, whose term as governor before the scandal amounted to one waste of political capital after another, had very few friends to stand with him when the bad news erupted.
Spitzer's speedy acknowledgment of his wrongdoing validated the evidence against him. But there are elements of this case that trouble me, especially when it comes to reporters.
Prosecutors don't always get things right. It is easy to imagine how, on another day, in another case case, they might release lurid and damaging details about a suspect who turns out to be innocent. Reporters making news out of such information need to think hard about what to report and when they are being used. The next target might not be as deserving as Eliot Spitzer.
The astonishing question for Spitzer, a man who presumably knows the law even if he has trouble obeying it, is that he chose such a risky way of booking and paying for a prostitute, even though surveillance of financial transactions has tightened since 9/11. Spitzer's actions were so bound to leave a trail. Indeed, his actions were so incriminating that people have been asking whether he somehow wanted to get caught.
I'll leave speculation about Spitzer's desire to be apprehended to psychologists. But just as he should have anticipated getting caught, he should have foreseen his humiliation. The initial details--such as client number nine's aversion to using condoms--were astonishing and steamy. But the affidavit, as former New Jersey attorney general John Farmer pointed out in a good Times op-ed, is part of a growing trend in prosecutorial strategy. For me, the released information carried echoes of the Starr Report released during the Clinton impeachment proceedings. In both cases, the prurient details were humiliating, beyond the bounds of legal necessity, and almost certainly included to put the defendant on the defensive in an ugly situation.
The key political difference between the situations of Clinton and Spitzer is that the president still commanded significant loyalty in the Democratic Party and the voters at large. Spitzer, whose term as governor before the scandal amounted to one waste of political capital after another, had very few friends to stand with him when the bad news erupted.
Spitzer's speedy acknowledgment of his wrongdoing validated the evidence against him. But there are elements of this case that trouble me, especially when it comes to reporters.
Prosecutors don't always get things right. It is easy to imagine how, on another day, in another case case, they might release lurid and damaging details about a suspect who turns out to be innocent. Reporters making news out of such information need to think hard about what to report and when they are being used. The next target might not be as deserving as Eliot Spitzer.
Friday, March 14, 2008
To the Top of the Greasy Pole, the Hard Way (Forgive the Double Entendres)
One of the most significant American contributions to constitutionalism was the creation of the vice-president, a governmental position with almost no assigned duties, whose main function is to simply stand and wait, next in line to the chief executive, ready to step up should the president be unable to continue to serve. As far as I know no other country had a similar institution before 1787. Monarchies, with a firm order of succession, do not require it, and neither do parliamentary democracies, which elect governments, not individuals, and the ruling party can address a crisis with the office and person of prime minister simply by means of a party caucus. The US constitution had no such option, and to avoid circumscribing the role of the president, left the position of vice president as a position as a nullity, without any clear responsibilities.
State governments had similar problems, and created, to serve with governors, the empty position of lieutenant-governor, with the title a reminder of the military origins of the position of governor. If anything, the position of lieutenant-governor in New York State has been more of a dead-end position than the federal vice-president. Over the last half century, the vice-president has become more significant, with numerous vice presidents either becoming president (Nixon, Johnson, Ford, George H. W. Bush), running for president (Hubert Humphrey, Walter Mondale), or being recognized as substantial power brokers while in office (Al Gore, Dick Cheney.) Nothing similar has happened for New York State’s lieutenant-governor, and most of the recent occupants of the office have been mired in near total obscurity or had little or no influence on state policy (George DeLuca, Mary Ann Krupsack, Betsy McCaughey Ross, Alfred DelBello, Stan Lundine, Mary Donahue. ) The two exceptions are Malcolm Wilson, Rockefeller’s long term lieutenant-governor, who succeeded him in 1973, and Mario Cuomo. But there seems to be little indication that the position had undergone an upgrade, and that lieutenant-governors have been tasked with greater responsibility.
Much of the same-old seemed to have happening with David Patterson, who has been almost completely out of the news since taking office last January. It was never quite clear why he agreed to take the position. Perhaps he was tired of the powerlessness that comes with being minority leader in a body in which the minority is granted few if any rights or privileges, or perhaps, as some have speculated, he was gambling that Hillary Clinton gets elected president, and that Spitzer would have named him to the senate. If that was the gamble, it paid off, somewhat differently but certainly more spectacularly than Patterson ever could have imagined.
It is interesting to speculate what might have happened in any of our recent governors had gone through as spectacular a crack-up as befell Eliot Spitzer. We might, very plausibly, have had the first woman governor of New York State. Or, since, the position of lieutenant-governor is often used to balance the upstate/downstate divide, we might have had the true first upstate governor since 1922, with a governor from Amsterdam [Krupsack], Jamestown [Lundine], or Troy [Donahue.] ( This is assuming that Westchester, even the northern Westchester city of Peekskill, really ought to be considered downstate.) But Spitzer broke with the recent pattern of female and upstate lieutenant-governors. Instead we have the first African American governor in the state’s history, and by the luck of the draw, the best prepared governors in terms of prior legislative experience in state government, since Al Smith. All New Yorkers wish David Patterson the best.
State governments had similar problems, and created, to serve with governors, the empty position of lieutenant-governor, with the title a reminder of the military origins of the position of governor. If anything, the position of lieutenant-governor in New York State has been more of a dead-end position than the federal vice-president. Over the last half century, the vice-president has become more significant, with numerous vice presidents either becoming president (Nixon, Johnson, Ford, George H. W. Bush), running for president (Hubert Humphrey, Walter Mondale), or being recognized as substantial power brokers while in office (Al Gore, Dick Cheney.) Nothing similar has happened for New York State’s lieutenant-governor, and most of the recent occupants of the office have been mired in near total obscurity or had little or no influence on state policy (George DeLuca, Mary Ann Krupsack, Betsy McCaughey Ross, Alfred DelBello, Stan Lundine, Mary Donahue. ) The two exceptions are Malcolm Wilson, Rockefeller’s long term lieutenant-governor, who succeeded him in 1973, and Mario Cuomo. But there seems to be little indication that the position had undergone an upgrade, and that lieutenant-governors have been tasked with greater responsibility.
Much of the same-old seemed to have happening with David Patterson, who has been almost completely out of the news since taking office last January. It was never quite clear why he agreed to take the position. Perhaps he was tired of the powerlessness that comes with being minority leader in a body in which the minority is granted few if any rights or privileges, or perhaps, as some have speculated, he was gambling that Hillary Clinton gets elected president, and that Spitzer would have named him to the senate. If that was the gamble, it paid off, somewhat differently but certainly more spectacularly than Patterson ever could have imagined.
It is interesting to speculate what might have happened in any of our recent governors had gone through as spectacular a crack-up as befell Eliot Spitzer. We might, very plausibly, have had the first woman governor of New York State. Or, since, the position of lieutenant-governor is often used to balance the upstate/downstate divide, we might have had the true first upstate governor since 1922, with a governor from Amsterdam [Krupsack], Jamestown [Lundine], or Troy [Donahue.] ( This is assuming that Westchester, even the northern Westchester city of Peekskill, really ought to be considered downstate.) But Spitzer broke with the recent pattern of female and upstate lieutenant-governors. Instead we have the first African American governor in the state’s history, and by the luck of the draw, the best prepared governors in terms of prior legislative experience in state government, since Al Smith. All New Yorkers wish David Patterson the best.
Thursday, March 13, 2008
Say it Ain't So, Eliot
Well, I go on a vacation for a week, I return home, and New York State has a new governor! I have a lot of blogging to do on the Spitzer crack-up and I will be writing about this for several days at least, but I will spread my posts so as to avoid an opining glut.
I guess a standard response to the foibles of humanity and our innate penchant for corruption is to quote Capt. Louis Renault from Casabalanca, “I’m shocked, shocked to find gambling is going on here!” But sometimes, something sharp enough can cut through our habitual cynicism and rather than being “shocked, shocked” we are simply left dumbfounded, and just plain shocked, silent upon a peak in Darien, contemplating the human capacity for self-destructive behavior.
Count me as one of those who was shocked by the revelations that Spitzer had paid numerous visits to a high price call-girl service, and one of those, who felt, immediately that he had no choice but to resign. (And if you ask me, why I spent a year arguing that Bill Clinton ought not to resign in the face of similar sexual indiscretions--—though obviously, the fact that Clinton did not use prostitutes makes a difference, I suppose, in terms of the prima facie legality of the acts, I can’t really tell you. This seems worse, and Spitzer rightly resigned. By the way, the Spitzer/Clinton comparison makes me suspicious of the oft made statement in the last few days that it was Spitzer’s shuddering plummet from the heights of moral rectitude than made his resignation inevitable. It seems to me the general consensus was, from his rise to prominence, that Bill Clinton was a poontang hound of rare assiduousness, and certainly after the Monica revelations few (certainly no Republicans)made the argument that no one should care who cares if he added a few more entries to his extensive catalogue; sex scandals envelop and destroy the upright and the dissolute alike, even Don Juan, who, as Mozart tells us, had sexual relations with 1,003 women in Spain alone, before complications with the 1,004th led to his being dragged to hell by the devil.
But sex scandals can make armchair psychologists of all of us, and I would like to resist the temptation. What drove Spitzer to his acts, I do not know, but if he is interested I know a few good therapists in the city he might want to talk to. I found myself in sympathy with Gail Collins’s column in the Times this morning, speculating on how well we know any politician. I know all too well from recent tragedies in my family, that you can discover, the hard way, that people you thought you knew well, you really didn’t know at all in some fundamental way. The 10% you don’t know can, in the right circumstances, destroy the 90% you do. I believe there is a death instinct, and an instinct for self-humiliation and degradation, and when Spitzer’s dark side spoke, Spitzer answered the call.
I guess the ultimate unknowability of human personality is a particular problem in a political system that tends to devalue ideological issues in favor of letting voters see “character” and
“personality,” and “leadership” as the key traits to evaluate in considering potential office holders. Beyond the reality that political personality and character has become marketable, commodity, one that can manipulated and advertised like any commodity, and can be utterly falsified, like our current president’s affability or leadership ability, there are questions if anyone can ever really know anyone else’s personality to trust them as much we need to trust our elected officials. Currently, Hilary Clinton is running an add that questions who would you rather trust at 3 o’clock in the morning to make crucial split second decisions, arguing that it should be her. Besides the fear mongering this ad invokes, I think it is profoundly misguided. I don’t trust anyone to make wise decisions at 3 o’clock in the morning, and sometimes we experience our “dark nights of the soul” in broad daylight, when, as client nine, we go to room 871 of the Mayflower Hotel in Washington. More to follow.
I guess a standard response to the foibles of humanity and our innate penchant for corruption is to quote Capt. Louis Renault from Casabalanca, “I’m shocked, shocked to find gambling is going on here!” But sometimes, something sharp enough can cut through our habitual cynicism and rather than being “shocked, shocked” we are simply left dumbfounded, and just plain shocked, silent upon a peak in Darien, contemplating the human capacity for self-destructive behavior.
Count me as one of those who was shocked by the revelations that Spitzer had paid numerous visits to a high price call-girl service, and one of those, who felt, immediately that he had no choice but to resign. (And if you ask me, why I spent a year arguing that Bill Clinton ought not to resign in the face of similar sexual indiscretions--—though obviously, the fact that Clinton did not use prostitutes makes a difference, I suppose, in terms of the prima facie legality of the acts, I can’t really tell you. This seems worse, and Spitzer rightly resigned. By the way, the Spitzer/Clinton comparison makes me suspicious of the oft made statement in the last few days that it was Spitzer’s shuddering plummet from the heights of moral rectitude than made his resignation inevitable. It seems to me the general consensus was, from his rise to prominence, that Bill Clinton was a poontang hound of rare assiduousness, and certainly after the Monica revelations few (certainly no Republicans)made the argument that no one should care who cares if he added a few more entries to his extensive catalogue; sex scandals envelop and destroy the upright and the dissolute alike, even Don Juan, who, as Mozart tells us, had sexual relations with 1,003 women in Spain alone, before complications with the 1,004th led to his being dragged to hell by the devil.
But sex scandals can make armchair psychologists of all of us, and I would like to resist the temptation. What drove Spitzer to his acts, I do not know, but if he is interested I know a few good therapists in the city he might want to talk to. I found myself in sympathy with Gail Collins’s column in the Times this morning, speculating on how well we know any politician. I know all too well from recent tragedies in my family, that you can discover, the hard way, that people you thought you knew well, you really didn’t know at all in some fundamental way. The 10% you don’t know can, in the right circumstances, destroy the 90% you do. I believe there is a death instinct, and an instinct for self-humiliation and degradation, and when Spitzer’s dark side spoke, Spitzer answered the call.
I guess the ultimate unknowability of human personality is a particular problem in a political system that tends to devalue ideological issues in favor of letting voters see “character” and
“personality,” and “leadership” as the key traits to evaluate in considering potential office holders. Beyond the reality that political personality and character has become marketable, commodity, one that can manipulated and advertised like any commodity, and can be utterly falsified, like our current president’s affability or leadership ability, there are questions if anyone can ever really know anyone else’s personality to trust them as much we need to trust our elected officials. Currently, Hilary Clinton is running an add that questions who would you rather trust at 3 o’clock in the morning to make crucial split second decisions, arguing that it should be her. Besides the fear mongering this ad invokes, I think it is profoundly misguided. I don’t trust anyone to make wise decisions at 3 o’clock in the morning, and sometimes we experience our “dark nights of the soul” in broad daylight, when, as client nine, we go to room 871 of the Mayflower Hotel in Washington. More to follow.
Tuesday, March 11, 2008
Governor Spitzer's Sad and Unusual Scandal
The scandal that engulfs Governor Eliot Spitzer is not only shocking in its revelations of corruption, it is almost without precedent in twentieth-century New York, according to Gerald Benjamin, political scientist and dean of liberal arts and sciences at SUNY-New Paltz.
Only one New York governor has been impeached and removed from office: William Sulzer, a Tammany Democrat who had some of his fellow braves investigated for corruption. Sulzer was subsequently impeached by the state assembly at the urging of Tammany leader Charles Murphy.
Particularly damning for Sulzer was the charge that he had filed false campaign records: he failed to report $60,000 in contributions and spent $40,000 of this sum playing the stock market. Sulzer was convicted in 1913 and removed from office, but the air of Tammany vindictiveness that surrounded the case diminished the odium against Sulzer. He was elected to the state assembly a year later on the Progressive Party ticket, but never won higher office and was done with active politics by 1916.
Since the nineteenth century public officeholders in New York have resigned under pressure, but the Spitzer case is unusual for both the governorship and the distinctive qualities of this governor's scandal. There is a jarring contrast between the "sleazy" quality of Spitzer's actions and his "rhetoric of rectitude," said Benjamin.
The New York City tabloids are already having a field day with the story: the headline in the Post is "Ho No!" The Daily News led with "Pay for Luv Guv." Thus do both papers get to indulge the delicious tabloid vice of expressing shock and outrage about sex in order to, among other things, write about sex. But however it works out there is something larger at stake in this story, Benjamin says.
"It's sad because New York had a moment to rise above mediocrity," said Benjamin. At 63, he can't conceive of a comparable opportunity to address the structural issues that distort New York State politics. His greatest concern is not for the "groupies" who can be found around many politicians, but for the people who saw the advent of the Spitzer administration as a moment to set aside solid careers and work "to help New York achieve its potential greatness."
Whatever the specific consequences for Spitzer and the state, Benjamin sees the scandal producing the kind of cynicism that undermines the legitimacy of the political system. "It's the unmeasurable consequences that are most troubling," he concluded.
Only one New York governor has been impeached and removed from office: William Sulzer, a Tammany Democrat who had some of his fellow braves investigated for corruption. Sulzer was subsequently impeached by the state assembly at the urging of Tammany leader Charles Murphy.
Particularly damning for Sulzer was the charge that he had filed false campaign records: he failed to report $60,000 in contributions and spent $40,000 of this sum playing the stock market. Sulzer was convicted in 1913 and removed from office, but the air of Tammany vindictiveness that surrounded the case diminished the odium against Sulzer. He was elected to the state assembly a year later on the Progressive Party ticket, but never won higher office and was done with active politics by 1916.
Since the nineteenth century public officeholders in New York have resigned under pressure, but the Spitzer case is unusual for both the governorship and the distinctive qualities of this governor's scandal. There is a jarring contrast between the "sleazy" quality of Spitzer's actions and his "rhetoric of rectitude," said Benjamin.
The New York City tabloids are already having a field day with the story: the headline in the Post is "Ho No!" The Daily News led with "Pay for Luv Guv." Thus do both papers get to indulge the delicious tabloid vice of expressing shock and outrage about sex in order to, among other things, write about sex. But however it works out there is something larger at stake in this story, Benjamin says.
"It's sad because New York had a moment to rise above mediocrity," said Benjamin. At 63, he can't conceive of a comparable opportunity to address the structural issues that distort New York State politics. His greatest concern is not for the "groupies" who can be found around many politicians, but for the people who saw the advent of the Spitzer administration as a moment to set aside solid careers and work "to help New York achieve its potential greatness."
Whatever the specific consequences for Spitzer and the state, Benjamin sees the scandal producing the kind of cynicism that undermines the legitimacy of the political system. "It's the unmeasurable consequences that are most troubling," he concluded.
Wednesday, March 5, 2008
New Histories of the Recent Past
The changes in postwar New York City have been so swift that it is possible to overlook the spirited activism that surrounded some of the most contentious issues in the city. In a forum tonight at the CUNY Graduate Center, "Recovering Community History: Puerto Ricans and African Americans in Postwar New York City," three speakers described projects that will recover important facets of the history of this period, from Puerto Rican rights to school integration to urban renewal.
Lillian Jimenez screened scenes from her soon-to-be-finished film, Antonia Pantoja: Presente! Pantoja was an activist, educator, founder of ASPIRA, and recipient of the Presidential Medal of Freedom. Her life, vividly recalled in Jimenez' film, is a reminder of the Puerto Rican activism that was a transforming presence in New York City in the Sixties and Seventies.
Craig Wilder, a historian at Dartmouth, described his current book project, The High. The book will be a history of twentieth century Brooklyn through the lens of Boys' and Girls' High--a school that has known excellence, integration, and segregation over the course of its history.
Marci Reaven, a historian and director of Place Matters, spoke on two efforts that grew out of Place Matters' survey of significant places in New York: a multi-year effort document the Latin music scene in New York, culminating in the film From Mambo to Hip hop and many public history projects, and an exploration of community activism that began against a Robert Moses redevelopment plan for Cooper Square. The movement not only defeated Moses, but continues to shape the neighborhood around Cooper Square today. Reaven's doctoral dissertation in progress at New York University, "Citizen Participation in City Planning, 1945-1975," explores the Cooper Square opposition and related themes.
All these inspiring stories raise a difficult question: why isn't there more spirited activism in the city today, especially over the levels of inequality and displacement that are remaking our neighborhoods? It is too early to provide a definitive answer to this question, but these three valuable projects--presented under the auspices of the American Social History Project, the Center for Media and Learning, and the Gotham Center--all open valuable windows on a recent past that is very different from our city's present.
Lillian Jimenez screened scenes from her soon-to-be-finished film, Antonia Pantoja: Presente! Pantoja was an activist, educator, founder of ASPIRA, and recipient of the Presidential Medal of Freedom. Her life, vividly recalled in Jimenez' film, is a reminder of the Puerto Rican activism that was a transforming presence in New York City in the Sixties and Seventies.
Craig Wilder, a historian at Dartmouth, described his current book project, The High. The book will be a history of twentieth century Brooklyn through the lens of Boys' and Girls' High--a school that has known excellence, integration, and segregation over the course of its history.
Marci Reaven, a historian and director of Place Matters, spoke on two efforts that grew out of Place Matters' survey of significant places in New York: a multi-year effort document the Latin music scene in New York, culminating in the film From Mambo to Hip hop and many public history projects, and an exploration of community activism that began against a Robert Moses redevelopment plan for Cooper Square. The movement not only defeated Moses, but continues to shape the neighborhood around Cooper Square today. Reaven's doctoral dissertation in progress at New York University, "Citizen Participation in City Planning, 1945-1975," explores the Cooper Square opposition and related themes.
All these inspiring stories raise a difficult question: why isn't there more spirited activism in the city today, especially over the levels of inequality and displacement that are remaking our neighborhoods? It is too early to provide a definitive answer to this question, but these three valuable projects--presented under the auspices of the American Social History Project, the Center for Media and Learning, and the Gotham Center--all open valuable windows on a recent past that is very different from our city's present.
Monday, March 3, 2008
Not Again!
"Barack Obama yesterday lashed out at political enemies who are spreading false rumors that he is a closet Muslim," the lead story in today's New York Post announced. The headlines read "'O' My God" and "Hounded Obama: 'I Pray to Jesus." Inside the paper, beneath the infamous picture of Obama in a turban, is a paragraph that includes the sentence, "Obama was never a Muslim and has been a member of the same Christian church for the last 20 years." Thus does the Post perpetuate the smearing of Obama, provide a veneer of balance for the story, and shrink from the only honorable answer to questions about Obama's religion: So what?
Despite the phrase "false rumors" in the lead, the Post story and photograph keep the smear campaign alive. If the allegation that Obama is a Muslim is false, it simply doesn't deserve the front-page play it gets in the Post. But the Post, I suspect, enjoys having it both ways: expressing outrage about the smear, but then publishing the photo and tormented quotes that keep the story alive. The Post, to paraphrase George Orwell, sells it soul in headlines and photos and buys it back in the news columns.
Ultimately, as Naomi Klein observes in The Nation,the problem is treating Islam as a religion that is somehow suspect for anyone who would run for the presidency. Obama's response that he is a Christian, while factually correct, doesn't do enough to slap down the basic premises of this religious slur.
The attacks on Obama, she observes, recall the attacks on a Polish presidential candidate Aleksander Kwasniewski on the grounds that he was really Jewish; Daniel Singer, the Nation correspondent, called the problem for what it was. Reports Klein: "What perturbed me," Singer wryly observed, "was that Kwasniewski's lawyers threatened to sue for slander rather than press for an indictment under the law condemning racist propaganda."
I'm inclined to judge Obama less harshly than the Post on this one, but the core issue remains the same: Islam should not be demonized as a religion unfit for a presidential candidate.
Despite the phrase "false rumors" in the lead, the Post story and photograph keep the smear campaign alive. If the allegation that Obama is a Muslim is false, it simply doesn't deserve the front-page play it gets in the Post. But the Post, I suspect, enjoys having it both ways: expressing outrage about the smear, but then publishing the photo and tormented quotes that keep the story alive. The Post, to paraphrase George Orwell, sells it soul in headlines and photos and buys it back in the news columns.
Ultimately, as Naomi Klein observes in The Nation,the problem is treating Islam as a religion that is somehow suspect for anyone who would run for the presidency. Obama's response that he is a Christian, while factually correct, doesn't do enough to slap down the basic premises of this religious slur.
The attacks on Obama, she observes, recall the attacks on a Polish presidential candidate Aleksander Kwasniewski on the grounds that he was really Jewish; Daniel Singer, the Nation correspondent, called the problem for what it was. Reports Klein: "What perturbed me," Singer wryly observed, "was that Kwasniewski's lawyers threatened to sue for slander rather than press for an indictment under the law condemning racist propaganda."
I'm inclined to judge Obama less harshly than the Post on this one, but the core issue remains the same: Islam should not be demonized as a religion unfit for a presidential candidate.
Saturday, March 1, 2008
A Sorry State for Newsday
The announcement that Newsday will cut about 120 jobs is bad news for journalism in general and Long Island on particular. Given all of the bad news that has come out of the paper's headquarters in Melville, LI in recent years, it has become easy to forget that Newsday was once a moneymaker and a great regional paper with an international reach.
I can't claim neutrality on this one. I worked at Newsday in the mid-1980s, as did my wife. But long before I labored there, I had admired its intelligent reporting and its economic clout.
At its best, in it Long Island and New York editions, Newsday was much better than even a good regional newspaper. For a while, its publisher was Bill Moyers. My old boss at Newsday, Bernie Bookbinder, created a sociological investigative team that did in-depth reporting on issues like race and politics. Their roster of columnists in the Eighties--which included Murray Kempton, Pete Hamill, Jonathan Schell, Les Payne and Sydney Schanberg--was hard to beat. And their local coverage was solid, a trait the paper brought from Long Island to its New York edition that invigorated coverage of the city's neighborhoods.
As a business, they were profitable in their heyday. In the late 1970s, fresh out of college, I interviewed for a job with a smart bunch of guys who were planning to open an alternative weekly newspaper on Long Island. They backed off from starting the paper, they later explained, because Newsday was just too big and powerful to compete with on Long Island.
Of course, some of the paper's wounds were self-inflicted. As a near-monopoly on Long Island, the paper could be complacent.
But ultimately Newsday's failings were not in its journalism, but in its business side. Mark Willes, the foolish publisher who killed off New York Newsday, never had the brains to match the paper's editorial strengths with a good business plan. And the circulation scandal that recently rocked the paper cannot be blamed on sloppy reporting or editing.
With the new budget cuts, Newsday will become an almost unrecongizable version of the regional paper that once published columnists of national stature, ran a Washuington, DC bureau and maintained a respectable roster of international bureaus. In the new age of the Web, some will say, that isn't a bad deal: let Newsday cover local news and let the Times and the BBC tell us about the rest of the world.
There are three problems with this approach.
One, the Times is suffering cutbacks of its own. And the BBC and Associated Press can't bear the burden of international reporting all by themselves.
Two, as traditional Anglo-Saxon thinking on the press goes, the best test of a truth is its ability to get itself accepted in the marketplace of ideas. But if the number of competing voices gets smaller in journalism, the competition weakens. And that makes it easier for bad ideas to prevail.
Three, in an increasinly global world there is a need to be local without being parochial. The Newsday that I knew was plugged into the rest of the world. It kept Long Islanders aware of how their lives fit into the big picture and saved them from being narrowly absorbed in their own backyards.
But the Newsday that will appear after these latest budget cuts will have trouble staying smart and relevant on Long Island. In an age when events on the other side of the world can reverberate in our own backyard, that's a bad development. And a sorry turning point for a once-great newspaper.
I can't claim neutrality on this one. I worked at Newsday in the mid-1980s, as did my wife. But long before I labored there, I had admired its intelligent reporting and its economic clout.
At its best, in it Long Island and New York editions, Newsday was much better than even a good regional newspaper. For a while, its publisher was Bill Moyers. My old boss at Newsday, Bernie Bookbinder, created a sociological investigative team that did in-depth reporting on issues like race and politics. Their roster of columnists in the Eighties--which included Murray Kempton, Pete Hamill, Jonathan Schell, Les Payne and Sydney Schanberg--was hard to beat. And their local coverage was solid, a trait the paper brought from Long Island to its New York edition that invigorated coverage of the city's neighborhoods.
As a business, they were profitable in their heyday. In the late 1970s, fresh out of college, I interviewed for a job with a smart bunch of guys who were planning to open an alternative weekly newspaper on Long Island. They backed off from starting the paper, they later explained, because Newsday was just too big and powerful to compete with on Long Island.
Of course, some of the paper's wounds were self-inflicted. As a near-monopoly on Long Island, the paper could be complacent.
But ultimately Newsday's failings were not in its journalism, but in its business side. Mark Willes, the foolish publisher who killed off New York Newsday, never had the brains to match the paper's editorial strengths with a good business plan. And the circulation scandal that recently rocked the paper cannot be blamed on sloppy reporting or editing.
With the new budget cuts, Newsday will become an almost unrecongizable version of the regional paper that once published columnists of national stature, ran a Washuington, DC bureau and maintained a respectable roster of international bureaus. In the new age of the Web, some will say, that isn't a bad deal: let Newsday cover local news and let the Times and the BBC tell us about the rest of the world.
There are three problems with this approach.
One, the Times is suffering cutbacks of its own. And the BBC and Associated Press can't bear the burden of international reporting all by themselves.
Two, as traditional Anglo-Saxon thinking on the press goes, the best test of a truth is its ability to get itself accepted in the marketplace of ideas. But if the number of competing voices gets smaller in journalism, the competition weakens. And that makes it easier for bad ideas to prevail.
Three, in an increasinly global world there is a need to be local without being parochial. The Newsday that I knew was plugged into the rest of the world. It kept Long Islanders aware of how their lives fit into the big picture and saved them from being narrowly absorbed in their own backyards.
But the Newsday that will appear after these latest budget cuts will have trouble staying smart and relevant on Long Island. In an age when events on the other side of the world can reverberate in our own backyard, that's a bad development. And a sorry turning point for a once-great newspaper.
Subscribe to:
Posts (Atom)