The big talk in Albany this week, at least in the part of Albany that I know best, the Cultural Education Center, the home of the NYS Archives, Library, and Museum, is not the impending state primary, the fortunes of Elliot Spitzer, or the prospective fortunes of the Giants in the Super Bowl, but the terrible tale of Dan Lorello, a long-time employee of the NYS Archives, who has admitted to stealing and selling at least 400 items from the Library and the Archives, such as signed documents from founding fathers, or rare copies of valuable books. (He was caught by an argus-eyed self-described "history buff” who noticed that some John C. Calhoun documents on sale on Ebay belonged, according to the extant records, to the NYS Archives. He contacted the Albany police, and the matter was quickly exposed, and Dan Lorello confessed to his disgraceful crimes.
When I was working on the Encyclopedia of NYS I met Dan Lorello a few times, though we barely knew each other. We asked him if he was interested in writing about NYS Civil War regiments for the Encyclopedia, and he turned us down cold. (Lorello had the reputation as the most knowledgeable expert anywhere on NYS’s Civil War military history.) I suppose we weren’t paying enough money, and needless to say, I am now glad he didn't write for us. One person I spoke this week said that when he was a researcher in the archives, that he found Lorello curt and unhelpful, and he went out of his way to avoid working with him, but this seems to be a minority opinion. Let me give the last word to a friend of mine who works extensively in the archives who sent me an email about it today, and who thinks that what is publicly known about Lorello’s pilferings might be the tip of the iceberg:
“I would equate the atmosphere in the library and archives like that of a funeral. I knew Dan, very well, and like you always hear, he was, "the nicest guy on the face of the earth. You would never suspect him." The problem for me is that his deposition stated he stole about 400 documents in 2007 - if you believe him. I do not. I bet he took much more. He claimed in went in on weekends and state holidays to get items! I bet he stole more, and to do that since 2002 - there is no way to calcualte what or how much he stole. Several Albany maps are missing still, and Simeon Dewitt [surveyor-general of NYS in the early decades of
the 19th century] items, which I need about a year ago are still unfound. The e-bay part is the only traceable way to find out what he stole, but the buzz is that he went to trade shows and antiquarian books and ephemera fairs - so time will tell. Because of the non e-bay items... who knows how much he took. He admitted to trading privately, but what he took for cash will never be known. How about the Bozo attorney who bought items outside the museum in the parking lot for cash, from Lorello? Idiot!
I do feel the loss is immeasurable - and to me it feels worse than the 1911 fire, because it was intentional. [The 1911 fire in the State Capitol Building, then the location of the State Library, which resulted in the loss of some 275,000 documents, one of the worst library fires in American history.] The value of what he stole we cannot reclaim and is permanent. We lost half of the Rev War documents in the fire and this Bozo goes and sells what we have left! Nice!”
Thursday, January 31, 2008
Wednesday, January 30, 2008
Giuliani Missed His Googoos
Rudy Giuliani's failure in the Republican primaries, as an excellent piece in today's Times by Michael Powell and Michael Cooper points out, was the product of bad strategy and the lingering odor of Bernard Kerik. The ex-mayor's recent and decisive loss in Florida was also due to the absence in that state of a sufficient number of New York good government liberals, a critical element in Giuliani's political success in Gotham.
New York liberalism is anything but a monolith. Going back to the Thirties, it has always been an unstable amalgam.
One important tendency in New York liberalism when Giuliani ran succesfully for mayor in 1993, and again in 1997, was the presence of voters who might be socially liberal, but like a clean and efficient city government. Philosophically, such voters trace their history to anti-Tammany good government activists--the googoos, to use political slang.
Giuliani had a certain appeal for such voters--who can be found, in New York, in the Democratic and Republican parties. It wasn't just his relative liberalism on social issues, I think, but his reputation for running an orderly government.
History may not be kind to Giuliani's repuation for executive expertise. Reports about Kerik have undermined faith in Giuliani's judgment, and there may be more to come in Kerik's trial.
At times, in his primary speeches, Giuliani seemed to mock the very New Yorkers who were part of his municipal base of support: socially liberal good government voters. He shouldn't have. The breadth of his appeal--to moderates and even some liberals--was part of his success in New York. That couldn't be repeated in the conservative atmosphere of the GOP primaries.
New York liberalism is anything but a monolith. Going back to the Thirties, it has always been an unstable amalgam.
One important tendency in New York liberalism when Giuliani ran succesfully for mayor in 1993, and again in 1997, was the presence of voters who might be socially liberal, but like a clean and efficient city government. Philosophically, such voters trace their history to anti-Tammany good government activists--the googoos, to use political slang.
Giuliani had a certain appeal for such voters--who can be found, in New York, in the Democratic and Republican parties. It wasn't just his relative liberalism on social issues, I think, but his reputation for running an orderly government.
History may not be kind to Giuliani's repuation for executive expertise. Reports about Kerik have undermined faith in Giuliani's judgment, and there may be more to come in Kerik's trial.
At times, in his primary speeches, Giuliani seemed to mock the very New Yorkers who were part of his municipal base of support: socially liberal good government voters. He shouldn't have. The breadth of his appeal--to moderates and even some liberals--was part of his success in New York. That couldn't be repeated in the conservative atmosphere of the GOP primaries.
Tuesday, January 29, 2008
75th Anniversaries
The New York Times has an article today on the 75th anniversary of Hitler’s ascension to power, and how Nazism and the Holocaust remains in the center of German historical consciousness, though the article, oddly wonders why “Germany seems unendingly obsessed with Nazism.” Is there some other event in German history that Germans should be obsessed with? The death of Frederick Barbarossa or the Diet of Worms? I suppose a country so genuinely concerned with remembering their mistakes, as a way of measuring how far they have come, is unusual. Certainly the unambiguous way in which Germans have acknowledged the very worst in their national history is a standard against which other nations should be measured.
But this is a topic for another post. I am reminded that this is also the 75th anniversary of the Times' coverage of Hitler’s ascension to power. What is striking about it is their concern to minimize its significance. They did give the event a banner headlines, but seemed determined to take away with one hand what they gave with another, “Hitler Made Chancellor of Germany But Coalition Cabinet Limits Power.” This line of interpretation was continued in the notorious lead article, by the Berlin correspondent, Guido Enderis, which starts accurately enough “Adolf Hitler, leader of the Nationalist Socialist Party, today was appointed Chancellor of Germany” but has, in the 3rd paragraph, the single worst prediction and bit of news analysis in the august 150-year plus history of the New York Times, “the composition of the cabinet leaves Herr Hitler no scope for gratification of any dictatorial ambition.”
Perhaps there were political reasons behind this coverage, or just myopic reporting, but I suspect the Times was doing what it always tended to do; to turn down emotions a notch, and to try not to be swayed by events that seem to be dramatic, which often turn out to be, in Daniel Boorstin’s term, “pseudo-events” hyped to the hills when the real story is actually somewhere else.
The Times got this spectacularly wrong, but I think skepticism in the face of seemingly transformative events is generally justified. One cannot but remember the “everything is changed” mantra that so widely circulated after 9/11, which referred to the new threat from Islamic terrorism, or a new sense of national unity, overcoming the debilitating effects of irony, or something along these lines. These stories were over-hyped, and served to cloak the real news that was emerging in post- 9/11 America, the stealthy planning of an aggressive war by the Bush administration.
But sometimes, everything does change, as when Yeats wrote of the 1916 Irish rebellion that “all changed, changed utterly” (though his famous line about a “terrible beauty” being born in reference to the IRA strikes me as a horrible aestheticizing of violence.) And in fairness to the New York Times, an appropriate headline for Hitler’s ascension to power would have been impossible to write on January 31, 1933. An accurate headline would have to read something like “Hitler Made Chancellor of Germany; The Single Most Catastrophic Event in the History of the World; All is Changed, Changed Utterly.”
But this is a topic for another post. I am reminded that this is also the 75th anniversary of the Times' coverage of Hitler’s ascension to power. What is striking about it is their concern to minimize its significance. They did give the event a banner headlines, but seemed determined to take away with one hand what they gave with another, “Hitler Made Chancellor of Germany But Coalition Cabinet Limits Power.” This line of interpretation was continued in the notorious lead article, by the Berlin correspondent, Guido Enderis, which starts accurately enough “Adolf Hitler, leader of the Nationalist Socialist Party, today was appointed Chancellor of Germany” but has, in the 3rd paragraph, the single worst prediction and bit of news analysis in the august 150-year plus history of the New York Times, “the composition of the cabinet leaves Herr Hitler no scope for gratification of any dictatorial ambition.”
Perhaps there were political reasons behind this coverage, or just myopic reporting, but I suspect the Times was doing what it always tended to do; to turn down emotions a notch, and to try not to be swayed by events that seem to be dramatic, which often turn out to be, in Daniel Boorstin’s term, “pseudo-events” hyped to the hills when the real story is actually somewhere else.
The Times got this spectacularly wrong, but I think skepticism in the face of seemingly transformative events is generally justified. One cannot but remember the “everything is changed” mantra that so widely circulated after 9/11, which referred to the new threat from Islamic terrorism, or a new sense of national unity, overcoming the debilitating effects of irony, or something along these lines. These stories were over-hyped, and served to cloak the real news that was emerging in post- 9/11 America, the stealthy planning of an aggressive war by the Bush administration.
But sometimes, everything does change, as when Yeats wrote of the 1916 Irish rebellion that “all changed, changed utterly” (though his famous line about a “terrible beauty” being born in reference to the IRA strikes me as a horrible aestheticizing of violence.) And in fairness to the New York Times, an appropriate headline for Hitler’s ascension to power would have been impossible to write on January 31, 1933. An accurate headline would have to read something like “Hitler Made Chancellor of Germany; The Single Most Catastrophic Event in the History of the World; All is Changed, Changed Utterly.”
Sunday, January 27, 2008
Newly-Recovered Capa Negatives in New York City
The great international war photographer Robert Capa was in and out of New York City from World War II until his death; his aura lingers at the Magnum photo agency that he helped to found and at the International Center of Photography (ICP). Now, according to the New York Times, thousands of Capa negatives from the Spanish Civil War--long thought to be destroyed in Paris during World War II--have been delivered to the ICP. There, they can be studied to better understand his career--and perhaps to clear up the lingering controversy around one of his greatest pictures.
As Randy Kennedy describes it in the Times, the story of the negatives is rich in the kind of mystery and drama that surrounded Capa's life. The photographer assumed they were destroyed in Paris after he left the city in 1939; in fact, they were sent to Marseilles and then to Mexico, where they remained for years until they were recently shipped to ICP.
There's no guarantee what we'll learn from the negatives, but perhaps they'll settle claims that Capa's famous "Falling Soldier" photograph of a Loyalist soldier at the moment of his death in the Spanish Civil War was staged. (Richard Whelan, the Capa biographer, makes a good case that the photograph is legitimate.)
Still, more research can't hurt. And there's also the possibility that the negatives will tell us more about the work of Gerda Taro, Capa's companion and fellow photographer who died covering the war in Spain. Both were recently the subjects of exhibits at ICP.
Here's hoping that study of these negatives leads research in interesting directions--and that we get to see prints of them exhibited in public before too long.
As Randy Kennedy describes it in the Times, the story of the negatives is rich in the kind of mystery and drama that surrounded Capa's life. The photographer assumed they were destroyed in Paris after he left the city in 1939; in fact, they were sent to Marseilles and then to Mexico, where they remained for years until they were recently shipped to ICP.
There's no guarantee what we'll learn from the negatives, but perhaps they'll settle claims that Capa's famous "Falling Soldier" photograph of a Loyalist soldier at the moment of his death in the Spanish Civil War was staged. (Richard Whelan, the Capa biographer, makes a good case that the photograph is legitimate.)
Still, more research can't hurt. And there's also the possibility that the negatives will tell us more about the work of Gerda Taro, Capa's companion and fellow photographer who died covering the war in Spain. Both were recently the subjects of exhibits at ICP.
Here's hoping that study of these negatives leads research in interesting directions--and that we get to see prints of them exhibited in public before too long.
Saturday, January 26, 2008
More on Women in New York Politics
The absence of a "deep bench" of women in New York City politics since the Seventies, as noted in this week's earlier post and TAP exchange, is central to Gotham's politics today. There are many possible explanations for this, but I'd offer three: the decline of the anti-war and civil rights movements and the absence of strong political party structure that could move women up the political ladder to run for high office. The roots of all three can be traced to the Seventies.
The women who were a distinguished presence in city politics during the Seventies and Eighties--Bella Abzug, Elizabeth Holtzman, Shirley Chisholm, Geraldine Ferraro, Ruth Messinger--were all different politicians. Each followed her own route into the pursuit of elected office. Each had her own kind of identification with feminism.
Still, it is striking to see how the civil rights movement and anti-war movement invigorated the reform Democratic circles from which Abzug, Messinger, Holtzman and Chisholm drew support.
But the end of the Vietnam War movement brought an end to anti-war mobilization that had inspired an early generation of reform Democrats. And the election of Ed Koch in 1977--and the relatively conservative tone of city government after that, at least in comparison to the Lindsay years--tamped down the city's civil rights movement.
Not until the late Eighties and the Jackson presidential primary run, and the Dinkins administration, would civil rights activists be close to the center of city politics and power.
But perhaps most important is a factor that nags all reform movements in city politics: the lack of an enduing political structure that can maintain a presence in the city, provide a home for defeated candidates, and groom new candidates. Without that, no political movement in the city lasts long.
This problem is not confined to women alone. The highly popular Fiorello LaGuardia had no real successor because he was a liberal, even radical, Republican, who governed without the benefit of a strong party organization beneath him to lift up a successor.
The anti-party impulse among New York reformers--an inheritance from the days of anti-Tammany insurgencies--is today compounded by a weakening of the significance of parties in New York City politics. The decay of the Liberal Party as a viable party also contributes to this problem. Democrats and Republicans alike operate in a political system that encourages freelance candidates far more than city politics did fifty years ago.
Mayor Michael Bloomberg, for all his fulminations against political parties, will confront the same problem if he wants to see a candidate of his stripe run for city hall. Unless there's another billionaire out the to run and claim the Bloomberg mantle, the mayor will be hard-pressed to point to any coherent way to perpetuate his vision in city hall.
At that point, the anti-party mayor will be undone by his own lack of a party. More on that another day.
The women who were a distinguished presence in city politics during the Seventies and Eighties--Bella Abzug, Elizabeth Holtzman, Shirley Chisholm, Geraldine Ferraro, Ruth Messinger--were all different politicians. Each followed her own route into the pursuit of elected office. Each had her own kind of identification with feminism.
Still, it is striking to see how the civil rights movement and anti-war movement invigorated the reform Democratic circles from which Abzug, Messinger, Holtzman and Chisholm drew support.
But the end of the Vietnam War movement brought an end to anti-war mobilization that had inspired an early generation of reform Democrats. And the election of Ed Koch in 1977--and the relatively conservative tone of city government after that, at least in comparison to the Lindsay years--tamped down the city's civil rights movement.
Not until the late Eighties and the Jackson presidential primary run, and the Dinkins administration, would civil rights activists be close to the center of city politics and power.
But perhaps most important is a factor that nags all reform movements in city politics: the lack of an enduing political structure that can maintain a presence in the city, provide a home for defeated candidates, and groom new candidates. Without that, no political movement in the city lasts long.
This problem is not confined to women alone. The highly popular Fiorello LaGuardia had no real successor because he was a liberal, even radical, Republican, who governed without the benefit of a strong party organization beneath him to lift up a successor.
The anti-party impulse among New York reformers--an inheritance from the days of anti-Tammany insurgencies--is today compounded by a weakening of the significance of parties in New York City politics. The decay of the Liberal Party as a viable party also contributes to this problem. Democrats and Republicans alike operate in a political system that encourages freelance candidates far more than city politics did fifty years ago.
Mayor Michael Bloomberg, for all his fulminations against political parties, will confront the same problem if he wants to see a candidate of his stripe run for city hall. Unless there's another billionaire out the to run and claim the Bloomberg mantle, the mayor will be hard-pressed to point to any coherent way to perpetuate his vision in city hall.
At that point, the anti-party mayor will be undone by his own lack of a party. More on that another day.
Friday, January 25, 2008
Thoughts about Heath Ledger
Very recently, my brother died, by his own hand, in his apartment in Lower Manhattan. Along with a profound and deep sense of grief, such as I have never experienced before, and never expect to experience in the future (my own death will be a piece of cake after this), I have learned a great deal about suicide.
It was with a sense of grim familiarity that I have read this past week of the death, in circumstances that are still unexplained, of the actor Heath Ledger, in an apartment in Lower Manhattan. (Let me extend my deepest sympathy and empathy to his family and friends. Beyond everything else, he was one of the finest actors of his generation, and his performance in Brokeback Mountain will be watched and admired as long as people go to the movies.)
Within hours the Times (which presumably was more discrete than other news sources) was describing how the body was found (naked, face down on the bed, surrounding by bottles of empty pills.) Within a day there were excerpts from the medical examiners report in news reports, with all sorts of lurid questions about what he was doing and with whom on his last day on earth. I soon turned my attention to subjects in the news that were less upsetting to me, like the War in Iraq and the coming global recession.
When people, who had no reason or right to know, asked me of the circumstances of my brother’s death (that is, how he “did it”), I told them, politely, that it was none of their business. But of course my brother wasn’t famous. If he was, all the circumstances of his death, idle speculation about its causes, and quotations from anonymous sources on whether there was a suicide note and what it might say would have appeared in the papers.
Why do we not give those who die from suicide the same dignity and privacy we afford other deaths, both for the sake of the victim and the victim’s survivors? This is true even for famous people whose deaths are newsworthy. When, for instance, Ronald Reagan was dying of Alzheimer’s and related diseases, the coverage was extremely discrete—no articles about his incontinence, drooling, or news reports on his conditions, or what he looked like when he was on his death bed, and this is of course is how it should be.
We still view suicide as a form a crime, and an occasion for public and journalistic prurience. Very often, as in the case of my brother, the first people the immediate survivors speak to are not medical personnel, or grief counselors, but police officers. (Immediately after my brother’s body was discovered, my surviving brother and myself, who were the family members on the scene, were interviewed by NYPD detectives, separately, like the opening segment of a Law and Order episode.) Suicide victims are no threat to anyone but themselves, and it should be treated as any other form of death, like cancer or heart disease. All that is important is that Heath Ledger has died, all too soon, at the age of 28. The rest should be private information, known only to his immediate survivors. I do not see why this elaborate hierarchy of medical privacy that we have erected, epitomized by the recent and extremely restrictive HIPAA acct, should be ignored in cases of suicide. We need to treat the victims of suicide with the same respect and dignity we provide to any other cause of death.
It was with a sense of grim familiarity that I have read this past week of the death, in circumstances that are still unexplained, of the actor Heath Ledger, in an apartment in Lower Manhattan. (Let me extend my deepest sympathy and empathy to his family and friends. Beyond everything else, he was one of the finest actors of his generation, and his performance in Brokeback Mountain will be watched and admired as long as people go to the movies.)
Within hours the Times (which presumably was more discrete than other news sources) was describing how the body was found (naked, face down on the bed, surrounding by bottles of empty pills.) Within a day there were excerpts from the medical examiners report in news reports, with all sorts of lurid questions about what he was doing and with whom on his last day on earth. I soon turned my attention to subjects in the news that were less upsetting to me, like the War in Iraq and the coming global recession.
When people, who had no reason or right to know, asked me of the circumstances of my brother’s death (that is, how he “did it”), I told them, politely, that it was none of their business. But of course my brother wasn’t famous. If he was, all the circumstances of his death, idle speculation about its causes, and quotations from anonymous sources on whether there was a suicide note and what it might say would have appeared in the papers.
Why do we not give those who die from suicide the same dignity and privacy we afford other deaths, both for the sake of the victim and the victim’s survivors? This is true even for famous people whose deaths are newsworthy. When, for instance, Ronald Reagan was dying of Alzheimer’s and related diseases, the coverage was extremely discrete—no articles about his incontinence, drooling, or news reports on his conditions, or what he looked like when he was on his death bed, and this is of course is how it should be.
We still view suicide as a form a crime, and an occasion for public and journalistic prurience. Very often, as in the case of my brother, the first people the immediate survivors speak to are not medical personnel, or grief counselors, but police officers. (Immediately after my brother’s body was discovered, my surviving brother and myself, who were the family members on the scene, were interviewed by NYPD detectives, separately, like the opening segment of a Law and Order episode.) Suicide victims are no threat to anyone but themselves, and it should be treated as any other form of death, like cancer or heart disease. All that is important is that Heath Ledger has died, all too soon, at the age of 28. The rest should be private information, known only to his immediate survivors. I do not see why this elaborate hierarchy of medical privacy that we have erected, epitomized by the recent and extremely restrictive HIPAA acct, should be ignored in cases of suicide. We need to treat the victims of suicide with the same respect and dignity we provide to any other cause of death.
Thursday, January 24, 2008
Ode to "Greater New York"
(Wherein and whereby I prove to myself that constructing the little stanzas of doggerel that poetaster Calvin Trillin provides in each issue of The Nation is harder than it looks)
Rob and I really don’t want to boast
But this is Greater New York’s 100th post
We are happy to have written so much
About New York City, State, and such
We are proud critics and nay-sayers
Skewerers of the city’s mayors
And we use our native stubbornness
To offer critiques of its wretched mal-governance
And as for Spitzer and the rest of the state
Things could be better, they aren’t great
Albany is just a pool of cess
And we will endeavor to clean the mess
Thanks gentle readers, at your computers
Skeptics, cynics, and refuters
Fellers of forests of falsehoods, like enraged loggers
For supporting the work of two humble bloggers
Rob and I really don’t want to boast
But this is Greater New York’s 100th post
We are happy to have written so much
About New York City, State, and such
We are proud critics and nay-sayers
Skewerers of the city’s mayors
And we use our native stubbornness
To offer critiques of its wretched mal-governance
And as for Spitzer and the rest of the state
Things could be better, they aren’t great
Albany is just a pool of cess
And we will endeavor to clean the mess
Thanks gentle readers, at your computers
Skeptics, cynics, and refuters
Fellers of forests of falsehoods, like enraged loggers
For supporting the work of two humble bloggers
Save The Triborough!
There was an article in the Times the other day about a misguided effort to rename the Triborough Bridge. Elliot Spitzer wants to rename it after Robert F. Kennedy. Others names were suggested, including Andrew Haswell Green, the founding father of the Consolidation of Greater New York, so I guess those of us at the Greater New York blog should approve, but I don’t. If I were to rename the Triborough bridge, I would name it after the person who created it, and made it a model of urban transportation, Robert Moses, who really does not have anything named after him in NYC worthy of his legacy as a master builder, but I suppose that is a non-starter.
But I don’t want the name to change. The Triborough is my favorite name among NYC’s bridges and tunnels. Most of the geographic names are one-sided, named after one side of the bridge, but not the other. (The Brooklyn, Manhattan, Williamsburg, Whitestone, Queensboro,and Throg’s Neck Bridges, for starters.) Others are named after people who have fairly tangential connection to NYC (Verrazano, Lincoln, and George Washington, though he did fight a major battle in Upper Manhattan) or are hopelessly obscure (Holland, up there with Major Deegan.) Only the Queens-Midtown and Brooklyn-Battery Tunnels have names that are fair to both ends of the crossing.
But of course the Triborough doesn’t go from point A to B, it goes from point A to B to C, and its name captures that, and, much better than naming it after Andrew Haswell Green, captures the sense of New York City as a composite of its boroughs, in a bridge that connects the three most important parts of New York City (sorry Staten Island fans), Manhattan, Long Island, and the lower extension of the Hudson Valley.
And the Triborough also is part what is most distinctive after New York City’s naming pattern, naming streets and other geographical locations after numbers, not people, which has saved the city from endless debates about renaming streets (which hasn’t prevented ersatz naming of streets after prominent people that are completely irrelevant to what they are actually called.)
If we most rename streets, let’s rename them after favorite numbers, not people. I would love my favorite number, 65,336 or 2 16, given proper recognition on some street somewhere, perhaps in Queens where no one will notice another confusing number. Or why not rename the diagonal between 1st Street and 2nd Street the √2 crossing? In any event, let’s keep the Triborough the Triborough!
But I don’t want the name to change. The Triborough is my favorite name among NYC’s bridges and tunnels. Most of the geographic names are one-sided, named after one side of the bridge, but not the other. (The Brooklyn, Manhattan, Williamsburg, Whitestone, Queensboro,and Throg’s Neck Bridges, for starters.) Others are named after people who have fairly tangential connection to NYC (Verrazano, Lincoln, and George Washington, though he did fight a major battle in Upper Manhattan) or are hopelessly obscure (Holland, up there with Major Deegan.) Only the Queens-Midtown and Brooklyn-Battery Tunnels have names that are fair to both ends of the crossing.
But of course the Triborough doesn’t go from point A to B, it goes from point A to B to C, and its name captures that, and, much better than naming it after Andrew Haswell Green, captures the sense of New York City as a composite of its boroughs, in a bridge that connects the three most important parts of New York City (sorry Staten Island fans), Manhattan, Long Island, and the lower extension of the Hudson Valley.
And the Triborough also is part what is most distinctive after New York City’s naming pattern, naming streets and other geographical locations after numbers, not people, which has saved the city from endless debates about renaming streets (which hasn’t prevented ersatz naming of streets after prominent people that are completely irrelevant to what they are actually called.)
If we most rename streets, let’s rename them after favorite numbers, not people. I would love my favorite number, 65,336 or 2 16, given proper recognition on some street somewhere, perhaps in Queens where no one will notice another confusing number. Or why not rename the diagonal between 1st Street and 2nd Street the √2 crossing? In any event, let’s keep the Triborough the Triborough!
Women in New York Politics
Let me share with Greater New York readers a fascinating post by Mark Schmitt on the TAPPED blog on women in New York politics. If I was a blogger worth my salt I would know how to link to the post. Instead, here it is below, in its entirety:
Addie Stan's question about Senator Clinton, "whether she possesses an 'Inner Bella [Abzug]'?" reminds me of a point I was going to make as an addendum to Holly Yeager's excellent column last week about the absence of a "bench" of women ready to run for president.
The point being that for some number of women, particularly those of Clinton's generation in New York (of whom I know a few), her success is inseparable from the particular tragedy of women in New York politics. Starting in the 1970s, New York was the birthplace of successful women running for office independently. The bench was deep: There was Abzug; there was Liz Holtzman, who took down the chair of the House Judiciary Committee in a primary in 1972 and went on to make her own mark in that committee's hearings on the Nixon impeachment; there was Carol Bellamy, a skillful politician who won the then-powerful position of City Council President in 1977; there was Geraldine Ferraro, elected to Congress in 1978. One could probably add the names of Shirley Chisholm, the first black woman elected to Congress, and Ruth Messinger, who became Manhattan borough president a little later. New York has sent 22 women to Congress, second only to California's 31.
From this talent pool, surely senators, governors and mayors would emerge. That was the hope. And then what followed was disappointment compounded by disappointment:
• Abzug lost the 1976 nomination to the U.S. Senate by 10,000 votes, largely because the New York Times endorsed the then-neoconservative Daniel Patrick Moynihan, a move so controversial that the editorial page editor, John Oakes, overruled by the publisher, his cousin Arthur Sulzberger, was forced to convert his endorsement of Abzug to a short letter to the editor (himself).
• Abzug toned down her style and missed the runoff in the 1977 mayoral primary, one of the greatest yet nastiest political campaigns in history, a tale told well in Jonathan Mahler's Ladies and Gentlemen, The Bronx is Burning.
• Holtzman won the 1980 Democratic Senate primary, but the ailing incumbent Republican Jacob Javits, having lost his party's primary to Al D'Amato, insisted on staying in the race on the Liberal Party line and took 11% of the vote, costing Holtzman victory by a tiny margin.
• Ferraro reached the pinnacle, a place on a national ticket in 1984, but her selection (which briefly boosted the Mondale-Ferraro ticket to parity in the polls), was overshadowed by controversy about her husband's finances.
• In the 1992 Senate primary, for the chance to take on a weakened D'Amato, Holtzman and Ferraro effectively destroyed each other with personal attacks, letting Robert Abrams win the primary and lose the general election. Bitterness over that race carried over into Holtzman's 1993 race for reelection as New York City Controller, which she lost.
• Bellamy's political career fizzled with a defeat in the 1985 mayoral primary, just as Messinger's did in 1997.
Addie Stan's question about Senator Clinton, "whether she possesses an 'Inner Bella [Abzug]'?" reminds me of a point I was going to make as an addendum to Holly Yeager's excellent column last week about the absence of a "bench" of women ready to run for president.
Mark Schmitt
I guess the question this poses is whether all of this was bad luck or fate? Were there reasons why New York politics in the 1970s and 1980s was not really ready for a dominant woman? There has much written about the unrepeatability of the success of black mayors in major cities such as David Dinkins, Harold Washington, and Tom Bradley. For the most part, women have not yet even had that much success, and probably not enough has been written about Hillary Clinton as the most important female politician in the state’s history.
Is there a common thread that explains the failures of Abzug, Holtzman, Ferraro, Bellamy, and Messinger (who by the way lost to Giuliani in the general election in 1997, not the primary.) Perhaps it was running against self-proclaimed tough guys like Koch, D’Amato, and Giuliani, and the presumption that somehow these liberal women weren’t tough enough to make hard decisions, and stupid blowhard crap like that. (Or with Abzug, treat her feisty and difficult personality as a liability in a way that, say, Giuliani never had to deal with.) Perhaps Hillary has learned this lesson, and has become, for better or worse, the sort of acceptably “tough liberal” that can neutralize the presumption that the electorate seems to have that only men have enough rage and vindictive spleen to be effective leaders. Whether this is a good thing or a bad thing is a topic for another post.
Addie Stan's question about Senator Clinton, "whether she possesses an 'Inner Bella [Abzug]'?" reminds me of a point I was going to make as an addendum to Holly Yeager's excellent column last week about the absence of a "bench" of women ready to run for president.
The point being that for some number of women, particularly those of Clinton's generation in New York (of whom I know a few), her success is inseparable from the particular tragedy of women in New York politics. Starting in the 1970s, New York was the birthplace of successful women running for office independently. The bench was deep: There was Abzug; there was Liz Holtzman, who took down the chair of the House Judiciary Committee in a primary in 1972 and went on to make her own mark in that committee's hearings on the Nixon impeachment; there was Carol Bellamy, a skillful politician who won the then-powerful position of City Council President in 1977; there was Geraldine Ferraro, elected to Congress in 1978. One could probably add the names of Shirley Chisholm, the first black woman elected to Congress, and Ruth Messinger, who became Manhattan borough president a little later. New York has sent 22 women to Congress, second only to California's 31.
From this talent pool, surely senators, governors and mayors would emerge. That was the hope. And then what followed was disappointment compounded by disappointment:
• Abzug lost the 1976 nomination to the U.S. Senate by 10,000 votes, largely because the New York Times endorsed the then-neoconservative Daniel Patrick Moynihan, a move so controversial that the editorial page editor, John Oakes, overruled by the publisher, his cousin Arthur Sulzberger, was forced to convert his endorsement of Abzug to a short letter to the editor (himself).
• Abzug toned down her style and missed the runoff in the 1977 mayoral primary, one of the greatest yet nastiest political campaigns in history, a tale told well in Jonathan Mahler's Ladies and Gentlemen, The Bronx is Burning.
• Holtzman won the 1980 Democratic Senate primary, but the ailing incumbent Republican Jacob Javits, having lost his party's primary to Al D'Amato, insisted on staying in the race on the Liberal Party line and took 11% of the vote, costing Holtzman victory by a tiny margin.
• Ferraro reached the pinnacle, a place on a national ticket in 1984, but her selection (which briefly boosted the Mondale-Ferraro ticket to parity in the polls), was overshadowed by controversy about her husband's finances.
• In the 1992 Senate primary, for the chance to take on a weakened D'Amato, Holtzman and Ferraro effectively destroyed each other with personal attacks, letting Robert Abrams win the primary and lose the general election. Bitterness over that race carried over into Holtzman's 1993 race for reelection as New York City Controller, which she lost.
• Bellamy's political career fizzled with a defeat in the 1985 mayoral primary, just as Messinger's did in 1997.
Addie Stan's question about Senator Clinton, "whether she possesses an 'Inner Bella [Abzug]'?" reminds me of a point I was going to make as an addendum to Holly Yeager's excellent column last week about the absence of a "bench" of women ready to run for president.
Mark Schmitt
I guess the question this poses is whether all of this was bad luck or fate? Were there reasons why New York politics in the 1970s and 1980s was not really ready for a dominant woman? There has much written about the unrepeatability of the success of black mayors in major cities such as David Dinkins, Harold Washington, and Tom Bradley. For the most part, women have not yet even had that much success, and probably not enough has been written about Hillary Clinton as the most important female politician in the state’s history.
Is there a common thread that explains the failures of Abzug, Holtzman, Ferraro, Bellamy, and Messinger (who by the way lost to Giuliani in the general election in 1997, not the primary.) Perhaps it was running against self-proclaimed tough guys like Koch, D’Amato, and Giuliani, and the presumption that somehow these liberal women weren’t tough enough to make hard decisions, and stupid blowhard crap like that. (Or with Abzug, treat her feisty and difficult personality as a liability in a way that, say, Giuliani never had to deal with.) Perhaps Hillary has learned this lesson, and has become, for better or worse, the sort of acceptably “tough liberal” that can neutralize the presumption that the electorate seems to have that only men have enough rage and vindictive spleen to be effective leaders. Whether this is a good thing or a bad thing is a topic for another post.
Monday, January 21, 2008
Senator Clinton and "People Like Us"
In a recent conversation with my friends of generally liberal politics, who tend to support Barack Obama for president, I once again heard a phrase invoked to explain why Hillary Clinton is allegedly unelectable in the fall: "And if people like us don't vote for her...."
But all of this depends on who "people life us are." In my limited experience, people who say this to me are economically secure Democrats who would typically vote for liberal Democrats in the primaries. But what if Senator Clinton is really a moderate who appeals to more numerous Democratic centrists, including union members and party stalwarts?
In that case, the discomfort of liberal Democrats with the Clinton candidacy may not be a measurement of her unelectability. Instead, it is the discomfort of liberals in a party with a lot of centrists.
In truth, I think the domestic policy differences of Obama and Clinton are modest (although I'm convinced by Paul Krugman that her health plan is better.) Internationally, he would present a fresh face to the world and has a better record on Iraq and Iran.
So far, the primaries have revealed both Democratic front runners to be strong candidates. That's good for the Democrats, who in any case need a "big tent" strategy to win in November.
But as I've said before, it is best for citizens, candidates and reporters to concentrate their questions on how the candidates would govern--not their supposed electability.
At best, discussions of electability are a fuzzy distraction from examining what candidates really plan to do. At worst, they just lift criticisms from the Republican playbook.
But all of this depends on who "people life us are." In my limited experience, people who say this to me are economically secure Democrats who would typically vote for liberal Democrats in the primaries. But what if Senator Clinton is really a moderate who appeals to more numerous Democratic centrists, including union members and party stalwarts?
In that case, the discomfort of liberal Democrats with the Clinton candidacy may not be a measurement of her unelectability. Instead, it is the discomfort of liberals in a party with a lot of centrists.
In truth, I think the domestic policy differences of Obama and Clinton are modest (although I'm convinced by Paul Krugman that her health plan is better.) Internationally, he would present a fresh face to the world and has a better record on Iraq and Iran.
So far, the primaries have revealed both Democratic front runners to be strong candidates. That's good for the Democrats, who in any case need a "big tent" strategy to win in November.
But as I've said before, it is best for citizens, candidates and reporters to concentrate their questions on how the candidates would govern--not their supposed electability.
At best, discussions of electability are a fuzzy distraction from examining what candidates really plan to do. At worst, they just lift criticisms from the Republican playbook.
Friday, January 18, 2008
Wanted: Jobs to End Poverty
Mayor Bloomberg's "State of the City" address this week trumpeted his plans to to lift New Yorkers out of poverty with education and better public health. What the mayor ignored, however, was one of the knottiest causes of economic inequality in our city and state: a lack of good-paying jobs.
About one fifth of the people in New York City live in poverty. Indeed, since 1975, as the mayor's own commission pointed out, the city's poverty rate has consistently exceeded the nation's since 1975. (In upstate cities, the problem is even worse: poverty rates there run at 30 percent.)
In a time of low wages, working doesn't always get you out of poverty: as the Fiscal Policy Institute (FPI) pointed out, in New York State in 1990 the percentage of people in working families who were poor was six percent. By 2005, that had climbed to 10 percent.
When Mayor Bloomberg promotes the tourism industry as a way to create "tens of thousands of jobs for those on their way up the economic ladder," he's dreaming. Tourism jobs are mostly low-paid, dead-end work. (Unless, of course, New York tourism workers unionize the way Nevada's casino workers did.)
As James Parrott of the FPI points out, "You can't have a middle class without middle class jobs."
About one fifth of the people in New York City live in poverty. Indeed, since 1975, as the mayor's own commission pointed out, the city's poverty rate has consistently exceeded the nation's since 1975. (In upstate cities, the problem is even worse: poverty rates there run at 30 percent.)
In a time of low wages, working doesn't always get you out of poverty: as the Fiscal Policy Institute (FPI) pointed out, in New York State in 1990 the percentage of people in working families who were poor was six percent. By 2005, that had climbed to 10 percent.
When Mayor Bloomberg promotes the tourism industry as a way to create "tens of thousands of jobs for those on their way up the economic ladder," he's dreaming. Tourism jobs are mostly low-paid, dead-end work. (Unless, of course, New York tourism workers unionize the way Nevada's casino workers did.)
As James Parrott of the FPI points out, "You can't have a middle class without middle class jobs."
Bobby Fischer
I have always been fascinated by chess, though my own talent is strictly rudimentary. I enjoy reading the chess column in the Times, and watching expert commentators who blithely discuss the next ten moves on both sides, and recommend a course of action that makes no sense to me—why not take that rook that seems to be undefended?
I learned about the legend of Bobby Fischer when I was very young. He was one of those people adults told you about—why when Bobby Fischer was your age, he was already defeating grandmasters. And of course he was, like me, a New Yorker, and a Jew (though of course we didn’t know he would go on to become a raging anti-Semite) and I was proud that he had, so early in his life, accomplished great things.
I guess when you think of archetypal New Yorkers, Bobby Fischer is not the first person to come to mind, but it many ways, Fischer’s drive and ambition, his instinct for the jugular, his pride in his eccentricities, and his utter indifference to what people might think of him, make him a model for a certain type of New York ruthlessness. (His later misadventures aside, I would much rather admire Bobby Fischer then, say, Donald Trump.) Perhaps Bobby Fischer can best be compared to those other remarkably focused talents that came of age in Brooklyn in the 1950s and early 1960s,Woody Allen and Barbra Streisand.
Bobby Fischer certainly had his share of inner demons, and they later derailed him, and perhaps in some ways destroyed him. But for one summer, one brief shining moment, in Iceland in 1972, he managed to sufficiently keep them under control to show everyone what he had always felt himself to be, and what he undoubtedly was; the greatest chess player in the world.
I learned about the legend of Bobby Fischer when I was very young. He was one of those people adults told you about—why when Bobby Fischer was your age, he was already defeating grandmasters. And of course he was, like me, a New Yorker, and a Jew (though of course we didn’t know he would go on to become a raging anti-Semite) and I was proud that he had, so early in his life, accomplished great things.
I guess when you think of archetypal New Yorkers, Bobby Fischer is not the first person to come to mind, but it many ways, Fischer’s drive and ambition, his instinct for the jugular, his pride in his eccentricities, and his utter indifference to what people might think of him, make him a model for a certain type of New York ruthlessness. (His later misadventures aside, I would much rather admire Bobby Fischer then, say, Donald Trump.) Perhaps Bobby Fischer can best be compared to those other remarkably focused talents that came of age in Brooklyn in the 1950s and early 1960s,Woody Allen and Barbra Streisand.
Bobby Fischer certainly had his share of inner demons, and they later derailed him, and perhaps in some ways destroyed him. But for one summer, one brief shining moment, in Iceland in 1972, he managed to sufficiently keep them under control to show everyone what he had always felt himself to be, and what he undoubtedly was; the greatest chess player in the world.
Thursday, January 17, 2008
Through Glass-Steagall, Darkly
There is much talk of bi-partisan stimulus packages in Washington these days, touted by Nancy Pelosi, Ben Bernanke, and even Mr. Harvard MBA-in chief himself. Oh, I don’t know. Some stimulus to the economy probably can’t hurt in the form of increased spending can’t hurt. Questions of whether further tax breaks can do anything other than increase income inequality in this country seems unlikely. And it isn’t clear to me how the Fed cutting interest rates, to fix a crisis that was basically caused by the low interest rates that catalyzed excessive mortgage lending, isn’t an example of pouring oil on a fire.
In general I have nothing against Keynesianism (though I must say in this case I have my worries about its efficaciousness) , but the mere application of stimuli, as the major political actors in Washington seem to want to do, really doesn’t deal what seems to me the real cause of the current crisis; the structure of the financial services industry in this country, and its lack of effective regulation. What made the New Deal reforms work was not mere pump priming, but acts of inspired statesmanship like the creation of the SEC, some hard and fundamental thinking about how finance was working in this country, and some fundamental changes to that structure.
In particular the New Deal reforms called for separation and partitioning of the financial industry into constituent parts, so a crisis in one part would not necessarily bring down the entire structure. The Glass-Steagall Act of 1933, which separated underwriting and investment banking from other sorts of loaning activities, was central to the New Deal reforms, and it worked pretty well, stabilizing the commercial banking industry, and giving investment banking space to lick its wounds and eventually recover, and it played an important role in providing a pattern for the post World War II economic success of the United States.
But by the 1960s and 1970s, disintermediation and the breaking down of barriers was the was the cry of the financial services industry. Why shouldn’t commercial banks be able to directly underwrite? And why shouldn’t investment banks be able to loan money for mortgages. Bankers started to complain that they were being held hostage to an archaic New Deal reform, that contrary to accepted principles, put road blocks into the free and unfettered operation of the market. And in little bits, really starting in 1986, when commercial banks were given the right to underwrite municipal bonds, Glass-Steagall was dismembered, until in 1999, in one of the Clinton administration’s inglorious moments, it was repealed altogether, allowing copulation between investment banks and commercial banks to thrive, and this in turn has contributed to (so far) two financial crises, the dot-com crisis (where the so called “Chinese walls” to separate stock sales and underwriting were largely ignored, in the name of disintermediation) and the current housing crisis, where banks first made reckless loans and mortgages, and then created incredibly complex derivatives (a form of security) to try to spread the risk on sub-prime loans throughout the financial industry. These derivatives did their job so well, and spread the risk so widely, that none of the banks that underwrote or purchased their securities have a real sense of their exposure, and hence the multi-billion dollar losses reported by Citibank, Merrill Lynch, Morgan Stanley, et al.
Look, no one is saying that Glass-Steagall is a panacea, or that its reintroduction would solve all the current problems. But there is a need for Democrats (and I suppose Republicans) to take a hard and critical look at how the financial industry is structured, and then suggest and institute sweeping reforms. Alas, because of the closeness of Wall Street to the Democrats, and the need for the raising of endless amounts of cash, little of this is going on among the three major candidates, beyond some justifiable moralizing about predatory lending. But let the Democrats take a page from the New Deal, and use this occasion to think hard about how to use government to strengthen the economy and better the lives of our citizens.
In general I have nothing against Keynesianism (though I must say in this case I have my worries about its efficaciousness) , but the mere application of stimuli, as the major political actors in Washington seem to want to do, really doesn’t deal what seems to me the real cause of the current crisis; the structure of the financial services industry in this country, and its lack of effective regulation. What made the New Deal reforms work was not mere pump priming, but acts of inspired statesmanship like the creation of the SEC, some hard and fundamental thinking about how finance was working in this country, and some fundamental changes to that structure.
In particular the New Deal reforms called for separation and partitioning of the financial industry into constituent parts, so a crisis in one part would not necessarily bring down the entire structure. The Glass-Steagall Act of 1933, which separated underwriting and investment banking from other sorts of loaning activities, was central to the New Deal reforms, and it worked pretty well, stabilizing the commercial banking industry, and giving investment banking space to lick its wounds and eventually recover, and it played an important role in providing a pattern for the post World War II economic success of the United States.
But by the 1960s and 1970s, disintermediation and the breaking down of barriers was the was the cry of the financial services industry. Why shouldn’t commercial banks be able to directly underwrite? And why shouldn’t investment banks be able to loan money for mortgages. Bankers started to complain that they were being held hostage to an archaic New Deal reform, that contrary to accepted principles, put road blocks into the free and unfettered operation of the market. And in little bits, really starting in 1986, when commercial banks were given the right to underwrite municipal bonds, Glass-Steagall was dismembered, until in 1999, in one of the Clinton administration’s inglorious moments, it was repealed altogether, allowing copulation between investment banks and commercial banks to thrive, and this in turn has contributed to (so far) two financial crises, the dot-com crisis (where the so called “Chinese walls” to separate stock sales and underwriting were largely ignored, in the name of disintermediation) and the current housing crisis, where banks first made reckless loans and mortgages, and then created incredibly complex derivatives (a form of security) to try to spread the risk on sub-prime loans throughout the financial industry. These derivatives did their job so well, and spread the risk so widely, that none of the banks that underwrote or purchased their securities have a real sense of their exposure, and hence the multi-billion dollar losses reported by Citibank, Merrill Lynch, Morgan Stanley, et al.
Look, no one is saying that Glass-Steagall is a panacea, or that its reintroduction would solve all the current problems. But there is a need for Democrats (and I suppose Republicans) to take a hard and critical look at how the financial industry is structured, and then suggest and institute sweeping reforms. Alas, because of the closeness of Wall Street to the Democrats, and the need for the raising of endless amounts of cash, little of this is going on among the three major candidates, beyond some justifiable moralizing about predatory lending. But let the Democrats take a page from the New Deal, and use this occasion to think hard about how to use government to strengthen the economy and better the lives of our citizens.
Wednesday, January 16, 2008
Speaking to the Dead
Why is it, that really for the first time in human history, or at least in western Christian civilization, in upstate New York in the 1840s, large numbers of people started to claim the ability to talk to the dead? Certainly people had tried to speaking to the dead before, and speaking to ghosts and the summoning of spirits has a long history. But with the rise of spiritualism, this became a pursuit not left to occultists and would-be practitioners of the dark arts, but a common middle class practice. Pay a spiritualist, attend a séance, rap on a table, and chat with your dead grandfather or mother.
Why not? It seems to me that religion comes down to two basic principles. One, that there is a God, a ground of all being, a principle of unity for all things, or what have you. And two, that somehow and in some way, death is not final, and that its barrier is permeable. And most religions offer some promise of continued existence, bodily or spiritually, after our physical beings are no more. But the possibility of life after death is of no use to people who are still alive, and immortality, for the vast majority of us who want to stay alive as long as possible, is a rather meager compensation for dying. But survival after death gains a practical utility for the first time when you gain the ability, while still living, to speak to those who already have died. And since, when you get down to it, most people don’t really care what happens to them after they are dead (after all, you’re dead), an ability to speak to the dead while still alive is one of the greatest consolations that religion can provide, giving some evidence that death is a transition, and its awful, terrible, finality can be tempered. As for me, God seems like a paltry thing, a glorified philosophical argument, and one I would gladly forsake, if I had one more chance to speak to those, beloved to me, who have departed this life.
But despite a promising start, spiritualism never really caught on, and remained marginal to American religious culture. The standard histories provide several reasons for this. Spiritualism saw itself as basically a phenomena of post-Christianity, a rational way to demonstrate immortality. This raised the ire of the churches, and it never quite became incorporated into mainstream American Protestantism. It operated basically as a “fee for service” therapy (a forerunner of psychotherapy and similar therapies paid for by individuals for private or group sessions) rather than a church and never developed a strong institutional basis. It was easily falsifiable, its claims unproveable, despite the heroic efforts of psychic researchers, and was plagued by charlatans. For all these reasons, spiritualism has retreated to a small and not particularly respected corner of American religious and spiritual consciousness, and perhaps deservedly so.
Oh, I don’t really think that we live on as spirits after we die, but I certainly don’t believe that it is any more implausible than the belief that there is an superintending intelligence that is planning and watching everything that has ever happened or ever will happen, and I am annoyed that certain types of spiritual beliefs are regularly ridiculed (new age spirituality, Mormonism), while if we are unsympathetic to the core beliefs of Christian evangelicalism (biblical inerrancy, creationism) we are criticized for our secularism, and told we need to hold our noses and get along with conservative Christians.
The science section of the Times yesterday had a pretty incomprehensible article yesterday on the latest cosmological theories of multiverses and big bang inflations, and I read it three times and still didn’t understand it, but the upshot seemed to be that some cosmological models favor the existence of disembodied spirits and reincarnation. Perhaps, but in any event, I think the possibility of some form of continued existence after death is probably the original religious belief, and perhaps the core and most fundamental of all religious beliefs. Proclaiming that people can speak to the dead is perhaps New York State’s single most important contribution to the world’s store of metaphysical speculation, and as a loyal New Yorker I hope (though not on the basis of any logical or rational knowledge or conviction) that the spiritualists knew what they were talking about.
Why not? It seems to me that religion comes down to two basic principles. One, that there is a God, a ground of all being, a principle of unity for all things, or what have you. And two, that somehow and in some way, death is not final, and that its barrier is permeable. And most religions offer some promise of continued existence, bodily or spiritually, after our physical beings are no more. But the possibility of life after death is of no use to people who are still alive, and immortality, for the vast majority of us who want to stay alive as long as possible, is a rather meager compensation for dying. But survival after death gains a practical utility for the first time when you gain the ability, while still living, to speak to those who already have died. And since, when you get down to it, most people don’t really care what happens to them after they are dead (after all, you’re dead), an ability to speak to the dead while still alive is one of the greatest consolations that religion can provide, giving some evidence that death is a transition, and its awful, terrible, finality can be tempered. As for me, God seems like a paltry thing, a glorified philosophical argument, and one I would gladly forsake, if I had one more chance to speak to those, beloved to me, who have departed this life.
But despite a promising start, spiritualism never really caught on, and remained marginal to American religious culture. The standard histories provide several reasons for this. Spiritualism saw itself as basically a phenomena of post-Christianity, a rational way to demonstrate immortality. This raised the ire of the churches, and it never quite became incorporated into mainstream American Protestantism. It operated basically as a “fee for service” therapy (a forerunner of psychotherapy and similar therapies paid for by individuals for private or group sessions) rather than a church and never developed a strong institutional basis. It was easily falsifiable, its claims unproveable, despite the heroic efforts of psychic researchers, and was plagued by charlatans. For all these reasons, spiritualism has retreated to a small and not particularly respected corner of American religious and spiritual consciousness, and perhaps deservedly so.
Oh, I don’t really think that we live on as spirits after we die, but I certainly don’t believe that it is any more implausible than the belief that there is an superintending intelligence that is planning and watching everything that has ever happened or ever will happen, and I am annoyed that certain types of spiritual beliefs are regularly ridiculed (new age spirituality, Mormonism), while if we are unsympathetic to the core beliefs of Christian evangelicalism (biblical inerrancy, creationism) we are criticized for our secularism, and told we need to hold our noses and get along with conservative Christians.
The science section of the Times yesterday had a pretty incomprehensible article yesterday on the latest cosmological theories of multiverses and big bang inflations, and I read it three times and still didn’t understand it, but the upshot seemed to be that some cosmological models favor the existence of disembodied spirits and reincarnation. Perhaps, but in any event, I think the possibility of some form of continued existence after death is probably the original religious belief, and perhaps the core and most fundamental of all religious beliefs. Proclaiming that people can speak to the dead is perhaps New York State’s single most important contribution to the world’s store of metaphysical speculation, and as a loyal New Yorker I hope (though not on the basis of any logical or rational knowledge or conviction) that the spiritualists knew what they were talking about.
Tuesday, January 15, 2008
Construction Worker Who Died on the Job Deserved More from the Post
Tabloid newspapers achieve greatness when they find the drama in ordinary people's lives: the crossing guard who jumps in front of a moving car to save a child, the quiet neighbor who commits a murder, the long-lost lovers reunited by chance. That's why it was infuriating to see today's New York Post devote its front page to Britney Spears when the Daily News recognized the more important story of the day: the death of Ukrainian immigrant worker, married with children, in an accident at a Trump construction site.
The Post's claim to represent the working people of New York always rested on the dubious premise that right-wing politics and celebrity news are just what regular folks want. Of course, Post readers are quite capable of seeing through that game. But the paper's editors owe them more.
As the Post, News and Times (which put the story on the front page of its Metro section) all noted, the job site in Soho where the man died has a history of violations. Worse, as the Times pointed out, one of the subcontractors on the job has mob connections.
Construction is dangerous work. In age when labor is little honored in our politics and culture, a story that illuminates dangers on the job--and their human cost--deserves front-page coverage. Especially in a paper that claims to represent the average New Yorker.
The Post's claim to represent the working people of New York always rested on the dubious premise that right-wing politics and celebrity news are just what regular folks want. Of course, Post readers are quite capable of seeing through that game. But the paper's editors owe them more.
As the Post, News and Times (which put the story on the front page of its Metro section) all noted, the job site in Soho where the man died has a history of violations. Worse, as the Times pointed out, one of the subcontractors on the job has mob connections.
Construction is dangerous work. In age when labor is little honored in our politics and culture, a story that illuminates dangers on the job--and their human cost--deserves front-page coverage. Especially in a paper that claims to represent the average New Yorker.
The Particulars of the Primaries
Our posts on primaries and conventions, along with the prospect of directing my students' reporting on the New Jersey primary, have strengthened my interest in the nomination process.
In my efforts to educate myself, I've found the sites and pieces below very helpful.
For a general introduction to the primaries, check out this page at about.com. It is by no means definitive, but it gives a good overview of the large issues (open versus closed primaries, methods of allocating delegates) that you need to understand.
In the New York Sun, "Electoral Quirks Are Poser in Nominating Process" takes you one step deeper into the issue and lays out the party rules and state-by-state complications that define the process.
For a great summary of how the particularities of party regulations shape the primaries in New Jersey, go the the Star-Ledger for "A Primer on New Jersey's Presidential Primary."
And finally, to keep track of it all, the New York Times' "Election Guide 2008" offers state-by-state profiles of the primary races and more.
For what it is worth, and my opinion is based on the interesting analysis in the Sun, so far the closeness of the Democratic race-- and the Democrats' system of apportioning delegates proportionally, according to how many votes each candidate gets in the primary--raises the possibility that the Democratic race could conceivably go to the convention.
If that happens, Senator Clinton, with her strong party ties dating to her husband's presidency, would likely enjoy an advantage because she would presumably get the votes of the party's super delegates--delegates to the convention selected from party leaders and activists without regard to primary results.
Remember, at this point the Clinton/super delegate question is hypothetical, but it illustrates how party nominating procedures do play a role in determining the strengths and weaknesses of different kinds of candidates.
And the need for the most reliable information you can find to understand this process. If you have any suggestions, don't hesitate to chime in.
In my efforts to educate myself, I've found the sites and pieces below very helpful.
For a general introduction to the primaries, check out this page at about.com. It is by no means definitive, but it gives a good overview of the large issues (open versus closed primaries, methods of allocating delegates) that you need to understand.
In the New York Sun, "Electoral Quirks Are Poser in Nominating Process" takes you one step deeper into the issue and lays out the party rules and state-by-state complications that define the process.
For a great summary of how the particularities of party regulations shape the primaries in New Jersey, go the the Star-Ledger for "A Primer on New Jersey's Presidential Primary."
And finally, to keep track of it all, the New York Times' "Election Guide 2008" offers state-by-state profiles of the primary races and more.
For what it is worth, and my opinion is based on the interesting analysis in the Sun, so far the closeness of the Democratic race-- and the Democrats' system of apportioning delegates proportionally, according to how many votes each candidate gets in the primary--raises the possibility that the Democratic race could conceivably go to the convention.
If that happens, Senator Clinton, with her strong party ties dating to her husband's presidency, would likely enjoy an advantage because she would presumably get the votes of the party's super delegates--delegates to the convention selected from party leaders and activists without regard to primary results.
Remember, at this point the Clinton/super delegate question is hypothetical, but it illustrates how party nominating procedures do play a role in determining the strengths and weaknesses of different kinds of candidates.
And the need for the most reliable information you can find to understand this process. If you have any suggestions, don't hesitate to chime in.
Sunday, January 13, 2008
New Light in an Old Synagogue
When I discovered the Eldridge Street Synagogue in the early 1980s, it felt as if I had encountered a remnant of an all-but-vanished civilization. But thanks to careful restoration by the Eldridge Street Project, the building now presents itself as a sturdy bridge between past and present.
When I first ventured inside I was a graduate student in history planning walking tours on the Lower East Side of Manhattan. The aging sexton (Mr. Markowitz, if I remember correctly) not only showed me the downstairs study where a few congregants still worshipped, but let me venture up a rotting staircase into the main sanctuary. There, the Moorish revival architecture, the ornate woodwork and the dust-covered chandeliers were at once breathtaking and forlorn.
For years, I meditated on the irony of the building's history: it was clearly built to last when it opened in 1887, but by the 1930s it was losing membership and headed into a long decline.
So much of immigration history back then seemed to be about the retention of customs from the Old World, but here was a sign that even the most permanent-looking of buildings had outlived its heyday in less than fifty years. I increasingly concluded that Jewish life on the Lower East Side was less a story of permanence and more a story of adapting to changes that came faster than anyone anticipated. Indeed, the restoration of the synagogue as a museum (with a small space for worship in the downstairs study) helps visitors understand just that process.
Thanks to the reopening of the building in December 2007, visitors can explore the changes and continuities that define Jewish life on the Lower East Side. As the Project's Web site points out,
Our restoration philosophy is attuned to the history, stories and aesthetics of an old building. The Project's architectural master plan calls for the restoration of the Synagogue to it original grandeur while leaving intact elements and areas that evidence the building's history. The building's original gas fixtures will remain, as will floorboards worn down by decades of prayer. In addition, there will be areas within the sanctuary that are not aesthetically restored and pay testament to the building's decline as its congregation left the Synagogue and the Lower East Side for more affluent neighborhoods.My one visit to Eldridge Street left me with just enough time to savor the Project's thoughtful restoration and too little time for one of its regular guided tours. I'll be back for one before too long.
Conventional Wisdom
A modest proposal for fixing the primary system. Abandon them altogether. We are living through another cycle in which state after state tries to make itself “relevant” in the nomination process by going as early as possible in the primary cycle, leading to a remarkable overload of primaries in January and February, accompanied by much handwringing about the inanity of the process, along with numerous suggestions for revamping the system, many of which sound intriguing, and none of which, for various reasons, have a tinker’s dam of being acted upon.
The problem with the current system is that since, as the primaries continue sequentially, their importance diminishes over time, and every state wants to be in the front of the queue. There is a readily available alternative, by which every state gets to voice its opinion at the same time, and one that has a long history in this country. It is called a nominating convention. As primaries have expanded, conventions have become increasingly vestigial, replaced by primaries, that, as most agree, give the illusion but not the reality of “choice.” Instead we held hostage to “invented traditions” like the New Hampshire primary, whose significance in American political history goes all the way back to 1952.
Look, most parties in most countries, like Britain, let their parties chose their leaders. It works fine. If you don’t like the candidate they pick, vote for another candidate. If you don’t like any of the candidates, create your own party. It is the job of the convention to pick candidates they think will win, keeping with the ideological turn of the party. It is the job of the people, in the run up to the presidential election, to form issue oriented campaigns or third party candidates, on immigration, the war in Iraq, or whatever. It is the job of the party leaders to examine the political terrain, and co-opt as many of the issues as they can, and pick whom they think will be the best candidate. It is not accident that the heyday of the convention system, let’s say 1868 to 1932, marks the high tide for both the ideological coherence of the two major parties and the proliferation of third parties.
Primaries are a failed progressive reform, that in the interest of increasing popular participation in elections, have been so subverted and traduced, that hardly any of the initial impetus remains. We have not taken money or party leaders out of the system, but they operate behind the scenes. Rather than choosing candidates, the old convention system gave average people, over the course of years, in local elections, the ability to chose the state’s political leaders, who then came together in a convention to chose the candidates. This is probably the best we can hope for. Eliminating primaries, and returning to conventions, will increase the ideological component of our politics, increase the visibility of third parties, minimize the importance of personality and “narrative,” and would restore the selection of the candidates to where they should be, a few months before the general election. The fog of primaries are no better than the old smoke filled rooms, and in many ways are far worse.
The problem with the current system is that since, as the primaries continue sequentially, their importance diminishes over time, and every state wants to be in the front of the queue. There is a readily available alternative, by which every state gets to voice its opinion at the same time, and one that has a long history in this country. It is called a nominating convention. As primaries have expanded, conventions have become increasingly vestigial, replaced by primaries, that, as most agree, give the illusion but not the reality of “choice.” Instead we held hostage to “invented traditions” like the New Hampshire primary, whose significance in American political history goes all the way back to 1952.
Look, most parties in most countries, like Britain, let their parties chose their leaders. It works fine. If you don’t like the candidate they pick, vote for another candidate. If you don’t like any of the candidates, create your own party. It is the job of the convention to pick candidates they think will win, keeping with the ideological turn of the party. It is the job of the people, in the run up to the presidential election, to form issue oriented campaigns or third party candidates, on immigration, the war in Iraq, or whatever. It is the job of the party leaders to examine the political terrain, and co-opt as many of the issues as they can, and pick whom they think will be the best candidate. It is not accident that the heyday of the convention system, let’s say 1868 to 1932, marks the high tide for both the ideological coherence of the two major parties and the proliferation of third parties.
Primaries are a failed progressive reform, that in the interest of increasing popular participation in elections, have been so subverted and traduced, that hardly any of the initial impetus remains. We have not taken money or party leaders out of the system, but they operate behind the scenes. Rather than choosing candidates, the old convention system gave average people, over the course of years, in local elections, the ability to chose the state’s political leaders, who then came together in a convention to chose the candidates. This is probably the best we can hope for. Eliminating primaries, and returning to conventions, will increase the ideological component of our politics, increase the visibility of third parties, minimize the importance of personality and “narrative,” and would restore the selection of the candidates to where they should be, a few months before the general election. The fog of primaries are no better than the old smoke filled rooms, and in many ways are far worse.
Friday, January 11, 2008
Candidate with a Thousand Faces
Let me just add something to Rob’s excellent post on narrative and the primaries. I have felt for a while that the purpose of American presidential campaigns is not to select politicians but to create heroes, with the model based on the typologies set out by Joseph Campbell in his classic work of comparative mythology, Hero With A Thousand Faces. The archetypal campaign story is that a candidate born in obscurity, recognizes his or her special gifts, and heeds a call to service. Then a series of trials and tribulations follow, in which the hero gains experience from success, and wisdom from failure, until the candidate is of sufficient stature to kill a dragon, rescue a princess (or prince), or run for president.
This campaign mythology comes in several versions, from the rags to riches story (Lincoln remains the model for this version), the conquering military hero (Washington to McCain), the overconfident insider overcoming adversity and discovering humility (JFK, George W. Bush), the clever outsider overcoming the bias against those who don’t seem to belong (Obama), and doubtless there are many more paths to glory, each with a chapter in America’s epic version of the Kalevala and Nibelungenlied. Unfortunately for Hillary, there is no place for the loyal spouse in the standard categories of American heroism.
And of course the reductio ad absurdum in this creation of faux presidential heroes is W., who evidently really believed because he was elected president he had to lead a war to burn the topmost towers of Illium, or someplace in that general vicinity, despite having an Achilles heel that stretched from ears to his toes. If we want elect presidents that will try to be heroes, the current primary system, or some equally unworkable version, will do just fine. To get what we need, campaigns that focus on important issues, in which the life story of the candidates is at best a tertiary concern, we need to rethink the presidency and its constitutional framework as a whole.
This campaign mythology comes in several versions, from the rags to riches story (Lincoln remains the model for this version), the conquering military hero (Washington to McCain), the overconfident insider overcoming adversity and discovering humility (JFK, George W. Bush), the clever outsider overcoming the bias against those who don’t seem to belong (Obama), and doubtless there are many more paths to glory, each with a chapter in America’s epic version of the Kalevala and Nibelungenlied. Unfortunately for Hillary, there is no place for the loyal spouse in the standard categories of American heroism.
And of course the reductio ad absurdum in this creation of faux presidential heroes is W., who evidently really believed because he was elected president he had to lead a war to burn the topmost towers of Illium, or someplace in that general vicinity, despite having an Achilles heel that stretched from ears to his toes. If we want elect presidents that will try to be heroes, the current primary system, or some equally unworkable version, will do just fine. To get what we need, campaigns that focus on important issues, in which the life story of the candidates is at best a tertiary concern, we need to rethink the presidency and its constitutional framework as a whole.
Thursday, January 10, 2008
Enough With the Narratives
Two intelligent observers of politics and journalism have used the same concept to describe coverage of the presidential primaries: scripted. While I don't think that means the news media are pulling strings to elect candidates, the word helps us understand the formulaic reporting that defines so much political journalism.
Certainly the major reporters and pundits were surprised by Hillary Clinton's victory in New Hampshire. But what was not surprising was the way this was immediately cast as a new "narrative" for understanding the campaign. If the narrative of the moment after Iowa was "young leader rises from obscurity to redeem his nation," the narrative that followed New Hampshire was "the comeback of a stumbling front runner."
Narrative is a fancy word for story, and there are only so many formulas for telling a good one. While these narratives are supposed to be deeply revealing about character, in fact they are fairly transferable. They can be pinned on almost any politician in the right circumstances. Worst of all, they tell you very little about what we most want to know: how would these people govern?
When you add to this the reporting about the polling process itself (how did everyone get New Hampshire so wrong?) and market research-style reporting (this just in: working class New Hampshire voters went to Clinton), the tidal wave of hype and hyperventilating that characterizes our accelerated primary season is nearly overwhelming.
In a primary season, and especially in this ridiculously compressed year of primaries, we need more coverage that encourages us to slow down, study, think hard, and make informed choices. A chart published in the Times before the Iowa caucuses, for example, did a great job of comparing candidates' records. Unfortunately, too little of our journalism helps us do that.
Certainly the major reporters and pundits were surprised by Hillary Clinton's victory in New Hampshire. But what was not surprising was the way this was immediately cast as a new "narrative" for understanding the campaign. If the narrative of the moment after Iowa was "young leader rises from obscurity to redeem his nation," the narrative that followed New Hampshire was "the comeback of a stumbling front runner."
Narrative is a fancy word for story, and there are only so many formulas for telling a good one. While these narratives are supposed to be deeply revealing about character, in fact they are fairly transferable. They can be pinned on almost any politician in the right circumstances. Worst of all, they tell you very little about what we most want to know: how would these people govern?
When you add to this the reporting about the polling process itself (how did everyone get New Hampshire so wrong?) and market research-style reporting (this just in: working class New Hampshire voters went to Clinton), the tidal wave of hype and hyperventilating that characterizes our accelerated primary season is nearly overwhelming.
In a primary season, and especially in this ridiculously compressed year of primaries, we need more coverage that encourages us to slow down, study, think hard, and make informed choices. A chart published in the Times before the Iowa caucuses, for example, did a great job of comparing candidates' records. Unfortunately, too little of our journalism helps us do that.
Tuesday, January 8, 2008
A Tragic Death and a Ghost Bike
As I rushed past the entrance to the Manhattan Bridge Sunday afternoon, I glimpsed a white bicycle chained to a signpost at the intersection of Canal and Bowery. The sorrowful woman standing near the bike, and the bouquet placed before it, made me curious. I asked her what was happening and learned about an extraordinary memorial project and the tragically short life of Sam Hindy.
His aunt, Sarah Foote, told me the story. Sam, a 27-year-old computer engineer from Brooklyn, rode his bike to Chinatown on the night of November 16, 2007 to meet a friend coming in on the bus from Boston with his own bike. They met and peddled back toward Brooklyn.
Sam was an experienced cyclist, but his familiar route in and out of Brooklyn was across the Brooklyn Bridge. The two became confused on the entrance ramp to the Manhattan Bridge --something that has happened to me in broad daylight--and wound up in traffic on the upper level of the bridge. They turned around and started to head back, but Sam hit a barrier and fell to the lower roadway--where a car struck and killed him. His companion survived.
The incident was vaguely familiar to me from the news, but Sam's aunt told me stories that brought the young man to life: his birth in Beirut where his father was a foreign correspondent, his education in New York public schools and at Northeastern, his "wide vistas," his love for his friends and family--and the Thanksgiving that he enlivened, when he was fully grown, by leading a gaggle of exuberant kids in a round of dancing. To describe him, she reached for one of his favorite superlatives: "Sam was just an awesome kid."
As we spoke, cyclists arrived at the white bicycle and left flowers. Sunday, it turned out, was the day of the Street Memorial Project's "Third Annual Memorial Ride and Walk." The international project commemorates the death of a cyclist by installing a white "ghost bike" at the spot. There are ghost bikes in cities as different as Chicago, London and Prague, with 41 in New York City.
The events of the day also included a memorial walk across Brooklyn Bridge in honor of pedestrians killed over the past year. The project calculates that 23 bicyclists and more than 100 pedestrians died in accidents in New York City last year. Official figures from the New York City Department of Transportation for the year have not yet been released, but a DOT study computed that 225 bicyclists were killed in New York City from 1996 to 2005.
The short memorial service that followed was sad but principled. Caroline Samponaro of Transportation Alternatives, which works for "better bicycling, walking and public transit, and fewer cars," spoke in honor of Sam and recalled all the cyclists and pedestrians killed by cars in New York City. Sam's father Steve--a journalist at Newsday who went on to found the Brooklyn Brewery--thanked the organizers of the event and described Sam as a man who loved rollerblading, skateboarding and biking. "Sam's preventable death has made us all see the importance of the work of Transportation Alternatives," he said. He added that our dependence on fossil-fueled vehicles is bad for the planet and a cause of the war in Iraq.
The assembled cyclists lifted their bikes in a salute and rode off to a rally at City Hall. The ghost bike, which had looked spectral with one bouquet before it, was now covered in a rainbow of blossoms. But that didn't heal the wound in his aunt's heart: "There's just a great hole where Sam used to be."
As for me, Sam Hindy is now more than a name in a headline. And I'll think hard the next time I pass a ghost bike.
Sunday, January 6, 2008
Agreement, Disagreement and the Weight of the Bush White House
As "We Agreed to Agree, and Forgot To Notice" observed in today's "Week in Review" section of the Times, there may well be a growing sense of agreement among Americans on contentious issues such as gay rights and the death penalty. But Kirk Johnson's interesting article omits on important explanation for this phenomenon: the Bush administration's habit of governing in an authoritarian manner with a radically conservative agenda. The White House and the Republican Party have exploited the post-9/11 environment and the powers of the presidency to govern in a way that obscures the widely-shared beliefs and decencies of most Americans.
While I'm skeptical of claims that there is a huge liberal majority out there waiting to elect an "authentically" liberal candidate--a charge that some Democratic activists level against "big tent" Democrats open to compromise with moderates--there's no doubt that Bush has governed in a manner to the right of the American mainstream. Even allowing for the shift to the right in recent decades, Bush has gone far beyond what was expected when he campaigned in 2000 as a unifying "compassionate conservative."
His tax policies, after all, disproportionately benefit only the wealthiest Americans. And after 9/11, when the country wanted to be brought together in an effective response to terrorism, Bush chose to demonize Democrats as soft on defense, shred the constitution, and invade Iraq.
The president's incredibly low approval ratings are proof of at least one thing: it is Bush who is out of the mainstream. Once you look beyond the not-insignificant powers of this office you can see considerable agreement among Americans on the deep flaws in his policies.
None of this means that the conservative movement is over. If nothing else, the emergence of Mike Huckabee is a reminder that the Christian right is an important part of the the Republican Party base that can still produce a viable candidate, albeit with more charm and a populist style.
The political landscape has not been redrawn, and there is every reason to expect that whoever the Democrats nominate will face a Republican firestorm in November. But so far in this primary season, voters who have had a chance to speak have expressed anything but satisfaction with a continuation of the Bush administration's status quo.
That's good news for the Democrats. And it's more proof that President Bush is out of touch with the deepest yearnings of the American people.
While I'm skeptical of claims that there is a huge liberal majority out there waiting to elect an "authentically" liberal candidate--a charge that some Democratic activists level against "big tent" Democrats open to compromise with moderates--there's no doubt that Bush has governed in a manner to the right of the American mainstream. Even allowing for the shift to the right in recent decades, Bush has gone far beyond what was expected when he campaigned in 2000 as a unifying "compassionate conservative."
His tax policies, after all, disproportionately benefit only the wealthiest Americans. And after 9/11, when the country wanted to be brought together in an effective response to terrorism, Bush chose to demonize Democrats as soft on defense, shred the constitution, and invade Iraq.
The president's incredibly low approval ratings are proof of at least one thing: it is Bush who is out of the mainstream. Once you look beyond the not-insignificant powers of this office you can see considerable agreement among Americans on the deep flaws in his policies.
None of this means that the conservative movement is over. If nothing else, the emergence of Mike Huckabee is a reminder that the Christian right is an important part of the the Republican Party base that can still produce a viable candidate, albeit with more charm and a populist style.
The political landscape has not been redrawn, and there is every reason to expect that whoever the Democrats nominate will face a Republican firestorm in November. But so far in this primary season, voters who have had a chance to speak have expressed anything but satisfaction with a continuation of the Bush administration's status quo.
That's good news for the Democrats. And it's more proof that President Bush is out of touch with the deepest yearnings of the American people.
Friday, January 4, 2008
Obama's Sense of History
Senator Barack Obama's victory speech in Iowa last night, rightly lauded for its eloquence and idealism, was lifted up by a sense of history. His ability to maintain that will go a long way in determining whether he wins the Democratic nomination.
As with much of Obama's candidacy, his knack for presenting himself as the candidate of historical destiny is presented in broad strokes. He's not always explicit on his details, and he has yet to be tested by the kind of firestorm that Republicans will throw at him if he wins the nomination.
Still, in his person--and in his rhetoric--Obama lets everyone know that his candidacy represents both a fulfillment of American ideals and a turning point in recent history. Last night, when he invoked the American Revolution (but not the Civil War) and the Civil Rights Movement, he reminded Americans of a national promise yet to be fulfilled. He also presented his own campaign as the fulfillment of that promise. And he reminded Americans of his desire to restore our soiled reputation in the world. Chris Matthews, for all his bombast, was onto something when he described Obama's victory as a Lexington and Concord moment.
The contrast between Obama's speech and Senator Hillary Clinton's was telling. If Obama operated on a grand strategic level, Clinton was far more tactical. She reassured her supporters that her candidacy was still viable, reminded people that the caucuses were proof of an invigorated Democratic Party, and laid out her policy agenda.
Yet, unlike Obama, who skillfully presents himself as both a Black candidate and a racial unifier, Clinton didn't find a way--however implicit--to present herself as the first viable female presidential candidate. Perhaps that's because her primary message has been experience and competence--two traits that don't lend themselves to a platform of running against sexist barriers. Or is it because the legacy of racism is a more attractive target to most Americans than the weight of sexism?
Obama brilliantly recognizes that most Americans, above all white Americans, want to be redeemed of the sin of racism. (How much they might sacrifice to do that is another matter.) In a nation where Martin Luther King is sometimes honored more for his dreams than for his real-world activism for social justice, Obama will have to work hard to avoid falling into the same trap.
It is far too early to pronounce the outcome of the primary season for the Democrats, especially for we New Yorkers who will not get to weigh in until February. But this much is clear: Obama has succeeded in presenting himself as the candidate who is lifted up by currents of history, a man of the present who can lead us to transcend the worst stains of our past. Hillary Clinton must do the same if she is to prevail.
As with much of Obama's candidacy, his knack for presenting himself as the candidate of historical destiny is presented in broad strokes. He's not always explicit on his details, and he has yet to be tested by the kind of firestorm that Republicans will throw at him if he wins the nomination.
Still, in his person--and in his rhetoric--Obama lets everyone know that his candidacy represents both a fulfillment of American ideals and a turning point in recent history. Last night, when he invoked the American Revolution (but not the Civil War) and the Civil Rights Movement, he reminded Americans of a national promise yet to be fulfilled. He also presented his own campaign as the fulfillment of that promise. And he reminded Americans of his desire to restore our soiled reputation in the world. Chris Matthews, for all his bombast, was onto something when he described Obama's victory as a Lexington and Concord moment.
The contrast between Obama's speech and Senator Hillary Clinton's was telling. If Obama operated on a grand strategic level, Clinton was far more tactical. She reassured her supporters that her candidacy was still viable, reminded people that the caucuses were proof of an invigorated Democratic Party, and laid out her policy agenda.
Yet, unlike Obama, who skillfully presents himself as both a Black candidate and a racial unifier, Clinton didn't find a way--however implicit--to present herself as the first viable female presidential candidate. Perhaps that's because her primary message has been experience and competence--two traits that don't lend themselves to a platform of running against sexist barriers. Or is it because the legacy of racism is a more attractive target to most Americans than the weight of sexism?
Obama brilliantly recognizes that most Americans, above all white Americans, want to be redeemed of the sin of racism. (How much they might sacrifice to do that is another matter.) In a nation where Martin Luther King is sometimes honored more for his dreams than for his real-world activism for social justice, Obama will have to work hard to avoid falling into the same trap.
It is far too early to pronounce the outcome of the primary season for the Democrats, especially for we New Yorkers who will not get to weigh in until February. But this much is clear: Obama has succeeded in presenting himself as the candidate who is lifted up by currents of history, a man of the present who can lead us to transcend the worst stains of our past. Hillary Clinton must do the same if she is to prevail.
Wednesday, January 2, 2008
Our Love is Here To Stay
Who are the most remarkable brothers in American history? There are political brothers, who often obtain their position through a dynastic inheritance of sorts; Jack, Robert, and Ted Kennedy (or the lesser Kennedy-related brother pairs of McGeorge and William Bundy) or Jeb and George W. (There must be earlier examples of famous political brothers in American history before the Kennedys, but they are not immediately coming to mind, except for John and William Tecumseh Sherman, and speaking of Tecumseh, the American Indian pairs of Tecumseh and Tenshgawatawa “the Shawnee Prophet,” and the Seneca half-brothers of Handsome Lake and Cornplanter.)
There are numerous brothers pairs in business and invention, with perhaps Wilbur and Orville Wright being the most famous duo, but there are also Walt and Roy Disney, and in the area of finance and investment banking, which until the 1970s tended to be dominated by dynastic partnerships, there were numerous examples of brother-dominated firms, such as Lehman Brothers, Brown Brothers Harriman, and Goldman,Sacks, along with some famous European examples, such as the Baring Brothers, and the most famous financial brothers of them all, the five Rothschild boys.
But in business as well as politics there is an element of luck and of greatness being thrust upon people born into the right place at the same time. Brothers who achieve success in cultural areas are perhaps the most remarkable brother teams of all. There are the five Marx Brothers, all New York City born, who found renown as a brother-act in vaudeville and later films, and the Six Brown Brothers (not all of whom were members of the Brown family), who in the pre-jazz 1910s were the most important saxophone ensemble in the United States, and played an important role in introducing the saxophone into American popular music.
But even here, luck played a role. As everyone knows, Groucho, Harpo, and Chico were more talented than Gummo or Zeppo. What is perhaps most rare are brothers who are independently creative on the highest level. One such duo are William and Henry James, arguably, respectively, the greatest philosopher and novelist this country has ever produced. And what is perhaps striking about them is despite some common themes, notably the exploration of consciousness, how different they were, especially in their prose styles, with William’s direct, pithy, and apothegmic, while Henry’s was gloriously circumulatory in its nuanced complexity. After William James died in 1910, Henry James wrote his autobiography, Notes of a Son and Brother, not to tell his own life story of growing up in New York City, but to “attempt to place together some particulars of the early life of William James and present him in his setting, his immediate native and domestic air.” Like most accounts of growing up in New York, there is much about real estate and how much things have changed in the intervening decades.
But Henry and William James basically worked separately. If I were to nominate my choice for the most remarkable brother pair in American history, it would be George and Ira Gershwin, both native New Yorkers. George was probably the most gifted musician the United States produced in the 20th century, and like Schubert and Mozart, George died far too young and on the cusp of even greater things. Ira Gershwin has tended to be overlooked, but they were a vital team, writing together almost all of their great songs. I think that of all the great lyricists in the middle decades of the 20th century, Ira Gershwin is my favorite, the cleverest and the most versatile, producing funny, sexy, sad, and sentimental songs in equal measure. And unlike his two nearest peers as lyricists, Cole Porter and Lorenz Hart, Ira seems to have something of a mensch, someone you would like to know. (And Ira’s book, Lyrics on Several Occasions, is the greatest book ever published on the subject of lyric-writing.) I never could have been George Gershwin, but sometimes I imagine that I might have been able to write lyrics like Ira.
Ira continued to write lyrics after George’s death in 1937, perhaps most notably with Kurt Weill in “Lady in the Dark,” but he never really recovered from George’s passing, and retired from song writing in the mid-1950s, devoting the remaining thirty years of his life (he died in 1981) to keeping alive the legacy of his brother.
George and Ira were working on the mediocre film The Goldwyn Follies when George suddenly died of a brain tumor. One of the last things George wrote was the melody of a song that Ira completed after his passing. Although seemingly about romantic love, as many commentators have noted, it is really about the tragic and untimely passing of George, and the deepness and profundity of Ira’s love for his departed brother. It is perhaps the greatest song ever written about brotherly love:
The more I read the papers
The less I comprehend
The world and all its capers
And how it will all end,
Nothing seems to be lasting,
But that isn’t our affair;
We’ve got something permanent---
I mean, in the way we care
It’s very clear
Our love is here to stay
Not for a year
But ever and a day
The radio and the telephone
And the movies that we know
May be passing fancies---
And in time may go
But oh, my dear
Our love is here to stay
Together we’re
Going a long, long way
In time the Rockies may crumble,
Gibraltar may tumble
(They’re only made of clay)
But—our love is here to stay.
There are numerous brothers pairs in business and invention, with perhaps Wilbur and Orville Wright being the most famous duo, but there are also Walt and Roy Disney, and in the area of finance and investment banking, which until the 1970s tended to be dominated by dynastic partnerships, there were numerous examples of brother-dominated firms, such as Lehman Brothers, Brown Brothers Harriman, and Goldman,Sacks, along with some famous European examples, such as the Baring Brothers, and the most famous financial brothers of them all, the five Rothschild boys.
But in business as well as politics there is an element of luck and of greatness being thrust upon people born into the right place at the same time. Brothers who achieve success in cultural areas are perhaps the most remarkable brother teams of all. There are the five Marx Brothers, all New York City born, who found renown as a brother-act in vaudeville and later films, and the Six Brown Brothers (not all of whom were members of the Brown family), who in the pre-jazz 1910s were the most important saxophone ensemble in the United States, and played an important role in introducing the saxophone into American popular music.
But even here, luck played a role. As everyone knows, Groucho, Harpo, and Chico were more talented than Gummo or Zeppo. What is perhaps most rare are brothers who are independently creative on the highest level. One such duo are William and Henry James, arguably, respectively, the greatest philosopher and novelist this country has ever produced. And what is perhaps striking about them is despite some common themes, notably the exploration of consciousness, how different they were, especially in their prose styles, with William’s direct, pithy, and apothegmic, while Henry’s was gloriously circumulatory in its nuanced complexity. After William James died in 1910, Henry James wrote his autobiography, Notes of a Son and Brother, not to tell his own life story of growing up in New York City, but to “attempt to place together some particulars of the early life of William James and present him in his setting, his immediate native and domestic air.” Like most accounts of growing up in New York, there is much about real estate and how much things have changed in the intervening decades.
But Henry and William James basically worked separately. If I were to nominate my choice for the most remarkable brother pair in American history, it would be George and Ira Gershwin, both native New Yorkers. George was probably the most gifted musician the United States produced in the 20th century, and like Schubert and Mozart, George died far too young and on the cusp of even greater things. Ira Gershwin has tended to be overlooked, but they were a vital team, writing together almost all of their great songs. I think that of all the great lyricists in the middle decades of the 20th century, Ira Gershwin is my favorite, the cleverest and the most versatile, producing funny, sexy, sad, and sentimental songs in equal measure. And unlike his two nearest peers as lyricists, Cole Porter and Lorenz Hart, Ira seems to have something of a mensch, someone you would like to know. (And Ira’s book, Lyrics on Several Occasions, is the greatest book ever published on the subject of lyric-writing.) I never could have been George Gershwin, but sometimes I imagine that I might have been able to write lyrics like Ira.
Ira continued to write lyrics after George’s death in 1937, perhaps most notably with Kurt Weill in “Lady in the Dark,” but he never really recovered from George’s passing, and retired from song writing in the mid-1950s, devoting the remaining thirty years of his life (he died in 1981) to keeping alive the legacy of his brother.
George and Ira were working on the mediocre film The Goldwyn Follies when George suddenly died of a brain tumor. One of the last things George wrote was the melody of a song that Ira completed after his passing. Although seemingly about romantic love, as many commentators have noted, it is really about the tragic and untimely passing of George, and the deepness and profundity of Ira’s love for his departed brother. It is perhaps the greatest song ever written about brotherly love:
The more I read the papers
The less I comprehend
The world and all its capers
And how it will all end,
Nothing seems to be lasting,
But that isn’t our affair;
We’ve got something permanent---
I mean, in the way we care
It’s very clear
Our love is here to stay
Not for a year
But ever and a day
The radio and the telephone
And the movies that we know
May be passing fancies---
And in time may go
But oh, my dear
Our love is here to stay
Together we’re
Going a long, long way
In time the Rockies may crumble,
Gibraltar may tumble
(They’re only made of clay)
But—our love is here to stay.
Subscribe to:
Posts (Atom)