Monday, September 28, 2009

Obama at the UN: In Search of FDR's Vision

The structure of world peace cannot be the work of one man, or one party, or one nation. . . . It cannot be a peace of large nations – or of small nations. It must be a peace which rests on the cooperative effort of the whole world.

--Franklin Delano Roosevelt

When Franklin Roosevelt spoke these words in 1944 – the world still at war, embroiled in a struggle between the forces of freedom and tyranny – the United Nations was but a vision resting on the ideals of international cooperation and the belief that the nations of the world could work together to promote peace and foster human rights. Although Roosevelt did not live to see the liberation of Europe and the Allied victory over Japan, his vision of international cooperation was implemented on October 24, 1945, when representatives of 51 countries assembled in San Francisco to sign the UN Charter, thereby formally establishing an organization whose purpose it was to maintain international peace and security, develop friendly relations among nations, and promote respect for human rights. Since its founding, the UN has often fallen short of its ideals, but the international activism of the UN has nevertheless accomplished many positive things consistent with its Charter, including extraordinary acts of humanitarianism – alleviating hunger, preventing disease, and mending broken nations – and contributing greatly to the decline in armed conflict, particularly since the end of the Cold War.

Since its first peacekeeping operation in 1948, when it monitored the Arab-Israeli ceasefire, the UN has implemented 63 peacekeeping missions, many of which have brought stability to previously volatile regions. Its international peacekeeping efforts collectively won a Nobel Peace Prize in 1988, and a 2005 study by the RAND Corporation found that seven out of eight UN peacekeeping missions have successfully maintained the peace, or enforced ceasefires, in areas of major conflict.

The humanitarian efforts of the UN have been even more impressive, reflecting a model of collective world action in efforts to alleviate poverty, hunger and disease through such UN entities as the World Food Programme (which helps feed more than 100 million people a year in 80 countries), the World Health Organization (which implements mass vaccination programs), UNICEF (United Nations Children’s Fund), UNESCO (United Nations Educational, Scientific and Cultural Organization), and UNAIDS (which treats and helps prevent HIV infections in countries the world over). The UN also provides food and shelter for international refugees and displaced persons rendered homeless due to civil war and famine; monitors international elections in newly democratic states; and promotes economic development in the Third World through such institutions as the International Monetary Fund and the World Bank.

Not surprisingly, as an organization dependent on the funding and actions of 192 member nations, some of whom use the UN for political obfuscation, it has a mixed historical record. Subject as it is to infighting and bureaucratic bungling, it has at times been a frustrating, corrupt, and incompetent institution that has disappointed as much as it has helped. Its glaringly unfair treatment of Israel – the only member country ineligible to sit on the United Nations Security Council – has compromised its institutional effectiveness on the Israeli-Palestinian dispute and resulted in a tolerance of anti-Semitism within its walls. And infighting among the Security Council has rendered the UN negligent in several major international crises, its inaction failing to prevent the genocides in Rwanda and Darfur, the massacre in Srebrenica, and mass starvation in Somalia.

Due in part to a philosophical opposition to its mission by conservative elements of the Republican Party, the United States has not always been a strong supporter of the UN; indeed, for eight years under President George W. Bush – who selected as the UN Ambassador a man, John Bolton, who believed in its complete demise – and prior to that, under President Reagan in the 1980’s, the United States failed to pay its bills and worked at times to undermine the UN’s mission. These actions, unfortunately, undermined the moral authority of the United States on many issues of importance, and caused us to lose diplomatic influence and effectiveness on the world stage.

Its imperfections notwithstanding, the UN has done much good in the world, and lacking anything better, it is humanity’s best hope for promoting peace and understanding among nations. It was thus refreshing to hear President Obama’s remarks before the UN General Assembly last Wednesday, in which he defended the ideals of the UN and articulated a vision of international cooperation consistent with that of FDR’s. Obama suggested that “the time has come for the world to move in a new direction,” one that embraces “a new era of engagement based on mutual interest and mutual respect.” He acknowledged that many nations had recently “come to view America with skepticism and distrust,” believing that the United States, on certain critical issues, had “acted unilaterally, without regard for the interests of others,” and he promised that the United States is “determined to act boldly and collectively on behalf of justice and prosperity at home and abroad.” Although the United States will not apologize for defending its interests, both military and economic, “in the year 2009 – more than at any point in human history – the interests of nations and peoples are shared,” whether in the fight against global terrorism, poverty, climate change, or in joint efforts to promote economic growth and prosperity. Obama’s words bespoke a true spirit of cooperation, noting that, while “we come from many places . . . we share a common future.”

Obama embraces an idealism that envisions hope and collective action prevailing over conflict and despair, but he is not na├»ve. He recognizes that the UN has “sadly, but not surprisingly” become at times “a forum for sowing discord instead of forging common ground; a venue for playing politics and exploiting grievances rather than solving problems.” Obama rightly believes, however, that:

[i]n an era when our destiny is shared, power is no longer a zero-sum game. No one nation can or should try to dominate another nation. No world order that elevates one nation or group of people over another will succeed. No balance of power among nations will hold. The traditional divisions between nations of the South and the North make no sense in an interconnected world; nor do alignments of nations rooted in the cleavages of a long-gone Cold War.

Obama thus imagines “a generation that chooses to see the shoreline beyond the rough waters ahead; that comes together to serve the common interests of human beings, and finally gives meaning to the promises embedded in the name given to this institution: the United Nations.” In his speech to the General Assembly, he identified four pillars that are “fundamental to the future that we want for our children” and which the world must collectively work towards:

• Stopping the spread of nuclear weapons and seeking a world without them. With this, Obama expressed a bold vision that must begin with a strengthening of the Nuclear Non-Proliferation Treaty, ratification of the Comprehensive Test Ban Treaty, and collective efforts to combat nuclear smuggling and theft, for “we must never allow a single nuclear device to fall into the hands of a violent extremist.”

• The pursuit of peace. Obama declared that we must “recognize that the yearning for peace is universal, and reassert our resolve to end conflicts around the world.”

• Preserving our planet. The United States in particular must confront the reality of climate change and enact policies that cut emissions, promote renewable energy and efficiency, and share new technologies around the world.

• Creating a global economy that advances opportunity for all people. Economic growth and prosperity cannot be limited to the rich nations of the world, and we must “set our sights on the eradication of extreme poverty in our time.”

Consistent with his past emphasis on individual responsibility, Obama appealed to the moral imperative of national responsibility, the principle that each nation shares in the obligation to make the ideals underlying the UN Charter a reality. Unlike President Bush and other naysayers of international cooperation, the imperfections of the UN and its past struggles to live up to the principles of its founding are not a reason to walk away, but “a calling to redouble our efforts.” Only by working to create the conditions for genuine dialogue among nations – founded upon mutual respect for the dignity of all human beings and cultures – may the world ever be in a position to resolve outdated grievances, alleviate poverty and hunger, and recognize and enforce universal principles of human rights, including equality, freedom of speech and religion, and the right of people everywhere to determine their own destiny.

As the President reminded the General Assembly, the concept of the United Nations “was rooted in the hard-earned lessons of war; rooted in the wisdom that nations could advance their interests by acting together instead of splitting apart.” I could not agree more. We live in a dangerous world, where the seeds of discontent and conflict are deeply embedded in a complex web of history, religion, culture, and economics. From the threat of a nuclear Iran, to the continued menace of al-Qaeda; from threats of genocide, mass starvation, and civil wars in Africa, to revolutionary fervor in Latin America; from rising HIV infections in Russia and Asia, to a nuclear meltdown between India and Pakistan, the threats we face are daunting and, at times, overwhelming. But if we truly seek to rid the world of nuclear weapons, eliminate global terrorism, wipe out poverty and oppression, and advance the cause of human rights to all nations, we must recognize that none of these goals stand a chance if we act alone and fail to acknowledge our shared humanity. Only if the international community is united in purpose and in action may FDR’s dream of world peace and security ever be realized.

Monday, September 21, 2009

On Norman Borlaug and the Sad Priorities of a Celebrity-Obsessed Culture

The forgotten world is made up primarily of the developing nations, where most of the people, comprising more than fifty percent of the total world population, live in poverty, with hunger as a constant companion and fear of famine a continual menace.
--Norman Borlaug

Saturday, September 12, 2009, marked the passing of Norman Borlaug, an American plant pathologist who won the 1970 Nobel Peace Prize for his work in dramatically increasing food production in developing nations. The founder of the “Green Revolution,” Borlaug is responsible for much of humanity’s progress in defeating world hunger over the past half-century. As declared by the Nobel Committee in awarding Borlaug the Peace Prize, “More than any other single person of this age, he has helped provide bread for a hungry world.” Some believe that Borlaug over his lifetime saved as many as one billion people from starvation and hunger. Due to his work and that of his colleagues on a world scale, food is more plentiful and economical than at any time in history.

A practical humanitarian, starting in the 1940’s, Borlaug applied scientific research in genetics, plant breeding, plant pathology, agronomy, soil science, and cereal technology to revolutionize wheat production in Mexico. His research and development of high-yield technologies enabled Mexico to become self-sufficient in wheat production and paved the way for a rapid increase in many other countries throughout the developing world. Borlaug’s techniques nearly doubled food production in India and Pakistan during a five-year period in the 1960’s, and his applied research has continued to avert famine and hunger in these densely populated countries (forty years later, wheat production in India has increased nearly tenfold from when Borlaug first applied his techniques there).

Borlaug believed that "the first essential component of social justice is adequate food for all mankind" and that "[f]ood is the moral right of all who are born into this world." Following his successes in Mexico, India, and Pakistan, Borlaug began programs throughout Latin America, the Middle East, and other parts of Asia. In the 1980’s, his work greatly increased wheat yields in more than a half dozen countries in sub-Saharan Africa. A professor at Texas A&M University in his later years – he worked out of a windowless office and never sought great privilege or fame – he was a genuine American (and world) hero.

Although Borlaug was a remarkable man, whose life and accomplishments should be celebrated, until I came across Borlaug’s obituary last week, I confess I had never heard of him. And based on a few anecdotal conversations with some well educated friends of mine, I suspect that 99 out of 100 Americans have never heard of him either. I cannot help but wonder, though, if the obscurity of Norman Borlaug is but a reflection of American culture – of an educational system that de-emphasizes science and math and basic knowledge of major world trends; a society that shows little concern for the plight of other nations, particularly those enmeshed in poverty and whose inhabitants do not share a European heritage; a media that glamorizes narcissistic Hollywood celebrities, self-centered sports heroes, anorexic bikini-clad models, and drug addicted rock stars, yet pays little attention to academic and scientific achievement; and a consumer culture that values the accumulation of wealth and material acquisitions, yet ignores the causes of world hunger, disease, and human suffering.

In a January 1997 article in The Atlantic Monthly, Gregg Easterbrook noted that, “[t]hough barely known in the country of his birth, elsewhere in the world Norman Borlaug is widely considered to be among the leading Americans of our age.” Why is this so? Why are Americans so woefully ignorant of the developing world? Why do the American media pay so little attention to the lives and achievements of people like Norman Borlaug? When Michael Jackson died, we were inundated for days with stories of candlelight vigils held in his memory, shrines created in his honor, and spontaneous outbursts celebrating his life and music. We idolize celebrities in a manner closely resembling religious worship, as if a pilgrimage to Graceland, or a shrine to Michael and Princess Diana can replace that lost portion of our soul no longer fulfilled by faith in God. We live in a celebrity-obsessed culture that values fame and wealth and glitz over substantive achievement, the virtues of public service, and selfless acts of charity.

It is as if our celebrity culture has become a replacement for family and a sense of community; as if Americans need to connect with celebrities on a spiritual level. We feed the paparazzi with an insatiable hunger for information on the most intimate aspects of the private lives of our public figures. We want to know about their sex lives, their personality quirks, their tastes in food and music; we judge the clothes they wear to every award ceremony, and devour magazines that critique hair styles and shoe selections. Entire television stations and websites are devoted to celebrity worship; we read about them in People and Us and National Enquirer; watch them on Entertainment Tonight and the Biography channel; listen to them spill their guts on late-night television and Oprah. Yet we ignore the desperate cries of a poverty stricken world, one full of human suffering and in need of repair; the same world Norman Borlaug devoted his life to improving, by alleviating hunger and disease and bringing a sense of justice and compassion to a viciously cruel, over-populated sea of humanity.

We rarely glamorize the works of our great scientists and cancer researchers, international assistance workers, and those who run our shelters and soup kitchens. Instead, we deify our celebrities; place them on pedestals until they fall back to earth, as they inevitably do, when their age increases and physical beauty declines, when their luck runs out or they do something to prove they, too, are human, full of imperfections, personality defects, and emotional insecurities.

In Revolutionary times, our heroes were George Washington and Benjamin Franklin, John Adams and Thomas Jefferson, men who symbolized the essential worthiness of our struggling democracy, standing as the embodiment of national virtue. Today, public virtue has been replaced by superficial notions of character and personality, glamour and packaged imagery; we measure success by photo opportunities and face time. The American attention span is said to hover around 9.8 seconds, making us the epitome of the sound-bite society; it affects our tastes in culture, the arts, and how we spend our time, and unduly influences our news reports and presidential campaigns.

Norman Borlaug died last week and, if only for a few minutes, a small portion of the world took note. There were likely no street stoppages and spontaneous dance parties; no candlelight vigils and makeshift shrines. But the lives of hundreds of millions of people were saved in part because Norman Borlaug devoted his life to improving the living conditions of others. He may have never appeared in People magazine or been interviewed on Hollywood Insider, and he may not be a household name in American schools, but his legacy -- that of Norman Borlaug, the forgotten benefactor of humanity -- lives on throughout the world.

Friday, September 18, 2009

A Baseball Quandary: Good Statistics, Bad Character, Broken Rules

There is perhaps no game as baseball that harmonizes so perfectly with statistical measurement. Virtually every aspect of the game is judged by a number or an equation: batting average (hits divided by at-bats), slugging percentage (total bases divided by at-bats), earned run average (earned runs multiplied by nine, divided by total innings pitched), strikeout-to-walk ratio, lefty-righty breakdowns, you name it, baseball probably has a stat for it. Statistics measure a player’s performance from week-to-week and year-to-year; they resolve salary arbitration disputes, help managers determine matchups and set lineups, and permit fans to debate the value of trades and compare players of different generations. When age and physical limitations finally betray a star player, his career statistics become his resume for future consideration at Cooperstown.

Thus, on September 11, 2009, when Derek Jeter slapped on opposite-field single into right field – marking his 2,722nd career hit and surpassing Lou Gehrig’s 70-year old record as the all-time Yankees hit leader – his future entry into the Hall of Fame was all but assured. That Jeter is a superb baseball player is not disputed, in large part because his statistics bear it out; a career .317 hitter, ten-time All-Star, three-time recipient of the Gold Glove and Silver Slugger Awards, he is a genuine five-tool player, combining hitting, fielding, arm-strength, power, and speed. At 35, Jeter remains at the top of his game, approaching his sixth 200-hit season and leading the Yankees to what will likely be their 40th American League pennant. By all accounts, he also is a man of exemplary character and professionalism, an Ambassador of baseball, and a highly respected teammate. If Jeter retired today, he would deservingly be voted into the Hall of Fame in his first year of eligibility.

It is of some irony, then, that Jeter’s record-breaking hit occurred exactly 24 years after Pete Rose surpassed Ty Cobb as baseball’s all-time career hit leader. Rose finished his career with 4,256 hits, the most ever in the history of major league baseball, and 1,500 hits more than Gehrig (and Jeter at present). In a career that spanned 24 years (1963 to 1986), Rose finished with more than 200 hits ten times (and 198 hits in two other seasons), the most of any player in history. He was a three-time National League batting champion, a 17-time All-Star selection, and over several seasons led the league in runs (1969, 1974, 1975, 1976), hits (1965, 1968, 1970, 1972, 1973, 1976, 1981), and doubles (1974, 1975, 1976, 1978, 1980). Nicknamed “Charlie Hustle” because he always played hard – he sprinted to first base every time he was walked and broke up double plays as well as anyone who ever played the game – Rose also holds the all-time National League records for runs scored (2,165), doubles (746), and longest consecutive game hitting streak (44). He was no slouch defensively either, as he played five positions and twice won the Gold Glove (as an outfielder in 1969 and 1970). That Rose was one of baseball’s all-time greatest players is, statistically at least, difficult to refute. Unlike Derek Jeter, however, Pete Rose may never enter the Baseball Hall of Fame, not for lack of baseball merit, but for a lack of character, integrity, and contrition.

In 1988, the Director of Security for Major League Baseball received reports that Rose had placed bets on games played by the Cincinnati Reds while Rose played for and managed the team. If true, Rose would have violated baseball’s most sacrosanct prohibition, which originated in the aftermath of the Black Sox scandal of 1919, when players from the Chicago White Sox disgraced the game by accepting bribes from big-time gamblers to fix the World Series. The rule is now codified as Major League Rule 21(d), and is posted prominently in every team clubhouse: “Any player . . . who shall bet any sum whatsoever upon any baseball game in connection with which the bettor has a duty to perform shall be declared permanently ineligible.” Betting on any other baseball game in which the bettor has no duty to perform results in one-year of ineligibility.

Rose denied under oath ever betting on professional baseball, but an investigation conducted in 1989 by John Dowd, a former mob prosecutor hired by the Office of Commissioner to investigate the gambling allegations, found otherwise. In his 225-page Report to the Commissioner (“The Dowd Report”), released on May 9, 1989, Dowd determined that “Pete Rose bet on baseball, and in particular, on games of the Cincinnati Reds Baseball Club, during the 1985, 1986, and 1987 seasons." Dowd found no evidence that Rose ever bet against the Cincinnati Reds (he bet on them to win), and there was no evidence of game fixing. Nevertheless, the report detailed overwhelming evidence that Rose violated the capital crime of Major League Baseball by betting on games in which he had a duty to perform.

I have studied the Dowd Report and, from my vantage point as a former prosecutor, there is no doubt that Rose violated Major League Rule 21(d). Although much of the evidence consists of the testimony of two convicted drug dealers – Rose’s former friend, Paul Janszen, and bookie Ron Peters – their testimony was independently corroborated by betting slips and Rose’s phone and bank records, which documented the flow of money and the activity described. Dowd also uncovered evidence of other “runners” used by Rose to place his bets (he did not place any bets directly), as well as various bookies utilized by Rose, several of whom had ties to organized crime (Dowd discovered hundreds of questionable financial transactions between Rose and these bookies). In his deposition, Rose admitted gambling extensively on college and professional basketball and football games (which did not violate Rule 21), and he acknowledged using runners to place the bets, so as to protect his privacy. He denied knowing many of the key witnesses against him (including Ron Peters), but when confronted with evidence that he had called them, or left tickets for them, or written checks to them, Rose came up with a variety of very weak explanations, such as failed real estate deals and sponsorship of card shows, none of which made much sense in context. With the assistance of an FBI gambling expert, who examined Rose’s financial records, Dowd showed that Rose placed nearly $15,000 in bets on a daily basis. Among the documentary evidence discussed in the report were three original betting slips in Rose’s handwriting (two of the betting slips also contained Rose’s fingerprints, although this evidence was withheld by the FBI and not made available to Dowd). In its totality, the evidence demonstrated conclusively that Rose bet on professional baseball games, including games played by the Reds.

That Rose, in fact, violated Rule 21(d) is thus not in issue for me, and I believe that his lifetime banishment from Major League Baseball was an appropriate sanction, necessary to protect the integrity of the game and to deter others tempted to repeat Rose’s mistakes. It is just and proper that Rose never again be allowed to participate in professional baseball in any official capacity – not as a coach, manager, hitting instructor, or motivational speaker.

Does Rose nevertheless belong in the Baseball Hall of Fame? I believe that he does, and that his plaque’s absence from the museum’s Gallery dishonors the game.

The Baseball Hall of Fame in Cooperstown, New York, is an independent, privately funded museum, which operates under its own rules, independent of Major League Baseball. To be elected to the Hall of Fame, a player must have played at least ten seasons in the major leagues and have been retired for five years. Hall of Fame Rule 5 states that voting (by the Baseball Writers Association of America) “shall be based upon the player’s record, playing ability, integrity, sportsmanship, character, and contributions to the team(s) on which the player played.” If Rose’s entry into the Hall were determined strictly by these criteria, he would have been selected a long time ago. While his character and integrity are certainly lacking, a cursory glance at the plaques lining the museum Gallery suggests that these traits are accorded little weight in determining membership. Ty Cobb, who previously held the all-time hits record until Rose broke it, was an unabashed racist who once slapped a black elevator operator he deemed “uppity,” then stabbed the night watchman (who also was black) when he tried to intervene (the matter was settled out of court in 1912). On May 15, 1912, Cobb severely beat a handicapped fan who heckled him during a game; he routinely spiked opposing players, and regularly used profanity and racial epithets; and he was despised by his own teammates. But in 24 years with the Detroit Tigers, Cobb won eleven batting titles and retired with 4,191 hits and a career .366 batting average. On the field, Cobb was one of the greatest players in the game’s history – his statistics bear that out – and so he is appropriately in the Hall of Fame. In baseball, if not in life, on-field play and statistical accomplishment rule. Babe Ruth is in the Hall of Fame, not because he was a man of great integrity and character – he was a notorious drinker and womanizer – but because he ended his career with 714 home runs, 2,217 RBIs, a .342 batting average, and was one of the best players ever to play the game (he even won 94 games as a pitcher with a career 2.28 ERA). Gamblers, womanizers, boozers, insensitive rogues – there are many to be found among the men whose plaques fill the walls of Cooperstown. There also are many fine men among the list of Hall of Fame inductees, but all of them are there because of their on-field accomplishments.

It is not Rule 5’s references to integrity and character, then, which has kept Rose out of the Hall of Fame, but rather Rule 3e, which declares that “[a]ny player on Baseball’s Ineligibility List shall not be eligible for the Hall of Fame.” This rule did not exist, however, until 1991, just prior to when Rose first became eligible for consideration to the Hall. Following release of the Dowd Report, Rose sought to enjoin then Commissioner A. Bartlett Giamatti from disciplining him; when a federal judge dismissed his suit, Rose and Giamatti reached a settlement. Rose agreed to be placed on the list of those “permanently ineligible” with the understanding that he would be allowed to apply for re-instatement after one year, and that Giamatti would consider such a request in good faith. The Office of Commissioner agreed to make no findings or determinations on whether Rose bet on baseball, and the settlement agreement said that “[n]othing in this agreement shall be deemed either an admission or denial by Peter Edward Rose of the allegation that he bet on any major league baseball game.”

Nevertheless, at the press conference announcing the agreement, a reporter asked Giamatti if he personally believed Rose had bet on baseball. Giamatti replied, “In the absence of a hearing and therefore in the absence of any evidence to the contrary, I am confronted by the factual record of Mr. Dowd. On the basis of that, yes, I have concluded that he bet on baseball” (as reported by James Reston, Jr., in Collision at Home Plate: The Lives of Pete Rose and Bart Giamatti, Harper Perennial, 1991, p. 307). Rose understandably was shocked and dismayed at what appeared to be a public retraction by the Commissioner and a breach of the settlement agreement, the precise language of which took several days to negotiate. Absent Rose’s understanding that Giamatti would make no formal findings, Rose never would have accepted “permanent ineligibility” status. As it happens, Giamatti died of a heart attack eight days later and never had an opportunity to clarify, correct, or expand on his comments.

Fay Vincent, who succeeded Giamatti as Commissioner until he resigned in 1992, apparently blamed Rose for Giamatti’s death and never forgave him. He has always been, and remains, publicly opposed to Rose’s re-instatement (see “The Confessions of Pete Rose,” by Fay Vincent, The New York Times, Op-Ed, January 2, 2004). In February 1991, 18 months after the Rose settlement and Giamatti’s death, and by some accounts at Vincent’s urging, the Hall of Fame’s Board of Directors enacted Rule 3e, which forbids Rose’s selection as long as he remains on Major League Baseball’s permanent ineligibility list.

Rose applied for re-instatement in 1997, but Commissioner Bud Selig has sat on the application for 12 years. No formal hearing has ever been scheduled and Selig has stated publicly that he has not seen anything that would convince him to overturn the original agreement.

Many people contend, quite reasonably, that Rose is his own worst enemy, that if he would simply admit his wrongdoing and show genuine contrition, all would be forgiven. Rose has made an admission of sorts; on March 14, 2007, during a sports radio program, Rose reportedly stated, “I bet on my team to win every night because I love my team. . . I made a big mistake. It’s my fault. It’s nobody else’s fault.” Whether this was a contrite admission or a cynical ploy to win favor does not, in my mind, make much difference. No ruling ever was made that Rose bet on baseball – by either the Commissioner’s Office or a court – and the Commissioner expressly agreed not to make any findings, nor to require an admission from Rose. It is contrary to Major League Baseball’s agreement with Rose, and lacks basic fairness, for another Commissioner to now require a full admission and undefined acts of contrition by Rose, when the Commissioner’s Office specifically agreed to “not make any formal findings or determinations on any matter including without limitation the allegation that Peter Edward Rose bet on any major league baseball games.”

Lawsuits and disputes are settled all the time, for a lot of reasons, some good, and some bad. Settlements allow litigants to compromise and move forward. The Rose agreement avoided what was to be a long, drawn out court fight that would have harmed Major League Baseball. It allowed resolution of the case by reaching an acceptable compromise – banishing Rose from baseball for life – which was and remains important for the game’s integrity – yet allowing Rose implicitly the hope and expectation that he may eventually receive the Hall of Fame recognition that his play justly deserved. Changing the rules after the fact, which is precisely what happened in this case, made worse by the 1991 enactment of Hall of Fame Rule 3(e), smacks of unfairness and is a touch vindictive.

I was four years old when Rose got his first hit in the Major Leagues, and I was 28 when he retired from the game; I grew up and became a man during the baseball career of Pete Rose. Although I never particularly liked him – he was far too brash and cocky for me – I always respected his ability, talent, and competitive spirit. And he was sure fun to watch. He may have acted foolishly, and I am certain he bet on baseball, but his play on the field was never compromised. Rose holds the major league record for most winning games played (1,972) not simply because he played for 24 years, but because he was all about winning. There may have been wagers placed, but there was no game fixing by Rose – his Hustle, on the field if not off, was all “Charlie.”

Over the years, Rose has remained bitter and unrepentant. It is hard not to feel some sadness, though, for there are few American figures that have achieved so much and fallen so far. Rose’s story is full of peaks and valleys – he achieved greatness and glory on the baseball diamond and became a national celebrity and a hero to thousands, only to throw it all away, by senseless greed and stubborn pride. Rose has many personal failings – the list is long – but his baseball performance on the field from 1963, when he won the Rookie of the Year Award, to 1986, when he finally set aside his glove and cleats, was not one of them.

Peter Edward Rose belongs in the Hall of Fame. To ignore his accomplishments, to treat him as a pariah forever, is to ignore a significant piece of baseball history. Keeping him out of the game of baseball is necessary to protect the game’s integrity; keeping him out of the Hall of Fame devalues the historical record. As James Reston, Jr., asked rhetorically in Collision at Home Plate, “If Cooperstown truly represents the history of baseball, how can it overlook the game’s all-time hit leader and the game’s all-time tragic figure?” Include a permanent exhibit about why gambling is baseball’s capital crime. Museums, after all, should educate and entertain. Explain why Rose’s gambling damaged not just his integrity, but that of professional baseball. But let us once and for all properly recognize and celebrate Rose’s on-field play, which was never compromised by his gambling activities. Don’t dishonor the game by excluding from the Hall of Fame one of its greatest and most accomplished players.

Wednesday, September 9, 2009

The President Can Talk To My Kids Anytime

They warned of socialist indoctrination, political propaganda, and all manner of sorcery and witchcraft. School districts throughout the country, particularly in Texas and Virginia, declined to show the speech. In those districts that did show it, some parents demanded that their children be excluded. As reported in today's New York Times, a fourth-grade girl at an elementary school in Greensboro, North Carolina, sat alone eating her lunch as the rest of her classmates watched the speech in an adjoining classroom. Some parents even kept their children home from school, rather than --God forbid -- they watch a fifteen minute inspirational talk by the President of the United States.

Given the hype and controversy leading up to the speech, one would expect that it was a real hum-dinger, perhaps a flowery manifesto in support of Big Government, Death Panels, Income Re-Distribution, and Taxes. Perhaps an attempt to brainwash our children into becoming servants of the State, followers of Karl Marx, and believers in socialized medicine. Well, not quite. As it turns out, after several days of right-wing conspiracy theories and talk radio venom, the President in the end spoiled the day.

Instead, the President spoke of the importance of education and of every students' individual responsibility to attend school, "pay attention" to teachers, listen to parents and grandparents, and "put in the hard work it takes to succeed." He instructed that, "no matter what you do in life, I guarantee that you'll need an education to do it. . . . You cannot drop out of school and just drop into a good job. You've got to train for it and work for it and learn for it."

He spoke from the heart of the emotional pain growing up as a fatherless child; of how he "missed having a father" and recalled "times when I was lonely and I felt like I didn't fit in." He understood that some students did not have adults in their lives to provide an affirming presence and much needed support; that some children live in neighborhoods "where you don't feel safe, or have friends who are pressuring you to do things you know aren't right." But no matter one's lot in life, the President admonished that "none of that is an excuse for talking back to your teacher, or cutting class, or dropping out of school. There is no excuse for not trying." He recited inspirational examples of students who have overcome tremendous odds -- language barriers, cancer, foster homes and tough neighborhoods -- and through hard work and perseverance have gone on to college and, in one case, medical school. These students succeeded and overcame great obstacles, simply because they "chose to take responsibility for their lives, for their education, and [to] set goals for themselves. And I expect all of you to do the same."

He spoke of the need to experience failure before one can experience success, using as examples J.K. Rowling, whose first Harry Potter book was rejected by 12 publishing companies before it was finally accepted, and Michael Jordan, who was cut from his high school basketball team during his freshman year. "You become good at things through hard work. . . . You've got to practice. . . . You might have to do a math problem a few times before you get it right. You might have to read something a few times before you understand it. You definitely have to do a few drafts of a paper before it's good enough to hand in." He said that no one should be afraid to ask questions or "for help when you need it. I do that every day."

"The story of America," the President said, is not "about people who quit when things got tough. It's about people who kept going, who tried harder, who loved their country too much to do anything less than their best." He urged all of the students to put their best efforts into everything they did, and that he expected great things from all of them. "So don't let us down. Don't let your family down or your country down. Most of all, don't let yourself down."

If this is socialist indoctrination, then please dish out some more. I feel sorry for the fourth grader in Greensboro, for it is not her fault that she was cursed with such close minded, intolerant parents, so full of fear and paranoia that they cannot permit their daughter to listen to the President of the United States speak about hard work and responsibility. But as for the school children in Philadelphia, and Camden, and the Bronx, and throughout most of America, I am really pleased that they sat and listened to the President's talk. It is these students for whom the President's message may have the most resonance, students who understand that the President's heart is in the right place, that he really does want all of them -- black or white, rich or poor, male or female -- to succeed; that he wants America to be a better country than we are now or have ever been. He believes in America's youth, because he believes in America. As one young middle schooler from Denver said, "I think it was pretty cool to have the leader of the country speaking to us." As for my kids, the President can speak to them any time his heart desires.

Sunday, September 6, 2009

Growing Doubts On Afghanistan

During a national nightly news broadcast in February 1968, Walter Cronkite questioned the legitimacy of America’s continued involvement in the Vietnam War. It was a defining moment. Following the broadcast, President Johnson famously told an aide, “If I’ve lost Cronkite, I’ve lost middle America.” He was right. American support for the war substantially declined and, soon thereafter, Johnson announced that he was not running for re-election.

On September 1, 2009, conservative columnist George Will published an op-ed piece in The Washington Post entitled, “Time to Get Out of Afghanistan.” Although Will may not rank with Cronkite as “most trusted man in America,” his dissension from the ranks of pro-war sentiment is nevertheless significant and potentially influential. Will is a model of civil discourse, a thoughtful, intelligent, and well-respected commentator who appeals to reason at a time when the news media is filled with shouting pundits and a lack of critical thinking.

Will correctly notes that the United States has been entangled in Afghanistan for eight years – longer than its combined involvements in the two world wars of the Twentieth Century. He contends that our stated policy of “clear, hold, and build” – that is, clear various regions of Taliban control, hold U.S./Afghan control of those areas, and build effective local, district and provincial governments – is doomed to failure; “nation-building would be impossible even if we knew how, and even if Afghanistan were not the second-worst place to try.” According to the Brookings Institution, only Somalia ranks lower than Afghanistan as a weaker nation state. Will contends that Afghanistan has never had an effective, central government and, citing recent commentary in The Economist, states that the regime of President Hamid Karzai is “so ‘inept, corrupt and predatory’ that people sometimes yearn for restoration of the warlords, ‘who were less venal and less brutal than Mr. Karzai’s lot.’” We presently have 68,000 American troops in Afghanistan (bringing the coalition total to 110,000), and President Obama is considering adding thousands more; yet most experts believe that the counterinsurgency effort needed to protect the population would require “hundreds of thousands of coalition troops, perhaps for a decade or more. That is inconceivable.”

Although most Republicans, including the editorialists of Fox News and The Wall Street Journal, continue to support increased military jingoism, Will is not alone on the right in espousing a more cautious approach. Republican Senator Chuck Hagel of Nebraska, for one, also advocates withdrawal of U.S. troops: “Bogging down large armies in historically complex, dangerous areas ends in disaster.” Hagel contends that the United States must recognize that every great threat to our country also threatens our global partners, including former adversaries China, Russia, India, and Turkey. “We need a clearly defined strategy that accounts for the interconnectedness and the shared interests of all nations.” Hagel suggests that we should not “view U.S. involvement in Iraq and Afghanistan through a lens that sees only ‘winning’ or ‘losing.’ . . . There are too many cultural, ethnic and religious dynamics at play in these regions for any one nation to control.”

Like many Democrats, I have until recently believed that Afghanistan was the “good war” in our fight against global terrorism. Afghanistan is where the attacks on the United States of September 11, 2001, were planned and executed. The Taliban forces who supported Osama Bin-laden and provided a safe haven for al-Qaeda were legitimate targets of U.S. military might. I continue to believe that we were right to invade Afghanistan in October 2001, and that we were wrong to invade Iraq, a war which was entered under false pretenses and which diverted our military and security resources in a country that had not attacked us and that posed no direct threat to us. I agreed with President Obama when he asserted during the campaign that U.S. resources should be increased in Afghanistan and decreased in Iraq.

So it is with some reluctance that I express doubts about the President’s strategy and acknowledge the legitimacy of questions raised by George Will, Chuck Hagel, and an increasing number of more liberal commentators. As Bob Herbert wrote in Saturday’s New York Times, “We’re fighting on behalf of an incompetent and hopelessly corrupt government in Afghanistan. If our ultimate goal, as the administration tells us, is a government that can effectively run the country, protect its own population and defeat the Taliban, our troops will be fighting and dying in Afghanistan for many, many years to come.”

Although Vietnam analogies can be tiresome, and while liberals too easily equate all American military incursions with Vietnam, some comparisons between Afghanistan and Vietnam are indeed apt.

• Like President Johnson during the Vietnam War, President Obama appears eager to demonstrate his toughness by vowing to do what it takes to “win” in Afghanistan – even though what is meant by winning is far from clear. When Obama’s special representative to Afghanistan, Richard Holbrooke, was asked last month to define “success” in Afghanistan, he replied, “We’ll know it when we see it.”

• As was the case with South Vietnam, Afghanistan is a deeply divided, semi-failed state with an incompetent, corrupt government that is considered illegitimate by a large portion of its population.

• As with Vietnam, in Afghanistan the United States is embroiled in a nation for which we have very little understanding of its culture, history, and language.

• Similar to Vietnam, Afghanistan has an inhospitable geography, with mountainous terrain, snowy winters, and numerous caves and escape routes that provide off-limit sanctuaries across 9,000 miles of borders. It is ideal for indigenous resistance to foreign invaders, providing the Taliban in certain areas with a distinct advantage.

• As with Vietnam’s resistance of French colonialism prior to the arrival of U.S. forces, Afghanistan successfully resisted military incursions by the British and the former Soviet Union.

• As with LBJ and Vietnam in the 1960’s, the conflict in Afghanistan threatens to derail President Obama’s efforts to reshape America at home. A military escalation in Afghanistan can only serve to divert much needed resources away from the President’s attempts to reform a troubled health care industry, revive a broken economy, prevent global warming, and restore America’s standing in the world.

None of this should be taken as criticism of the brave and courageous U.S. military forces that are stationed in Afghanistan. The ability and skill of our armed forces is unmatched. But many years after U.S. forces had completely withdrawn from Vietnam, Col. Harry Summers, a military historian, said to a counterpart in the North Vietnamese Army, “You never defeated us in the field.” The NVA officer replied, “That may be true. It is also irrelevant.” The Viet Cong knew that one day U.S. forces would leave and, until that day arrived, they would outlast us. At the peak of the Vietnamese conflict, LBJ confided in Senator Richard Russell that he knew we could not win the war in Vietnam, but he felt compelled to stay the course so as to avoid being the first American president to lose a war. Johnson’s pride and political calculations cost the lives of tens of thousands of some of America’s finest young men. These should be warnings to President Obama. While the number of U.S. casualties in Afghanistan will likely never approach Vietnam War levels, President Obama is risking a commitment to a war that has no end in sight and no apparent upside.

While the goal of General Stanley McChrystal, the commander in Afghanistan, is to establish a reasonably noncorrupt Afghan state that will partner with America in keeping Afghanistan free of the Taliban and al-Qaeda, it is clear that he is talking about nation building in one of the poorest, most tribalized, countries in the world. As Thomas Friedman contends in today’s New York Times:

It would be one thing if the people we were fighting with and for represented everything the Taliban did not: decency, respect for women’s rights and education, respect for the rule of law and democratic values and rejection of drug-dealing. But they do not. Too many in this Kabul government are just a different kind of bad. This has become a war between light black — Karzai & Co. — and dark black — Taliban Inc. And light black is simply not good enough to ask Americans to pay for with blood or treasure.
Obama has framed Afghanistan as a war of necessity and not of choice. No one disputes that there are people and organizations committed to harming America and that strong measures are needed to protect us from these threats. But how and when we use force is a decision that must not be made in the mere hope that, maybe, it will succeed.

If we still have not learned that it is virtually impossible to defeat an enemy we do not understand in a terrain we cannot control, then President Obama is destined to repeat the failed lessons of history. There is much more to this debate, and the issues are complex and not easily boxed into the Vietnam analogy. But I cannot help but believe that a coordinated policy of containment and deterrence of the Taliban and al-Qaeda, coupled with strategic development, military training, and economic aid to the Afghan government, will be more successful in keeping us safe and in preventing a resurgence of Taliban influence in the Pashtun regions of Pakistan, than will a military strategy of "winning" at all costs. As Senator Hagel said, “Relying on the use of force as a centerpiece of our global strategy, as we have in recent years, is economically, strategically and politically unsustainable and will result in unnecessary tragedy – especially for the men and women, and their families, who serve our country.”