Friday, November 28, 2008

Roy Zimmerman: "Christma-Hanu-Rama-Ka-Dona-Kwaanza"



This is Roy Zimmerman performing his all-purpose holiday tune "Christma-Hanu-Rama-Ka-Dona-Kwaanza” on KPIG Radio's "Please Stand By" with host John Sandidge.

Wednesday, November 26, 2008

Stan Getz: “Out of Nowhere” (1960)



The Stan Getz quartet performs the Edward Heyman & Johnny Green tune, “Out of Nowhere”. This clip was recorded in Dusseldorf, March 28, 1960 with Jan Johansson (piano), Ray Brown (bass) and Ed Thigpen (drums).

Stanley Gayetzky (1927 – 1991), usually known by his stage name Stan Getz, was an American jazz saxophone player. His parents were Ukrainian Jews who immigrated to the United States in 1903. Getz started playing professionally in 1943 and became popular in the 1950’s playing cool jazz. In the 1960’s he became a central figure in Bossa Nova and Brazilian jazz.

He performs two of his most popular recordings, “Desafinado” and “The Girl from Ipanema”, here and plays with John Coltrane here.

The Thanksgiving tradition reveals more about what we have forgotten about the past than what we remember

Thanksgiving is a tradition that likely evolved from ancient harvest festivals common in agricultural societies around the globe. It was not a formal holiday in the United States but was observed by proclamation by various presidents from time to time. It was only after Abraham Lincoln’s proclamation in 1863 that the last Thursday in November became the Thanksgiving Day we have observed ever since.

Thanksgiving, like Christmas, has its own creation story and that is in 1621 Pilgrims and Indians sat down together in a feast to celebrate a successful harvest. According to the myth a tradition was born. Yet, the truth is a little more complex as Karl Jacoby explains in the L.A. Times:
When Americans sit down to our annual Thanksgiving meal with family and friends, we like to imagine that we are reenacting a scene that first took place in 1621. That year, having made a successful harvest after a brutal winter that killed half their number, the 50 or so surviving Colonists in Plymouth "entertained and feasted," in the words of one, a visiting delegation of nearby Wampanoag Indians, led by "their greatest king," Massasoit.

American holidays, however, sometimes reveal more about what we have forgotten about the past than what we remember. Historical records indicate that the parties dined on venison and corn rather than on the stuffing, cranberry sauce and pumpkin pie Americans have come to associate with Thanksgiving, and that the feast probably took place in the early autumn rather than November. Moreover, it is not even clear that the Pilgrims referred to their 1621 celebration as a thanksgiving. To devout Pilgrims, a day of thanksgiving was usually a solemn religious undertaking, marked by worship and, often, fasting. It was not a day spent gorging on wild deer and engaging in "recreations" with one's Indian neighbors.

Although there were sporadic local Thanksgiving days in Colonial and early America, it was not until the middle of the Civil War -- 1863 -- that President Lincoln issued a proclamation making the last Thursday in November a national holiday of Thanksgiving. Lincoln's statement suggested that thanks were being given as much for "the advancing armies and navies of the Union" as for a bountiful harvest, and the president urged special prayers for "all those who have become widows, orphans, mourners or sufferers in the lamentable civil strife in which we are unavoidably engaged."

Not surprisingly, few at the time viewed Thanksgiving as a private, family occasion. Instead, Northern civilians donated turkey and cranberries to feed Union troops, while Jefferson Davis declared separate Thanksgiving holidays for the Confederacy.

During Reconstruction, many Southerners initially expressed reluctance at celebrating what they saw to be a Yankee holiday. And yet it was at this moment, as the recently rejoined United States struggled to reconcile its populace after a divisive Civil War, that it became useful to reinvent the history of Thanksgiving. Most Americans found it far more pleasant to imagine this American holiday as originating not during the traumas of the 1860s but rather during the more distant past of the early 1600s. To partisans of the Union and the Confederacy alike, the image of Pilgrims and Indians sitting down together to a shared meal offered a comforting vision of peace between potential rivals.

Yet this new image of Thanksgiving not only allowed Americans to gloss over the deep divisions that had led to the Civil War, it also overlooked much of the subsequent history of the Pilgrims' relations with their Indian neighbors. About 50 years after Massasoit and his fellow Wampanoags enjoyed their harvest meal at Plymouth, the Colonists' seizures of Wampanoag land would precipitate a vicious war between Plymouth Colony and the Wampanoags, now led by Massasoit's son, Metacom.

Most of the other peoples in New England at first tried to avoid the conflict between the onetime participants in the "first Thanksgiving." But the confrontation soon engulfed the entire region, pitting the New England Colonies against a fragile alliance of Wampanoags, Narragansetts, Nipmucs and other Native American groups. Although these allies succeeded in killing hundreds of Colonists and burning British settlements up to the very fringes of Boston itself, the losses suffered by New England's indigenous peoples were even more devastating. Thousands died over the two years of the war, and many of those captured were sold into slavery in the British West Indies, including Metacom's wife and 9-year-old son.

Metacom met his end at the hands of a Colonial scouting party in August of 1676. His killers quartered and decapitated his body and sent Metacom's head to Plymouth, where for two decades it would be prominently displayed on a pike outside the colony's entrance. That same year, as the violence drew to a close, the colony of Connecticut declared a "day of Publique Thankesgiving" to celebrate "the subdueing of our enemies."

Perhaps it is not surprising that we choose to remember the Thanksgiving of 1621 and to forget the Thanksgiving of 1676. Who, after all, would not prefer to celebrate a moment of peaceful unity rather than one of bloody conflict? But if our public holidays are meant to be moments for self-reflection as well as self-congratulation, we should think of Thanksgiving not as a perpetual reenactment of the "first Thanksgiving" of 1621 but instead as a dynamic event whose meaning has shifted over time.

We need not forget Massasoit's pleasant experiences dining with the Pilgrims in order to remember the more troubling fate of his son at the hands of the Pilgrims' descendants. Indeed, commemorating all the many reasons Americans have expressed thanks over the centuries allows us to come to a more complete and more honest understanding of our history. For while we cannot change events in the past, we do have the power to decide what we wish to be thankful for now and in the future.

Tuesday, November 25, 2008

The foreclosure crisis



Communities crushed by the foreclosure crisis are dealing with hundreds, and sometimes thousands, of abandoned and deteriorating houses. As local governments scramble to grapple with the problem, Congress, who bailed out Wall Street to the tune of $700 billion, has provided little relief. Four billion dollars has been set aside by Congress to help communities buy up and repair foreclosed houses, but will it be enough? According to Mary Kane in the Washington Independent:
Places hit hard by the foreclosure crisis, like Prince William County, are dealing with hundreds, and sometimes thousands, of abandoned and deteriorating properties like the Irongate townhouse — the damage left behind by the subprime mess. Unlike banks, insurance companies and others that have gotten a piece of the $700-billion rescue bill to help with their credit crisis problems, cities and suburbs are mostly on their own.

It wasn’t supposed to be this way. Politicians in Washington crowed this summer about helping homeowners with a mortgage rescue bill that included $300 billion in guarantees for refinanced mortgages and $4 billion for communities to buy up and repair foreclosed houses.

But since the program launched in October, the Federal Housing Admin. has received only 42 applications to refinance mortgages. That’s a far cry from the 400,000 or so homeowners expected to avoid foreclosure with the lower payment loans.

The issue is that the program is strictly voluntary for lenders. Congress could have made taking part in it a condition of getting money from the Treasury rescue plan — but it didn’t. In an effort to address this omission, government officials announced last week they would make it easier for borrowers to qualify for the loans, in order to draw more applicants.

The idea behind the mortgage rescue bill during the summer had been to combine those refinanced mortgages with the $4 billion for foreclosed properties, and make a dent, on the ground, in the foreclosure crisis, according to Danilo Pelletiere, research director of the National Low Income Housing Coalition. Instead, foreclosures grew at record levels. Refinancings faltered. Now there’s just the $4-billion piece.

Communities have to finish their plans for the money by Dec. 1. The U.S. Dept. of Housing and Urban Development is going to approve the proposals, and give out the funds in February, at the earliest. By contrast, the Treasury bailout plan was approved in two weeks.

“There is no question that you are throwing a small amount of money at a very big problem,” Pelletiere said. “The way this thing has panned out is that it’s a really small amount of help. It really looks pretty wimpy.”
(The video clip above is from the American News Project.)

Wednesday, November 19, 2008

Tuesday, November 18, 2008

Who is to blame for the disaster in the Democratic Republic of the Congo?

Who is to blame for the disaster in the Democratic Republic of the Congo? There is plenty of blame to go around including not only the actors on the scene but the international community that does nothing. Humanitarian rhetoric flows from the international community but effective intervention does not. In the meantime, death and destruction spreads. Erin A. Weir of Refugees International has these observations:
Violence re-erupted in the North Kivu province of the Democratic Republic of the Congo on the evening of October 26th and the redisplacement of tens, and then hundreds of thousands of people began.

By Wednesday, October 29th, the untrained, unpaid, uncontrolled Congolese military had abandoned their post and were actively terrorizing the population in Goma and throughout the province.

By the end of the week the world had caught on to the disaster. Foreign ministers and senior diplomats flooded into Goma, and with them the international press corps. The hand wringing and finger pointing had begun in earnest.

And who was to blame? Laurent Nkunda and his rebel forces were certainly first in line, as well as their backers in Rwanda, and the weak and ineffectual Congolese army. Most worrying though was the failure of the UN forces themselves. Why had “the world’s largest peacekeeping force” failed to protect the people of Congo?

The answer, now three weeks into this crisis, should be abundantly clear to anyone paying attention. For all the expressions of concern and support, for all the press conferences from the “front lines” of the DR Congo, the member states represented on the UN Security Council have persisted in doing absolutely nothing.

Long before the violence reignited, the UN Mission in the DR Congo (known by the acronym MONUC) had been requesting additional troops and other resources in order to carry out the many complex responsibilities that the Security Council has placed on their shoulders. With the high-profile of this most recent crisis, and the very public attention that it has received from the highest echelons of power, one might be led to believe that this time the politicians in Washington, London and Paris, might actually come through with the material and – more importantly – the political support that MONUC needs to get the job done.

Unfortunately, true to form, the most powerful members of the Security Council seem content to just be seen to be paying attention, and are not at all bothered by the total lack of any concrete action. Of course, the crisis is not entirely off their radar. The Council has, after all, penciled in a slot to discuss these matters… on November 26th, mind you, a full month after the crisis began. They very well may authorize reinforcements on that date, but it will take four to five months to get these forces on the ground. This, it seems, it what they meant when, in an October 29th statement, the Council promised to “study expeditiously” the matter of additional resources.

The trouble is that having failed to reinforce MONUC before the crisis ignited, the simple addition of troops after the fact is not going to cut it. MONUC forces need reinforcement, but they also need some time to regroup, and the hundreds of thousands of displaced and terrorized people in North Kivu need to see an ounce of stability so that they can begin to rebuild the lives that have been ripped to pieces in the last three weeks. UN deployments, however effective, take time, and time is not a luxury that the world can afford in North Kivu.

Angola, the Congo’s southern neighbor, has offered to enter the breach and fight alongside Congolese forces as they have done in the past. But this risks even more involvement of other regional actors like Rwanda, which is less inclined to fight on the side of the Congolese government. As bad as the situation continues to be in North Kivu, the prospect of a fully fledged regional war is far worse still.

Monday, November 17, 2008

Atheists in foxholes

The Christianization of the military is something which should worry Americans. American soldiers and sailors represent the United States, not one religion or another. The military is an instrument of foreign policy and national defense not a religious sword that exists to pry open other nations for the sake of spreading the gospel. Nothing positive has ever come about whenever the armed forces of other nations come under the sway of particular religions and there is no reason to believe the United States military is any different.

Yet, stories abound that officers and enlisted personnel are targeted by evangelical Christians and American armed forces are becoming more hostile to Americans who don’t share belief in their particular form of Christianity. This has led to litigation by soldiers who face retaliation for not being lockstep with evangelical Christianity.

Does this organized expression of Christianity have a long tradition in the American military? Not really, according to Jonathan Herzog in the Washington Independent:
So there are atheists in foxholes after all.

Last week, on the eve of Veterans Day, the Secular Coalition for America and the Military Assn. of Atheists and Freethinkers held a news conference in Washington to present an open letter to President-elect Barack Obama. Citing a report that found 21 percent of those in the armed forces identifying themselves as atheists or having “no religion,” the groups called on the new administration to pursue a military policy more open to nonbelievers.

The action follows on the heels of a much-publicized legal case involving atheism and the military. Jeremy Hall, 23, a U.S. Army specialist, grew up a Bible-reading Baptist in rural North Carolina. But his faith in God did not survive the battlefields of Iraq. Since disclosing his atheism, Hall claims he has become a target of insult and scorn — labeled “immoral,” “devil worshiper” and, curiously enough, gay — by fellow GIs and superior officers. But the pith of his complaint runs deeper than personal insult.

In his lawsuit, filed in Kansas last year, Hall and his co-plaintiff, the Military Religious Freedom Foundation, accuse the military establishment not only of prejudice against nonbelievers but of blatant favoritism toward Christianity. As the suit challenging the place of religion in the armed forces lumbers toward a constitutional showdown, Hall and the Secular Coalition for America have sparked a national conversation about one of the military’s least discussed shibboleths.

The battle lines are already drawn. Critics depict Hall’s complaint as a campaign to destroy the spiritual foundation that the nation’s military has depended on for centuries. (“His right to spew his lying hot air cannot be allowed to decrease the morale of soldiers in combat,” writes one Christian blogger.) Meanwhile, the latest crop of best-selling atheists grant Hall some form of secular sainthood.

U.S. martial leaders have long prayed before and after battle: George Washington at the close of the Revolutionary War; George Dewey after his victory against the Spanish fleet at Manila; and Dwight D. Eisenhower on the eve of D-Day. Chaplains have also been key components of U.S. fighting forces, from the ragtag colonial militias to the highly professional units of today.

So when Americans learn that soldiers are being evangelized on military bases, that religious materials are often circulated among troops and that depictions of Washington kneeling in prayer are ubiquitous in military circles, they might likely see all this as an organic part of a venerable tradition.

But these incidents are anything but organic — and not nearly as deeply rooted as one might imagine. In fact, they are largely the residue of a forgotten footnote to U.S. military history during the late 1940s and 1950s — a time when civilian and military leaders attempted to imbue the armed forces with religious zeal and purpose.

At issue today, however, is not the place of religion in the military. Rather, it is the official sanction that government gives it. While this matter is given special weight by those who see America in the midst of a modern holy war against terrorism, it has precedent in the nation’s last great quasi-religious crusade — the battle against atheistic communism.

More than 60 years ago, when the Cold War was menacing but still unnamed, U.S. leaders faced the luckless dilemma of picking their own poison. If they demobilized the military after World War II, as their predecessors had done after previous wars, the Soviet threat might become unmanageable. But maintaining a large standing military would betray a national principle. It was considered profoundly un-American to maintain a powerful armed force in a time of peace. According to a long line of patriots, from Samuel Adams on down, standing armies threatened liberty and smothered virtue.

Added to this dilemma was a spiritual wild card. While Americans today would probably define communism as a political or economic philosophy, decision-makers in the 1940s and 1950s viewed it as a quasi-religion. It had prophets and prophecy, missionaries and martyrs, and a belief in the ultimate perfectibility of mankind through inevitable historical process.

National-security analysts fretted over the almost “messianic” devotion of Soviet citizens. Military leaders worried that physical force alone might be insufficient in the emerging Cold War. “Over and over again, gigantic concentrations of physical power have gone down in defeat before a lesser strength propelled by conviction,” warned one brigadier general in 1949. “The Goliaths have perished at the hands of the Davids.”

President Harry S. Truman decided to run the risk of America maintaining a sizable standing military. But to many, his cure looked worse than the disease. In 1938, only one in five servicemen was younger than 21. Ten years later, they made up more than half the military and accounted for 70 percent of all enlistments. America’s new standing army was regarded as puerile, impressionable and naïve.

Military leaders wondered if they stood on the verge of creating a potential Frankenstein monster. Their plan needed a fail-safe. So they decided not to pull the plug on their monster — but to give it a soul instead. To this end, religion became indispensable.

Military leaders vigorously blended the martial with the sacred to foster virtue and create spiritual warriors immune to the siren songs of communism. In the Fort Knox Experiment of 1947, the army toyed with the idea of simultaneously running new recruits through a physical and religious boot camp. When this proved too blatantly unconstitutional for Army-wide adoption, the “Fort Knox methods” lived on in the Army’s commitment to develop the spiritual side of its troops.

Truman thought so highly of this mission that, one year later, he created the President’s Committee on Religion and Welfare in the Armed Forces, the first presidential commission devoted to religion. Its members designed campaigns to encourage soldiers to attend church; to urge local religious groups to invite servicemen to their congregations; and to revitalize the military chaplaincy.

While the military brass had no stomach for mandatory religious services, it did authorize, beginning in the late 1940s, various “character guidance” programs run by the reorganized chaplaincy. New recruits attended a minimum of six hours of chaplain lectures on such topics as the sacredness of marriage, the relationship between democracy and religion, and the dangerous faith of communism. All other personnel had to attend similar lectures once a month.

Among other things, soldiers learned that in the Cold War, the United States, a “covenant nation” due to its reliance on God, confronted the “demonic nation” of the Soviet Union. In a contest between God and Satan, military leaders bet on the home team.

This was tame compared to the religious programs of the newly independent Air Force. Under Maj. Gen. Charles I. Carpenter, the Air Force project consisted of lay retreats, on-base preaching missions by religious groups and the confiscation of obscene materials.

Carpenter also believed in the power of religion to solve the personal problems of Air Force personnel. Consider one case cited by a U.S. Air Force report. A military surgeon reported treating an airman suffering from a nervous breakdown. The diagnosis: neurosis stemming from religious confusion. The prescription: a session with the base chaplain, who set up a “systematic plan” of religious treatment.

Nor did Carpenter stop there. In late 1948, he struck a deal with the Moody Bible Institute of Science, an evangelical organization devoted to repairing the damage done to religion by Darwinism. Soon, airmen across America and throughout the world were watching films like “God of Creation” and “Duty or Destiny.” The Air Force even provided the representatives of the Moody Institute with a fully crewed B-25. By 1951, nearly 200,000 Air Force personnel were watching Moody films each year.

Nonbelievers like Hall must have existed in the 1950s, or, at the very least, troops uncomfortable with the idea of religious training. But few spoke up. It took a 1962 decision by the U.S. Supreme Court to end the 15-year period of officially sanctioned military sacralization.

In the wake of Engel vs. Vitale, the Supreme Court ruling that deemed prayer in public schools unconstitutional, the Washington director of the American Civil Liberties Union brought grievances of “religious indoctrination” directly to Army Sec. Cyrus R. Vance. Vance responded quickly. In March 1963, he ordered Army chaplains to create a new, secular version of character guidance — outside chapels and without sermonizing. The other services did the same.

As long as the United States remains a religious country, there will be religion in the military. And while the outcome of Hall’s lawsuit is uncertain, it has sparked a worthwhile conversation about faith and the uniform.

Understanding why the military was allowed to craft its own religious imprimatur 60 years ago takes no large stretch of the imagination. During an era when the truly religious could not be communists, the truly irreligious could not be Americans. This axiom rang particularly true for those on the front lines of the Cold War.

Those lamenting Hall’s lawsuit today should consider this slice of military history. From Puritan dreams to evangelical rallies, religion has remained a constant force in our national journey — the military’s in particular.

But the official sanctions afforded it have been anything but constant. Few today realize just how much of the military’s current positions toward religion, far from being longtime American attitudes, are merely vestiges from the Cold War era.

Those cheering Hall’s case should appreciate the extent to which the military has grown more secular over the past few decades. Where once the U.S. Air Force supplied airplanes to evangelists, it now officially insists that commanders “not take it upon themselves to change or coercively influence the religious views of subordinates.”

During the struggle against atheistic communism, comments like those of the Army’s Lt. Gen. William Boykin — who in 2004 called the war on terror a battle against “Satan” — were not only common but celebrated. Today, they are decried by the command structure, including President George W. Bush.

Throughout history, the Davids have sometimes slain the Goliaths. But more often, the stronger, better-equipped force prevailed — with or without the blessings of the Almighty.

Maybe this is what Hall means when he says that while he doesn’t believe in God, he does “believe in Plexiglas.” Whether he wanted to or not, Hall may have stumbled on the ultimate form of “coming out” in the military, and this may require the consideration of military leaders, an appreciation for the military’s religiously sanctioned past and perhaps even a decision from the next commander-in-chief.

If nothing else, it would give a new meaning to the policy of “Don’t ask, don’t tell.”

Sunday, November 16, 2008

Afghanistan: It’s still winnable, but only just

David Kilcullen is a former Australian Army officer who now advises the United States Department of State on counterinsurgency. In 2007 he served in Iraq with the Multi-National Force on the staff of General David Petraeus. He is currently a senior fellow at the Center for a New American Security.

George Packer interviewed Kilcullen regarding the deteriorating situation in Afghanistan:
The White House briefed both campaigns on Afghanistan before the election. Apparently that’s how little time we have to turn things around. So how bad is it?

It’s bad: violence is way up, Taliban influence has spread at the local level, and popular confidence in the government and the international community is waning fast. It’s still winnable, but only just, and to turn this thing around will take an extremely major effort starting with local-level governance, political strategy, giving the Afghan people a well-founded feeling of security, and dealing with the active sanctuary in Pakistan. A normal U.S. government transition takes six to nine months, by the time new political appointees are confirmed, briefed, and in position. But nine months out from now will be the height of the Afghan fighting season, and less than a month out from critical Presidential elections in Afghanistan. If we do this the “normal” way, it will be too late for the Obama Administration to grip it up. I think this is shaping up to be one of the smoothest transitions on record, with the current Administration going out of its way to assist and facilitate. That said, the incoming Administration has a steep learning curve, and has inherited a dire situation—so whatever we do, it’s not going to be easy.

It sounds like you’re proposing classic counterinsurgency strategy: a combination of offensive and defensive military operations, political and economic development, and diplomacy. Isn’t that what we’ve been doing these past seven years? Have we just not been doing enough of all these? Or do we need to change strategy to something fundamentally new?

Well, we need to be more effective in what we are doing, but we also need to do some different things, as well, with the focus on security and governance. The classical counterinsurgency theorist Bernard Fall wrote, in 1965, that a government which is losing to an insurgency isn’t being out-fought, it’s being out-governed. In our case, we are being both out-fought and out-governed for four basic reasons:

(1) We have failed to secure the Afghan people. That is, we have failed to deliver them a well-founded feeling of security. Our failing lies as much in providing human security—economic and social wellbeing, law and order, trust in institutions and hope for the future—as in protection from the Taliban, narco-traffickers, and terrorists. In particular, we have spent too much effort chasing and attacking an elusive enemy who has nothing he needs to defend—and so can always run away to fight another day—and too little effort in securing the people where they sleep. (And doing this would not take nearly as many extra troops as some people think, but rather a different focus of operations).

(2) We have failed to deal with the Pakistani sanctuary that forms the political base and operational support system for the Taliban, and which creates a protective cocoon (abetted by the fecklessness or complicity of some elements in Pakistan) around senior al Qaeda and Taliban leaders.

(3) The Afghan government has not delivered legitimate, good governance to Afghans at the local level—with the emphasis on good governance. In some areas, we have left a vacuum that the Taliban has filled, in other areas some of the Afghan government’s own representatives have been seen as inefficient, corrupt, or exploitative.

(4) Neither we nor the Afghans are organized, staffed, or resourced to do these three things (secure the people, deal with the safe haven, and govern legitimately and well at the local level)—partly because of poor coalition management, partly because of the strategic distraction and resource scarcity caused by Iraq, and partly because, to date, we have given only episodic attention to the war.

So, bottom line—we need to do better, but we also need a rethink in some key areas starting with security and governance.

***

On the Pakistani sanctuary, this seems to be the cancer in the bones of Afghanistan, and no one has a good answer. Both air power and special-forces incursions have drawn the wrath of the Pakistani government and people, but their efforts, as you say, have been weak at best and two-faced at worst. Our diplomats and development workers are being systematically targeted, and there’s a question how well we can spend $750 million in the northwest. Is there a way to clear out this sanctuary, that doesn’t cause the problem to metastasize?

You’re right. Pakistan is extremely important; indeed, Pakistan (rather than either Afghanistan or Iraq) is the central front of world terrorism. The problem is time frame: it takes six to nine months to plan an attack of the scale of 9/11, so we need a “counter-sanctuary” strategy that delivers over that time frame, to prevent al Qaeda from using its Pakistan safe haven to mount another attack on the West. This means that building an effective nation-state in Pakistan, though an important and noble objective, cannot be our sole solution—nation-building in Pakistan is a twenty to thirty year project, minimum, if indeed it proves possible at all—i.e. nation-building doesn’t deliver in the time frame we need. So we need a short-term counter-sanctuary program, a long-term nation-building program to ultimately resolve the problem, and a medium-term “bridging” strategy (five to ten years)—counterinsurgency, in essence—that gets us from here to there. That middle part is the weakest link right now. All of that boils down to a policy of:

(a) encouraging and supporting Pakistan to step up and effectively govern its entire territory including the FATA [Federally Administered Tribal Areas], and to resolve the current Baluch and Pashtun insurgency, while

(b) assisting wherever possible in the long-term process of state-building and governance, but

(c) reserving the right to strike, as a last resort, at al Qaeda-linked terrorist targets that threaten the international community, if (and only if) they are operating in areas that lie outside effective Pakistani sovereignty.

During the campaign, McCain talked about transferring the surge from Iraq Afghanistan. We’ve discussed the military side. On the political side, is there any possible counterpart to the Sunni Awakening in Afghanistan—perhaps local Taliban disenchanted with foreign influences on their leadership? Should part of our political strategy be to talk to Taliban leaders who might be prepared to negotiate with us?

Well, I doubt that an Anbar-style “awakening” is likely in Afghanistan. The enemy is very different from A.Q.I. and, in any case, Pashtun tribes have a very different makeup from Arab tribes. So even if an awakening happened it would likely play out differently from Iraq. Rather than talking about negotiations (which implies offering an undefeated Taliban a seat at the table, and is totally not in the cards) I would prefer the term “community engagement.” The local communities (tribes, districts, villages) in some parts of Afghanistan have been alienated by poor governance and feel disenfranchised through the lack of district elections. This creates a vacuum, especially in terms of rule of law, dispute resolution, and mediation at the village level, that the Taliban have filled. Rather than negotiate directly with the Taliban, a program to reconcile with local communities who are tacitly supporting the Taliban by default (because of lack of an alternative) would bear more fruit. The Taliban movement itself is disunited and fissured with mutual suspicion—local tribal leaders have told me that ninety per cent of the people we call Taliban could be reconcilable under some circumstances, but that many are terrified of what the Quetta shura and other extremists associated with the old Taliban regime might do to them if they tried to reconcile. So, while an awakening may not happen, the basic principles we applied in Iraq—co-opt the reconcilables, make peace with anyone willing to give up the armed struggle, but simultaneously kill or capture all those who prove themselves to be irreconcilable—are probably very applicable.

You spoke of Iraq’s effect in draining our energy and focus away from Afghanistan. President-elect Obama has made it clear that he plans to alter the balance significantly. But, as you say, he doesn’t have much time. If you had his ear, what would be your basic advice?

Well, I don’t have his ear, and I don’t envy the pressure he must be under. But if I did have his ear, I think I would argue for the four major points we discussed above. First, the draw-down in Iraq needs to be conditions-based and needs to recognize how fragile our gains there have been, and our moral obligation to Iraqis who have trusted us. As I said, we don’t want to un-bog ourselves from Iraq only to get bogged in Afghanistan while Iraq turns bad again. Second, our priorities in Afghanistan should be security, governance, and dealing with the Pakistan safe haven—and we may not necessarily need that many more combat troops to do so. Third, the Afghan elections of September 2009 are a key milestone—we can’t just muddle through, and the key problem is political: delivering effective and legitimate governance that meets Afghans’ needs. And finally, most importantly, this is a wartime transition and we can’t afford the normal nine-month hiatus while we put the new Administration in place: the war in Afghanistan will be won or lost in the next fighting season, i.e. by the time of the September elections.

The situation in Afghanistan is dire. But the war is winnable. We need to focus our attention on the problem, and think before acting. But we need to think fast, and our actions need to involve a major change of direction, focussing on securing the population rather than chasing the enemy, and delivering effective legitimate governance to the people, bottom-up, at the local level. Do that, do it fast, and we stand an excellent chance of turning things around.
You can read the entire piece here.

Friday, November 14, 2008

Bessie Smith: “St. Louis Blues” (1929)



This is Bessie Smith in her only film appearance, a 1929 two-reeler titled “St. Louis Blues” based on W. C. Handy’s song by the same name. She sings the title song accompanied by members of Fletcher Henderson’s orchestra, the Hall Johnson Choir, pianist James P. Johnson, and a string section.

Bessie Smith (1892 – 1937) was an American blues singer and the most popular female blues singer of the 1920’s and 1930’s. She was a major influence on subsequent jazz vocalists. Billie Holiday, Mahalia Jackson and Janis Joplin are among singers who credit her with influencing their careers.

As a way of earning money for their impoverished childhood household, Smith and her brother Andrew began performing on the streets of Chattanooga as a duo, she singing and dancing, he accompanying on guitar; their preferred location was in front of the White Elephant Saloon at Thirteenth and Elm streets in the heart of the city's African-American community.

Her oldest brother, Clarence, had joined a musical group in 1904 but she was too young to join. However, by 1912, Clarence arranged for an audition for his sister. She was hired as a dancer rather than a singer, because the company also included Ma Rainey.

By the 1920’s Smith starred in theatrical music productions and soon became a recording artist. Within a few years she became the highest-paid black entertainer of her day. She made 160 recordings for Columbia Records.

Smith died from injuries suffered in an automobile accident in the fall of 1937. The funeral was attended by seven thousand people but her grave remained unmarked due to her estranged husband. In August of 1970, a tombstone was placed, paid for by singer Janis Joplin and Juanita Green, who, as a child, had done housework for Smith.

The 2008 election and the dawning of a new liberal order

Peter Beinhart examines in Time magazine this week the history of modern liberalism from the early 20th Century through the present (using Grant Park during the Democratic Convention in 1968 and Grant Park on election night 2008 as bookends for its temporary demise) and the dawning of a new liberal order under President-elect Obama:
In America, political majorities live or die at the intersection of two public yearnings: for freedom and for order. A century ago, in the Progressive Era, modern American liberalism was born, in historian Robert Wiebe's words, as a "search for order." America's giant industrial monopolies, the progressives believed, were turning capitalism into a jungle, a wild and lawless place where only the strong and savage survived. By the time Roosevelt took office during the Great Depression, the entire ecosystem appeared to be in a death spiral, with Americans crying out for government to take control. F.D.R. did — juicing the economy with unprecedented amounts of government cash, creating new protections for the unemployed and the elderly, and imposing rules for how industry was to behave. Conservatives wailed that economic freedom was under assault, but most ordinary Americans thanked God that Washington was securing their bank deposits, helping labor unions boost their wages, giving them a pension when they retired and pumping money into the economy to make sure it never fell into depression again. They didn't feel unfree; they felt secure. For three and a half decades, from the mid-1930s through the '60s, government imposed order on the market. The jungle of American capitalism became a well-tended garden, a safe and pleasant place for ordinary folks to stroll. Americans responded by voting for F.D.R.-style liberalism — which even most Republican politicians came to accept — in election after election.

By the beginning of the 1960s, though, liberalism was becoming a victim of its own success. The post-World War II economic boom flooded America's colleges with the children of a rising middle class, and it was those children, who had never experienced life on an economic knife-edge, who began to question the status quo, the tidy, orderly society F.D.R. had built. For blacks in the South, they noted, order meant racial apartheid. For many women, it meant confinement to the home. For everyone, it meant stifling conformity, a society suffocated by rules about how people should dress, pray, imbibe and love. In 1962, Students for a Democratic Society spoke for what would become a new, baby-boom generation "bred in at least modest comfort," which wanted less order and more freedom. And it was this movement for racial, sexual and cultural liberation that bled into the movement against Vietnam and assembled in August 1968 in Grant Park.

Traditional liberalism died there because Americans — who had once associated it with order — came to associate it with disorder instead. For a vast swath of the white working class, racial freedom came to mean riots and crime; sexual freedom came to mean divorce; and cultural freedom came to mean disrespect for family, church and flag. Richard Nixon and later Reagan won the presidency by promising a new order: not economic but cultural, not the taming of the market but the taming of the street.

Flash forward to the evening of Nov. 4, and you can see why liberalism has sprung back to life. Ideologically, the crowds who assembled to hear Obama on election night were linear descendants of those egg throwers four decades before. They too believe in racial equality, gay rights, feminism, civil liberties and people's right to follow their own star. But 40 years later, those ideas no longer seem disorderly. Crime is down and riots nonexistent; feminism is so mainstream that even Sarah Palin embraces the term; Chicago mayor Richard Daley, son of the man who told police to bash heads, marches in gay-rights parades. Culturally, liberalism isn't that scary anymore. Younger Americans — who voted overwhelmingly for Obama — largely embrace the legacy of the '60s, and yet they constitute one of the most obedient, least rebellious generations in memory. The culture war is ending because cultural freedom and cultural order — the two forces that faced off in Chicago in 1968 — have turned out to be reconcilable after all.

The disorder that panics Americans now is not cultural but economic. If liberalism collapsed in the 1960s because its bid for cultural freedom became associated with cultural disorder, conservatism has collapsed today because its bid for economic freedom has become associated with economic disorder. When Reagan took power in 1981, he vowed to restore the economic liberty that a half-century of F.D.R.-style government intrusion had stifled. American capitalism had become so thoroughly domesticated, he argued, that it lost its capacity for dynamic growth. For a time, a majority of Americans agreed. Taxes and regulations were cut and cut again, and for the most part, the economic pie grew. In the 1980s and '90s, the garden of American capitalism became a pretty energetic place. But it became a scarier place too. In the newly deregulated American economy, fewer people had job security or fixed-benefit pensions or reliable health care. Some got rich, but a lot went bankrupt, mostly because of health-care costs. As Yale University political scientist Jacob Hacker has noted, Americans today experience far-more-violent swings in household income than did their parents a generation ago.

Starting in the 1990s, average Americans began deciding that the conservative economic agenda was a bit like the liberal cultural agenda of the 1960s: less liberating than frightening. When the Gingrich Republicans tried to slash Medicare, the public turned on them en masse. A decade later, when George W. Bush tried to partially privatize Social Security, Americans rebelled once again. In 2005 a Pew Research Center survey identified a new group of voters that it called "pro-government conservatives." They were culturally conservative and hawkish on foreign policy, and they overwhelmingly supported Bush in 2004. But by large majorities, they endorsed government regulation and government spending. They didn't want to unleash the free market; they wanted to rein it in.

Those voters were a time bomb in the Republican coalition, which detonated on Nov. 4. John McCain's promises to cut taxes, cut spending and get government out of the way left them cold. Among the almost half of voters who said they were "very worried" that the economic crisis would hurt their family, Obama beat McCain by 26 points.

The public mood on economics today is a lot like the public mood on culture 40 years ago: Americans want government to impose law and order — to keep their 401(k)s from going down, to keep their health-care premiums from going up, to keep their jobs from going overseas — and they don't much care whose heads Washington has to bash to do it.

That is both Obama's great challenge and his great opportunity. If he can do what F.D.R. did — make American capitalism stabler and less savage — he will establish a Democratic majority that dominates U.S. politics for a generation. And despite the daunting problems he inherits, he's got an excellent chance. For one thing, taking aggressive action to stimulate the economy, regulate the financial industry and shore up the American welfare state won't divide his political coalition; it will divide the other side. On domestic economics, Democrats up and down the class ladder mostly agree. Even among Democratic Party economists, the divide that existed during the Clinton years between deficit hawks like Robert Rubin and free spenders like Robert Reich has largely evaporated, as everyone has embraced a bigger government role. Today it's Republicans who — though more unified on cultural issues — are split badly between upscale business types who want government out of the way and pro-government conservatives who want Washington's help. If Obama moves forcefully to restore economic order, the Wall Street Journal will squawk about creeping socialism, as it did in F.D.R.'s day, but many downscale Republicans will cheer. It's these working-class Reagan Democrats who could become tomorrow's Obama Republicans — a key component of a new liberal majority — if he alleviates their economic fears.

Obama doesn't have to turn the economy around overnight. After all, Roosevelt hadn't ended the Depression by 1936. Obama just needs modest economic improvement by the time he starts running for re-election and an image as someone relentlessly focused on fixing America's economic woes. In allocating his time in his first months as President, he should remember what voters told exit pollsters they cared about most — 63% said the economy. (No other issue even exceeded 10%.)

In politics, crisis often brings opportunity. If Obama restores some measure of economic order, kick-starting U.S. capitalism and softening its hard edges, and if he develops the kind of personal rapport with ordinary Americans that F.D.R. and Reagan had — and he has the communication skills to do it — liberals will probably hold sway in Washington until Sasha and Malia have kids. As that happens, the arguments that have framed economic debate in recent times — for large upper-income tax cuts or the partial privatization of Social Security and Medicare — will fade into irrelevance. In an era of liberal hegemony, they will seem as archaic as defending the welfare system became when conservatives were on top.

There are fault lines in the Obama coalition, to be sure. In a two-party system, it's impossible to construct a majority without bringing together people who disagree on big things. But Obama's majority is at least as cohesive as Reagan's or F.D.R.'s. The cultural issues that have long divided Democrats — gay marriage, gun control, abortion — are receding in importance as a post-'60s generation grows to adulthood. Foreign policy doesn't divide Democrats as bitterly as it used to either because, in the wake of Iraq, once-hawkish working-class whites have grown more skeptical of military force. In 2004, 22% of voters told exit pollsters that "moral values" were their top priority, and 19% said terrorism. This year terrorism got 9%, and no social issues even made the list.

The biggest potential land mine in the Obama coalition isn't the culture war or foreign policy; it's nationalism. On a range of issues, from global warming to immigration to trade to torture, college-educated liberals want to integrate more deeply America's economy, society and values with the rest of the world's. They want to make it easier for people and goods to legally cross America's borders, and they want global rules that govern how much America can pollute the atmosphere and how it conducts the war on terrorism. They believe that ceding some sovereignty is essential to making America prosperous, decent and safe. When it comes to free trade, immigration and multilateralism, though, downscale Democrats are more skeptical. In the future, the old struggle between freedom and order may play itself out on a global scale, as liberal internationalists try to establish new rules for a more interconnected planet and working-class nationalists protest that foreign bureaucrats threaten America's freedom.

But that's in the future. If Obama begins restoring order to the economy, Democrats will reap the rewards for a long time. Forty years ago, liberalism looked like the problem in a nation spinning out of control. Today a new version of it may be the solution. It's a very different day in Grant Park.

Thursday, November 13, 2008

Taliban targets teenage girls with acid and government office with car bomb



Since the U.S. invasion of Iraq, Afghanistan has become the neglected war for American policy makers. However, the violence of the unresolved conflict continues unabated and the Taliban operates quite freely in many parts of the country.

In Kandahar, two men on a motorbike attacked teenagers yesterday as they were walking to their all-girls high school. They threw acid in the faces of the girls injuring up to 15 of the young women. The Taliban have opposed education for women and the establishment of girls’ schools has been considered one of the significant achievements of the new government. The attack emptied the school the girls attended.

Reporters for Aljazeer English were working on the story of the attack on the girls when a Taliban car bomb exploded next to a government building during a provincial council meeting killing six and wounding at least forty-two Afghan citizens. The bomb ripped through the council office and flattened nearby homes.

Wednesday, November 12, 2008

Coleman Hawkins: “Stoned” (1964)



This is the Coleman Hawkins’ Quintet (Hawkins on tenor saxophone, Harry “Sweets” Edison on trumpet, Jimmy Woode on bass, Charles Thompson on piano, and Papa Jo Jones on drums) performing “Stoned” in 1964.

Coleman Randolph Hawkins (1904 – 1969) was a prominent American jazz tenor saxophonist. He is commonly regarded as the first important and influential jazz musician to use the tenor saxophone. He began playing professionally in the 1920’s and was associated with swing music but played a role in the development of bebop in the 1940’s and avant-garde jazz in the 1950’s and 1960’s.

He was nicknamed "Hawk" and sometimes "Bean".

Something about Sara

As the GOP recovers from the Sara Palin candidacy for Vice President in 2008 she is making moves to be the Republican nominee for President in 2012. It’s time to sit back and give some thought not only to Palin but the whole Palin phenomena. Andrew Sullivan does and is quite blunt in his assessment:
Let's be real in a way the national media seems incapable of: this person should never have been placed on a national ticket in a mature democracy. She was incapable of running a town in Alaska competently. The impulsive, unvetted selection of a total unknown, with no knowledge of or interest in the wider world, as a replacement president remains one of the most disturbing events in modern American history. That the press felt required to maintain a facade of normalcy for two months - and not to declare the whole thing a farce from start to finish - is a sign of their total loss of nerve. That the Palin absurdity should follow the two-term presidency of another individual utterly out of his depth in national government is particularly troubling. 46 percent of Americans voted for the possibility of this blank slate as president because she somehow echoed their own sense of religious or cultural "identity". Until we figure out how this happened, we will not be able to prevent it from happening again. And we have to find a way to prevent this from recurring.

It happened because John McCain is an incompetent and a cynic and reckless beyond measure. To have picked someone he'd only met once before, without any serious vetting procedure, revealed McCain as an utterly unserious character, a man whose devotion to the shallowest form of political gamesmanship trumped concern for his country's or his party's interest. We need a full accounting of the vetting process: who was responsible for this act of political malpractice? How could a veep not be vetted in any serious way? Why was she not asked to withdraw as soon as the facts of her massive ignorance and delusional psyche were revealed?

The Palin nightmare also happened because a tiny faction of political professionals has far too much sway in the GOP and conservative circles. This was Bill Kristol's achievement.

It was a final product of the now-exhausted strategy of fomenting fundamentalist resentment to elect politicians dedicated to the defense of Israel and the extension of American military hegemony in every corner of the globe. Palin was the reductio ad absurdum of this mindset: a mannequin candidate, easily controlled ideologically, deployed to fool and corral the resentful and the frightened, removed from serious scrutiny and sold on propaganda networks like a food product.

This deluded and delusional woman still doesn't understand what happened to her; still has no self-awareness; and has never been forced to accept her obvious limitations. She cannot keep even the most trivial story straight; she repeats untruths with a ferocity and calm that is reserved only to the clinically unhinged; she has the educational level of a high school drop-out; and regards ignorance as some kind of achievement. It is excruciating to watch her - but more excruciating to watch those who feel obliged to defend her.

North Kivu is quite possibly the worst place to be a child

The increasing violence in the eastern regions of the Democratic Republic of Congo is taking a terrible toll on civilians. The most vulnerable of these civilians are the children. Reuters has this report from the North Kivu Province (Province du Nord-Kivu) which is located on the DRC’s eastern border:
Packed into squalid refugee camps or roaming in the bush, hundreds of thousands of Congolese children face hunger, disease, sexual abuse or recruitment by marauding armed factions, aid workers said on Tuesday.

Weeks of violence have forced more than 250,000 people from homes or ramshackle camps where they had taken shelter, bringing to over 1 million the number of internal refugees from years of fighting in Democratic Republic of Congo's North Kivu province.

Most are children.

"North Kivu is quite possibly the worst place to be a child. There is no question that children have been the most severely affected by the recent conflict," said George Graham, spokesman for Save the Children in the provincial capital, Goma.

Fighting between Tutsi rebels and pro-government troops and militia fighters has subsided into sporadic clashes in recent days as African leaders staged summits and leant on both sides to avert a repeat of Congo's devastating 1998-2003 regional war.

"When children flee fighting they become more vulnerable to contracting diseases, to becoming malnourished, and vulnerable to predators like sexual abuse, exploitation, violence and recruitment into armed groups," U.N. Children's Fund (UNICEF) spokesman Jaya Murthy told Reuters in Goma.

Sixty percent of the 1.1 million displaced are children, he said. "We estimate that there's around 2,000 to 3,000 children in armed groups and recruitment is going on right now."

"This has been a silent emergency for children for the last five years, only now it is re-exploding -- again."

Fighters on both sides have attacked, looted, raped and murdered civilians in raids the U.N. peacekeeping force in Congo, known as MONUC, says include war crimes.

U.S.-based Human Rights Watch quoted local sources and civilians as saying at least 50 civilians were killed last week in Kiwanja, 70 km (45 miles) north of Goma.

Nyrarukundo Rivera, 42, told Reuters she lost her children when fleeing violence in Kiwanja and hadn't seen them since.

Tuesday, November 11, 2008

Freddie Hubbard: “Children of The Night” (1963)



This is Freddie Hubard playing “Children of the Night" with Art Blakey, Wayne Shorter, Curtis Fuller, Cedar Walton, Reggie Workman at the 1963 San Remo Jazz Fest.

Frederick Dewayne Hubbard (born 1938) is an American jazz trumpeter known primarily for playing the bebop, hard bop and post bop styles of jazz. Freddie played mellophone and then trumpet as a teenager. He moved to New York in 1958 at the age of 20, he quickly astonished fans and critics alike with the depth and maturity of his playing working with veteran jazz artists Philly Joe Jones (1958-59, 1961), Sonny Rollins (1959), Slide Hampton (1959-60), J.J. Johnson (1960), Eric Dolphy, his room-mate for 18 months, and Quincy Jones, with whom he toured Europe (1960-61).

In 1961 he joined Art Blakey's Jazz Messengers (replacing Lee Morgan). Freddie had quickly established himself as an important new voice in jazz. While earning a reputation as a hard-blowing young lion, he had developed his own sound, distancing himself from the early influence of Clifford Brown and Miles Davis and won Down Beat's "New Star" award on trumpet.

He remained with Blakey until 1966, leaving to form his own small groups, which over the next few years featured Kenny Barron and Louis Hayes. Throughout the 60s he also played in bands led by others, including Max Roach and Herbie Hancock.

He achieved his greatest popular success in the 1970s with a series of crossover albums on CTI Records. Although his early 70s jazz albums Red Clay, First Light and Straight Life were particularly well received (First Light won a Grammy Award), this period saw Hubbard emulating Herbie Hancock and moving into jazz fusions. However, he sounded much more at ease in the hard bop context of his 1977 tour with the V.S.O.P. quintet, the band which retraced an earlier quintet led by Miles Davis and brought together ex-Davis sidemen Hancock, Hayes, Wayne Shorter and Ron Carter, with Hubbard taking the Davis role. In the 80s Hubbard was again leading his own jazz group, attracting very favorable notices for his playing at concerts and festivals in the USA and Europe, often in the company of Joe Henderson, playing a repertory of hard-bop and modal-jazz pieces.. He played with Woody Shaw, recording with him in 1985, and two years later recorded Stardust with Benny Golson. In 1988 he teamed up once more with Blakey at an engagement in Holland. In 1990 he appeared in Japan headlining an American-Japanese concert tour and at the Warsaw Jazz Festival in 1992.

Health problems since 1992 have slowed down his performance schedule.

Some sticky issues should be a little less sticky by the time Obama enters the White House

There is no question Barack Obama will face a number of challenges when he takes office in January. No small number of those challenges will be products of the Bush administration’s governance over the past eight years – governance in many cases that could reasonably be classified as malpractice.

Still, there are some hopeful signs that certain issues in foreign and military policy about to be dropped in Mr. Obama’s lap may be more workable that they would have been just a few months ago. Fred Kaplan reviews some in Slate:
Iraq. Just a few days after Obama's victory, the Iraqi political factions seemed much more disposed to sign a new Status of Forces Agreement with the United States. The SOFA, which is set to expire at the end of the year, outlines the conditions under which U.S. troops are permitted to remain in the country. One condition that Iraq has been demanding is the complete withdrawal of U.S. combat troops by 2011. Several Iraqi parties have been reluctant to ratify the accord even then, doubting that George W. Bush—or, had he won, John McCain—would really withdraw. But they believe that Obama will. So they're suddenly more eager to finalize an accord. Some factions are also more keen to settle their internal differences to avoid a political collapse or a renewed civil war once the Americans leave. Obama knows that early in his presidency he'll have to figure out a way to mount a major withdrawal from Iraq while minimizing the chance that the Baghdad government falls apart. This new tenor in Iraqi politics somewhat eases the task.

Iran. After refusing to talk with Iran for seven years, on the grounds that "we don't negotiate with evil, we defeat it" (as Vice President Dick Cheney once put it), the Bush administration is preparing to set up a U.S. interests office—not quite an embassy but the beginning of renewed diplomatic relations, a forum for communiqués, anyway—in Tehran. If Obama is prepared to offer more elaborate negotiations, as he should be, a forum will exist for doing so. At the same time, a smart-sanctions campaign run out of the U.S. Treasury Department for the last two years—in which international banks have been persuaded to stop doing business with Iran—seems to be having some effect. Meanwhile, plunging oil prices have slashed Tehran's cash flow. And President Mahmoud Ahmadinejad, who has been riding high on this flow, is losing popularity at home. In short, the time may be ripe for a game of carrot-and-stick diplomacy with Iran, in which the carrot may be welcome and the stick might really hurt.

Russia. President Dmitry Medvedev's recent rumblings—his threat to place short-range missiles in Kaliningrad if the United States proceeds with its plan to install missile-defense batteries in the Czech Republic and Poland—may, if played right, redound to Obama's benefit. Obama clearly doesn't share Bush's misplaced enthusiasm for the missile-defense program; he has said several times that he would deploy a system if it were proved workable—a condition that's not likely to pan out. So Obama now has a good reason to drop the deployment plan—but with a caveat. He should reiterate Bush's point (whether or not it's entirely true) that the batteries in Eastern Europe were designed to shoot down Iran's missiles, not Russia's, and if he's going to let down our guard on that front, Russia has to help him keep Iran from building nuclear weapons in the first place—in other words, Russia has to stop assisting the Iranian nuclear program and join the sanctions initiated by the United States, the European Union, and the U.N. Security Council. If this trade can be made, other avenues of cooperation can also be reopened.

Efforts to revive relations with Russia—crucial for dealing with such vital issues as terrorism, nuclear proliferation, and stability in the Middle East—might also be boosted by the latest news from Georgia. Independent military groups monitoring Russia's withdrawal are reporting that Georgia might not have been the purely innocent victim of Russian aggression, after all. The evidence, though still tentative, seems to suggest that Moscow was responding to the Georgian military's indiscriminate rocket and artillery barrage against the semi-autonomous enclaves of Abkhazia and South Ossetia. This finding doesn't exonerate Medvedev or Putin from the brutality of their counterinvasion, nor should it prompt an abandonment of concern for Georgian independence. But it does create an opening for rapprochement with Moscow—for hardheaded national-security reasons—without seeming craven.

North Korea. After six years of refusing to talk seriously with the North Koreans about their nuclear program—for the same reason that he refused to talk with the Iranians about anything—Bush finally signed an accord that at least stopped North Korea's plutonium project. However, this was one case in which their obstinacy was justified. The deal signed last year was a multiphase arrangement. As part of the second phase, the North Koreans were to present data on their nuclear program—at which point the United States was to take North Korea off the list of nations supporting terrorism. The North Koreans submitted the data; Bush officials then demanded that the United States be allowed to verify the information on the list through on-site inspections. The North Koreans protested—correctly—that verification is a matter to be taken up in the third phase. When Washington kept refusing to take them off the list—largely at the instigation of officials in Cheney's office—the North Koreans threatened to cancel the whole agreement. Finally, Secretary of State Condoleezza Rice sent Assistant Secretary Christopher Hill to Pyongyang, and the deal was straightened out. The point is this: In 2000, Bill Clinton left George W. Bush on the verge of signing a far-reaching agreement with North Korea on nuclear weapons and missiles—and Bush tore it up and threw it away. Now Bush is leaving Obama with a much less-satisfying deal—during Bush's no-talking period, Pyongyang built and tested an atomic bomb and thus gained considerably greater leverage—but Bush is leaving Obama something to take to the next level without sparking (too much) partisan rancor.

Military spending. According to a story by Bryan Bender in the Boston Globe, the Defense Business Board, a senior advisory group appointed by Secretary of Defense Robert Gates, recommended huge cuts in the military budget, noting that the current level of spending on weapons is "unsustainable." Several private and congressional defense analysts have been making this point for a few years now; the U.S. Government Accountability Office recently calculated that the Pentagon's 95 largest weapons systems have accumulated cost overruns amounting to $300 billion (that's just the overruns, not the total cost, which amounts to many hundreds of billions more). It's also clear, from the Pentagon's own budget analyses, that well over half of the $700 billion-plus budget has little if anything to do with the threats the United States faces now or in the foreseeable future. The past seven years have been a free-for-all for the nation's military contractors and service chiefs; the number of canceled weapons projects can be counted on one hand; they've otherwise received nearly all the money for everything they've asked for. Even many of the beneficiaries realize that the binge is coming to an end; the nation simply can't afford it. Obama's fortune is that he can order the cuts, invoking not his own preferences but the sober-minded urgings of a business advisory group in the Bush administration.

Monday, November 10, 2008

Miriam Makeba: “Pata Pata” (1979)


This is Miriam Zenzi Makeba (1932 – 2008) singing one of her most famous songs "Pata Pata" during a performance in the VARA TV-studios in Holland, 1979. Joining her on stage towards the end of the song is her granddaughter Zenzi. Zenzi's mother (the late Bongi Makeba) is one of the backing vocalists (in the blue dress). Makeba was known affectionately as "Mama Africa." She died earlier today.

Her professional career began in the 1950s singing a blend of jazz and traditional melodies of South Africa.

Makeba then travelled to London where she met Harry Belafonte, who assisted her in gaining entry to and fame in the United States. She released many of her most famous hits there including "Pata Pata", "The Click Song" ("Qongqothwane" in Xhosa), and "Malaika". In 1966, Makeba received the Grammy Award for Best Folk Recording together with Harry Belafonte for An Evening With Belafonte/Makeba. The album dealt with the political plight of black South Africans under apartheid.

She discovered that her South African passport was revoked when she tried to return there in 1960 for her mother's funeral. In 1963, after testifying against apartheid before the United Nations, her South African citizenship and her right to return to the country were revoked. She has had nine passports, and was granted honorary citizenship of ten countries.

Nelson Mandela persuaded her to return to South Africa in 1990. Makeba continued to perform onstage and record new albums. She was a proud United Nations goodwill ambassador and also set up a school for destitute young girls in South Africa.

On 9 November 2008, she became ill while taking part in an anti-Mafia concert in southern Italy. Makeba suffered a heart attack a few minutes after her performance and died soon after. In his condolence message, former South African president Nelson Mandela said it was “fitting that her last moments were spent on a stage, enriching the hearts and lives of others - and again in support of a good cause.”

Obama’s chances of success depend largely on whether his short-run economic plans are sufficiently bold

There’s much comparison of the economic situation faced by Franklin Roosevelt’s incoming government in 1933 to the current state of the economy about to drop in the lap of Barack Obama’s new administration. The economy is a mess and a recession is setting in that will impact negatively to one degree or another on all Americans.

Paul Krugman says Obama should learn from Roosevelt failures as a result of being overly cautious as well as his successes in today’s New York Times:
… Barack Obama should learn from F.D.R.’s failures as well as from his achievements: the truth is that the New Deal wasn’t as successful in the short run as it was in the long run. And the reason for F.D.R.’s limited short-run success, which almost undid his whole program, was the fact that his economic policies were too cautious.

About the New Deal’s long-run achievements: the institutions F.D.R. built have proved both durable and essential. Indeed, those institutions remain the bedrock of our nation’s economic stability. Imagine how much worse the financial crisis would be if the New Deal hadn’t insured most bank deposits. Imagine how insecure older Americans would feel right now if Republicans had managed to dismantle Social Security.

Can Mr. Obama achieve something comparable? Rahm Emanuel, Mr. Obama’s new chief of staff, has declared that “you don’t ever want a crisis to go to waste.” Progressives hope that the Obama administration, like the New Deal, will respond to the current economic and financial crisis by creating institutions, especially a universal health care system, that will change the shape of American society for generations to come.
But the new administration should try not to emulate a less successful aspect of the New Deal: its inadequate response to the Great Depression itself.

Now, there’s a whole intellectual industry, mainly operating out of right-wing think tanks, devoted to propagating the idea that F.D.R. actually made the Depression worse. So it’s important to know that most of what you hear along those lines is based on deliberate misrepresentation of the facts. The New Deal brought real relief to most Americans.

That said, F.D.R. did not, in fact, manage to engineer a full economic recovery during his first two terms. This failure is often cited as evidence against Keynesian economics, which says that increased public spending can get a stalled economy moving. But the definitive study of fiscal policy in the ’30s, by the M.I.T. economist E. Cary Brown, reached a very different conclusion: fiscal stimulus was unsuccessful “not because it does not work, but because it was not tried.”

This may seem hard to believe. The New Deal famously placed millions of Americans on the public payroll via the Works Progress Administration and the Civilian Conservation Corps. To this day we drive on W.P.A.-built roads and send our children to W.P.A.-built schools. Didn’t all these public works amount to a major fiscal stimulus?

Well, it wasn’t as major as you might think. The effects of federal public works spending were largely offset by other factors, notably a large tax increase, enacted by Herbert Hoover, whose full effects weren’t felt until his successor took office. Also, expansionary policy at the federal level was undercut by spending cuts and tax increases at the state and local level.

And F.D.R. wasn’t just reluctant to pursue an all-out fiscal expansion — he was eager to return to conservative budget principles. That eagerness almost destroyed his legacy. After winning a smashing election victory in 1936, the Roosevelt administration cut spending and raised taxes, precipitating an economic relapse that drove the unemployment rate back into double digits and led to a major defeat in the 1938 midterm elections.

What saved the economy, and the New Deal, was the enormous public works project known as World War II, which finally provided a fiscal stimulus adequate to the economy’s needs.

This history offers important lessons for the incoming administration.
The political lesson is that economic missteps can quickly undermine an electoral mandate. Democrats won big last week — but they won even bigger in 1936, only to see their gains evaporate after the recession of 1937-38. Americans don’t expect instant economic results from the incoming administration, but they do expect results, and Democrats’ euphoria will be short-lived if they don’t deliver an economic recovery.

The economic lesson is the importance of doing enough. F.D.R. thought he was being prudent by reining in his spending plans; in reality, he was taking big risks with the economy and with his legacy. My advice to the Obama people is to figure out how much help they think the economy needs, then add 50 percent. It’s much better, in a depressed economy, to err on the side of too much stimulus than on the side of too little.

In short, Mr. Obama’s chances of leading a new New Deal depend largely on whether his short-run economic plans are sufficiently bold. Progressives can only hope that he has the necessary audacity.
And Matthew Yglesias emphasizes that moderate and conservative members of congress need to do what’s right for the country:
… It’s those members in the most marginal districts who are most likely to look at plans for an expansive agenda and start getting queasy, but in truth it’s those very same members who aren’t going to win re-election unless dramatic action is taken. The bulk of the most liberal members have safe seats and will be congressmen for life if they want to be. But a lot of the more vulnerable Blue Dog types out there are going to need to choose between their instinct for trimming and the objective demands of their political survival — to say nothing of the country.

Good policy and good politics often don’t align, and they rarely align perfectly, but in this case the overwhelming political and substantive requirement of the new congress is big action to prevent a long and deep recession.

Friday, November 07, 2008

The end of evangelical foreign policy

The United States has suffered globally under the leadership of George W. Bush. The international community rallied to the United States following the September 11th attacks yet seven years later that good will has clearly been squandered by unprecedented arrogance guided by ideological blinders and undercut by incompetence and dishonesty. All of this has been has been justified (or rationalized) by an evangelical zeal that sees the world as black and white and as divided between the forces of good versus the forces of evil. Nuance is an alien concept. It is the evangelical world view that can be (and has been) detrimental to a foreign policy promoting the best interests of both this nation and the international community. Ironically, virtues such as human rights and political self-determination are shunned aside when they conflict with the evangelical world view.

Fortunately for the United States and the rest of the world this policy will end on January 20th with the incoming Obama administration. Andrew J. Bacevich contrasts Barack Obama’s foreign policy perspectives with that of George W. Bush in today’s Boston Globe:
WITH Barack Obama's election to the presidency, the evangelical moment in US foreign policy has come to an end. The United States remains a nation of believers, with Christianity the tradition to which most Americans adhere. Yet the religious sensibility informing American statecraft will no longer find expression in an urge to launch crusades against evil-doers.
Like our current president, Obama is a professed Christian. Yet whereas George W. Bush once identified Jesus Christ himself as his favorite philosopher, the president-elect is an admirer of Reinhold Niebuhr, the renowned Protestant theologian.

Faced with difficult problems, conservative evangelicals ask WWJD: What would Jesus do? We are now entering an era in which the occupant of the Oval Office will consider a different question: What would Reinhold do?

During the middle third of the last century, Niebuhr thought deeply about the complexities, moral and otherwise, of international politics. Although an eminently quotable writer, his insights do not easily reduce to a sound-bite or bumper sticker.

At the root of Niebuhr's thinking lies an appreciation of original sin, which he views as indelible and omnipresent. In a fallen world, power is necessary, otherwise we lie open to the assaults of the predatory. Yet since we too number among the fallen, our own professions of innocence and altruism are necessarily suspect. Power, wrote Niebuhr, "cannot be wielded without guilt, since it is never transcendent over interest." Therefore, any nation wielding great power but lacking self-awareness - never an American strong suit - poses an imminent risk not only to others but to itself.

Here lies the statesman's dilemma: You're damned if you do and damned if you don't. To refrain from resisting evil for fear of violating God's laws is irresponsible. Yet for the powerful to pretend to interpret God's will qualifies as presumptuous. To avert evil, action is imperative; so too is self-restraint. Even worthy causes pursued blindly yield morally problematic results.

Niebuhr specialized in precise distinctions. He supported US intervention in World War II - and condemned the bombing of Hiroshima and Nagasaki that ended that war. After 1945, Niebuhr believed it just and necessary to contain the Soviet Union. Yet he forcefully opposed US intervention in Vietnam.

The vast claims of Bush's second inaugural - with the president discerning history's "visible direction, set by liberty and the Author of Liberty" - would have appalled Niebuhr, precisely because Bush meant exactly what he said. In international politics, true believers are more dangerous than cynics.

Grandiose undertakings produce monstrous byproducts. In the eyes of critics, Abu Ghraib and Guantanamo show that all of Bush's freedom talk is simply a lie. Viewed from a Niebuhrean perspective, they become the predictable if illegitimate offspring of Bush's convictions. Better to forget utopia, leaving it to God to determine history's trajectory.

On the stump, Obama did not sound much like a follower of Niebuhr. Campaigns reward not introspection, but simplistic reassurance: "Yes, we can!" Yet as the dust now settles, we might hope that the victor will sober up and rediscover his Niebuhrean inclinations. Sobriety in this case begins with abrogating what Niebuhr called "our dreams of managing history," triggered by the end of the Cold War and reinforced by Sept. 11. "The course of history," he emphasized, "cannot be coerced."

We've tried having a born-again president intent on eliminating evil. It didn't work. May our next president acknowledge the possibility that, as Niebuhr put it, "the evils against which we contend are frequently the fruits of illusions which are similar to our own." Facing our present predicament requires that we shed illusions about America that would have offended Jesus himself.

Obama has written that he took from reading Niebuhr "the compelling idea that there's serious evil in the world" along with the conviction that evil's persistence should not be "an excuse for cynicism and inaction." Yet Niebuhr also taught him that "we should be humble and modest in our belief we can eliminate those things." As a point of departure for reformulating US foreign policy, we could do a lot worse.