8 THE LAST CRUSADE
In March 1975, a State Department official filed a brief regarding the rapidly developing situation in Cambodia. The pro-American yet corrupt and incompetent Lon Nol regime had fallen to the hands of rebels calling themselves the Khmer Rouge (KR).
In a remarkably short time the rebels seized control of the country’s limited resources, arms, and governmental apparatus. The last stronghold was the capital city, Phnom Penh. “The Communists,” the official warned, “are waging a total war against Cambodia’s civilian population with a degree of systematic terror unparalleled since the Nazi period—a clear precursor of the blood bath and Stalinist dictatorship they intend to impose on the Cambodian people.”[1]This time the warning was not too late. The capital had not fallen; American forces still could intervene and prevent what President Gerald Ford correctly predicted as a “massacre.” US military commanders had extensive familiarity with the terrain, a legacy of the 1970 invasion during the Vietnam War. Nor could they claim to be unaware of the severity of Khmer Rouge tactics; in these final days, journalists and American envoys remained to witness the bloodshed themselves.
But the State Department memo disappeared into a void. After Vietnam, after Watergate, the American people had been lied to once too often. They had been cajoled into believing the Vietnam War was a humanitarian mission, not an exercise in cold realpolitik. The disillusionment of the early 1970s, coupled with the stunning downfall of President Nixon, ingrained a cynicism into the American mind that remains to this day. Even NGOs like Amnesty International refused to believe the worst. As Samantha Power writes: “To the extent that the apocalyptic warnings of US government officials were sincere, many Americans believed they stemmed from the Ford administration’s anti-Communist paranoia or its desire to get congressional backing for an $82 million aid package for the Lon Nol regime.”[2]
In early April, the Khmer Rouge began evacuating Phnom Penh.
Ultimately the entire city, millions of inhabitants, were forced out of their homes and sent on death marches into the countryside. Embassies were abandoned. Prince Sirak Matak was offered a seat on an American evacuation plane but chose instead to remain with his people. He wrote to the US ambassador: “As for you and in particular your great country, I never believed for a moment that you would [abandon] a people which had chosen liberty…. If I shall die here on this spot in my country that I love…I have only committed the mistake of believing in you, the Americans.”[3] Before the letter reached Washington, Matak was dead.The pace of the genocide quickened past all comprehension in the weeks and months that followed. The Khmer Rouge ordered the execution of all military personnel down to the rank of lieutenant “and their wives and children.” Clerics of every faith—Buddhists, Hindus, Christians, Muslims, Jews—were targeted as “class enemies.” So too were those the KR considered intellectuals: writers, artists, journalists, academics, and indeed anyone with an education or even, incredibly, persons who wore glasses. Industrialists, aristocrats, farm owners, bureaucrats, schoolteachers…the list went on and on. Disloyalty was punishable by instant death and could take virtually any form; even the tiniest infraction of KR law—flirting, praying, reading, reminiscing—meant execution. “To keep you is no gain,” the Khmer Rouge told its people, “to kill you is no loss.”[4]
This genocide, the greatest since 1945, played out against a worldwide backdrop of disbelief and indifference. Nightly news programs devoted it scant attention, as did most American and European newspapers. By 1976, only fragmentary reports emerged of life and death within Cambodia; the KR had effectively shut down all contact with the outside world. In a move that has come to be axiomatic of impending atrocity, journalists and foreigners were expelled. But even the momentary glimpses that emerged were enough to horrify those who still bothered to pay attention.
By early 1978, the administration of President Jimmy Carter had assembled enough evidence to spark an impassioned declaration, three years exactly since the fall of Phnom Penh: “America cannot avoid the responsibility to speak out in condemnation of the Cambodian government, the worst violator of human rights in the world today…. It is an obligation of every member of the international community to protest the policies of this or any nation which cruelly and systematically violates the right of its people to enjoy life and basic human dignities.”[5]This Grotian response, belated as it was, seemed to herald a new age. The Carter administration was in many ways better positioned to take up the banner of human rights than its predecessors: untarnished by the legacies of Vietnam or Watergate, it could (and did) reassert the image of the United States as a moral force. Carter’s declaration drew prominent allies, some more expected than others. Democratic senator and former presidential candidate George McGovern came out boldly in favor of military intervention to stop what he described as a “genocide.” McGovern, with his strong sense of duty, correctly understood Cambodia’s predicament as a direct outgrowth of US presence and policy in Southeast Asia. The United States had a moral obligation to help. Moreover, he argued, such intervention would be swift and effective: the KR, despite their ferocity, were weak. Conservative commentator William F. Buckley agreed. Channeling the internationalist voice that still held some currency in the Republican Party, Buckley urged the president to assemble a coalition of neighbor states and intervene under US command.[6]
In the end, nothing came of these plans. It was Vietnam, not the United States, that finally invaded Cambodia, and its motives were anything but humanitarian. Just as McGovern predicted, the Khmer Rouge collapsed and fled. By that time, however, the Carter administration had done another about-face and condemned the Vietnamese for violating state sovereignty.
The opportunity to reestablish the United States as preeminent defender of natural law principles and redress decades of stagnancy had come and gone before anyone was fully aware of the fact.
Cambodia demonstrated that, in spite of the Nuremberg Principles and the repeated pledge “Never forget, and never again,” a state could employ a policy of unlimited destruction against its own citizens as the international community stood idly by. On the one hand it was a colossal failure of diplomacy and international law, a refutation of the principle of progress that underlay the Nuremberg Tribunal. On the other, it was a wake-up call. The Carter administration was not singly responsible for American failure to intervene—causes extended all the way back to the Johnson and Nixon eras—but it was the first to receive a complete, devastating account of the atrocity. “The drama of human rights,” wrote historian Samuel Moyn, “is that they emerged in the 1970s seemingly from nowhere.”[7] Not from nowhere: from Cambodia.
The Cambodian genocide also coincided with a proliferation of NGOs. Amnesty International had been founded in the early 1960s but remained for over a decade a lone voice for human rights outside the usual government channels. For nearly thirty years, the dialog of rights had been consumed by the Cold War. Then, in 1978, a new organization was founded in New York City to ensure the Soviet Union obeyed the protocols of the Helsinki Accords. Helsinki Watch, as it was originally known, broadened its mandate and was retitled Human Rights Watch. It now has several thousand members and an annual budget of over $75 million.
These developments came in accordance with a renewed commitment from the Carter administration. Indeed, author and activist Stephen Hopgood describes Jimmy Carter’s inauguration as “the turning point.”[8] In 1978, Carter directed the State Department to begin publishing annual “country reports” detailing the human rights record of nations receiving US assistance.
Congress later amended this list to include nearly all major states. The president also created the office of assistant secretary of state for human rights, ostensibly to oversee the creation and dissemination of these reports. The State Department complied reluctantly. Still staffed with holdovers from the Nixon and Ford administrations, it regarded such compilations as a distraction at best, or at worst imperiling valuable international relationships. This was particularly the case when President Carter openly criticized numerous Latin American dictators who had allied themselves with the United States, including Augusto Pinochet of Chile, Romeo Lucas Garcia of Guatemala, Anastasio Somoza in Nicaragua, and the right-wing death squads in Argentina. It must be admitted, however, that despite Carter’s condemnation, US aid continued (for the most part) to flow to these countries. Nevertheless, the backlash among Republicans in Congress was fierce.[9]In many ways, President Carter’s position was untenable. As long as the United States remained locked in global combat with the Soviet Union it could not abandon its rightist allies; the best Carter could hope for was to use what diplomatic pressure he had to curb their worst excesses. It is interesting to speculate what the ultimate results might have been had the American electorate granted him a second term. As it was, however, the now-former president channeled his energies and global vision into an NGO of his own, the Carter Center. Founded just as he left office, it expounds “a fundamental commitment to human rights and the alleviation of human suffering…to prevent and resolve conflicts, enhance freedom and democracy, and improve health.”[10] Thus it may certainly be said that Jimmy Carter has done more as an ex-president to promote human rights than the vast majority of his peers achieved while in office—including himself.
Ronald Reagan’s election in 1980 was a victory for those within his party who regarded Carter’s emphasis on human rights as naïve, wrongheaded, and antithetical to American interests.
Acolytes of Henry Kissinger and Dean Acheson, their foreign policy could best be summed up as, “He may be a scumbag, but at least he’s our scumbag.” Jeane Kirkpatrick, chosen by President Reagan to represent the United States at the United Nations, was vociferous in her condemnation of Carter’s foreign policy as responsible for the replacement of the corrupt Somoza and Shah of Iran with the infinitely worse Sandinistas and Ayatollah Khomeini. Under the “Kirkpatrick doctrine,” the United States would willingly align itself with established dictatorships on the assumption that “traditional authoritarian governments are less repressive than revolutionary autocracies.” Accordingly, President Reagan assumed office under a mandate to abandon human rights in favor of cold-eyed self-interest. The State Department’s country reports became parodic, obviously weighted in favor of “friendly” nations and unnecessarily harsh toward leftist or revolutionary ones. The administration also turned a blind eye to atrocity whenever convenient, as in the case of a US-trained military force in El Salvador that brutally slaughtered over a thousand civilians in December 1981. Testifying before Congress, administration officials blithely denied the validity of the victims’ accounts.[11]But Reagan went further. In a speech before the British Parliament in June 1982, just as that nation emerged victorious from the Falklands War, he called for an international “crusade of democracy.” The choice of words is revealing. A crusade, by definition and practice, is a holy war waged against an infidel enemy. The United States was thus pledging itself to spreading democracy around the world by any means necessary. As Human Rights Watch founder Aryeh Neier later recalled:
Reagan equated advances in the direction of electoral democracy with human rights…. Whatever the initial inspiration for the speech, the way forward that it enunciated has had a profound impact. In the three decades since that speech, every American administration has committed itself in significant measure to the promotion of democracy on the international stage. Also, though the extent to which the promotion of democracy with the promotion of human rights has varied to some degree, in general the two are not distinguished by those who have spoken for the US government in the period since Reagan’s address at Westminster. [Note: Neier wrote these words in 2012, prior to the election of Donald Trump.][12]
The conflation of human rights with Western democratic values did not originate with Ronald Reagan, but never had the ideological battle between democracy and communism been recast in such Manichaean terms. Reagan’s natural law philosophy was very simple: for people to enjoy their natural rights to life and property, they must first be free; for them to be free, they must enjoy democratic government. Ironically, by predicating all other rights on the establishment of a particular form of government, his views matched exactly those of the communist delegation at the Human Rights Commission in 1948.
Like any good salesman, Reagan passionately believed his own pitch. In his inaugural address, the president invoked John Winthrop’s depiction of America as a “shining city on a hill,” a beacon and a model for peoples around the world. He returned to this image again in his farewell address eight years later: “I’ve spoken of the shining city all my political life, but I don’t know if I ever quite communicated what I saw when I said it. But in my mind it was a tall, proud city built on rocks stronger than oceans, wind-swept, God-blessed, and teeming with people of all kinds living in harmony and peace; a city with free ports that hummed with commerce and creativity.”[13] Or, as historian Nicolas Guilhot has written, Reaganite “human rights were primarily based on a set of values embedded in existing national political institutions and legal structures, of which the United States were at once the best historical example and the model.”[14]
If the United States had at least been consistent in its objectives, this politicized definition of human rights might have produced dividends. During his second term, Reagan echoed the language of his Democratic predecessor by condemning human rights abuses in Haiti, Chile, and the Philippines. Moreover, his administration was quick to trumpet any incremental advances in US-allied nations toward democracy, however halting or illusory. Yet Reagan, while condemning human rights abuses in Grenada and using them as a justification for armed invasion, pointedly ignored them in South Africa—ultimately provoking Congress to pass sanctions against the apartheid regime over the president’s veto. Thus the legacy of the Reagan era for natural law and human rights is twofold. First, his “crusade for democracy” was both the logical successor to Cold War containment theory and an evolution. It was not enough to protect democracy where it existed; the United States must henceforth place itself at the fore of a global democratic movement. This bound the United States to an internationalist foreign policy even as it recast objectives in starkly moralistic terms. Reagan’s view of the United States’ place in the world was a unique fusion of natural law, Western liberalism, and tub-thumping patriotism combined with even greater abstractions of Duty, Honor, and Right. Fittingly, it was best expressed at the unveiling of a restored Statue of Liberty on July 3, 1986:
For love of liberty, our forebears—colonists, few in number and with little to defend themselves—fought a war for independence with what was then the world’s most powerful empire. For love of liberty, those who came before us tamed a vast wilderness and braved hardships which, at times, were beyond the limits of human endurance. For love of liberty, a bloody and heart-wrenching civil war was fought. And for love of liberty, Americans championed and still champion, even in times of peril, the cause of human freedom in far-off lands.
“The God who gave us life,” Thomas Jefferson once proclaimed, “gave us liberty at the same time.”…We are the keepers of the flame of liberty. We hold it high tonight for the world to see, a beacon of hope, a light unto the nations.[15]
Second, while Reagan’s full-throated rejection of isolationism was laudable, in other respects his democratic view of human rights was deeply flawed. The ethical mandate of a “crusade” required consistency in application, but consistency was impossible. The United States could not champion democracy as the harbinger of human rights and at the same time turn a blind eye to abuses in democratic or protodemocratic nations. The charge levied by Ambassador Kirkpatrick against Jimmy Carter could apply equally to her boss: a foreign policy predicated on moral absolutes was bound to be undone by the essential amorality of statecraft. Moreover, the conflation of rights and democracy reinforced postcolonial suspicions that “human rights” were simply neoimperialism writ large—a charge often made by the Soviet Union which in the Reagan years had more than a kernel of truth.
But suddenly everything changed. The Berlin Wall came down, and the Soviet Union collapsed under the weight of its own bloated bureaucracy. Geopolitical realities that had defined human rights discourse for decades evaporated. These events were viewed, not without merit, as a triumph of democracy generally—and specifically of the policies of Presidents Reagan and George H. W. Bush. The “crusade for democracy” had, figuratively speaking, reached Jerusalem.
Contemporary observers, many of whom could not remember a world before the Cold War, were quick to assign messianic properties to the moment. Political scientist Francis Fukuyama famously termed it an “end of history,” even as others—notably Henry Kissinger—cautioned that a unipolar world was an unsustainable model. But this was dismissed as so much carping. Victory in the West opened a clear path for the ultimate triumph of Western rights. As Peter Stearns describes it, “The collapse of communism and the advances of global capitalism without question reduced attention to social and economic rights. These rights had never secured an absolutely fixed place on the human rights agenda, but now they definitely trailed off.”[16] For a brief moment it seemed as though the celestial clock had been turned back to 1945; once again the United States had triumphed over its ideological foes. But this time there was no other superpower sharing the stage; by 1990 it looked as though the United States would enjoy many decades of unchallenged hegemony. The opportunity to finally implement its vision of global justice had never seemed so close.
Yet, as is so often the case, the signal traces of a countervailing force were already there for those who could see them. In the same week in June 1989 when Poland held its first democratic elections in half a century, tanks rolled into Beijing’s Tiananmen Square. Over a thousand democratic protestors were massacred. Even as the Soviet Union fell, China began its slow but steady rise as both market competitor and ideological foil to the West. This was not Maoist China, hidebound in its own political philosophies, but a nimble authoritarian regime that embraced the less controversial aspects of capitalism while still maintaining an iron grip on its people.
It was not long after that a warning came that would reverberate in the years to come. A conference of Asian states including China, Malaysia, Indonesia, and Singapore produced a startling critique of Western rights in an article entitled “Asia’s Different Standard.” “Many East and Southeast Asians” it began, “tend to look askance at the starkly individualistic ethos of the West in which authority tends to be seen as oppressive and rights are an individual’s �trump’ over the state.” It then went on to lay out its indictment with devastating candor:
The hard core of rights that are truly universal is smaller than many in the West are wont to pretend…. It is not only pretentious but wrong to insist that everything has been settled once and forever. The Universal Declaration is not a tablet Moses brought down from the mountain. It was drafted by mortals. All international norms must evolve through continuing debate among different points of view if consensus is to be maintained.[17]
The plea for recognition of communitarian values echoed Dr. Chang and others at the drafting of the Universal Declaration. But there was also something new. In direct contrast to the Reagan model, which predicated all other rights on democratic government, the article argued the reverse: “Order and stability [are] preconditions for economic growth, and growth is the necessary foundation for any political order that advances human dignity.” Freedom of the press, democratic elections, the right to political protest—these might be laudable, but they were not universal. This language was practically guaranteed to raise a storm of protest in the West, from human rights NGOs to neocon internationalists.
But it was not, ultimately, a rejection of all human rights. The proposition that the list of universal rights was “smaller” not only reaffirmed that such essential rights exist but was in fact correct. What, then, were “universal” rights in Asia? Life, liberty, and property, wrote Singaporean diplomat Bilahari Kausikan, speaking for the conference. Assaults on the essential life, freedom, or security of the person—as typified by genocide, slavery, or torture—were necessarily criminal regardless of the politics of the state undertaking them. “The West has a legitimate right and moral duty to promote those core human rights.”[18]
Far from a relativist justification for tyranny, this was a reasoned and necessary corrective to decades of Western liberalism; better still, it was based entirely on the principles of natural law. Had it been heeded, the consensus that eluded the drafters in 1948 might actually have been reached and a genuinely universal law emerge as a result.
It was not heeded. The fall of communism was universally accepted as the triumph of Western democratic values, a neat syllogism that was not, in fact, true. Yet it had an attractive simplicity and a great deal of political mileage. Having “won” the Cold War, the United States saw its success as a moral mandate to continue the crusade until the whole world was democratized. “Flushed with victory,” as Justice Jackson would have said, few realized the extraordinary opportunity available to them. For the first time in almost fifty years (or, arguably, almost eighty) there was no enemy, no competing ideology, no geopolitical necessity to peddle democracy around the world. In victory, the United States was liberated. It could finally engage with the community of nations in a real dialog about natural and universal law with no political blinkers on any side.
Inasmuch as it was perceived at all, it remained a dream of academics. There was virtually no support for a nuanced view of human rights in any governmental quarter. Neoconservatives saw the triumph of Reagan’s crusade for democracy laying the political and diplomatic groundwork for events up to the Second Iraq War. Liberals saw a golden opportunity to translate American hegemony into global change—social, political, economic—a benign Pax Americana spreading democracy, human rights, and Coca-Cola throughout the world. The result was a masterpiece of perversity. The exact moment when the United States was most able to bring workable universal law into reality, it was least inclined to do so.
Nor, indeed, were other Western powers. The 1993 Vienna Declaration and Programme of Action, quoted earlier in this book, was a recommitment to the principles of the Universal Declaration, a kind of “amen” at the end of the 1948 prayer. “Invoking the spirit of our age,” it began, “and the realities of our time which call upon the peoples of the world and all States Members of the United Nations to rededicate themselves to the global task of promoting and protecting all human rights and fundamental freedoms so as to secure full and universal enjoyment of these rights.” The opportunity was there, but the next passages squandered it. “All human rights are universal, indivisible and interdependent and interrelated. The international community must treat human rights globally in a fair and equal manner, on the same footing, and with the same emphasis.”[19] “All human rights” meant all provisions of the Universal Declaration.
Incredibly, the international community was stating that every right was equal and codependent—the right to life and the right to royalties, freedom from torture and freedom of cultural expression. It was as though they had learned nothing from the forty-five-year trajectory of the Universal Declaration; or, more likely, they regarded the fall of communism as an opportunity to undo several decades of stalemate. There is an almost mystical connection invoked between the 1948 and 1993 declarations, as though drafters of the latter were willfully trying to erase the intervening time.
But by ignoring past errors they were condemned to repeat them. The global community was no closer to adopting all thirty provisions of the Universal Declaration than it had been in 1948, and not even a unipolar US could compel it to do so (not to mention, of course, its own reservations about several articles). Article 8 of the Vienna Declaration compounded the folly:
Democracy, development and respect for human rights and fundamental freedoms are interdependent and mutually reinforcing. Democracy is based on the freely expressed will of the people to determine their own political, economic, social and cultural systems and their full participation in all aspects of their lives. In the context of the above, the promotion and protection of human rights and fundamental freedoms at the national and international levels should be universal and conducted without conditions attached. The international community should support the strengthening and promoting of democracy, development and respect for human rights and fundamental freedoms in the entire world.[20]
Once again, overconfidence and a politicized vision of rights had won out. By the end of the decade, observers spoke of the so-called “globalization era” as one of retrenchment rather than progress. As Michael Ignatieff described it, this neoimperialism was not “built on colonies, conquest and the white man’s burden” but was instead “a global hegemony whose grace notes are free markets, human rights and democracy, enforced by the most awesome military power the world has ever known.”[21]
Nevertheless, the “unipolar moment,” lasting from 1991 to 2008, saw significant advances in one field of international law: accountability. The year of the Vienna Declaration also saw the establishment of the first international tribunal since Nuremberg: the International Criminal Tribunal for Yugoslavia. Created to adjudicate crimes committed during the Yugoslav wars, it ultimately tried a total of 161 defendants from 1993 to 2017, ranging from head of state Slobodan Milosevic to ordinary soldiers and members of the security services. The UN was justly proud of its success, boasting that the tribunal “irreversibly changed the landscape of international humanitarian law, provided victims an opportunity to voice the horrors they witnessed and experienced, and proved that those suspected of bearing the greatest responsibility for atrocities committed during armed conflicts can be called to account.”[22]
Success in adjudicating the crimes of Yugoslavia was coupled with colossal failure to stem genocide in Rwanda, occurring less than a year after the Vienna Declaration. The response of the international community, including the United States, Belgium, France, and the UN itself, has been the subject of widespread condemnation. The Canadian general leading the UN peacekeeping force, Romeo Dallaire, accused his employers of willful and cynical disregard of the unfolding crisis. In a speech delivered in Kigali four years after the massacre, President Bill Clinton admitted, “The international community, together with nations in Africa, must bear its share of responsibility for this tragedy, as well. We did not act quickly enough after the killing began.” General Dallaire’s response twenty years later is succinct: “Most of that is crap. A month before the genocide, [Clinton] produced a presidential directive that stated that the United States will not engage in any humanitarian operation, unless it’s in its self-interest. He had instructed his staff…not to tell him what the hell was going on.”[23]
It became a truism of the 1990s and early 2000s that nations that failed to prevent atrocities proved much more adept at adjudicating them. The International Criminal Tribunal for Rwanda was established only a few months after the genocide ended and remained in operation until December 31, 2015. In 1997, the Cambodian government formally requested UN assistance in creating a tribunal for the crimes of the Khmer Rouge; first convened in 2003, it exists to this day. The Special Court for Sierra Leone was established in 2002, in the aftermath of a bloody civil war resulting in thousands of civilian deaths. Nearly thirty defendants have been tried under its auspices, including former Liberian president Charles Taylor.
Most famous, and with arguably the most lasting impact, was the creation of an International Criminal Court in 2002. The court represented the fulfillment of a promise made during the Roosevelt administration, but the horrors of ethnic cleansing in the 1990s gave new urgency to the project. As M. Cherif Bassiouni has written: “If the lessons of the past are to instruct the course of the future, then the creation of a permanent system of international criminal justice with a continuous institutional memory is imperative.”[24] The “institutional memory” to which Bassiouni referred is a combination of foundational law, in this case the Nuremberg Principles (given concrete form as the Rome Statute of 1999) and case law. Bassiouni correctly understood that individual tribunals convened for a specific set of offenses would never have the same weight in precedent as the decisions of a sitting judicial body.
President Clinton, himself a graduate of Yale Law School, agreed. Haunted perhaps by the memory of Rwanda, he pushed especially hard for a permanent international tribunal for war crimes and crimes against humanity. Yet, predictably, his internationalist zeal ran headlong into isolationist opposition. Old fears of surrendering American sovereignty, nearly identical in form and content to those of 1919 and 1945, rose once again. As a result, under the presidency of George W. Bush the United States declined to recognize the court it helped establish. To this date it remains something of an orphan on the international scene, with fifteen cases “under investigation” yet none resolved. Scholarly opinion on the tribunal—indeed, all tribunals—is mixed. Human Rights Watch founder Aryeh Neier has been optimistic: “Augusto Pinochet and Slobodan Milosevic are symbols of the cruelty and barbarity of the last third of the twentieth century, but they also inspired the most significant advances in international accountability for the authors of great crimes.”[25] His fellow activist Stephen Hopgood disagreed. “International criminal tribunals,” he wrote dismissively, “are grand ritualized spectacles that symbolize authority and power by dramatizing the archetypal myth of the hero defeating existential threats to the community.”[26] That is a harsh judgment, surely. If civilization is once again the complaining party at the bar, it was not the prosecutors but the defendants who put it there.
Still and all, the list of former heads of state tried since 1990 of crimes against their citizens is growing. Milosevic and Taylor join the likes of Jean Kambanda of Rwanda, Jorge Videla of Argentina, Alberto Fujimori of Peru, Khieu Samphan of Cambodia, and, of course, Saddam Hussein of Iraq. Add to this list of luminaries the hundreds of other lesser-rank individuals who worked for them, also successfully tried by the same tribunals. Indeed, if one wishes to find evidence of Dr. Peabody’s “upward path” of social progress, the accountability for international crimes may be the best evidence available. Prosecutor David Crane opened his remarks at the 2004 Sierra Leone tribunal by invoking the “sober and steady climb upwards towards the towering summit of justice.”[27]
For all these successes, it remains indisputable that the great powers—Russia, China, and the United States—are effectively immune from international justice. This is troubling given the brutal treatment of minorities in the former two countries—homosexuals in Russia, numerous ethnic and religious sects in China—but the United States itself is likewise tainted. This was especially the case following the events of September 11, 2001. Al Qaeda was an enemy unique in the American experience: lacking any state authority or territory, recognizable military force, or—crucially—codes of conduct.[28] It was impervious to all the usual measures undertaken to chastise a state: shaming, sanctions, etc. The word most commonly (if controversially) used to describe Al Qaeda’s nature and methods was “barbaric.” Barbarism, by this understanding, meant not playing by the rules. In previous centuries a barbaric enemy was presumed not to comprehend “civilized” war; here, however, it was the terrorist organization’s knowledge of the limitations of conventional warfare and diplomacy that made it so formidable a foe.
War with a barbaric enemy brought out the barbarism in ourselves. Less than a week after the attacks, Vice President Dick Cheney was typically unsparing in his assessment of this new form of conflict. “It is a mean, dangerous, dirty business out there, and we have to operate in that arena…sort of on the dark side, if you will.”[29] What Cheney meant by “dirty business” was not yet clear, but it is certain the vast majority of Americans would have supported striking back at Al Qaeda by any means. The United States, which had not experienced a territorial invasion or attack since 1812, had grown complacent. From this complacency it critiqued other nations on the directed use of torture, suspension of civil liberties, and absence of due process—nations like Israel and the United Kingdom, which had been combating terrorist organizations for decades.
The US response to direct attack was no different; indeed, not having become accustomed to this kind of warfare, it was arguably worse. A 2006 report listed nearly one hundred deaths in US extralegal prison camps, of which at least thirty-four were probable homicides. Strangulation, asphyxia, and blunt-force injuries were cited as causes.[30] Prisoner abuse was graphically revealed in the circulation of the Abu Ghraib photographs, but it was much more common than anyone knew—in fact, more than we may ever know. Consumed by its War on Terror, and ultimately by Afghanistan and Iraq, the United States disengaged from the global dialog on human rights. Even before its own abuses were revealed, the Bush administration adopted the position that the promotion of rights was a luxury enjoyed in peacetime but a distraction during war. To the extent that it mentioned them at all, it did so in the familiar refrain of spreading democracy and liberty; fundamentalist regimes were a convenient stand-in for the “red menace” of the Comintern. This fierce retrenching filtered its way through American society, even academia. A 2006 study titled “Forging a World of Liberty under Law” by Princeton scholars G. John Ikenberry and Anne-Marie Slaughter reads as if it could have emerged intact from archaic debates over the Bricker Amendment: “The basic objective of US strategy must be to protect the American people and the American way of life. This overarching goal should comprise three more specific aims: 1) a secure homeland…2) a healthy global economy…and 3) a benign international environment grounded in security cooperation among nations and the spread of liberal democracy.”[31] There was no mention of natural rights at all.
Marcus Tullius Cicero declared, “The security of the people is the highest law.” He wrote these words as a former consul who, during the Catiline conspiracy, suspended Roman law and ruthlessly pursued the conspirators. Little has changed. As a Supreme Court justice wrote, challenges to our rights rarely occur in times of placidity. The American response to 9/11 illustrated (or reaffirmed) that no nation is immune from passions that may trump fundamental liberties in the name of collective security. This extends even to the presidency of Barack Obama and his decision to summarily execute Osama Bin Laden. It was the exact opposite of Nuremberg: instead of “staying the hand of vengeance,” the United States allowed itself the pleasure of an assassination. Few would argue Bin Laden didn’t deserve it; then again, few would have argued Goering or Kaltenbrunner didn’t either. But the decision—made by a former University of Chicago law professor—transformed what might have been the most significant trial of the century (and an ennobling end to the War on Terror) into a squalid little scuffle in the dark. A trial, wrote human rights barrister Geoffrey Robertson, “would have been the best way of demystifying this man, debunking his cause and de-brainwashing his followers. In the dock he would have been reduced in stature…as a hateful and hate-filled old man, screaming from the dock or lying from the witness box.”[32]
The legacy of 9/11 for American engagement in rights is profound. From Carter to Clinton, the United States spent over two decades reestablishing its position as global exemplar and authority; in eight years this legacy was undone. The American people themselves would come to feel a sense of shame for Abu Ghraib and other abuses, as well as outrage at the Bush administration for allowing them to continue. This reflects a curious duality: even as the government pursued its policies, donations poured into organizations like the ACLU at an unprecedented rate. A similar phenomenon occurred in the 1980s, leading to the establishment of several human rights watchdogs for the American government. Partly this can be accounted for by the partisanship of the electorate, but it also reminds us that not all understandings of rights emerge from governments. In present-day America it is most often NGOs, corporations, or even individuals who have carried the torch after the Trump administration laid it down.
The last decade has witnessed the most abrupt about-face on human rights in American history, from pragmatic engagement and cautious optimism to near-complete rejection. As with most historical phenomena, its roots lie deep. In 2002, President Bush’s UN ambassador (and Donald Trump’s future national security adviser), John Bolton, formally withdrew the United States from the International Criminal Court, which he described as “my happiest moment at State.”[33] Three years later, a global summit convened by the UN unanimously declared a “responsibility to protect” peoples in imminent danger from their governments for crimes against humanity. Hitherto the only convention to have a so-called “trigger mechanism” was the Genocide Convention of 1948, which pledged signatories to intervene once a genocide was formally recognized to have begun. The reluctance of states to do so is illustrated by the fact that neither Cambodia, Rwanda, Yugoslavia, nor any major atrocity until the Sudan was formally declared a genocide by the United Nations. Not surprisingly, this new responsibility to protect, or “R2P,” came under withering fire from Asian, African, and Middle Eastern nations who regarded it as an open license for Western states to launch preemptive wars in the name of humanity—as, indeed, the United States was then doing in Iraq.[34]
The aftermath of the Iraq War accelerated a process that had been underway for some time: the concurrent rise of non-Western views of human rights (or lack thereof) and the decline of American hegemony. Syria, China, Brazil, Russia, Myanmar, and countless other states openly defied US warnings and pursued brutal and criminal policies against their own citizens. “American influence is weakening,” Stephen Hopgood cautioned. “The distribution of power in the international system is shifting from unipolarity to multipolarity, and the institutions that are built on liberal hegemony are fatally at risk…. Our world will finally kill the European dream.”[35] Hopgood wrote these words in 2013, at a time when—contrary to his pessimistic views—it appeared that the United States under President Obama was attempting to recapture some measure of moral authority it enjoyed during the Clinton years. But Hopgood must be credited as among the first scholars to sound the warning bell for the rise of neonationalism and fragmenting of traditional liberal institutions, which became all too apparent by decade’s end.
Respect for natural rights had always been predicated on some overarching ideal of civilization, from Grotius and Blackstone to Wilson and FDR. Yet a condescending paternalism lay at the heart of the mission: if some states must “civilize” others, how was this any different from good old-fashioned nineteenth-century imperialism? President Clinton had been able to chart a narrow path, asserting America’s engagement in human rights without overemphasizing its raw power post-Cold War. President Obama was forced to navigate a very different geopolitical reality: a resurgent Russia and China that frequently aligned themselves with criminal regimes and themselves engaged in criminal conduct against their citizens. “In other words,” Hopgood declared, “the last great hope for humanist internationalism is an illusion. The United States, democratic, liberal, and successor to the great heritage of Western enlightenment, has avoided becoming embroiled in alliances and commitments that would restrain it…. R2P is a meaningless doctrine without US support, and the ICC is a European vanity project.”[36]
This was not entirely fair. The European Union, Organization of American States (OAS), African Union, and Association of Southeast Asian Nations would certainly take umbrage at the United States as “the last great hope” for human rights and be right to do so. Even as America’s voice on the international stage began to fade, others came forward. Globalization and the interlocking of nations’ economic, social, and even political structures has introduced a new arsenal of weapons for enforcing human rights, and a new army to use them. Oil-rich nations, nations commanding critical trade routes, large corporations, and political blocs like the EU and OAS have been able to bring pressure to bear on recalcitrant states. While their record may be mixed, it is not without its victories—and indeed it is debatable whether the mobilization of world action and opinion might not have greater success than a hegemonic state power enforcing its will.
Yet it is undeniable that the old pas de deux between Western liberalism and cultural relativism continues. Even as new nations joined (and eventually eclipsed) the United States in their call for universal rights, other voices rose in protest. This was a classic example of the pillow effect: for example, as new claims for gender and sexual equality arose, they produced a push-back among religious, ethnic, and political bodies that zealously guarded their bias as a cultural heritage. International debates over burqas and marital enslavement at the beginning of the decade broadened to include state codes that penalize homosexuality (many of them legacies of colonialism) by the end. And in this new era of global movements and oversight, political action can cut both ways. One example is illustrative. In 2014, the Ugandan Parliament passed a law mandating the execution of persons convicted of homosexual acts. It was then duly signed by longtime Ugandan president Yoweri Museveni. International outcry was immediate. The European Union, along with the governments of Sweden, Germany, France, Norway, Denmark, the United Kingdom, and the United States all pledged to halt aid to Uganda. The World Bank withheld a $90 million loan. The so-called “Kill the Gays” bill was ultimately ruled unconstitutional on procedural grounds. In 2019, however, it returned. Faced with the same outrage and threats of cutting aid, the Ugandan government was intransigent. It later transpired that religious organizations within the United States, most notably the National Christian Foundation (NCF) had been actively financing the bill’s reemergence and aided in its drafting. These organizations, in turn, were financed by conservative corporations, including the fast-food giant Chick-fil-A.[37] While the outcome of the Ugandan bill remains unclear, it demonstrates how global opinion and financing can promote human rights and hinder them.
One must also reckon with the failure of the Obama administration to respond effectively to the gravest humanitarian crisis during its tenure, the Syrian civil war. The war had dragged on for years, a grinding unequal struggle between rebel forces and the oppressive regime of Bashar al-Assad. In August 2012, a reporter asked President Obama what circumstances might lead him to authorize military force in Syria. “We have been very clear to the Assad regime,” he answered, “that a red line for us is we start seeing a whole bunch of chemical weapons moving around or being utilized. That would change my calculus.” In fact the regime had already begun stockpiling such weapons and distributing them to allies, including the terrorist organization Hezbollah. Just a week after President Obama issued his public warning, intelligence reports confirmed a sarin attack in Damascus had killed over one thousand people. Until that time, the balance of opinion within the Obama White House favored caution. According to one account, the chairman of the Joint Chiefs of Staff, Marty Dempsey, had argued that Syria was a “slippery slope, with little chance of success” and warned against being drawn into another Iraq situation. But after the Damascus attack, that thinking changed. “Now he said that something needed to be done even if we didn’t know what would happen after we took action.” The red line had been crossed.[38]
President Obama, careful and deliberative by nature, sought the advice of those around him. The consensus favored air strikes, but even this limited step was flagged as potentially bringing the United States into another war. Instead, the president began sounding out allies for an international coalition. This, after all, was the embodiment of the UN ideal: nations banded together to preserve and protect human rights norms. But the response was chilling. Chancellor Angela Merkel of Germany told the president, “I don’t want you to get into a situation where you are left out on a limb,” implying other European nations might not be quick to follow the US lead. Having pledged Britain’s enthusiastic support, Prime Minister David Cameron was forced to apologize and withdraw after his own parliament vetoed the air strikes, 285-272.
Nor did the president fare any better with Congress. House speaker John Boehner was personally in favor of action but cautioned he could not move his caucus to support it. Senator Mitch McConnell was even more duplicitous, refusing to endorse the move and then criticizing the president afterward for not making it. “Real profiles in courage,” Obama commented acerbically. Rank-and-file Republicans, many of whom had enthusiastically joined George W. Bush’s call to arms in 2003, found various ways of ducking responsibility now. Even Democrats were gun-shy, the fear of another war overwhelming their humanitarian impulses. “People always say never again,” President Obama lamented. “But they never want to do anything.” In the end, the president chose to pursue a somewhat dubious diplomatic channel through Russia, and the prospect of military action receded. Ben Rhodes, a former deputy national security adviser, was fatalistic. “Thousands of tons of chemical weapons would be removed from Syria and destroyed,” he wrote, “far more than could have been destroyed through military action.”[39] That might be true, but it was also true that the president had promised retribution if his “red line” was crossed, and none came. The precedent was set.
Despite its faults, the Obama administration must be credited with reestablishing international bonds that had frayed under George W. Bush and providing a steady and reassuring presence on the world stage. In 2016, as the outgoing administration prepared to hand over the reins, they could congratulate themselves that the next president would be well positioned to continue American stewardship of human rights. That person was all but certain to be Hillary Clinton, who had served as President Obama’s secretary of state and could be relied upon to protect his legacy and continue his policies abroad. The possibility of a Trump presidency was terrifying but remote: his reactionary, incendiary, and boneheaded comments seemed to make the outcome foreordained. “Yes, I think the Republican nominee is unfit to serve as president,” Obama said during a joint press conference with Singapore’s prime minister, Lee Hsien Loong, in August 2016. “I said so last week, and he keeps on proving it. The notion that he would attack a Gold Star family [the name for a family whose relative has died in service] that had made such extraordinary sacrifices on behalf of our country, the fact that he doesn’t appear to have basic knowledge around critical issues in Europe, in the Middle East, in Asia means that he’s woefully unprepared to do this job.” He went on:
I think I was right and Mitt Romney and John McCain were wrong on certain policy issues but I never thought that they couldn’t do the job. And had they won, I would have been disappointed but I would have said to all Americans: this is our president and I know they’re going to abide by certain norms and rules and common sense, will observe basic decency, will have enough knowledge about economic policy and foreign policy and our constitutional traditions and rule of law that our government will work and then we’ll compete four years from now to try and win an election.
But that’s not the situation here. And that’s not just my opinion. That is the opinion of many prominent Republicans. There has to come a point at which you say enough. The alternative is that the entire party, the Republican Party, effectively endorses and validates the positions that are being articulated by Mr. Trump.
These words now ring with prophecy.