7 THE CAROUSEL OF PROGRESS
On the shores of the Pool of Industry, at the intersection of the Avenue of Commerce and the Court of the Universe, was Progressland. A collaborative effort between Walt Disney and General Electric, it quickly became one of the most celebrated pavilions at the 1964 World’s Fair.
Guests entered a saucer-shaped structure that looked vaguely like a hubcap and were briskly guided through a series of turnstiles before arriving at its heart—the Carousel of Progress. The concept was novel: a theater placed on a revolving turntable like a giant record player, moving audiences past audio-animatronic scenes of daily life from past, present, and future. Here technology and democracy, symbiotically linked, advanced mankind toward utopia. Disney commissioned the Sherman brothers, fresh off their triumph in Mary Poppins, to provide an appropriate song. The result was so catchy, so emblematic of the age, that it became Walt Disney’s personal theme and plays even now throughout his vast entertainment empire:“Oh there’s a great, big, beautiful tomorrow,
shining at the end of every day,
Yes there’s a great, big, beautiful tomorrow,
just a dream away!”
It is unlikely that many visitors to Progressland paused to consider the architectural absurdity of a pavilion devoted to forward-moving progress that revolved instead in an endless repetitive gyre, thus borne back, like the metaphorical boats in The Great Gatsby, ceaselessly into the past. Disney himself certainly never did: the attraction proved so popular that it was moved first to Disneyland and eventually to Walt Disney World, where it remains—creakily intact despite annual threats of closure—to this day. Like a time capsule, it perfectly captures that volatile admixture of optimism, vainglory, and myopia that animated the fair itself and the entire postwar world.[1] It was this same spirit that produced the Universal Declaration of Human Rights.
The impetus toward some form of international bill of rights had been gathering steam since the end of the Second World War. The Nuremberg Principles, enacted in August 1945, codified the crimes of the tribunal and pledged the international community to uphold them sine die. But these applied only to atrocities committed during war. For months, religious organizations and NGOs had been pressuring the State Department to include protection for human rights into the draft charter of the United Nations. Ultimately they were successful, albeit to a limited degree. The UN charter pledged to “reaffirm our faith in fundamental human rights, in the dignity and worth of the human person, in the equal rights of men and women and of nations large and small.”[2] But such high-sounding language was so vague as to be practically meaningless.
Events began to shift when President Truman appointed Eleanor Roosevelt as a delegate to the first meeting of the UN General Assembly in London in January 1946. She was to serve as the first chair of the newly created Human Rights Commission. The commission, like holding the chair, was likely meant to be a sinecure; Secretary of State Edward Stettinius had little interest in human rights. But if Stettinius or Truman expected Eleanor to regard the post in the same symbolic light, they misjudged badly. Despite feeling unqualified for the role of diplomat, Eleanor embraced the commission as an opportunity to consolidate her husband’s legacy.[3] The General Assembly met in the Methodist Central Hall of Westminster, a symbolic vortex of Protestant dissent and natural law philosophy. Drawing inspiration from her surroundings, Mrs. Roosevelt pressured the assembly to commission a new bill of rights. As she put it, “Many of us thought the lack of standards for human rights the world over was one of the greatest causes of friction among the nations, and that recognition of human rights might become one of the cornerstones on which peace could eventually be based.”[4] Counting the number of conditional tenses in the preceding statement gives one a fairly accurate idea of the scale of the project.
Bland, opaque platitudes about human dignity were easy enough; to create an actual list of rights was quite another matter. Which rights, whose rights, and how should they be enforced? These became the paramount questions of the Human Rights Commission.The work began with laudable circumspection. The commission was determined not to replicate the disastrous wartime bill of rights produced by the State Department—a document, as will be recalled, that was little more than a restatement of the US Bill of Rights. Universal rights had to be universal and therefore multinational; anything less would be justly condemned as imposing an American lex imperatoria on the entire world. To that end, the commission sent a questionnaire out to the foremost legal minds of dozens of nations, soliciting their views. The response was enthusiastic.[5] Nearly every respondent affirmed their culture’s recognition of basic human rights, but it soon became clear that “right” had a subjective meaning. Philosopher Chung-Shung Lo affirmed that “the idea of human rights developed very early in China, and the right of the people to revolt against oppressive rulers was very early established,” but demurred from the Western conception of law as the protection of individual rights. “The basic ethical concept of Chinese social political relations,” he wrote, “is the fulfillment of the duty to one’s neighbor, rather than the claiming of rights.”[6] Mohandas Gandhi agreed. “I learned from my illiterate but wise mother that all rights to be deserved and preserved came from duty well done. Thus the very right to live accrues to us only when we do the duty of citizenship of the world.”[7]
Fulfilling one’s role within a community entitled one to rights, not the other way around. Implicit within this construction was not only a rejection of Western legal norms but of several centuries’ colonial domination. Echoing a much earlier argument about Queen Victoria’s proposed bill of rights for India, Bengali poet Humayun Kabir wrote pithily that the “fundamental flaw in the Western conception of human rights” was the West’s failure to live up to them.
“In practice [human rights] often applied only to Europeans, and sometimes only to some Europeans.”[8]This polite but pointed clapback hinted at a much larger phenomenon taking shape beyond the walls. The Philippines had already gained its independence from the United States; Indonesia was on the cusp of breaking off from the cash-strapped Netherlands. Within the British Empire, Transjordan had been freed, and negotiations were underway for the momentous dissolution of the Raj. Before the end of the decade, India, Pakistan, Israel, Burma, and Ceylon would join the growing list of independent nations. This explosion of independent states set the stage for the cataclysms of the 1950s and 1960s, when decolonization spread throughout the world and the last vestiges of European domination were swiftly, often violently, overthrown. Though the Human Rights Commission could not have known it, Kabir’s critique would come to dominate the language of postcolonial resistance to human rights.
Even within the committee itself there were fractures. The communist delegation, voiced by Yugoslav representative Vladislav Ribnikar, rejected any form of individualism in favor of the collective. Human liberty, said Ribnikar, is “the perfect harmony between the individual and the community,” whose best and only guarantor was—not surprisingly—the state. “The psychology of individualism has been used by the ruling class in most countries to preserve its own privileges; a modern declaration of rights should not only consider the rights of the ruling classes.”[9] Beneath the Marxist-Leninist dogma was an arguable point: Western rights were both political and negative, enjoining the state from certain actions against the individual. The century-long impact of socialism had introduced a counterpoint: actions that the state was required to ensure, including the right to work and a decent standard of living. But the obvious danger was of a state pursuing these goals at the expense of natural rights—as the Soviets had done since 1917.
This provoked an outraged response from French jurist Rene Cassin: “The deepest danger of the age,” he warned, was a collectivist state that demanded “the extinction of the human person as such in his own individuality and inviolability.”[10]It was inevitable that Cassin, with his encyclopedic knowledge of French law from the rights of man to the Code Napoléon, should emerge at the conference as the most ardent proponent of natural law. His draft for a preamble, which eventually became Article I of the declaration, included a familiar restatement of the Ciceronian credo: “All men are endowed by nature with reason and conscience and should act towards one another in a spirit of brotherhood.” Then the storm broke. The Catholic element, voiced by Brazil, vehemently demanded that the secular language be replaced with “all human beings are created in the image and likeness of God.”[11] The Soviet response was apoplectic. Chinese delegate P. C. Chang warmly endorsed the brotherhood aspect as reflective of both Confucian and Enlightenment thought but suggested that any reference to nature or God was too culturally specific. It was enough, he said, to recognize that each person was endowed with reason; the agency was immaterial. Eleanor Roosevelt, who had hitherto remained largely silent on the matter, agreed. As she later explained:
Now, I happen to believe that we are born free and equal in dignity and rights because there is a divine Creator, and there is a divine spark in men. But there were other people around the table who wanted it expressed in such a way that…left it to each of us to put in our own reason, as we say, for that end.[12]
There is an aura of unreality to these exchanges. Western and Eastern notions of right and justice, Grotius and Confucius, Judeo-Christian natural law and nineteenth-century socialism all collided and combined, embodied in the delegates seated around the table at Lake Success, New York. Probably not since the Synods of Antioch had so many divergent and combustible views been aired within a single room.
This heterogeneity came to be reflected in the list of rights ultimately chosen. The delegates had two choices before them: include only those rights upon which universal agreement could be found, or include everything but the kitchen sink. From the outset it became clear how difficult the former option would be. In January 1947, Eleanor invited Lebanese delegate Charles Malik, Canadian John P. Humphrey, and P. C. Chang to her Washington Square apartment for tea. Dr. Humphrey had been tasked with drawing up a preliminary list of rights; Malik suggested he need look no further than the precepts of natural law. Dr. Chang swiftly objected, as Mrs. Roosevelt relates:The Declaration, he said, should reflect more than simply Western ideas and Dr. Humphrey would have to be eclectic in his approach. His remark, though addressed to Dr. Humphrey, was really directed at Dr. Malik, from whom it drew a prompt retort as he expounded at some length the philosophy of Thomas Aquinas…. I remember that at one point Dr. Chang suggested the Secretariat might well spend a few months studying the fundamentals of Confucianism![13]
Teatime wrangles were just the beginning. Months of debate produced little consensus, but rather a hardening of respective positions. The original forty articles in the Universal Declaration of Human Rights were reduced by only ten. The final product was thus not one list of rights but two. Articles 1–21 were a grab bag of Western legal and political theory, mandating everything from democratic government (Article 21) to the presumption of innocence (Article 11)—this last despite the fact that only common-law nations like the United Kingdom and the United States had it in their legal systems. The language of the articles revealed obvious linkages to everything from the Magna Carta to FDR’s Four Freedoms. Articles 22–29 were expressly social (and, in some cases, socialist), and many commentators would question whether many, like Article 24, had ever been considered “rights” at all: “Everyone has the right to rest and leisure, including reasonable limitation of working hours and periodic holidays with pay.” There was nothing in the document to indicate which rights were most fundamental (except possibly that political rights came numerically first, and there were twice as many of them) or what would happen if two or more articles conflicted. There was no mechanism for enforcement. This was a particular grievance of Mrs. Hansa Mehta, the Indian representative. A Brahmin and prominent jurist in her own right, she had spent decades challenging Britain’s illegal detention of Indian subjects and confiscation of their property.[14] The British themselves denounced the list as “highly unsatisfactory” on related grounds; how, they wondered, was the international community supposed to compel a state to grant its citizens paid vacation?
One might argue that consensus among the delegates was impossible, and therefore a “kitchen sink” list of rights was the only practicable outcome. P. C. Chang, addressing the General Assembly on the day of ratification, acknowledged this problem. “Preoccupations of a political nature” and “uncompromising dogmatism,” he admitted, meant that any list of rights that did not include the controversial political elements—democratic and socialist—could be upheld by the UN only with the use of force. “But however violent the methods used, equilibrium achieved in that way could never last.”[15] The failure of the declaration, however, was already apparent even as the document was being drafted: with so many rights, and no differentiation between them, each state would choose for itself which articles to follow and ignore the rest. Thus the Universal Declaration was not so much a blueprint for the future as a reflection of the moment, particularly the divergent ideological poles of democratic and socialist nations. It was neither binding nor enforceable; not a single state was in compliance with all its provisions or wished to be. That being the case, one has to wonder what, exactly, the delegates and Mrs. Roosevelt hoped to accomplish.
In a word: progress. Examining the cold record of committee discussions and disagreements today is to reckon without the incandescent optimism that infused the proceedings. That quicksilver sense of limitless possibility was everywhere. Utopianism seems to come in historical waves; one can find evidence of a swelling of enthusiasm in the sixteenth, eighteenth, and late-nineteenth centuries. Now its cause was clear: war had completely devastated the structural, social, and political order. A new world would have to be built. Emergent nations had already begun clamoring for some form of international human rights law to protect them from aggressors. The Organization of American States (OAS), representing Latin America, produced its own 1948 American Declaration of Rights and Duties that in many ways mirrored the optimistic language of the Universal Declaration. All over the world new constitutions were being drafted with economic and social liberties jointly enshrined. For the briefest of moments, from 1945 to the early 1950s, it truly seemed as if a new world order was not only possible, but necessary. The Human Rights Committee—and Eleanor Roosevelt in particular—believed they stood at a moment of historic flux. But overconfidence bred hubris. The consensus among historians and political scientists is that the committee overreached: in attempting to hasten utopia, they crafted a document that was too broad and too weak.[16] It would be unfair to blame or credit Mrs. Roosevelt with a declaration that was the collaborative effort of dozens of brilliant legal minds. It is inarguable, however, that the finished product would not have pleased her late husband as much as it did her. FDR, the consummate realist, would have seen at once the fatal flaws within a document of so many conflicting and controversial provisions. His own vision of postwar human rights was no less radical but considerably more focused. Nor would he have countenanced a bill of rights without any enforcement mechanisms. The record of his views and actions suggests, on balance, that Roosevelt would have pressed for a shorter and more fundamental list of rights and backed them up with the full weight of American political and military force. Had the United States done so, the record of human rights advancement and enforcement for the next seventy years might have looked very different indeed.
As it was, however, the Human Rights Commission believed they had achieved a compact for the ages. One can hear the faint echo of trumpets in Eleanor Roosevelt’s address to the United Nations on December 10, 1948:
We stand today at the threshold of a great event both in the life of the United Nations and in the life of mankind…. This Declaration may well become the international Magna Carta of all men everywhere. We hope its proclamation by the General Assembly will be an event comparable to the proclamation of the Declaration of the Rights of Man by the French people in 1789, the adoption of the Bill of Rights by the people of the United States, and the adoption of comparable declarations at different times in other countries.[17]
If these words read now like starry-eyed idealism, one must blame the times, not the author. There were glimmering, fleeting moments when it seemed as though it could almost have worked. A global community shattered by war and surrounded by the toppled idols of totalitarianism; dozens of new nations clamoring to be born; a transcendent desire for order and decency after a decade of chaos and barbarism; and, at the apex of it all, a vivified democratic nation with more raw power than any empire in history—never before and probably never again would the kaleidoscope of human events fall into such a perfect pattern for universal law.[18] A few months after the declaration was introduced, the State Department released a pamphlet reaffirming “the establishment of the methods and precedents of a world rule of law in which disputes among nations would be resolved just as most disputes among individuals are resolved today—through recourse to a proper and established court of justice.”[19] The Universal Declaration of Human Rights must certainly be recognized as the founding charter for such a court. Notwithstanding its flaws, it was the first significant attempt to extend natural law principles beyond state parameters and make them truly universal. Article 1, quoted above (although “men” was fortunately changed to “human beings”), established the natural basis of law reaffirmed by scholars from Cicero to Kant. Article 3 echoed Grotius and Blackstone in its declaration that “everyone has the right to life, liberty and the security of person.” Even the language of the preamble, cobbled together from such natural law texts as the Declaration of Independence, the Rights of Man, and FDR’s Four Freedoms, was in its own way a reaffirmation of the legal foundation underlying them:
Whereas recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world,
Whereas disregard and contempt for human rights have resulted in barbarous acts which have outraged the conscience of mankind, and the advent of a world in which human beings shall enjoy freedom of speech and belief and freedom from fear and want has been proclaimed as the highest aspiration of the common people,
Whereas it is essential, if man is not to be compelled to have recourse, as a last resort, to rebellion against tyranny and oppression, that human rights should be protected by the rule of law.
This language, steeped in Western liberalism, was inevitably going to produce a backlash. But the first sign of resistance came from an unexpected quarter closer to home. Isolationism in the United States held a long and distinguished pedigree, from George Washington’s Farewell Address in 1796 to the Red Scares of the early twentieth century. It reached its zenith in the months preceding the attack on Pearl Harbor, represented by such powerful organizations as America First and public figures like aviator Charles Lindbergh and endorsed by a significant portion of Congress. By the end of the decade, however, isolationism had become a tattered standard, frequently (and not unfairly) derided as xenophobic, obtuse, even fascist. Nevertheless, as universal law moved tantalizingly close to reality, the prospect of a “world court” reawakened old fears of foreign entanglements. Perversely but predictably, these domestic antagonists couched their critique in the language of their opponents: “The American people want to make certain that no treaty or executive agreement will be effective to deny or abridge their fundamental rights. Also, they do not want their basic human rights to be supervised or controlled by international agencies over which they have no control.”[20]
The idea that international covenants designed to safeguard human rights could themselves be instruments of denying them was delusional on an Orwellian scale. Moreover, the isolationist horror of “international agencies” was oxymoronic: exercising its hegemony after the war, the United States had effectively fashioned a universal law in its own image. The same treaties and agencies that so terrified the isolationists were crafted, sponsored, staffed, and maintained by the Americans and their allies. Objective reality, however, has never been a serious hindrance to politics. In 1951, Ohio senator John W. Bricker introduced an amendment to Article 6 of the Constitution, certifying that no international treaty or covenant could alter domestic law unless specifically endorsed by congressional legislation. “My purpose in offering this resolution,” Bricker said candidly, “is to bury the so-called covenant on human rights so deep that no-one holding high public office will ever dare to attempt its resurrection.” Debate raged for nearly three years until the amendment was ultimately defeated by a razor-thin margin of two Senate votes.[21] This was hardly a victory for the universalists: in truth, the amendment died not because it was unpopular but because it was redundant. The American government did not need a constitutional amendment prohibiting it from doing something it had no intention of doing in the first place. “This whole damn thing is senseless,” President Eisenhower complained.[22]
Senseless perhaps, but comprehensible. Beneath the convoluted logic of the Bricker amendment was a solid bedrock of fear. Its staunchest supporters were likewise enemies of unionization, social welfare, and—most particularly—desegregation. Far from curtailing basic rights, the real danger was that the Universal Declaration might ultimately compel the United States to recognize them. Someone, Bricker raved, had to “slow the State Department in its mad pursuit for a World Bill of Rights.”[23] In actuality, the State Department had already begun applying the brakes. Secretary of State John Foster Dulles made a special trip to the Senate Judiciary Committee to assure them that the president would never seek ratification of any human rights covenants—not even the Genocide Convention of 1948. Sounding remarkably like a cultural relativist of the mid-1990s, Dulles opined that such agreements would force “one part of the world to impose its particular social and moral standards upon another.”[24] So much for the Four Freedoms.
The intransigence of the Eisenhower Administration had a chilling effect on the United Nations itself. Since 1948 a draft committee within the UN had been doggedly attempting to transform the articles of the Universal Declaration into binding agreements. But even the friendlier Truman government had balked at adoption of “socialistic” protections for workers and families, and Republicans in Congress fumed at the “eager beavers” in the UN “grinding out treaties…which have an effect upon the rights of American citizens.”[25] By 1952 it was abundantly clear that no single covenant would gain the support of both the United States and the Soviet Union, or their respective blocs. Accordingly, drafters codified reality by officially splitting the articles into two separate covenants: the International Covenant on Civil and Political Rights (ICCPR) and the International Covenant on Economic, Social and Cultural Rights (ICESCR). Then both drafts stalled, hors de combat within an escalating Cold War that made grand international agreements all but impossible. It would not be until 1966 that the two covenants even reached the General Assembly for a vote; another decade passed before ratification. The United States formally and finally ratified the ICCPR in 1992 (with numerous reservations); it has never ratified the ICESCR.
Even as the United States retrenched, other nations seemed eager to pick up the torch of natural law and liberalism. The postwar decolonization movement followed a pattern broadly similar to the French and Russian Revolutions, marked by constitutional restraint at the start but devolving rapidly into violence and extremism. The blame may be laid in part on the colonial powers themselves and more broadly on the bipolar geopolitics of the era. The Atlantic Charter of 1941, discussed in chapter 4, pledged the Allies to a policy of self-determination for all nations. The language was Roosevelt’s; Winston Churchill signed on reluctantly, with fingers crossed behind his back. Less than a year later, when Churchill learned that FDR was secretly negotiating with Gandhi’s Congress Party for postwar independence, he exploded with rage. “Anything like a serious difference between you and me would break my heart,” he wrote to FDR, “and would surely deeply injure both our countries at the height of this terrible struggle.”[26] Roosevelt took the hint and desisted.
But the matter was merely deferred. The Atlantic Charter had raised hopes around the world of a postwar dissolution of empire, and by 1945 emergent nations were ready to collect on the bond. In Vietnam, Ho Chi Minh began talks with American espionage agents for a pledge not to allow the resumption of French imperial rule. “He kept asking me if I could remember the language of our Declaration [of Independence],” one OSS operative recalled. “The more we discussed it, the more he actually seemed to know more than I did.”[27] Not surprisingly, the Vietnamese Declaration of Independence of September 3, 1945, declared “All men are created equal, they are endowed by their Creator with certain unalienable rights; among theses are Life, Liberty and the pursuit of Happiness.” Ho wrote the draft himself and added an observation of his own: “In a broader sense, this now means all the peoples have the right to live, to be happy and free.” By this single phrase, dusty old natural law was wedded to the very new right of cultural self-determination.[28]
India quickly followed suit. The Indian Constitution of 1949 listed six “fundamental rights.” The first three were distinctly Rooseveltian: freedom of speech and association, freedom of religion, and the right to due process of law. (One might also make the case that the fourth, freedom from want, was implied by the constitutional guarantee of a “right to employment and profession” regardless of caste.) The latter three were more culturally specific: prohibition of forced and child labor; the right to education; and the rights of minorities to their culture, language, and self-governance. After the partition, Pakistan’s constitution contained nearly identical language, with the added provision that no law could contradict the teachings of Islam.[29]
Similar developments could be found on the African continent. Former British colonies including Ghana and Nigeria melded colonial law—which had often promised equal rights de jure while denying them de facto—with the new language of individualism. Thus natural law rights, as per Blackstone, were augmented with freedom of worship, speech, and protection of minorities. Cameroon, which achieved independence from France in 1960, invoked both the UN Charter and Universal Declaration in providing for the “inalienable rights” of its citizens.[30]
It should not be surprising to find such expressly Western language in these new constitutions. Like the American founding fathers, the architects of the postcolonial world were trained and immersed in the law of their colonizers. Gandhi studied law at University College London at roughly the same time that Mohammad Ali Jinnah was finishing his legal pupilage at Lincoln’s Inn. Many African leaders attended universities in the United States on scholarships funded by American religious or political organizations, including Nnamdi Azikiwe of Nigeria and Kwame Nkrumah of Ghana, and thus returned to their countries well versed in Western liberalism.[31] Ho Chi Minh lived in France, England, and the United States and had even petitioned Georges Clemenceau and Woodrow Wilson to recognize Vietnamese independence at the conference of Versailles in 1919.
The response of the Western powers to their constitutional efforts, however, was anything but receptive. In the short term, Churchill’s vision of status quo antebellum won out. Despite Ho Chi Minh’s protests and those of other former colonies, the British quickly restored French Indochina and most of its possessions in West Africa to France, even as they attempted to salvage their own waning empire. The Congo, which had supplied uranium crucial for the making of the first atomic bomb, was likewise handed back to Belgium. Throughout it all the United States remained silent, as did the UN. This naked power grab, which occurred at precisely the same time drafters hammered out provisions for the Universal Declaration of Human Rights, must be understood in context. The Soviet Union had already engulfed the whole of Eastern Europe and was recommitting itself to the old Comintern idea of world socialism. American forces still occupied Japan and were busily drafting a Showa constitution intended to demilitarize the nation and place it forever within the Western liberal democratic sphere. By the end of 1945 the world was divided into spheres of influence, distinct but not wholly different from the empires of the previous century.
The American position, expressed by Presidents Truman and Eisenhower, was that it was preferable to maintain old colonial systems if they kept their colonies in the Western bloc—by force, if necessary. This betrayal was deeply disillusioning for those, like Ho Chi Minh, who had taken FDR’s pledge to heart. As historian Samuel Moyn has written, after 1945 “Ho, who initially begged his American interlocutors to live up to the Atlantic Charter’s promise of self-determination rather than allow the French to return, stopped asking and never again made even declaratory rights central.”[32] Humayun Kabir’s warning about the “fundamental flaw” in human rights had tragically come to pass.
The failure of Western liberalism provided an opening for its nemesis. Many anti-imperialists already identified themselves as Marxists, or at least sympathetic to communistic theories. Spurned by the west, Ho Chi Minh, Nkrumah, and others accepted aid and support from the Soviet Union. The second wave of independence movements from the mid-1950s through the 1960s thus emerged in a Manichaean geopolitical climate which effectively forced them to align with one side or the other. In contrast to individual human rights, the Soviet Union championed the collective “right of self-determination,” which implicitly included wholesale rejection of the colonizer’s legal, political, and cultural norms. The Declaration on the Granting of Independence to Colonial Countries and Peoples, proposed at the UN by Nikita Khrushchev in 1960, stated baldly that self-determination was not only an absolute human right, but a superseding one. Amilcar Cabral, a Guinean anti-imperialist (and covert agent of Czechoslovak state security) crowed, “The colonial system…is now an international crime.”[33]
What the Soviets had done, in fact, was dust off and update Xenophon’s definition of law as the will of the state. If the primary right of all peoples was self-determination, this was tantamount to declaring that no law could exist that subordinated locality. In the guise of deferring to local custom (and in contrast to centuries of Western meddling under the banner of “civilization”), the Soviets were openly challenging the basis for all human rights—natural law. Its precepts, along with several centuries of Western liberalism, were collectively labeled imperialist oppression. Rejection of human rights was thus recast as an act of nationalism, even patriotism. Nuremberg might never have happened. From this admixture of anti-imperialism, positivism, and rejection of Western values came the germination of cultural relativism.
To the extent that the United States or any Western power could still claim to be invested in human rights, it was only in opposition to the perceived authoritarianism and brutality of their Soviet counterparts. Thus natural law was bound up in a kind of package deal along with democratic government, free speech, and free market economy, peddled to the postcolonial world with the same brio as Coca-Cola and Chryslers. The sales pitch was brilliantly on display at the 1964 World’s Fair, where visitors passed through the General Foods Arch along United Nations Avenue to see Japan, Austria, and Seven-Up.
Having returned, so to speak, to the fair, it is a fitting point to pause and consider in sum the two decades that preceded it. They began with enormous promise, embodied by the late President Roosevelt’s efforts to secure universal law under a United Nations. In the early postwar years, the Truman administration and its State Department seemed genuinely committed to fulfilling Roosevelt’s vision—none more than his wife. With her trademark tenacity, earnestness, and sense of duty, Eleanor devoted herself to the cause of human rights. But while she certainly shared her husband’s passion, it is arguable that she lacked his political acumen. The same might be said of the Human Rights Commission itself, composed as it was primarily of scholars and jurists. Its achievements were titanic and world-altering, but so too were its mistakes.
In hindsight, there were three principal faults. First, instead of narrowing its focus to those rights universally agreed upon between cultures—rights that also, correspondingly, had centuries of precedent, application, and evolution in every nation’s laws—the committee chose instead a broad list of “rights” that were in fact culturally specific norms: democratic government, presumption of innocence, right to join trade unions, etc. In other words, rather than tackle the admittedly difficult question of what constitutes a universal right, they deferred to each ideological conception and incorporated the whole lot within the document. This is not to suggest such goals are not laudable, or that the world wouldn’t be better if they were universally adopted. But the world of 1948 was not one where such a list could ever be realized; nor, for that matter, is ours today.
For the committee members that was immaterial. The declaration was meant to be aspirational, a foundation rather than a blood oath. But a foundation must be built upon solid ground, and the declaration was not. The lack of consensus within committee meetings should have been a warning for how the document would be received internationally. In fact, its trajectory mirrored the tribulations of its drafting, split into two covenants, one “democratic” and one “socialist,” that withered on the vine for decades until their relevancy was moot. The inclusion of expressly political elements into the document guaranteed this outcome; it was, in effect, terming a host of rights “universal” that were not universal at all, nor likely to be. Mrs. Roosevelt compared the Universal Declaration of Human Rights to the Magna Carta, the Declaration of the Rights of Man, and the US Constitution. But a code, constitution, or charter does not allow its signatories to pick and choose which provisions they wish to follow. The declaration was drafted with the full knowledge that states would do exactly that. The fact that it was signed by nearly every nation in 1948 was not evidence of its acceptance but its impotence.
This leads us to the second fault, lack of enforcement. Mrs. Mehta was correct: declarations are words on paper, but laws are a combination of word and force. The simple truth was that there were far too many provisions in the declaration for any kind of enforcement mechanism. How could the international community or any of its members force a state to provide paid medical leave? The proposed remedy was to transform the unenforceable declaration into a series of treaties or covenants. But states could still choose whether to sign—choose, in effect, whether these “universal” rights applied to them. Clearly those states that signed the covenant were already in compliance with its provisions or expected soon to be. This meant that the covenants were not an impetus to further progress but yardsticks for how far each state had already come. Moreover, they listed no penalties for states that fall short—in other words, no enforcement.
Third, and most fatal, was the lack of differentiation between rights. A code of laws, as discussed above, requires equal compliance. This means that whether the law against murder is No. 1 or No. 2339, it is still enforced. Differentiation exists not within the code but in the penalties applied for breach—hence the difference in prison sentences, for example, between first-degree murder and involuntary manslaughter. But without enforcement, all we have left is the words. And here words were the problem. Nowhere in the Universal Declaration of Human Rights is it stated or implied that the right to life is more fundamental than the right to patents and royalties. This would not matter if the declaration were law, but it is not law. Stripped to its essence, it is a statement by the community of nations that there are certain things it must do and others it must not. The list was already too long, too controversial, and too divorced from geopolitical reality. Compounding these failings was a conspicuous moral vacuum. If states would ultimately choose which rights to uphold, what was to prevent a state from claiming the right to unemployment insurance trumped the right not to be tortured? Some logical division would be helpful, and William Blackstone’s eighteenth-century distinction between “rights” and “liberties” is as sensible as any:
For the principal aim of society is to protect individuals in the enjoyment of those absolute rights, which were vested in them by the immutable laws of nature, but which could not be preserved in peace without that mutual assistance and intercourse which is gained by the institution of friendly and social communities. Hence it follows, that the first and primary end of human laws is to maintain and regulate these absolute rights of individuals…. And, therefore, the principal view of human laws is, or ought always to be, to explain, protect, and enforce such rights as are absolute, which in themselves are few and simple: and then such rights as are relative, which, arising from a variety of connections, will be far more numerous and more complicated…. Thus much for the declaration of our rights and liberties.[34]
In sum, the abandonment of natural law principles for a catchall list with no enforcement mechanism left the Universal Declaration permanently crippled—broad and shallow rather than narrow and deep. The goodwill and fervor that led to its creation dissipated quickly in the wake of Cold War realities. Less than a decade after the destruction of Nazism, the world was consumed once again by a political, military, and ideological tug-of-war that subordinated all else. Even philosophic constructs of right, law, and duty were passed through its narrow prism. At the World’s Fair, visitors tripped happily from the UNICEF Pavilion (sponsored by Pepsi) humming its irritating signature theme: “It’s a small world after all…” Across from them rose the glittering striated hubcap of Progressland.
Yet what no one realized was that the Carousel of Progress had done a complete revolution and was back at the beginning once again.