UniversityEssayServices

Module 4: The Cold War

The wartime alliance of the Soviet Union, the United States, and Great Britain was the result of confronting a common enemy they feared more than they distrusted each other. Once the common foe was vanquished, the underlying hostility surfaced quickly as the Soviet Union moved to expand its borders. In March 1946, Britain’s Winston Churchill would proclaim that an “iron curtain” had fallen across Europe, separating east and west. Although the curtain was merely symbolic, it would effectively divide the world into two camps until early 1989, when the most visible embodiment of that iron curtain—the Berlin Wall—was dismantled, signaling the end of the Cold War.

I. The Cold War Begins

II. Baby Boom in Suburbia

III. The Elvis Era

IV. Fighting Communism at Home

I. The Cold War Begins

When World War II ended, it seemed, briefly, as though the world might be headed toward a new era of cooperation. The plans for the new United Nations were drawn up in the spring of 1945 and the organization was officially established in October of that year, once the five permanent members of the Security Council (the United States, Great Britain, France, China, and the USSR) had ratified the treaty.

Flash program with voiceover and archival film about the beginning of the UN, its 20th anniversary, and major decisions, including speeches by Truman, Molotov, Lyndon Johnson, and others.

This time, there was little hesitation on the part of the U.S. Senate and the treaty was quickly ratified. In part, this was in recognition of the recent death of Franklin Roosevelt, who had championed the creation of the UN from the beginning of World War II. Americans had also, however, clearly learned a lesson from the experience after World War I and there were few within the country who advocated a return to isolationism. The general feeling was that the United States had an obligation to lead the post-war world. The first step toward global leadership was to rebuild war-torn Europe. The second was to create an infrastructure, through the United Nations, that would allow for peaceful resolution of conflicts and, hopefully, avoid another world war. (Click on the thumbnail United Nations.)

In just a matter of months, however, the shaky truce between the United States and the USSR began to collapse. The Soviets saw the United States, the self-proclaimed leader of the capitalist world, as the natural enemy of any Communist nation and sought to create friendly, buffer states around its borders. The United States felt that the Soviet government and Communism were a threat to Europe and, potentially, all democratic governments.

Flash program with voiceover and archival film about the threat of Communism in France, Iran, E. Germany, China and Korea. It includes a cartoon showing the “Duck and Cover” defense against nuclear attack, a segment on Cuba becoming Communist, the hydrogen bomb threat by Russia, and the Cuban Missile crisis.

To Western eyes, the creation of Soviet-style regimes in Hungary, Romania, and Bulgaria, and soon thereafter in Yugoslavia and Albania, confirmed their greatest fears that Stalin would not be content until the entire continent of Europe—and perhaps, the entire globe—was under his control. When the Soviets blocked open elections in Poland in 1946, fears grew that the Nazi swastika that Europe and its allies had just fought so hard to avoid would simply be replaced with the Soviet hammer and sickle (Jones 2001, 220–1). George Kennan, a U.S. State Department employee stationed in the Soviet Union, sent what would become known as the “Long Telegram.” Kennan cautioned that the Soviets were unlikely to compromise in their territorial aims. The response of the U.S. would have to be one of persistent containment, countering the USSR at any point on the globe if they attempted to expand their sphere of influence (Jones 2001, 232). This principle became the cornerstone of U.S. policy throughout the Cold War. (Click on the thumbnail The Red Menace.)

The Cold War grew chillier in 1947 as a Communist movement within Greece seemed likely to overthrow the Greek monarchy, an ally of Great Britain. The British, still severely weakened from World War II, were not able to aid the Greek government. This was one of several events that led President Truman to develop the Truman Doctrine. Congress authorized a massive military and economic package of $400 million to the forces of Greece and Turkey, where they were also fighting against a Communist insurgency (Jones 2001, 240). The Marshall Plan was also approved, with the hope that an infusion of economic assistance to European countries would help to stabilize the existing governments and make it less likely that hungry and deprived citizens would embrace the alternatives being suggested by Communist groups.

Flash program with voiceover and archival 1949 British documentary film detailing the Allied effort to supply Berlin during the Communists' Berlin blockade.

From the Soviet perspective, however, the Marshall Plan was another clear example of U.S. economic aggression. Stalin’s response was to attempt to force the Western powers out of Berlin. The post-war division of Germany into four spheres of influence, under Britain, France, the United States, and the USSR, had similarly divided the German capital city, which was physically located within the Soviet section of the country. In the summer of 1948, the USSR cut off rail and road access to the Western sections of Berlin. Truman’s response, the Berlin Airlift, affirmed the policy of containment when the Soviet Union backed down and reopened supply routes in May 1949 (Behrman 2007, 245). (Click on the thumbnail Berlin Airlift.)

Spheres of economic influence were clearly important to both sides in the late 1940s, but military security issues were also of vital concern. In April 1949, the United States oversaw the creation of a collective security organization, the North Atlantic Treaty Organization (NATO), whose members agreed that an attack against one member would be viewed as an attack against all. Stalin’s immediate response was to unite the economies within the Soviet sphere under the Council for Mutual Economic Assistance (COMECON) and, in early 1949, to initiate an alliance with the new Peoples’ Republic of China (PRC).

The United States moved quickly to contain the spread of Communism in both Europe and Asia in 1950, supporting the rearmament of West Germany and offering aid to the French, who were fighting off a Communist insurgency in French Indochina (later called Vietnam). At the end of World War II, Korea (which had been occupied by Japan) was divided into two separate units, the U.S.-backed Republic of Korea in the south, and the Communist-led Democratic People’s Republic of Korea in the north. The United States withdrew its troops in 1949 and the North Koreans responded a year later by invading South Korea. The U.S. responded by returning troops to the Korean peninsula and requested assistance from the United Nations. The USSR was boycotting the UN due to the refusal of the Security Council to seat the representative of the Peoples’ Republic of China, so there was no opposition to the proposal, and UN forces joined the effort (Lens and Zinn 2003, 377).

Flash program with voiceover and archival film about the Korean War from its inception to its end, including the confrontation between President Truman and General MacArthur.

Troops led by General Douglas MacArthur pushed North Korea back to the previous border and beyond, with the goal of creating a united Korea. China launched a counterattack in late 1950, causing heavy U.S. casualties. MacArthur’s request that he be allowed to invade China was denied by Truman. MacArthur repeatedly and publicly denounced Truman’s decision. In 1951, President Truman dismissed him on charges of insubordination (Halberstam 1994, 114). The troops would hold in a stalemate until 1953, when President Dwight D. Eisenhower engineered a cease-fire. North and South Korea are still, technically, at war, with the mile-wide Demilitarized Zone (or DMZ) separating them near the 38th parallel. (Click on the thumbnail Korea.)

Throughout the 1950s, the nuclear arms race cast a shadow over every aspect of relations between the U.S. and the Soviet Union. The fact that the United States had nuclear weapons and the Soviets did not was one of the key reasons for Stalin’s aggressive actions after WWII. The Baruch Plan, a United Nations proposal that would have put all nuclear weapons under international control, was dismissed by the Soviets as an obvious attempt by the United States to prevent the USSR from acquiring the bomb, and the USSR moved forward with its nuclear program after the war. In August 1949, the USSR tested its first nuclear weapon.

In 1950, the National Security Council issued a secret document, known as NSC-68, that committed the United States to meet the challenge of a nuclear-armed Soviet Union through a massive increase in military spending. Truman also urged the U.S. Atomic Energy Commission to move forward with development of the hydrogen bomb. By 1954, hydrogen weapons that researchers estimated at 1,000 times the strength of the atomic bomb dropped a decade earlier at Hiroshima were tested at the Pacific archipelago of Bikini (Horowitz 1965, 144). The Soviets soon tested their own hydrogen weapons and the arms race that followed would continue at various levels of intensity throughout the Cold War.

II. Baby Boom in Suburbia

After World War II, the United States was without question the wealthiest country in the world. The economy quickly adjusted from war-based production to consumer goods, as Americans lined up to purchase items that had not been available due to the war. They could now choose from a wider array of products, many of which were based on scientific innovations resulting from the war effort. Household appliances—refrigerators, dishwashers and clothes dryers—were at the top of the list. Production increased and employment opportunities were abundant. Government spending remained high, as the Truman administration continued many programs launched through the New Deal. Military spending was also increased as the Cold War developed and the Marshall Plan expanded European markets for American goods.

Flash program with voiceover and archival film showing Redbook Magazine’s 1957 marketing film depicting life in the suburbs at that time.

To the generations of Americans that had lived through the Depression and WWII, this new prosperity was staggering. By 1960, the median family income had nearly doubled and an increasing number of American families were now considered themselves part of the middle class (O’Neill 1989, 288). Home ownership was now within the reach of most families, especially those who could qualify for assistance under the GI Bill. The vast majority of these new homes were built in the suburban areas that began springing up around every major city in the 1950s. The move to the suburbs meant that automobile production increased, and by 1960, three-quarters of all families owned a car and one-fifth of them owned two (ibid). (Click on the thumbnail The Suburbs.)

Flash program with voiceover and archival film featuring a 1957 baby-food commercial, the opening of the Kaiser Maternity Hospital, a baby race in Los Angeles, and a diaper derby.

Many couples had delayed starting a family during the war. During the early 1950s, the birth rate rose steadily. In 1957, when the “baby boom” peaked, an American child was born every 7 seconds. Dr. Benjamin Spock’s book on baby and child development was a national bestseller throughout the 1950s (Patterson 1997, 77). (Click on the thumbnail Baby Boom.)

Television sets were also increasingly common in the American home. Stations had begun broadcasting in some American cities prior to WWII, but the number of households with a TV set skyrocketed during the 1950s—from less than one million sets in 1949 to over 44 million in 1959 (O’Neill 1989, 83). There were 600 broadcasting stations by the end of the decade. Large cities noted a drop in water pressure during popular programs such as I Love Lucy, as viewers across the city rushed to the bathroom during the commercial break (ibid).

Flash program with voiceover and archival film clips from early TV shows including, among others, Dragnet, the Mickey Mouse Club, Howdy Doody, Frank Sinatra, Ozzie and Harriet, and the Honeymooners. It also has ads for the 1959 Chevrolet, Coca Cola, Winston cigarettes, and others.

Shows like I Love Lucy and other situation comedies, which had moved easily from radio to television, frequently focused on the suburban wife and motherhood. A record 44 million viewers tuned in to view the 1953 episode where Lucy Ricardo gave birth to “Little Ricky,” paralleling the real-life pregnancy of the actress, Lucille Ball. Only half as many viewers would watch the inauguration of President Eisenhower the next day (Halberstam 1994, 198).

Another staple of early television programming was the soap opera, which earned the name because shows were initially hosted by a single sponsor, often a company like Proctor and Gamble, that specialized in cleaning products for the home. As the number of television households grew, so did the impact of advertising. Programming was often shaped by the desires of advertisers and most shows focused on the middle-class white suburban families who were most likely to purchase the products being advertised. (Click on the thumbnail Early Television.)

The middle class was larger than ever before, but twenty percent of America still lived below the poverty level (O’Neill 1989, 21). This latter segment of the population was diverse, including the urban poor, migrant workers, the elderly, and residents of isolated rural communities. The middle-class flight to the suburbs meant that the inner cities were increasingly made of up the poor and, in particular, minorities, who were the most likely to be affected by poverty. Between the end of the war and 1960, over five million African Americans from the rural South moved into the Northern cities being vacated by the middle class. Many of the Mexican guest workers, who had poured into the United States during World War II to take the place of workers who had joined the military, had remained illegally after the war. Operation Wetback was devised in 1954 by the newly created Immigration and Naturalization Service (INS) to question “Mexican-looking” people in towns along the border and deport those who could not prove their citizenship status (Patterson 1997, 380).

There was, however, some movement on the civil rights front after the war. Harry Truman acted quickly to desegregate the armed forces at the end of WWII and to end discrimination in the hiring of federal workers. These were two things he was able to accomplish via executive order. On other issues, however, Truman was less successful. His efforts to end poll taxes and to pass an anti-lynching bill required action on the part of Congress and were blocked by a coalition of Republican opponents and southern Democrats, many of whom would break away from the Democratic Party to form the Dixiecrats in the 1948 election. Progressive Democrats also broke away to form a third party. Despite this—and despite newspaper headlines heralding his defeat—Truman was narrowly reelected.

The Republican Party held a majority in Congress, however, so Truman met opposition to several core policies of his agenda, which he called the “Fair Deal,” including a plan for national health-care coverage. Republicans viewed the Fair Deal as a warmed-over New Deal, and they had not been fond of the dish when it was initially served by Franklin Roosevelt. Truman was, however, able to extend Social Security and also obtain an increase to the minimum wage.

Flash program with voiceover and archival film about the 1952 Stevenson vs. Eisenhower presidential campaign, Eisenhower’s diplomatic activities during his first term, excerpts from a 1956 election campaign ad, and Eisenhower’s reelection in 1956.

The election of Eisenhower in 1952 did not result in major changes on the legislative front. Eisenhower, who was a moderate Republican, continued a variety of New Deal programs and also oversaw a massive new federal project for the creation of the interstate highway system and the beginning of the National Aeronautics and Space Administration (NASA). (Click on the thumbnail I Like Ike!)

While “Ike” was considered to be only a lukewarm supporter of civil rights, he did appoint Earl Warren as Chief Justice of the Supreme Court. The Warren Court marked a dramatic shift in U.S. judicial history, beginning with one of the first major cases, Brown v. the Board of Education , which declared segregated schools unconstitutional. The next year, states were ordered by the court to move as quickly as possible to integrate public schools. Many Southern states resisted, including Arkansas. In 1957, the Governor, Orval Faubus, blocked the integration of Little Rock Central High School and Eisenhower reluctantly sent in the National Guard to ensure that the Supreme Court ruling was carried out.

Integration of the public schools was a long, slow process and the goal would not be fully realized in some parts of the South until the mid-1960s. Many white parents, especially in the South, were concerned that merging the black and white schools would lower the quality of education—and given that African-American schools had been so poorly funded, this was probably not an unfounded fear. Others were equally concerned that integration would accelerate the cultural changes that they were seeing in the younger generation—especially in their choice of music.

III. The Elvis Era

Prior to World War I, the “teenager” as we know it did not exist. The actual term was coined in the late 1940s to describe a demographic with growing economic power (Palladino 1997, 52). Advertisers soon realized that they ignored the teenage market at their peril, since teenage girls would soon be women with control over household purchases.

The teenager did not, however, emerge overnight. From the 1920s through the early 1940s, a small group of privileged young adults had spare time on their hands and a bit of money to spend. Some of these were the young people who sneaked into speakeasies in Harlem. Others were the young middle-class girls—the “bobby-soxers”—who swooned over Frank Sinatra and Rudy Valee (Schrum 2004, 122).

Most young people during the Depression and World War II, however, had to take life pretty seriously. A growing number of young people were continuing their education through high school, but the boys knew that they would soon be in the military or working and many already held part-time jobs. Many girls would also enter the workforce briefly, but the general pattern was to marry early and become a housewife. College education was available only to the lucky few whose parents could afford it.

After World War II, however, the United States was enjoying a much greater level of prosperity, and an increasing number of families joined the middle class. Privileges that had previously been available only to the children of the relatively wealthy were now common for average families. Parents pushed their children to finish high school and a greater number of graduates continued to college. Young adults were more likely to be given an allowance and to have free time to spend with their peers.

Flash program with voiceover and archival film clips on the causes of juvenile delinquency, including excerpts from Rebel without A Cause; I Was A Teenage Werewolf; Ask Me, Don’t Tell Me; and a discussion of the influence of teen clubs.

Teenagers who had both free time and a bit of spending cash were instrumental in the culture shift that began in the 1950s. The advent of radio and television also gave teenagers an easy way to learn what was hip—and advertisers an easy way to influence that decision. Parents were suddenly alarmed at the “generation gap” that appeared as teens began to try to differentiate themselves from the culture embraced by their parents. The news reported an increase in juvenile delinquency and schools began to show cautionary films to keep teenagers on the straight-and-narrow path. (Click on the thumbnail Juvenile Delinquents.)

Flash program with voiceover and archival film discussing black and white performers including film clips of performances by Pat Boone, Fats Domino, Little Richard, Elvis Presley, the Beatles, and others.

The music that teenagers were listening to was often touted as the strongest evidence of the decline of morality among youth, and racism was clearly an additional element in parental concerns. There could be little doubt that the roots of rock and roll were firmly planted in African-American blues. Black artists saw their record sales begin to move upward as white teenagers began to purchase more rhythm and blues records in the early 1950s. These were marketed to white teens as a new category of music, “rock and roll,” popularizing the label first used by disc jockey Alan Freed (aka “the Moondog”) in 1952. Freed played music by black artists and his show was widely syndicated in the mid-1950s. He also hosted a television show, The Big Beat, but it was cancelled in 1957 when ABC affiliate stations in the South complained about a segment in which black singer Frankie Lymon danced with a white girl. Dick Clark, whose show American Bandstand would also premiere in 1957, was careful not to make the same mistake (Paladino 1997, 133). (Click on the thumbnail Shake, Rattle & Roll | Multimedia transcript.)

The solution that record labels hit upon was to find white artists to cover the songs that white teenagers wanted to hear. While some of the original recording artists were annoyed that their sound was being co-opted (and watered down) by the white performers, others recognized that this usually resulted in greater sales for their own records, as teenagers sought out the wilder original versions of their favorite tunes. Singers like Pat Boone and Ricky Nelson, who were less threatening to white parents, had major hits with songs that were initially recorded by Little Richard, Chuck Berry, and Fats Domino.

Elvis Presley, however, fell into a different category. Presley’s music seems rather tame to modern ears, but for the white parents of teenagers in the mid-1950s, he was shocking. Unlike Pat Boone and Ricky Nelson, Presley didn’t “sound white” and the moves he made on stage were sexually suggestive. An appearance by Elvis on Milton Berle’s popular variety show in 1956 showed teenage girls screaming at every shake of his hips. The Catholic Church published an article warning the faithful to “Beware Elvis Presley” and psychiatrists cautioned that seeing Elvis perform could result in young girls becoming sexually promiscuous (WGBH 1999). Ironically, teenage boys seemed less convinced of this—at one performance, a teen gang bombed the singer’s car, apparently jealous of the attention he was receiving (Carr and Ferren 1989, 11). Elvis embraced the “rebel image” and could reportedly quote from memory every line that James Dean’s character had uttered in the 1955 film Rebel without a Cause (Halberstam 1994, 487).

Other performers—both white and black—began to adopt elements of the Elvis style. As Presley’s popularity grew, his managers began trying to make his work more palatable to mainstream America. Ed Sullivan, who had initially said he’d never hire Presley, only a few months later signed the singer to a three-show contract. In addition to the usual moves that drove the girls wild, Presley included ballads and a gospel song. Ed Sullivan closed the last performance by declaring Elvis a “decent, fine boy,” but many parents remained unconvinced (Halberstam 1994, 509). The more that parents and schools cautioned against or even banned rock and roll music, however, the more popular it became.

IV. Fighting Communism at Home

Rock and roll was not the only cultural menace of the 1950s, however. As the Cold War grew more intense, the U.S. government became increasingly concerned about the possibility of Communist infiltration. Initially, the key concern was government employees, but during the 1950s, suspicion would spread to include writers and performers.

The Smith Act of 1940 prohibited any group from advocating the violent overthrow of government. This act was used to convict several leaders of the Communist Party and Socialist Workers Party in the late 1940s and early 1950s. Some of the fears were legitimate—the Communist Party claimed its largest membership ever during World War II and some party members held positions in the federal government.

The House Un-American Activities Committee (HUAC) began a series of investigations in the late 1940s. Several high-profile cases remain controversial. In 1948, a Time magazine editor, Whittaker Chambers, was called before the committee to testify. He initially denied having worked for the Soviets, but claimed that he had knowledge that a former State Department official, Alger Hiss, had passed classified material to the USSR in the 1930s. Hiss, who was a rising star in the FDR administration and had prominent contacts in the government, sued Chambers for slander.

During the trial, Chambers changed his story, claiming that both he and Hiss had been operatives and that Hiss had passed microfilmed documents to him in the 1930s. Chambers then led investigators to a hollowed out pumpkin on his farm in Maryland, where he had hidden rolls of undeveloped film that he claimed were passed to him by Hiss. The documents were promptly dubbed “The Pumpkin Papers” by the press (Kessler 2003, 87). Representative Richard Nixon touted the film as solid evidence against Hiss, but Hiss could not be prosecuted for treason due to the statute of limitations; he was instead convicted of perjury and served several years in prison. Hiss steadfastly maintained his innocence and later sued for the release of the “Pumpkin Papers,” and while the contents were not as damning as Nixon suggested, most scholars remain convinced that Hiss was indeed a Soviet spy (Powers 2004, 88–9).

In the same year that Hiss was convicted, Julius and Ethel Rosenberg were charged with passing nuclear secrets to the Soviet Union. The two were executed in 1952. While the evidence that Julius Rosenberg did pass secrets to the Soviets is strong, documents released after the end of the Cold War suggest that the Soviets believed the information was of little use. Ethel, whose brother Daniel Greenglass worked at Los Alamos, was probably guilty only of being Rosenberg’s wife and a fellow Communist, although KGB records suggest that she knew of Rosenberg’s activities. Greenglass testified that he had seen her transcribing his notes from Los Alamos on a typewriter, but later admitted that he lied about this, to protect himself and his wife (Kessler 2003, 95).

Suspicions of disloyalty soon spread to the entertainment industry. A growing list of actors, directors, and writers were summoned to appear before HUAC. Some refused and were imprisoned. Those who were even suspected of ties to the Communist Party frequently found themselves blacklisted and unable to obtain work in Hollywood.

Flash program with voiceover and archival film showing the McCarthy hearings on communists in government and industry, opinions voiced by his opponents, and his eventual censure in 1954.

Republican Senator Joseph McCarthy was a central figure in the hearings. In early 1950, McCarthy stood on the floor and claimed to have a list of hundreds of Communists who were employed by the State Department (Whitfield 1996, 29). He was unable to support the claim, but gained popular support, charging that Truman and the Democratic Party were “soft” on Communism (Whitfield 1996, 229). Later that year, Congress passed the McCarran Internal Security Act, which set up a special board to investigate Communist infiltration in the government. (Click on the thumbnail McCarthyism.)

Fear of Communism was also a key factor in the 1952 elections. The Republican presidential candidate, Dwight Eisenhower, avoided accusations against the Democrats, but his running mate, former HUAC member Richard Nixon, was less reticent. He charged that electing Adlai Stevenson, the Democratic candidate, would result in “more Alger Hisses, more atomic spies” (Cochran 1973, 383). Eisenhower won a solid victory in 1952 and was easily re-elected in 1956.

McCarthy eventually overstepped his bounds. During the 1952 campaign, McCarthy was making headlines by denouncing General George Marshall as a Communist. Eisenhower was a close, long-time friend of Marshall and was infuriated at the charges. He planned to denounce McCarthy in a campaign speech but deleted that section of the speech when he was cautioned that this could hurt his chances in the election. For this reason, Eisenhower was determined to see McCarthy fall. The senator did not realize that his usefulness to the Republican Party was over now that they held the White House and he continued with his charges, launching the Army-McCarthy hearings in 1954. By the conclusion of those hearings, which were televised, McCarthy had lost much of his public support and was censured by his Senate colleagues at the end of the year. The only Democrat to vote against censure was Senator John F. Kennedy (Whitfield 1996, 209).

The end of McCarthy was not, however, the end of McCarthyism. Throughout the country, books were banned as “subversive” and the slightest connection to a left-leaning group could result in the loss of a job for writers, academics, and entertainers. As was the case with McCarthy, the charge was often not against the content of the work or any concrete actions on the part of the individual, but simply a matter of guilt by association. FBI director J. Edgar Hoover had been a close ally of McCarthy, but cut all ties to McCarthy when it became clear that McCarthy’s political career was at an end. Hoover continued to work closely with HUAC as it investigated a wide variety of entertainers, writers, and government workers. In 1959, former president Harry Truman denounced HUAC as the “most un-American thing in the country today” (Whitfield 1996, 124). Although HUAC lost influence during the 1960s, the committee was not formally disbanded until 1975.

Module 4: The Cold War

Without doubt, “the sixties” left their mark on American history and culture. To some, the era is seen as a time of violence and chaos, when riots and assassinations scarred the political scene and a costly war divided the nation. Others look back nostalgically on the decade as a time when the country took its first real steps toward ensuring racial equality and when a small segment of society believed that a peaceful, more holistic society was within reach. Sex, drugs, and rock and roll all had starring roles and the decade continues to have an impact on many political and social debates.

I. Nuclear Politics and the Space Race

II. The Great Society

III. Fighting Jim Crow

IV. Vietnam

V. Make Love, Not War

I. Nuclear Politics and the Space Race

Flash program with voiceover and archival film about the competition to be preeminent in space between Russia and the U.S. It shows film clips of the first satellites—Russia’s Sputnik and the U.S.’s Explorer and Challenger—followed by images of missiles with live animal cargo, then the first men in space, Yuri Gagarin and John Glenn.

In 1957, the USSR launched Sputnik, the first artificial satellite. This achievement was in many ways more alarming to Americans than the first Soviet nuclear tests, as it indicated that the USSR was significantly ahead of us in the arena of science and technology. The United States launched its own satellite in January 1958, and later that year, NASA was formed to oversee civilian space exploration, but there was a clear sense that we were significantly behind in the new “space race.” The pattern was repeated in 1961, when the Soviets announced that astronaut Yuri Gagarin had successfully orbited the earth. A U.S. astronaut, John Glenn, followed suit in 1962. (Click on the thumbnail The Space Race.)

Flash program with voiceover and archival film showing ads for the 1960 election, clips of the Nixon-Kennedy debates, newsreels of the voting and Kennedy's subsequent win, and newsreels of Kennedy's acceptance speech and swearing-in speech.

Americans were not accustomed to coming in second. In the 1960 election, Democratic presidential candidate John F. Kennedy channeled that frustration and defeated the Republican candidate, Vice-President Richard Nixon. Kennedy’s vision of a “New Frontier” for America, both at home and abroad, was inspiring to young people, who flocked to join the newly created Peace Corps. In keeping with that theme, he set an ambitious goal for space exploration—an American on the moon within ten years. In terms of foreign policy, however, Kennedy’s administration was very similar to that of Eisenhower, continuing the policy of containment. Kennedy significantly raised defense spending, increased our nuclear arsenal, and expanded our involvement in Vietnam. (Click on the thumbnail 1960 Election.)

During Kennedy’s first year in office, the containment policy would be put to one of its most stringent tests, as the Cold War very nearly turned hot. In 1959, a rebellion led by Fidel Castro had resulted in the overthrow of the Cuban government. Castro moved quickly to nationalize businesses, in an effort to narrow the gap between the rich and poor in Cuba. This included the takeover of several U.S.-owned businesses. The failed Bay of Pigs invasion in 1961 convinced Cuba that it needed a stronger foreign ally, and the new Soviet Premier, Nikita Khrushchev, agreed to set up a nuclear missile base in Cuba. The 1962 Cuban Missile Crisis pushed the U.S. and USSR to the very brink of nuclear confrontation. The close call was, however, an impetus for Kennedy and Khrushchev to take some tentative steps toward slowing the pace of the nuclear rivalry, including the 1963 Limited Test Ban Treaty.

II. The Great Society

Flash program with voiceover and archival film showing TV reports of the Kennedy assassination, including a re-enactment of the shooting narrated by Walter Cronkite, and a discussion of the Warren Report.

Kennedy’s New Frontier launched several new initiatives, including the Peace Corps, and he made some initial steps toward civil rights legislation. Other initiatives were still in the planning stages when he was assassinated in November 1963. (Click on the thumbnail Kennedy Assassination.) The new president, Lyndon B. Johnson (LBJ), a Texan with a long career in Congress, took up the domestic agenda with a vengeance. In early 1964, Johnson declared a “War on Poverty”—a multifaceted plan for education, training, and direct assistance to the poor. Programs included Volunteers in Service to America (VISTA), the Job Corps training program, and Project Head Start.

Flash program with voiceover and archival film showing President Johnson’s address declaring a war on poverty and showing excerpts from an educational film depicting poverty in rural America, produced as part of the war on poverty .

The 1964 election returned Johnson to the White House and he expanded his domestic agenda. His program, which he called “The Great Society,” continued many of the War on Poverty initiatives. The establishment of Medicare and Medicaid to provide medical care for the elderly and poor, respectively, were central components of Johnson’s agenda. Other programs sought to improve the environment, protect endangered species, and provide federal aid to impoverished school districts. (Click on the thumbnail War on Poverty.)

The domestic reforms of the New Deal era had faced both congressional and judicial opposition. Congress was still an occasional impediment to Johnson’s ambitious agenda, but the president had spent many years as Speaker of the House and was adept at finding ways to secure passage of his key programs. The judiciary was also much more sympathetic than the one that FDR had faced. Four new justices joined the Supreme Court during the Kennedy and Johnson years, including the first African-American justice, Thurgood Marshall. These new appointments resulted in a solid liberal majority that handed down a series of landmark rulings in the 1960s. The court moved quickly against a variety of laws that allowed for continued segregation, limited the ability of communities to censor movies and reading material, and declared organized school prayer and various state restrictions on interracial marriage and the distribution of contraceptives to be unconstitutional. Other decisions, including Miranda v. Arizona and Gideon v. Wainwright expanded the rights of the accused.

III. Fighting Jim Crow

Flash program with footage from a low-budget film, starring William Shatner, dramatizing the desegregation of a small high school in the South. It also features a report by John Cameron Swayze on desegregation in St. Louis, Mo., and a clip about the John Birch Society.

One of the primary areas of reform during the Johnson administration was civil rights, but the legislative and judicial gains of the 1960s were the product of slow, steady pressure by civil rights groups throughout the 1940s and 1950s. The NAACP pushed a variety of cases through the courts in an attempt to overturn the 1896 “separate but equal” ruling in Plessy v. Ferguson. When the Brown decision was handed down in 1954, however, it was clear that there would be considerable resistance in the states. Even after the 1955 order that states move “with all deliberate speed,” Southern states continued to block enrollment by black students. (Click on the thumbnail School Desegregation.)

Television cameras were, however, beginning to bring the civil-rights struggle directly into living rooms across the country. They covered the National Guard as they ushered nine black students into the classrooms at Little Rock Central High School and they covered the 1955 bus boycott in Montgomery, which was organized when Rosa Parks refused to vacate her seat on the bus to a white passenger. The minister who helped to organize that boycott was Martin Luther King, Jr., and he would be at the forefront of a new organization created in 1957, the Southern Christian Leadership Discussion (SCLC).

Television cameras were also rolling as young people expanded the movement. In February 1960, four black students from North Carolina A&T University staged a sit-in at segregated lunch counters, patiently waiting to be served. The protest quickly spread throughout the South and a separate organization, the Student Nonviolent Coordinating Committee (SNCC), was formed to help organize activities, including sit-ins, where protesters met the violence of the authorities in the South with nonviolent resistance. SNCC and the Congress of Racial Equality (CORE) encouraged both black and white passengers to ignore the segregated seating rules and participate in “Freedom Rides” throughout the South.

Although Kennedy was a strong proponent of civil rights, it was not one of the items that he pursued vigorously during the first two years of his presidency. Prior to 1963, his strongest statement had been authorizing federal troops to quell the riots that erupted when James Meredith, an African American and former soldier backed by a federal court order, enrolled in the University of Mississippi. Alabama was also a focal point that year, as Governor George Wallace personally blocked the doorways to the University of Alabama and police in Birmingham used violent tactics against civil rights proponents, including fire hoses, attack dogs, and cattle prods. Also in Birmingham, Ku Klux Klan members bombed a church on September 15, 1963, killing four African-American girls who were attending Sunday School.

In the summer of 1963, Kennedy proclaimed that ensuring civil rights was a moral issue that the country could no longer delay. As Congress debated the civil rights legislation he proposed, massive demonstrations were organized including the March on Washington in August 1963, where more than 200,000 people heard Martin Luther King tell the world of his dream for an America where racial hatred and violence no longer divided the nation (Zinn 2003, 457).

Kennedy’s assassination in November 1963 led many to assume that the hope for civil rights legislation was dead, especially since Johnson was from the South. LBJ surprised them by not only endorsing but expanding the proposed legislation and pushing Congress to enact the Civil Rights Act of 1964 as a memorial to Kennedy.

Several other advances for racial equality followed. The ratification of the 24th Amendment in 1964 meant that states could no longer assess a poll tax for participation in federal elections and, in 1966, the Supreme Court declared poll taxes invalid for any election. The Voting Rights Act of 1965 further expanded suffrage rights for African Americans by outlawing literacy tests that had been used for generations to keep black voters from participating. As a result of these measures, the number of black voters who were registered in the South increased threefold (Zinn 2003, 456). Johnson also signed Executive Order 11246, which required organizations affiliated with or doing business with the federal government to take positive steps—or “affirmative action”—to correct the existing imbalance in economic opportunities for African Americans and other groups (including women) that had been the victims of past discrimination.

Flash program with voiceover and archival film featuring Reverend Martin Luther King describing the "New Negro." It includes a speech by President Kennedy on the topic, other civil rights stories, speeches by James Farmer, Malcolm X, and Martin Luther King’s "I have a dream" speech. It also shows President Johnson’s speech regarding the march on Selma, Alabama and his subsequent signing of the voting rights act.

For many, however, these long-delayed actions were too little and too late. In August 1965, a week of rioting in Watts, a predominantly black area of Los Angeles, resulted in 34 deaths and massive property loss (Zinn 2003, 459). Rioting spread to cities throughout the nation between 1966 and 1968. The assassinations of Nation of Islam leader Malcolm X in 1965 and of Martin Luther King in 1968 fueled the argument of those who claimed that American society would never accept African Americans as equal citizens. Younger and more radical civil rights groups clashed directly with older groups like the NAACP and SCLC, arguing that the emphasis should be on black power and separatism, not an integrated society. (Click on the thumbnail Civil Rights Protests.)

IV. Vietnam

The domestic programs that LBJ pursued during his administration were not popular with conservatives, but his foreign policy was even less popular with liberals. Foreign policy experts in both the Eisenhower and Kennedy administrations had been convinced that it was critical to hold the line in Vietnam in order to avoid a Communist takeover of Asia. France had withdrawn from Vietnam in 1954 and the country was divided, much like Korea, between a Communist government (the National Liberation Front, or NLF, often known as the Viet Cong) in the northern half of the country and a non-Communist government in the south.

Kennedy continued the policies of the Eisenhower administration in supporting the South Vietnamese and dramatically increased both financial support and the number of military advisors in the country. Just months before Kennedy’s assassination, the United States supported a successful coup attempt in South Vietnam that overthrew the elected leader, Ngo Dinh Diem.

Flash program with voiceover and archival film showing Secretary of State Robert McNamara discussing the progress of the war in Vietnam and addresses by Presidents Johnson, Nixon, and Ford on the subject.

Communist forces were gaining influence in South Vietnam, and Johnson was faced with a difficult decision. Increased aid for South Vietnam would inevitably impact funding for his domestic programs, which he felt were needed for the long-term social and economic growth of the nation. On the other hand, failing to increase support for South Vietnam might mean that the nation would fall to the Communists and, even though the United States had never officially declared war, Johnson knew that this would be perceived as a loss for the U.S. It would also be a loss for the Democrats, who could not risk being labeled as “soft” on Communism—especially with an election on the horizon. (Click on the thumbnail Vietnam War.)

In August 1964, several months before the presidential election, the North Vietnamese attacked U.S. warships in the Gulf of Tonkin. In response, Johnson authorized limited bombing campaigns against North Vietnamese targets and requested that Congress pass the Gulf of Tonkin Resolution, which gave the president the authority to use armed force in Vietnam. The resolution was supported by all but a few members of Congress.

During the 1964 election, LBJ campaigned on a peace and prosperity platform—easily defeating conservative challenger Barry Goldwater. After the election, Johnson used his power under the resolution to dramatically increase U.S. troop presence in Vietnam and escalate the bombing raids. By 1968, U.S. forces in Vietnam totaled over half a million (Zinn 2003, 477).

Despite the escalation, there was no formal declaration of war. The goal of the U.S. military was to break the North Vietnamese by inflicting steady and sustained losses, but they were not easily broken. In 1968, the North Vietnamese launched the Tet Offensive, which resulted in heavy U.S. and South Vietnamese casualties, including the capture of the U.S. Embassy at Saigon (Zinn 2003, 499). The North Vietnamese were eventually pushed back, but the casualty count and the war debt continued to grow and the sense among the American public was that we were accomplishing very little in return.

V. Make Love, Not War

Flash program with voiceover and archival film clips from movies and TV, including among others, Gunsmoke, Psycho, The Andy Griffith Show, Mr. Ed, To Kill a Mockingbird, Goldfinger, Dr. Strangelove, The Addams Family, Gilligan’s Island, Star Trek, and 2001: A Space Odyssey.

As the 1960s began, a rebellion was brewing in colleges across the nation. A growing number of young people were now attending college and, as is often the case, they were convinced that the previous generation had made serious mistakes. The suburban lifestyle that their parents had gladly embraced after the Depression and World War II seemed too commercial and crass. (Click on the thumbnail Growing Up in the 60s.)

The younger generation was also stuck fighting a war that they increasingly viewed as having been avoidable, and many felt that the United States was the aggressor. The news reports of mass bombings, chemical weapons, and constant death—both of Vietnamese civilians and U.S. soldiers—led to an increasing sense of dissatisfaction with both the U.S. government and all aspects of American society, especially among younger Americans.

Flash program with voiceover and archival film about the peace marches opposing the U.S. involvement in Vietnam. Clips show draft-card burning, antiwar marches in the U.S. and abroad, love-ins, sit-ins, hippies, and the Kent State protest.

Some of these young adults had volunteered with civil-rights organizations in the late 1950s and early 1960s, and the new organizations followed a similar pattern of growth and development. In 1962, Students for a Democratic Society (SDS) organized a protest against U.S. capitalism at Port Huron, Michigan. The Port Huron Statement that emerged from that meeting emphasized individual rights, demanding economic justice and a variety of social reforms. It became a focal point for student groups around the country, which began to hold nonviolent sit-ins and “teach-ins” to protest university policies, the war, and a wide range of other issues. (Click on the thumbnail Peace Movement.)

In 1967, the place to be, if you were young and seeking change, was the Haight-Ashbury area of San Francisco. In January, a local artist had organized a “Human Be-In” in Golden Gate Park. The event was billed as a new type of celebration that would help to raise consciousness so that “revolution can be formed with a renaissance of compassion, awareness, and love, and the revelation of unity for all mankind” (Perlstein 2008, 177). A song by the Mamas and Papas raced to the top of the charts in the U.S. and Great Britain—the lyrics advising, “if you’re going to San Francisco, be sure to wear some flowers in your hair” and, despite cautions from the Mayor of Haight-Ashbury to stay away, nearly 100,000 “flower children” flocked to the city during the spring and summer of 1967, which became known as the “Summer of Love” (Lytle 2006, 220). One draw was the music—the Monterey Pop Festival, held in June, featured Janis Joplin, Jimi Hendrix, the Byrds, the Grateful Dead, and the Who, among others.

Violence swept the nation during the next few years. Ironically, as the Summer of Love was taking place in San Francisco, massive race riots were erupting in Newark and, later, in Detroit, where LBJ ordered federal troops to restore the peace (Lytle 2006, 223). At a 1968 speech given the day after Martin Luther King was assassinated, the Democratic Party’s frontrunner in the presidential race, Robert F. “Bobby” Kennedy, spoke of the need to move beyond the pain of the moment: “…this much is clear; violence breeds violence, repression brings retaliation and only a cleansing of our whole society can remove this sickness from our soul.” Three months later, Kennedy would be assassinated as well, at a victory rally after winning the California primary.

Riots at the Democratic and Republican conventions resulted in countless injuries and dozens of deaths as police used increasing violence against the political protestors. The 1969 Stonewall Riots in Greenwich Village occurred when gays, who had long tolerated the inevitable police raids of homosexual clubs, decided that they were not going to go peacefully this time. That same year, seventy-nine Native Americans occupied the island of Alcatraz near San Francisco, where the federal prison (built on land they considered sacred) had been closed. They demanded rights to the land and the funds to set up a cultural center and university, but were eventually forced from the island when the government cut off the supply of power and water. The event was, however, a catalyst for the “Red Power” movement of the 1970s (KQED 2002).

Flash program with voiceover and archival film about the influence of social drugs including LSD and marijuana. Film clips are shown of artists of the era, including Jethro Tull, the Rolling Stones, John Lennon, the Beatles, Jefferson Airplane, and Janis Joplin.

The violence encouraged many young people to simply escape—mentally, if not literally. Music and drugs, often in combination, played a key role in the counterculture. Folk musicians like Pete Seeger, Bob Dylan, and Joan Baez spoke out against the war and established society. Psychedelic music, heavily influenced by British bands like the Beatles, encouraged expanding the mind through drugs and mysticism. The music festival at Woodstock, in 1969, was for many the quintessential event of the 1960s, when nearly 400,000 people—six times the anticipated attendance, but only a fraction of those who later claimed to have been there—spent three days on a muddy farm in upstate New York, celebrating free love and plentiful drugs to the sound of electric guitars (Patterson 1997, 710). (Click on the thumbnail Psychedelic Sixties.)

As the nation entered the 1970s, however, most of those who had accepted counterculture icon Timothy Leary’s advice to “turn on, tune in, and drop out” were gradually pulled back into the mainstream culture. By 1980, the vast majority had steady jobs, mortgages, and families to support—much the same as their parents’ generation.

Module 4: The Cold War

The Vietnam War had a lasting impact on the nation. It was the first war that the United States had “lost” and that did not sit well with many Americans. Our withdrawal from Vietnam also seemed to signal that we were losing the battle against Communism, although it became apparent by the end of the 1980s that Soviet might was largely an illusion.

I. Nixon and Watergate

II. Feminism and the ERA

III. Oil and Turmoil

IV. The Reagan Revolution

I. Nixon and Watergate

Beginning with the FDR and the New Deal, the balance of power in government tilted increasingly toward the executive branch. The Cold War and growing government secrecy meant that much of what the government was doing was hidden from the average citizen. Covert operations were used increasingly in the 1950s and 1960s, due in part to the realization that any open actions against the USSR or China could push us toward a nuclear confrontation that both sides had a great interest in avoiding. The entire conduct of the Vietnam War was a tacit recognition of this fact—despite the fact that we were clearly engaged in a war and that most people called it a war—a state of war was never declared.

Many historians and political analysts have labeled this shift of power toward the executive as the “imperial presidency”—a tendency to avoid consultation with Congress, employ secret tactics, ignore constitutional restrictions, and claim executive privilege. For several decades, this term was most closely associated with the administration of Richard M. Nixon.

Flash program with voiceover and archival film of Robert Kennedy delivering the news that Martin Luther King was assassinated, the resulting riots, and R. Kennedy’s assassination. Promotional trailers for the 1968 election are shown.

Nixon rose to national political prominence in the midst of McCarthyism, as a prominent member of the House Un-American Activities Commission (HUAC). He served as vice-president during Eisenhower’s two terms and then lost the 1960 election by a very narrow margin to John F. Kennedy. Many believed that his loss was due to the fact that, unlike Kennedy, he was not telegenic, coming across as nervous and sweaty during the first televised presidential debate. Another factor in Kennedy’s favor was that the country seemed to be looking for change in 1960 and Nixon was pretty firmly attached to the politics of the past (Lens and Zinn 2003, 427–8). (Click on the thumbnail 1968 Election.)

In 1968, however, the country was in the midst of one of the most tumultuous years in its memory and many voters were thinking of the past with a great deal of nostalgia. The assassination of the Democratic frontrunner, Bobby Kennedy, just prior to the Democratic national convention resulted in disarray in the Democratic Party, and the violence of the past few years had pushed many Americans to embrace a more conservative perspective. Kennedy entered the Democratic primaries late and, a few weeks after he joined the race, Johnson announced his withdrawal from the campaign. It is hard to say whether Robert Kennedy would have won the nomination or election if he had survived, but his death was a blow to many liberals who saw him as the last hope for a badly divided nation (Patterson 1997, 693).

The eventual Democratic nominee was Johnson’s vice-president, Hubert Humphrey, who was not exactly appealing to many younger party members, given his support for Johnson’s war policies. Southern Democrats were also increasingly disaffected with the national party and Alabama governor, George Wallace, mounted a strong third-party challenge, winning several Southern states on a platform that appealed to the staunchly conservative—and, frequently, racist—views of Southern white voters.

During the campaign, Nixon kept his statements on the war purposefully ambiguous, vowing to end the war in Vietnam in a way that would “win the peace” and secure “peace with honor.” To some, that implied a speedy end to the war. To others, it stated that we would only end the war when we were assured of victory. Nixon would attribute his victory to the “silent majority” of Americans, who were disenchanted with the liberal policies of the 1960s, but did not join in protest movements or actively participate in political affairs (Patterson 1997, 751).

In 1969, Nixon put forward the statement that would become known as the Nixon Doctrine. The United States would continue to offer assistance to Asian countries fighting against Communist insurgencies, but would no longer commit troops to those battles. His goal of “Vietnamization” stressed that the responsibility for actual combat should rest with the countries in Asia. The fact that he was simultaneously expanding the war into Cambodia was not immediately known and it was not until 1970 that he ordered ground troops into the country. Protests were staged at campuses around the country, including Kent State University in Ohio, where four protesters were shot by the National Guard (Patterson 1997, 770–1). Congress reacted to the president’s actions by passing the War Powers Act. The resolution did not, however, expressly forbid bombing, so the aerial attacks on Cambodia continued well into 1973.

Flash program with voiceover and archival film about the 1969 Apollo 11 flight to the moon and its landing.

There were, however, several major policy successes during the Nixon administration. The goal of putting a man on the moon, set during the Kennedy years, was finally achieved. (Click on the thumbnail Man on the Moon.) In matters of foreign policy, Nixon relied heavily on the advice of Henry Kissinger, who served first as his national security advisor and then as secretary of state beginning in 1973. Kissinger’s view of international relations moved beyond the ideas of containment in which the U.S. and USSR were the only key players, to establish the policy of détente, in which the two sides worked to reduce the level of tension. He urged Nixon to establish diplomatic relations with China and was a key player in negotiating the Strategic Arms Limitation Treaty (SALT I) between the U.S. and USSR in 1972.

Flash program with voiceover and archival film about the Watergate break-in, the following investigation, and events leading to Nixon’s impeachment, his resignation as President, and his pardon by President Ford.

Although the country was suffering from inflation in 1972, the situation seemed to be improving and Nixon’s Vietnamization policy meant that more U.S. troops were returning home. Nixon was reelected by a comfortable margin in 1972, but he and his political advisors apparently wanted a bit of extra insurance. In June 1972, they authorized five men to break into the headquarters of the Democratic National Committee, which was located in the Watergate apartment complex. The burglars, who were disguised as plumbers, were caught. The incident did not receive much press coverage at the time, but two reporters, Carl Bernstein and Bob Woodward of the Washington Post, received a tip that led them to start digging a bit deeper (Patterson 1997, 773). The story they uncovered was one of political sabotage, unauthorized wiretapping, a cover-up that led all the way to the Oval Office, and a concerted effort to obstruct an official investigation. (Click on the thumbnail Watergate.)

Nixon denied involvement and claimed executive privilege when he was asked to produce tape recordings he had made of conversations in the White House. In July 1974, he was ordered by the Supreme Court to release the tapes. By the end of July, the House Judiciary Committee had decided to impeach on three charges—obstructing justice, abuse of power, and failure to comply with the Judiciary Committee’s subpoenas (Patterson 1997, 794). The committee assembled a large team of lawyers to draft the official articles of impeachment, including a twenty-seven-year-old Hillary Rodham (Zelizer 2004, 643). Years later, as First Lady, she would see her husband, President Bill Clinton, impeached by the same committee.

Nixon was advised by several close colleagues that the evidence against him was strong and he would most likely be convicted if impeached. Nixon announced his resignation on August 9, prior to the scheduled vote. His first vice-president, Spiro Agnew, had resigned in October 1973 due to pending corruption charges, so it was Gerald Ford, who was not even on the presidential ticket in 1972, who became president (Patterson 1997, 792). One of Ford’s first official acts was to pardon Nixon, arguing that the country would be further divided if the former president faced criminal charges.

For liberals, Watergate became synonymous with the abuse of presidential authority and violation of public trust. For conservative supporters of Nixon and many members of his administration, including Dick Cheney, it was an indication that presidential authority was gradually being eroded. To the general public, the hearings were evidence that the entire national government was too wrapped up in Beltway politics and intrigue. Although the next two presidential elections would result in leaders from different parties, with dramatically different views on how to run the nation, they would have one thing in common—neither could in any way be considered a Washington insider.

II. Feminism and the ERA

Flash program with voiceover and archival film showing discussions by proponents and opponents of the women’s liberation movement, including George Gilder, Betty Friedan, Ann Richards, and Phyllis Schlaffly.

In 1972, Congress passed the Equal Rights Amendment, with Nixon’s endorsement. The amendment, which was first proposed in 1923 by a young Alice Paul, was then sent out to the states for ratification. This was seen by many as the culmination of a decade of slow but steady progress toward equality. Early in his presidency, Kennedy had established the Commission on the Status of Women, which reported in 1963 that women were systematically underpaid and faced pervasive discrimination in employment. Discrimination on the basis of sex was one of the elements that had, after much debate, been included in the 1964 Civil Rights Act. Gender discrimination was, however, one element that the Equal Employment Opportunity Commission (EEOC) had failed to enforce. The National Organization for Women (NOW) was founded in 1966, largely as a reaction to that failure. The first head of the organization, Betty Friedan, had written a widely read book several years earlier called The Feminine Mystique, in which she argued that women could achieve satisfaction outside of their roles as wives and mothers (Evans 1997, 237). (Click on the thumbnail Women’s Liberation.)

The main goal of NOW was to end gender discrimination in the workplace and achieve legal equality for women. Another, more philosophical, side of feminism also emerged during the late 1960s. Based on small consciousness-raising groups that had, in many cases, developed as an offshoot of women’s activities in the civil rights and peace movements, these women frequently argued that the entire culture was based on the premise of patriarchy. Many urged women to separate themselves from a society that glorified violence and objectified women.

While the two sides of the feminist movement were often at odds, they did agree on many issues and were able to work together toward several major goals. One was the Women’s Strike for Equality in 1970, where women gathered to mark the fiftieth anniversary of gaining the right to vote and to urge, among other things, congressional action toward the passage of the ERA and less restrictive state laws on abortion (Evans 1997, 288). This latter goal would be met in 1973, when the Supreme Court issued its landmark decision in the case of Roe v. Wade.

At the center of the 1970 celebrations was the grand dame of the woman’s movement, Alice Paul, who, at age 85, was still quite active in the work toward the ERA. When Congress passed the amendment in 1972, many of its proponents assumed that the battle was won. Like their grandmothers, they had marched in protest, picketed, and finally convinced Congress to take action. Alice Paul, however, was very worried. She knew all too well that the ratification of the suffrage amendment in 1920 had been due only in part to the parades and picketing. The suffrage amendment would never have been ratified if it had not been for the support of women across the political spectrum, including many relatively conservative women’s organizations. The network that had been built up over decades had frayed over time and the gap between conservative women and their liberal sisters was wider than it had been in the 1910s. In addition, unlike the suffrage amendment or any other constitutional amendment, Congress had added a seven-year time limit (later extended to ten years) for ratification (Sparks 2000, 201–2).

Alice Paul died in 1977, so she did not live to see the most heated days of the ratification battle—but they probably would not have surprised her. As had been the case in 1920, conservative opponents of the ERA flocked to a few key battlegrounds in an effort to defeat the amendment. In 1982, thirty-five states had ratified the amendment and three more were needed (Felder 2003, 272).

The election of Ronald Reagan in 1980 gave the conservative movement a morale boost and Reagan made his opposition to both ERA and Roe v. Wade well known. Phyllis Schlafly’s Eagle Forum and STOP ERA! persisted until the last battlegrounds were clear. Newspaper editorials began to appear raising doubts about the wisdom of an amendment that would put men and women on equal legal footing. Some noted that men might not be required to pay alimony when they left their wives. Others noted that young women would also be required to register for the Selective Service and, if the draft were ever reinstated, would be equally subject to conscription. Still others claimed that housewives would be forced to work outside the home and that unisex restrooms would be required by law. The federal government, it was argued, would require states to pay for abortions, rape would no longer be a crime, and children would be raised in federally run day-care centers, instead of in private homes (Evans 1997, 304–5). Most of these assertions had no credible legal basis—but they were very effective in shifting support away from the ERA.

Some feminist supporters of the ERA also hurt their own cause without realizing it. They would appear en masse in rural and more conservative areas, often matching the stereotype the residents had of feminists—no bra, no make-up, and little apparent tolerance for men or traditional families. Women’s rights advocates within these communities tried to caution the national office of groups like NOW that these tactics were unlikely to work in their area and asked that local feminists be allowed to determine the best tactics for their state or community. In most cases, their pleas were ignored (Evans 1997, 209). When the time limit expired, the amendment was still three states short of the required number for ratification. Feminists went back to square one, trying to convince Congress to again pass the ERA, just as Alice Paul and her colleagues had done every year for five decades. As of 2009, they have yet to succeed.

Women’s rights did, however, expand during this era, especially in the areas of employment and education. Title IX of the Higher Education Act, which Congress passed in 1972, prohibits sexual discrimination on the part of any school or program that receives federal funds. Athletic programs for women expanded rapidly as a result. Military academies began to accept female cadets. Women began to pursue a wider array of employment and the gap between the numbers of men and women in colleges and in the professional fields grew narrower.

III. Oil and Turmoil

The fall of South Vietnam in 1975 and rising economic problems plagued Gerald Ford’s short time in office. Inflation was rising sharply and the country faced the worst recession it had seen since the Great Depression. An oil embargo by the Organization of Petroleum Exporting Countries (OPEC) in the early 70s, which resulted in drastic increases in oil prices, worsened the overall economic outlook. Reluctant to impose wage and price controls, Ford instead launched a voluntary campaign that he called Whip Inflation Now (WIN). It did not win public support.

Ford narrowly won a primary challenge by California governor, Ronald Reagan, but lost in the general election to another governor, Jimmy Carter, who campaigned as a Beltway outsider who was distant from the pervasive corruption in Washington. Carter’s campaign focused on issues of trust and openness, partly because many Americans were troubled by the fact that Ford had pardoned Nixon. After the election, Carter found himself facing the same economic challenges that confronted Ford. Efforts to steer the economy back onto course were unsuccessful and inflation tripled during his administration. Unemployment rose to around 8 percent and interest rates soared (Zinn 2003, 570).

Flash program with voiceover and film clips of popular movies, TV shows, and music of the 1970s. It includes clips of The Brady Bunch, Hawaii Five-O, All in the Family, The Jeffersons, Happy Days, The Godfather, One Flew over the Cuckoo's Nest, Jaws, Rocky, and Sesame Street.

The public nostalgia for simpler and more prosperous times was reflected in popular culture. Happy Days, a show that idealized the late 1950s, was the number one series in 1977 and remained near the top of the ratings for several years (Brooks and Marsh 1999, 421). The 1950s fear of juvenile delinquency was played for laughs as the character of leather-clad motorcycle tough guy Arthur Fonzarelli, aka “the Fonz,” was shown to be both loveable and wise beneath his rough exterior. (Click on the thumbnail Those 70s Shows.)

President Carter’s primary interest was human rights and he pushed for improvements in the USSR, South Africa, and Latin America. He also was instrumental in mediating peace talks between the leaders of Egypt and Israel in 1978 at Camp David (the resulting agreements later being known as the Camp David Accords). A treaty with the USSR to limit nuclear weapons stockpiles, SALT II, was signed by both parties (although not yet ratified by the U.S. Congress) when the Soviets invaded Afghanistan. Carter requested that the Senate hold off on considering the treaty and cancelled U.S. participation in the Olympics, which were scheduled to be held in Moscow in 1980.

Flash program with voiceover and archival film footage of various discussions about President Carter’s attempt to rescue the 52 hostages taken during the 1979 Iranian seizure of the U.S. embassy in Iran, and the hostages later release.

In 1979, the Shah of Iran was visiting the United States for cancer treatments. Taking advantage of his absence, Islamic militants overthrew the Shah’s government and attacked the U.S. embassy in Tehran, taking 52 Americans as hostages. They demanded that the Shah be handed over in exchange for the hostages. Carter responded by freezing Iranian assets and, after several months, backed a rescue operation that ultimately failed. He finally agreed to release several billion dollars in Iranian assets as ransom—money that Iran badly needed as they were now at war with Iraq. The hostages were released, but only after the 1980 election was over. Ronald Reagan defeated Carter in a landslide with 459 electoral votes (Foner and Garraty 1991, 348). The Iranians finally released the hostages on the day Reagan was inaugurated—after 444 days of captivity. (Click on the thumbnail Hostage Crisis.)

IV. The Reagan Revolution

Several changes in the U.S. political landscape converged to bring about Reagan’s dramatic victory. One major change was that the shaky coalition of Northern liberals and Southern conservatives that had comprised the Democratic Party for the past several decades finally crumbled. White conservatives had increasingly become Democrats in name only, breaking with the party in national elections and supporting third-party challenges like the Dixiecrats and George Wallace’s American Independent Party. The only group in the South that continued to provide reliable support for Democratic candidates was African-American voters, and their voting strength was not enough to guarantee the Democratic presidential candidate a win in the region. Population trends had also shifted, which meant that as the South was shifting toward the Republican Party, it was also gaining electoral clout.

A general resurgence of conservative political beliefs was also underway and Reagan’s core principles appealed to that constituency. Religious conservatives, including members of Jerry Falwell’s Moral Majority, were drawn to the fact that Reagan opposed legalized abortion, the Equal Rights Amendment, and an amendment to allow school prayer. Fiscal conservatives liked his view (summed up in Reagan’s first inaugural address) that “Government is not the solution to our problems; government is the problem.” They also liked his promises to reduce government regulation of business and his policy of supply-side economics.

Flash program with footage of Ronald Reagan’s speech at the Republican National Convention in 1980, his address on the space shuttle Challenger disaster, and his Brandenburg Gate address, which included his famous phraseology, Mr. Gorbachev, tear down this wall!

An additional reason for Reagan’s victory was that he was, simply put, a candidate who was made for television. He had spent several years, in fact, as host of the TV show Death Valley Days, and Americans were comfortable with him in their living rooms. Reagan projected a warm, folksy image that appealed to voters, and even those who did not agree with many of his policies admitted that he came across as a likeable father-figure. After the tumultuous 60s and 70s, he was exactly what many voters wanted—a familiar face with a steady voice who could help to simplify a complex world. (Click on the thumbnail The Great Communicator.)

Flash program with footage of popular TV shows, movies, and musical artists of the 1980s.

Despite cuts in government spending for social programs, military spending rose dramatically under Reagan. On the surface, the economy seemed healthy. Inflation and unemployment were both significantly lower by 1984 and helped Reagan to win reelection with over 500 electoral votes (Foner and Garraty 1991, 348). The gap between the rich and the poor, however, grew steadily and the impact was worsened by the fact that budgets for many social programs for the poor had been slashed. The apparent health of the economy also masked the fact that the nation was, in essence, paying the monthly bills by credit card. Madonna’s hit song and musical video, “Material Girl” (1985) satirized the era’s focus on wealth and spending. (Click on the thumbnail The 80s.) The increased revenue from tax cuts that the supply-side economists had promised never materialized and the government ran up major deficits in the 1980s. During the eight years of the Reagan administration, the country went from being a creditor nation to being the world’s largest debtor (Cannon 2000, 235–6).

After several administrations that had moved away from Cold War foreign policies, Reagan sharply reversed course and reinstated a policy of containment. He referred to the USSR as the “evil empire,” a label that resonated with a generation of voters who had flocked to the theatres in record numbers to watch George Lucas’s Star Wars trilogy. The Reagan administration’s ambitious Strategic Defense Initiative (SDI), dubbed “Star Wars,” was announced at almost the same time.

Neither speech was met with approval by the Soviet leader, Yuri Andropov. Just as the Galactic Empire of the Star Wars films had been no match for the Rebel Alliance, however, the USSR began to show signs of weakening. Andropov was in his late eighties when he took over in 1982. He died fifteen months later and his successor, seventy-nine-year-old Konstantin Chernenko, died in just over year. It was apparently clear to the Soviets that it was time to pass the torch to a younger generation of leaders.

In 1985, a fifty-four-year-old reformer, Mikhail Gorbachev would become the leader of the Soviet Union, setting forth his new ideas for perestroika and glasnost. The Soviet economy had been steadily declining, due in part to massive military spending, and Gorbachev knew that the nation would either have to become a full participant in the global economy or it would collapse. The USSR also relaxed its control over Eastern European nations, announcing that it would no longer require those countries to maintain Communist regimes. Reagan worked in concert with Gorbachev, but in his famous 1987 speech at the Brandenburg Gate in Germany, he implied that talk was cheap, and urged Gorbachev to show his commitment to reform by tearing down the Berlin Wall (Talbott 2008, 251).

Flash program with footage of various aspects of the Iran Contra cover-up, including the public reaction to the hearings, Oliver North’s testimony, and President Reagan’s statements about the affair.

The Iran-Contra Affair, which first came to light in 1986, revealed a more clandestine side of the Reagan presidency, which his critics had long denounced. As the details of the complicated plot came to light, Reagan simply claimed that did not know about the illegal activity, and his vice-president, George Bush, also stated that he had been “out of the loop” (Cannon 2000, 657–8). Despite the fact that this showed a president who was either untruthful or woefully out of control of his own foreign policy, the scandal did not have a lasting effect on Reagan’s legacy. (Click on the thumbnail Iran-Contra.)

Reagan would easily have won a third term in office, a fact that probably made Republicans regret their insistence, in 1947, on a constitutional amendment limiting a president to ten years in office. Vice-President Bush lacked Reagan’s charisma and he had, at best, lukewarm support from religious conservatives who suspected that he secretly supported abortion rights and other liberal causes, but he was elected by a fairly comfortable margin in 1988. Although Reagan would receive the credit, the Soviet Union collapsed on Bush’s watch. During 1989, a string of formerly Communist states broke away from the Soviet Union and in November, East and West Germans united to begin tearing down the Berlin Wall.

Module 5: New Global Threats and Challenges

The Cold War had shaped the cultural, economic, and political landscape of the United States for more than half a century, and the end of the conflict was the subject of a wide array of scholarly analyses. Some writers believed the change would be a great advance toward a more peaceful world and would provide the opportunity to focus on domestic needs that had been neglected due to military spending. Others, however, were less certain—noting that the international system had existed for some time in a fairly stable, bipolar balance. The world was entering an uncertain period of transition and might, therefore, face an even greater likelihood of war.

I. The End of the Cold War

II. A Changing Nation

III. Diversity in the Workplace

I. The End of the Cold War

Flash program with voiceover and film showing the debates between Michael Dukakis and George Herbert Walker Bush and between Lloyd Bentsen and Dan Quayle. It includes TV campaign ads depicting the negative campaign ads employed and concludes with a Saturday Night Live segment satirizing the debates.

The end of the Cold War was, in some ways, anticlimactic. Ronald Reagan, the quintessential Cold Warrior, was no longer in the White House in 1989, when a series of rapid and sweeping changes occurred within the Communist bloc, although many of these changes began during his two terms in office. Reagan’s successor, George Herbert Walker Bush, continued many of his policies, but he lacked Reagan’s eloquence and public appeal. Bush’s rather easy defeat of Massachusetts Governor Michael Dukakis in the 1988 election was due largely to the weakness of the Dukakis campaign. The Democratic nominee had fought a divisive primary battle. The presumed frontrunner, Senator Gary Hart, left the race due to charges of adultery with a former model, reentered the race a few months later, and then dropped out again when it became clear that he had been rather badly hurt by the negative publicity. Dukakis also made several key missteps during the campaign and these were highlighted in negative ads used by the Bush campaign. Bush was able to paint Dukakis as weak, both on law enforcement and national security (Campbell 2008, 69–70). (Click on the thumbnail The 1988 Election.)

Flash program showing a documentary about the end of East Germany, including people fleeing to West Germany, demonstrations by the East Germans against the government, erroneous announcements about East Germans being allowed to leave East Germany, and the fall of the Berlin wall.

As a former ambassador to China and former head of the Central Intelligence Agency, Bush’s credentials in foreign policy were highlighted in the campaign. Despite a relaxation of tensions with the Soviet Union, the Communist Party remained in control and the Cold War was still a concern for many voters. Reformist Mikhail Gorbachev came to power in 1985 and attempted to halt the economic decline that had been escalating in the USSR for over a decade. His twin policies of glasnost (social and political openness) and perestroika (economic reform) heralded a new, more democratic government and the beginnings of capitalism in the Soviet Union. Gorbachev’s decision not to continue propping up the various Communist allies in Eastern Europe resulted in the rapid fall of Communist governments in Poland, Czechoslovakia, East Germany, Bulgaria, and Romania. By the end of 1989, the Berlin Wall that had separated East and West Germany was down and the two German states were officially reunited in 1990. (Click on the thumbnail The Berlin Wall.) However, any hopes that China might quickly follow suit were dashed when the Chinese government cracked down on the country’s pro-democracy movement in a brutal attack at Tiananmen Square in June of 1989 (Hastedt 2006, 69–70).

Another key focus of the Bush campaign in the 1988 election was the war on drugs. As the USSR was crumbling, Bush authorized the first use of American forces since World War II that was not related to the Cold War—Operation Just Cause. The Bush administration suspected that Panamanian dictator Manuel Noriega was facilitating the transport of drugs from South America to the U.S. and Europe, and this was one of the justifications given for the December 1989 invasion. U.S. troops removed Noriega from power after capturing him in a dramatic standoff at a church where he had requested asylum. Both the invasion and the apprehension of a foreign leader were condemned by the United Nations, as was Bush’s decision that Noriega be tried in the United States for his role in drug trafficking (Hastedt 2006, 69).

Flash program with voiceover and video clips depicting the U.S. invasion of Panama in 1989 and the hunt for Noriega and Saddam Hussein’s Iraqi invasion of Kuwait that resulted in the Persian Gulf War of 1991.

In the summer of 1990, Iraqi leader Saddam Hussein’s forces invaded Kuwait. Bush authorized a heavy bombing campaign against the occupying forces and then persuaded the United Nations to support a U.S.-led coalition to remove Iraqi troops from Kuwaiti territory. The Gulf War was quick and decisive, although some critics felt that the troops should have continued into Baghdad in order to remove Saddam Hussein. Iraq was instead made subject to a no-fly zone, and a United Nations commission was established to oversee the disarmament of the nation. The failure of Iraq to comply with United Nations resolutions resulted in additional U.S. bombing attacks on the country toward the end of the decade (Hastedt 2006, 71). (Click on the thumbnail Just Cause & Desert Storm.)

The Gulf War earned Bush considerable public approval, but a faltering economy pushed that from the minds of voters long before the election in November of 1992. The relatively brief war had caused a spike in oil prices and, although allies eventually shared a considerable portion of the expense for that war, it also put pressure on the U.S. budget. Bush had campaigned on the slogan, “Read my lips: no new taxes,” but the economic situation led him to renege on that promise. His relationship with the Democratic-led Congress was not strong and they frequently clashed over how to strengthen the economy, reduce the national deficit, and pay down the debt that had accrued during the 1980s (Campbell 2008, 68–9).

II. A Changing Nation

With the end of the Cold War, many Americans felt that the administration should shift its focus to domestic problems. The country grew rapidly in the 1990s—the 2000 census revealed a dramatic increase of nearly 33 million people during the previous decade. Nearly one-third of the growth was attributable to a single demographic group—Hispanic Americans (U.S. Census Bureau 2002, 75–79). This was due in part to higher birth rates within existing Hispanic communities, but also to increased immigration. By 1990, fewer than ten percent of immigrants were European, as a result of changes in immigration policy that had previously granted preferential treatment to immigrants from that region. About forty percent of the recent immigrants were from various parts of Asia, with a slightly larger percentage of Hispanic origin (U.S. Census Bureau 2002, 85).

Flash program with voiceover and film about the problems caused by immigration, both legal and illegal. It includes statistics about the part of the U.S. population composed of immigrants, the various immigration laws, and views pro and con regarding immigration.

The Immigration Reform and Control Act of 1986 attempted to reduce the flow of illegal immigrants into the country by fining employers who hired them and by tightening control of the border. The act also offered amnesty to any who could prove that they had lived in the United States for the prior four years (Daniels and Graham 2001, 52). A decade later, the Illegal Immigration Reform and Immigrant Responsibility Act gave the government even more power to deport those trying to enter the country without permission (Daniels and Graham 2001, 60–1). Despite these efforts, illegal immigration continued to be a major concern, especially in the handful of states, like California, Texas, and Florida, where immigrants were most likely to settle. Anti-immigrant sentiment increased dramatically in California, which was home to about half of the illegal immigrants in the nation, prompting mass protests and the eventual passage of Proposition 187 in 1994 (Daniels and Graham 2001, 52). In 1998, California voters enacted Proposition 227, which eliminated most bilingual education programs in the state in favor of English-only education (Schmid 2001, 159). (Click on the thumbnail America for Americans.)

Many of those who favored tightening the immigration laws (both in terms of limiting legal immigration and preventing illegal entry) and ending bilingual instruction were concerned that immigration was threatening to fundamentally alter the culture of the United States (Daniels and Graham 2001, 90). Many newly arrived immigrants wanted to preserve traditions of their own culture and thus resisted the notion of the “melting pot” from which all immigrants of the past had (theoretically) emerged simply as “Americans” (Schmid 2001, 159). The many Americans who favored multiculturalism felt that America should aim instead to be a mosaic (or, some suggested, a “tossed salad”) in which various cultures could coexist harmoniously without altering their cultural heritage or beliefs. Critics, however, contended that cultural pluralism only increases conflict and does not give new immigrants the tools (especially in terms of language skills) necessary to succeed financially in the United States.

The growing number of immigrants increased opposition to affirmative-action programs because such programs, initially designed to remedy past discrimination against African Americans, also covered women and other minority groups. Some white males argued that these programs benefited other groups at their expense, resulting in reverse discrimination. In the 1978 Supreme Court case, Regents of the University of California v. Bakke, the Court held that the University’s decision to deny the application of Bakke, a white student, was unconstitutional as race was the only criteria used. The Court stressed, however, that while setting specific racial quotas is unconstitutional, institutions can consider race as one of several factors when making admissions decisions. The Supreme Court reaffirmed that decision in 2003, in the case of Grutter v. Bollinger, but noted that racial preferences would eventually need to be phased out (Pohlman 2004, 35).

Another demographic shift stemmed from the post-war baby boom. As the “baby boomers” approached retirement age and life expectancy continued to expand, the solvency of the Social Security and Medicare programs became less certain and many younger Americans began to question whether the system that they were paying into would still be around to pay benefits when the time came for their own generation to retire.

III. Diversity in the Workplace

The gender issues that had emerged in the late 1970s and 1980s were increasingly at the forefront of domestic policy debates, especially those issues that pertained to working women. By the end of the century, women made up nearly half of the total labor force. Despite this, they were generally paid less for work requiring similar degrees of skill or education and there were fewer opportunities for advancement (National Organization for Women 2007). In the 1990s, the National Organization for Women and other women’s advocacy groups pushed the government to enact laws to ensure pay equity and attempt to shatter the “glass ceiling,” but these efforts were largely unsuccessful. More women in the workforce resulted in a greater need for high quality, affordable childcare and parental leave policies.

A bill to provide up to twelve weeks of unpaid family leave per year was vetoed by President Bush in 1990, but was finally signed into law in 1993. Although it was hailed as a step in the right direction in that it gave workers the right to short periods off for childbirth, adoption, or family illness without risking the loss of their jobs, it did not apply to businesses with fewer than 50 workers. In a 2004 Harvard study of 168 countries, 163 guaranteed paid parental leave for mothers and 45 provided similar benefits for fathers (The Project on Global Working Families 2004, 7). The fact that the leave is unpaid means that most families opt for the time to be taken by the lower-paid parent—usually the mother. This tends to perpetuate the view that employers cannot rely as fully on female workers and, ultimately, it widens the gender gap in wages.

The growing number of women in the workforce also meant that there was increased pressure for changes in the working environment. Women who worked in fields that were traditionally male had long been forced to simply “grin and bear it,” when male co-workers made suggestive comments or displayed sexually explicit content in the workplace. The federal government began to establish policies to end sexual harassment in the 1980s, based on provisions in the Civil Rights Act of 1964, and the judicial system began to support women who brought suit against companies that were in violation of the law. The issue received considerable publicity in 1991 when Clarence Thomas was nominated by President Bush for a position on the Supreme Court. Anita Hill, a female attorney who had worked with Thomas previously, testified that he had sexually harassed her on several occasions. Despite her testimony, the Senate eventually voted to confirm his appointment (Skaine 1996, 202–3).

Module 5: New Global Threats and Challenges

Although the Cold War was over, the weapons cache of the former Soviet Union remained, and there was some uncertainty about exactly where all of the stockpile might be. The threat of major nuclear war receded, but Americans grew increasingly concerned about smaller attacks from underdeveloped nations. In the last decade of the twentieth century, the world was increasingly divided between developed industrial nations, with high levels of literacy and standards of living, and underdeveloped nations, which were marked by poverty, disease, and illiteracy. The spread of mass communications to even these less developed countries (LDCs) meant that the poorer nations of the world were increasingly aware of the disparity.

I. The Clinton Administration

II. Globalization

III. The Internet Revolution

I. The Clinton Administration

In 1992, the Democratic presidential nominee, Arkansas Governor Bill Clinton, stressed the economy and successfully portrayed President Bush as out of touch with the average voter and consumer. Clinton also avoided the pitfalls of the Democratic candidates of the 1980s by positioning himself as a moderate “New Democrat,” who embraced many centrist positions, including a balanced budget. A third-party challenge by Texas businessman Ross Perot added a bit of extra interest to the campaign, but it was difficult to determine whether he ultimately took more votes from Bush or Clinton.

Clinton entered the White House with the advantage of a Democratic Congress, but immediately set out to tackle one controversial goal—ending the military ban on gays and lesbians in the armed forces—and one extremely difficult goal—reforming health care. The ballooning cost of health care added to the economic challenges of many Americans, especially the millions without employer-sponsored insurance. Those lucky enough to have coverage often found their employers switching to health maintenance organizations (HMOs) in an effort to reduce expenditures. HMO plans were somewhat effective in lowering insurance costs, but were widely criticized for limiting patient autonomy and lowering the quality of health care. Although health-care reform was widely seen as an issue that needed to be addressed, the public was skeptical of Clinton’s health-care proposal, which political opponents claimed was socialist medicine in disguise.

In response to Clinton’s initiatives to end the ban on gays and lesbians in the armed forces, the military shifted to a policy of “Don’t Ask, Don’t Tell,” under which service members were no longer required to reveal their sexual orientation prior to enlistment, but the policy was viewed as too little change by those on the left and too much by those on the right. The public reacted in the 1994 midterm elections by voting in a Republican Congress—the first time since 1953 that the House of Representatives had not been in the hands of the Democrats (Campbell 2008, 114). Republicans, led by Newt Gingrich, had campaigned on the idea of a “contract with America,” promising lower taxes, a reduction in the size and scope of government, job creation, tort reform, and the abolishment of welfare (Bader 1996, 13#&8211;14).

Clinton also favored some degree of welfare reform, so this was one area where action was possible during the remainder of his first term in office. The Personal Responsibility and Work Opportunity Reconciliation Act of 1996 set a five-year lifetime cap on welfare payments and included work requirements for most recipients (U.S. Department of Health and Human Services, 2005). The economy was also growing rapidly, which allowed Clinton to eliminate the deficit that had been built up during the Reagan years and to balance the national budget. These centrist accomplishments served Clinton well in the 1996 election and he easily won a second term against Republican challenger, Senator Bob Dole.

The primary focus of the Clinton administration was domestic, but the emergence of the United States as the sole superpower and the ongoing chaos in the former USSR forced the U.S. to assume a leadership role in efforts to maintain global peace. Clinton had also inherited the U.S. involvement in Somalia from the Bush administration, and the public outcry when several American soldiers were lost in a battle with Somali warlords in 1993 made Clinton reluctant to continue to put U.S. forces into combat. Genocidal wars erupted in the 1990s, first in the Rwanda civil war of 1994, where the U.S. and the United Nations took no concerted action despite a death toll of at least 800,000 (Harris 2006, 128). In 1998, evidence emerged of ethnic cleansing in the Kosovo province under the authorization of Yugoslavian leader Slobodan Milosevic. This time, the U.S. was heavily involved, leading a coalition of NATO troops in a bombing campaign against the Federal Republic of Yugoslavia, which eventually dissolved into separate states along ethnic lines (Harris 2006, 348).

Terrorism emerged as a key concern during the Clinton administration. The 1993 car-bombing of the World Trade Center, organized by Islamic extremists, resulted in the death of six people and more than a thousand injuries (9/11 Commission 2004, 88). The goal of the attack had been to level the first tower and cause it to crash into the second. When a massive explosion demolished the Murrah Federal Office Building in Oklahoma City in 1995, killing 149 adults and 19 children, the first assumption was that this attack was also due to Islamic terrorists, but two American citizens, Timothy McVeigh and Terry Nichols, were eventually convicted of planning and carrying out the attack (Harris 2006, 179#&8211;80).

Flash program with voiceover and film about the 1992 presidential election, featuring Bill Clinton’s role. It includes his saxophone performance; the Gennifer Flowers affair, which was examined on the Arsenio Hall TV show; Clinton’s impeachment; and the Monica Lewinsky affair.

Clinton’s relations with Congress were combative and Republican legislators launched a variety of investigations and probes into alleged wrongdoing by the president as well as by First Lady Hillary Clinton and several other Clinton administration officials. In 1998, one of the investigations finally yielded results. Clinton was forced to admit an extramarital affair with White House intern, Monica Lewinsky, after initially denying it under oath. Congress moved forward with impeachment in late 1998, on charges of lying under oath and obstructing justice. In February 1999, Clinton was acquitted by the Senate on both charges (Harris 2006, 346#&8211;7). (Click on the thumbnail Clinton vs. the Right Wing.)

II. Globalization

As the twentieth century drew to a close, the business and economic buzzword was globalization, a term that generally denotes the movement toward a world in which knowledge, goods and services, money, and even people flow more freely across international borders. Although globalization has been occurring for centuries, the pace of change accelerated rapidly during the last half of the 1900s, due to advances in transportation and communication. There are both positive and negative effects of globalization, and the United States experienced each variety in the 1990s.

It is increasingly difficult to analyze national economies separately from the global economy. This is especially true for the United States, as international trade accounts for over half of the nation’s economic production (Scheve and Slaughter 2007). The efforts during the past few decades to integrate the world’s economies are based on the theory that a free global market, with limited trade barriers, will allow for greater competition and increased opportunities for economic development throughout the world. Organizations such as the International Monetary Fund (IMF) and the World Trade Organization (WTO) were established to manage global economic development and international trade. Smaller, regional organizations such as the European Union (EU) and the North American Free Trade Association (NAFTA) are designed to integrate the economies of member states.

Although most economists argue that the impact of economic globalization is a positive one, many in the general public are less enthusiastic. In the United States, globalization has critics across the political spectrum. Critics on the left note that the most of the benefits to date have accrued to the developed nations while the developing nations have seen little growth and, in some cases, have suffered economically. The environment in these developing countries was also affected, due to the absence of strict environmental laws. Critics on the right argue that the true aim of globalization is the abolishment of the nation-state and the eventual rise of a single world government (Germain 2000, xiv#&8211;xv).

For many Americans workers, however, globalization was the unseen hand that snatched their well-paying factory jobs and sent them south of the border. Major corporations laid off hundreds of thousands of workers in the 1990s, claiming that this “downsizing” was the only way to remain competitive in the global economy (Korten 1995, 216). The United States was moving toward a post-industrial economy, based mostly on services. Some of these service jobs were well paid, including management positions, and many required an extensive skill set and higher education. Other service positions required little training, but were generally low paid. Many factory workers who had been downsized in the early 1990s found themselves in these lower-paid service positions, which were rarely unionized. This usually meant a significant reduction in both income and benefits.

The rapid spread of personal computers and the advent of the Internet paved the way for a major economic boom in the late 1990s. Productivity grew approximately 20 percent between 2000 and 2007, due in part to the increased use of computers in the workplace (Bernstein and Mishel 2007).

Flash program with voiceover and film with excerpts from "The One Percent," and other sources that explore the income gap in the U.S. and the attitudes and opinions of those on both sides of the gap.

Wages at the bottom, however, remained stagnant, resulting in the largest gap between the economic classes since the 1920s (Scheve and Slaughter 2007). In 1990, the chief executive officer (CEO) of the average company earned $63 for each dollar earned by the average schoolteacher in America—nearly twice as much as in 1960. By 2001, however, the average CEO was earning $264 for every dollar earned by the average schoolteacher (Sklar 2002). In 2004, the richest 1 percent of households held 34.3 percent of the country’s assets—more than the amount shared by the bottom 90 percent combined. The 400 richest Americans (the Forbes 400) saw their shared personal wealth triple, from $483 billion in 1995 to $1.54 trillion in 2007 (Institute for Policy Studies 2008, 4). (Click on the thumbnail A New Gilded Age?)

Having seen little personal gain from globalization in the 1990s, many working Americans favored protectionist policies that would weaken the global economy as a whole and, likely, result in similar measures from other nations. Some political scientists and economists warned that the only way to reduce the protectionist tide would be to ensure that the benefits of globalization are more equitably distributed to workers as well as CEOs (Scheve and Slaughter 2007).

The pace of globalization is unlikely to slow in the coming decades, as many problems faced today require global solutions. Environmental issues are the most obvious, as we share a single atmosphere, and weak environmental regulations in any part of the globe have an impact on the whole. Likewise, communicable diseases can easily spread beyond international borders. The rapid spread of the HIV/AIDS virus, beginning in the 1980s, illustrated the need for cooperation in developing global solutions to such epidemics.

III. The Internet Revolution

The booming economy of the late 1990s was due in large part to the rise of the Internet. Business was revolutionized, allowing workers to communicate and share documents around the globe as easily as they could with the person in the next cubicle. Proponents of the new World Wide Web applauded it as the greatest invention of the century, while detractors dismissed it as a fad.

Although it seemed to appear overnight, the Internet had been developing gradually since the Eisenhower administration. The Advanced Research Projects Agency (ARPA) was created in February 1958 in response to the recent Soviet launch of its first Sputnik satellite. One project that began in the early 1960s, known as ARPANET, aimed to link computers at several universities and defense research institutes on the West Coast. By 1970, the link extended to a handful of East Coast institutions as well, and by the end of the decade, a complex web of connections was developed (Waldrop 2008). Former Vice President Al Gore is somewhat unfairly chided for having claimed credit for “creating the Internet,” but he began holding hearings on the issue as a senator as early as 1986 and he was the author of the bill that transitioned ARPANET from a defense tool into the National Information Infrastructure, which is the underpinning of the World Wide Web (Tuomi 2006, 156).

In the era that ARPANET was being developed, home computers were non-existent, as the average computer would have occupied much of the living space in the average home. By the 1980s and 1990s, however, computers for personal use had become manageably sized and priced, and the number of families who invested in home computers grew steadily. The capacity and speed of these computers grew rapidly as well. For the most part, the pre-Internet home computer was used for word processing or games. With the advent of the Internet, however, the computer was no longer limited by the software that you loaded into the system. Your machine was now, theoretically, connected to the world.

Flash program with voiceover and film giving opinions about the impact of the Internet on communication and merchandising from the early days to current times. It includes a look at the fall of the dot com industry in 1991.

At first, that world was rather sparsely inhabited, but this situation was short-lived. Thousands of new “dot-coms” emerged as investors lined up to fund a wide array of projects, and e-commerce was at the top of the list. Although consumers seemed reluctant to shop online initially, the market grew steadily, especially as search engines became more reliable and more products became available. Grocery store chains began offering online shopping and delivery in many major markets and banks encouraged online bill payment and direct deposit. The expansion of major online retailers such as Amazon soon meant that shoppers could find most items online—and could even resell them later on eBay. Total online retail sales were estimated at more than $133 billion in 2008, accounting for 3.3 percent of all retail transactions (U.S. Census Bureau 2009). (Click on the thumbnail Dot Com.)

Within less than a decade, the educational and entertainment opportunities offered by the Internet exceeded the expectations of all but the most prescient. Wireless devices enabled users to download content via laptops and cell phones anywhere they could find a strong signal. Answers to questions could be obtained instantly—and if you avoided the unreliable sites, they might even be accurate. A new generation of students who were at ease with technology began to gravitate toward online social networks and even online education.

Bar Chart indicating the percentage of households with computers and internet access.

This new age of instant information and education has not been equally distributed, either globally or within the United States, and by the mid-1990s concerns grew over the “digital divide.” Although access grew steadily between the early 1980s and 2008, access (both at home and at school or work) was much higher among the middle and upper classes and these groups were much more likely to have broadband connections enabling them to more fully utilize online content. Several government and non-governmental programs created to address this disparity have narrowed the gap and ensured greater availability, especially within schools and libraries, but affluence is still a crucial element in determining the ease and extent of access. Rural Americans are less likely to have broadband available as an option, and this has resulted in the United States slipping to number 17 in the rankings for “connected countries” in a recent international survey by the International Telecommunications Union (2009, 22).

Others have used the term “digital divide” to refer to the digital knowledge gap, much of which is generational. In the 1990s, parents often found themselves asking teenagers and even children—the new generation of “digital natives”—for computer advice or help in navigating the strange, interconnected world of the Internet. Typing skills, once taught mostly in high school, shifted gradually to middle school and then to the earliest levels of elementary school. The challenge for educators became less one of helping students locate information than of helping students to analyze the quality of the information located.

Globally, the digital divide between developed and developing nations is narrowing, but it is a slow process. Several nonprofit organizations have started programs to provide inexpensive laptops to schoolchildren throughout the world, complete with Wi-Fi connections that enable them to access the Internet from their classrooms. In remote and impoverished areas or areas simply with limited access to printed material, these programs offer children an array of educational opportunities far beyond the dreams of previous generations. These programs have, however, met with criticism from those who fear that the primary purpose of providing children with Internet access is to indoctrinate them in Western culture and values (Borsch 2008).

Module 5: New Global Threats and Challenges

The United States entered the new century with an air of optimism. The economy was doing remarkably well. The country even enjoyed a budget surplus during the last few years of the 1990s and leaders were discussing ways to pay down the national debt. Before the decade reached the halfway mark, however, the nation faced major challenges in rapid succession—the most divisive election in history, a terrorist attack of unprecedented proportions, a massive (and, increasingly unpopular) war, and the first signs that the nation’s economic health was in serious jeopardy.

I. The 2000 Election

II. September 11th

III. The War in Iraq

IV. The 2008 Election

I. The 2000 Election

Despite scandal and impeachment, President Clinton left office with a high approval rating. As had been the case with Eisenhower in 1960 and Ronald Reagan in 1992, he would likely have won a third term, had it not been forbidden by the passage of the Twenty-second Amendment in 1947. Vice President Al Gore lacked Clinton’s charisma and there was a reluctance to rely heavily on Clinton in the campaign, due to the enduring taint of the Lewinsky scandal. Gore’s theme was “Prosperity and Progress,” as the public was generally satisfied with the economy under the Clinton administration, although there was evidence that the boom of the late 1990s seemed to be fading, especially after the “dot-com bubble” burst in March 2000. Gore also stressed the need to strengthen the nation’s schools and protect the environment (Ceaser and Busch 2001, 117).

The Republican opponent, Texas Governor George W. Bush, identified himself as a “compassionate conservative” who could work across party lines. He was well-liked by the Religious Right, which had given only lukewarm support to his father, President George Herbert Walker Bush (1989–1992). This was due both to the younger Bush’s professed faith as a “born again” Christian and his support for conservative social positions (Ceaser and Busch, 2001, 78). He also promised a tax cut, which (as always) appealed to many voters.

Flash program with voiceover and film clips showing the Bush-Gore debates and the presidential race, including the Florida debacle, Bush’s subsequent win, and the protests at his inauguration.

The election of 2000 was like none other. Gore won the popular vote by a substantial margin of over half a million votes (Ceaser and Busch 2001, 162). The winner of the electoral vote, however, would be decided by the state of Florida, where Bush seemed to be ahead by a few hundred votes. Exit polls, on the other hand, showed Gore with the lead. Numerous voting irregularities had been reported in the state. The fact that the governor of Florida, Jeb Bush, was the brother of one of the candidates, and the fact that Florida’s secretary of state, Kathleen Harris (who was in charge of certifying the vote), was also the co-chair of the state’s Elect Bush campaign were very troubling to many Democrats. A confusing ballot in one district also resulted in many elderly voters claiming they’d been misled into voting for third-party candidate Patrick Buchanan (Ceaser and Busch 2001, 210–11). Legal challenges and recounts went on for five weeks, with an even narrower margin for Bush and many ballots still in dispute, when the U.S. Supreme Court ruled against any additional recounts, in a 5–4 decision. The addition of Florida’s electoral votes gave Bush the presidency, with 271 electoral votes—just one above the minimum needed (Campbell 2008, 179–80). (Click on the thumbnail The 2000 Election.)

Gore conceded the election. Millions of voters were angered by the Supreme Court’s decision, feeling that their votes had been negated, but power transferred peacefully from Clinton to Bush. There were protests as Bush took office—including lines of silent citizens who stood along Pennsylvania Avenue in an icy January rain, with their backs to the inaugural procession. For a few months, there were discussions of the need to reform the voting process and possibly eliminate the electoral vote, but little concrete action was taken.

The first few months of the Bush administration were focused on passage of his $1.35 trillion tax cut, which was to be spread over the decade, and the initial work on his education program, No Child Left Behind. Bush had indicated during the campaign that his focus would be on domestic issues and although not exactly isolationist, he had made it clear that the country, now the world’s lone superpower, would have minimal involvement in “democracy building” and multinational organizations (Ceaser and Busch 2005, 35–6). Bush appeared ready to follow through on that agenda, until the events of September 11th made it clear that this was no longer a viable option.

II. September 11th

Flash program with voiceover, film, and newscasts about the events of 9-11. It also shows addresses by President Bush concerning the hunt for Osama bin Laden and the memorial at the World Trade Center.

Most Americans who were alive and of an age to be at all aware of their surroundings can tell you where they were when they heard the news of President Kennedy’s assassination in 1963. The morning of September 11, 2001 has become similarly burned into our collective memory. People throughout the nation were glued to their televisions (and computer screens), watching a video clip of a commercial airliner crashing into one of the World Trade Center towers, which ran in what seemed to be an endless loop. As a more complete story emerged, the nation learned that terrorists had hijacked four jets, crashing two of them into the Twin Towers in New York and a third into the Pentagon, just outside of Washington, DC. A fourth plane had crashed in a remote area of Pennsylvania after passengers apparently learned of the hijacking and attempted to regain control of the plane. The eventual reported death toll of the 9/11 attacks surpassed that of Pearl Harbor in 1941, which had been the last occasion of an attack on U.S. soil. Including the passengers in the planes, nearly 3,000 people died (Ceaser and Busch 2005, 39). (Click on the thumbnail September 11th.)

Air traffic ceased, stock exchanges closed, and the cities of New York and Washington, DC, were put under National Guard protection. Messages of support poured in from around the world, punctuated by the occasional video of Arab nationalists cheering the success of the attacks. Responsibility was quickly traced to the terrorist organization known as al-Qaeda, led by Osama bin Laden, a reclusive and wealthy Saudi businessman. Although not officially tied to any national government, al Qaeda’s main base of operations was Afghanistan, where it had the tacit backing of government leaders, the Taliban.

During the weeks after the 9/11 attacks, the Bush administration worked to build consensus among allies, both for retaliation against Afghanistan and for the broader and more nebulous “war against terror.” In early October, NATO troops joined the United States in a series of bombing raids against al-Qaeda training camps and a variety of military targets in Afghanistan. The Marines followed a few weeks later. Numerous al-Qaeda leaders were captured or killed, but the primary target—Osama bin Laden—was believed to have escaped into the mountains of neighboring Pakistan (Ceaser and Busch 2005, 43–5).

As the war against Afghanistan began, the country was again on alert due to traces of a powdered form of anthrax bacteria that were found in letters mailed to various parts of the country. Five people died, raising fears that this was another al-Qaeda attack. Investigations would eventually point, however, to a U.S. citizen who was employed at a bio-defense lab in Maryland (Willman 2008). This increased concern over domestic security enabled President Bush to push through the USA PATRIOT Act (Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorists) with little debate or opposition from Congress. A few months later, legislation was signed creating the new Department of Homeland Security (DHS).

The PATRIOT Act gave the executive branch much broader powers in detaining and monitoring terrorist suspects. As the panic died down, however, many in Congress began to fear that some provisions of the PATRIOT Act were a serious risk to civil liberties (Ceaser and Busch 2005, 43). It took several months for opposition to grow, as opponents feared being labeled “unpatriotic” by those who supported the new measures.

Efforts to assess why the United States had been caught by surprise on September 11 found plenty of blame to go around. A congressional report released in 2003 noted a general failure of the intelligence community, under both the Clinton and George W. Bush administrations, to “fully capitalize on available, and potentially important, information” that might have prevented the attacks (U.S. Congress 2003). Much of the problem lay in the lack of communication and information sharing between the various intelligence agencies in the network—several of them held key bits of information that would have raised an alert if they had been put together in advance of the attacks. A similar report from a bipartisan independent commission would propose a radical restructuring of U.S. intelligence agencies and the appointment of a single individual, the National Intelligence Director, who would be in charge of coordinating the efforts of the various organizations (9/11 Commission 2004).

III. The War in Iraq

By the spring of 2003, the Taliban had been removed from power in Afghanistan, and many assumed that the military stage of involvement in that country was winding down. The focus now shifted toward Iraq. Throughout 2002, the Bush administration referenced Iraq as a supporter of terrorism and insisted that its leader, Saddam Hussein, was still in possession of weapons of mass destruction. Iraq was generally uncooperative with UN inspectors, and in late 2002, Congress passed a resolution that authorized Bush to use military force against the nation in order to ensure compliance.

Flash program with voiceover and film including the alleged link between Saddam Hussein and Al Qaida, the subsequent invasion of Iraq, the war in Afghanistan, Abu Ghraib, Bush’s justification of the wars, and a Saturday Night Live satirization of Bush’s opinion of the axis of evil.

In efforts to gain support for military actions, Bush administration officials raised the specter of a “mushroom cloud,” arguing that after 9/11, the United States and its allies could no longer tolerate rogue nations who showed intent to harm. There were also repeated inferences by administration officials that Iraq was a state supporter of al-Qaeda and thus tied to the attacks on 9/11. Although Bush admitted in September of 2003 that there was no evidence tying Hussein to al-Qaeda, opinion polls show that nearly half of the American public believed this in 2005 (Harris Interactive 2005) and one-third continued to believe this as late as September 2007 (Angus Reid Global Monitor 2007). (Click on the thumbnail Iraq & Afghanistan.)

The Bush administration sought UN support for the military action against Iraq, but several key nations (including three with veto power in the Security Council) questioned the evidence presented by the United States and argued that inspections should be allowed to continue. On March 20, 2003, a coalition of forces comprised mostly of U.S. and British troops invaded Iraq. The actual war against the Iraqi military was quick and decisive. By the end of April, the capital city of Baghdad had been taken. In early June, Bush appeared on the USS Abraham Lincoln to proclaim, “mission accomplished,” to a group of sailors who had assembled to greet him.

In the months ahead, however, it would become clear that the real battles were far from over. Major conflict between factions in Iraq would result in mounting casualties, as thousands of coalition troops and tens of thousands of Iraqis fell victim to guerrilla warfare and sectarian violence throughout the country. The Bush administration shifted its focus to building a stable and democratic government in Iraq when the weapons of mass destruction, which were the original justification for the war, never materialized. A 2004 report issued by the U.S. weapons-inspections team concluded that Hussein had discontinued his weapons program, but probably did not want to reveal this due to ongoing rivalry and previous wars with neighboring Iran (Kaplan 2004).

The economy was showing signs of recovery, and despite flagging support for the war, Bush won re-election in 2004 against Democratic challenger Senator John Kerry of Massachusetts. One of Bush’s key campaign issues had been Social Security reform and he moved quickly to push through a plan to privatize the program by allowing younger workers to invest their contributions in the stock market. The plan was not popular with either party in Congress and was never enacted, despite having been the key focus of Bush’s 2005 State of the Union address.

Bush’s second term was marked by major controversies. The poor handling of relief efforts in the wake of Hurricane Katrina in August of 2005 gave many Americans the sense that Bush was unconcerned with the fate of the thousands who lost their homes in the worst natural disaster ever to strike the nation. In 2006, there were revelations of a secret surveillance program against U.S. citizens and the use of torture in U.S. military installations. Despite Bush’s appointment of two new, conservative Supreme Court justices, the high court handed down several decisions that suggested the administration had overstepped its constitutional limits. By early 2006, public opinion of Bush’s performance had dropped to 34 percent (Roberts 2006). With congressional elections on the horizon, the Republican Party appeared to distance itself from the president. This attempt failed, however, and the Democrats won control of both houses of Congress.

Over the next two years, scandals forced the resignation of several key administration officials, including Attorney General Roberto Gonzalez, and Bush’s approval ratings continued to decline, bottoming out at 20 percent just prior to the 2008 elections—the lowest public-approval level of any president since Gallup began recording this statistic in 1938 (CBS News 2009). International opinion of Bush’s leadership was even more negative. A public-opinion survey on confidence in international leaders conducted in 20 countries ranked Russia’s Vladimir Putin and China’s Hu Jintao ahead of Bush. In fact, only Iran’s Mahmoud Ahmadinejad ranked lower than Bush—by a single point (Tepperman 2008).

IV. The 2008 Election

The 2008 campaign began early for both parties, and it was clear from the beginning that it would be a divisive—and expensive—election. The Democrats were in the unusual position of having an advantage over Republicans in fundraising (OpenSecrets 2008). By the spring of 2008, the Republican nominee had been determined, as Senator John McCain had easily garnered enough delegates to win the nomination. The Democratic nomination, however, was still being hotly contested as Hillary Clinton (senator from New York and former First Lady) and Barack Obama (senator from Illinois) had both won a significant share of electoral votes. The final few states, however, gave the edge to Obama, and he emerged as nominee, selecting Senator Joe Biden as his running mate.

The Democratic primary was bitterly contested and the fact that many women voters were dismayed at missing the chance for the first female president was not lost on the Republican Party. McCain, whose support was weak both among women and Christian conservatives, selected Alaska Governor Sarah Palin, a favorite of the Religious Right, as his running mate. As the press frequently noted, the 2008 campaign was destined to make history regardless of the outcome. Either Barack Obama would become the first African-American president or Sarah Palin would become the first female vice president in the nation’s history.

It seemed initially that foreign policy would be the primary focus of the campaign, as Obama strongly supported ending our involvement in Iraq as quickly as possible and McCain asserted that we should remain in Iraq as long as it took to establish a stable democratic government. In September 2008, however, a crisis that had been brewing for several years shifted the focus sharply to the state of the U.S. economy.

The Federal Reserve had warned earlier in the year that the country appeared to be headed for a recession. The reasons for this economic slump were complex, and included a combination of the rising cost of oil, a sharp drop in housing prices (which were overvalued as a result of the housing bubble), and the growing income gap between economic classes. Another key factor was the cost of the wars in Iraq and Afghanistan. In the past, wars had been funded by increasing taxes, but in this case, the wars were funded the same way many Americans were funding their own personal spending—on credit. A $200 billion loan program for major banks was approved by the Federal Reserve in March 2008, in hopes that this would improve investor confidence (Lawder 2008).

It became clear by early autumn that the economic problems were far worse than previously expected. The two major mortgage lending institutions, Freddie Mac and Fannie Mae, were essentially bankrupt, and they were placed under government protection. Soon thereafter, several major banks announced bankruptcy or consolidations and the Federal Reserve stepped in to rescue American International Group (AIG), purchasing an 80-percent stake in the company for $85 billion (ibid). Although this was an unprecedented step for the U.S. government, it represented a middle ground between simply bailing out the company and nationalizing it. The hope was that taxpayers might, after recovery, recoup at least some of the investment.

By early October, analysts were beginning to refer to the crisis as the worst economic downturn since the Great Depression. The effects were not confined to the United States alone, as foreign investment firms around the world held shares in the U.S. mortgage market. Uncertain of the future, banks around the world began to stop lending. Congress eventually passed a $700 billion bailout package and authorized expansive powers for the Secretary of the Treasury, Henry Paulsen, to distribute those funds at his discretion, with limited congressional and regulatory oversight (Fitzgerald 2008).

Flash program with voiceover and film showing the Republican and Democratic primary debates; the presidential debates, including the Joe-the-Plumber discussions; interviews with Sarah Palin; Tina Fey’s Saturday Night Live satirization of Sarah Palin; and Obama’s win and inaugural speech.

Even with these measures in place, concerns about the global economy played a major role in the election. Voters generally gave Obama the edge in surveys that asked which of the candidates were more likely to make “the right decision” about economic issues. This may have been a decisive factor in the election. The race appeared to be fairly tight in early October, prior to the bailout, with a slight advantage to Obama, but his margins increased in the weeks that followed. Obama won a decisive victory, with 53 percent of the popular vote and 365 electoral votes. Democrats also picked up seven seats in the Senate and twenty one in the House of Representatives (CNN 2008). (Click on the thumbnail The 2008 Election.)

As Americans watched the inauguration of the first African-American president, many were optimistic about the future. Those concerned about the role of the United States in the world were also optimistic, as Obama’s understanding of Islam offered a chance to improve relations in an area of the world where the American image was at an all-time low. Other, more cautious voices, noted that Obama would, as president, inherit two wars and one of the worst financial crises on record.

There was, however, one point upon which most Americans could agree. Few of those who listened to Martin Luther King proclaim to the nation that he had dreamed of a better America, where people were not limited by the color of their skin, dared to imagine that forty-five years later the country would elect an African American to the highest office in the land. Regardless of what the future might hold for his administration, the election of Barack Obama was a pivotal moment in the nation’s history.

Found something interesting ?

• On-time delivery guarantee
• PhD-level professional writers
• Free Plagiarism Report

• 100% money-back guarantee
• Absolute Privacy & Confidentiality
• High Quality custom-written papers

Related Model Questions

Feel free to peruse our college and university model questions. If any our our assignment tasks interests you, click to place your order. Every paper is written by our professional essay writers from scratch to avoid plagiarism. We guarantee highest quality of work besides delivering your paper on time.

Sales Offer

Coupon Code: SAVE25 to claim 25% special special discount
SAVE