Isolationism refers to America's longstanding reluctance to become involved in European alliances and wars. But American isolationism did not mean disengagement from the world stage. Brought up with the question if America can remain isolationist, the answer is no. Here are some reasons why. The United States remained politically isolated all through the 19th century and the beginning of the 20th, an unusual feat in western history. During the 1920s, American foreign affairs took a back seat. In addition, America tended to insulate itself in terms of trade. Tariffs were imposed on foreign goods to shield U.S. manufacturers. The year 1940 signaled a final turning point for isolationism. German military successes in Europe and the Battle of Britain prompted nationwide American rethinking about its posture toward the war. If Germany and Italy established hegemony in Europe and Africa, and Japan swept East Asia, many believed that the West might be next. But by the autumn of 1940, many Americans believed it was necessary to help defeat the Axis, even if it meant open hostilities. Everything changed when Japan naval forces sneak attacked Pearl Harbor on December 7, 1941. Germany and Italy declared war on the United States four days later. America galvanized itself for full blown war against the Axis powers. The isolationist point of view did not completely disappear from American discourse, but never again did it figure prominently in American policies and affairs. Countervailing tendencies that would outlast the war were at work. During the war, the Roosevelt administration and other leaders inspired Americans to favor the establishment of the United Nations, and following the war, the threat embodied by the Soviet Union under Joseph Stalin dampened any comeback of isolationism.