As countries develop they must expand, and like many countries, the United States found its way on a path of expansionism. Though this happened throughout the U.S.’s early history, the late nineteenth and early twentieth-centuries proved that the U.S. continued to be an expansionist country. However, there is also evidence that shows how the U.S. slowly departed from their expansionistic ways.
Imperialism in the U.S referred to their military and economic influence on other countries. Generally, expansionism is born through imperialism, and there were many people who disagreed with what many world powers were doing at the time. One newspaper artist named Thomas Nast published a drawing, in “Harper’s Weekly,” which portrayed all of the worlds major powers picking up the regions they wished to have and dropping them into their ‘grab-bags.’ This exaggeration was quite accurate because at the time many Americans were upset with the United States’ continuation to expand. Not only did it disrupt foreign policy, but it also enraged many citizens that were Anti-Imperialists, one of them being Nast.
Imperialism in the U.S. dates all the way back to the Louisiana Purchase, and it was clearly the beginning of expansionist acquisitions. This concept began its popularity during the presidency of James Polk, who led the U.S. into the Mexican-American war of 1846. Polk continued to express imperial behavior with the annexation of California, along with the purchase of other western territories, all which occurred through the Treaty of Guadalupe Hidalgo and the Gadsden Purchase. All of this expansionism and lust for territory comes from a belief called Manifest Destiny, the idea that America was destined to stretch from the Atlantic to the Pacific oceans. Josiah Strong, an author from New York, argues that God expects the U.S. to expand until they have fulfilled their destiny, and he attempts to convince American citizens that...
Please join StudyMode to read the full document