However The American Empire was such a novel sounding thing prior to the Spanish-American War. It was this war that almost certainly started America down the road of imperialism and colonialism. The imperialism in this sense is the control of foreign countries and peoples and not necessarily the lands of Native Americans. This distinction must be made otherwise since the end of the Revolution and the founding of the country could be a tale of imperialism. What with Manifest Destiny and the Monroe Doctrine. …show more content…
The U.S. would not allow the continued colonization or aggressive actions by European countries but this did not stop the U.S. from its own imperialistic ambitions. This policy was used to great effect when the U.S. warned Great Britain to keep out of Hawaii. Which the U.S. then annexed. One could argue the United States was already dabbling in imperialism prior to the Spanish-American War. However because of the ceding of certain countries and Spanish colonies to America increased the fledgling empire. Aside from gaining colonies and holdings from military action the U.S. also gained some colonies and land via “dollar diplomacy” and buying countries