During the 20th century, the British Empire was once one of the most formidable colonial powers in the world; throughout the 16th, 17th, 18th, and 19th centuries, the United Kingdom acquired lands, territories, and dominion over many nations. As a superpower during the 1920's the British Empire began losing its control of its colonial lands. This greatly affected the social politics in and around the world as a growing sense of injustice spread throughout the colonies. Webster's Dictionary defines Colonialism as being under the control by one POWER over a dependent area or people. Many countries today originated from post-colonization under British rule, America or the thirteen colonies specifically being one of the most famous. When a country colonizes another the reasons are clear; to use the colonized land for its resources or expand the colonizers territories in order to gain power, or in some twisted way bring “civilization” to uncivilized people. After many years many of these “colonies” choose to fight for there freedom and enter into what we call the post-colonization phase. Dictionary.com defines post-colonialism as an era after colonization.
Before colonization America was a land full of lush, well kept woodland full of Native American tribes. Before the Colonization process began Native Americans had a very complex and very sophisticated way of living and communicating to other tribes throughout the North Americas. Most native tribes shared common currency but mostly traded to obtain goods. Most tribes if not all believed in animism or the believe that everything from rocks to a dog has a spirit and that spirit should be appeased at all times. There was little dispute over which tribe owned the land it rested on because Native Americans believed the land did not belong to any one person or group but to everyone as a whole.
Please join StudyMode to read the full document