The War of 1812 resolved many of the issues for which the U.S. fought, and it did have several positive consequences for the American nation. The U.S. declared war on Britain on June 18, 1812 because the American people were tired of being taken advantage of by the European powers, and they wished to gain respect and assert their place among the other nations. America had remained neutral in every previous European conflict, and the French and the British had taken them for granted. Sometimes both countries traded with the U.S. while at the same time confiscating American ships and their goods. The British took more American ships than the French because many British sailors that were treated cruelly in the British navy would escape to the U.S. and work on American ships. The British would then seize an American ship and reclaim their sailors. Britain was also arming Native Americans, assisting the Indians to prevent the expansion of American settlers. All of these occurrences were key factors that kindled the fire of the War of 1812. At the end of the war, Britain and the U.S. signed the Treaty of Ghent. It was more of an armistice than a treaty, because neither side lost or gained any land. Since they had quelled the Native Americans in the West and South, the U.S. resumed American expansion. Realizing the need for a strong navy and army, the government increased military spending. America also gained respect from the European nations, as it proved to be a proud and formidable nation. The American people came out of the War of 1812 with a new sense of nationalism, as a united front was needed to oppose the mighty British Empire. A new sense of nationalism, a stronger military, and a newfound respect from the European nations were all positive consequences of the War of 1812.