When
World War I erupted in Europe in the summer of 1914, most Americans wanted no part in the war. Neutralism-a policy to not get involved with either side in a dispute was an important part of
United States foreign policy at the time. We took care of business in our own hemisphere, period. The United States' eventual entry into the war forever changed our foreign policy. We were no longer a country that minded its own business; we were a world power that looked out for the rest of the world.
United States foreign policy at the time. We took care of business in our own hemisphere, period. The United States' eventual entry into the war forever changed our foreign policy. We were no longer a country that minded its own business; we were a world power that looked out for the rest of the world.
What
do you think of the policy of neutralism? Should the United States should go back to that policy? Tell why or why not. How do you feel about the policy of neutralism in your personal life? When should you mind your own business? When should you get involved in other people's disputes?