I was wondering, if any of you who live inside the United States, (either born here and lived here your whole life or are from a different country and have lived here for many years now) have been indoctrinated if you will into this notion that anything the U.S. does is for the greater good of the world or anything that we do is for peace? I've always been told this by the media, teachers, majority of friends, even other family members that whatever the U.S. does, it is for peace and if your against the U.S. then by definition your against peace.