I personally think that what the United States has become is an absolute disgrace. I truly do hate living in a country that can't even remember the morals it was founded on, and then goes on to say that everything we do is right, and other people are the bad guys. In my opinion, the entire government should be fired, and completely new people should be put in.