As some of you know. I was in Crete (which is an Island belonging to Greece) last week and there was loads of anti Amercan slogans all over the place. How can America improve it's worldwide image????? Do the American people and/or the Leadership not care about the rest of the world.......Unless oil is involved that is!!!! Most members on this site are American, so what do you guys think???? If you are from the rest of the world,reply also.