Besides being born in the USA what makes a person uniquely American? I was had a conversation with a Finnish friend of mine over dinner last weekend and he scoffed at the idea of America actually having any semblance of culture. He said most Europeans generally think our "culture" is a joke. He surmised that nothing cultural held our country together and the love of money is the only thing that people have in common here. The only "culture" we have has been manufactured in the pursuit of wealth. He cited Las Vegas and Disneyland. I had to think about this a while. But was wondering if you all had any thoughts on this. What do you think binds this country together? I for one think there is a very strong American culture because something definitely happens to new immigrants here that makes them very different from the people they left behind. They become American.