This is one of the stupidest things I've ever heard. Americans are responsible for their "cultural imperialism"? Here's an idea, if you don't want "American culture" in your country, stop importing it. Nobody else is to blame. I'd love to hear the damage this "imperialism" is doing. I'm sure listening to The Black…