One thing I have noticed about the difference between Americans and other countries is that other nations seem to largely not make the kind of promises American doctors tend to make. In America, pain relief means “taking the pain away” more often than not. Most of us seem to believe that “pain relief” means a total…