Why is it that nurses are viewed/treated as uneducated in society? Is it to do with the media portrayal (ie., nurses don't exist in hospitals, and when they do they are drug addicts or "sluts"), or is it because it is a female dominated profession? I looked after a patient recently whose family treated me like total…