I think it’s obvious that men have been indoctrinated (more or less) into believing their behavior is acceptable, just as women have. Is this anyone’s fault? Nobody is born into a vacuum, and no child truly contributes to a culture. That’s why it’s frustrating to see men being constantly made into an enemy around…