Why is it always white authors who are always looking to find hidden racism in media and declare how terrible it is? Sometimes it feels like white authors are trying to explain racism to the Black community and how offended people should be. Can you imagine how annoying it would be if men were always trying to tell…