A better question might be why American TV is so afraid of the vagina. It is never shown nor ever mentioned. A female commentator once noted, somewhat gleefully, that male nudity is not so much shown as 'brandished.' So why not brandish the female sex? You know what: personally I wouldn't want that to happen to women.…