I asked my sister what was a good topic to talk about for this series and she said black women don’t perform or don’t like to perform oral sex. Although I never really thought about it, she was on to something.
When I got to college I was amazed at how many of my white female peers were proudly giving oral sex. Where I come from “she sucked my dick” was the biggest insult a boy could give you. You couldn’t be known for anything worst than that. Now as an adult my black friends all admit to performing oral sex but it seems to be something reserved for a special person or as a returned favor. For example on the Braxton Family Values, Trina admitted that she had performed oral sex on a man who was not her husband. Trina’s sisters were more shocked by the fact that she had just performed oral sex and not received anything sexual in return, than the fact that this man was not her husband. Her sister even went on to say “Black girls don’t just go around doing that”. I don’t remember ever having a talk with my parents that specifically addressed oral sex. I just remember that it was always this kind of taboo subject even in college and the taboo hasn’t went away. Maybe people just don’t like to talk about it because I know for fact that black women are doing it, and although maybe not everyday, but not just on special occasions with special people. I know that our attitudes differ on the matter, especially culturally, but I don’t think there is anything wrong with performing oral sex and if you’re going to do it, do it well and enjoy it. Nobody ever died from sucking dick. At least as far as I know.
So my question is why is performing oral sex that big of a deal?
What is that we are taught as Black women that makes it so taboo?