Women know a thing or two about horror. In fairy tales, women are always stolen away by nefarious villains. Little girls are raised by witches who pretend to be their mothers. I can’t think of a single fairy tale where the man pricks his finger, is smothered by a piece of apple caught in his throat, or opens the forbidden door only to find his wife’s seven murdered husbands inside.
We’re taught that it’s unsafe to travel alone. Self-defense classes tend to have more women participants than men. Our boyfriends walk us to our door; it’s seldom the other way around. They are fairly sure they can get home unmolested.
When we’re young, we’re warned that our bodies will gush blood every month for the rest of our lives. It will hurt. It will make us ill. It has the potential to be humiliating. The junior high choir/band/drama competition is five hours long and we’re forced to wear white dresses. That is true horror.
The blood prepares us for pregnancy, where something grows within our bodies and ripples under our skin. The baby either bursts out of our bellies or is sliced out with surgical tools. In the case of the latter, the smell of cauterized flesh accompanies the birth. We look at the first pictures of baby and realize we’re staring at our own flayed-open abdomen, at our own guts. We continue to bear the scars.
Yet often there is a disconnect when somebody hears the words “women” and “horror”. Women are stereotypically known as the fairer sex. It’s supposed to be our job to beautify things. We’re supposed to soothe fevered brows, take a few walls and make a home out of them, and throw stardust and glitter everywhere. Women are, again stereotypically, supposed to be beings of love and light. Often it is unseemly to mention the darkness.
But it’s there. It’s always been there, and we have a front row seat. We write what we know, and we know loveliness. We know want, and desire, and bloodshed. We know joy, of course, and we especially know horror.