Why Don’t We Tell Women What’s Making Them Miserable?
With so many feminist advances, women should be getting happier instead of just more medicated.
Over the last five decades, feminism has made a long march through American culture, culminating in the first female vice president of the United States. But it isn’t clear that feminism’s fruits are helping everyone. Happiness metrics confirm that women are struggling: Suicides, depression, substance abuse, and sexually transmitted infections have all increased dramatically over the last five decades. Women aren’t becoming happier, just more medicated. A 2020 Pew Report showed that over 50 percent of liberal white women under 30 have some sort of mental-health issue. That statistic alone is stunning enough to indicate that something is going very wrong for the modern woman, despite the steady uptick of feminist advances.
I recently had dinner with two friends who work at a crisis-pregnancy center. They told me some far-from-unusual stories about the challenges their residents face and the rough living most have experienced — being trafficked, pimped, addicted, incarcerated, abused, and on and on. Crisis-pregnancy homes, despite their recent mischaracterization in The Handmaid’s Tale, do amazing work to protect and help rebuild the lives of new mothers. But it is striking to consider that we as a culture do precious little to help women avoid these troublesome situations from the beginning. It is only when women get this far down the road and in this much trouble that mentoring women can step in and say, “Something has to change.” And those in crisis are willing to listen because they have tried everything else. As lives are rebuilt, basic changes to behavior are taught — albeit as an uphill battle because of the absence of cultural support.
But most American women with money, degrees, or connections will never hear that our culturally prescribed feminist lifestyle is the source of their unhappiness, struggles, and feeling of emptiness. It seems that we just allow women to free-fall into truly awful states, without even so much as the quickly spoken warnings of side effects required for pharmaceutical commercials. (Imagine what that might sound like? “Side effects may include sexually transmitted diseases, debilitating depression, loneliness, despair, substance abuse, and suicide.”)
The regnant belief is that human nature is plastic enough that we can do whatever we want consequence-free, but so many devastated lives paint a different picture. The progressive solution, which has been cycling around for decades, has been to fix or shore up problems with more government assistance and programs. Remember Julia? The imaginary woman who never needed a man? This unintentionally dystopian portrait thought up during the Obama administration was meant to let us know that government is here to supply our every need, from birth to death, without placing any kind of demands on our behavior.
Rarely is the suggestion made that women have been sold a poisonous lifestyle and the behaviors implied in that lifestyle are what actually needs changing. Instead, we have a steady diet of articles such as “Anal Sex: Safety, How Tos, Tips, and More” or “How Summer Camp Gave Me the Freedom to Explore My Queerness” at Teen Vogue, which is marketed as “the young person’s guide to saving the world.” College freshmen, now being oriented to their new life away from home as the school year begins, are particularly targeted into their new, savage world where anything goes as long as there is consent and maybe a mask. Heavy doses of gender exploration and safe-sex practices, and heaps of contraceptives, are all part of the welcome at most U.S. college campuses.
But what if there is another way of living, one that doesn’t lead to the predictable road of confusion and despair? Efforts or individuals that shed light on the things that actually help women are met with leftist cries of “victimization,” or by bullying or blaming the patriarchy — hardly the reason-based argumentation radical feminism was supposed to provide.
What, then, should we be conveying to women of every economic and ethnic stripe to help us have fulfilling lives? There are basics, such as: Don’t sleep around, don’t do drugs, don’t have abortions, stop blaming the patriarchy, find a purpose outside of yourself, cover up some of that skin, don’t overspend, and figure out what is truly good, not just what celebrities say. None of these suggestions is revolutionary, especially if one looks honestly at history. Or human nature. Or psychology.
These elements, which were so obvious throughout most of human history, are the real remedy for so much that afflicts all of us. But they are things radical feminists don’t want to be spoken out loud. Over the last five decades, a carefully constructed closed system has been created so that anything outside of its boundaries is almost unthinkable. Hollywood, universities, politics, the fashion industry, magazines, daytime television, and book publishing generate enough ideological tale-weaving to make sure that there is only one narrative in town. Faith and family are about the only outliers that might let the slip show. Is it any wonder that these too are under attack?
But more satisfying ways to live do exist, ways where dignity is honored, health is truly valued, body parts aren’t ignored or rendered useless, and relationships — which are at the true heart of most women — aren’t fleeting or shallow, useful or convenient, but deep, abiding, and life-giving. If only we could find a way to tell this to every woman.
Comments are closed.