When algorithms prey on vulnerability
How social media systems identify our weaknesses and turn them into engagement
When I was in the depths of a dark relationship with the internet, Instagram was my biggest vice. I was level-headed scared enough to never join TikTok, and while I kept my Facebook around for the local parent and school groups, I didn’t spend much time on it. Instagram was, more or less, my weakness.
After cutting Instagram, I wasn’t prepared to fill the void with something healthy, and instead I found myself scrolling on Facebook like it was 2009 again. This was arguably worse than being on Instagram because the content was 97% trash.
I realized my mistake, set up some new healthy boundaries, and pivoted to only accessing my parent group pages on my laptop browser, maybe 3x a week. It worked.
That is, until I had a newborn and succumbed to the “nap trap.” Despite my attempts to manifest a “unicorn baby” who would magically sleep in her bassinet or crib, my third baby was just like my first two — a cuddler.
I tried to prepare myself for this inevitability by having my Kindle handy, picking up a new hobby, enjoying time just talking with my big kids. And largely, I did those things most of the time.
But, as it happens, there were times when I was just bored and stuck. No book or needlepoint handy, no kids nearby to tell me about cowboy dinosaurs on Saturn. I found myself spending more time than I wanted on Facebook.
I’d try to stick to my main groups — a tactic that I used pre-baby — but time to time, I’d find myself on the main feed.
A targeted algorithm
The targeted algorithm on Facebook, at first, made me laugh. I was served with ad after ad for nursing bras and tanks, for new milk storage vessels, for sleep training programs (hah!), for baby carriers, and even a few mistaken ads for maternity clothes — the tech not 100% positive yet if I delivered the baby that I had been Googling about for the last 9-10 months.
There was a time in my life when I appreciated this type of targeted advertising. How convenient that the ads I’m being show are things I really need! But as my bank account dwindled and the useless junk piled up, I figured out that I actually only needed 1% of what I was being served. I changed my mindset, and I figured out how to tune out these deceptively on-target ads.
So, ignoring the nursing bras and milk collectors was easy. But after just a few weeks back on Facebook, a new kind of targeted content started appearing in my feed — stories of parents losing children and children losing parents.
When the algorithm senses weakness
Here I was, mere weeks after delivering my third baby, and I was being targeted with stories about death. I was served with sponsored posts from local news stations in tiny towns I’d never heard of, tabloid stories about shocking one-in-a million devastations, stories from random news blogs, pop culture magazines, and big name publications.
Frankly, it was relentless.
I’m lucky — this postpartum season has been an absolute joy. Not everyone has the same experience. Nearly 14% of postpartum women experience postpartum depression, 18% experience postpartum anxiety (source). And these are the reported numbers.
Scrolling on Facebook, with my weeks old baby on my chest, it felt as if my news feed was trying to trigger PPA or PPD. As if, Meta preferred for me to be depressed.
And the stories did start to nag at me. I’d think about them all of the time. The rare times I got my baby to sleep in a bassinet, I wouldn’t leave her side. At night, worst-case scenarios started popping into my head. At a dentist appointment, I started spiraling when I realized I took the car with the car seats and my husband was home with the kids. What would he do in an emergency?
It was that evening, when I opened up Facebook and saw yet another sponsored post about a tragic death, that I realized what a mistake I was making.
I deactivated my account and went cold turkey. Parent groups be damned.
Profit is king
In those few weeks, I had been trading tiny dopamine hits for a slide into a postpartum anxiety that I didn’t need to have. It didn’t take long for the anxiety to slip away, but I was left with anger.
Why is it okay for Meta to do this? What is going on in their algorithm where they think it’s acceptable to be serving stories of death and tragedy to vulnerable women?
I know Meta (and other tech companies) have sketchy tech practices, yet I still fell victim to their targeting. Wall Street Journal gives an in-depth explanation of how Meta determines it’s Facebook algorithm:
Facebook takes into account numerous factors — some of which are weighted to count a lot, some of which count a little and some of which count as negative — that add up to a single score that the news feed algorithm generates for each post in each user’s feed, each time they refresh it. That score is in turn used to sort the posts, deciding which ones appear at the top and which appear so far down that you’ll probably never see them. That single all-encompassing scoring system is used to categorize and sort vast swaths of human interaction in nearly every country of the world and in more than 100 languages.
So what’s happening, as I experienced, is Facebook’s algorithm is calculating that women who have just given birth are more likely to pay attention to certain types of sponsored content: baby carriers and tragedies.
Meta is not only risking the mental health of vulnerable women, they are profiting from it.
Target audience: the vulnerable
I’m not the only mother who was targeted with tragedy. Politico shares the story of a woman, Joanna, who found herself in the same situation as me:
“More than half the posts pushed to Joanna by Facebook’s algorithm were related to her fears, according to an analysis done over two months by Panoptykon Foundation, a digital rights nonprofit, with an academic specialized in algorithm audits, Piotr Sapieżyński, and shared exclusively with POLITICO. Telling Facebook to hide the suggested posts only seemed to lead to a slight increase in the amount of disturbing posts, the group said.”
This is exactly the experience I had. I would consistently click the little button “Not Interested,” and even begun blocking tabloid content only to get sponsored stories from XYZ News station from ABC small town in a state I had never even visited.
And it’s not just postpartum women who are experiencing this. According to Facebook’s own research, “vulnerable communities, including Black, elderly, and low-income users, are among the groups most harmed by the prevalence of disturbing content on the site.” (Time Magazine)
Let me break down exactly how Meta is making money by manipulating it’s user’s vulnerabilities.
Social media platforms are, at face-value, free. The real product is attention. The longer a user spends on a social media platform, the more advertisements they will see and click on, and the more money the social media platform will make. So, these platforms have one primary goal: keep you there as long as possible.
Unfortunately, morals aren’t at play here, and they’ll do anything to keep you engaged. Negative, emotionally-driven content tends to capture more attention, and their algorithms know this.
So in my case, their algorithms understood that postpartum women were more likely to be struck by these fear-based stories. And while the responsible thing for a corporation to do would be to eliminate these stories from a postpartum woman’s feed, instead, they pumped more and more similar content. They were counting on my spiral into anxiety — depending on me to get hooked to my feed, click a few sponsored posts, and generate more profit for their shareholders.
The Youth Endowment Fund (YEF) conducted a study in England and Wales that shows the direct effect of these dangerous algorithms on children. From The Guardian:
One in four teenagers who see real-life violence, including fist fights, stabbings and gang clashes, online are being served the clips automatically by algorithmic recommendation features, according to the study done by the Youth Endowment Fund (YEF) and shared with the Guardian. Only a small minority actively searched for the violent content.
…Black boys were the most likely to have seen violent content online (78%) while white children were less exposed (69%), the survey found. Children with special educational needs were almost as likely to have seen violence as black boys – 10 percentage points more than children without SEN.
These finding are jarring. And the sad result is we are creating fear and anxiety that’s unfounded.
The Guardian continues to explain, “The amount of violence the teenagers see appears to far outweigh the actual risk of violence. Only one in 20 teenagers said they carried a weapon, but one in three saw weapons on social media.”
The end result is an endless loop of anxiety and depression > more time spent on social media.
The end result is depressed, anxious, fearful, and angry users.
The end result is corporate profit.
When will it be enough?
I recently read an interview in the NYT with Pinterest chief executive, Bill Ready. While, I abhorred the majority of the article (blatant defense of AI slop, a company model reliant on it’s users buying more things than you need), there was one bit that really resonated with me. Ready says, “Why can’t we create a world where social media companies compete on their safety records, the same way the auto manufacturers compete on theirs?”
It circles back to the idea of cruel optimism. You might read this essay and come to the conclusion: If social media makes you depressed, that’s not the platform’s problem, it’s yours.
But that’s not enough. Government exists for a reason. Regulations exists to protect consumers. If we can have safety regulations in other industries, why don’t they exist in an industry that consumers are using for two and a half hours every day?
I have to maintain hope that real change can and will happen. That the congressional hearings will lead to something, that the Anxious Generation movement will continue to grow into a deafening roar.
Because right now, hope might be all we have.
Can’t wait on government regulations or corporate policy changes to finally quit your doom scrolling habit? Join the Digital Detox program that helps you reset your tech habits and learn to use your phone as a tool. You don’t need a dumbphone, you just need a mindset shift.




