BY RINDALA ALAJAJI | September 17, 2025
This is the second instalment in a ten-part blog series documenting EFF's findings from the Stop Censoring Abortion campaign. You can read additional posts here.
During our Stop Censoring Abortion campaign, we set out to collect and spotlight the growing number of stories from people and organizations that have had abortion-related content removed, suppressed, or flagged by dominant social media platforms. Our survey submissions have revealed some alarming trends, including: if you don’t have a personal or second-degree connection at Meta, your chances of restoring your content or account are likely to drop significantly.
Through the survey, we heard from activists, clinics, and researchers whose accounts were suspended or permanently removed for allegedly violating Meta’s policies on promoting or selling “restricted goods,” even when their posts were purely educational or informational. What the submissions also showed is a pattern of overenforcement, lack of transparency, and arbitrary moderation decisions that have specifically affected reproductive health and reproductive justice advocates.
When accounts are taken down, appeals can take days, weeks, or even months (if they're even resolved at all, or if users are even given the option to appeal). For organizations and providers, this means losing access to vital communication tools and being cut off from the communities they serve. This is highly damaging since so much of that interaction happens on Meta’s platforms. Yet we saw a disturbing pattern emerge in our survey: on several occasions, accounts are swiftly restored once someone with a connection to Meta intervenes.
The Case Studies: An Abortion Clinic
The Red River Women's Clinic is an abortion clinic in Moorhead, MN. It was originally located in Fargo, North Dakota, and for many years was the only abortion clinic in North Dakota. In early January, the clinic’s director heard from a patient that she thought they only offered procedural/surgical abortions and not medication abortion. To clarify for other patients, they posted on the clinic’s page that they offered both procedural and medication abortions—attaching an image of a box of mifepristone. When they tried to boost the post, the ad was flagged and their account was suspended.
They appealed the decision and initially got the ad approved, yet the page was suspended again shortly after. But this time, multiple appeals and direct emails went unanswered, until they reached out to a digital rights organization that was able to connect with staff at Meta that stepped in. Only then was their page restored, with Meta noting that their post did not violate the policies but warning that future violations could lead to permanent removal.
While this may have been a glitch in Meta’s systems or a misapplication of policy, the suspension of the clinic’s Facebook account was detrimental for them. “We were unable to update our followers about dates/times we were closed, we were unable to share important information and news about abortion that would have kept our followers up to date, there was a legislative session happening and we were unable to share events and timely asks for reaching out to legislators about issues,” shared Tammi Kromenaker, Director of Red River Women's Clinic. The clinic was also prevented from starting an Instagram page due to the suspension. “Facebook has a certain audience and Instagram has another audience,” said Kromenaker, “we are trying to cater to all of our supporters so the loss of FB and the inability to access and start an Instagram account were really troubling to us.”
The Case Studies: RISE at Emory University
RISE, a reproductive health research center at Emory University, launched an Instagram account to share community-centered research and combat misinformation related to reproductive health. In January of this year, they posted educational content about mifepristone on their instagram. “Let's talk about Mifepristone + its uses + the importance of access”, read the post. Two months later, their account was suddenly suspended, flagging the account under its policy against selling illegal drugs. Their appeal was denied, which led to the account being permanently deleted.

Screenshot submitted by RISE to EFF
“As a team, this was a hit to our morale” shared Sara Redd, Director of Research Translation at RISE. “We pour countless hours of person-power, creativity, and passion into creating the content we have on our page, and having it vanish virtually overnight took a toll on our team.” For many organizational users like RISE, their social media accounts are a repository for resources and metrics that may not be stored elsewhere. “We spent a significant amount of already-constrained team capacity attempting to recover all of the content we’d created for Instagram that was potentially going to be permanently lost. [...] We also spent a significant amount of time and energy trying to understand what options we might have available from Meta to appeal our case and/or recover our account; their support options are not easily accessible, and the time it took to navigate this issue distracted from our existing work.”
Meta restored the account only after RISE was able to connect with someone there. Once RISE logged back in, they confirmed that the flagged post was the one about mifepristone. The post never sold or directed people where to buy pills, it simply provided accurate information about the use and efficacy of the drug.
This Shouldn’t Be How Content Moderation Works
Meta spokespersons have admitted to instances of “overenforcement” in various press statements, noting that content is sometimes incorrectly removed or blurred even when it doesn’t actually violate policy. Meta has insisted to the public that they care about free speech, as a spokesperson mentioned to The New York Times: “We want our platforms to be a place where people can access reliable information about health services, advertisers can promote health services and everyone can discuss and debate public policies in this space [...] That’s why we allow posts and ads about, discussing and debating abortion.” In fact, their platform policies directly mention this:
Note that advertisers don’t need authorization to run ads that only:
- Educate, advocate or give public service announcements related to prescription drugs
Note: Debating or advocating for the legality or discussing scientific or medical merits of prescription drugs is allowed. This includes news and public service announcements.
Meta also has policies specific to “Health and Wellness,” where they state:
When targeting people 18 years or older, advertisers can run ads that:
- Promote sexual and reproductive health and wellness products or services, as long as the focus is on health and the medical efficacy of the product or the service and not on the sexual pleasure or enhancement. And these ads must target people 18 years or older. This includes ads for: [...]
- Family planning methods, such as:
- Family planning clinics
- In Vitro Fertilization (IVF) or any other artificial insemination procedures
- Fertility awareness
- Abortion medical consultation and related services
But these public commitments don’t always match users’ experiences.
Take the widely covered case of Aid Access, a group that provides medication abortion by mail. This year, several of their Instagram posts were blurred and removed on Instagram, including one with tips for feeling safe and supported at home after taking abortion medication. But only after multiple national media outlets contacted Meta for comment on the story were the posts and account restored.
So the question becomes: If Meta admits its enforcement isn’t perfect, why does it still take knowing someone, or having the media involved, to get a fair review? When companies like Meta claim to uphold commitments to free speech, those commitments should materialize in clear policies that are enforced equally, not only when it is escalated through leveraging relationships with Meta personnel.
“Facebook Jail” Reform
There is no question that the enforcement of these content moderation policies on Meta platforms and the length of time people are spending in “content jail” or “Facebook/Instagram jail” has created a chilling effect.
“I think that I am more cautious and aware that the 6.1K followers we have built up over time could be taken away at any time based on the whims of Meta,” Tammi from Red River Women’s Clinic told us.
RISE sees it in a slightly different light, sharing that “[w]hile this experience has not affected our fundamental values and commitment to sharing our work and rigorous science, it has highlighted for us that no information posted on a third-party platform is entirely one’s own, and thus can be dismantled at any moment.”
At the end of the day, clinics are left afraid to post basic information, patients are left confused or misinformed, and researchers lose access to their audiences. But unless your issue catches the attention of a journalist or you know someone at Meta, you might never regain access to your account.
These case studies highlight the urgent need for transparent, equitable, and timely enforcement that is not dependent on insider connections, as well as accountability from platforms that claim to support open dialogue and free speech. Meta’s admitted overenforcement should, at minimum, be coupled with efficient and well-staffed review processes and policies that are transparent and easily understandable.
It’s time for Meta and other social media platforms to implement the reforms they claim to support, and for them to prove that protecting access to vital health information doesn’t hinge on who you know.
This is the second post in our blog series documenting the findings from our Stop Censoring Abortion campaign. Read more in the series: https://www.eff.org/pages/stop-censoring-abortion