Is Instagram’s Sensitive Content Filter Censoring Marginalized Communities?

Where exactly is the margin for the marginalized? Instagram recently created a Sensitive Content Control option for users to filter out content that Instagram may deem inappropriate. The boundaries are becoming abundantly more clear as to what the Sensitive Content Control wants you to be viewing and believing.

sensitive content control or controlling the narrative?

Controlling the narrative is important and this content control option wants you to think that these subjects are out of line, but it doesn’t mean that we shouldn’t be discussing these topics, nor excluding ourselves from reading about them. When anyone takes this approach of “out of sight, out of mind,” it also takes away the space for our ability to have civil discourse. This is exactly the kind of content we need to be viewing if we want to have any sort of understanding about what is going on in our world and what we can do to make a difference.

Limitations like these try to force us to turn a blind eye to our community and are actively taking something away from us – an open line of communication, which we must access for us to work cohesively and productively.

The new functionality resembles censorship. Historically, internet censorship has been used by dictatorships and abusive governments to control the narrative and hinder freedom of speech. The internet is supposed to be a space for people to connect and share, but there are those with agendas who try to skew narratives in their favor. The question shouldn’t be what we’re censoring; rather why we’re censoring. Why don’t we want people to see this, and who may it be affecting and in what way?

The internet and more specifically, social media, has been a great tool for marginalized communities such Black, Brown, LQBTQIA+, and indigenous communities to create their own safe space and have their voices heard.

The new functionality resembles censorship. Historically, internet censorship has been used by dictatorships and abusive governments to control the narrative and hinder freedom of speech. The internet is supposed to be a space for people to connect and share, but there are those with agendas who try to skew narratives in their favor.

Social media played a crucial role during the uprising of violence against the Black community especially after the murder of George Floyd, mobilizing millions of users to contribute to the Black Lives Matter movement. Most recently, we witnessed its power during the rise of violence against Palestinians in Palestine and the support it gathered all around the world.

Social media is also the place where many suffer bullying, trolling, and many forms of harassment but no measures have been taken by big tech companies to effectively counter online aggressions. Instead, Instagram has become the perpetrator of “algorithm aggression” — and no, this is not the first time. Instagram has faced controversy — actually, not so long ago (in 2020!) — for discriminatory features such as the nudity policy that unfairly censored plus-size women and nipples being sexualized on breastfeeding content, etc. Is that sensitive content? Many creators on social media raise awareness on subjects related to sexual education and womxn’s bodies, many of whom can be heavily affected by the automated sensitive content control feature.

How to Disable Instagram’s “Sensitive Content Control” Feature

It’s been noted that Instagram has enabled this feature on everyone’s account as a default setting. While the sensitive content control might help some users to filter out some content they might deem as offensive, if you would like to disable the feature, author Frederick Joseph provides a step-by-step how-to below. You can watch in video format, or keep scrolling to find the instructions written out for you.

  1. First, on your profile page, click on the three horizontal bars in the upper-righthand corner.
  2. Next, select “settings” then click on “account.”
  3. Finally, scroll down to the label “sensitive content control.” You’ll next be presented a page with three prompts, “allow,” “limit (default),” and “limit even more.” Upon selecting “allow,” you will be asked, “allow sensitive content?” to which you can press “ok.”

Many social and digital activists have taken to their keyboards to denounce the abusive feature and the issues the Sensitive Content Control raises.

Activist Ashlee Marie Preston Speaks Out

Ashlee Marie Preston took to her social media last week to denounce the fact that Instagram deems socio-political commentary, international crisis (i.e. Middle East) trans, queer, feminist, Black, Brown, and Indigenous content centered on injustice as inflammatory and potentially “not viewer-friendly,” or “upsetting.”

I actually had the honor to chat with Ashlee Marie Preston concerning her unique perspective on the Sensitive Content Control. Ashlee Marie Preston is an acclaimed media personality, cultural commentator, social impact strategist, political analyst, and civil rights activist. She is the first trans person in the U.S. to become editor-in-chief of a national publication, and the first openly trans person to run for state office in California. Ashlee Marie was named one of The Root 100’s “Most Influential African Americans of 2017,” profiled as one of LOGO/NewNowNext’s 30 Most Influential LGBTQ Influencers of 2017 & 2018, PopSugar’s Top 40 LGBTQ’s of 2017, and OUT Magazine’s OUT100 of 2018. She made her TED talk debut in September of 2018, and was chosen as one of Coca-Cola’s “Next Generation LGBTQ Leadership” influencers of 2018. In addition, she was honored as one of  Essence Magazine’s “WOKE 100” of 2019, made the BlackList100 of 2020, and was a national surrogate for Elizabeth Warren’s 2020 Presidential Campaign. 

Amid the COVID-19 crisis, Ashlee Marie launched YouAreEssential.org , a national fund to benefit grassroots organizations serving vulnerable communities disproportionately impacted by the pandemic. In 2021, the organization began providing educational tools and resources to help grassroots organizations achieve growth and sustainability. In addition, YAE curates experiences centered on radical rest and healing, affording activists, organizers, and community stakeholders the opportunity to become active beneficiaries of their labor.

Here’s what she had to say about the whole Instagram situation.

Muslim Girl: How would you say the sensitive content control affected marginalized communities? For example, the BLM movement, support for Palestine, or the LGBTQIA+ community?

AMP: When it comes to systemic racism and xenophobia, BIPOC identity and experiences are often deemed “sensitive” subjects because of the nature of the violence we endure. Therefore how are social justice movements supposed to build coalitions and move messaging if what’s happening them is censored? The simple answer is that we aren’t supposed to. Social media operates in the exact same ways as systems of power offline. In fact, the executives of companies like Instagram & Facebook pour millions into lobbyists who will shape laws in their favor. We have the power to disrupt that, and we should before it’s too late.

MG: What kind of impact does this have on you personally? Why is this important to you? Have you ever been a victim, or do you know anyone that has been a victim of algorithm aggression?

AMP: I began using the term “algorithmic aggression” to describe my experiences on Facebook and Instagram — not to imply anything that AI is inherently “bad”– but to underscore the reality that it is programmed, and developed by individuals who don’t look like you or me, nor do they care about our survival or wellbeing. 

As a Black trans woman, every layer of my identity is hyper-sexualized, fetishized, or villainized. So algorithms trained with data sets that don’t accurately provide text or context to what it means to be to someone like me are automatically taught to see me as a danger or threat. That means I’m disproportionately scrutinized, surveilled, and censored simply for being visible.

That is not an accident. If it were, Instagram and  Facebook would be more intentional about diversifying their training data and making sure that it’s not solely reflecting identity markers, but lived experiences as well.

MG: What do you think is the underlying issue?

AMP: The ultimate beast we’re battling is capitalism. Instagram and Facebook are prioritizing profit over people; and they will block, ban, censor, or mute, anyone who stands in the way of the almighty dollar. This is why it’s important to push for policies and legislation that enforces oversight, regulatory governance that breaks up monopolies and limits their influence in politics. That’s ironic considering how much big tech loves to pretend it’s “apolitical.”

Special thanks to Ashlee for taking the time to answer the questions to educate us on this important issue!