Artificial Intelligence (AI) is a profound breakthrough in technology and can perform many of the same duties, if not more than humans are capable of today. The use of AI in mobile apps is becoming prominent as we see the demand for AI to revolutionize photography through Lensa, a mobile app that generates a person’s avatars based on a photo they upload.
While this has been a fun activity for many people on social media, it has caused an uproar in the Muslim community. Many Muslim women who used Lensa reported that the avatars the app generated of them were hypersexualized. Although they uploaded a photo wearing a hijab, many of the avatars generated were hijab-less and immodest. This does not come as a surprise. The Guardian, in partnership with the Pulitzer Center’s AI Accountability Network, issued an exclusive story that confirmed, “AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved.”
lensa Is hyper-sexualizing women
Lensa uses artificial intelligence to create avatars from an uploaded image. How exactly is this done? First, it’s important to know that artificial intelligence uses machine learning, a set of algorithms that allows computers to learn from data. Unfortunately, the data that is being used is curated through content used on social media which is gender biased. The datasets do not embody diversity, and most are curated by men. Hence, many artificial intelligence apps are marginalizing women and recreating societal biases.
The avatars that Lensa curates are a direct reflection of society’s norms regarding women’s sexuality. Social media platforms encourage the hypersexuality of women. The Lensa avatars of Muslim women as immodest and hypersexualized are proof that companies are set on altering the physical features of women in accordance with what deems fit society’s standards.
there is a clear bias in AI
The avatars of men who used the app did not experience sexualization. In fact, many men who shared photos of their avatars were modeled as fierce leaders, with capes and fully clothed. AI is not isolated from bias which means that new technological advancements are not catered to protecting women’s sexuality and identity. Clearly, the datasets companies such as Lensa use are a testament that the algorithms have a built-in gender bias. Companies need to do better when it comes to guidelines regarding AI technology.
Have you experienced a hypersexualization of your photos on Lensa or a similar AI photo app? You can send in your responses to email@example.com to get featured!