Click to expand Image
© 2024 Human Rights Watch
(Beirut) – Meta should do more to protect the safety of lesbian, gay, bisexual, and transgender (LGBT) people and all its users, in particular on Facebook and Instagram, Human Rights Watch, Social Media Exchange (SMEX), INSM Foundation for Digital Rights, Helem, and Damj Association said in the #SecureOurSocials campaign opened today.
The campaign, building on the work of various human rights organizations, is based on a February 2023 Human Rights Watch report, “‘All This Terror Because of a Photo’: Digital Targeting and Its Offline Consequences for LGBT People in the Middle East and North Africa.” Human Rights Watch examined the use of digital targeting by security forces and its far-reaching offline consequences—including arbitrary detention and torture—in five countries: Egypt, Iraq, Jordan, Lebanon, and Tunisia. The findings showed that security forces use social media platforms, including Facebook and Instagram, to entrap and harass LGBT people, as well as to gather and create evidence to prosecute them.
Secure Our Socials
We are calling on Facebook and Instagram to do more to make their social media platforms safe for LGBT users who face digital targeting and severe offline consequences including detention and torture.
Act Now
“As the largest social media company in the world, Meta should be a global leader in making social media safe for everyone,” said Rasha Younes, acting LGBT rights deputy director at Human Rights Watch. “When LGBT people, who already face insecurity offline, use Facebook and Instagram for connection and organizing, they deserve certainty that Meta is doing everything in its power to ensure their security.”
Based on Human Rights Watch research and civil society recommendations, the #SecureOurSocials campaign aims to engage Facebook and Instagram to be more transparent and accountable by publishing meaningful data on investment in user safety, including regarding content moderation in the MENA region and around the world.
LGBT people Human Rights Watch interviewed reported losing their jobs; being subjected to family violence, including conversion practices; having to change their place of residence and even flee their country; and experiencing severe mental health consequences as a result of being targeted online, including on Facebook and Instagram.
Human Rights Watch interviewed dozens of LGBT people who indicated that they had reported being harassed, doxxed, outed, and abused on Facebook and Instagram, but in all these cases, Meta either did not respond to their complaints or found that the content they reported did not violate its policies, and the content remained online.
To help spread awareness on the topic, Human Rights Watch partnered with Lebanese drag pioneer Anya Kneez to create an explainer video and developed an awareness guide sharing tips on how LGBT people can stay safe online when using social media applications like Facebook and Instagram.
While Meta’s policies and standards prohibit many forms of online abuse, the company frequently falls short in consistently applying these rules on its platforms, Human Rights Watch said. As a result, content targeting LGBT people sometimes remains on Facebook and Instagram even when it violates Meta’s policies, while the platform removes other content, including documentation of human rights abuses.
In a December 2023 report, Human Rights Watch documented various forms of censorship on Instagram and Facebook affecting posts and accounts documenting and condemning human rights abuses and raising awareness in support of Palestine and Palestinian human rights.
#SecureOurSocials provides a variety of solutions for Meta to keep LGBT people safe on its platform and asks Meta to disclose its annual investment in user safety and security, including reasoned justifications explaining how trust and safety investments are proportionate to the risk of harm, for each region, language, and dialect in the Middle East and North Africa. Human Rights Watch also published a question and answer document that details the campaign’s objectives, its recommendations to the company, and explains its focus on Meta.
Human Rights Watch has been in discussions with Meta staff about its concerns for months. In addition, Human Rights Watch sent a letter on February 2, 2023, to Meta’s human rights department, which posed specific questions that had stemmed from the research and listed the report’s findings before publishing its digital targeting report. Meta declined to provide a written response, though it continued to engage with Human Rights Watch on these issues.
On January 8, 2024, Human Rights Watch sent another letter to Meta to inform the company of the campaign and solicit its perspective.
Social media companies have a responsibility to respect human rights, including the rights to nondiscrimination, privacy, and freedom of expression. They should avoid infringing on human rights. They should also identify and address human rights impacts arising from their services, including by providing meaningful access to remedies, and communicate the steps they take to address these impacts.
When moderating content on its platforms, Meta’s responsibilities include taking steps to ensure its policies and practices are transparent, accountable, and applied in a consistent and nondiscriminatory manner. Meta is also responsible for mitigating human rights abuses perpetrated against LGBT people on its platforms while respecting the right to freedom of expression.
As powerful as social media companies are, governments are the primary duty bearers responsible for protecting human rights, Human Rights Watch said. Governments in the MENA region should respect and protect the rights of LGBT people instead of criminalizing their expression and targeting them online. They should introduce and enforce laws protecting people against discrimination on the grounds of sexual orientation and gender identity, including online.
“Meta has underinvested in user safety and underestimated the role its platforms play in facilitating abuses against LGBT people in the region,” Younes said. “Meta should always be accountable for the security of users on its platforms, but especially when it can protect them from egregious harm.”