Hidden Hate

How Instagram fails to act on 9 in 10 reports of misogyny in DMs

We conducted several case studies in partnership with women with large Instagram followings to reveal how Meta, in its continued negligence and disregard for the people using its platforms whilst churning record profits, has created an environment where abuse and harmful content is allowed to thrive. This denies those being abused the ability to freely express themselves online.

Download report Find out more

About

After reporting on public gender-based violence and misogynistic abuse through posts directed at high-profile women, CCDH researchers have turned to an under-studied and even more unregulated facet of online abuse: the direct message (DM). This report uncovers the side of Instagram that is often unseen, but more often experienced first-hand by women who use social media: how harassment, violent threats, image-based sexual abuse can be sent by strangers, at any time and in large volumes, directly into your DMs without consent and platforms do nothing to stop it.

Instagram claim that they act on hate speech, including misogyny, homophobia, and racism; nudity or sexual activity; graphic violence; and threats of violence. But our research finds that Instagram systematically fails to enforce appropriate sanctions and remove those who break its rules.

CCDH has conducted a series of Failure to Act reports over several years – on platforms’ failure to act on Covid-19 misinformation, identity-based hate, climate denial, and more. This report has one of the worst-ever failure rates of our reports.

Instagram failed to act on 90% of abuse sent via DM to the women in this study.

Further, Instagram failed to act on 9 in 10 violent threats over DM reported using its tools and failed to act on any image-based sexual abuse within 48 hours.

Also, check out our Coalition Letter to Meta demanding action, signed by 27 major civil society groups and activists.