When it was discovered that people were using a specific hashtag on Instagram to facilitate the exchange of child pornography, Instagram hesitated to take action. Teen meme creators took matters into their own hands.

A group of teens operating meme accounts on the platform found out that users were using the seemingly innocent hashtag #dropboxlinks to find and share explicit photos of children. According to the Atlantic:

The alleged child-porn-trading users set up anonymous accounts with throwaway usernames or handles such as @dropbox_nudes_4_real (which has since been removed). The accounts that @ZZtails and other memers surfaced, which can also be found on the hashtag, contain blank posts with captions asking users to DM them for Dropbox links, which allegedly contain child porn or nudes.

When they reported the accounts to Instagram, the platform replied to their messages saying that the users had not violated the rules of the platform. Discontent with Instagram’s lack of action, the teen meme creators took matters into their own hands, and began to spam the hashtags with unrelated memes to deter the spread of the illicit photos.

Instagram eventually responded to the complaints, blocking the hashtags #dropboxlinks and #tradedropbox and stating:

Keeping children and young people safe on Instagram is hugely important to us. We do not allow content that endangers children, and we have blocked the hashtags in question. [We are] developing technology which proactively finds child nudity and child exploitative content when it’s uploaded so we can act quickly.

Hopefully Instagram will implement this new technology soon. Many users are not convinced that the platform is doing enough to combat and respond to the sharing of elicit content on the platform.