Meta has announced a series of measures to combat the rise of 'nudify' apps, which use artificial intelligence to create fake non-consensual nude or sexually explicit images. The company reiterated its longstanding rules against such content and detailed new steps to address these concerns.
"We remove ads, Facebook Pages and Instagram accounts promoting these services when we become aware of them," Meta stated. Additionally, the company blocks links to websites hosting these apps and restricts related search terms on its platforms.
Meta is taking legal action against Joy Timeline HK Limited, the entity behind CrushAI apps. These apps allow users to create AI-generated nude images without consent. Meta has filed a lawsuit in Hong Kong to prevent Joy Timeline from advertising on its platforms after multiple attempts by the company to bypass Meta's ad review process.
"This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta emphasized.
Beyond removing these apps from its own platform, Meta plans to collaborate with other tech companies through the Tech Coalition’s Lantern program. Since March, they have shared over 3,800 URLs of violating apps and websites for further investigation by participating companies.
The fight against nudify apps is described as adversarial, with financially motivated individuals evolving tactics to evade detection. To counteract this, Meta has developed technology designed specifically for identifying such ads and uses matching technology for quicker removal of copycat ads. The company also applies tactics used against coordinated inauthentic activity networks to find and disrupt networks promoting these services.
Meta expressed support for legislation that combats intimate image abuse across the internet. They are working on implementing the U.S. TAKE IT DOWN Act and supporting laws that empower parents in overseeing their teens' app downloads.