Meta, the parent company of Facebook, Instagram and WhatsApp, has imposed a requirement for political advertisers to disclose any digital alterations made to their advertisements. This decision, was announced as a result of growing concerns about the potential misuse of artificial intelligence (AI) technology. It aims to combat the rise of deepfakes and misinformation in the lead-up to the 2024 US presidential election.
Global policy to combat deepfakes and misinformation
Under this new policy, advertisers must openly acknowledge when they digitally create or modify political ads that portray real individuals engaged in actions or making statements they did not. Additionally, if an advertisement features fabricated events or individuals, it must be clearly indicated. Altered footage of real events, digitally generated audio or video meant to depict fictional occurrences and even content related to social issues must all be disclosed.
This global policy is set to come into effect in the upcoming year. Notably, Meta recently introduced generative AI tools that permit advertisers to automatically generate new backgrounds, alter text, and adjust ad sizes to fit various formats. However, political advertisers will not have access to these tools, a decision initially reported by Reuters.
The increased utilization of generative AI technology by social media platforms has raised concerns about its potential to disseminate election-related misinformation and disinformation, particularly in the context of the upcoming US election. US President Joe Biden recently issued an executive order directing the commerce department to develop guidance for labelling AI-generated content, including deepfakes, to address issues of “fraud and deception.”
Political advertisers to face penalties for non-disclosure
Google’s YouTube became the first major digital advertising platform to require advertisers to prominently disclose synthetic content depicting real or realistic-looking individuals or events back in September.
In October, US Senator Amy Klobuchar and House member Yvette Clarke wrote a letter to Meta’s CEO, Mark Zuckerberg, inquiring about the company’s efforts to combat threats to free and fair elections.
Linda Yaccarino, CEO of X (formerly Twitter), was also contacted and has engaged with lawmakers on the issue. However, as of now, X has not implemented any policy changes in response to these concerns.
Advertisers who repeatedly violate Meta’s disclosure rules could face penalties. It’s important to note that these requirements do not apply to alterations that are inconsequential or immaterial to the claims made in the advertisement, such as simple image cropping or sharpening.