Social media platforms have been under increasing scrutiny in recent months for their failure to police content effectively. The latest example comes from Twitter, where several major advertisers have pulled their ads after learning that they were appearing alongside tweets soliciting child pornography. This is a serious problem, and one that CMOs need to be aware of.
Ecolab, Dyson, and Mazda are just a few of the companies that have suspended their marketing campaigns or removed their ads from parts of Twitter because their promotions are appearing alongside tweets soliciting child pornography.
Twitter has long been plagued by fake accounts and bots, which exist for the sole purpose of spreading spam or posting inflammatory content. In recent months, the platform has taken steps to crackdown on these accounts. However, it appears that child pornography has become rampant on the platform.
According to a new report from Ghost Data, ads for at least 30 brands have appeared on the profile pages of Twitter accounts that contain links to exploitative material.
The brands range from Walt Disney Co, Comcast Corp’s NBCUniversal and Coca-Cola Co to a children’s hospital. This is a huge problem, not just for the affected companies, but for society as a whole.
It’s still unclear exactly how the ads ended up on these accounts.
It’s possible that the advertisers were targeted by scammers who set up fake accounts in order to get paid for clicks on their ads.
It’s also possible that the ads were placed manually by someone working for the advertiser who wasn’t aware of the nature of the content. Either way, it’s unacceptable.
This is a serious issue, and one that could damage Twitter’s reputation irreparably if not addressed quickly and effectively. CMOs need to be aware of the risks associated with advertising on social media platforms like Twitter, and they need to consider whether the benefits outweigh the risks.
In a statement to Reuters, Twitter said it was “sorry” for the situation and is taking steps to prevent it from happening again in the future.
Twitter also said that it is working on evolving its policies so that this type of content will be flagged more easily. However, some are skeptical about how effective these measures will be, given that Twitter has been struggling with moderating offensive content for years.
As a corporate advertiser, it is your responsibility to be aware of the content your ads are appearing next to. A recent Reuters review found that some tweets include keywords related to “rape” and “teens,” and appeared alongside promoted tweets from corporate advertisers. In one example, a promoted tweet for shoe and accessories brand Cole Haan appeared next to a tweet in which a user said they were “trading teen/child” content.
We’re horrified,” David Maddocks, brand president at Cole Haan, said after learning about how their ads were helping fund child porn. “Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.”
This comes only months after child-safety advocates basted Twitter and lined up to support a lawsuit that alleges the social network declined to remove videos depicting the sexual exploitation of minors despite pleas from a victim and his family. The lawsuit, filed in federal court in San Francisco on Wednesday, claims Twitter “knowingly profited from child sex abuse” by allowing the videos to be disseminated on its platform.
The lawsuit against Twitter was brought by the parents of two minors who were sexually abused and had the videos of their abuse shared on Twitter. The parents allege that Twitter knew about the videos but refused to take them down despite multiple pleas from the family. The family is seeking damages for emotional distress, negligence, and intentional infliction of emotional distress.
The National Center for Missing and Exploited Children, which works closely with the federal government to fight the sexual exploitation of children, supports the lawsuit because of Twitter’s alleged refusal to take down the videos despite the family’s pleas. In an email to The Examiner, the NCMEC said, “The facts in this case are especially egregious because the electronic service provider was aware of the child victims’ graphic sexual images and refused to remove the videos from the platform.”
This latest news is sure to give pause to any advertiser who is considering running ads on Twitter. After all, no company wants to be associated with tweets about rape or pedophilia, no matter how accidental the placement may be. In the age of social media, where one misstep can quickly turn into a full-blown PR disaster, advertisers have to be extra careful about where their ads appear.