This week’s revelation that a pornographic video briefly made the “Editors Picks” recommendations for the new Twitter app Vine is raising questions about whether the ultra-short form video-clip sharing app is at risk for more such content — and whether this could scare users away from the service.
The app, now available for iOS devices, was recently purchased by Twitter. It took just days after the app’s release for videos deemed not safe for work (NSFW) to show up.
“A human error resulted in a video with adult content becoming one of the videos in Editors Picks, and upon realizing this mistake we removed the video immediately,” said Twitter spokesperson Carolyn Penner. “We apologize to our users for the error.”
Was the clip on Vine long enough to damage the brand?
“To their credit, Twitter was quick to react to the ‘human error,'” said Greg Verdino, marketing futurist of Greg Verdino LLC. “While mistakes are inevitable anywhere, not just on new social apps, the fact that adult content make it to the top of the ‘Editor’s’ Picks points to the need for tighter processes around governance.”
Not Safe For Marketers?
Regardless of whether “human error” was at fault, users don’t need an editor’s pick to find Vine content not safe for work or family. Searches for #nsfw, #porn, and #sex will bring up quite graphic results.
The videos are only six seconds long, but that’s apparently long enough to cause concern for marketers who may have considered using Vine for quick teases about brand products or services.
“Talk about a quickie,” said Josh Crandall, principal analyst at NetPop Research. “What an unfortunate glitch for Twitter’s new Vine app. It’s ironic that for an industry that uses sex appeal to sell, advertisers will get skittish about six seconds of porn. But they will, because the brands they represent must be shielded from the dirty ways of human behavior.
“Twitter, of course, has taken this issue very seriously,” Crandall told the E-Commerce Times. “They will set up new workflows and protections to make certain that a mistake like this doesn’t happen again. They will parade the new technology and ultimately, over time, brands that are more adventurous will take advantage of Vine and take a spin. Companies that cater to male audiences — think GoPro, sports franchises, video games — will lead.”
Six Seconds to Ruin a Reputation
While many individuals have likely tweeted or posted something that they’d regret, with celebrities and politicians ending up in the spotlight for the wrong reasons, Vine’s video stumble could highlight even greater dangers of trying to get six seconds of fame.
“It reveals the flaws and vulnerabilities in the way these sites manage content and keep porn segregated from ‘legitimate’ content,” said Greg Sterling, principal analyst at Sterling Market Research. “Having porn be so closely associated with a new site like Vine could scare away brands and other marketers who wouldn’t want to be tainted by the association.
“If this is a one-time event it won’t have a lasting negative impact on Vine, but if porn keeps ‘leaking’ then it could compromise Vine and even impact parent Twitter,” Sterling told the E-Commerce Times.
Things could have been worse from a marketing perspective if Vine had been running traditional display ads.
“In that case, a brand would rightfully be concerned that it would appear that they’re ‘sponsoring’ adult material,” said Verdino. “This harkens back to the early days of social media, when many marketers were concerned about the appropriateness and often even just the quality of the consumer-generated content they’d run adjacent to. And this is the ChatRoulette or Omegle challenge — that display advertisers buy audience through a network and, often unbeknownst to them, find themselves slotted into inventory running opposite NSFW chats.”
Risky Content
While this incident highlights the most obvious example of what could be shown in six seconds, porn isn’t the only subject that could be grounds for concern.
“Porn, spam, hate speech and other inappropriate content is a major inherent risk for businesses and services dependent on user-generated content,” said Billy Pidgeon, senior analyst at Inside Networks. “Filters and real identity requirements can help mitigate the problem to a degree, but human moderation is necessary, and that can be costly.”
Enabling user reporting will help, but problem content can’t be too prevalent or business partners won’t get involved and large numbers of users will avoid the service. Google already spends a lot of time to make sure this sort of content doesn’t filter through on YouTube, but there remains an expense in moderating for all sorts of content, Pidgeon told the E-Commerce Times.
“It is relatively early in the cycle to be concerned about it, but Twitter is going to have to reassure people this thing can’t happen regularly. Overall you have to make sure that the portal is free of this content.”
The question now is whether one bad clip in the editor’s picks will be enough to kill marketing efforts on the Vine, or whether Twitter can rebound.
“Painful yes, but debilitating, not so much,” said Crandall. “The advertising industry flirts with provocative messages all of the time. They push the limits and extend the bounds.
“When companies are dealing with millions of six-second content clips, something like this was bound to happen. It may not happen again soon, but it’s bound to occur more in the future as the fire hose of user generated content continues to fill new channels of technology.”
Social Media
See all Social Media