Facebook is apologizing for what it says is a “bug” that caused people to see the vomit sticker when looking for something to represent “liberals” or “feminism” on its platform. The company attributed the blame to something in its search algorithm and acknowledged that it was fixing the problem.
“We have guidelines for tagging Stickers in place, so this appears to be a bug in our search algorithm which is being corrected and should not appear after today. We apologize if this caused any concern,” said a Facebook Messenger spokesperson in a statement.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2043301,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,social,","session":"A"}']Go into @Facebook #stickers and search for #feminism or #liberals. Go ahead, I'll wait. How was this approved?? pic.twitter.com/EDvUyuBigr
— Serena Ehrlich (@Serena) August 31, 2016
Reports surfaced on social media and were verified by VentureBeat that if you looked for stickers around “liberals” or “feminism,” the only result that appeared was the vomit sticker. It’s likely that most people didn’t notice this, but it does raise into question how stickers are tagged within Facebook’s system. The company declined to comment, but since the problem stemmed from its search algorithm, it’s possible that it parsed through conversations to better understand how people are using them.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
This could all be an isolated incident that Facebook is working to rectify, but in light of the impact the political season has had on the company, it’s another small blemish that leads people to believe something underhanded took place. Earlier this year, it was heavily criticized after a report emerged saying that Facebook showed political bias with its Trending Topics — naturally, the company denied its human editors were biased. Just last week, Facebook announced it stopped using humans to write its trending topic descriptions.
On the other hand, this whole episode might be about the company’s efforts to provide relevant stickers to people gone awry and mistakes happening. After all, it’s near impossible for Facebook’s team to tag stickers with every conceivable label that makes it easier to find when the time is right. That’s likely why the search algorithm comes into play. However, it admits that this process failed to meet Facebook’s tagging guidelines, so what will the company do to improve its checks and balances?
Right now the sticker may seem a bit innocuous, but the larger issue remains: What happens if you search for “African American” or “women” or anything else and something a bit more vulgar or offensive shows up? How will Facebook deal with that?
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More