Tumblr went missing from Apple’s app store over the weekend, leaving users with a lot of unanswered questions.
Although the company stated it was working to fix it, it didn’t disclose the reason why the app vanished. Now, we’re starting to get answers.
Last week, Tumblr users began reporting issues with the search feature of the app while ‘Safe Mode’ was off. To fix the issue, some deleted the app and then tried to reinstall it.
However, many of them couldn’t find the app within Apple’s store.
The company released a vague statement, explaining it was “still working on the issue with the iOS app.” That led some to speculate Apple removed the app due to inappropriate content, something the tech giant has done to numerous apps before.
Apple’s iOS guidelines state that any app available in the store must have some type of content filter in order to screen out inappropriate content, and with Tumblr’s ‘Safe Mode’ having issues, this seems like a pretty likely scenario.
Tumblr’s latest announcement, made this morning, notes it’s still working to restore the app to the iOS store.
We continue to work to restore our app to the iOS app store. Follow along in our help center for more info: https://t.co/GM4hj6Jkhh
— Tumblr Support (@Tumblrsupport) November 20, 2018
The reason Tumblr’s app went missing
However, Download.com learned what really happened—proving the speculations were right.
According to the website, the iOS store removed the app due to child pornography.
“We’re committed to helping build a safe online environment for all users, and we have a zero tolerance policy when it comes to media featuring child sexual exploitation and abuse. As this is an industry-wide problem, we work collaboratively with our industry peers and partners like [the National Center for Missing and Exploited Children] (NCMEC) to actively monitor content uploaded to the platform,” a Tumblr spokesperson tells Download.com in a statement.
“Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform. A routine audit discovered content on our platform that had not yet been included in the industry database. We immediately removed this content,” the statement continues. “Content safeguards are a challenging aspect of operating scaled platforms. We’re continuously assessing further steps we can take to improve and there is no higher priority for our team.”