> Last month, Tumblr said its iOS app was pulled from the Apple Store over child pornography found on the platform. According to a Twitter user who claimed to have found the illegal content, Tumblr had initially failed to take down the images, despite repeated complaints about the problem. (As of Monday, the Tumblr app is still unavailable on Apple's App Store.)
I think you can also look at the various discussions of this issue online. The user perception was that it was inordinately difficult to report abusive content.
I also recall an instance finding problematic content and now knowing how to report it. I think it didn’t occur to me think “I think this person might be underage, better hit the share button”. TikTok for example makes it incredibly easy to report offensive, unsafe, or illegal content.
Whether or not Tumblr management _intended_ to have very poor controls against CSAM doesn’t change the fact that they did in practice. And when push came to shove, instead of protecting user expression on their platform, they banned a whole category of content rather than building the tools and policies needed to make it work.
Worst case, they could have, instead of deleting all adult content, removed the toggle for iOS users and made them go to the browser for filtered content.
Even if you hate Apple for censoring their platform, it was Tumblr’s choice to apply that policy to Android and web users.
This response and earlier comments also fail to address the fact that the iOS app had a toggle to block NSFW content. It was allowed in 2017, and it was allowed in 2022. So I’m not sure why they had to purge non-CSAM content and remove the toggle in 2018. Source: https://techcrunch.com/2022/01/11/tumblr-sensitive-content-t...
Is there insider info that challenges any of that?
You're making a ton of incorrect assumptions about what approaches Tumblr tried, what solutions Apple rejected as insufficient, what resourcing decisions were made by Tumblr vs parent company, and so many other critical details here.
It's pretty messed up for you to say "Tumblr just didn’t want to put the effort" when you have no idea who or what was actually responsible for any of the things you're describing. I don't feel you are addressing this topic in good faith, so I'm not going to continue discussing this with you.
You’re saying that from the perspective of the Tumblr people, having been purchased by Yahoo/Verizon, there were limited options. And maybe negotiations with Apple came into play once the CSAM problem was already so bad the app was delisted.
What I’m saying is that from the consumer perspective, it doesn’t really matter who in the company made the choice (Tumblr people or Verizon people), a business choice was made to remove a significant part of the users and use case for the platform.
Similarly by the time Apple banned the app for CSAM, maybe it truly was too late to keep NSFW content on the platform for Apple users, and maybe it was seen by ownership as too expensive to implement a partial solution. Regardless of ownership, failing to invest in effective moderation and flagging tools at every point in Tumblr’s journey was a business choice.
Maybe it was too expensive to keep CSAM off the platform. I don’t see evidence that Tumblr made those investments. An easy reporting option for users is the bare minimum. Maybe Tumblr would have bankrupted themselves policing that content - but choosing to not go down that road is still a business decision and still not something that can realistically be pinned on Apple.
In other words there are two versions of the story:
- Excuse: we had to take down NSFW content platform wide because Apple hates adult content
- Explanation: we didn’t have the resources to police an NSFW platform effectively, so we made a business decision to drop that segment of our business
I think you can also look at the various discussions of this issue online. The user perception was that it was inordinately difficult to report abusive content.
The second comment on this thread reports that even the moderation that was taken was ineffective. https://old.reddit.com/r/OutOfTheLoop/comments/a2r4h4/whats_...
I also recall an instance finding problematic content and now knowing how to report it. I think it didn’t occur to me think “I think this person might be underage, better hit the share button”. TikTok for example makes it incredibly easy to report offensive, unsafe, or illegal content.
Whether or not Tumblr management _intended_ to have very poor controls against CSAM doesn’t change the fact that they did in practice. And when push came to shove, instead of protecting user expression on their platform, they banned a whole category of content rather than building the tools and policies needed to make it work.
Worst case, they could have, instead of deleting all adult content, removed the toggle for iOS users and made them go to the browser for filtered content.
Even if you hate Apple for censoring their platform, it was Tumblr’s choice to apply that policy to Android and web users.
This response and earlier comments also fail to address the fact that the iOS app had a toggle to block NSFW content. It was allowed in 2017, and it was allowed in 2022. So I’m not sure why they had to purge non-CSAM content and remove the toggle in 2018. Source: https://techcrunch.com/2022/01/11/tumblr-sensitive-content-t...
Is there insider info that challenges any of that?