Last Thursday, popular messaging app Telegram and its experimental counterpart, Telegram X, strangely vanished from the iOS App Store. Though the apps returned within mere hours, no reason was initially given for their abrupt removal.

Telegram founder Pavel Durov said in a tweet that the apps were removed for containing inappropriate content. "We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store," Durov explained at the time. "Once we have protections in place we expect the apps to be back on the App Store."

What exactly that inappropriate content may have been hasn't been entirely clear until now. According to 9to5Mac, one of the outlet's readers reached out to App Store manager Phil Schiller via e-mail for the reasoning behind Telegram's sudden removal.

Schiller responded by saying the App Store team was "alerted to illegal content, specifically child pornography" within the apps. Naturally, after verifying the veracity of these claims, the App Store team was quick to work with Telegram to ban the users involved and pull the apps from the store until fixes preventing it from happening again could be rolled out.

The relevant portion of Schiller's response is as follows:

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

As 9to5Mac noted, it's unlikely that the illegal media was detected within regular one-on-one messages due to privacy-protecting encryption in place for the apps' users. Rather, the content was likely being served via a third-party plugin used by Telegram - hence the quick fix on the developer's part.