The study between June and July 2020 found that as of early June, up to 110 out of a total of 182 WhatsApp group links that were shared with the social media company in 2019 were “still active and operating”.
CPF had conducted two similar investigations in 2019 and found that WhatsApp and Telegram had active groups and channels sharing such material. “Many of these groups that were directly reported to the platform either comprised of adult pornography groups or had group icons that were pornographic,” the organisation said in its latest report. “Almost all groups that had a clear CSAM (child sexual abuse material) group icon or description were removed while groups with obscene pictures and names…remain active.”
CPF has selected 29 groups from a large pool of 1,084 adult pornography groups as a part of its research. Within this, 15 groups were found to be disseminating CSAM and reported to the platform. However, most users in these groups remained active, the report said.
“While none of the groups were still removed, it was later reported to the research group that many offending users were banned. 25 of the 29 users who were reported (along with screenshots of them having uploaded CSAM on a group) remain active on the date of issuance of this report,” it said.
The report was first issued on August 30. In an update, Nitish Chandan, who leads technology, law and policy research group at CPF, said these groups continued to remain even as recently as September 10. “There remain many gaps in reporting and in ban implementation on such content,” he told ET. Telegram did not respond to specific queries on the CPF report while queries sent to WhatsApp remained unanswered till press time