Technology reporter
Instagram users have told the BBC of “extreme tension” to ban their accounts after being incorrectly accused by the platform to dissolve their rules on child sexual abuse.
The BBC is in contact with the three people who were told by the original company Mata that their accounts were being permanently disabled, only to restore them shortly after highlighting journalists.
“I have lost endless hours of sleep, felt isolated. It is terrible, not mentioning such an allegation on my head,” one of men told BBC News.
Meta refused to comment.
More than 100 people have been contacted from BBC News, claiming that Meta has been wrongly banned by Meta.
Some people talk of loss of earning after closing from their commercial pages, while others now highlight the pain of not having access to years of paintings and memories. It has affected their mental health.
More than 27,000 people have signed a petition, in which the Mata Moderation System has been accused, Artificial (AI), false on accounts and then an appeal process that is disqualified for purpose.
Thousands of people are also dedicated to this subject in the Reddit forums, and many users have posted about being banned on social media.
Meta is first Accept a problem with Facebook groups But denied that its platforms were more widely affected.
‘Derogatory and Vile’
The BBC has changed the names of people in this piece to protect its identity.
David from Aberdeen in Scotland was suspended from Instagram on 4 June. They were told that they had not followed the community standards of meta Child sexual abuse, abuse and nudity,
He appealed that day, and then permanently disabled to Instagram and his affiliated Facebook and Facebook Messenger accounts.
David found a redit thread, where many others were posting that they were also wrongly banned on child sexual abuse.
He told BBC News, “We have lost 10 years of messages, photos and posts due to completely derogatory and soluble allegations,” he told BBC News.
He said that the meta was “an embarrass”, with AI-related answers and reactions to their questions. He still does not know why his account was banned.
“I have lost endless hours of sleep, excessive stress, felt isolated. It is terrible, not mentioning such an allegation on my head.
“Although you can talk to people on reddit, it is difficult to talk and talk to a family member or colleague. They probably do not know the reference that a ban is going on.”
The BBC raised David’s case in Mata on 3 July, as one of the several people who claimed that they were wrongly banned on child sexual abuse. Within hours, their account was restored.
In a message sent to David, and seen by the BBC, the tech veteran said: “We are sorry that we have found this wrong, and you were not able to use Instagram for some time. Sometimes, we need to take action to help keep our community safe.”
“This is a big weight from my shoulders,” said David.
Faisal was banned from Instagram on alleged child sexual abuse on 6 June and like David, his Facebook account was also found suspended.
London’s student is starting a career in creative art, and was starting to earn money through commission on his Instagram page when suspended. He appealed after realizing that he had done nothing wrong, and then his account was banned a few minutes later.
He told BBC News: “I don’t know what to do and I am really upset.
,[Meta] I accuse me of a crime that I have never done, which also damages my mental state and health and it has put me in pure separation in the last month. ,
His case was also raised by the BBC with Meta on 3 July. About five hours later, their accounts were restored. He received an accurate email as David, with an apology from Meta.
He told BBC News that he got “quite relief” after hearing the news. “I am now trying to limit my time on Instagram.”
Faisal said that he was upset with the incident, and is now worried that if a background is investigated on him, an account can be banned.
A third user Salim told BBC News that he had falsely banned for violations of child sexual abuse.
He highlighted journalists on his case, stating that the appeal “was largely ignored”, professional accounts were being affected, and AI “was labeled as criminal insults to the common people”.
He was banned about a week later, his Instagram and Facebook accounts were restored.
Is it wrong?
When asked by BBC News, Meta refused to comment on David, Faisal and Salim’s matters, and did not answer questions about whether it was wrongly a problem in accusing users of child abuse crimes.
It seems in a part of the world, however, it is a broad issue that it has been accepted.
The BBC has learned that the Chairman of Science, ICT, Broadcasting and Communications Committee at the National Assembly in South Korea said last month that Meta acknowledged the possibility of wrong suspension For the people of his country.
Dr. Carolina, a blogger and researcher at the University of Northern in the social media moderation, said it was difficult to know what was the root of the problem because the meta was not getting open about it.
However, he suggested that this may be due to recent changes in words of some community guidelines and lack of a practical appeal process.
“Meta often does not explain what it is that the deletion has been triggered. We are not private to be wrong with the algorithm,” she told BBC News.
In the previous statement, Meta said: “We take action on the accounts violating our policies, and people can appeal that if they feel that we have made a mistake.”
Meta, with all large technology firms, generally, have come under increasing pressure from regulators and officials to make their platforms a safe place.
Meta told the BBC that it used a combination of people and technology to find and remove accounts that broke their rules, and did not know about the spike in the wrong account suspension.
Meta says its child sexual abuse policy The materials generated by children and “non-permissible illustrations with human equality”, such as art, AI or imaginary characters.
Meta also told BBC a few weeks ago It uses technology to identify potentially suspected behaviorsSuch as adult accounts are being repeatedly discovered by adolescent accounts, or adults.
Meta says that when it is known about “clear child abuse”, it reports it to the National Center for Missing and Explode Children (NCMEC) in the US. NCMEC told BBC News that it provides all those reports for law enforcement worldwide.