Brian Tamaki’s video grabs Facebook’s attention




In the video – quickly pulled by Facebook – the head of the Church of Destiny made offensive remarks about Sharia law

A video posted by controversial Destiny Church leader Brian Tamaki was investigated by police and removed by Facebook following the Lynnmall terror attack.

This isn’t the first time Tamaki’s social media has come under scrutiny.

In the aftermath of the Christchurch Mosque terror attacks, Facebook deleted a post in which Tamaki, a fundamentalist Christian, described Islam as a “rapid rampant social invasion”.

The latest incident took place earlier this month when Tamaki posted a video on his Facebook page citing a mosque in West Auckland, including offensive remarks about Sharia law.

Ahamed Aathill Mohamed Samsudeen, a Tamil Muslim, injured seven people when he used a knife from LynnMall’s Countdown supermarket to attack his victims.

The terrorist was shot in minutes by police from the Special Tactics Group, who were monitoring Samsudeen around the clock in Auckland.

Police received a complaint about Tamaki’s video, which was posted on Sunday, September 5 – two days after the terror attack.

An investigation found that the video did not meet a criminal threshold.

A police spokesperson told Newsroom that the contents of the video had been reviewed and that “although many of the comments were wrong and very hurtful to many, no criminal offenses were identified.”

Police were in the process of removing the video, but Facebook stepped in and proactively removed it first.

It is understood that the video was removed the same day it was posted.

Facebook’s transparency policies define hate speech as “a direct attack against people – rather than against concepts or institutions on the basis of what we call protected characteristics.”

These characteristics are race, ethnicity, national origin, disability, religious affiliation, caste, sexual orientation, sex, gender identity and serious illness.

“We define attacks as violent or dehumanizing speech, harmful stereotypes, statements of inferiority, expressions of contempt, disgust or dismissal, name calling and calls for exclusion or segregation,” according to Facebook .

Between April and June of this year (the most recent statistics available), 97.6% of content breaking hate speech rules was found and reported by Facebook, while the remaining 2.4% was reported by the users.

In the days following the LynnMall attack, Police Commissioner Andrew Coster issued a public statement urging the public to “exercise caution when receiving unverified information” about the attack, via the platforms of social media.

“The police have been informed of false information shared on social platforms. I urge anyone who comes across this type of information to know that much of what is circulating on social media platforms is either false or inaccurate, ” Coster said.

Security Intelligence Minister Andrew Little told Newsroom on Friday that agencies, including the police, were always on the lookout for threats.

“There are a lot of people who are going to say things, express conspiracy theories and everything that is difficult to intervene on a legal basis,” he said.

Deputy Prime Minister Grant Robertson told Newsroom that a number of new protocols and tools were available and used between police and social media companies in the wake of the attacks on the Christchurch Mosque.

Tamaki did not respond to Newsroom’s request for comment.


Leave A Reply

Your email address will not be published.