Facebook executive says tech companies need tighter regulation | Facebook

0


The tech industry “needs regulation” because it shouldn’t be left alone to make the rules on issues such as harmful online content, a Facebook official said.

Monika Bickert, vice president of content policy at Facebook, believes that “government regulation can set standards that all businesses should meet.”

His comments come as tech companies and some of their most ardent critics travel to parliament this week to talk about new rules to tackle harmful content online.

Among those who will testify before MPs and peers is Frances Haugen, a former project manager at Facebook who leaked tens of thousands of internal documents.

The documents contain claims that the social media juggernaut knew its products harmed adolescent mental health and incited ethnic violence in countries like Ethiopia.

They also allege that Facebook employees repeatedly raised concerns before and after the election, when Trump attempted to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told colleagues a week after the election that 10% of all U.S. opinions on political content were publications that falsely claimed the vote was fraudulent.

Writing in the Sunday Telegraph, Bickert said: “While there will no doubt be different views, we should all agree on one thing: the tech industry needs regulation.

“At Facebook, we have advocated for democratic governments to set new rules for the internet in areas such as harmful content, privacy, data and elections because we believe companies like ours shouldn’t take these. decisions by themselves.

“The UK is one of the leading countries with far-reaching proposals on everything from hate speech to child safety and while we don’t agree with all the details we are thrilled as the online security bill moves forward.

Culture Secretary Nadine Dorries said online hatred had “poisoned public life” and the government had been pressured to reconsider its upcoming online safety bill in light of the death of MP Sir David Amess in his riding.

Dorries said Amess’ death may not have been stopped by a crackdown on online abuse, but it highlighted the threats people were facing.

There have been calls for social media companies to transmit data faster and quickly remove content themselves. The bill should also force platforms to stop amplifying hateful content through their algorithms.

Bickert wrote in the newspaper that “once parliament passes the online security bill, Ofcom will ensure that all tech companies are held to account.”

She suggested that “companies should also be judged on how their rules are applied.”

Facebook has released figures on how it deals with harmful content, including the amount of content viewed and deleted, over the past three years. The firm is also independently audited.

Bickert wrote: “I spent over a decade as a criminal attorney in the United States before joining Facebook, and for the past nine years, I have helped our company develop its rules on what is and is not allowed on our platforms.

“These policies aim to protect people from harm while also protecting freedom of expression.

“Our team includes former prosecutors, law enforcement officers, counterterrorism specialists, teachers and child safety advocates, and we work with hundreds of independent experts from around the world. whole to help us find the right balance.

“While people often disagree on exactly where to draw the line, government regulations can set standards that all businesses should meet. “

She said Facebook has a business incentive to remove harmful content from its sites because “people don’t want to see it when they use our apps and advertisers don’t want their ads next to.”

The number of hate speech seen on Facebook was found to be around five views per 10,000 as their detection improved.

Bickert said: “Of the hate speech that we suppressed, we found 97% before anyone reported it to us, up from just 23% a few years ago. While we still have some work to do, the implementation reports show that we are making progress. “

Earlier this week, a report found that an international lobby group that aired false and conspiratorial allegations on Covid-19 more than doubled the average number of interactions it had on Facebook in the first six month of 2021.

Pages belonging to the World Doctors Alliance, a group of current and former medical professionals and academics from seven countries, received 617,000 interactions in June 2021, up from 255,000 in January, according to the Institute for Strategic Dialogue.

The World Doctors Alliance includes prominent members who have falsely claimed that Covid-19 is a hoax and that vaccines cause widespread damage.


Leave A Reply

Your email address will not be published.