Description of “May 14” from Meta – Last Minute Technology News

Meta, owner of Facebook, Instagram and WhatsApp, announced it will activate the Election Operations Center to identify potential threats in real time and respond more quickly as the presidential and 28th term parliamentary elections due to be held in Turkey on May 14 approach.

Meta’s written statement provided information about preparations for the elections on Sunday 14 May.


As Election Day approaches, the Turkey Election Operations Center, which focuses on real-time detection of potential threats and taking swift action, will be activated, and this large-scale Meta initiative will bring together engineers, lawyers, researchers and analytics experts work in different departments of the company.

The Community Standards publicly specify what content is and is not allowed on Facebook and Instagram, and cover a wide range of electoral areas, including harassment and incitement to violence policies, as well as detailed hate speech policies that cover attacks against people on subjects such as ethnicity or religion. It was reminded that content that violates these rules will be removed as soon as it is detected.


Noting that the company is investing in manpower and technology to reduce the spread of false information and remove malicious content in its applications to help conduct the May 14 general election in a safe and secure manner, the statement said, by the launch training programs on elections and online false information. It was noted that he also worked to improve digital literacy.

The statement announced that under certain circumstances, false information that could contribute to potential violence or physical harm or that aims to prevent voting has been completely removed from Facebook and Instagram, including false news about voting dates, places, times and methods.

The statement stated that Meta has partnered with more than 90 independent third-party news verification organizations, including Turkey’s Doğru Payı and Teyit. to information from authoritative sources so fewer people can see it.” it was said.

The statement said that if the verification authorities Meta works with classify a content as “false”, then the content’s rendering has been significantly reduced to reduce the likelihood that people will see it.

It was stated that people who shared or had previously shared the content were warned and warning tags were added to the content leading to the article drafted by the verification body that the information was unfounded.

When a content is classified as “false”, “partially unsubstantiated” or “altered”, warning labels are added to this content and the display of the content is reduced to reduce the chances of people seeing it.


The statement said that whenever a message appears on WhatsApp that looks suspicious or inaccurate, people are encouraged to double-check this information with Truth Share and Confirmation verifiers to confirm the accuracy of the content.

“How to Fight False Information” to help users detect and take action against false online news. The statement noted that the campaign titled campaign had been brought to life and shared that as part of this campaign, partnerships have been established with local radio stations for advertisements that share information on how false information was detected and what to do in this case.

The statement reminded that any message sent once last year can only be forwarded to one group at a time, referring to viral messages on WhatsApp.

The statement, which pointed out that the same feature would also be available for messages with a high forward rate in 2020, stated that the number of these messages sent on WhatsApp decreased by more than 70 percent.


The statement said that advertisers who want to run ads about elections or politics in Meta’s applications must go through a verification process to prove who they are and what country they live in, and that tags showing who the advertiser is added to the posts in parallel to the principles of transparency.

The statement, which reportedly includes additional controls that will help people see fewer ads about elections and politics, stated that users will not see tagged ads when using these controls.


The statement emphasized that specialized teams have also been set up to prevent interference in the elections, thanks to these teams, some pages, groups and accounts that, through advanced networks, tried to distort the public debate, that is, focused on the issue of coordinated abuse.

It has been reported that more than 200 networks around the world have been taken down since 2017 due to coordinated abuse, and the company takes action if such activity is detected on its platforms, especially in cases related to elections.

The statement stated that the coordinated abuse networks detected will be further monitored and removed and details will be shared publicly.

Leave a Reply

Your email address will not be published. Required fields are marked *