Nominations for the CIO100 Awards - East Africa Edition are now open!



FB Removes Misinforming Content In New Measures For, And Ahead Of Ethiopia's General Elections – CIO East Africa – CIO East Africa | EthioWatch
2021-06-11 16:48:08
[…] FB Removes Misinforming Content In New Measures For, And Ahead Of Ethiopia’s General Elections…  CIO East Africa Source: Ethiopian News […]

FB Removes Misinforming Content In New Measures For, And Ahead Of Ethiopia’s General Elections

Facebook has outlined its election integrity measures ahead of Ethiopia’s June 21st General Elections to curb the spread of untrue information


FB Removes Misinforming Content In New Measures For, And Ahead Of Ethiopia’s General Elections

Facebook has shared a new set of requirements on the ongoing election integrity work in Ethiopia, as the country braces for general elections scheduled on June 21st, 2021. The new guidelines will continue in the lead up to, during, and after the voting, and builds on Facebook’s longstanding efforts in understanding and addressing the way social media is used in Ethiopia.

The guidelines include the company’s efforts to detect and remove hate speech and content that incites violence, its ongoing work to reduce the spread of misinformation, its efforts to improve digital literacy, and the steps the company is taking to make political advertising more transparent.

Some of Facebook’s election integrity work on Ethiopia include:

1. Activating Facebook’s Elections Operation Center
Facebook opened its first Elections Operations Center in 2018, ahead of the elections held that year in the United States and Brazil. Since then, Facebook has run operations centers for major elections around the world, including upcoming elections in Ethiopia.


At the onset of the COVID-19 pandemic, the Elections Operation Center transitioned from a physical to a virtual workspace. However, Facebook is still bringing together subject matter experts from across the company – including from threat intelligence, data science, engineering, research, operations, policy, and legal teams – to enable the company to respond in real-time to potential problems and abuses see emerging in Ethiopia.

2.Tackling Hate Speech and other harmful content
Facebook’s Community Standards – which set out what is and isn’t allowed on Facebook – cover a number of areas relevant to elections, including policies against harassment and incitement to violence, as well as detailed hate speech policies that ban attacks on people based on characteristics like ethnicity or religion. Facebook removes any content that it’s made aware of that violates these rules.

Facebook has significantly improved and simplified its reporting tools to make it easier for Ethiopians to report when they see violating content, enabling Facebook to investigate. To further broaden awareness of its policies in Ethiopia, Facebook has run online ad and radio campaigns in Ethiopia and held training sessions with activists, civil society organizations, small and medium-sized business owners, government agencies, and members of the local media. The company has also established dedicated reporting channels for specialized international and local human rights and civil society organizations to make sure they can quickly review problematic content they identify for possible violations and continue to work with local partners who provide Facebook with feedback that the company incorporates into its policies and programs.

Alongside these efforts to improve reporting, Facebook also invested in proactive detection technology that helps catch violating content before people report it to Facebook. Facebook is now using this technology to proactively identify hate speech in Amharic and Oromo, alongside over 40 other languages globally.


Over the last few years, Facebook tripled the size of the global team working on safety and security to over 35,000 and hired more content reviewers who are native speakers of Amharic, Oromo, and Somali, while having the capacity to review content in Tigrinya.

These investments are having an impact: between March 2020 and March 2021, Facebook removed 87,000 pieces of hate speech in Ethiopia, about 89 per cent of which were detected proactively.

Facebook is taking additional temporary steps ahead of, and during the election to reduce the distribution of content and comments that our proactive detection technology identifies as likely containing hate speech, or violence and incitement, while our teams investigate it. This content will be removed if Facebook determines that it violates its policies.

In addition to Facebook’s standard practice of removing accounts that repeatedly violate the company’s Community Standards, Facebook is also continuing to reduce the distribution of content posted from accounts that have recently and repeatedly posted violating content in countries facing heightened risks. This means fewer people in Ethiopia will see content from these repeat offenders located in those countries.


Combating misinformation and false news

It’s important for people to see accurate information on Facebook and Instagram, that is why Facebook is working to fight the spread of misinformation on its services in Ethiopia. Facebook removed the most serious kinds of misinformation, such as content that is intended to suppress voting or which could cause violence or physical harm. For content that doesn’t violate these particular rules, Facebook has partnered with independent third-party fact-checking partners in Ethiopia – Pesa Check and AFP – to ascertain whether something is misinformation or false news. When they review and rate a piece of content as false, Facebook reduces its distribution so fewer people see it and add a warning label with more information for anyone who does see it. In general, when a warning screen is placed on a post, 95 per cent of the time people don’t click past it.

3. Addressing misleading, outdated, or out of context imagery
Often people try to deceive, abuse, or cause harm by sharing outdated images or news articles that are taken out of context on Facebook. To address this, Facebook has launched tools that notify people when a news article they’re about to share is more than 90 days old. People will also see a message when they attempt to share specific types of images, including photos that are over a year old, warning them that the image they are about to share could be harmful or misleading.

3. Improving the transparency of political advertising
Facebook believes political discussion and debate should be transparent to every voter, which is why over the past few years the company has introduced a number of tools that provide more information about political ads on Facebook and Instagram. In March this year, Facebook made these political ads transparency tools mandatory in Ethiopia. As a result, anybody who wants to run political ads in Ethiopia must now go through a verification process to prove who they are and that they live in Ethiopia.

Facebook then runs additional checks to ensure compliance with its policies. Political ads in Ethiopia will be labeled with a “paid by” disclaimer, so you can see who paid for them. Facebook also puts political ads that run in Ethiopia in our Ads Library so that everyone can see what ads are running, what types of people saw them, and how much was spent. This fully searchable archive will store these ads for seven years. In addition to providing more transparency, earlier this year Facebook also announced that it’s rolling out new controls so that people can choose to see fewer social issues, electoral, and political ads.

When people use these controls, they’ll no longer see ads that run with a “Paid for by” disclaimer. These changes mean that political advertising on Facebook and Instagram is now more transparent than other forms of election campaigning, whether that’s billboards, newspaper ads, direct mail, leaflets, or targeted emails.

4. Supporting Digital literacy
Finally, Facebook is also investing in digital literacy in Ethiopia through working with local partners. Facebook partnered with the Center for African Leadership Studies to implement “My Digital World”, a series of live webinars where Facebook has engaged with over 7,000 people in the country on topics such as online safety, privacy, digital citizenship, news, and media literacy. The company also rolled out a media literacy campaign, aimed at educating and informing people on how to detect potential false news, and ran billboard advertising campaigns across Addis Ababa, the first of its kind across Africa, focused on informing and educating people on how to stay safe online and use social media responsibly.

Commenting about the work, Mercy Ndegwa, Facebook Head of Public Policy for East and Horn of Africa said, “At Facebook, we are committed to election integrity, we are delighted that our elections integrity efforts in Ethiopia are informed by the conversations Facebook is having with human rights groups, NGOs, local civil society organisations, and regional experts within Facebook. These efforts are being implemented by a team purpose-built to focus on the Ethiopian election. Local understanding is critical to doing this work effectively, therefore the team includes a number of people in and from Ethiopia including experts in topics like misinformation, hate speech, elections, and disinformation.”

Do you have a story that you think would interest our readers? write to us