Introduction
The social media platform, Facebook, is today’s largest social media platform. It has the largest number of active users among all such platforms. The number of active users on Facebook per month, according to the latest statistics for this year (2023), is 2.966 billion users around the world. This means that about 37% of the world’s population uses Facebook. If we consider that China (the second largest country in the world in terms of population) blocks Facebook within its territory, then the proportion of platform users in other countries of the world to their entire population will, in fact, be greater and greater.
With the magnitude of size, comes the magnitude of the impact, which, unlike the size, is difficult to estimate with clear numbers, but it can be concluded roughly. Whatever the approach used to measure the amount of Facebook’s influence and perhaps its power, it is certain that if we look at it as a media outlet, it will be the most important and most influential not in one country but in most countries of the world, not only in our time but throughout history.
Nearly three billion people, private and government agencies of all kinds use this platform to exchange ideas, opinions, correct or false information, true or false news, and text, audio, and video materials in a huge number of languages of the world. Meta, the company that runs this platform and has the ability to allow or prevent content published on its platform, has unprecedented influence over communication between people and the movement of data and information that no party of any kind has ever had before. In addition to its influence over the extent to which billions of people who use this platform enjoy their basic freedoms and rights. In the foreground, of course, comes the right to freedom of expression, and the right to privacy.
This amount of influence is still unprecedented even if it’s compared to states that have the ability to block a huge sector of the Internet as a whole or block Facebook services itself. This detailed level of control over the dissemination and exchange of expression content can only be compared to other social platforms that remain smaller and less influential compared to Facebook.
The impact of Facebook’s content moderation policies on its network in the real world is undeniable. This impact can be noted in the civil war in Ethiopia, in the US presidential and parliamentary elections, and in the extent of widespread misconceptions about the Covid-19 pandemic.
There is no room for surprise, then, if Meta is the company at the center of an endless number of issues arousing global public opinion, and a topic for behind-the-scenes discussions of governments around the world, and for the debates of legislative assemblies, especially the US Congress and the European Parliament.
The question on everyone’s mind is: How can Facebook (and the rest of the very big online social media platforms) moderate the content posted on them? This question necessarily preoccupied Facebook officials themselves, including Mark Zuckerberg, founder of Facebook and CEO of the company, and then of its current parent company, Meta, which, in addition to Facebook, runs another platform, Instagram. The idea Zuckerberg came up with, perhaps to pre-empt countries regulating his platforms against his will, is to create an independent entity that people can reasonably trust its impartiality, to have the final say on high-profile problems with the company’s moderation of content on its platforms. The idea put forward by Zuckerberg in November of 2018, the year in which the crisis of Facebook’s role in ethnic cleansing erupted in Myanmar, developed over the next two years to become at the end of 2020 a practical reality represented by the Oversight Board.
The Oversight Board is the subject of this paper, which raises a set of questions regarding it: Why did Facebook establish this board? How was it created? What are its competencies, powers, formation, method of work, and legal form? What impact does this Board have on Facebook’s content moderation policies and, through that, on the right to freedom of expression and the right to privacy? What is the impact of this Board on Facebook’s politically biased policies on issues such as the Palestinian-Israeli conflict and resistance to occupation in the Occupied Palestinian Territories? Finally, to what extent can this type of entity have a real role in solving the thorny and complex problems of content moderation on social media platforms?
Why did Facebook create the Oversight Board?
There are two meanings to the question of why Facebook established the Oversight Board. In the first sense, the question refers to the circumstances that led Facebook (Meta later) to need to establish the Oversight Board. In the second sense, it refers to the goals that the company seeks to achieve by establishing the Board. In the following two sections, an attempt is made to answer the question in both meanings. In the third section, we review the steps of establishing the Board.
Facebook in crisis
In 2018, Facebook faced sharp criticism that focused on the company’s failure to deal with an outbreak of hate speech directed against the Rohingya Muslim minority in Myanmar on its platform. The significant impact of the speech published on Facebook on the development of acts of violence that amounted to ethnic cleansing, according to a United Nations report, was evident beyond denial. Facebook later admitted its responsibility for failing to moderate the content on its platform. The crisis prompted an investigation by a US House of Representatives committee. The committee summoned Mark Zuckerberg, founder, and CEO of Facebook, to appear before it in April 2018. Zuckerberg was asked about his company’s failure to stop hate speech directed against the Rohingya minority in Myanmar.
Towards the end of the same year (November 2018), Zuckerberg wrote a note titled “A Blueprint for Governance and Enforcement of Content on Social Media Platforms.” This note is the founding document behind the idea of establishing an Oversight Board. In this note, Zuckerberg re-admits that Facebook has been too slow to start dealing with the rampant hate speech on its platform, which helped escalate violence against the Rohingya minority in Myanmar to the point of ethnic cleansing.
The crisis in Myanmar and the role of Facebook in escalating it is not the only one that the company has faced. It was obvious that states’ governments and their legislative bodies have begun feeling the threats emanating from leaving social media platforms without any regulatory intervention in their administration. Calls were made to regulate the work of social media platforms, in particular the moderation of the content published through them, and their pace accelerated and rose to the higher levels of the executive and legislative authorities.
The idea of creating an Oversight Board is partly a pre-emptive step in an effort by Facebook (Meta) to build itself a compellingly independent regulatory framework to oversee the operations of its two social media platforms (Facebook and Instagram).
Look for legitimacy and a disclaimer
Facebook’s Transparency website published an article titled “Creating the Oversight Board.” The company says in the article that the simple idea behind creating the Oversight Board is that “Meta should not alone make so many important decisions about freedom of expression and safety.” That explicit reference is repeated in many documents issued by Facebook and later by Meta, specifically by Mark Zuckerberg. This asserts that the final say in the content moderation process, which affects billions of users who create content on a daily basis on Facebook, in a huge number of languages, and through a large number of cultural backgrounds is beyond the technical capabilities of the company. Therefore, the content moderation process should be for those who are more knowledgeable, specialized, and experienced in judging what should be allowed to be published and what should be withheld. There is also an indication of the possible consequences of some of these decisions and their impact on the balance between respecting the right to freedom of expression and protecting the safety of people, which may be so great that the company should not be solely responsible for it.
In his previous note, Zuckerberg stresses the need to “build a sense of legitimacy” into the mechanisms used to enforce the rules governing content moderation. This legitimacy requires that the supervisory authority that has the final word at the end of the grievance process against decisions to remove or leave content has a sufficient degree of independence from the managerial structure of the company that owns the social media platform.
The establishment of the Oversight Board
The creation of the Oversight Board took about two years after Zuckerberg published his note in November 2018 as the board became operational in October 2020. During this period, Facebook took several steps:
- On January 28, 2019, Facebook released a first draft of the Oversight Board charter, including more details about the company’s perceptions of the formation of the board. The announcement of the draft stated that the company had conducted preliminary consultations and discussions before putting forward the basic scope and structure of the board. The decisions that had not been made at that time related to the proposed number of members of the Board, the term of their membership, as well as the mechanism for selecting the cases to be examined. The announcement also mentioned that the company will hold six-month workshops around the world to invite experts and organizations working in the fields of “freedom of expression, technology and democracy, procedural fairness and human rights.”
- According to a subsequent announcement by Facebook, the company sought to obtain the opinions of its critics and supporters alike. The advertisement said that the company held workshops and round tables for the purpose of consulting on a global scale in 88 countries, and more than 650 people participated in it.
- The Oversight Board Charter, which is the main document for regulating the establishment and work of the Board, was issued in September 2019. This was accompanied by the publication of a letter by Mark Zuckerberg explaining the purpose of establishing the Board and the goals it will seek to achieve.
- Facebook announced the issuance of the first version of the Bylaws of the Board on January 28, 2020, and published a Diagram showing the process for submitting appeals against Facebook’s decisions to remove content to the Board for review and decision. At the same time, the company announced the appointment of its first Director to the board.
- On May 6, 2020, Facebook announced the appointment of its first cohort of 20 board members. Earlier, the announcement said the new members represent a wide range of perspectives and past experiences, come from 27 different countries, and speak at least 29 languages. The announcement further said that the process of nominating and appointing new members will continue until the target number of 40 council members is reached. The announcement confirmed that the members contract directly with the Board and are not employees of Facebook, and the company cannot terminate the work of any of them. After appointing members and a period of training, the Board began its work in October of the same year (2020).
Oversight Board: Formation, Competencies, and Operation
The two main documents governing the formation of the Board, its competencies, and the way it works are the Board’s Charter and its Bylaws, which also include the code of conduct that members of the Board must abide by. In addition, there are two complementary documents: the Book of Rules and the General Standards.
Board bodies and composition
At the top level there are three entities with different roles in establishing and managing the Oversight Board. Firstly, there is the Trust through which the Board is funded. This Trust is administered by Trustees consisting of a chairperson and four members.
The second major entity is the Oversight Board LLC, a limited liability company set up by the Trust. The company’s responsibility is to handle administrative and logistical matters for the Oversight Board including providing a headquarters for the Board and staff to carry out the tasks of assistance in the work of the Board and the administrative functions that it needs in its work.
Finally, the third main entity is the Board itself, which consists of several members, no less than eleven and no more than 40.
Council membership
One term for a member of the Board extends for three years, renewable for a maximum of two additional periods, and the Trustees decide on renewal for members whose term of membership has expired. In the process of appointing members of the Board for the first time, it is taken into account to postpone filling membership seats in stages, so that the membership of members ends later on different dates, so that only replacements are appointed for a limited number of members to ensure continuity in the pattern of the Board’s work through the continuation of the work of a number of old members along with new ones in each time.
A cochair is formed from several board members (currently three) whose chosen ones assume the responsibility of being the link between the board and the Trustees and the LLC. They also lead the work of the Board committees, and they bear administrative responsibilities, including selecting new members from among the candidates to fill vacant seats on the Board and choosing the cases that the Board will consider from among those submitted by users of Facebook and Instagram, or submitted by Meta Company for the purpose of reviewing it.
When the Board was formed for the first time, Meta selected the co-chair group, which later, in cooperation with the company, chose the remaining members of the Board. In all cases, the Trustees formally appoint new members, and they alone have the power to terminate the membership of a member, exclusively if the member violates the code of conduct attached to the Bylaws of the Board. It is not permissible to terminate the membership of any member on the background of any of the Board’s decisions s/he participated in issuing.
Competencies of the Oversight Board
The Board’s Charter specifies the primary purpose of its establishment as protecting the right to freedom of expression. The Board is supposed to achieve this purpose by issuing decisions about important content. The importance of the content is determined if Facebook’s decisions regarding it represent dealing with one of the main issues related to freedom of expression or hate speech related to groups vulnerable to persecution or one of the controversial policies of Facebook. This means that this treatment represents a stable policy or a steady pattern that applies to a large number of content published daily on the platform. According to Facebook’s commitments, its commitment to implement the Board’s decisions regarding any content will extend to amending the company’s previous decisions regarding similar content or to which the same standards apply. In addition to issuing these binding decisions to Facebook, the Board is entitled to issue advisory opinions regarding Facebook’s content moderation policies. These opinions are not binding for Facebook to apply, but it is obligated to respond to the Board and clarify the actions it will take regarding them.
The competencies and powers of the Board as clarified in the Charter are
- The Board may ask Facebook/Meta to provide any information that the Board deems necessary for its deliberations, provided that this is within an appropriate period and in a transparent manner.
- The board has the right to interpret the Facebook Community Standards according to Core values formulated by Meta.
- The Board may issue binding decisions directing Facebook/Meta to allow or delete content.
- The Board may direct Meta to approve or change the designation of content, which will result in the company being committed to applying the rules and standards of content moderation according to that designation.
- The Board shall promptly issue written explanations of its decisions when they are taken.
- The Board can provide advisory opinions regarding a specific case within its published decision, or upon a request made by Meta, on the company’s content management policies.
Oversight Board operations
Examining requests for review
The Board receives requests to review content moderation decisions made by Meta through Facebook and Instagram. Content moderation decisions on any platform are made by either automated algorithms in most cases or human reviewers in the rest of the cases. Decisions are made based on an automated random survey of posts in which an algorithm decides that a post violates the rules of Facebook or Instagram. In this case, the post is hidden by blocking access to it, and a penalty is imposed on the user who posted it, which is usually a temporal suspension for a specified period. In other cases, the decision is made based on a user submitting a request to remove a post with their identification of the platform policy that they believe the post has violated. In a third type of cases, the decision to delete a type of content applies to posts that reshare it or that share similar content.
The grievance system against removal decisions or failure to respond to a request for removal allows the relevant user to request a review of the decision. The decision issued as a result of the re-review becomes final regarding the internal systems of the platform, and thus the user exhausts the grievance means provided by Facebook or Instagram through the platform itself.
With the establishment of the Oversight Board and its commencement of work in October 2020, there has become a last resort like the Court of Cassation or the Supreme Court in judicial systems. The affected user can submit to the Board a request to review the final decision taken by the platform, whether by removing their post or by not responding to their request to remove a post by another user. The second source of content that the board is to consider and make decisions about is Meta itself, which can ask the board to review a decision made by one of its algorithms or one of its employees. Usually, the cases that Meta presents to the Council are related to an issue that aroused public interest and for which Meta faced widespread criticism. In exceptional cases, Meta can ask the Board to expedite one of the cases that the company submits to it without exhausting the time limit set in the Charter for examining cases under normal circumstances. Such cases concern content whose retention or deletion could have immediate consequences in real life. In all cases, the co-chair has the right to accept or reject such a request.
In one of Meta’s clarifications about the Oversight Board, it said that it has established a system to determine its priorities regarding the type of content that the company will submit to the Board for consideration according to two criteria: significance and difficulty. Significance is related to the extent of the expected direct impact of the content on actual reality. This is determined by the content’s attachment to matters that are severe, such that there is a threat to the life or safety of a person or a group, or the wide range of impact because the content is expected to reach a large number of recipients or affect a large number of people or it must be a trend on Facebook, or the content must be related to the public discourse, that is, it must generate a public debate of a large scale. In terms of difficulty, what is meant is that the decision about the content in question renders content moderation policies or enforcement questionable if there are strong arguments supporting both the deletion of the content and its retention at the same time. The difficulty is determined by each contention when there is a disagreement about the soundness of the decision or the Facebook policies that support it, or when there is doubt about the decision or content in terms of its compatibility with Facebook policies, and finally, competition, in the sense that there is tension between more than one of the values of Facebook, which are of equal importance so that each of these values leads to a different outcome.
When the Board receives a request for review, the co-chair of the Board considers it to decide whether to accept it or not. If the co-chair decides to accept the examination of the application, it shall form a team from among the members of the council, as the Charter stipulates, at least one of them belongs to one of the countries in the region related to the content to be examined and is fluent in the language used in it. In most of the decisions to form work teams to discuss the content so far, the co-chair of the Board is keen to ensure that the work team is gender diverse as well. The membership of the working team for each case is confidential, although the membership of the Council itself is public, in order to allow the members of the team to make their decisions impartially and without feeling any external pressure of any kind, as well as to protect them from any possible consequences of their decisions, as many of them may relate to sensitive social or political issues.
According to the Board’s current bylaws, which have been amended several times so far, the work team has 90 days from the start of its work to finish examining the case it is working on. The Board’s charter requires the members of the working teams to try to reach a decision by consensus, but if this is not possible, the Charter approves the issuance of the decision by the majority of the team members’ votes. In this case, the published text of the decision may include the opinions of those voting against the final decision to allow their point of view to be presented and their voices heard by the public. After reaching a decision that is published to all members of the Board before being final and before it is published to the public, if the majority of the members of the Board decided to reconsider the decision, another team is formed to re-review it within a shorter period, and the decision of this team is final.
The Board may include in its decisions advisory opinions and recommendations for Meta, and in all cases, it shall publish its decisions immediately upon their issuance. In practice, the Board is also keen to publish a list of cases it has selected for review and allows the public, organizations, and various parties to provide comments on the case so that the work team can use some of them in making its decision. The applicant is also allowed to submit a written explanatory statement. In all cases, the council’s decision becomes binding upon Meta, and it must implement it, either by keeping the content in question or deleting it. On the other hand, Meta is not obligated to implement the recommendations of the Board attached to the decision, but it must provide the Board with a detailed response regarding them.
Advisory opinions and recommendations
As previously explained in the previous section, the Board may include in its decisions recommendations for Meta. In addition, the Board may also initiate opinions and recommendations to the company independently of its decisions. In both cases, these opinions and recommendations relate to the company’s policies on content moderation, whether that is related to defining the rules for classifying content under any of the company’s policies for unacceptable content, regulating exceptions to these policies, or their implementation.
Unlike the decisions of the Board regarding the deletion or retention of content, or regarding the description of content according to the classifications that result in a decision to delete or retain, which are binding decisions for Meta and must be implemented, the opinions and recommendations of the Board about Meta content moderation policies are not binding on the company. However, the Charter obliges the company to provide adequate responses regarding these opinions and recommendations, through which it clarifies its position on each one of them. The positions that the company can take are limited to three alternatives, the acceptance of the recommendation and implementing it or accepting to consider the possibility of implementing the opinion or recommendation, in both cases the company is also committed to providing a periodic update on the progress of its implementation or assessment. The third is the rejection of opinion or recommendation. In all cases, the company must provide explanations and justifications for its response.
Statistics on reviews submitted to the Board
The quarterly periodic report issued by the Oversight Board covering the last quarter of last year (October-December 2022), provides several statistics about review requests submitted to it by users of the Facebook and Instagram platforms, as well as the extent of Meta’s cooperation with the Board. Among the figures of importance, the report stated that the Board received 193,137 cases in the last quarter of 2022 from Facebook and Instagram users. This is a 29% decrease from the number of cases it received during the previous quarter of the same year (July-September 2022). In total, the Board received, during its work period from October 2020 until the end of December 2022, about two and a half million cases from users of the two platforms.
In terms of regions around the world from which cases reported to the Board came, North America (United States and Canada) comes in the lead, accounting for 47% of cases, then Europe at 22%, Latin America and the Caribbean at 13%, and Asia and the Pacific 11%. It is noticeable that the Arab region (Middle East and North Africa) and sub-Saharan Africa share the remaining 7%, which is a very small percentage compared to the population of the two regions and the number of Facebook users there.
The most frequent classifications of violations, on the basis of which Meta decided the deletion of content and users requested a review by the Board for this deletion, were distributed between violence and incitement to it at 42%, hate speech at 23%, and adult nudity and sexual activities at 10%. In particular, the number of cases related to the violation of the Facebook policy of so-called dangerous individuals and organizations doubled over the year 2022 from 4% at the beginning of the year to 8% at the end. This controversial, and until recently unknown, policy has been the subject of contention between the Board and Meta.
The most frequent categories of violations reported by users who requested the removal content and took their grievance to the board were hate speech at 27%, bullying and harassment at 24%, and adult nudity and sexual activities at 13%.
With regard to the cases submitted by users from the Arab region specifically to restore deleted content, the most frequent categories were incitement to violence at 30%, hate speech at 26%, dangerous people and organizations at 21%, adult nudity and sexual activities at 9%. As for grievances against non-response to requests to delete content, the most frequent classifications were hate speech at 42%, bullying and harassment at 15%, and adult nudity and sexual activities at 7%.
Oversight Board’s relationship with Meta and its policies
In the last quarter of last year, the Board issued five decisions. In two cases it approved the decision that was issued by Meta, while it reversed its decisions in three cases. During the same period, the Board published one opinion regarding one of Meta’s content moderation policies, the so-called Cross-Check Program, which is a policy related to giving greater care and special review to cases related to content published by VIPs. The Board published an opinion related to reviewing this policy in light of Meta’s commitments to human rights and its stated values, and it also raised a number of questions about how Meta treats its most influential users.
The Board submitted 101 questions to Meta, about decisions published in the last quarter of 2022, and received answers to 90 of them, and incomplete answers to six questions, and Meta did not answer five questions. But in general, the report says that Meta has provided the Board in this last quarter of last year with more information about its approach to content moderation on scale, and that it also shared with the Board technical information about its operations in more details.
Among the policies that the report said Meta provided more information about is a controversial policy of great importance and was not known until recently, which is the policy of country-tiers that Meta provides disparate resources to moderate their content. Another matter of great importance, which the report mentioned that the Board received more information about, is the relationship between the company and law enforcement institutions in different countries.
With regard to the recommendations made by the Board to Meta about its content moderation policies and means of implementing them, the report stated that the Board made 12 recommendations in the last quarter of last year. Meta said that it implemented 4 of them fully, 4 partially, and is considering the possibility of applying three recommendations. Finally, the company refused to take any action regarding one recommendation.
In all, the report says that Meta has committed to implementing or has already implemented most of the recommendations that the Board submitted to it during its work period since October 2020, which amounted to 140 recommendations. The Board has verified that 24 of these recommendations have been fully implemented through published information. The Board estimated that 11 other recommendations had been partially implemented, while Meta said that it had made progress in implementing 53 other recommendations and the Board was still verifying this.
Some issues pertaining to the Arab region
Acknowledgment of a Meta retraction from the deletion of content related to the Palestinian resistance
On September 14, 2021, the Board approved Facebook’s reversal of its previous decision to delete content in which an Egyptian user reposted a story on Al-Jazeera channel page with a picture of the official spokesperson for the Izz al-Din al-Qassam Brigades, the military wing of the Islamic Resistance Movement, Hamas. Facebook removed the content in accordance with its policy on dangerous individuals and organizations. It reversed its removal when the Board decided to select the grievance for review. In the text of its decision, the Board considered that the deletion of the content did not reduce the risks in actual reality while it restricted the right to freedom of expression on an issue of public interest.
The Board made a set of recommendations regarding its decision, the most important of which is that Meta should seek help from an independent party unconnected to any of the parties to the Palestinian-Israeli conflict, to examine whether Facebook’s moderation of content in both Arabic and Hebrew has been done without bias, including the use of Algorithms. Facebook responded to this recommendation and entrusted an independent company to prepare a report on the extent to which Facebook’s content moderation practices are biased against Arabic content and content supporting the right to resistance of the Palestinian people in general.
Approval of maintaining content depicting violence against civilians in Sudan
On June 13, 2022, the Board published a resolution recognizing Meta’s decision to maintain a Facebook post depicting violence against a civilian in Sudan. The board’s basis for its decision is that the content raises awareness of human rights violations and is of great value in terms of the public good. Additionally, the Board said in its decision that Meta’s use of the newsworthiness policy to decide to retain content is insufficient because escalating content for newsworthiness consideration places requirements that do not apply to a large amount of content that deserves to be kept for other reasons, and therefore the Board recommended Meta to make an explicit exception to the content-deletion policy on the basis of containing violence, so that the content that exhibit violations of human rights is excluded, especially if it comes from countries whose regimes are accused of systematic violation of these rights.
What is the Oversight Board’s impact on the right to freedom of expression and the right to privacy?
It should first be noted that a group of features of the process of establishing the Board that preceded the start of its work, could be indicators of Meta’s perception of the Board’s relationship with human rights in general and the right to freedom of expression in particular.
- Facebook was one of the companies that voluntarily announced its adoption of the United Nations guidelines on the relationship between business and human rights, and its commitment to them. The impact of this on the work of the Oversight Board was that these guidelines became a major reference on which the Board relies in its decisions and recommendations on Facebook’s content moderation policies.
- Among the parties that Facebook sought to consult in the preparation phase for the establishment of the Board, there were a large number of human rights organizations.
- Facebook’s selection of four members as a start for the establishment of the Board and to help it choose the rest of the members, included a number of human rights workers. In particular, Facebook chose a human rights defender, Thomas Hughes, to be the first Director of the Board, a former president of the Article 19 organization concerned with the right to freedom of expression.
- Facebook requested a specialized company to prepare a report on the Board’s relationship with human rights and recommendations on its work rules that help its commitment to them.
Secondly, we should understand the limits of the influence that the Board has on Facebook’s decisions and policies regarding content moderation, and through this, we can understand the limits of its ability to influence the exercise of the users of their rights in general through the platform. The Board has real power in relation to the cases brought to it and in which it chooses to issue decisions. By analyzing the decisions issued by the Board to date, it can be said that its commitment to the right to freedom of expression and the right to privacy is quite clear. This means that in these particular cases, the Board has a direct impact on the enjoyment by users associated with and affected by the content of their right to freedom of expression and freedom of access to information. The broadest scope relates to recommendations the Board makes that relate to Facebook’s public policies and affect all users of its platform. These recommendations are not binding on Facebook, but so far the response to putting the Board’s recommendations into practice has been reasonably positive. Some of the recommendations that Facebook has adhered to have been reflected in its policies which can have a significant impact on the exercise of groups subject to systematic discrimination, such as women, the LGBT community, and those opposed to authoritarian regimes, of their rights to freedom of expression, with the aim of raising awareness about their issues.
On the other hand, there are no guarantees that Facebook will continue to comply with the Board’s recommendations, including those that it has accepted and begun to implement, as there is no obligation for it to do so. Also, several thorny issues remain due to their close relation to politics and the company’s relationship with governments and their institutions, on which Facebook does not respond to the Board sufficiently, such as the special treatment policy for influential users, the country tier policy so that Facebook allocates disparate resources to manage the content issued by each of them, as well as the secret list policy of dangerous organizations and individuals.
In conclusion, it should be said that there is a positive impact of the work of the Board, but in its direct form, it is limited to the number of cases that the Board can deal with, which until the last quarter of last year did not exceed five cases within three months. In its broadest form, it can be acknowledged that this impact carries significant weight, but again provided Facebook voluntarily chooses to abide by the Board’s recommendations, which are dependent on many factors and few guarantees.
Is replicating and expanding the Oversight Board’s experiment the solution?
The question that needs to be answered before this one is: What is the problem we are looking for a solution to? This paper argues that the problem lies mainly in the fact that a privately owned entity, whose policies and their implementation are not subject legislative regulation and accountability by democratically elected institutions, has such a huge amount of influence over the exercise by a huge mass of the world’s population of basic rights and freedoms such as the right to freedom of expression, the right to privacy, the protection of their interests and safety, and perhaps even their lives. This goes beyond the power of states over their citizens in some matters. If this is the description of the problem, then the Oversight Board does not provide a solution to it in any way. In the end, the Board itself is an entity whose members are not chosen democratically, and these members are only accountable to the Trustees, who are appointed by Meta.
If the problem, as formulated by governments and decision-making circles in most countries of the world, is the failure of major companies such as Facebook, Google, and Twitter to rein in the so-called harmful speech on their platforms, which leads to unacceptable political and security consequences, then the Board’s experiment is also not the solution. It is difficult to imagine that the Board’s within the limits of its direct power and advisory capacity can help Facebook prevent a repetition of its platform’s role in a genocidal-like catastrophe like the one that took place in Myanmar or the manipulation of any electoral process anywhere in the world in even its oldest and mightiest democracies. The European Union’s resort to issuing a package of laws to regulate the work of digital data-handling service providers, including social media platforms, is evidence that world governments do not see the possibility for technology companies to offer a better alternative. This does not mean that laws imposed by states and whose effects extend beyond their borders are an acceptable solution, but it is ultimately more effective than what entities such as the Oversight Board can achieve.
The possibility of finding a democratic alternative that genuinely and equally protects the rights of users of social media platforms is still elusive and requires a different approach than the different experiments presented so far. Such an alternative should take the nature of the Internet as crossing the borders of states and their sovereignty over their territories into account as a first step and consider the absence of political and moral legitimacy in any alternative offered by privately owned entities as a second step. This means that the alternative cannot be offered by individual states or groups of them, nor by individual companies or an alliance between them.
Conclusion
This paper deals with the experiment of Facebook/Meta in establishing the Oversight Board to assume the role of the final arbiter in appealing the decisions taken by the company in its moderation of content on Facebook and Instagram. The paper presented the reasons that prompted Facebook to establish the Board, and its objectives for its establishment. It also presented the steps for building this Board and reviewed the general features of its work during its tenure until today. The paper also touched on the impact of the Board’s work on Facebook users’ enjoyment of their basic rights and freedoms and concluded that this impact is mostly positive, but it is limited in its direct form, and there are no guarantees that it will be achieved or continue in its broader indirect form. The paper also concluded with regard to the possibility of success of the experiment in dealing with the vexing problem of each of the guarantees that users of social media platforms enjoy the exercise of their rights and deal with hate speech and forms of political and economic exploitation by disseminating misinformation on these platforms, that it cannot in any way be expected to succeed in dealing with this problem.
In the end, it should be said that the experiment of the Oversight Board is certainly full of good intentions, including on the part of Facebook itself, but the most successful aspect of it is mostly the propaganda echo that Facebook sought to exploit as much as possible by publishing details of the steps to establish the board, its work, etc. widely and by a huge number of documents, blogs, and pages.