SAN FRANCISCO - It's been four months since former president Donald Trump was last allowed to post on Facebook, after CEO Mark Zuckerberg said he was banned "indefinitely."

Now the Facebook Oversight Board, an outside group funded and created by Facebook to review the social media giant's thorniest policy choices, has made a decision on the case. It is expected to announce on Wednesday whether Facebook can uphold its suspension of Trump or if it has to allow him back on the site.

The board will announce its decision on this case - its most significant by far - at approximately 9 a.m. Wednesday. The ruling is being closely watched by politicians around the world, as well as social media researchers and other tech companies that similarly banned Trump in January.

Facebook was the first major social media platform to suspend Trump indefinitely in the wake of the Jan. 6 attack on the Capitol, and its controversial decision was met with praise by many critics who believed the company had let him dodge its normal rules and policies. But others decried the decision as "censorship" and said it set a dangerous precedent for how world leaders communicate online.

Facebook, the Oversight Board and Trump have periodically outlined their thinking in blog posts, news releases and on social media. Here's everything you need to know about the upcoming ruling.

What is the Facebook Oversight Board?

The Facebook Oversight Board is a group created by Facebook to which users can appeal important company decisions. Though it is funded by a $130 million trust created by Facebook, the board says it is an independent and neutral third party. Its goal is to review moderation decisions made by the company and decide whether they were "made in accordance with its stated values."

First proposed in a 2018 blog post by Zuckerberg, the Oversight Board is the company's attempt to have an outside authority handle difficult decisions. It formally started deliberating in October 2020 and has also been called "Facebook's Supreme Court," though it has no government affiliation or legal standing. The board is currently made of 20 people from around the world who are experts in things like journalism, misinformation, freedom of speech and extremism, though only 19 are participating in this case. The original goal was to have 40 members total, and more will continue to be added.

The board was created to appease critics who thought power over the world's largest social network and its 3.45 billion monthly users (including Facebook, Instagram and WhatsApp) was too concentrated in a group of Facebook executives, specifically Zuckerberg. However, critics say it outsources individual decisions without creating meaningful internal change and shields Facebook from responsibility for difficult decisions.

So far the board has ruled on Facebook moderation decisions around blackface, threats of violence and covid-19 misinformation. It has overturned Facebook's decisions six times, upheld them twice, and was unable to complete a ruling once.

When will we know if Trump is allowed back on Facebook?

The board's decision should be announced publicly Wednesday morning. Facebook first referred the decision to the board on Jan. 21, saying Trump's suspension would remain in place during deliberations. In addition to asking the board to rule on the ban, Facebook asked it for any "observations or recommendations" on how to handle other world leaders on the site.

The board typically has a 90-day window to reach a decision, but it announced April 16 that it was going to need an extension in this case. The decision on Trump's ban will be its most high-profile to date. If the board overturns Facebook's ban, the company will have seven days to unlock and give Trump control of the page. There is no way for it or Trump to appeal the decision.

How does the board's decision work?

First a case has to be referred to the board, either by Facebook itself or through direct submissions from users who disagree with Facebook taking down their content or leaving someone else's up. The board selects a panel of five of its members, including at least one person from the country where the case is based. They are not named publicly so they cannot be lobbied. Members have all gone through training for the job, which is not full time, and approach the decisions as precedent-setting legal cases, even though the process is not part of any legal system.

The panel meets over Zoom and considers Facebook's own lengthy Community Standards bylaws and consults with outside experts and organizations. The affected account holder can also submit a statement, and there is a public commenting period for any regular people to weigh in. The Trump case received more than 9,000 public comments, almost as many as all the board's past cases combined.

The panel tries to reach an unanimous decision, but technically it needs only a simple majority. It then takes its decision and presents it to the full board, which can overrule the finding if a majority of board members disagree with it.

In the Trump ruling, the board's decision will have two parts. First it will say yes, Facebook can continue to ban Trump, or no, it has to let him back on. Its decision on whether to uphold Facebook's ban is binding, according to the board's bylaws. However, the board also goes further than that simple ruling and makes broader policy suggestions to Facebook. Those suggestions - which can include things like asking the company to add policies around issues like hate speech or bullying, or whether world leaders get different treatment - are not binding, and the company does not have to follow them or take them into consideration. However, Facebook has so far been open to the suggestions. In its first ruling, the Oversight Board made 17 recommendations and Facebook said it was "committed to action" on 11 of them.

The board will post a written version of the decision to its website that will include a detailed explanation of what it considered and how it reached its conclusion, as well as the public comments.

How did we get to this point?

Tension had been building between Trump and Facebook for nearly six years before the company indefinitely suspended him after the Jan. 6 attack on the Capitol. In 2015, then-candidate Trump posted a video calling for a ban on Muslims entering the United States. In a controversial decision, Facebook declined to remove it. Instead, that internal decision eventually led to the company's "newsworthiness" policy, which created an exception for some posts that violated guidelines to nevertheless remain online because they carried public-interest value.

Facebook's policies were constantly tested throughout 2020, when Trump posted misleading information about the coronavirus and bombastic statements about protests taking place across the country. In a May post, Trump referred to protesters as "THUGS" and wrote, "Any difficulty and we will assume control but, when the looting starts, the shooting starts."

Though Twitter was Trump's go-to social media site, the former president also regularly used Facebook to spread messages and often cross-posted on both Twitter and Facebook.

Twitter labeled a similar tweet on its site with a public interest notice, but Facebook left it untouched. Employees and advocates called for Facebook to take harsher action and in June, Zuckerberg announced the company would label posts that violated hate speech and other policies, even from politicians. And it would also remove posts that attempted to incite violence or suppress voting, with no newsworthiness exception.

Facebook did begin labeling some of Trump's tweets, but it faced mounting pressure from critics saying it wasn't doing enough, as well as from some conservative politicians and pundits who called its actions "censorship."

The breaking point came Jan. 6 when Trump posted a video on Facebook and Instagram, and other social media sites, telling rioters to go home. But in the video he also said, "We love you, you're very special." Facebook suspended the president for 24 hours. The next day, Zuckerberg announced the suspension would be indefinite, saying, "We believe the risks of allowing the President to continue to use our service during this period are simply too great."

Later that month, Facebook said it would refer the decision to the Oversight Board to make the final call. "Many argue private companies like Facebook shouldn't be making these big decisions on their own," the company wrote at the time. "We agree."

What does this mean for other tech companies?

Twitter and YouTube took similar action on Trump's account soon after Facebook. Trump's account remains available on YouTube, but he's blocked from uploading new videos. YouTube's suspensions usually last only a week for a "first strike," but the company will keep Trump's in place until the "risk of violence has decreased," CEO Susan Wojcicki said in March. YouTube's analysts will determine when the risk is low enough by looking at government statements, whether there are police buildups and the level of violent rhetoric elsewhere on YouTube, Wojcicki said.

Trump had millions of views and followers on YouTube, but the platform wasn't used as directly as Twitter was. Instead his campaign used the site to post official videos that were shared around the Web by supporters. The campaign also bought prime ad space on YouTube's homepage the week of the election.

Unlike with Twitter, Trump probably did not personally control the YouTube channel.

Twitter, on the other hand, has made no bones about its plans: Trump is banned permanently, regardless of what other companies decide or whether he runs for office again.

"The way our policies work, when you're removed from the platform, you're removed from the platform, whether you're a commentator, you're a CFO, or you are a former or current public official," Twitter CFO Ned Segal said during an interview with CNBC in February.

What does this mean for Trump?

Trump has lost much of his direct online communication with supporters since the major social media networks kicked him off in January. He still has been sending out news releases and messages to supporters, however, as well as appearing on television interviews.

But being reinstated on Facebook would be a major win for Trump to reclaim his personal brand of communication - a way to speak directly to supporters in his own, unfiltered words.

Trump senior adviser Jason Miller confirmed to Fox News that the former president is building his own social media network. Creating a social media site with broad appeal is a significant technological, social and financial undertaking. Even if the network successfully launches, Trump will have a tough time building a similar audience from scratch. The former president had more than 88 million followers on Twitter when his account was banned.

The Washington Post's Cat Zakrzewski, Gerrit De Vynck and Elizabeth Dwoskin contributed to this report.

Recommended for you