Facebook CEO Mark Zuckerberg speaks in Berlin, Feb. 25, 2016. Kay Nietfeld/AFP/Getty Images

The pitchforks are out for Facebook, which is under increasing scrutiny for the role humans take in deciding what stories merit inclusion in its "Trending Topics" feature. As a First Amendment matter, it's an open-and-shut case: Facebook has the right to exercise editorial judgment — by human editors or algorithms — in any manner it chooses.

"It’s like complaining that the New York Times doesn't publish everything that's fit to print or that Fox News is conservative,” said professor Eugene Volokh of the University of California, Los Angeles, School of Law. “Facebook is allowed to publish anything it wishes.”

That doesn’t mean the pervasive social networking giant won’t find itself in legal trouble. For one thing, Facebook was hardly transparent about how the system worked. It only opened up about "Trending Topics" after anonymous claims that staffers routinely suppressed conservative news sources — claims that prompted the U.S. Senate Committee on Commerce, Science and Transportation to call for the company to reveal the process.

Now Facebook's editorial guidelines, published in detail Thursday, are offering a better picture of which news outlets might have cause to complain. The guidelines state that Facebook's small staff of editors relies on only 10 outlets to determine if a story is of national importance: “We measure this by checking if it is leading at least five of the following 10 news websites: BBC News, CNN, Fox News, the Guardian, NBC News, the New York Times, USA Today, the Wall Street Journal, Washington Post, Yahoo News or Yahoo.”

A more extensive list of the sourced websites beyond the top 10 selection is far more diverse, but still leaves out plenty of publishers — even those that are verified on the network.

The omission of other sources could theoretically form the basis of a legal claim, but news outlets would have the burden of proving that Facebook promised something it didn't deliver. An antitrust or anti-competitive “claim could be Facebook has a dominant position in presenting news for the public and they're putting ahead competitors,” said Joel Reidenberg, a professor at Fordham Law School in New York City.

When it comes to the dissemination of news, Facebook has become dominant. Indeed, at least 30 percent of Americans say they now get their news from Facebook, according to a Pew Research Center study. More than 60 percent use Facebook for news.

Former news curators at Facebook who were in charge of the trending topics section and spoke anonymously to Gizmodo said the company uses editorial judgment to select stories to put there. Guidelines for the team show that judgment calls are made at nearly every step of the operation.

For instance, the guidelines show that Facebook does not select stories where the headlines have profanity or are "sensationalized."

Prior to the stories in Gizmodo last week and Facebook's release of the guidelines Thursday, little was publicly known about "Trending Topics." The word “trending” is fairly vague, and the description in Facebook’s Help Center notably does not disclose that editors are involved.

“Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors, including engagement, timeliness, pages you've liked and your location,” the page reads.

“That really does not give any way an appropriate degree of detail to what is actually going on. I think it is misleading. At the very least it’s disingenuous,” said Frank Pasquale, professor at the University of Maryland Carey's School of Law.

These human curators may have biases that skew their judgment in what they deem as “popular.” An anonymous source told Gizmodo there have been cases of suppressing conservative news, such as reports on 2012 Republican presidential candidate Mitt Romney and stories from right-leaning news outlets, allegations that Facebook has denied.

The Help Center's wording shows that Facebook did not previously provide a strict definition for how it chooses what its users see within the trending topics section. “It doesn't say that topics are based solely on the popularity, and they’re not promising that all the popular ones will be included,” Volokh said.

The Senate Committee on Commerce, Science and Transportation, whose members include former Republican presidential candidates Ted Cruz and Marco Rubio, had called upon CEO Mark Zuckerberg to clarify the “rigorous guidelines” it claimed to have — in the wake of the Gizmodo reports — and produce a detailed list of all trending topics included and omitted.

The Senate letter is simply a request for Facebook to provide information by May 24. There is nothing legally binding, yet, and there may never be at the federal level. “There's something troubling about a government agency pressing an organization for news dissemination,” Reidenberg said.

It has happened before. In 1974, the Miami Herald was brought to the Supreme Court in an attempt to require equal editorial space for political candidates. The case was dismissed on First Amendment grounds.

For Facebook, a case could emerge at the state level on potential claims of unfair acts and practices. Facebook was recently charged on that count in a federal court in San Jose, California, its home turf in Silicon Valley, for targeting ads based on visits to cancer-related websites.

Objectivity is not Facebook’s business. Its algorithms choose topics to show users based on their usage of the network, so perhaps “trending for me” or “trending in my network” would be a better label, noted Lauren Henry Scholz, a media scholar at Yale Law School’s Information Society Project.

Regulatory agencies demand such clarifying labels when it comes to advertisements. That’s why there’s a sponsored or promoted tag next to paid trends or searches in Twitter or Google. Facebook does not offer such an advertising option within "Trending Topics."

In 2011, the Federal Trade Commission did step in and charge Facebook over a case of deceiving users of their privacy settings. Facebook settled the case and created the "privacy dinosaur" to explain privacy settings to users. It also is required to maintain third-party audits of its systems.

Some critics say the cartoon dinosaur for privacy and the cartoon owl for the Help Center are not nearly enough. “It’s just condescending, the little owl and the privacy dinosaur. It’s time for them to stop treating us like children,” Pasquale said.

A call for greater transparency in Facebook's role as a news organization might bring bad publicity, but it would have no legal authority. "There may be a call for greater transparency, accountability, but now you’re running right up against the First Amendment," said Andrew Sellars, the Dunham First Amendment Fellow at Harvard University's Berkman Center for Internet & Society.