Facebook’s live video-streaming platform become a major media powerhouse — although not in the way the social giant perhaps may have expected or desired.

The ability for anyone with a smartphone to become a live broadcaster has changed the media equation, as gruesome deaths have gone viral on Facebook Live and spread across traditional media. On Wednesday night, Minnesota woman Diamond Reynolds live-streamed video on the social service after her boyfriend, Philando Castile, was shot and killed by police.

That came a day after the killing by police of a man in Baton Rouge, La. — captured on video that was uploaded to Facebook and Instagram. The two deaths touched off nationwide protests over police violence against black citizens and led to what officials said were the racially motivated killings of five white police officers in Dallas Thursday evening; at least one bystander, Michael Kevin Bautista, shared Facebook Live video of the chaos as shooting erupted.

With social outlets providing instantaneous access to newsworthy events, Facebook, Twitter and other Internet companies are having to refocus on how they treat such content — weighing free speech and privacy with questions of when to allow disturbing and violent material to remain on their services and when to remove it. The process of drawing those lines is still in flux, and the real-time nature of live video makes the task even more urgent.

Facebook is trying strike a balance between enabling shared experiences while providing a safe and respectful environment, a company rep said. “We believe the vast majority of people are coming together to share experiences and moments on Facebook Live,” the spokesperson said. “That said, we take our responsibility seriously and we are committed to improving it.”

Tech companies like Facebook prefer to portray themselves as open platforms, not media gatekeepers controlling the flow of information. And all Internet companies have set policies about what, generally speaking, is unacceptable for users to post — but those rules are subject to interpretation and judgment.

With recent events, Facebook’s practices regarding what is and isn’t allowed is drawing more intense scrutiny. Facebook denied a report Friday by U.K. tech site the Register that police were responsible for removing Reynolds’ video Wednesday night. The live video was unavailable for about an hour, before it was restored with a “warning: graphic video” message. The video dropped offline because of a technical error on Facebook’s part, according to the company. As of Friday afternoon, the video had been viewed more than 5 million times.

How are the decisions made about what Facebook’s more than 1.6 billion users worldwide are able to see? In this case, it would be helpful if Facebook provided more details about its handling of the video, said Emma Llanso, free expression director at the Washington, D.C.-based Center for Democracy and Technology.

“It’s not clear to people if (the Reynolds’ video) violates any terms of Facebook’s service. I think leaving it up as a matter of public interest is the right call,” Llanso said. “But it points to the difficult questions in this area, because it comes down to definitions and in what context do you understand the content to be acceptable.”

Facebook says it has a clearly outlined policy in its “community standards” section. Among other things, those prohibit content that glorifies or celebrates violence, while disturbing content such as the death of a human being may be OK but will be flagged as “graphic,” with video-autoplay disabled and the post made unavailable to users under 18. Facebook says it has a global team dedicated to monitoring requests about content that violate those standards around the clock, and that in the majority of cases responds within 24 hours to such requests. The company also says it has the ability to interrupt a live stream if it violates the policy.

In another area recently, Facebook’s roles as information gatekeeper flared into the public spotlight after it was accused of political bias in Trending Topics section to downgrade conservative-leaning stories. The company responded by trying to provide greater transparency into its inner workings: CEO Mark Zuckerberg and other execs met with conservative leaders to discuss the issue and posted an explanation of the process. It also now mandates all employees to under training about identifying political bias as a standard policy.

For better or worse, Facebook, as well as Twitter, YouTube and any other company that allows user-shared content, must rely on algorithms and individual users flagging material that is inappropriate or illegal. “If you think about the amount of content that’s uploaded to these sites, any pre-publication review is just impossible,” Llanso said. What’s important is for users to have the right to appeal a decision, because mistakes will inevitably happen and content-flagging systems will be subject to abuse, she added.

As a legal matter, Facebook is protected by the First Amendment and has the right to allow or disallow the distribution of content according to its own policies, said David Greene, civil liberties director at the Electronic Frontier Foundation.

“Facebook has a near-absolute right to this platform, and it users have a near-absolute right to transmit a live feed,” Greene said. He added that the company may have a contractual duty to comply with its terms of service. In addition, Internet companies are also beholden to copyright laws to remove any infringing content that’s posted without authorization.

In the case of Diamond Reynolds’ video, the fact that she was broadcasting it live was important, Greene said, because if police officers had confiscated her phone the video may have never seen the light of day.

“What you’re getting with Facebook Live, Periscope and other services is you don’t need to be a professional journalist,” Greene said. “It’s just citizens broadcasting live on the Internet. People seeing and sharing newsworthy information is a good thing.”