×

Senators Blast Tech Giants: ‘I Don’t Think You Get It’ on Russian Influence

Dianne Feinstein
AP/REX/Shutterstock

WASHINGTON — Facebook, Google, and Twitter continued to face the criticism and frustration of senators on Wednesday, as lawmakers expressed anger that they didn’t do more to curb Russian and foreign influence in the 2016 election and expressed some doubts that they have taken adequate steps to limit in the future.

Sen. Dianne Feinstein (D-Calif.), a member of the Senate Intelligence Committee, told the companies, “I don’t think you get it.”

“You have a huge problem on your hands. … You bear the responsibility. You created these platforms and now they are being misused.”

She added, “You have to be the ones to do something about it, or we will.”

The company representatives were appearing before the Senate Intelligence Committee in the second of three public hearings this week in which they are testifying on Russian influence on the presidential election.

They have already disclosed the results of their own investigations into the extent to which Russian-linked sources used their platforms to spread disinformation during the 2016 election. Facebook, for instance, said that the Internet Research Agency, a Russian company linked to the Kremlin, posted 80,000 pieces of content that reached 126 million Facebook users.

Sen. Ron Wyden (D-Ore.) also admonished the companies, telling them that it was “self evident” that “in relation to the power your platforms now have, in the past election you failed.” He pointed to a provision of the Communications Decency Act that gives internet firms legal immunity, under a set of conditions, for the content that users post on their site.

The company representatives — Facebook’s general counsel Colin Stretch, Twitter’s acting general counsel Sean Edgett, and Google’s general counsel Kent Walker — outlined steps they have taken to limit the deceptive content and inflammatory posts.

“We could have done more, and we are doing more today,” Walker said.

But it was clear that a number of lawmakers were not satisfied. Sen. Susan Collins (R-Maine) pointed out that inflammatory posts about Maine’s governor, Paul LePage, continued to appear on Facebook as recently as August.

“The Russians continue to push this kind of divisive rhetoric to this very day,” Collins said.

Sen. Angus King (I-Maine) said he wished that the companies sent their CEOs, not their general counsels, although he was appreciative of some of their responses.

“This is a sophisticated worldwide strategy that applied here in 2016,” King said, noting that the Russian influence effort reached other countries — Spain and Scotland, most recently. He said Russian sources have also continued to try to foment dissent on divisive issues in the U.S., such as the NFL player protests.

“They were tweeting on both sides of the NFL dispute, in order to exacerbate the divisions,” he said. “It hasn’t stopped and it won’t stop, and we have to figure out what to do about it.”

The companies are caught between trying to assure lawmakers that they are doing all they can to prevent the influence of further Russian disinformation campaigns, and maintaining their open platforms that have driven their growth into some of the world’s most powerful information tools.

On Tuesday, the company representatives, in oral and written testimony, outlined the results of their investigation so far on the extent to which Russian sources were able to purchase ads and use other methods to infiltrate the platforms, such as with the use of bots and via so-called “troll farms.”

Google’s Walker said the company plans a “transparency report,” to identify the source of political ads and how much was paid for it, along with labels to identify information suspected of being fake news in search results.

Twitter’s Edgett talked of “information quality teams” to try to identify “bad actors from amplifying their message.” He also said they were better coordinating with law enforcement, and working with engineering teams to improve detection of abusive and misinformation on their platforms.

Facebook’s Stretch said they are planning to double their safety and security teams, by the end of 2018, to 20,000 people, increasing ad transparency, requiring documentation from ad buyers, and tightening their ad policy to “limit divisiveness.”

Other highlights:

More to Be Done. Sen. Mark Warner (D-Va.), the ranking member of the Senate Intelligence Committee, isn’t convinced that the companies have fully examined the scope of Russian influence.

He said that Facebook “has more work to do to see how deep this goes,” noting that they have pinpointed the work of just the Internet Research Agency,” and he has doubts that they are the only “trolls” out there. He also said that Twitter was “greatly underestimating” the number of “bots” and faux sources pushing out content, as the company has said that it was less than 5%. Google’s search algorithms “continue to have problems” identifying fake news, Warner said. He also said that YouTube has now become the “go-to” platform for RT, the state sponsored Russian news agency.

The companies said that they will continue to investigate.

But Warner later was testy with Stretch when he asked him whether some of the accounts that Facebook took down before the French elections were also involved in efforts to target users in the United States. Stretch said that he did not know.

The reps for Twitter and Facebook said that the companies became aware in 2015 that Russian-linked sources were involved in cyber threats.

Feinstein also had questions of why it took Google until only recently to remove RT from the list of channels getting “premium” YouTube content it sells to advertisers. On Tuesday, another Google executive Richard Salgado told lawmakers that the removal was due to a loss in viewership, not propaganda.

But Walker said that while they “recognize the concerns expressed” about RT for its “slanted coverage,” “we have carefully reviewed its content” but have found no violations. “But we continue to look.”

He said that Google is taking steps to identify the government funded nature of RT, and is looking for ways to do that on YouTube.

Others asked whether Facebook could alert users who were potentially exposed to Russian disinformation — and interesting concept, but, according to Stretch, something that would be difficult to do.

Targeting Discord. Sen Richard Burr (R-N.C.), the chairman of the Senate Intelligence Committee, criticized media reports that the Russians sought out online platforms with the express intent of favoring one candidate over the other. (He didn’t mention Donald Trump by name.)

Instead, he tried to show how Facebook was used by Russian sources to sow discord. He showed an ad from a fake group called Heart of Texas, promoting an event on May 21 to protest at an Islamic center, as well as another faux organization, United Muslims of America, staging a counter-protest.

“What neither side could have known is Russian trolls” were promoting the event, he said. “In this case, people were brought together to foment conflict.”

Instagram Influence. Another Facebook platform, Instagram, also was used by Russian sources for 120,000 posts, and Stretch said that they reached about 16 million users. But their study only went back to October, 2016, and the Russian content may have reached four million more users before that.

The companies, though, have tried to show that they did not reap a big windfall from Russian activity. Facebook has said that the Russian-linked political ads they have identified do far cost about $100,000, compared to the $81 million that the Hillary Clinton and Trump campaigns spent.

A Platform or a Publisher: Some senators questioned why Facebook couldn’t make it easier for readers to discern whether a post was real or fake — like doing things that newspapers do. For instance, asked Collins, why doesn’t Facebook require the geography information of the author — just as news outlets offer their own datelines.

“There are occasions where that can be problematic,” Stretch said. Facebook has been used, for instance, by dissidents who want to disseminate information about repressive regimes.

What was clear, though, was that the three corporate representatives still do not consider Twitter, Facebook and YouTube as publishers, but platforms primarily for user-generated content.

Sen. Joe Manchin (D-W. Va.), though, said that when it comes to foreign influence, “we are getting hit from every way you could possibly imagine. You are one of the largest distributors of news.” The inference was that comes with a whole other set of responsibilities, just as a TV network or newspaper would face.

But the companies made it clear that there was a distinction, as they are not producing the content itself.

“That may well be a distinction that is lost on most of us,” said Sen. John Cornyn (R-Texas).

Antitrust Exemption? Burr made it pretty clear that the companies needed to do a better job of making sure that foreign money wasn’t being used to buy political ads — something that already is a violation of election law. He said that even if there was a suspicion that foreign money was being used, they shouldn’t run a spot until it could be verified.

He also opened up the idea of the companies getting an antitrust waiver to collaborate with each other on ways to prevent the spread of foreign-backed influence campaigns on their platforms.

Burr concluded the hearing with a bit of an admonition, telling the executives that they were the “first line of defense” against foreign influence. “Don’t let nation states disrupt our future,” he said.