In 2004, when Mark Zuckerberg was a Harvard undergrad working on a skunkworks project called The Facebook, a friend asked him how he’d managed to obtain more than 4,000 emails, photos and other bits of personal info from fellow students.
“People just submitted it. I don’t know why. They ‘trust me,’” Zuckerberg wrote in an instant-messaging exchange leaked to website Silicon Alley Insider. The guy who would become CEO of one of the world’s biggest tech giants added, “Dumb f—s.”
Zuckerberg later expressed regret for those disdainful remarks in a 2010 New Yorker interview, saying he had “grown and learned a lot.”
Today, the Facebook co-founder, now 33, is facing his toughest grown-up test yet — steering his global enterprise through a data-privacy crisis that has angered users, led panicked investors to sell the stock and drawn intense scrutiny from lawmakers and regulators worldwide.
The scandal in a nutshell: Cambridge Analytica, a U.K.-based political data analytics firm, illicitly procured the data of 50 million Facebook users — without their knowledge or consent — and then enlisted that to inform voter-targeting strategies for Donald Trump’s presidential campaign. It wasn’t a hack per se. But both Facebook and Cambridge Analytica claim they were duped by the researcher who originally harvested the data, who used an innocuous-seeming personality quiz in 2013 to access info on friends of people who used the app. That was possible because of Facebook’s then relatively lax privacy protocols.
The controversy landed like a Category 5 hurricane, to Facebook’s evident surprise. Ultimately, it may prove to be a watershed moment in how governments — the U.S. in particular — decide internet companies should be regulated after decades of laissez-faire policies.
Facebook was caught asleep at the wheel, says Daniel Ives, chief strategy officer and head of technology research at GBH Insights. “The Cambridge Analytica debacle has been the darkest chapter in Facebook’s 14-year history,” he says. “We view this as a seminal moment that’s going to change the nature of privacy, content and ad transparency.”
The regulatory aftershocks could rattle companies beyond Facebook. In the big M&A deals in play in the media sector — AT&T’s bid for Time Warner, Comcast’s pending acquisition of Sky, Disney’s proposed takeover of 20th Century Fox — streaming media is front and center. And everyone wants to use big data to serve up highly targeted ads over the internet, just as Facebook does. “Hundreds of billions of dollars [in digital ad spending] are at stake over the next several years over this issue,” Ives estimates.
What Zuckerberg is compelled to do now, observers say, is continue apologizing for the mess, and Facebook needs to show it’s committed to restoring the trust of its users. As part of his mea culpa tour, Zuckerberg reportedly has decided to testify before Congress, including at a Senate Judiciary Committee hearing scheduled for April 10. But in terms of regulatory expectations, the tide has irreversibly turned. In the wake of the Facebook/Cambridge Analytica revelations, the conventional wisdom is that new rules for online data are coming. And the chorus of calls for regulations curtailing the use of personal data also is sounding loudly from Silicon Valley.
Apple CEO Tim Cook blasted Facebook’s handling of the situation, saying that “privacy to us is a human right” and noting that Apple could “make a ton of money” if it chose to monetize its customers’ data. “This is not something that we just started last week when we saw something happening. We’ve been doing this for years,” Cook said in an interview with MSNBC. While his preference is for industries to self-regulate, he said, “I think we’re beyond that here.” Responding to Cook, Zuckerberg said in an interview with Vox that he found the claim “that somehow we can’t care about you to be extremely glib” and “not at all aligned with the truth.”
NBCUniversal chief Steve Burke also piled on, suggesting there’s a need to apply regulatory standards to Facebook similar to those under which the TV business has long operated. “Facebook is a gigantic business, and gigantic businesses have big responsibilities to the world,” Burke said at NBCU’s Innovation Day event for advertisers in New York. “I think the problem is exacerbated by the fact that Facebook’s business model is based on capturing data that many, many times, people don’t know they’re giving. And then selling that data, very often electronically to a buyer who Facebook doesn’t even know is a buyer.”
Facebook has confronted a number of controversies in recent years. It’s been accused of failing to thwart Russia’s attempts to manipulate the 2016 U.S. election and enabling the rapid spread of “fake news.” The Facebook Live streaming service was used to broadcast a spate of shocking murders and suicides. And the tech giant is a regular punching bag for critics who say it has amassed far too much market and cultural power.
But nothing before has threatened Facebook like the Cambridge Analytica scandal. “The power of data is sometimes a hard thing to understand, and this incident woke people up to that,” says Jamie Williams, staff attorney at the Electronic Frontier Foundation. It’s dawning on people, she says, that Facebook “knows more about you than they would if they were wiretapping your phone.”
Clearly, the Trump connection cranked up the story’s volume. Facebook had already been hauled before Congress over Russia’s disinformation campaign. The exposés about Facebook data being improperly used to help Trump win the White House kicked the football squarely onto the D.C. playing field, touching off a new round of congressional hearings.
“The media saw that it involved Trump, and that made it super-duper catnip,” says Scott Cleland, chairman of NetCompetition, a lobbying group funded by cable and telco providers. Cleland, a longtime critic of internet companies like Google and Facebook, believes the Cambridge Analytica matter could result in “a big leap in the direction of more accountability” for an industry he characterizes as having enjoyed an extremely light regulatory touch.
It’s evident that Facebook didn’t immediately know how to respond, or that execs thought the bad PR would blow over. On March 16, the company disclosed that it suspended Cambridge Analytica and parent company SCL Group from accessing Facebook’s platform. The story became embedded in the news cycle after reports in The New York Times and The Guardian revealed that Cambridge Analytica hadn’t entirely disposed of the ill-gotten data, as it had claimed. “We’ve been caught flat-footed,” Facebook head of news partnerships Campbell Brown said at conference less than a week later.
When Zuckerberg finally spoke publicly, after four days on the sidelines, he stressed that these are past sins the company is addressing. He said Facebook was deceived, and that the company in 2014 changed its policies to block third-party apps’ ability to access friend data (which would have prevented the collection of the vast trove of info that wound up on Cambridge Analytica’s servers). Still, the CEO acknowledged: “It was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.” (Facebook didn’t make Zuckerberg or other senior execs available for interview for this story.)
The costs to Facebook, in both financial terms and reputation, are potentially huge. Facebook is the target of a Federal Trade Commission probe into whether it broke a 2011 agreement requiring user consent for sharing data, which carries penalties of up to $40,000 per violation. A bipartisan coalition of 41 U.S. state attorneys general last week formally requested details from Facebook about the Cambridge Analytica case, to determine if it violated consumer-protection laws. The company also faces multiple lawsuits from shareholders and users over the unauthorized data leak. Meanwhile, the selloff sparked by the news sent Facebook’s stock price down more than 17% in less than two weeks, representing a loss of more than $90 billion in market value.
In the past two weeks Facebook has detailed several steps it’s taking to set things right. The company says it will investigate all apps that had access to large amounts of information prior to 2014. It has announced new restrictions on developers’ data access, and it has plans to adopt “rigorous policies and terms” for third-party app developers. Facebook also says it will shut down a program over the next six months that has let marketers use data brokers like Acxiom, Epsilon and Experian to target ads based on consumers’ offline profiles. The Partner Categories program, launched in 2013, has provided ad targeting based on users’ purchasing activity or other data points (like homeownership). “While this is common industry practice, we believe this step … will help improve people’s privacy on Facebook,” product marketing director Graham Mudd said in a statement.
In another move to assuage users’ fears, the company has announced a series of updates to its privacy tools that are designed to make them easier to find and use. The updates, slated to roll out in the next few weeks, include a new section called Access Your Information that will provide a simpler way for users to manage, delete or download their posts or personal profile info. In addition, Facebook says, users will be able to more easily see info that the company uses to serve targeted advertising.
For some, though, Facebook’s response is too little, too late. In a worrisome trend for the company, the momentum behind the #DeleteFacebook movement has gained steam. Last week, actor-comedian Will Ferrell and Playboy separately said they were terminating their Facebook accounts over the Cambridge Analytica issue. A week earlier, tech billionaire Elon Musk summarily axed the Facebook pages for two of his companies, Tesla Motors and SpaceX, after Twitter users called his attention to their existence. And Brian Acton, a co-founder of messaging app WhatsApp — which Facebook acquired in a deal valued at $19 billion — called for a boycott in a tweet: “It is time. #deletefacebook.”
All the same, most Wall Street analysts believe Facebook will make it through the tempest without serious long-term damage. Some even see the stock plunge as a golden buying opportunity for a company that throws off tons of cash and has continued to grow at an astonishing rate. For 2017, Facebook posted revenue of $40.7 billion (up 47%) and net income of $15.9 billion (up 56%).
“I don’t think it’s fatal,” says media analyst Michael Nathanson, who maintains a “buy” rating on Facebook. “They will be subject to new regulations … but I don’t see it affecting the core business model.”
Only about 5% total of Facebook’s revenue is generated by large brand advertisers who may be motivated to pull their spending, according to Jack Myers, chairman of ad industry research company MediaVillage. Up to 80% of Facebook’s sales come from the 6 million small and midsize businesses that buy direct-marketing ads on the social service, while an estimated 15% is generated through programmatic transactions.
“I don’t want to minimize the danger to Facebook,” Myers says. “If there’s anything being hidden that surfaces, that could have an impact. But right now, I don’t think the issues are going to have a material effect on their ad revenue.”
Investors will be watching for signs of significant user and advertiser flight when Facebook reports first-quarter 2018 earnings in early May.
The biggest risk to Facebook — and the digital-ad business overall — would be a wide-ranging privacy-protection law on the order of the 2010 Dodd-Frank Act in the banking sector. That established the Consumer Financial Protection Bureau, designed to keep predatory lenders in check, along with a host of new regulations. “If you have financial services-style regulation, in an industry that hasn’t really seen regulation since its inception, that would be a cyclone for Facebook and the industry,” says GBH’s Ives.
However, the introduction of sweeping data-privacy legislation at the federal level is unlikely in 2018, given the focus on midterm elections, says Michael Morgan, a partner with McDermott Will & Emery and leader of the law firm’s global privacy and cybersecurity practice. “There aren’t many signs coming out of this Congress that would suggest they want to focus a lot of attention on consumer privacy.”
But Zuckerberg has seen the writing on the wall. “I’m not sure we shouldn’t be regulated,” Zuckerberg said in a March 21 interview with CNN. “I actually think the question is more, ‘What is the right regulation?’” The focus for Facebook will be to ensure whatever laws come to pass don’t significantly disrupt its bottom line. In the interview, he also said Facebook supports ad-transparency tools that would provide the types of disclosures required by pending Senate bills.
|Regulatory officials examine files at Cambridge Analytica’s London offices.
Sipa USA via AP
When companies conduct a cost-benefit analysis on whether to adopt stronger user-privacy controls or other enhanced security, they often opt to do nothing — as long as the potential fines or remediation costs are in a tolerable range, according to Gregory Garrett, head of international cybersecurity for consulting firm BDO: “Companies typically do as little as they can, to maximize profitability and reduce costs of complying with regulations.”
In fact, Facebook is boosting spending to combat bad actors on its platform, and Zuckerberg warned investors last November that increased spending on “preventing abuse on our platforms” would “significantly impact our profitability going forward.” As part of that, Facebook has said it expects to double the size of its safety and security-monitoring team by the end of 2018, to 20,000 staffers.
Whether Facebook’s actions will be enough to satisfy all constituents — including a passel of congressional reps — that it has plugged its privacy holes and is a good corporate citizen remains to be seen. “We’re living in a really politicized time,” says Michael Priem, CEO of ad agency Modern Impact. “The market’s freaking out about Facebook. But it really comes down to one thing: It’s testing the thesis of how much trust there is between consumers and platforms.”
In the near term, Facebook and other tech companies will be forced to comply with stricter European privacy protections. The European Union’s General Data Protection Regulation, which goes into effect May 25, requires companies to obtain consent before collecting data from consumers. It also gives consumers the right to learn what companies know about them and even ask a company like Facebook to delete photos or any other data it may have.
Noncompliance can be expensive: The maximum fine is up to 4% of a company’s global revenue, which would be $1.6 billion for Facebook, based on the company’s 2017 results.
Technically, GDPR applies only to services offered to EU residents. But given the global nature of many online services as well as the increased focus on privacy, it’s likely that some of the new rules will be implemented for U.S. users. It’s also worth noting that GDPR, first drafted in early 2012, passed despite heavy lobbying against it by big tech companies.
Eventually, American legislators may take their cues from the EU. “The U.S. government is likely to draft a watered-down version of GDPR after the upcoming midterms,” Barclays analyst Ross Sandler predicted in a recent research note.
Before that happens, some states may move forward with more rigorous data-handling laws. In California, voters will decide the fate of a privacy-focused ballot measure in November. The California Consumer Privacy Act would make data collection more transparent and give consumers a way to opt out of companies selling their data. “We’re trying to do a commonsense reform,” says real estate developer Alastair Mactaggart, who’s spearheading the initiative.
The California Chamber of Commerce, which is organizing the opposition to the ballot initiative, argues that it would pose “grave risks to the business community and potentially the state’s economy.” Mactaggart dismisses the claims: “They’re just defending business as usual.”
Some contend that these days, “business as usual” should factor into the potential costs of data breaches — regardless of the letter of the law. “It boggles the mind,” says Garrett, “that a company that should be super sophisticated at information management would let personal information on 50 million users be accessed by an unauthorized party.”
Janko Roettgers contributed to this report.