On Tuesday, The New York Times’ investigation of a study into how Facebook promoted anti-refugee violence in Germany galvanized discussion about how even normal political speech on the platform can drive users to extremes. On Wednesday, the report was criticized on the grounds that it may have unfairly linked correlation to causation, drawing more dramatic conclusions than can be supported by the evidence.
The case against the piece goes like this:
- The study, which you can read here, has not been peer-reviewed.
- The study authors could not measure actual Facebook usage, which is private, so they relied on problematic proxies. Their proxy for average, non-ideological usage of Facebook was the Nutella Germany page, with 32 million followers — but they managed to collect data on only 21,915 users who interacted with the page, and whose German location could be verified.
- Data from the study is charted week by week, rather than in the moment. As Ben Thompson and others have pointed out, it seems just as possible that anti-refugee violence inspired Facebook posts than that Facebook posts inspired violence.
- The article reported that Facebook was linked to a 50 percent increase in attacks on refugees; an update to the study revised that figure downward, to 35 percent.
- The article represents a case of confirmation bias. People (like me!) who shared it tend to sympathetic to the idea that heavy usage of Facebook can be corrosive inside democracies, and so we accepted it without appropriate skepticism.
Some of these criticisms seem fairer to me than others. (The week-by-week chronology issue bothers me the most; I’ve reached out to the study’s authors, and will share anything I hear back from them in this space.) But none turns the original article on its head — or acknowledges that the Times journalists, Amanda Taub and Max Fisher, bolstered the study’s findings with their own on-the-ground reporting. (Thompson did note the additional reporting.)
And while the study hasn’t been peer-reviewed, the Times authors did seek input from other experts, who called the findings “credible, rigorous — and disturbing.” It also seems worth noting that the Economist also covered the study when it was first published earlier this year, and drew similar (if somewhat less agitated) conclusions.
In any case, I can’t imagine anyone reading the study and, even accounting for its flaws, not believing that further inquiry is warranted. “More study is needed” is perhaps the most common conclusion to be drawn from any study, and this one is no exception.
But as lots of folks noted online today, further study is difficult, because Facebook data is private by default. As New York’s Max Read put it: “The frustrating thing about the justified quibbles around this Facebook hate-crimes study is that Facebook itself could, in a couple hours, pull together a comprehensive data report that would answer all of the questions.”
The data in question is generally private for good reason — make it public, and you’ve got a Cambridge Analytica situation on your hands. But given the urgency of the question — does Facebook push normal political speech to extremes, inciting violence even in developed nations? — I wish Facebook would find a way.
Thompson doubts the company will:
Of course at best this sort of study will be done for internal consumption; I suspect it is more likely it won’t be done at all. Facebook has publicly buried its head in the sand about filter bubbles at least twice that I can remember, first in 2015 with a questionable study whose results were misinterpreted and last year on an earnings call.
The reason why seems clear: unlike fake news or Russian agents, which involve a bad actor the company can investigate and ban, the propagators of filter bubbles are users ourselves. To fix the problem is to eliminate the temporary emotional comfort that keeps users coming to Facebook multiple times a day, and that is if the problem can be fixed at all. Indeed, perhaps the most terrifying implication of this study is that, if true, the problem is endemic to social networks, which means to eliminate the former necessitates the elimination of the latter.
On the second point, I fear Thompson is right. And on the first — that Facebook will ignore studies like this — I can only hope he’s wrong.
Someone is still trying to hack the Democratic National Committee, Sheera Frenkel and Jonathan Martin report:
A cybersecurity researcher from a firm called Lookout contacted the D.N.C. on Tuesday about the attempted intrusion, said two officials briefed on the matter who were not authorized to speak publicly.
The F.B.I. is investigating, according to one of the officials. But the attempted hack, which was described as sophisticated, was not successful, the committee said.
Facebook updated its post from yesterday, in which it revealed that it had removed more than 600 accounts and pages that it had identified as being part of Iranian and Russian influence campaigns, with examples of their posts.
Facebook has now suspended 400 apps as part of its post-Cambridge Analytica audit, including one from the University of Cambridge, which had 4 million users, Facebook’s Ime Archibong says:
It’s clear that they shared information with researchers as well as companies with only limited protections in place. As a result we will notify the roughly 4 million people who chose to share their Facebook information with myPersonality that it may have been misused. Given we currently have no evidence that myPersonality accessed any friends’ information, we will not be notifying these people’s Facebook friends. Should that change, we will notify them.
Facebook co-founders Mark Zuckerberg and Dustin Moskovitz are backing campaigns to support housing and criminal justice reforms in November, David McCabe reports:
Both organizations have given $1 million each to a group supporting an Ohio ballot initiative that would institute criminal justice reforms, including reclassifying drug possession crimes from felonies to misdemeanors.
The Chan Zuckerberg Initiative gave $250,000 to support a ballot measure in California that would fund affordable housing projects.
Facebook shut down API access for a data-mining firm named Crimson Hexagon last month after the Wall Street Journalreported that it was working with the US government and a nonprofit tied to the Kremlin, in violation of its policies. But it’s back on the platform now, reports Alex Pasternack:
The reinstatement, which began earlier this month, followed “several weeks of constructive discussion and information exchange,” said Dan Shore, Crimson’s chief financial officer. But the companies didn’t specify the results of the inquiry or explain why access was restored, raising more questions about how Facebook and other platforms police third parties like Cambridge Analytica and Crimson Hexagon.
China is shutting down WeChat news channels if they publish news about the blockchain:
Sarah Frier gets some rare interview time with Evan Spiegel, who uses it to promise everyone that he is studying very hard to be a CEO. The story is full of good details, but the heart-shaped “talking piece” geode is the part you want to read:
On the second floor of the new headquarters of Snap Inc. in Santa Monica, Calif., is a room dedicated to helping employees open up. It’s round and lined with potted plants. “Speak from the heart,” reads a framed sign on the wall. “Listen from the heart.” Employees show up in groups of about a dozen, sit cross-legged on black cushions, and take turns with the “talking piece,” a heart-shaped purple geode that gives the bearer the right to confidentially share deep thoughts.
This is the inner sanctum for what Snap calls “Council,” a sort of New Age corporate retreat that uses a technique Chief Executive Officer Evan Spiegel learned in childhood. It was also where I found myself on a Friday morning in July. Council meetings, I’d been told by the company’s communications chief, are “sacred.” They’re also a real-life example of what Spiegel wants people to do with his smartphone app, Snapchat: share intimately, without fear of judgment from the outside world.
Facebook plans to pull its data-security app Onavo from the App Store after Apple complained that it violated its data collection policies, Deepa Seetharaman reports. Data from Onavo helps Facebook monitor the growth of competing social networks and has been seen as a major competitive asset:
Apple’s decision widens the schism between the two tech giants over privacy and is a blow to Facebook, which has used data gathered through the app to track rivals and scope out new product categories. The app, called Onavo Protect, has been available free download through Apple’s app store for years, with updates regularly approved by Apple’s app-review board.
My colleague Ashley Carman has new details on the latest Tinder co-founder lawsuit:
Meanwhile, a source close to Tinder says that Rad actually sold a great deal of stock following the merger between Tinder and Match Group, and suggested that the co-founder didn’t have much faith in the future of the dating app and that Match’s valuation was accurate. According to SEC filings, Rad exercised about half of his stock options in Match on August 4th and 6th, which Match repurchased for a net pay out of $94,413,552.06 based on a closing price of $18.89 per share. His other half was exercised on August 9th and he received net 816,805 shares of IAC stock.
Facebook’s next virtual-reality headset will be a high-end version of the recently released Oculus Go, David Jagneaux reports:
The headset is designed to function on its own without the need for a PC, similar to Oculus Go, but with cameras added for inside-out tracking of 6DOF head movement and two Oculus Touch-style controllers. The last time we went hands-on with Santa Cruz was at the Oculus Connect 4 conference last year. The release window lines up with the two year anniversary of the original Rift’s launch at the end of Q1 2016, March 28th.
Facebook is trying to give a boost to Instant Games, which feel like they’re going nowhere, at least in the United States:
At the outset, Facebook said developers would receive 70 percent of the Instant Games revenue, with 30 percent going to Facebook. But on Android, developers also had to share 30 percent of their revenue with Google. In fact, Google took 30 percent of the total, and then Facebook took 30 percent of what was left. Developers were left getting only 49 percent of the total revenue on games they had created.
After evaluating this, Facebook has decided to roll back its revenue share, so the developers only have to pay Google on Android.
Move over lemonade stands, the hot new trend for teens is sponcon, Taylor Lorenz reports:
With “jobs you need to do a lot of training,” says Lucy, a 13-year-old in Pennsylvania who asked to be referred to by a pseudonym. “Then you have to, like, physically go out and do the job for hours a day. Doing this, you can make one simple post, which doesn’t take a while. That single post can earn you, like, $50.” Last month, she started posting brand-sponsored Instagrams for her more than 8,000 followers. So far, she says, she’s earned a couple hundred dollars.
Sharing an Instagram account with a lover sounds like a nightmare, but some lost souls are doing it, Emily Dreyfuss reports, who is now one of those souls. (Twist: so is Taylor Lorenz, who is featured in this story!)
Yes, I still feel a twinge of embarrassment about sharing an account with Seth sometimes. But so far, my tiny hang-up is the only real downside to our new joint-account life. If you’re considering it and you’re sensitive to the judgment of others, you should know that when I asked on Twitter whether anyone knew people who did this, the common response was “ew” and “I assume anybody who replies to this in the affirmative gets arrested.” But you know what? Lock me up, folks, because I love love and I love our joint Instagram account.
Shannon Liao updates us on one of Facebook’s internet connectivity efforts, this one in Tanzania:
Facebook gave an update yesterday on its efforts to expand Express Wi-Fi, an app that lets unserved communities pay for internet service. The company is still working on efforts to reach the 3.8 billion people in the world who don’t have internet access, in order to grow its potential market.
Alex Stamos, who just left his job as Facebook’s chief security officer, vents at the United States government for failing to do more to improve the nation’s cybersecurity defense mechanisms:
If the weak response of the Obama White House indicated to America’s adversaries that the U.S. government would not respond forcefully, then the subsequent actions of House Republicans and President Trump have signaled that our adversaries can expect powerful elected officials to help a hostile foreign power cover up attacks against their domestic opposition. The bizarre behavior of the chairman of the House Permanent Select Committee on Intelligence, Rep. Devin Nunes, has destroyed that body’s ability to come to any credible consensus, and the relative comity of the Senate Select Committee on Intelligence has not yet produced the detailed analysis and recommendations our country needs. Although by now Americans are likely inured to chronic gridlock in Congress, they should be alarmed and unmoored that their elected representatives have passed no legislation to address the fundamental issues exposed in 2016.
And finally …
Patrick Gerard has your algorithmic failure of the day:
Facebook is pushing that “share a memory” junk where they make custom videos out of your old photos to boost engagement and I just literally got shown a bunch of happy cartoon characters dancing on my mom’s grave. ♂️ pic.twitter.com/6NKIXqqq9I
— Patrick Gerard (@PatrickGerard01) August 21, 2018
Talk to me
Send me tips, questions, comments, and alternate theories about violence against refugees: firstname.lastname@example.org.