Hello everyone and welcome to the third edition of Plaintext. I hope this newsletter has become another good reason to wait on Friday. (Now I look forward to Friday because it means the newsletter is ready and I don't have to worry about the next one until … Sunday?) If you're reading this on the web and wondering why it's not in your inbox, correct it by subscribing to WIRED with such a high discount that you should then depressurize. You will get the print version, unlimited web and, of course, this newsletter.
Last weekend I attended an event called Social Science Foo Camp, a "deconference" where attendees spontaneously schedule discussion sessions to create an animated agenda. The place was the headquarters of Facebook in Menlo Park, California. One of the most interesting sessions I attended was a project called Social Science One.
Social Science One is an effort to put the Holy Grail of data sets in the hands of private investigators. That Holy Grail is information from Facebook. Yes, that same incredibly massive treasure that Cambridge Analytica brought us.
At the Foo Camp session, Nate Persily of Stanford Law School, co-director of Social Science One, said that after 20 months of negotiations, Facebook was finally releasing the data to researchers. (The researchers had thought that all that would be resolved in two months). A Facebook data scientist who worked on the team dedicated to this project issued a confirmation. In fact, the official announcement came a few days later.
It is an unprecedented fall, which involves a data set of 10 billion numbers. The information focuses on the URLs shared by the billions of Facebook users, specifically, the 38 million of them that were shared more than 100 times on Facebook between January 1, 2017 and July 31, 2019. Researchers can isolate URLs by features as if they were verified or marked as hate speech, and they can see (together) who saw them, liked them, shared them or even if they shared the links without seeing them. "This set of data allows social scientists to study some of the most important questions of our time about the effects of social networks on democracy and elections with information that they had never had access to before," the statement said. Press release of Social Science One.
The reason it took so long is that Facebook, as is understandable, wanted to protect the privacy of its users. Simply adding the information so that the activity of an individual could not be identified was not enough for Facebook, which insisted on also coding the data through a technology called differential privacy. It is an excellent way to protect privacy, but because it works by adding digital noise to the data set to avoid exposing people, the technique limits the research that can be done. Social Science A person thinks that Facebook is excessively cautious. "But not only did I receive a fine of $ 5 billion from the FTC," Persily acknowledges, referring to the penalty assessed on Facebook last summer for his privacy sins.
. (tagsToTranslate) Plain text data (t) Facebook (t)