Facebook apologized to misinformation researchers for providing them with flawed, incomplete data for their work examining how users interact with posts and links on its platform, the New York Times reported. Contrary to what the company told the researchers, the data Facebook provided apparently only included information for roughly half of its users in the US, not all of them.
The Times revealed that individuals from Facebook’s Open Research and Transparency group held a call with specialists on Friday to apologize for the blunder. A portion of the scientists addressed whether the misstep was purposeful to attack the exploration, or just a case of carelessness.
The blemish in the information was first found by a specialist at Italy’s University of Urbino, who analyzed a report Facebook delivered freely in August to the information it had given exclusively to the scientists. The informational collections didn’t coordinate, as indicated by the Times.
Facebook didn’t quickly answer to a solicitation for input from The Verge on Saturday, however a representative told the Times that the mix-up was the aftereffect of a specialized blunder and the organization “proactively outlined for affected accomplices and are working quickly to determine” the issue.
The report from August 18th that the University of Urbino researcher used in his comparison was released in the interest of “transparency,” showing the most-viewed content in Facebook’s public News Feed between April and June of this year, its second quarter. However, the Times discovered that Facebook had shelved a report about its first quarter that portrayed the company in a much-less flattering light. Facebook eventually released the shelved report.
Also in August, Facebook banned academic researchers from New York University’s Ad Observatory project from its platform, after the group’s Ad Observer browser plug-in highlighted problems. Its research found Facebook had failed to disclose who paid for some political ads on its site.