Researchers say Facebook has shut down German research into the Instagram algorithm. techydeed.com

NYU Ad Observatory Issues: A chilling echo Algorithm Watch researchers claim they were threatened by Facebook and had to end their monitoring of the Instagram algorithm. The Berlin-based project made public the conflict in a Friday morning post, citing Facebook’s recent ban on the NYU Ad Observatory.

The post states that there are likely more instances of bullying than we know. We hope more organizations will come forward to share their stories.

“THE COMPANY CAN NOT BE TRUSTED”

AlgorithmWatch was launched in March 2020. It provided a browser plugin that allowed users to gather data from their Instagram feeds. This would give insight into the algorithm’s priority of photos and videos. Regularly, the project released findings that revealed that AlgorithmWatch encouraged pictures of naked skin and that screenshots of text are ranked higher than photos with faces. Facebook denied the methodology but did not take any other action against AlgorithmWatch during the first year.

Researchers claim that Facebook requested a meeting with project leaders in May and accused them of violating its terms of service. Another objection was that the project broke the GDPR since it collected data from users who had not consented to participate.

The researchers claim that they only collected data about content Facebook displayed to volunteers who installed the addon. The plugin’s users could only access their feed and share it with us for research purposes.

The researchers decided to end the project because they feared that they would sue them if they continued.

A Facebook representative confirmed that the meeting was held but denied that he threatened to sue the project. He said the company was open to privacy-preserving methods to continue the research.

The representative stated they had concerns about their practices and contacted them several times to get their permission to continue their research. This is the same thing we do with other research groups whenever we have similar concerns. “We will continue to work with independent researchers but in ways that don’t put people’s privacy or data at risk.”

It is difficult to isolate any user on Facebook because of their social nature. Even if users opt into the platform, their feed will be populated with content from other users, who most likely have not consented. Facebook has been especially sensitive to research projects ever since the Cambridge Analytica scandal. This scandal saw academic research data used for political and commercial manipulation.

The larger pattern is still troubling. Although the algorithms used to manage Instagram and Facebook’s news feeds are extremely powerful, they are not well understood. Facebook’s policies also make it difficult for users to study them objectively. In November, the NYU Ad Observatory was banned for tracking political advertising on the platform. There were also allegations of data scraping. In November, the company made similar legal threats against Friendly’s browser, which let users reorder their feeds chronologically. CrowdTangle, another popular tool for Facebook research, was acquired by the company in 2016.

Researchers can collect data directly through Facebook’s Ad Library and Social Science One partnerships. AlgorithmWatch claims that the data is not trustworthy because of the nature of the research.

Researchers state that researchers cannot trust data from Facebook as the company is not trustworthy. “There’s no reason to believe Facebook would offer usable data if researchers were to replace their data with the company.

LEAVE A REPLY

Please enter your comment!
Please enter your name here