Researchers who rely on CrowdTangle to find the worst social media content are concerned about what happens when it disappears.
On May 17, as several states held their primary elections, Jesse Littlewood searched the internet using a tool called CrowdTangle to spot the false narratives he knew could change perceptions of the results: damaging stories about ballots being collected and delivered in bulk by people unauthorized. , whom disinformation peddlers called “ballot mules.”
Littlewood, the vice president of campaigns for the voter advocacy group Common Cause, easily found dozens of posts featuring a “wanted” poster falsely accusing a woman of being a ballot mule in Gwinnett County, Georgia. He set off alarm bells on Facebook and Twitter. “This was going to lead to threats and intimidation of this individual who may be a poll worker, and there was no evidence that this person was doing anything illegal,” Littlewood said. “It had to be removed.”
Meta Platforms Inc.’s Facebook owns the search tool Littlewood used, and the company has kept its plans for CrowdTangle secret for months. Meta has been winding down its support for the product. The company is expected to eventually scrap it and has declined to say when it plans to do so. Not knowing the future of CrowdTangle or what Meta chooses to replace it with, Littlewood said, jeopardizes planning for future elections. The group has thousands of volunteers who work in shifts to identify false information online, primarily with CrowdTangle.
Erin McPike, a spokeswoman for Meta, said the company will continue to support researchers, with plans to create “even more valuable” tools for them. Responding to investigators’ concerns, she said the company would keep CrowdTangle alive through at least this year’s US midterm elections.
Election officials and voting rights advocates are bracing for a repeat of the spate of misinformation that engulfed the 2020 presidential race online, resulting in real-world violence during the Jan. 6 insurrection. Kate Starbird, an associate professor at the University of Washington and co-founder of the Center for an Informed Public, said that if Facebook must disable CrowdTangle, she hopes the company will “create a viable alternative,” which she said doesn’t exist. so far, and “give researchers and journalists time to redesign their workflows around the new tool.” Failure to provide one would “significantly limit” investigators’ ability to help others counter misinformation in real time and could lead to voter manipulation.
Common Cause’s work “would be impossible to do without a tool that looks through Facebook,” Littlewood said. CrowdTangle also provides information about posts on Instagram, Twitter, and Reddit. “And we all know that the midterm elections are testing grounds for 2024, when the level of misinformation will be even higher.”
Researchers not only trust the tool, but also the companies that react to reports of harmful content that they make. Twitter removed the misinformation Littlewood flagged in May; on Facebook, which did not respond to his warning, at least 16 of the posts remained as of mid-June. Facebook removed them after media outlets including ProPublica and Bloomberg News got in touch.
McPike, the spokesperson, said that “the CrowdTangle product experience for the 2022 midterm election remains the same as it was for the 2020 election.” But researchers are already seeing a difference, pointing to a buggy experience as the company has shifted support for the tool in recent months.
In February, Meta began an official internal process to shut down CrowdTangle, but halted the plan when the Digital Services Act, a landmark law in Europe that aims to provide transparency on how Facebook, YouTube and other internet services amplify divisive content, gained strength, according to a person familiar with the matter. CrowdTangle is still on track to eventually shut down, the person said, with some Facebook engineers tasked with removing it.
Meta bought CrowdTangle in 2016, saying at the time that it wanted to help news publishers discover how their content is performing on Facebook and Instagram, so they can improve their strategies. A few months later, the company revealed Russia’s campaign to influence the 2016 elections through a social media post. As the public debated the spread of false information online, CrowdTangle became a tool not only for insight into social media strategy, but also for manipulation. It was awkward for Meta; The company often tried to publicly challenge the conclusions that journalists and others drew from the CrowdTangle investigation. Executives could no longer bear to support a feature that resulted in so many PR crises for Meta.
The CrowdTangle team within Meta disbanded in the summer of 2021, with its dozens of employees resigning or receiving new assignments elsewhere in the company. Meta also rescinded a $40,000 grant that was intended to help two research partners use CrowdTangle data to understand public discussion around the Covid-19 pandemic. Brandon Silverman, the former CEO of CrowdTangle, left Facebook in October. And in January of this year, Meta “paused” new users to gain access to CrowdTangle while working with what he said were staffing limitations. You have not restarted the process of adding new partners to the service.
Recently, fewer than five engineers from Facebook’s London-based integrity team were working to keep CrowdTangle afloat, a person with knowledge of the matter said. That leaves little support for the tens of thousands of organizations that use the tool in their work, including major fact-checking organizations around the world such as Agence France-Presse in France and VERA Files in the Philippines, along with hundreds of others. academics and researchers, media, human rights activists in places like Myanmar and Sri Lanka.
No new features have been added to CrowdTangle in over 16 months. Before CrowdTangle’s disbandment, his team released new updates several times a month and major new products every half year. Researchers are concerned that product instability could worsen during major events, as computing load increases, said Cody Buntain, an assistant professor and social media researcher at the University of Maryland. “I expect this load to change during midterms,” Buntain said. “There is legitimate concern about whether it will hold steady in the important time frame.”
Cameron Hickey, director of the Institute for Algorithmic Transparency at the National Conference on Citizenship, said his group is currently in the process of putting together a comprehensive watchlist of all the candidates on the ballot in 2022, and that this list, which advocate thousands of voters volunteers across the country have access to, live in CrowdTangle. Meanwhile, Facebook has kept CrowdTangle closed to groups dedicated to combating misinformation about new topics uploaded in the news, such as advocacy groups that want to combat abortion misinformation on the brink of a major Supreme Court ruling that may annul Roe v. Wade, he said.
“For an investigative and transparency tool, Facebook is not adding necessary enhancements that would benefit the transparency and investigative community,” Hickey said. She cited longstanding bugs in the platform and missing features, such as the ability to filter posts that have already been verified by Facebook.
Meta said that when he learns of a potential issue on CrowdTangle, he addresses it as quickly as possible. He added that the company provides another dedicated tool for its third-party fact-checkers to analyze its social media apps and flag content that may be misleading.
Brandon Silverman, former CEO of CrowdTangle, said the research community the team worked with had long seen how impactful data sharing was, but that CrowdTangle had “struggled” with how to tell that story in a meaningful way. wide, even within Meta. “In the last few months, I think that has started to change,” he said in an interview. “There is a greater recognition that getting to some baseline transparency has to be one of the first steps forward.”
The company has tried to promote its other transparency reports, such as the widely viewed content report it distributes every quarter, which was originally released as a rebuttal to CrowdTangle data that suggested far-right personalities consistently dominate the platform. But the researchers say a polished Meta report isn’t nearly as revealing as a tool they can use to ask their own questions. The company shelved the first content report it compiled as Facebook executives, including Alex Schultz, the company’s chief marketing officer, debated whether it would cause a public relations problem, according to the New York Times.
Most likely, experts say, Facebook will release a tool that mimics some of CrowdTangle’s features without giving users full access to its original capabilities. The company assigned its data transparency team to work on a replacement tool in a privacy-safe way, he said. So far, their efforts fall short, the researchers say. Those who have access to a separate back-search tool for academic research say it is much less user-friendly. Buntain, the University of Maryland researcher, said researchers who want to use it must know how to code to extract analysis from the dataset, and academics have no idea how Meta compiles the data it provides.
In fact, researchers previously spotted an error by Facebook when they found a discrepancy between the data it provided to its research community and the data it published publicly through its widely viewed content report. The data provided to the researchers had left out about half of US Facebook users who engaged enough with political Pages to make their political leanings clear. That incident showed “the value of multiple points of view in data,” Buntain said.
CrowdTangle is unmatched in “its ease of use, the speed with which you can get insights, and the ease with which you can get insights,” added Buntain. “That cannot be exaggerated.”