The Debunkers of a Gay Marriage Study Just Re-bunked It, Sort Of
In 2014, a young researcher named Michael LaCour published some remarkable results in the journal Science. His study, written with the well-respected political scientist Donald Green, looked to see if a short, personal conversation with door-to-door canvassers could change people’s minds about gay marriage. It did, in a big way—especially when the canvassers were gay themselves.
Soon another young researcher, David Broockman, started looking into the canvassing data, hoping to replicate these exciting results. But when he started digging along with another UC Berkeley student, Joshua Kalla, they realized that LaCour had made it all up. It was blatant fraud, and the scandal was followed by a very hasty and very public retraction. Now, a year after blowing the whistle, Broockman (now at Stanford) and Kalla have made good on the original goal: to extend the impressive (and impressively bogus) research.
Their study, also in Science, looked to see if a short, personal conversation with door-to-door canvassers could change people’s opinions about transgender people. It did. In a big way.
Nobody expected things to work out like this.
The overwhelming opinion in psychology and political science was that persuasion—real, substantive change of entrenched opinions—was pretty much impossible. Political strategists preferred to mobilize supporters, to get out the vote, rather than convert people on the other side of an election. “Those people who are 90 percent likely to vote for your candidate but only 50 percent likely to vote, you call them up 3,000 times,” says Andrew Gelman, a statistician and political scientist at Columbia. LaCour and Green’s results contradicted that established knowledge. That’s why Broockman and Kalla wanted to follow up on it in the first place.
It’s surprising, then, that their results—for now—have panned out. “It’s a little odd to talk about replicating a paper that has already been retracted,” Kalla says. But the fact that he and Broockman followed through on their plan is a thrilling moment for proponents of transparency and replication in the social sciences. Especially because it worked.
It’s not a coincidence that Broockman and Kalla were the ones to uncover LaCour’s fraud: They’re part of a new generation of scientists in fields from physics to psychology who have had ideas about transparency and replication ingrained in them. Both went to Yale, where they worked with some of the most prominent political scientists in the country—who also happened to be strong proponents of data transparency. “People started talking about these ideas when I was in undergrad,” Kalla says.
In 2009, Donald Green—one of Kalla’s professors and LaCour’s coauthor—and others founded Evidence in Governance and Politics. Among other things, it functions as a repository for preanalysis plans—essentially statements of intent for an upcoming research project. “There’s a lot of ways to analyze data, lots of judgement calls, and the ones that seem innocent can quickly build up,” Kalla says. “We submit our plans to tie our hands in advance.”
Baobao Zhang graduated from Yale undergrad in 2013—she overlapped with Broockman and Kalla—and she too remembers those early lessons in transparency. “There was a class that we all took with Alan Gerber,” she says. “We all learned to write these preregistration plans before we did our projects.” Kalla and Broockman submitted plans for this study, of course; LaCour seemingly fabricated evidence of that process.
Those practices won’t stay stuck inside Yale’s walls for long. In 2012, Green and Gerber published Field Experiments: Design, Analysis, and Interpretation, which includes many of the basics of preregistration and data sharing that are part of the standard curriculum in political science at Yale. That book, and others like it, have the potential to democratize the ideas that until recently have seemed like heresy in the social sciences. “When presenting the early drafts of the book to an assortment of political science departments,” Green writes in an email, “I found that it was often the first time that audiences had heard or thought about registration.” If they had heard about it, they were likely to express concerns that it could let people scoop their work. But the ideas are spreading.
The Next Generation
When Broockman first noticed discrepancies in the original Science paper, he tried to surface those issues in the political science forum poliscirumors.com. “But no one stood up, because they didn’t take this forum seriously,” says Zhang. “It was sort of a trolly place.” Still, it was one of the only places to discuss political science papers beyond of the closed doors of individual offices and labs.
Zhang and her Yale PhD advisor Allan Dafoe want to make it easier to have those conversations by creating a web forum—tentatively named Science Commons—for post-publication peer review. “That way, mistakes can be discovered really quickly and then addressed rather than wait for months or years,” Zhang says. Similar organizations like PubPeer have already led to multiple retractions based on comments.
A forum like that could also help organize replication attempts. For all the pats on the back Broockman and Kalla will be getting today, their study is still a single data point. The fact that persuasion worked in this one case—in South Florida, tackling transphobia—doesn’t mean it will work in other places. “It’s not like yesterday we were ignorant and today we know everything,” Broockman says. Field research is full of hidden variables, and political science has yet to tackle some even bigger problems about how to replicate studies full of so much human messiness.
Broockman points to a recent study that measured how effective call centers were at mobilizing voters. Some were great; others had no effect at all. “That kind of quality is this ineffable thing, and it’s not often reported,” says Broockman. “We don’t even know how to report it, and that should strike fear in our hearts.”
Clearly, it has. In the paper, he and Kalla call for replication attempts—desperately asking other researchers to try their methods out in other populations, on other ingrained opinions. The students have become the teachers.
Read this article –