The following comes from a May 25 New York Times article by Benedict Carey and Pam Bullock:
He was a graduate student who seemingly had it all: drive, a big idea and the financial backing to pay for a sprawling study to test it.
In 2012, as same-sex marriage advocates were working to build support in California, Michael LaCour, a political science researcher at the University of California, Los Angeles, asked a critical question: Can canvassers with a personal stake in an issue — in this case, gay men and women — actually sway voters’ opinions in a lasting way?
He would need an influential partner to help frame, interpret and place into context his findings — to produce an authoritative scientific answer. And he went to one of the giants in the field, Donald P. Green, a Columbia University professor and co-author of a widely used text on field experiments.
“I thought it was a very ambitious idea, so ambitious that it might not be suitable for a graduate student,” said Dr. Green, who signed on as a co-author of Mr. LaCour’s study in 2013. “But it’s such an important question, and he was very passionate about it.”
Last week, their finding that gay canvassers were in fact powerfully persuasive with people who had voted against same-sex marriage — published in December in Science, one of the world’s leading scientific journals — collapsed amid accusations that Mr. LaCour had misrepresented his study methods and lacked the evidence to back up his findings.
On Tuesday, Dr. Green asked the journal to retract the study because of Mr. LaCour’s failure to produce his original data. Mr. LaCour declined to be interviewed, but has said in statements that he stands by the findings.
The case has shaken not only the community of political scientists but also public trust in the way the scientific establishment vets new findings. It raises broad questions about the rigor of rules that guide a leading academic’s oversight of a graduate student’s research and of the peer review conducted of that research by Science.
New, previously unreported details have emerged that suggest serious lapses in the supervision of Mr. LaCour’s work. For example, Dr. Green said he had never asked Mr. LaCour to detail who was funding their research, and Mr. LaCour’s lawyer has told Science that Mr. LaCour did not pay participants in the study the fees he had claimed.
Dr. Green, who never saw the raw data on which the study was based, said he had repeatedly asked Mr. LaCour to post the data in a protected databank at the University of Michigan, where they could be examined later if needed. But Mr. LaCour did not.
“It’s a very delicate situation when a senior scholar makes a move to look at a junior scholar’s data set,” Dr. Green said. “This is his career, and if I reach in and grab it, it may seem like I’m boxing him out.”
But Dr. Ivan Oransky, A co-founder of “Retraction Watch,” which first published news of the allegations and Dr. Green’s retraction request, said, “At the end of the day he decided to trust LaCour, which was, in his own words, a mistake.”
Critics said the intense competition by graduate students to be published in prestigious journals, weak oversight by academic advisers and the rush by journals to publish studies that will attract attention too often led to sloppy and even unethical research methods. The now disputed study was covered by The New York Times, The Washington Post and The Wall Street Journal, among others.
Mr. LaCour approached Dr. Green after class at summer workshop on research methods with his idea.
His proposal was intriguing. Previous work had found that standard campaign tactics — ads, pamphleteering, conventional canvassing — did not alter core beliefs in a lasting way. Mr. LaCour wanted to test canvassing done by people who would personally be affected by the outcome of the vote.
His timing was perfect. The Los Angeles LGBT Center, after losing the fight over Proposition 8, which barred same-sex marriage in California, was doing just this sort of work in conservative parts of the county and wanted to see if it was effective. Dave Fleischer, director of the center’s leadership lab, knew Dr. Green and had told him of the center’s innovative canvassing methods.
“Don said we were in luck because there was a Ph.D. candidate named Mike LaCour who was interested in doing an experiment,” Mr. Fleischer said.
Money seemed ample for the undertaking — and Dr. Green did not ask where exactly it was coming from.
“Michael said he had hundreds of thousands in grant money, and, yes, in retrospect, I could have asked about that,” Dr. Green said. “But it’s a delicate matter to ask another scholar the exact method through which they’re paying for their work.”
The canvassing was done rigorously, Mr. Fleischer said. The LGBT Center sent people into neighborhoods that had voted against same-sex marriage, including Boyle Heights, South Central and East Los Angeles. The voters were randomly assigned to either gay or straight canvassers, who were trained to engage them respectfully in conversation.
Mr. LaCour’s job was to track those voters’ attitudes toward same-sex marriage multiple times, over nine months, using a survey tool called the “feeling thermometer,” intended to pick up subtle shifts. He reported a response rate of the participants who completed surveys, 12 percent, that was so high that Dr. Green insisted the work be replicated to make sure it held up.
The LaCour-Green findings electrified some in the field. Joshua Kalla, a Ph.D. candidate at the University of California, Berkeley, saw the study presented before it was published.
“It was very exciting, and partly because it wasn’t just theoretical, it was something that could be applied in campaigns,” he said.
He and a fellow student, David Broockman, who will soon be an assistant professor at Stanford, decided to test the very same approach on another political issue, also working with the Los Angeles LGBT Center. Mr. Fleischer of the center said the issue was transgender equality in Florida. Mr. Kalla and Dr. Broockman paid participants as they thought Mr. LaCour had, but their response rate was only 3 percent.
“We started to wonder, ‘What are we doing wrong?’ ” Mr. Kalla said. “Our response rate was so low, compared to his.”
There are now serious questions about whether Mr. LaCour achieved the high response rate he claimed. He has acknowledged that he did not pay participants as he had claimed, according to Dr. Green and Dr. McNutt, the Science editor in chief.
Dr. Green asked Mr. LaCour for the raw data after the study came under fire. Mr. LaCour said in the letter to Dr. McNutt that he erased the raw data months ago, “to protect those who answered the survey,” Dr. McNutt said.
She said that it was possible some voters had responded to some surveys, but that it was most likely that too few had done so to provide enough data to reach persuasive conclusions.
Survey data comes in many forms, and the form that journal peer-reviewers see and that appears with the published paper is the “cleaned” and analyzed data. These are the charts, tables, and graphs that extract meaning from the raw material — piles of questionnaires, transcripts of conversations, “screen grabs” of online forms. Many study co-authors never see the raw material.
Mr. Kalla, trying to find out why he and Dr. Broockman were getting such a low response rate, called the survey company that had been working with Mr. LaCour. The company, which he declined to name, denied any knowledge of the project, he said.
“We were over at Dave’s place, and he was listening to my side of the conversation, and when I hung up,” we just looked at each other, he said. “Then we went right back into the data, because we’re nerdy data guys and that’s what we do.”
On Saturday, they quickly found several other anomalies in Mr. LaCour’s analysis and called their former instructor, Dr. Green. Over the weekend, the three of them, with the help of an assistant professor at Yale, Peter Aronow, discovered that statistical manipulations could easily have accounted for the findings. Dr. Green called Mr. LaCour’s academic adviser, Lynn Vavreck, a professor, who confronted Mr. LaCour.
Dr. McNutt of Science said editors there were still grappling with a decision on retracting.
“This has just hit us,” she said. “There will be a lot of time for lessons learned. We’re definitely going to be thinking a lot about this and what could have been done to prevent this from happening.”