When I saw this article on NPR's website, I immediately saw how this applies to projects as well as politics. Brendan Nyhan, a health policy researcher at the University of Michigan, published "When Corrections Fail: Ther Persistence of Political Misperceptions." I suggest you check out the entire article, but I posted some of my favorite exchanges from the interview for you below to give you a taste of the goodness it contains:
Mr. BRENDAN NYHAN (Robert Wood Johnson Scholar in Health Policy Research, University of Michigan): Thanks for having me.
CONAN: And when facts are readily available, why are they not enough to change people's minds?
Mr. NYHAN: Well, the problem is, you know, as human beings, we want to believe, you know, the things that we already believe. And so when you hear some information that contradicts your pre-existing views, unfortunately, what we tend to do is think of why we believed those things in the first place.
And, you know, so when, you know, we get these corrections, we tend to say I'm right, and I'm going to stick with my view. And the thing that my research, which is with Jason Reifler at Georgia State University, found is that in some cases, that corrective information can actually make the problem worse...
CONAN: This is a phenomenon described as backfire. You say it's a natural defense mechanism to avoid cognitive dissonance...So this isn't a question of education, necessarily, or sophistication. It's really about, it's really about preserving that belief that we initially held.
CONAN: And you define sophistication, as I read your piece, you define it as somebody who is right a lot of the time, but the 10 percent of the time they're wrong, boy, they stick to being wrong...
Brendan Nyhan, why is it that highly partisan issues seem to be most subject to this backfire phenomenon?
Mr. NYHAN: Well, I think they're the cases where people care most about the actual outcome of the debate....
Mr. NYHAN: ...one of the things that I've advocated is holding elites responsible for putting this information out there in the first place, and the kind of fact-checking that Dana has done is one way to do that. And even if it isn't effective at changing people's minds out there, maybe it makes people think twice about putting the information out there in the first place.
CONAN: And Dana, that suggests that there is a, you know, the shame factor: If you can hold the elites responsible for putting out information that's just, well, flat wrong, maybe they can then tell their supporters I might have missed that one.
Mr. MILBANK: Well, I think it's useful to attempt to do that. I'm kind of doubtful that it actually has any effect....
I don't necessarily agree with Brendan that 'shaming' can be used successfully, especially in a project where people have to work with one another every day after such an event would occur, but I see that there are some fascinating impacts this type of research has on how we, as project people, work with each other and our stakeholders.