I have a tab on this blog for “marketing science”. The concept: Scientifically validated insights should provide solid insights into which marketing models will work best, and which will not. Presumably, if we follow the science, rather than our hide-bound traditional practices or just fly by the seat of our pants, we should achieve much better results.
Alas, scientific researchers have dropped a rather big fly into that ointment. They’ve discovered that as many as six of 10 psychological science experiments cannot be replicated. In other words, given as close as possible the same criteria for the original experiment, the researchers could not duplicate the findings.
And the story may be worse than that.
“Replication” is one of science’s main tenants. You publish your work publicly, and then others should be able to see the same results. But it isn’t very sexy to spend your time replicating experiments. You don’t get any credit for proving the original researcher correct — and if you find things don’t match up (and the original research carries big names or any sort of fame behind it), then you have an uphill battle in explaining why you didn’t do something wrong (I realize, a double negative), in your replication research.
The Journal Science published what is described as a “landmark study“ by more than 270 researchers, who came together to replicate 100 recent findings from top psychology journals.
Especially problematic here is the fact that if this data is correct, things may be much worse than they seem.
First, some 50 studies were eliminated because the replicators deemed they were too difficult to even try to replicate. Second, the replicators co-operated and co-ordinated their research with the original study authors. This could lead to selective data decisions and other choices that would skewer the results in favor of a successful replication. Finally, the original studies were published in peer-reviewed major journals. The scientists presumably should have known the basics of research design and study validity before publishing their results.
Does this mean you should ignore the scientific observations when you plan your marketing strategies? I think not, especially since even if the studies are only 40 per cent accurate, there still could be a worthy edge over doing nothing but winging it. As well, I expect that if you apply some scientific discipline — including measuring your results — in your marketing work, you’ll be able to gather useful data to improve and enhance your actual results.
Equally, however, if you read some “wow” marketing science, take it with a grain of salt. Odds are, the experiment cannot be repeated.