Skip to main content

Unfortunately we don't fully support your browser. If you have the option to, please upgrade to a newer version or use Mozilla Firefox, Microsoft Edge, Google Chrome, or Safari 14 or newer. If you are unable to, and need support, please send us your feedback.

Elsevier
Publish with us
Connect

A fascinating experiment into measuring dishonesty

September 22, 2016

Is peer review a major determent in keeping science honest?

Scientific misconduct is a serious issue that has resulted in hefty fines, ruined reputations and even prison sentences.

If a scientific paper is a building block in the foundation of knowledge of specific subject area, a fraudulent paper has the potential to destabilize anything that is built upon it.

But just how dishonest are we as humans? Can you even measure dishonesty? And do existing peer-review and ethics guidelines deter researchers from heading down this path?

With these questions in mind, Darren Sugrue, Marketing Communications Manager, Elsevier, reached out to Professor Dan Ariely and Yael Melamede (producer Oscar-winning Innocente) and asked them to share some of their findings while making the recent documentary (Dis)Honesty: The Truth About Liesopens in new tab/window.

Darren: How do you even begin with measuring something as abstract as dishonesty?

Dan: At the beginning of 2002, we started a series of studies called The Matrix Experiments. We gave people 20 simple math problems: each one was a matrix of numbers were people had to find two numbers that added up to ten. it was simple enough exercise that anyone could do, but we didn't give them enough time. At the end of five minutes, people had to put their pencils down and write on another piece of paper how many they solved correctly. They then put the original test paper in the shredder, so nobody would know the true number they had solved. They received $1 for each problem they claimed to have solved correctly.

Shredder with a twist

What they didn't know is that we modified the shredder! We only shredded the sides of the page, where as the body of the page remained intact. Over 40,000 people, form all walks of life, participated in The Matrix Experiments.

What did we find?

  • On average, people solved four problems but reported solving six.

  • Nearly 70% cheated.

  • Only 20 out of the 40,000 were “big cheaters”, people who claimed to have solved all 20 problems. They cost the experiment $400.

  • We also found more than 28,000 “little cheaters” who cost the experiment $50,000.

So although there are some big cheaters out there, they are very rare and their overall economic impact is relatively low. On the other hand, there are a lot more “little cheaters” out there and their economic impact is incredibly high.

fascinating-experiment

Darren: As a researcher yourself, do you believe that this cheating behavior is occurring on a similar scale within scholarly publishing or do peer-review and ethics guidelines deter most researchers?

Dan: So I think that the problem replicates itself entirely in the world of academic publications. I don’t think that anybody who wants to maximize their lifetime income would pick academia as the path for this process – it’s not a profitable path. But people behave dishonestly for all kinds of reasons. There’s something called “pilot data”, and the moment you call something “pilot data” you have a different process around it.

There are pressures for funding. Imagine you run a big lab with 20 people and you’re about to run out of funding – what are the pressures of taking care of the people who work with you? And what kind of shortcuts would you be willing to take? Dishonesty can permeate through a system and show up not because of selfish interest but because of a desire to help.  

With regards to research, I don’t think that most people think long-term and think to themselves that somebody would try to replicate their results and find that they don’t work.  People often tend to “tweak” data and convince themselves that they are simply helping the data show its true nature. There are lots of things in academic publications that are manifestations of our abilities to rationalize.

We need to question and evaluate the funding of science. So for example, should we have these big grants that run out every three or five years? Or should we fund things in a different way and maybe have different funding models? Should we allow people to be funded by big pharma or non-commercial companies?  In essence, I think that the pressures of publication, funding, helping the group, and reputation are very much present in academic publications. They clearly play a significant role in the challenges that are facing academic research.

Combating dishonesty

Inspired by the extraordinary conversations around ethics sparked by the film over the past year, Dan and Yael are now working to transform ethics education in schools, corporations and organizations of all kinds.

"Ethics is like health and therefore something we need to invest in, monitor, be mindful of and continuously consider - as individuals and as a community," says Dan. "If we only exercised once a year, it would not be helpful. So how can we make ethics a more salient part of our day-to-day?"

(Dis)Honesty – The Truth About Lies is a documentary feature film that explores the human tendency to be dishonest. Inspired by the work of behavioral economist, Dan Ariely, the film interweaves personal stories, expert opinions, behavioral experiments, and archival footage to reveal how and why people lie. Watch the trailer >>opens in new tab/window

“Brilliant” – Psychology Today

"Illuminating" - New York Times

“A deep-think doc animated by the researcher at its center” – The Hollywood Reporter

dishonesty-poster