Skip to main content

Advertisement

Log in

What Crisis? Management Researchers’ Experiences with and Views of Scholarly Misconduct

  • Original Paper
  • Published:
Science and Engineering Ethics Aims and scope Submit manuscript

Abstract

This research presents the results of a survey regarding scientific misconduct and questionable research practices elicited from a sample of 1215 management researchers. We find that misconduct (research that was either fabricated or falsified) is not encountered often by reviewers nor editors. Yet, there is a strong prevalence of misrepresentations (method inadequacy, omission or withholding of contradictory results, dropping of unsupported hypotheses). When it comes to potential methodological improvements, those that are skeptical about the empirical body of work being published see merit in replication studies. Yet, a sizeable majority of editors and authors eschew open data policies, which points to hidden costs and limited incentives for data sharing in management research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. An August 2017 search of the Web of Science for articles with the term “replication” in the title found only 125 articles in journals included in the FT 45 journal list.

  2. See http://onlinelibrary.wiley.com/doi/10.1002/smj.2016.37.issue-11/issuetoc.

  3. Not all replication studies in the SMJ special issue draw findings similar to those of the original work. Tsang and Yamanoi (2016) point out inconsistencies in hypothesis development along with a lack of generalizability based on a sample from Barkema and Vermeulen’s (1998) study. Park et al. (2016) fail to replicate the major findings of three studies they sought to replicate.

  4. As the questionnaire design and analysis took place outside of the US, no university institutional review board has been involved in the oversight of this research. The US-based co-author was not involved in data collection and had no access to identifiable data.

  5. In 2016, the FT 45 added five journals to become the FT 50.

  6. To reflect on the other side of the process, we also asked journal editors whether manuscripts should contain statistically significant results. This question bases on the work of Devaney (2001). Of the 191 respondents to this question, 131 responded that yes, manuscripts should indeed contain significant results. This, in part, may relate to the perceptions of authors/reviewers here.

  7. One question (following Devaney 2001), which addressed those with editorial responsibilities only, asked whether replication studies were appropriate for publication. An overwhelming majority of editors, 84%, responded that these types of studies were appropriate.

  8. Respondents received the link to the journal list to corroborate that the journals they had published in appeared on the list in the study.

  9. When it comes to differences across the groups, three out of four editors report at least one FT 45 publication, with one third of editors having more than five FT 45 publications. For those in non-editor roles, more than 40% report zero FT 45 publications. Among those reviewing for FT 45 journals 82% report at least one FT 45 publication; 20% have more than five FT 45 publications, while two out of three of those not reviewing for FT 45 journals report zero FT 45 publications.

  10. Though, in some instances, replication has helped to identify fraudulent behavior, as evidenced in Broockman et al. (2015).

References

  • Andreoli-Versbach, P., & Mueller-Langer, F. (2014). Open access to data: An ideal professed but not practised. Research Policy, 43(9), 1621–1633.

    Article  Google Scholar 

  • Antes, A. L., English, T., Baldwin, K. A., & DuBois, J. M. (2018). The role of culture and acculturation in researchers’ perceptions of rules in science. Science and Engineering Ethics, 24(2), 361–391.

    Google Scholar 

  • Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J., Fiedler, K., et al. (2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27(2), 108–119.

    Article  Google Scholar 

  • Azoulay, P., Bonatti, A., & Krieger, J. L. (2017). The career effects of scandal: Evidence from scientific retractions. Research Policy, 46(9), 1552–1569.

    Article  Google Scholar 

  • Azoulay, P., Furman, J. L., Krieger, J. L., & Murray, F. (2015). Retractions. Review of Economics and Statistics, 97(5), 1118–1136.

    Article  Google Scholar 

  • Azoulay, P., Graff Zivin, J., & Wang, J. (2010). Superstar extincition. The Quarterly Journal of Economics, 125(2), 549–589.

    Article  Google Scholar 

  • Barkema, H. G., & Vermeulen, F. (1998). International expansion through start-up or acquisition: A learning perspective. Academy of Management Journal, 41(1), 7–26.

    Google Scholar 

  • Bebeau, M., & Davis, E. (1996). Survey of ethical issues in dental research. Journal of Dental Research, 75(2), 845–855.

    Article  Google Scholar 

  • Bedeian, A. G., Taylor, S. G., & Miller, A. N. (2010). Management science on the credibility bubble: Cardinal sins and various misdemeanors. Academy of Management Learning and Education, 9(4), 715–725.

    Google Scholar 

  • Berghmans, S., Cousijn, H., Deakin, G., Meijer, I., Mulligan, A., Plume, A., et al. (2017). Open data: The researcher perspective-survey and case studies. New York: Mendeley Data.

    Google Scholar 

  • Borgman, C. L. (2012). The conundrum of sharing research data. Journal of the American Society for Information Science and Technology, 63(6), 1059–1078.

    Article  Google Scholar 

  • Borgman, C. L. (2015). Big data, little data, no data: Scholarship in the networked world. Cambridge, MA: MIT Press.

    Book  Google Scholar 

  • Brandt, M. J., IJzerman, H., Dijksterhuis, A., Farach, F. J., Geller, J., Giner-Sorolla, R., et al. (2014). The replication recipe: What makes for a convincing replication? Journal of Experimental Social Psychology, 50(1), 217–224.

    Article  Google Scholar 

  • Brodeur, A., Lé, M., Sangnier, M., & Zylberberg, Y. (2016). Star wars: The empirics strike back. American Economic Journal: Applied Economics, 8(1), 1–32.

    Google Scholar 

  • Broockman, D., Kalla, J., & Aronow, P. (2015). Irregularities in LaCour (2014). In Working paper. Stanford University.

  • Bülow, W., & Helgesson, G. (2018). Criminalization of scientific misconduct. Medicine, Health Care and Philosophy. https://doi.org/10.1007/s11019-018-9865-7.

    Article  Google Scholar 

  • Camerer, C. F., Dreber, A., Forsell, E., Ho, T. H., Huber, J., Johannesson, M., et al. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6280), 1433–1436.

    Article  Google Scholar 

  • Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T., Huber, J., Johannesson, M., et al. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637–644.

    Article  Google Scholar 

  • Chambers, C. (2017). The seven deadly sins of psychology: A manifesto for reforming the culture of scientific practice. Princeton: Princeton University Press.

    Book  Google Scholar 

  • Chang, A. C., & Li, P. (2015). Is economics research replicable? Sixty published papers from thirteen journals say “Usually Not”. In Finance and Economics Discussion Series 2015-083. Washington, D.C.: Board of Governors of the Federal Reserve System.

  • Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93–99.

    Article  Google Scholar 

  • Devaney, T. A. (2001). Statistical significance, effect size, and replication: What do the journals say? The Journal of Experimental Education, 69(3), 310–320.

    Article  Google Scholar 

  • Devereaux, P. J., Guyatt, G., Gerstein, H., Connolly, S., & Yusuf, S. (2016). Toward fairness in data sharing. The New England Journal of Medicine, 375(5), 405–407.

    Article  Google Scholar 

  • Eastwood, S., Derish, P., Leash, E., & Ordway, S. (1996). Ethical issues in biomedical research: Perceptions and practices of postdoctoral research fellows responding to a survey. Science and Engineering Ethics, 2(1), 89–114.

    Article  Google Scholar 

  • Evanschitzky, H., Baumgarth, C., Hubbard, R., & Armstrong, J. S. (2007). Replication research’s disturbing trend. Journal of Business Research, 60(4), 411–415.

    Article  Google Scholar 

  • Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS ONE, 4(5), e5738.

    Article  Google Scholar 

  • Fanelli, D. (2013). Only reporting guidelines can save (soft) science. European Journal of Personality, 27(2), 124–125.

    Google Scholar 

  • Fecher, B., Friesike, S., & Hebing, M. (2015). What drives academic data sharing? PLoS ONE, 10(2), e0118053.

    Article  Google Scholar 

  • Frank, M. C., & Saxe, R. (2012). Teaching replication. Perspectives on Psychological Science, 7(6), 600–604.

    Article  Google Scholar 

  • Gardner, W., Lidz, C. W., & Hartwig, K. C. (2005). Authors’ reports about research integrity problems in clinical trials. Contemporary Clinical Trials, 26(2), 244–251.

    Article  Google Scholar 

  • Gigerenzer, G. (2018). Statistical rituals: The replication delusion and how we got there. Advances in Methods and Practices in Psychological Science, 1(2), 198–218.

    Article  Google Scholar 

  • Gorman, D. M., Elkins, A. D., & Lawley, M. (2017). A systems approach to understanding and improving research integrity. Science and Engineering Ethics. https://doi.org/10.1007/s11948-017-9986-z.

    Article  Google Scholar 

  • Hambrick, D. C. (2007). The field of management’s devotion to theory: Too much of a good thing? Academy of Management Journal, 50(6), 1346–1352.

    Article  Google Scholar 

  • Harley, B., Faems, D., & Corbett, A. (2014). A few bad apples or the tip of an iceberg? Academic misconduct in publishing. Journal of Management Studies, 51(8), 1361–1363.

    Article  Google Scholar 

  • Hartshorne, J. K., & Schachner, A. (2012). Tracking replicability as a method of post-publication open evaluation. Frontiers in Computational Neuroscience, 6, 8.

    Article  Google Scholar 

  • Honig, B., Lampel, J., Siegel, D., & Drnevich, P. (2014). Ethics in the production and dissemination of management research: Institutional failure or individual fallibility? Journal of Management Studies, 51(1), 118–142.

    Article  Google Scholar 

  • Hubbard, R. (2015). Corrupt research: The case for reconceptualizing empirical management and social science. Los Angeles: Sage Publications.

    Google Scholar 

  • Hubbard, R., Vetter, D. E., & Little, E. L. (1998). Replication in strategic management: Scientific testing for validity, generalizability, and usefulness. Strategic Management Journal, 19(3), 243–254.

    Article  Google Scholar 

  • Ioannidis, J. P., & Khoury, M. J. (2014). Assessing value in biomedical research: The PQRST of appraisal and reward. Journal of the American Medical Association, 312(5), 483–484.

    Article  Google Scholar 

  • Jasny, B. R., Chin, G., Chong, L., & Vignieri, S. (2011). Again, and again, and again…. Science, 334(6060), 1225.

    Article  Google Scholar 

  • John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532.

    Article  Google Scholar 

  • Karabag, S. F., & Berggren, C. (2016). Misconduct, marginality and editorial practices in management, business and economics journals. PLoS ONE, 11(7), e0159492.

    Article  Google Scholar 

  • Kattenbraker, M. (2007). Health education research and publication: Ethical considerations and the response of health educators. Carbondale: Southern Illinois University. Doctoral thesis.

  • Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196–217.

    Article  Google Scholar 

  • Koole, S. L., & Lakens, D. (2012). Rewarding replications: A sure and simple way to improve psychological science. Perspectives on Psychological Science, 7(6), 608–614.

    Article  Google Scholar 

  • Leung, K. (2011). Presenting post hoc hypotheses as a priori: Ethical and theoretical issues. Management and Organization Review, 7(3), 471–479.

    Article  Google Scholar 

  • Levelt, C., Noort, C., & Drenth, C. (2012). Falende wetenschap: De frauduleuze onderzoekspraktijken van sociaal-psycholoog Diederik Stapel. Tilburg: Tilburg University.

    Google Scholar 

  • Lichtenthaler, U. (2010). RETRACTED: Determinants of proactive and reactive technology licensing: A contingency perspective. Research Policy, 39(1), 55–66.

    Article  Google Scholar 

  • List, J. A., Bailey, C. D., Euzent, P. J., & Martin, T. L. (2001). Academic economists behaving badly? A survey on three areas of unethical behavior. Economic Inquiry, 39(1), 162–170.

    Article  Google Scholar 

  • Longo, D., & Drazen, J. (2016). Data sharing. New England Journal of Medicine, 374(1), 276–277.

    Article  Google Scholar 

  • Martinson, B. C., Anderson, M. S., & De Vries, R. (2005). Scientists behaving badly. Nature, 435(7043), 737–738.

    Article  Google Scholar 

  • McCullough, B. D., McGeary, K. A., & Harrison, T. D. (2008). Do economics journal archives promote replicable research? Canadian Journal of Economics/Revue canadienne d’économique, 41(4), 1406–1420.

    Article  Google Scholar 

  • Mellers, B., Hertwig, R., & Kahneman, D. (2001). Do frequency representations eliminate conjunction effects? An exercise in adversarial collaboration. Psychological Science, 12(4), 269–275.

    Article  Google Scholar 

  • Merton, R. K. (1942). A note on science and democracy. Journal of Legal and Political Sociology, 1, 115.

    Google Scholar 

  • Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615–631.

    Article  Google Scholar 

  • O’Boyle, E. H., Banks, G. C., & Gonzalez-Mulé, E. (2017). The chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management, 43(2), 376–399.

    Article  Google Scholar 

  • Open Science Collaboration. (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7(6), 657–660.

    Article  Google Scholar 

  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716.

    Article  Google Scholar 

  • Park, U. D., Borah, A., & Kotha, S. (2016). Signaling revisited: The use of signals in the market for IPOs. Strategic Management Journal, 37(11), 2362–2377.

    Article  Google Scholar 

  • Pashler, H., & Harris, C. R. (2012). Is the replicability crisis overblown? Three arguments examined. Perspectives on Psychological Science, 7(6), 531–536.

    Article  Google Scholar 

  • Pashler, H., & Wagenmakers, E. J. (2012). Editors’ introduction to the special section on replicability in psychological science a crisis of confidence? Perspectives on Psychological Science, 7(6), 528–530.

    Article  Google Scholar 

  • Retraction Watch. (2016a). http://retractionwatch.com/category/by-author/ulrich-lichtenthaler/. Accessed 29 Nov. 2018.

  • Retraction Watch. (2016b). http://retractionwatch.com/category/by-author/walumbwa/. Accessed 29 Nov. 2018.

  • Rousseau, D. M. (2006). Is there such a thing as “evidence-based management”? Academy of Management Review, 31(2), 256–269.

    Article  Google Scholar 

  • Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13(2), 90.

    Article  Google Scholar 

  • Schooler, J. (2011). Unpublished results hide the decline effect. Nature, 470(7335), 437.

    Article  Google Scholar 

  • Seifert, B., & Gasser, T. (2004). Local polynomial smoothing. In S. Kotz, C. B. Read, N. Balakrishan, B. Vidakovic, & N. L. Johnson (Eds.), Encyclopedia of statistical sciences. Hoboken: Wiley.

    Google Scholar 

  • Silberzahn, R., Simonsohn, U., & Uhlmann, E. L. (2014). Matched-names analysis reveals no evidence of name-meaning effects: A collaborative commentary on Silberzahn and Uhlmann (2013). Psychological Science, 25(7), 1504–1505.

    Article  Google Scholar 

  • Silberzahn, R., & Uhlmann, E. L. (2013). It pays to be Herr Kaiser: Germans with noble-sounding surnames more often work as managers than as employees. Psychological Science, 24(12), 2437–2444.

    Article  Google Scholar 

  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366.

    Article  Google Scholar 

  • Steen, R. G. (2011). Retractions in the scientific literature: Is the incidence of research fraud increasing? Journal of Medical Ethics, 37(4), 249–253.

    Article  Google Scholar 

  • Stroebe, W., & Strack, F. (2014). The alleged crisis and the illusion of exact replication. Perspectives on Psychological Science, 9(1), 59–71.

    Article  Google Scholar 

  • Tsang, E. W., & Yamanoi, J. (2016). International expansion through start-up or acquisition: A replication. Strategic Management Journal, 37(11), 2291–2306.

    Article  Google Scholar 

  • Walumbwa, F. O., Luthans, F., Avey, J. B., & Oke, A. (2011). Retracted: Authentically leading groups: The mediating role of collective psychological capital and trust. Journal of Organizational Behavior, 32(1), 4–24.

    Article  Google Scholar 

  • Zigmond, M. J., & Fischer, B. A. (2002). Beyond fabrication and plagiarism: The little murders of everyday science. Science and Engineering Ethics, 8(2), 229–234.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Hopp.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hopp, C., Hoover, G.A. What Crisis? Management Researchers’ Experiences with and Views of Scholarly Misconduct. Sci Eng Ethics 25, 1549–1588 (2019). https://doi.org/10.1007/s11948-018-0079-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11948-018-0079-4

Keywords

JEL Classification

Navigation