ABSTRACT

The methods that are used to analyze data in organizational research are increasingly complex and diverse; a survey of all the articles published in two years in two leading journals reveals a wide mix of analytic methods for tackling what are often similar questions. While complex data analysis methods sometimes allow researchers to tackle questions that would evade simpler methods, the likelihood that complex data analyses are performed correctly and, more important, interpreted correctly is likely to decrease as complexity increases. As methods depart from the familiar frameworks of ANOVA and OLS regression, they have become increasingly dependent on null hypothesis tests and are less likely to report interpretable effect size measures. I illustrate the trade-off between complexity and interpretability by showing how these the key components of these significance tests (i.e., standard error terms) become increasingly difficult to understand as analyses become more complex. I describe ways researchers could make better use of descriptive statistics to make data analyses more meaningful and more interpretable, and I lay out the analytic tools that are most likely to prove useful to organizational scientists who wish to understand what their data mean.