A few years ago, as I started researching my book about the science of exercise recovery, I found something curious: the methodological flaws that have roiled psychology were also lurking in sports science. The problems were hiding in plain sight. As I plowed through the published studies in the sports and exercise science literature, I saw many studies with small sample sizes,31 a journal system that appeared to be biased toward publishing studies showing that a treatment or regimen improves performance (versus those that find no effect) and studies that collected multiple measures in a way that could make it tempting for researchers to fish around for a favorable result. What I wasn’t seeing though was much open discussion of these issues. Until now.
Today at SportsRxiv, a place where researchers can share their unpublished studies (so-called preprints) to get feedback before peer review, 36 researchers have released an editorial urging the field to adopt practices that have been gaining traction in the social sciences to combat “questionable research practices” such as p-hacking. This happens when researchers adjust the parameters of an analysis until they get a statistically significant p-value, a difficult-to-understand number that’s often misused to imply a finding couldn’t have happened by chance.
The goal is to bring more transparency, openness and rigor to the field, much as the open science movement has done in psychology. Judging by our reporting into a dubious statistical practice in sports science, it’s a movement that’s badly needed.
The researchers involved in the editorial are looking to psychology for a road map. They’ve formed The Society for Transparency, Openness, and Replication in Kinesiology, which is modeled after the Society for the Improvement of Psychological Science that has brought psychology researchers together to develop better research practices.
STORK is beginning by focusing on two approaches to improving research practices: preregistration and registered reports. In the former, researchers submit their hypotheses in advance and commit to a specific methodology and analysis plan, which they post in an independent registry. This prevents researchers from playing around with different ways of looking at their data until they get an appealing result, said lead author Aaron Caldwell, a graduate student in exercise science at the University of Arkansas-Fayetteville.
Registered reports, on the other hand, give researchers an opportunity to submit their studies to journals where they’ll be accepted or rejected based on the rigor of their methodology, rather than on the sexiness of their results. With a registered report, scientists submit the methods and data collection and analysis plan to the journal before they do the study. Peer reviewers then have a chance to comment and offer suggestions on the methods, and the paper is given conditional acceptance based on the design. “Registered Reports provide authors peace of mind that publication is not dependent on results,” Caldwell and his colleagues write.
Right now, exercise scientists don’t have many resources available to put these two tools into practice. The idea for the new editorial came out of a discussion that happened in a Facebook group where sport and exercise scientists discuss methods. Caldwell asked if anyone knew of any journals in the field that accepted registered reports. The answer was no, so he suggested that they draft a “call” for registered reports.
The reaction so far has ranged from “This is good — it’s how science should be operating” to “why are you trying to make science harder to do? It’s already hard enough,” Caldwell said. A few people have said, “that’s fine, but I’m not interested.” But overall, said Andrew Vigotsky, the paper’s corresponding author, “the field has been pretty receptive” to this conversation about improving methodology. “It helps that people see these other fields (psychology, social sciences) going through changes, so the push is not too surprising.”