Mental Health Centre Copenhagen, Denmark; email@example.com
Increased focus on replicability in science has led to legislation and regulation to minimize the file drawer phenomenon. An alternative approach could be to encourage authors to write papers with impact rather than papers in high impact journals. Based on personal experience, this essay suggests a systematic framework developed to facilitate the extraction of valuable knowledge from a “failed” trial. First, “negative” results should be differentiated into inconclusive, neutral, negative and statistically significant but clinically irrelevant. Second, to avoid cherry-picking of references, systematic search should be performed when the results are integrated in current research. Third, acknowledging that the tested hypothesis might be wrong can initiate de-implementation in clinical practice and suggest that further research should look for an alternative approach.
Member only content
The most recent journal content is only available to members of EASE. If you are already a member, please login using the form below.
If you would like access, please see how to join.