top of page

Quality of Research Evidence

Few of us would argue with the importance of evidence-based practice (EBP) in sports medicine, but to develop such a culture we must have ‘quality’ research literature (1). Research is simply a way of solving problems. Questions are raised, and methods are devised in an attempt to answer them (2). Research enables us to convert theory into practice to solve clinical problems. Research in medicine and sciences has developed within a framework of thinking that is known as the ‘scientific method’. This framework has become the predominant model for rigorous research investigation.


The quality issue

Unfortunately, owing to the vast quantities of research material now available, searching and locating relevant literature can be difficult and often very time consuming. Furthermore, the ‘quality’ of research evidence cannot always be relied upon for EBP. This article will attempt to identify some of the typical pit-falls, discuss the necessary research skills required for EBP.



Evidence-based practice (EBP) is a process of turning clinical problems into questions, then systematically locating, critically appraising, and using robust contemporary research as the basis for clinical decisions (3). It is widely accepted the reading of peer reviewed research articles is essential for EBP and vital to continuing professional development (CPD) (4, 5, 6). Journal reading enables the therapist to keep up-to-date with current thinking (7). However, research evidence should never be accepted blindly and certainly not taken at face value (8). Therefore, research needs to be critically evaluated for both quality (validity) and relevance (9) to help the clinician make better use of the evidence (10) to inform clinical decisions and practice. Rothstein (11) realised the survival of his profession (physical therapy) depended not on the quantity of dubious research but on the quality of focused and meaningful research. Rosenberg and Donald (3) said “we are confronted by a growing body of information, much of it invalid or irrelevant to clinical practice”.


Many authors share this view, for example; Marshall (8) reminds us of the need to read literature with a critical mind even that published in peer reviewed journals. More drastically, Greenhalgh (12) suggests most published articles belong in the bin, and should certainly not be used to inform practice. According to Del Mar (13) most research papers are written as communications from scientists to scientists and relatively few have immediate clinical relevance - most of the remainder are not rigorous enough to warrant applying clinically. Consequently, the proportion of useful information is very small. Greenhalgh (12) reported many papers published in medical journals have potentially serious methodological flaws. Therefore, if you are deciding whether a paper is worth reading, you should do so on the design of the methods section. In support of Greenhalgh, Sheldon et al. (14) suggests that when designing studies investigators should consider how and by whom their results will be used. The design should be sufficiently robust, the setting sufficiently similar to that in which the results are likely to be implemented, the outcomes should be relevant, and the study size large enough for the results to convince decision makers of their importance. Although textbooks play an important role in providing basic information to learners the drawback with all textbooks is staying current (15). Some concepts described in textbooks can lag behind the empirical evidence by as much as 10 years. This lag is in part attributable to the more prolonged publication cycle for textbooks than for journal articles. In summary, treatment interventions should be chosen from the most relevant, scientifically sound and rigorous evidence currently available. Fortunately, the use of hierarchies of evidence can assist clinicians in this process.



The need for effective high-quality research evidence has arisen from evidence-based practice, fundamentally driven by patients’ expectations and their ever increasing demands, and rightly so. Indirectly, the need has arisen from increasingly stringent legislation, and the looming threat of civil action resulting from negligence and malpractice. Regardless of the driving forces involved, it remains abundantly clear to all concerned that high-quality evidence is a necessity, now, and will remain so in the future. Notwithstanding the reasons above, the development of research skills remains a key issue and constitutes suitable components for future professional development. Understanding the concept of and the components within the hierarchy of evidence would represent a good starting point.

         His knowledge of anatomy was incredible. ...The pain I experienced disappeared almost instantly after I left the treatment room and has been a lot better since... Thanks Nick!

Need reassuring?
Read what others have to
say about us...

        Three visits to Nick in 3 weeks and to my amazement not only did I make the start line but I ran the 26 miles, non-stop in under 4 hours, without any problems with my calf muscle..

or Call: 01298 600477


  1. Bleakley C, & MacAuley D. (2002). The quality of research in sports journals. British Journal of Sports Medicine,  36:124-125.

  2. Thomas JR, & Nelson JK. (2001). Research methods in physical activity, 4th edition. Human Kinetics.

  3. Rosenberg W, & Donald A. (1995). Evidence based medicine: an approach to clinical problem-solving. British Medical Journal, 310:1122-1126.

  4. Bury T, & Mead J. (1998). Evidence-based healthcare: a practical guide for therapists. Butterworth Heinemann.

  5. Turner PA, & Whitfield AW. (1999). Physiotherapists’ reasons for selection of treatment techniques: A cross-national survey. Physiotherapy Theory and Practice, 15(4): 235-246.

  6. Turner PA. (2001). Evidence-based practice and physiotherapy in the 1990s. Physiotherapy Theory and Practice, 17:107-121.

  7. Alsop A. (1997). Evidence-based practice and continuing professional development. British Journal of Occupational Therapy, 60(11): 503-550.

  8. Marshall G. (2005). Critiquing a research article. Radiography, 11:55-59.

  9. Straus SE, & Sackett DL. (1998). Using research findings in clinical practice. British Medical Journal, 317: 339-342.

  10. Cape J. (2000). Clinical effectiveness in the UK: Definitions, history and policy trends. Journal of Mental Health, 9(3):237-246.

  11. Rothstein JM. (1996). Outcomes and survival. Physical Therapy, 76(2):126-127.

  12. Greenhalgh T.(1997).  How to read a paper: getting your bearings (deciding what the paper is about). British Medical Journal, 315:243-246.

  13. Del Mar C. (2005).  Clever searching for evidence. British Medical Journal,  330:1162-1163.

  14. Sheldon TA, Guyatt GH, Haines A. (1998).  When to act on the evidence. British Medical Journal, 317:139-142.

  15. Steves R, & Hootman JM. (2004).  Evidence-based medicine: what is it and how does it apply to athletic training? Journal of Athletic Training,  39(1):83-87.

bottom of page