Skip to main content

Evidence-based Practice in Orthoptics

Orthoptists, like all other health professionals, are increasingly required to ensure their practice is based on robust evidence. Evidence based practice relates to clinical decision making and is defined as ‘the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients’.* Essentially it requires orthoptists to integrate best available evidence with their clinical expertise when managing an individual’s vision disorder.

Strength of Evidence
An integral part of applying evidence based practice to clinical decision making is the evaluation of strength of scientific evidence. Strength of evidence relates to the study design and the degree to which bias has been eliminated. In general the strongest evidence is considered to be a systematic review of randomised controlled trials, whilst the weakest is considered to be case studies or expert opinions. Various organisations and institutions have developed guidelines for evaluating the quality and validity of studies and hierarchies or levels of evidence. The National Health and Medical Research Council in Australia, for instance, ranks evidence as follows:

I        Evidence from at least one systematic review of all randomised controlled trials.
II-1   Evidence from at least one well designed randomised controlled trial.
III-1  Evidence obtained from well-designed pseudo-randomised controlled trials (alternate allocation or some other method).
III-2  Evidence from well-designed trials such as non-randomised trials, cohort studies, time series or matched case-controlled studies.
III-3  Evidence obtained from comparative studies with historical control, two or more single-arm studies, or interrupted time series without a parallel control group.
IV      Evidence obtained from case series, either post-test or pre-test and post-test.

It is however important to acknowledge that such grading systems can be misleading as levels alone do not always reflect the strength of evidence. The synthesis, appraisal and interpretation of the literature are a significant part of deciding whether there is sufficient evidence for application in practice. As clinicians aiming to provide the best possible care for our patients, the skill of understanding clinical evidence and integrating this with clinical expertise and patient preferences is complex but paramount in clinical decision making.

Resources

Cochrane Collaboration
There are various sources of information to inform clinical practice. One of the key resources is the Cochrane Collaboration. This is an international organisation that produces and disseminates systematic reviews of healthcare interventions and promotes the search for clinical trials and studies providing evidence. The Cochrane Library is maintained by the Collaboration. This is a collection of regularly updated evidence-based healthcare databases, a few of which are:
 

  • The Cochrane Database of Systematic Reviews (CDSR) - Cochrane Reviews
  • The Database of Abstracts of Reviews of Effectiveness (DARE) - Other Reviews
  • The Cochrane Central Register of Controlled Trials (CENTRAL) – Published controlled trials


The Cochrane Collaboration also consists of various Review Groups, one of which is the Cochrane Eyes and Vision Group (CEVG). CEVG prepares, maintains and promotes access to systematic reviews of interventions used to prevent or treat eye diseases and/or vision impairment.  The CEVG protocols and reviews are published on CDSR database of the Cochrane Library. A list of reviews and protocols can also be found on the CEVG webpage.

The Society for Clinical Ophthalmology
The Society for Clinical Ophthalmology primarily aims is to support eyecare practitioners by providing timely information on clinical developments in ophthalmology. The website includes updates and forums for discussion amongst a number of other resources.

* Sackett, DL, Rosenberg WMC, Muir Gray JA, Haynes RB, Richardson WS. Evidence-Based Medicine: What it is and what it isn‘t. British Medical Journal, 1996;312:71-72