Evidence-based Practice: Some Tools to Find It

CSES, HORT, AECT, PLPA, HESC, POSC, AMPD, FDSC Agriculture Librarian

Necia Parker Gibson's picture
Necia Parker Gibson
Contact:
Mullins Library 220N

Email is the best way to contact me. neciap@uark.edu

I've set my office phone to forward to my cell phone. We'll see how that works.

I'm working from home starting 3/16/2020 until further notice. Email or use the text number below. If you want me, particularly, ask for me.

Text a librarian: 479-385-0803

For students, faculty, and staff I will:
Answer your questions via email, phone or Skype or Facetime (until we are back to face to face).
Recommend databases for your topic.
Meet individually to work out your topic or discuss research strategies.

For faculty, I will:
Provide in-person library instruction tailored to your class, or tailored research guides to your class, with some lead time.

Meet with your students individually or in small groups.
Track down tricky citations. Purchase books and other materials, as funds allow.

I do consultations via email, Skype or Facetime (as well as face to face, when we can again).
Email me for an appointment.
479-575-8421
Website

Chat with Us

What is Evidence-based Practice?

The purpose of evidence-based practice is to help ground professional decisions in the literature of the discipline and in the context of the patient(s), subjects or other situations. Appropriate sources may vary by discipline; IF in doubt, consult your advisor/professor. 

A feature of reviews for evidence-based practice is the structure of the review. What rules did the authors follow when doing the meta-analysis or systematic review? There should be a summary of findings, details on where they searched, how, when, how the studies were selected, how many, what quality, and whether they limited the search to English language sources, among other possibilities. Did they cover the relevant, important literature in the field? Is the number sufficient? How did they define useful studies? Does their definition make sense? Did they include randomized controlled studies as at least part of their selection? Did they suggest implications of the results of the study for practice and research? Are recommendations or protocols included, or are you directed to them in another document?

Some Databases with Systematic Reviews

Levels of Evidence

Example Levels of Evidence

 

Sources of Evidence 

Classification 
Meta-analysis of multiple well-designed controlled studies

1A

Well-designed randomized controlled trials

1

Well-designed non-randomized controlled trial (quasi-experiments)

2

Observational studies with controls (retrospective studies, interrupted time-series studies, case-control studies, cohort studies with controls)

3

Observational studies without controls (cohort studies without controls and case series)

4

 

Robey, R. R. (2004, April 13). Levels of Evidence. The ASHA Leader. http://www.asha.org/Publications/leader/2004/040413/f040413a2.htm

There are other examples. This is a commonly-used hierarchy.

Another Hierarchy for EBM

Rating System for the Hierarchy of Evidence: Quantitative Questions

Level I: Evidence from a systematic review of all relevant randomized controlled trials (RCT's), or evidence-based clinical practice guidelines based on systematic reviews of RCT's

Level II: Evidence obtained from at least one well-designed Randomized Controlled Trial (RCT)

Level III: Evidence obtained from well-designed controlled trials without randomization, quasi-experimental

Level IV: Evidence from well-designed case-control and cohort studies

Level V: Evidence from systematic reviews of descriptive and qualitative studies

Level VI: Evidence from a single descriptive or qualitative study

Level VII: Evidence from the opinion of authorities and/or reports of expert committees

Above information from "Evidence-based practice in nursing & healthcare: a guide to best practice" by Bernadette M. Melnyk and Ellen Fineout-Overholt. 2005, page 10. From: http://researchguides.ebling.library.wisc.edu/content.php?pid=325126&sid=2940230

In addition:

Practice-based on empirical research is more likely to be sound. Looking for systematic reviews of the literature allows you to have some confidence that the practices recommended are based on more than just a few patients. Systematic reviews that include meta-analyses of the data in the articles are more likely to be reliable.

Things to consider: Did they ask a good question? the 'right' question? Did they make their methods explicit? Was their search detailed? Was it comprehensive? What did they miss, and why?

A different model for evaluation that I also like is called the rhetorical triangle: the three points of the triangle are author, audience and purpose (Laura Wukovitz, http://Research Guides.hacc.edu/milex). This brings into consideration that there are often social dimensions to even the most data-driven science (sometimes known as "Whose ox is being gored?", another way of asking who benefits from a particular publication, and in what way)?

Even the Best Studies May Have Flaws--