Evidence-Based Practice. The term evidence-based practice (EBP) was used initially in relation to medicine, but has since been adopted by many fields including education, child welfare, mental heath, and criminal justice. The Institute of Medicine (2001) defines evidence-based medicine as the integration of best researched evidence and clinical expertise with patient values (p. 147). In social work, most agree that EBP is a process involving creating an answerable question based on a client or organizational need, locating the best available evidence to answer the question, evaluating the quality of the evidence as well as its applicability, applying the evidence, and evaluating the effectiveness and efficiency of the solution. EBP is a process in which the practitioner combines well-researched interventions with clinical experience, ethics, client preferences, and culture to guide and inform the delivery of treatments and services.
Evidence-Based Practices, Evidence-Based Treatments, Evidence-Based Interventions, and Evidence-Informed Interventions are phrases often used interchangeably. Here, for consistency, we will use the term evidence-based treatments (EBT). Differentiating from the evidence-based practice process described above, one definition of an evidence-based treatment is any practice that has been established as effective through scientific research according to a set of explicit criteria (Drake et al., 2001). These are interventions that, when consistently applied, consistently produce improved client outcomes. Some states, government agencies, and payers have endorsed certain specific evidence-based treatments such as cognitive behavioral therapy for anxiety disorders and community assertive treatment for individuals with severe mental illness and thus expect that practitioners are prepared to provide these services.
Evaluation of Research on Practice Interventions. Randomized controlled trials (RCT) are frequently viewed as the gold standard for the evaluation of interventions. However, it is not always possible or ethical to conduct RCT in social, health, and human services, and thus there is a lack of that type of research evidence for some interventions provided by social workers. Qualitative research can enhance quantitative research and help us better understand cultural issues and contexts related to interventions.
Some view research as falling into a hierarchy with the highest level of the strength of research being systematic reviews and meta-analyses. From this perspective, the next levels of evidence from highest to lowest are: RCT; quasi-experimental studies; case-control and cohort studies; pre-experimental group studies; surveys; and qualitative studies (McNeece & Thyer, 2004). A number of organizations have attempted to develop objective evidence grading systems to rate the strength of evidence for interventions. For example, the California Evidence-Based Clearinghouse for Child Welfare (www.cachildwelfareclearinghouse.org) has developed a detailed six-level system. The Institute of Medicine (IOM) has convened a multidisciplinary roundtable on evidence-based medicine that is exploring multiple issues including examination of the lack of consistency in assessing the strength of evidence regarding what works . For more information, visit www.iom.edu/ebm.
The Campbell Collaboration conducts systematic reviews of research and promotes systematic reviews because such rigorous analysis of research endeavors to minimize bias in the identification, assessment and synthesis of research results (Littell, 2006, p. 9). In these systematic reviews, the review process and decision-making criteria are transparent and established in advance.
While there is no consistent agreement on the hierarchy of best available research, a common perspective on a hierarchy of evidence might be: •Surveillance data;
•Systematic reviews of multiple intervention research studies;...