By the same authors

From the same journal

Perspectives on Evidence-Based Research in Education: What Works? Issues in Synthesizing Educational Program Evaluations

Research output: Contribution to journalArticle

Author(s)

Department/unit(s)

Publication details

JournalEducational Researcher
DatePublished - Jan 2008
Issue number1
Volume37
Number of pages10
Pages (from-to)5-14
Original languageEnglish

Abstract

Syntheses of research on educational programs have taken on increasing policy importance. Procedures for performing such syntheses must therefore produce reliable, unbiased, and meaningful information on the strength of evidence behind each program. Because evaluations of any given program are few in number, syntheses of program evaluations must focus on minimizing bias in reviews of each study. This article discusses key issues in the conduct of program evaluation syntheses: requirements for research design, sample size, adjustments for pretest differences, duration, and use of unbiased outcome measures. It also discusses the need to balance factors such as research designs, effect sizes, and numbers of studies in rating the overall strength of evidence supporting each program. (Contains 1 table.)

Bibliographical note

Database: ERIC

Record type: New.

Language: English

DataStar source field: Educational Researcher, 2008, vol. 37, no. 1, p. 5-14, pp. 10, 63 refs., ISSN: 0013-189X.

DataStar update date: 20090101

    Research areas

  • evidence-based reform, meta-analysis, research review, What Works Clearinghouse

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations