About the Continuum
The Clearinghouse Continuum of Evidence (Continuum) is an interactive, searchable database of evidence-based programs that address a wide variety of family and mental health issues, such as healthy parenting, financial literacy, nutrition and physical activity, stress, anxiety, and depression.
The Placement Process
The Clearinghouse uses a rigorous process to review and categorize programs. To determine a program's placement on the Continuum, the criteria below are considered.
- Significant Effects
- Sustained Effects
- Study Design
- External Replication
- Additional Criteria
- Representative Sample
- Modest Attrition
- Practical Significance
- Outcome Measures
To ensure scientific rigor, only evaluations published in peer-reviewed journals are considered. View our Continuum Overview (PDF) to better understand the review criteria we use in our placement process. To learn how the Clearinghouse for Military Family Readiness at Penn State (Clearinghouse) differs from other organizations that review programs, please review our Full Comparison Matrix (PDF) or a Condensed Comparisons Matrix (PDF).
Program Placements
A program may be assigned to one of several possible placements on the Continuum.
- Significant effects
- Effects lasting at least one year from program completion
- Randomized controlled study design
- At least one successful external replication
- Meets all additional criteria
- Significant effects
- Effects lasting at least one year from program completion
- Quasi-experimental study design
- At least one successful external replication
- Meets all additional criteria
- Significant effects
- Effects lasting at least six months from program completion
- Randomized controlled or quasi-experimental study design
- No evidence of successful external replication
- Meets at least 2 additional criteria
- Significant effects with potentially promising features
- Effects lasting less than six months from program completion OR
- Pre-test/post-test study design OR
- Meets 0 or 1 additional criteria
- No evaluations or mixed results across more than one evaluation
- No significant effects or significant negative effects
- A program that meets all criteria for an effective placement but fails to demonstrate a significant effect, or demonstrates significant negative effects
Please select the heading below for each placement to better understand what criteria a program must meet in order to reach the associated placement.
An effective program has been shown to have positive results in an evaluation using a randomized controlled trial (RCT) study design or a well-matched quasi-experimental study design. This evaluation must show a significant and sustained effect. Demonstrating a significant effect requires performing a rigorous statistical analysis of data that results in a statistically significant change in a highly desired outcome. For a program to demonstrate a sustained effect, the effect must last for at least one year beyond the end of the program or at least two years after the beginning of the program.
To be placed as an effective program, positive results (as described above) must have also been demonstrated in at least one other study conducted by researchers who were not involved with the original successful study. In other words, there must be independent confirmation of positive program outcomes. Effective programs also have evaluations that meet high scientific standards in research design and adequately address the following criteria:
- A representative group of participants;
- Adequate outcome measurements using reliable and valid assessments;
- Indication of practical (vs. statistical) significance; and
- Relatively few people leave the study before it is completed, so you can have more confidence in conclusions.
The Clearinghouse assigns effective programs of the following two categories:
Effective (RCT) - Both the original and the replication study must be RCTs.
Effective (Quasi) - One or both of the evaluations can be quasi-experimental.
A Promising program has at least one study that shows statistically significant effects and uses an RCT or quasi-experimental design. A promising program shows sustained effects six months after the end of the program or one year from the beginning of the program. These programs do not have to show evidence of replication. Evaluations of promising programs must address at least two of the following criteria:
- A representative group of participants;
- Adequate Outcome Measurement using reliable and valid assessments;
- Indication of practical (vs. statistical) significance; and
- Relatively few people leave the study before it is completed, so you can have more confidence in the conclusions.
An unclear program is one that has been evaluated with a quasi-experimental design that lacks sufficient methodological rigor, a pre/posttest design without a comparison group, or a purely descriptive evaluation (e.g., case study). Unclear programs could also have no evaluations of the program or have mixed results. The Clearinghouse assigns unclear programs to one of the following three categories:
Unclear(∅) - no evaluations performed or mixed results across two or more studies.
Unclear(+) - a program does not quite qualify for a Promising placement but has potentially promising features.
Unclear(−) - a program does not quite qualify for an Ineffective placement but has potentially ineffective features.
Currently, many of the programs placed on the Continuum are categorized as unclear programs. This does not mean that they are ineffective, but rather that the evaluations have either not been conducted or not been sufficiently rigorous.
An ineffective program is one that meets all of the criteria for an effective placement; however, evaluations have failed to demonstrate a significant effect, or they have produced significant negative effects.
Partnerships and Funding
The Clearinghouse for Military Family Readiness at Penn State is the result of a partnership funded by the Department of Defense between the Office of the Deputy Assistant Secretary of Defense for Military Community and Family Policy and the USDA’s National Institute of Food and Agriculture through a cooperative agreement with Penn State. This work leverages funds by the USDA’s National Institute of Food and Agriculture and Hatch Appropriations.
This material is the result of partnership funded by the Department of Defense between the Office of the Deputy Assistant Secretary of Defense for Military Community and Family Policy and the USDA’s National Institute of Food and Agriculture through a grant/cooperative agreement with Penn State.