Hi, I’m Dr. Jennifer Karre, research scientist at the Penn State Clearinghouse for Military Family Readiness. My background is in Developmental Psychology. Before we get started, I’d like to take a minute to thank my colleagues here at the Clearinghouse for their help and support in the development of this presentation. Okay, let’s get started.
One of the services that the Clearinghouse provides is to vet programs and place them on our continuum of evidence. We vet prevention and treatment programs that may be useful for practitioners who work with Military families.
We have five evaluation criteria that help determine whether a program gets placed as effective, promising, unclear, or ineffective. These criteria are significant effects, sustained effects, external replication, study design, an a set of additional criteria. First, I will talk about how each of these criteria are related to the effective, promising, and unclear categories, then I will discuss our ineffective category.
First is significant effects. In order to qualify for the effective or promising categories, a program must show statistically significant evidence of a change in behavior. There must also be no negative effects found. If effects are unclear due to mixed results or questionable evaluation design, then it qualifies for the unclear placement.
Next is sustained effects. In order to qualify for the effective placement, a program must show that its effects last either two years from the beginning of the program, or one year from the end of the program. For a program to qualify as promising, it must show effects one year after the beginning of the program or six months from the end of the program. If the program has not assessed effects at least one year from the beginning of the program or six months from the end of the program, then it qualifies for placement in unclear.
Third is external replication. Possible effects must be demonstrated twice in order for it to qualify for the effective category. Two studies must find sustained effects and randomized controlled trials. One of those studies must be conducted by someone not affiliated with the program. If the program has not been evaluated by someone other than the program developer, or if the external evaluation has limited design or sustainability, then it qualifies for promising or unclear.
The fourth criteria is study design. A program must be evaluated using a randomized controlled design for it to qualify for the effective placement. If the program has at least a quasi-experimental design, it qualifies for the promising placement. If a program uses a poorly controlled quasi-experiment design, is a pre-test post-test design, or is purely descriptive, then it qualifies for the unclear placement.
There are several additional criteria that we consider but do not carry as much weight as the other criteria. These criteria form the last category. They are: Having a representative sample; providing a description of the intervention; using adequate outcome measurement; having practical significance; identifying important adverse effects; having modest attrition; using an intent-to-treat approach; and accurately interpreting results.
Once we have determined where the program falls on each criteria, we determine the final placement. The final placement is the lowest placement has been made on any of the criteria. For example, if the program qualifies for an effective placement for significant effects, sustained effects, study design, and in additional criteria but only qualifies for promising on external replication, the program will be placed in promising.
For a program to be placed as ineffective, the evaluation must be as rigorous as a promising or effective program. A well-designed study must find negative effects or failed to find significant positive effects, have no evidence of successful external replication, and address at least half of the issues in the additional criteria category.
We hope that this vodcast has given you a better understanding of how we place programs on the continuum of evidence. Having strict criteria for how programs get placed helps ensure that you can get the most from the service we provide. Thank you for taking the time to watch this vodcast. If you have any suggestions for a vodcast that you would like to see or topics you would like for us to cover please contact us here at the Clearinghouse.