
EVALUATION IN PRACTICE
Developing an Evaluation Plan With Scientific and Cultural Rigor
An interview with Native American Professional Parent Resources, Inc. (NAPPR), staff

What is your evaluation question?
Do Native families participating in tribal home visiting that receive a culturally enhanced version of Parents As Teachers (PAT) (parent-child activities and family group connections) demonstrate increases in cultural self-efficacy, cultural interest, and cultural connectedness compared with Native families that receive standard (non-culturally enhanced) PAT through Early Head Start?
How did you balance cultural and scientific rigor when developing your evaluation plan?
First, it took time to develop internal evaluation capacity and mutual understanding among university evaluators and NAPPR staff. It was important for us to allow ample time to form trusting relationships and build shared ownership and investment in the research process. It also took time for the program to stabilize so that outcomes could be evaluated effectively.
We had to find the right study focus and research question. Once we determined that our outcome of interest would be “‘cultural connectedness,'” we had to decide how we were going to measure such a complicated construct. We chose to develop our own measure, consulting with and drawing on the work of other researchers. In the process of developing cultural enhancements, we had to navigate tribal governance systems: Who has authority to call a culturally enhanced activity “Pueblo”? How can someone get that authorization? Consulting with our home visiting model developer about enhancing the curriculum to include culturally-tailored home visit and group activities took time as well.
Throughout the process, we consulted with the our program’s community advisory board, parent advisory group, and staff. There was an ongoing feedback loop with these groups. We wanted their input and consultation at every stage of development, so the study became a regular item on meeting agendas.
How did your commitment to balancing cultural and scientific rigor influence decisions you made about the evaluation?
Balancing cultural and scientific rigor was a study-long process. We questioned what we could do to be more culturally responsive each step of the way. For example, because we serve a population that is tribally diverse, we decided against the idea of developing tribal-specific cultural activities. Instead, we developed intertribal activities that would appeal to participants from different tribes with prompts for families to share their own tribal values and traditions. By designing our intervention to be more intertribal, we decided that our home visitors were not going to be teachers but facilitators for cultural activities. This was important for our evaluation, because it meant the intervention would vary somewhat from family to family. Having a tribally diverse population also meant the definition of “cultural connectedness” could vary among participants. We worked hard to develop survey language relevant to participants from a range of tribes. We also built focus groups into our evaluation design, in addition to surveys, to capture the diverse ways participants perceive and experience cultural connectedness.
How did TEI help?
TEI helped us understand federal expectations and supported us in finding the right evaluation focus for our program and outlining a preliminary evaluation plan. TEI also supported us in achieving a good balance of cultural and scientific rigor, often by asking questions that prompted us to rethink proposed approaches and reach for greater rigor, but also by acknowledging our progress and successes along the way.