Southwest Educational Development Laboratory
The evaluation of your NIDRR grant's dissemination activities can help you in:
Unfortunately, many times grant projects only collect "tracking" information. Sometimes this type of data collection is referred to as "bean counting." Examples of this type of tracking data include such things as:
Such tracking data are limited in their ability to assist in describing the level of impact your dissemination efforts have had. These data can be very useful, however, in establishing the level of interaction your project staff members are having with your target audience(s). Very few contacts coming in or going out can reflect the basis of a potential dissemination limitation. Extensive contacts in and out may indicate an active information-sharing exchange that tends to support information dissemination.
Tracking information is typically collected through the use of forms. These forms can be paper or electronic. The development and use of electronic forms can be very beneficial in promoting easier aggregation of data.
Many times forms are used to collect information relevant to evaluation questions. Forms can be used to collect information from project staff, user groups, or the general public. There are a few things to consider when developing such forms that can make their intended result more predictable. Briefly, these include:
There are four main types of evaluation activities that seem to be of general use in measuring dissemination-related grantee efforts. A brief description of these follows.
Formative Evaluation -occurs during the design period of materials development or activity implementation. These types of evaluation data address the effectiveness and usefulness of approaches. It is applied to materials or activities before they are considered to be "completed." Formative evaluation data can help re-direct or refine activity strategies and/or material development efforts. The data source for formative evaluation efforts is frequently an appropriate sub-sample of the intended user audience that is not involved in the project's development activity. To be most effective, formative evaluation requires clear alternatives with "potential users" commenting on what works best.
Impact Evaluation -is intended to provide information concerning the long-term result or impact of a project or activity. This type of evaluation measures actual changes- for example, the number of spinal cord injuries in an annual period- rather than more subtle attitude or behavior changes that may or may not be linked to cause and effect relationships with results. Depending upon the nature of the result, impact evaluation can be complex. It is often avoided because of the frequent difficulty in many human-intensive, time-limited efforts to separate the effects of project activities from the effects of "outside" variables in producing measured impact.
Outcome Evaluation -is designed to measure the effects of project activities upon identified target audiences. Outcomes can be measured in terms of changes/increases in awareness, shifts in attitudes, changes in behaviors, and increases in knowledge, among others. Outcome evaluation is helpful in identifying and measuring how your project activities affected various segments of your audience. This type of measurement usually establishes baseline data before project activities are initiated, and then periodically assesses for changes over time in these same data areas.
Process Evaluation -usually takes place during the time new activities are being implemented. Process evaluation is designed to help in determining changes in the efficiency and effectiveness of the implementation process. This type of evaluation analyzes the extent to which planned activities occur, their timeliness, their cost, and the extent to which they reach intended audiences.
All too frequently, proposals do not include information about specific dissemination goals, goal-related dissemination activities, dissemination budget, clearly defined dissemination target audiences, or dissemination-related evaluation activities. Without this framework, dissemination efforts become nothing more than distribution plans. Evaluation of a project's dissemination process and outcomes is not possible unless goals, strategies, and expected outcomes have been conceptualized.
Despite the lack of proposal plans regarding dissemination, it is possible to devise an effective dissemination plan as soon as funding decisions are made and initial implementation is occurring. In addition, it is also frequently possible in ongoing projects to make adjustments so that the impact of dissemination is more clearly understood.
As stated previously, your project's ability to clearly identify and demonstrate its effect on people (other researchers, journalists, consumers, consumers' family members, or others) will increase the credibility and perceptions of effectiveness about the strategies, interventions, and/or materials that you have used or developed.
It is important to be able to answer as many of the following dissemination-related evaluation questions as possible:
While this list is by no means comprehensive, it does indicate areas in which evaluation of your project's efforts should be considered.
Focus groups can be extremely helpful in gaining perspectives from your target audience. Basic tips on planning and implementing focus groups include:
Your evaluation planning should involve identifying specific evaluation questions to answer. After this has been done, however, you will need to clarify several other "pieces" of the evaluation activity that must be considered. These details of the evaluation planning process apply to each evaluation question you plan to answer. These include such things as:
There are many alternatives in both what and how evaluation activities will be focused. Data collection strategies will be influenced by the type of evaluation you are attempting to conduct. Many of your dissemination-related evaluation efforts will necessarily involve sampling your designated target audience(s).
Examples of strategies for collecting evaluation data should be individually considered for each evaluation question. Some alternatives may be more time-consuming, costly, difficult, or desirable than others. Tailor your collection strategies to that which you can do. Examples of such collection activities that you might consider include:
NIDRR grantees that have produced informational materials for others should use some form of field testing to determine how the materials meet the needs of the intended user/target groups. Data from field testing can be collected in a variety of ways such as those highlighted in the previous section, Collecting Dissemination-Related Evaluation Data. You should outline questions that you want answered as a function of the field test activity. Questions should be specific enough to provide information you can use to make modifications to the materials.
Your field testing should also capture information about the medium you have selected to communicate. Provide the information in several formats-for example, print, electronic disk, and World Wide Web page. Have your field test participants comment on which format they used to receive the information and what they thought about the functional capacity of the other formats. All formats are not equal in their power to communicate. You should be aware of the difference these formats can make with your intended user groups.
This is also important if you are doing outreach to new user groups or you are using general media to create a broad-based awareness of the availability of your information. Awareness of your project as an informational source and awareness of specific information topics that can be shared through contacting your project, can be considered for wider public campaigns. If this is a part of your dissemination plan, it should be subject to field testing and further evaluation after it is conducted.
It is important to document your evaluation efforts and results. Many times this is done in the form of a report. Such reporting provides an opportunity to disclose not only your dissemination planning, methods, and evaluation results, but also what the project did with the information. Any ways in which evaluation data were helpful in making modifications enhancing effectiveness, or utility of your results should be reported. Consider the following as points to include in reporting your dissemination activity evaluation effort:
The evaluation of your dissemination activities and outcomes is an important part of your NIDRR project effort. It is through such evaluation that you and NIDRR management can learn more about the best strategies to accomplish specific outcomes. The process of dissemination is so varied and complex that evaluation on an ongoing basis is needed to help in determining the level and nature of your progress in achieving outreach to various potential user groups. Without good evaluation of your dissemination efforts, you will never really know the extent of your impact.
Each grantee must be prepared to respond to the often-asked question,
Copyright ©2004 Southwest Educational Development Laboratory