SEDL Logo

Southwest Educational Development Laboratory



Dissemination Evaluation Strategies and Options

The evaluation of your NIDRR grant's dissemination activities can help you in:

  • establishing credibility of your project's unique approaches and outcomes,
  • demonstrating the effectiveness of your project's strategies, and
  • learning more about what does and does not work, as well as allowing your efforts to be continually refined.

Bean Counting

Unfortunately, many times grant projects only collect "tracking" information. Sometimes this type of data collection is referred to as "bean counting." Examples of this type of tracking data include such things as:

  • number of telephone contacts received per day;
  • time of telephone contacts made per day;
  • telephone caller descriptive information;
  • number of items sent out by U.S. Mail or electronic mail;
  • number of items received by U.S. Mail, electronic mail, or fax;
  • number of products ordered or purchased;
  • how those making contact learned about your project; and
  • how much time was spent in responding to each contact.

An illustration of a pencil, some beans, paper and hash marks symbolizing bean counting

Such tracking data are limited in their ability to assist in describing the level of impact your dissemination efforts have had. These data can be very useful, however, in establishing the level of interaction your project staff members are having with your target audience(s). Very few contacts coming in or going out can reflect the basis of a potential dissemination limitation. Extensive contacts in and out may indicate an active information-sharing exchange that tends to support information dissemination.

Tracking information is typically collected through the use of forms. These forms can be paper or electronic. The development and use of electronic forms can be very beneficial in promoting easier aggregation of data.

Developing an Effective Evaluation Form

Many times forms are used to collect information relevant to evaluation questions. Forms can be used to collect information from project staff, user groups, or the general public. There are a few things to consider when developing such forms that can make their intended result more predictable. Briefly, these include:

  1. Tailor your form to the specific data measures that are appropriate.
  2. Design your form for specific data sources, recognizing a need for special adaptation or alternatives in format such as audio or Braille versions and the possible need for non-English versions.
  3. Consolidate items as much as possible.
  4. Pre-test (and/or field test) your form with a sample of your target audience prior to full implementation. Make changes that seem warranted, and consider the benefit of a second field test.
  5. Configure your form and possible responses in such a way that allows results to be easily entered into a computer database, aggregated and compared.

Dissemination-Related Evaluation

There are four main types of evaluation activities that seem to be of general use in measuring dissemination-related grantee efforts. A brief description of these follows.

Formative Evaluation -occurs during the design period of materials development or activity implementation. These types of evaluation data address the effectiveness and usefulness of approaches. It is applied to materials or activities before they are considered to be "completed." Formative evaluation data can help re-direct or refine activity strategies and/or material development efforts. The data source for formative evaluation efforts is frequently an appropriate sub-sample of the intended user audience that is not involved in the project's development activity. To be most effective, formative evaluation requires clear alternatives with "potential users" commenting on what works best.

Impact Evaluation -is intended to provide information concerning the long-term result or impact of a project or activity. This type of evaluation measures actual changes- for example, the number of spinal cord injuries in an annual period- rather than more subtle attitude or behavior changes that may or may not be linked to cause and effect relationships with results. Depending upon the nature of the result, impact evaluation can be complex. It is often avoided because of the frequent difficulty in many human-intensive, time-limited efforts to separate the effects of project activities from the effects of "outside" variables in producing measured impact.

Outcome Evaluation -is designed to measure the effects of project activities upon identified target audiences. Outcomes can be measured in terms of changes/increases in awareness, shifts in attitudes, changes in behaviors, and increases in knowledge, among others. Outcome evaluation is helpful in identifying and measuring how your project activities affected various segments of your audience. This type of measurement usually establishes baseline data before project activities are initiated, and then periodically assesses for changes over time in these same data areas.

Process Evaluation -usually takes place during the time new activities are being implemented. Process evaluation is designed to help in determining changes in the efficiency and effectiveness of the implementation process. This type of evaluation analyzes the extent to which planned activities occur, their timeliness, their cost, and the extent to which they reach intended audiences.

Plan to Evaluate

All too frequently, proposals do not include information about specific dissemination goals, goal-related dissemination activities, dissemination budget, clearly defined dissemination target audiences, or dissemination-related evaluation activities. Without this framework, dissemination efforts become nothing more than distribution plans. Evaluation of a project's dissemination process and outcomes is not possible unless goals, strategies, and expected outcomes have been conceptualized.

Despite the lack of proposal plans regarding dissemination, it is possible to devise an effective dissemination plan as soon as funding decisions are made and initial implementation is occurring. In addition, it is also frequently possible in ongoing projects to make adjustments so that the impact of dissemination is more clearly understood.

As stated previously, your project's ability to clearly identify and demonstrate its effect on people (other researchers, journalists, consumers, consumers' family members, or others) will increase the credibility and perceptions of effectiveness about the strategies, interventions, and/or materials that you have used or developed.

It is important to be able to answer as many of the following dissemination-related evaluation questions as possible:

  • Were the project's materials delivered in the quantities and in the formats desired by the target audience?
  • Do members of your target audience report using your materials or information?
  • Did your project receive and respond to requests from the target audience in a timely manner?
  • How much did the dissemination efforts cost and was that adequate to achieve the planned-for outcome?
  • Which of several options or alternatives is most effective in meeting the expressed needs of specific target audiences?
  • Did the project make a difference and, if so, what are the dimensions of that difference?
  • What changes in knowledge, attitude, behavior, or condition have occurred in the target audience(s) as a result of project activity?
  • To what extent has each objective of the project's dissemination plan been accomplished?

While this list is by no means comprehensive, it does indicate areas in which evaluation of your project's efforts should be considered.

Using Focus Groups in Evaluation

Focus groups can be extremely helpful in gaining perspectives from your target audience. Basic tips on planning and implementing focus groups include:

  1. Carefully select 8 to 10 individuals that reflect the characteristics of your designated target audiences and have not been involved in your project's planning or implementation efforts. If your audiences are very diverse you may need multiple focus groups.
  2. Carefully detail the information you wish to learn from the focus group members.
  3. Do not tell potential focus group members about the topic or the identity of other group members in advance of the meeting. Tell contacts if you are paying a nominal fee for their participation in the focus group.
  4. To increase the validity of your focus group results, conduct multiple focus groups aimed at the same information collection.
  5. Present material or other information in the same way to each focus group.
  6. Consider recording the audio and/or video portions of the group discussion.

A picture of a tape recorder

Planning Details

Your evaluation planning should involve identifying specific evaluation questions to answer. After this has been done, however, you will need to clarify several other "pieces" of the evaluation activity that must be considered. These details of the evaluation planning process apply to each evaluation question you plan to answer. These include such things as:

  • data sources-identify as clearly and precisely as possible where the information relevant to answering the evaluation question is located.
  • data measures-specify the items (data) that will be collected in the evaluation effort to answer the evaluation questions.
  • timing of data collection-determine if some items may require weekly or monthly collection while others may be collected initially and annually thereafter.
  • use of the information-consider how the answers to each of the evaluation questions will be used by project staff, funding agency staff, persons in the community, or others.
  • reporting timeliness-usually, the results of dissemination-related evaluation efforts are reported. It is helpful to establish the extent and the frequency of such reporting in advance.

Collecting Dissemination-Related Evaluation Data

There are many alternatives in both what and how evaluation activities will be focused. Data collection strategies will be influenced by the type of evaluation you are attempting to conduct. Many of your dissemination-related evaluation efforts will necessarily involve sampling your designated target audience(s).

Examples of strategies for collecting evaluation data should be individually considered for each evaluation question. Some alternatives may be more time-consuming, costly, difficult, or desirable than others. Tailor your collection strategies to that which you can do. Examples of such collection activities that you might consider include:

  • Follow-up telephone calls,
  • Project tracking records of materials distribution,
  • User-completed feedback cards included within materials,
  • Survey of segments of your designated target audience(s),
  • Focus groups,
  • Project tracking of requests for materials or services from the designated target audience (including format, as appropriate),
  • Audit of materials and associated alternate formats developed by the project,
  • Project tracking of budget expenditures in dissemination activities,
  • User-related data collected as a part of field test or pilot test procedures,
  • Newspaper clippings or other popular press documentation of target audience changes, and
  • Number of "hits" or contacts made to your project's World Wide Web site.

Field Testing Your Materials

NIDRR grantees that have produced informational materials for others should use some form of field testing to determine how the materials meet the needs of the intended user/target groups. Data from field testing can be collected in a variety of ways such as those highlighted in the previous section, Collecting Dissemination-Related Evaluation Data. You should outline questions that you want answered as a function of the field test activity. Questions should be specific enough to provide information you can use to make modifications to the materials.

Your field testing should also capture information about the medium you have selected to communicate. Provide the information in several formats-for example, print, electronic disk, and World Wide Web page. Have your field test participants comment on which format they used to receive the information and what they thought about the functional capacity of the other formats. All formats are not equal in their power to communicate. You should be aware of the difference these formats can make with your intended user groups.

This is also important if you are doing outreach to new user groups or you are using general media to create a broad-based awareness of the availability of your information. Awareness of your project as an informational source and awareness of specific information topics that can be shared through contacting your project, can be considered for wider public campaigns. If this is a part of your dissemination plan, it should be subject to field testing and further evaluation after it is conducted.

Sharing Your Evaluation Results

It is important to document your evaluation efforts and results. Many times this is done in the form of a report. Such reporting provides an opportunity to disclose not only your dissemination planning, methods, and evaluation results, but also what the project did with the information. Any ways in which evaluation data were helpful in making modifications enhancing effectiveness, or utility of your results should be reported. Consider the following as points to include in reporting your dissemination activity evaluation effort:

  • Describe both the strengths and the weaknesses that you discovered through the evaluation;
  • Identify any specific outcomes that were a result of your project activities;
  • Recommend changes or modifications, or further data collection that may be appropriate;
  • Address the extent to which funding and staff time commitment affected the activities and/or results of the project;
  • Identify specific ways in which your designated target audience was impacted by your project's dissemination efforts;
  • Identify the extent to which your projects' dissemination results varied according to the variables of: user group, content, context, medium, and information source; and
  • Describe any "next steps" that seem to be appropriate, based upon your evaluation.

Summary

The evaluation of your dissemination activities and outcomes is an important part of your NIDRR project effort. It is through such evaluation that you and NIDRR management can learn more about the best strategies to accomplish specific outcomes. The process of dissemination is so varied and complex that evaluation on an ongoing basis is needed to help in determining the level and nature of your progress in achieving outreach to various potential user groups. Without good evaluation of your dissemination efforts, you will never really know the extent of your impact.

Each grantee must be prepared to respond to the often-asked question,

What difference
did your
project make?


Top

Copyright ©2004 Southwest Educational Development Laboratory

About SEDL | Contact SEDL | Terms of Use