How?
How will you follow up?
Answering this question is often not as easy as it may seem. Learners do not always give notice before they leave LBS programs and often do not maintain the same contact information. Past learners, community partners and the public in general are busy and have their own priorities, making it difficult to get responses from them. These issues cause problems for LBS agencies trying to complete their follow-up evaluations. Some suggestions made by LBS practitioners to forestall these problems are:
- Talk to the learners early in their association with your program (either during assessment while developing the learner plan or early in the training process) about their future exit and the importance of the exit interview to your program and future learners.
- Make sure contact information is up to date. The more ways of contacting the learner, the better (home, cell and alternate phone numbers, email and mail address). Ask which is most convenient and when would be the best time of day if calling.
- Keep the lines of communication open. This helps you get ongoing feedback to identify issues. It may also flag a potential exit. You may be able to assist the learner through referrals, etc. and avoid the exit. Even if you can’t, you will still have the information required for the exit interview.
- Try to get an active email address from your learner. Emails may not change as frequently as telephone numbers.
- See if the learner is willing to give you the contact information of someone you may contact if you are unable to contact the learner. You may also get permission to talk to a referral partner or other community service who may have updated contact information. (This also gives you an opportunity to talk to the referral partner about your services.)
- Set up a Facebook account that can be “friended” by learners and community organizations. This provides an opportunity for them to post comments about your services and is a way to keep contact with learners after they exit your program.
- Make it a practice to encourage learners to call with updates or come in to visit after they have completed their programing. Invite them to luncheons or special events.
- Ask tutors to let you know in advance if a student discusses exiting.
- Try to find opportunities for some one-on-one time with front-line workers from partner agencies. People are more likely to share specific issues, praise and concerns one-on-one than in a group. Going to their office or inviting them to yours for a brown bag lunch can be a good option.
- Instead of just sending an email with a survey or questionnaire attached, ask in the email if there is a time that you could call them to discuss the questions and responses with them. You are more likely get a response and it will allow for valuable discussion.
- Respect people’s time. Make your evaluations relevant, brief and not too frequent.
- Think of using text instead of phone calls. Learners who are on a limited budget do not have minutes on their plan to call or answer calls during the day (most free minutes are after business hours). Use an agency cell phone to encourage texting.
- Keep your ears open!! Many learners know each other and will tell you if they hear from a student who was once in your program.
Probably the most common way to get customer feedback is through surveys or questionnaires, which you design to suit your needs and the needs of recipients. There are a number of ways of circulating surveys:
- in person
- on the phone
- by mail
- by email
- your website
- through surveying sites such as SurveyMonkey
To get the most responses it is best to make your survey adaptable to a variety of distribution methods.
One other option to get feedback is focus groups. “A focus group is a form of qualitative research in which a group of people are asked about their perceptions, opinions, beliefs, and attitudes towards a product, service, concept, advertisement, idea, or packaging. Questions are asked in an interactive group setting where participants are free to talk with other group members.” (Wikipedia http://en.wikipedia.org/wiki/Focus_group) The discussion that takes place in a focus group can be advantageous because:
- Information and perceptions of one group member can stimulate ideas and experiences in other participants.
- The security of a peer group provides a safe setting, thus enabling group members to voice their opinions.
- Group members may have had similar experiences, which provide “validation” to the participants.
CLO’s and LLSC’s Developing a Culture of Evaluation website, especially the module on Collecting Data discusses a number of ways to get data and other valuable information or you can view their Webinar #3 – Collecting Data: Beyond Survey Monkey in the site’s Webinars Section.
Community Literacy of Ontario’s (CLO) SmartSteps to Organizational Excellence provides a chart of a variety of methods for gathering evaluation information in its Program Evaluation section. Beside each method are its purpose and the pros and cons of its use.
How will you manage it?
When you are planning and designing your follow-up plan, you must be realistic about what you can accomplish and over what period. Consider the costs involved in things like paper and printing, postage, phone, travel and computer software. Perhaps the greatest concern is the availability of human resources. If you set yourself too many tasks, you will set yourself up for failure. It is better to start small and increase your capacity over time.
Customer satisfaction and program success can be difficult to measure for several reasons:
- Both satisfaction and success can be difficult to quantify and may vary among stakeholders.
- You have to count on learners and other stakeholders to give not only feedback, but also their honest opinion.
- Many people, when satisfied, feel no need to let you know.
- Some will grumble to others but never voice their complaints to you.
- Requirements for and degrees of satisfaction can be unique to each individual.
- Unless they are extremely upset, many people won’t bother to complain.
There aren’t really any solutions for these difficulties, but they should be considered while you plan and design your follow-up plan and as you go on to execute and monitor your plan.
Executing Your Follow-Up Evaluation Plan
Once you have considered all the necessary questions for the development of your plan, it’s time to put the plan in place. A well-developed plan will not be difficult to accomplish. Nevertheless, it is worth having some sort of document that lays out what will be done, when and by whom. Many find that some form of a work plan template is valuable to detail and track the progress of your plan. Work plans can take a number of formats. One common design is a table with headings such as the one shown below.
Activities | Informants | Resources | Timeline | Responsible | Comments |
The headings can vary as you choose. Some other examples are:
- Inputs
- Outputs
- Resources
- Tasks
- Constraints
- Start/End Dates
- Objectives
- Completed/Done
- Indicators
You also will need methods to compile the information you gather, whether it is numerical or commentary. Storing the data in a well-thought-out way will make it easier for you to locate specific information when you need it. Organizing your data will also make it easier to use in statistical analysis, look for trends or to work with it in other ways. Using spreadsheets or databases are excellent options. They allow you to sort and filter your data or display it in tables, graphs or charts.
CLO’s and LLSC’s Developing a Culture of Evaluation website modules on Analyzing Data has some suggestions that could help.
Monitor
As you work through your plan, you need to constantly monitor how it is unfolding and working. It is a good idea to include regular reviews in your plan and place reminders in your organization’s day planner or calendar. Here are some of the things you may want to check:
- Are you on schedule?
- Are your resources still sufficient?
- Is the workload reasonable for staff/volunteers?
- Are you expecting too much from those you are surveying?
- Is there a better way to get the information you need?
- Is the information you are getting accurate? Is it of value? Can you respond to it?
- Are your recording and compiling systems working?
- Are you responding to the information in a timely manner?
- Are you dispersing the information to the stakeholders?
As you continuously monitor your plan, you make changes and continue with modified execution, which, in turn, will be monitored.
Results
Once you have made the plan, executed it, compiled the responses and monitored the process, you have the results.
The most important thing you can do with the information you receive from your follow-up evaluation is to act on it. Whether the feedback is from clients, learners (current and exited), or community partners, you need to
- consider their suggestions
- investigate and fix (if possible) the things they have complained about
- improve in the areas that matter most to them
- maintain the things that they like
When disseminating the results, ensure that the methods used are appropriate for the chosen audience – e.g. executive summary for Board members, newsletter article for participants, etc.
Organizational Assessment and Project Evaluation Workshop Presented to Atlantic Canada Literacy Coalitions, 2004, By Nishka Smith and Julie Devon Dodd
You should also act on feedback by sharing the information with all the stakeholders. Remember the “Who” in your plan and let them know the results of your follow-up evaluation. How you let each stakeholder know the results will vary. You might
- let a learner know in a three or six-month follow-up interview how you improved your program in response to their comments at exit
- send an email to or have a chat with your Ministry of Training, Colleges and Universities (MTCU) Employment and Training Consultant (ETC)
- publish statistics in your annual report or comments from satisfied learners on your web page
- produce a report to share at a staff or Board of Directors meeting
- thank the partners for their responses, at an interagency meeting in your community, and advise them of changes you made due to their feedback
- hold an interpretation workshop with key stakeholders to collaboratively understand the data and to gather recommendations for improvement
Data or comments, if positive, may be useful to promote your LBS program, i.e., the percentage of learners that move on to employment or further education. Another example might be positive quotes from satisfied learners. The CAD Centre training centre website from the United Kingdom has done well at using learner feedback to promote themselves .
When you report, it is a good idea to make the information tell a story that has a conclusion. What were the major findings and what did you do about them? Each “story” should consider the audience and adjust the details accordingly to suit their levels and interests.
CLO and LLSC’s Developing a Culture of Evaluation website modules Taking Action and Communicating the Results, along with their webinars Getting Heard in A Noisy World and Using Digital Media to Tell Your Evaluation Story are highly recommended. One interesting option on the website is their recorded clinic Failure Is An Option. This clinic looks at failure as a learning tool to help guide you, making your programs, projects and services stronger and better than before.
Questions and Activities for Reflection
- If the Ministry did not require follow-up, would you still do it? Reflect on why you would or would not do follow-up.
- Would you say your present Follow-Up service has a well-considered plan and design?
- How does your program use the feedback received from follow-up activities to inform change or program promotion?
- If you were not a literacy practitioner, as a taxpayer, what would you want to see measured through LBS Follow-Up?
- Consider your stakeholders. What plan do you have in place to get feedback from current learners? Staff members? Referral partners? The public?
- Review the current survey questions your agency uses for an exiting learner and for 3, 6 and 12-month follow-up. Does each question provide quantitative or qualitative results that could help your program improve, grow or develop?