Developing Your Follow-Up Service

This module will review follow-up materials to highlight how literacy can make a difference in people’s lives, encouraging learners to participate in life-long learning opportunities.

Developing Your Follow-Up Service

As  with any program or service you undertake, there are various stages to develop a Follow-Up service plan:

  • Knowing your objectives

  • Planning and design

    • Who?

    • What?

    • Where?

    • When?

    • Why?

    • How?

    • Executing

    • Monitoring

      • Is the plan working?

      • Are you getting the type of information you need to evaluate your services?

      • Should you re-design the plan or the process?

Let’s consider the various aspects of developing your follow-up process.

follow-up process

Objectives – Learner Status and Evaluation of LBS Services

Why do you need to do follow-up? The Ministry tells us that it is

  • to “document outcomes” and “to evaluate the training activities”

  • to demonstrate the “value and effectiveness of the other four services” (Information and Referral, Assessment, Learner Plan Development and Training)

  • for “receiving ongoing feedback and information from other service providers in the community, employers and learners.”

When you study these requirements, you can see that your objective is actually two-fold:

  1. discover and document what the learner is doing at exit and at 3, 6 and 12 months

  2. find out how efficient, successful and useful the learners’ and community’s experience with your agency was from start to finish

The first purpose is merely fact-finding. What is the learner doing now – working, volunteering, in further education, on benefits, etc.? The second purpose, however, is to gather important feedback in order to evaluate your program. It is important that we remember these two objectives when we undertake the Follow-Up service.

As we noted in the Ministry of Training, Colleges and Universities Requirements section of this module, the Follow-Up service is essentially an evaluation of your program and of the other four LBS services. Evaluations of any kind take time and resources from your other activities. Even the simplest follow-up may seem hard to justify in tight fiscal and human resource situations. Before you start a follow-up plan, you should ask yourself, “Why are we doing this at all?” You could go back to the MTCU requirements and say you do follow-up because you must or, instead, you could explore the value of follow-up.

“Value” is the root of the word “evaluation.” When you evaluate, you measure and assess the value of your program or service. Evaluation provides important input on a program. When you measure the value of programs and services, they improve. If the answer to the question “why evaluate?” is that the results will lead to actions that improve the teaching, learning and customer satisfaction within your program, then all the effort is worthwhile.

Plan and Design

Any good service has a good plan with a well-considered design. Of course, the objectives are your starting point, but there are many questions to ask yourself beforehand, e.g., who is this for, what do they want to find out and what changes will take place in response to the results? All the Who, What, Where, When, Why and How questions need asking. You have already answered “why” as you set out your objectives. So we will move on to discuss the other “four-W’s” and “How?”

Who?

There are several “Who?” questions to consider:

  • Who wants to know and who gains?

  • Who will do the follow-up?

  • Whom will you get feedback from?

Who wants to know and who gains?

MTCU tells us that LBS agencies must “ensure accountability to all stakeholders by providing literacy services that are effective and efficient,” (LBS Service Provider Guidelines). Both your program and MTCU want to know the results of the follow-up, but you should consider whom else the evaluation would benefit.

Let’s start with learners, since they are the reason for the LBS Program. The OALCF is learner-centred. LBS Service Providers respect learners and provide supportive learning environments. They help learners set achievable goals and provide plans and training to help them move on to their goals. This also includes supports during exit and follow-up. Developing a follow-up plan that addresses learners’ concerns and what is important to them is integral to a learner-centred follow-up plan. You should involve learners in the development of the plan, discuss the plan with them and explore their suggestions for both the design and delivery of the plan. By doing this, the resulting evaluation will focus on discovering how to improve your LBS program to satisfy the identified needs of the learners.

Your program itself is a stakeholder. It is likely that your program is interested in knowing

  • the value of your program (effectiveness)

  • about program accessibility, delivery and content (quality and efficiency)

  • the extent of customer satisfaction

  • how successfully you provided transition-oriented, learner-centred services

  • other things specific to your organization

MTCU, as the funder, wants to know that they are getting value for money. They want to know that

  • they are getting a quality program that is efficient, effective and satisfactory to customers

  • your program coordinates with other services to support learners and ease transition to their goals

  • your program justifies its costs

MTCU, in turn, needs to summarize the results of all their LBS-funded programs to show value to taxpayers.

There are other stakeholders who benefit from the results of your evaluations. Potential learners and community partners who could refer them to you benefit from an improved, valuable program to meet their future needs.

When you are developing and delivering your follow-up evaluation plan, you should take all of the stakeholders and their concerns into consideration. However, it isn’t always possible to satisfy all the needs of all the stakeholders all at one time. Remember that your program has the biggest investment in the evaluation, so you must set the priorities.

Who will do the follow-up?

Who will be responsible for each of the tasks involved?

  • Who will tell the learners about the Follow-Up service before they leave the program?

  • Who will complete the exit interview?

  • Who will complete the 3, 6 and 12-month learner interviews?

  • Who will create questionnaires/surveys for other service providers, employers and learners to complete?

  • Who will record and compile the results of the interviews, questionnaires and/or surveys?

  • Who will review the results?

  • Who will utilize the results to inform change or program promotion?

It is a good idea, if possible, to involve a committee when devising your follow-up evaluation plan. Your committee should involve the various stakeholders within your organization such as board members (if applicable), management and service delivery staff (paid or volunteers). It might be a good idea to include learner representation, too.

Whom will you get feedback from?

With whom will you follow up? It is important to your plan that you get feedback from all the stakeholders.

  • Clients and Learners: Follow-Up as an LBS service includes getting feedback from all clients and learners, throughout all of the other four services (Information and Referral, Assessment, Learner Plan Development and Training) and beyond, at exit and for 12 months after.

  • Your Program: There will be things that you want to evaluate that are specific to your program. There will be things your staff members will want to know, too.

  • Other Stakeholders: One stakeholder group that you shouldn’t forget is your community. Within your community, you have referral partners and the public. The opinions of both these groups matter as they have the potential to affect the number of learners you have in the future.

What?

What do you need to know for the Ministry of Training, Colleges and Universities (MTCU) and what does your agency itself want to know?

For MTCU

We know from the Ontario Transfer Payment Agreement that you have to complete the LBS Exit and Follow-Up Form. The LBS Exit and Follow-Up Form also has sections to complete at exit and at 3, 6 and 12 months after the learner exits your program.

What questions does your agency need answered?

Designing your follow-up is about asking effective questions, then coming up with ways to get useful answers. Take time to determine which questions are the “right” ones. Consider the “who”, “what”, “where”, “why”, “when” and “how” and develop appropriate questions. There is no point asking questions if you do not intend to react to or use the responses. Inappropriate or unrealistic questions will produce ineffective and useless answers. You should ask questions that will provide positive quantitative or qualitative results to help your program improve, grow or develop.

In order to get the best information for comparison and tallying, you should design the answers to the questions on a weighted scale (as in a scale of 1 to 5 with 1 indicating complete dissatisfaction and 5 indicating complete satisfaction)

Community Literacy of Ontario’s (CLO) and Literacy Link South Central’s (LLSC) Developing a Culture of Evaluation website has about everything you could want to know about evaluation. In its Collecting Data module there is a section devoted to Asking the Right Question.

Community Literacy of Ontario’s (CLO) SmartSteps to Organizational Excellence has samples of both learner and community partner surveys in its Program Evaluation section. These may assist you in developing a suitable survey for your agency's needs. CLO’s Capacity Plus: Organizational Capacity Resource Guide for Ontario’s Community Literacy Agencies has sample customer satisfaction questions for learners, volunteers and other community stakeholders in its Customer Service Management chapter. You can download these valuable resources from CLO’s Publications webpage.

Asking the Questions that Enable You to Act (YouTube) with Dr. Andrew Taylor (United Way of Peel and Region of Peel) challenges you to consider creating “powerful questions.” He encourages you to look past your regular satisfaction surveys and obvious questions to ask braver questions. He maintains that “The heart (of evaluation) isn’t in the asking. It’s about putting yourself out there to learn…”

Where?

Where will you record the results?

Once you get the facts and feedback from your Follow-Up Plan, where will you record the results?

The results of the Ministry required information on the LBS Exit and Follow-Up Form will be added to EOIS-CaMS. Also, every closed learner paper file must include one of these forms. (For convenience, you can download the LBS Exit and Follow-Up Form from the Employment Ontario Partners’ Gateway (EOPG) Tools link, Forms section as a Microsoft Word document and print it.)

It is up to you to decide where you will keep the follow-up results that are specific to your agency. Sheets kept in learner files are not so easy to refer to when it comes time to complete your program’s evaluation. Many programs use a spreadsheet or database to compile all or some of the results. Others keep all the results in one file, while still others use tracking forms for recording information to which they will respond later.

Again, we recommend referring to CLO and LLSC’s Developing a Culture of Evaluation website, especially the module on Analyzing Data that discusses how to prepare and organize the different types of data for analysis. Also, you can view one of their Webinar #3 – Collecting Data: Beyond Survey Monkey in the site’s Webinars Section.

Imagine Canada’s Project Evaluation Guide for Nonprofit Organizations has a section on Managing Data Collection that you may find helpful.

Where will you use the results?

Where you use the results of the follow-up facts and feedback depends on the type of information you have collected. Some results will be positive, while others may require changes or calls for improvement. Imagine Canada’s Project Evaluation Guide for Nonprofit Organizations suggests ways you can use the results of project evaluation. Since a good part of the Follow-Up service is project evaluation, we have selected ways from Imagine Canada’s list that are transferable to LBS service provider options:

  • identify ways to improve or shift your project activities

  • facilitate changes in the project plan

  • prepare project reports (e.g., mid-term reports, final reports)

  • inform internal and external stakeholders about the project

  • plan for the sustainability of the project

  • learn more about the environment in which the project is being or has been carried out

  • learn more about the target population of the project

  • present the worth and value of the project to stakeholders and the public (e.g., 98% of learners who use our programs are extremely satisfied with our programming)

  • compare projects to plan for their futures

  • make evidence-based organizational decisions

  • demonstrate your organization’s ability to perform evaluations

  • demonstrate your organization’s accountability concerns for implementing its plans, pursuing its goals and measuring its outcomes

 

Information generated in MTCU’s Report 60D, LBS – All Data – Outcomes/Follow-ups, can be used for outreach purposes. All of the exit and follow-up information can be used to create an infographic. There are many samples available on the Internet. Try googling “student satisfaction infographic” Infographics can easily be displayed in a rack card size/postcard size.

In the Sample Forms section of this module, we have included Connections Adult Learning Centres’ Monthly Client Survey Action Plan Form to track action required due to survey responses.

When?

When will you do the follow-up? An evaluation plan is not just a single event. It should involve a number of inputs that take place over an extended length of time. Your plan may be for three months, a year or be part of a three or five-year strategic plan. When you do the various activities in your plan will depend on why you are doing them, who is doing them, with whom and what they are doing. You may decide to survey community partners once a year, carry out random interviews with active learners at a different time in the year, and complete exit and post-exit interviews with learners monthly.

To meet MTCU requirements, you must complete a learner follow-up interview at exit and at 3, 6 and 12 months after learners exit your program. Some service providers attend these interviews all at once on a monthly or weekly basis, as they become due, while others start the process on the day, 3, 6 and 12 months after the learners exit.

So, how do programs remember when it is time for these follow-up interviews? While some set up reminders in electronic calendar software or track them on spreadsheets, most programs use the reports and reminder systems generated by the Employment Ontario Information System-Case Management System (e.g., print the “Pending Reviews” page from EIOS-CaMS and use that as a tangible reminder of who needs to be contacted). Some programs also find it helpful to pick a specific date in the month and complete the follow-ups around the 15th of the month.

How?

How will you follow up?

Answering this question is often not as easy as it may seem. Learners do not always give notice before they leave LBS programs and often do not maintain the same contact information. Past learners, community partners and the public in general are busy and have their own priorities, making it difficult to get responses from them. These issues cause problems for LBS agencies trying to complete their follow-up evaluations. Some suggestions made by LBS practitioners to forestall these problems are:

  • Talk to the learners early in their association with your program (either during assessment while developing the learner plan or early in the training process) about their future exit and the importance of the exit interview to your program and future learners.

  • Make sure contact information is up to date. The more ways of contacting the learner, the better (home, cell and alternate phone numbers, email and mail address). Ask which is most convenient and when would be the best time of day if calling.

  • Keep the lines of communication open. This helps you get ongoing feedback to identify issues. It may also flag a potential exit. You may be able to assist the learner through referrals, etc. and avoid the exit. Even if you can’t, you will still have the information required for the exit interview.

  • Try to get an active email address from your learner. Emails may not change as frequently as telephone numbers.

  • See if the learner is willing to give you the contact information of someone you may contact if you are unable to contact the learner. You may also get permission to talk to a referral partner or other community service who may have updated contact information. (This also gives you an opportunity to talk to the referral partner about your services.)

  • Set up a Facebook account that can be “friended” by learners and community organizations. This provides an opportunity for them to post comments about your services and is a way to keep contact with learners after they exit your program.

  • Make it a practice to encourage learners to call with updates or come in to visit after they have completed their programing. Invite them to luncheons or special events.

  • Ask tutors to let you know in advance if a student discusses exiting.

  • Try to find opportunities for some one-on-one time with front-line workers from partner agencies. People are more likely to share specific issues, praise and concerns one-on-one than in a group. Going to their office or inviting them to yours for a brown bag lunch can be a good option.

  • Instead of just sending an email with a survey or questionnaire attached, ask in the email if there is a time that you could call them to discuss the questions and responses with them. You are more likely get a response and it will allow for valuable discussion.

  • Respect people’s time. Make your evaluations relevant, brief and not too frequent.

  • Think of using text instead of phone calls. Learners who are on a limited budget do not have minutes on their plan to call or answer calls during the day (most free minutes are after business hours).  Use an agency cell phone to encourage texting.

  • Keep your ears open!! Many learners know each other and will tell you if they hear from a student who was once in your program.

Probably the most common way to get customer feedback is through surveys or questionnaires, which you design to suit your needs and the needs of recipients. There are a number of ways of circulating surveys:

  • in person

  • on the phone

  • by mail

  • by email

  • your website

  • through surveying sites such as SurveyMonkey

To get the most responses it is best to make your survey adaptable to a variety of distribution methods.

One other option to get feedback is focus groups. “A focus group is a form of qualitative research in which a group of people are asked about their perceptions, opinions, beliefs, and attitudes towards a product, service, concept, advertisement, idea, or packaging. Questions are asked in an interactive group setting where participants are free to talk with other group members.” (Wikipedia http://en.wikipedia.org/wiki/Focus_group) The discussion that takes place in a focus group can be advantageous because:

  • Information and perceptions of one group member can stimulate ideas and experiences in other participants.

  • The security of a peer group provides a safe setting, thus enabling group members to voice their opinions.

  • Group members may have had similar experiences, which provide “validation” to the participants.

CLO’s and LLSC’s Developing a Culture of Evaluation website, especially the module on Collecting Data discusses a number of ways to get data and other valuable information or you can view their Webinar #3 – Collecting Data: Beyond Survey Monkey in the site’s Webinars Section.

Community Literacy of Ontario’s (CLO) SmartSteps to Organizational Excellence provides a chart of a variety of methods for gathering evaluation information in its Program Evaluation section. Beside each method are its purpose and the pros and cons of its use.

How will you manage it?

When you are planning and designing your follow-up plan, you must be realistic about what you can accomplish and over what period. Consider the costs involved in things like paper and printing, postage, phone, travel and computer software. Perhaps the greatest concern is the availability of human resources. If you set yourself too many tasks, you will set yourself up for failure. It is better to start small and increase your capacity over time.

Customer satisfaction and program success can be difficult to measure for several reasons:

  • Both satisfaction and success can be difficult to quantify and may vary among stakeholders.

  • You have to count on learners and other stakeholders to give not only feedback, but also their honest opinion.

  • Many people, when satisfied, feel no need to let you know.

  • Some will grumble to others but never voice their complaints to you.

  • Requirements for and degrees of satisfaction can be unique to each individual.

  • Unless they are extremely upset, many people won’t bother to complain.

There aren’t really any solutions for these difficulties, but they should be considered while you plan and design your follow-up plan and as you go on to execute and monitor your plan.

Executing Your Follow-Up Evaluation Plan

Once you have considered all the necessary questions for the development of your plan, it’s time to put the plan in place. A well-developed plan will not be difficult to accomplish. Nevertheless, it is worth having some sort of document that lays out what will be done, when and by whom. Many find that some form of a work plan template is valuable to detail and track the progress of your plan. Work plans can take a number of formats. One common design is a table with headings such as the one shown below.

  • Activities

  • Informants

  • Resources

  • Timeline

  • Responsible

  • Comments

The headings can vary as you choose. Some other examples are:

  • Inputs

  • Outputs

  • Resources

  • Tasks

  • Constraints

  • Start/End Dates

  • Objectives

  • Completed/Done

  • Indicators

You also will need methods to compile the information you gather, whether it is numerical or commentary. Storing the data in a well-thought-out way will make it easier for you to locate specific information when you need it. Organizing your data will also make it easier to use in statistical analysis, look for trends or to work with it in other ways. Using spreadsheets or databases are excellent options. They allow you to sort and filter your data or display it in tables, graphs or charts.

CLO’s and LLSC’s Developing a Culture of Evaluation website modules on Analyzing Data has some suggestions that could help.

Monitor

As you work through your plan, you need to constantly monitor how it is unfolding and working. It is a good idea to include regular reviews in your plan and place reminders in your organization’s day planner or calendar. Here are some of the things you may want to check:

  • Are you on schedule?

  • Are your resources still sufficient?

  • Is the workload reasonable for staff/volunteers?

  • Are you expecting too much from those you are surveying?

  • Is there a better way to get the information you need?

  • Is the information you are getting accurate? Is it of value? Can you respond to it?

  • Are your recording and compiling systems working?

  • Are you responding to the information in a timely manner?

  • Are you dispersing the information to the stakeholders?

As you continuously monitor your plan, you make changes and continue with modified execution, which, in turn, will be monitored.

Results

Once you have made the plan, executed it, compiled the responses and monitored the process, you have the results.

The most important thing you can do with the information you receive from your follow-up evaluation is to act on it. Whether the feedback is from clients, learners (current and exited), or community partners, you need to

  • consider their suggestions

  • investigate and fix (if possible) the things they have complained about

  • improve in the areas that matter most to them

  • maintain the things that they like

When disseminating the results, ensure that the methods used are appropriate for the chosen audience – e.g. executive summary for Board members, newsletter article for participants, etc.

Organizational Assessment and Project Evaluation Workshop Presented to Atlantic Canada Literacy Coalitions, 2004, By Nishka Smith and Julie Devon Dodd

You should also act on feedback by sharing the information with all the stakeholders. Remember the “Who” in your plan and let them know the results of your follow-up evaluation. How you let each stakeholder know the results will vary. You might

  • let a learner know in a three or six-month follow-up interview how you improved your program in response to their comments at exit

  • send an email to or have a chat with your Ministry of Training, Colleges and Universities (MTCU) Employment and Training Consultant (ETC)

  • publish statistics in your annual report or comments from satisfied learners on your web page

  • produce a report to share at a staff or Board of Directors meeting

  • thank the partners for their responses, at an interagency meeting in your community, and advise them of changes you made due to their feedback

  • hold an interpretation workshop with key stakeholders to collaboratively understand the data and to gather recommendations for improvement

Data or comments, if positive, may be useful to promote your LBS program, i.e., the percentage of learners that move on to employment or further education. Another example might be positive quotes from satisfied learners. The CAD Centre training centre website from the United Kingdom has done well at using learner feedback to promote themselves.

When you report, it is a good idea to make the information tell a story that has a conclusion. What were the major findings and what did you do about them? Each “story” should consider the audience and adjust the details accordingly to suit their levels and interests.

CLO and LLSC’s Developing a Culture of Evaluation website modules Taking Action and Communicating the Results, along with their webinars Getting Heard in A Noisy World and  Using Digital Media to Tell Your Evaluation Story are highly recommended. One interesting option on the website is their recorded clinic Failure Is An Option. This clinic looks at failure as a learning tool to help guide you, making your programs, projects and services stronger and better than before.

Questions and Activities for Reflection

  1. If the Ministry did not require follow-up, would you still do it? Reflect on why you would or would not do a follow-up.

  2. Would you say your present Follow-Up service has a well-considered plan and design?

  3. How does your program use the feedback received from follow-up activities to inform change or program promotion?

  4. If you were not a literacy practitioner, as a taxpayer, what would you want to see measured through LBS Follow-Up?

  5. Consider your stakeholders. What plan do you have in place to get feedback from current learners? Staff members? Referral partners? The public?

  6. Review the current survey questions your agency uses for an exiting learner and for 3, 6 and 12-month follow-up. Does each question provide quantitative or qualitative results that could help your program improve, grow or develop?


Join the Community Literacy of Ontario

CLO’s 2024 Membership Registration

We invite you to join the Community Literacy of Ontario and become a member. CLO has its benefits!

Enjoy the benefits of CLO’s research, training opportunities, resource development and a combination of communication tools.

2024 Membership

Enjoy the benefits of CLO’s research, training opportunities, resource development and a combination of communication tools.

CLO Members

Recognized organizational members or associate members on CLO’s website as part of a collective voice for community-based literacy.