Evaluation of Public Legal Education And Information: An Annotated Bibliography
Appendix A. The Art and Science of Evaluation*
Allison Mackenzie, Law Society of Alberta
Power Point presentation
The Art & Science of Evaluation
What Is Evaluation?
- Using science to answer "how did we do what can we do better?"
- Systematic program evaluation is the foundation for demonstrating value.
- We need to embrace rigorous program evaluation in order to demonstrate value.
Why Evaluate?
- Gather data to justify programming.
- Identify new needs.
- Determine what's being done right and importantly, what needs to be finetuned.
Why Evaluate?
- Measure program impact or effectiveness with respect to goals and objectives.
- You know what you are presenting / intending do you know what is actually being delivered?
Setting the Stage for Evaluation
- Do you have a formal plan for overall business operations and for each initiative?
- Have you established program objectives and outcomes to provide a basis for the measurement of true results.
Setting the Stage for Evaluation
- Are your project goals linked to organizational goals? Have you identified your purpose / messages, target groups and desired outcomes?
- It is difficult to measure PLE as it involves measuring changing attitudes, beliefs and perceptions.
Tools?
- No single measurement tool for measuring PLE effectiveness.
- Use a combination of measurement and evaluation tools and techniques to establish program benchmarks
- Determine success in "moving the needle."
Techniques
- Media content analysis
- Cyberspace analysis
- Trade show and event measurement (always after, never during)
- Polls and surveys
- Focus groups
When to Use Surveys
- Need to explain the motivations and attitudes driving public's behaviour.
- Want to establish a baseline of information to measure the effectiveness of a program.
When to Use Surveys
- Survey construction is an art; You can be confident asking respondents program related questions keep it simple; Use the same survey repeatedly.
- To create a survey go online and do some research many useful examples of good surveys available online to suit most needs.
Surveys
- Quantitative telephone, mail, in person, email, internet provides overall information.
- Qualitative focus groups, in depth one on one interviews provides indepth information.
Simple Survey
- Most common are telephone and exit; Or follow up by mail surveys.
- Combination of open and closed ended questions.
- Scale of 1 to 5 measurable.
- A few, but not many open ended questions (interpretation is complex).
Questions
- Simple clear and direct, and short.
- One concept per question with difficult concepts, reword and ask in another way geographically removed from first asking.
Questions
- Open ended questions solicit top of mind awareness, thoughts and opinions;
- Closed ended questions provide a range of responses and respondents are asked to select one.
To Survey or Not to Survey?
- You can do simple surveys.
- Keep it simple 10 to 15 closed ended questions numerical scale of 1 to 5.
- Final question asking for comments.
Focus Groups
- Most common research method used.
- Ethnographic research ? observation, participation observation.
- Provide a basis for sound decision making.
Focus Groups
- Need a specially trained facilitator.
- Provides qualitative results about thoughts and feelings.
- As a follow up to a survey provides more detail to issues to help you understand why participants answered they did.
Informal Focus Groups
- Gather a group of volunteers community league, friends ask them to review your brochure or materials and provide their opinions.
- Talk to them and find out what works and what does not work.
- Create an online focus group.
Informal Focus Groups
- Set a topic and a time frame.
- Ask an informed colleague to facilitate.
- Invite key stakeholders to participate.
- Monitor; Ask facilitator for clarifications.
- Results can be used for evaluation or needs assessment.
Interviews
- Talk to people what do they think or feel prepare questions in advance.
- Ask people involved in delivery, staff and participants how they perceive an issue.
- Ask for understanding on issues.
- Oneonone interviews (long) can be especially appropriate.
Content Analysis
- Content analysis is a form of systematic analysis using clearly outlined factors for analysis.
- Any issue you'll be delivering on monitor your local newspapers.
- Do on line research what are the commonly held misperceptions what do you intend to change how much coverage do these issues get i.e. perceptions re gun control.
- What words/rhetoric are used used to describe the topic?
Other Quantitative
- Phone calls
- Email responses
- Web site hits
5 Tips for Evaluation
-
#1
- Carefully set your program objectives.
- Remember that evaluation begins when planning starts.
- Evaluation will be based on more than assessing whether whether the program was delivered did participants gain knowledge? Change opinions?
- Did it do what was intended?
- #2
- Consider getting someone else to do the evaluation.
- Colleague or coworker.
- You want impartial, independent feedback.
- #3
- Evaluation does not have to cost $$$.
- Conference or trade show simple exit survey like the one we'll use today.
- Evaluating printed materials readership survey with small prize.
- Websites seek visitor input.
- #4
- Evaluate at the time of running the program or within the yearly planning cycle.
- Results can be considered in planning process.
-
#5 - Commit to Act on Results
- Commit to act on evaluation outcomes.
- Meaningless if not acted upon.
- Crucial if used to hone, learn and fine tune.
- Date modified: