Project Evaluation

Why evaluate?

Project evaluation is not only a required component of the reporting process for TLEF projects, but it is also an important tool to help you understand what is and isn’t working with your project and which areas may need more attention. Ongoing and iterative evaluation will help you to improve project design and implementation, make informed decisions, identify ways to sustain your project after closure, and ultimately reach your intended goals.

Writing your application

You will be asked to provide a project evaluation plan as part of your TLEF application. When developing your evaluation plan, align each of the impacts/benefits you’ve listed (under the Project Impact section of your proposal) with a means for evaluating whether those impacts are being achieved.

Evaluation involves periodically gathering data and reviewing it to determine if the objectives listed in your application are being met. Here is step-by-step resource to help you develop your evaluation plan: TLEF Evaluation plan worksheet

Here is an example of how this might look in your evaluation section:

Project Objective (What do you plan to do?) and Intended Impact (What effect do you hope it will have?): A short sentence that includes the intended impact(s) of the project objective.

Example: “We will develop and implement a training module on academic speaking in Psychology. Two anticipated outcomes of the training module are that students will demonstrate greater confidence in academic speaking [intended impact #1] and knowledge of how to improve their speaking skills [intended impact #2] after completing it.”

Data Sources (How will you know if the impact was achieved?): A few sentences that describe how you will measure each impact.

Example: “To determine if confidence in academic speaking increases after using the training module [intended impact # 1], we plan to measure change in student confidence via a 3-item survey at the start and end of the term. We will assess whether students are understanding how to improve their speaking skills [intended impact #2] by comparing teaching assistant observations and grade performance on academic speaking activities before and after the training module is implemented in the class (early in the term versus later in the term). We will also hold a student focus group to gather feedback on the module: specific items they found to be useful, how they implemented those speaking skills, and anything they felt was unclear.”

 

 

Do Don’t
– Consider project team experiences as part of evaluation:  What worked well? What will you do differently as a team next year?

– Consider different sources of data you might use to corroborate feedback (i.e., a mixed-methods approach).
For example, you may distribute a broad survey to get an overall idea of what is happening, and follow up with a focus group to dig deeper into students’ experiences.

– Remember that evaluation is an ongoing process! Results/feedback
from evaluation in the first term or year of the project should be incorporated into the following offering and re-evaluated.

– Use enrollment/usage counts as an evaluation metric. While these numbers are important to document, they do not capture the impact of your project.

– Use dissemination as an evaluation metric. While we encourage dissemination of your findings, dissemination is not a measure of the impact of your project.

– Worry about needing a baseline or control group for comparison. It can be challenging to achieve this in educational evaluation contexts.

Conducting your Evaluation

CTLT’s Research and Evaluation team can help with all aspects and stages of evaluating your TLEF project. We offer 10 hours of in-kind support for both Large and Small TLEF projects. We can help you:

  • Develop a project evaluation plan,
  • Identify the best method(s) to collect evidence,
  • Collect and analyze data to determine your project’s impact(s).

The Research and Evaluation team has also put together resources for getting started with evaluation.

Final Report

After the funding period ends, you are required to complete a final report on your project, which includes a discussion of your evaluation process and outcomes. Be sure that your description of the project evaluation includes: the objective(s) of the project, the impacts/benefits achieved, the measurement technique used to determine that these were achieved (e.g., interviews, surveys, etc.), and what was learned from this process. Please refer to the TLEF final report page for more details. Below are a few examples of clear evaluation summaries:

“In order to assess impact, we engaged in three evaluation strategies, developed for each of the three groups involved … We conducted interviews with 6 faculty partners to assess their overall experience with the project and specifically, the Library’s involvement in OERs. We also asked what they would hope to see for ongoing sustainable support. Several faculty commented that the Library had played an important role in their work in issues such as copyright, CC licensing, publishing and dissemination platforms, and serving as cross-disciplinary connectors with others doing similar work. Most expressed the hope to see some sort of sustained Library support … To assess [student partner] experience, we administered a Qualtrics survey, meant to gauge their learning, work experience, and value of this exposure to open approaches. Students consistently expressed what a valuable learning experience this had been for them.  This was especially true for students involved in creating new content or resources. … To evaluate the impact of the open resources [for students in the classroom] we disseminated a Qualtrics survey to capture student  awareness of OER, their sense of cost savings, and attitudes towards using OER  as alternatives to traditional materials in their classes. Students’ feedback about their OER class experience was very positive.”

Leonora Crema, Library Support for Open Textbook and Open Educational Resource (OER) Creation

 

“We created a beginning-of-term/end-of-term survey for students to probe their knowledge of/understanding of signed languages … [sample survey link and graphs included] … [The response data shows] that the majority of beginning students in Ling 100 thought that yes, ASL is based on English (at least somewhat), which is incorrect. Advanced students at the beginning of a term were much more likely to correctly say “no,” but even then, a handful thought yes. By the end of the term, only one maintained this misconception, and several of the others actually had an even more nuanced view (indicated by the handful of “other” responses). The second set of graphs shows responses to the cultural questions “Is being deaf a medical condition, and should it be considered a disability?” Almost everyone in Ling100 thought the answers to both questions were “yes.” The students in Ling 447 started out with an interesting dichotomy: the majority thought that it is definitely a medical condition, but were split on whether it’s a disability. Meanwhile, a sizeable minority thought it was definitely NEITHER a medical condition NOR a disability. By the end of the term, almost all the students had a much more nuanced view of the situation, with most of them choosing “other” (because the question especially of disability is largely a cultural and personal decision by people who are d/Deaf).”

Kathleen Hall, Integrating Sign Languages into the Linguistics Curriculum

 

“…eChIRP has been offered for free as the primary text resource in CHEM 123 (all sections, all terms), saving students significant money compared to the traditional combination of textbooks ($125) … survey results suggest that we successfully created an easy-to-use resource, with students (n=1102) rating the ease of navigation 75/100 on average (0=very difficult to navigate; 50=neutral; 100=very easy to navigate) and 90% of students rating the ease of navigation above 50 (neutral) … We wrote xAPI statements to track how students use the eChIRP … students were actively engaged with these components because we recorded an average of over 15,000 xAPI statements per day (50 interactions per active student) and up to 90,000 per day before the final exam (see figure 1) … In a controlled study comparing interactive videos to their traditional counterparts.

We found that students who watched the interactive video self-reported that they felt significantly more engaged (p<0.001) compared to students who watched the non-interactive videos (see figure 2). Students also indicated a strong preference for interactive videos over non-interactive videos, as shown in figure 3 below … We have some evidence that the eChIRP has been particularly helpful for students at risk of failing CHEM 123. For example, failure rates in CHEM 123 at Vantage College have dropped since the introduction of the eChIRP. In S2015 (before the introduction of the eChIRP), 9% of students failed. After a small pilot release of the organic in S2016 the failure rate dropped to 6%. In S2017 after the full organic half of the eChIRP was released, the failure rate decreased again to 4%. Finally, after the full pilot release of the eChIRP in S2018 the failure rate was just 1.5%. This trend suggests that having a resource such as the eChIRP maybe helping more students successfully complete CHEM 123.”

Kayli Johnson, Development of an electronic Chemistry Integrated Resource Package for CHEM 123

Support Available

The CTLT Research and Evaluation team offer bi-annual evaluation workshops, as well as workshops related to ethics for teaching and learning projects, survey design and focus groups/interviews. You can view a listing of upcoming events related to evaluation of teaching and learning projects. 

Do you need more support with evaluation? We can help! Our team of experts is available to help you with the evaluation process through your project’s lifecycle. Please contact Trish Varao-Sousa (trish.varao-sousa@ubc.ca) for consultation.