Supporting Science Students’ Problem Solving in Organic Chemistry

TitleSupporting Science Students’ Problem Solving in Organic Chemistry
Faculty/College/UnitScience
StatusCompleted
Duration1 Year
Initiation04/01/2006
Completion03/31/2007
Project Summary

Rationale: Undergraduate science programs strive to help students learn science content, think about and use science as scientists do, and improve methods for self-directing their learning. Despite considerable theoretical and empirical research on these topics, there is an important gap–we know little about how students bring prior knowledge to bear in preparing to solve a problem, and there is scant data that directly reflects methods they use to solve problems. For example, general conclusions from research are that learning objectives can enhance problem solving, that students benefit from worked examples, and that the prior experiences with successful problem solving methods will transfer to subsequent problems. However, students do not always examine learning objectives available to them (Winne & Jamieson-Noel, 2002), often do not know how to create their own helpful objectives (Morgan, 1985), do not always benefit from worked examples (Kalyuga, Chandler, Tuovinen, & Sweller, 2001), and do not always reuse a tactic that has proven successful in a subsequent, similar task (Rabinowitz, Freeman, & Cohen, 1992).

Instructional practices that guide teaching and learning in higher education hold implicit that teachers and experimenters can effectively manipulate students' cognitive engagements with content to foster learning, enhance motivation, and elevate achievement (Winne, 2004). One explanation for observed variance in these intended outcomes is that learners construct interpretations about the features of problems presented to them (Butler, 2002; Winne, 1982). They set goals for learning from the problem and they choose tactics and strategies for solving problems based on their predictions about how each may support progress toward chosen goals. As they work on problems, they may adjust their approaches. This process of goal setting and working toward a plan is called self-regulated learning (Butler & Winne, 1995; Winne & Jamieson-Noel, 200 I). However, it does not necessarily follow that students are effective at self-regulation that is intrinsic to learning and solving problems (Winne & Jamieson-Noel, 2002).

In this context, a significant challenge facing undergraduate science instruction at UBC (and everywhere science is taught) is determining how students are learning and solving problems as they strive to master content. While common assessment tools such as homework sets and examinations provide summative (final) measures of student learning, these provide little to no insights into the dynamics of how students apply and improve strategies for solving problems and for transferring those skills to new problems they encounter. Yet, it is precisely this kind of information that is critical for improving instruction (Ercikan, 2006). One clear instance of this challenge arises in UBC's CHEM 233, Organic Chemistry for the Biological Sciences. CHEM 233 is an important prerequisite for continuing studies in life sciences. Regrettably, for the past four years, the failure rate in CHEM 233 has been high (-18-29% after scaling). Our project will investigate the reasons why students in introductory chemistry classes are not meeting expectations, develop meaningful solutions, and validate these empirically.

Objectives: The ultimate goal of this project is to improve students' learning in chemistry by helping them develop more effective skills in studying disciplinary content and solving chemistry problems. To approach this goal, we will take advantage of newly developed, leading-edge software–gStudy (Winne et al., in press), a system implemented by Winne and his team in the Learning Kit Project, because it provides unique methods for gathering data that reveal how students study and how they approach problem solving. Simultaneously, because studying and problem solving tools provided by this software are grounded in research in the learning sciences, we strongly expect students will develop skills for learning and for self-regulating their learning to improve it over time.

We will create content modules, called learning kits, that students can work on using gStudy. Learning kits are packages of richly hyperlinked multimedia information presented with texts, graphical displays, and audio and video clips. Using tools provided in gStudy, students can, for example: (1) build glossary of terms, search in the glossary to review definitions of key terms in the problem statement; (2) search organic chemistry content to locate and review knowledge and procedures essential to solving problems; (3) annotate a solved problem to preserve information about how it was solved and ways these methods can be applied to similar problems in the future; (4) label problems according to procedures useful in solving each, then reexamine those previously worked problems as worked examples of a problem on which they are now working; (5) study worked examples provided by the instructor; and (6) record notes for distinguishing when a principle applies to various kinds of problems.

Objective 1. To develop learning kits for CHEM 233 and I 23 that serve the dual purposes of (a) providing material that can help students learn key content plus problem solving skills as they (b) use gStudy's tools to reveal how they study and approach problems. Objective 2, To work with students in CHEM 233 and 123 to identify the processes they use to study and solve problem of varying complexity, thereby generating data for modeling students' problem solving processes in organic chemistry. Objective 3. Based on empirically validated models of students' studying and problem solving and their relations to achievement, to develop guidelines for designing assessment procedures that: (1) provide valued information about students' cognitive processes and growth; (2) guide assessment and revision of instructional and assessment methods; (3) provide specific and timely feedback to students so that they can monitor and regulate their own learning more effectively.

Methods: We will develop first versions of organic chemistry learning kits in gStudy in May-June of 2006. These will be pilot tested in the summer session to ensure their effectiveness in capturing students' studying and problem solving processes. In fall 2006, students will use gStudy's kits to complete CHEM 233 homework problems. As well as data on studying and problem solving processes that gStudy collects, we also will gather achievement data (mid-term and final exams) to investigate the effectiveness of varying study tactics and problem solving strategies. CHEM 123 (first year organic chemistry course) data will be collected in Winter 2007. Data analysis will run alongside data collection. During the semester, we will provide descriptive feedback to students, and publicize preliminary findings to the teaching and research community at UBC. In depth studies of the data will proceed apace and will be completed by August 2007. Based on these results, in fall 2007, we will begin developing assessment items and guidelines that organic chemistry instructors can use to more effectively frame problems in ways that help or challenge students' use of prior knowledge. We also will conduct a comparative analysis of methods used for assessing the different components of problem solving in chemistry. After gathering feedback from professors on preliminary products of this work, we will translate our findings into clear online design specifications for developing problem solving assessments in chemistry, and recommendations for adapting these specifications into other science domains. We will pilot test the specifications with a representative sample of chemistry instructors in winter 2008, collect student data as well as students' opinions on the perceived value of the new assessment forms. Based on these results, design guidelines and models of assessment practices will be completed by May 2008.

Funding Details
Year 1: Project YearYear 1
Year 1: Funding Year2006/2007
Year 1: Project TypeSmall TLEF
Year 1: Principal InvestigatorJaclyn Stewart
Year 1: Funded Amount50,583
Year 1: Team Members

Jaclyn Stewart, Chemistry, Faculty of Science
Deborah Butler, Associate Professor, Faculty of Education
Kadriye Ercikan, Associate Professor, Faculty of Education
France Gagnon, Ph.D. Student, Faculty of Education
Edward Grant, Professor / Head, Chemistry, Faculty of Science
Patricia Lau, President, Science Undergraduate Society
Annie Prudhomme Généreux, Lecturer, Coordinated Science Program, Faculty of Science
Philip H. Winne, Canada Research Chair, Self-Regulated Learning and Learning Technologies, Simon Fraser University