Viewpoints discussion board | University of Illinois

National Coalition for Learning Outcomes Assesment

Entering content area for Viewpoints discussion board

blog posts

  • Embedded Assessment and Evidence-Based Curriculum Mapping: The Promise of Learning Analytics

    Assessing desired learning outcomes effectively is a complex endeavor. Risking over-simplification, I propose we accept the definition offered by Randy Swing (2010) “Assessment is change management.”  If we admit that premise and partner it with “Data are resources; people are the agents of decisions and change” (Kramer, Hanson, Olsen, 2010, p.44), then the locus of control for assessment and therefore change, can reside with the people generating the data. In my school, The Wegmans School of Pharmacy at St. John Fisher College in Rochester, New York, that means the faculty. 

    At Wegmans School of Pharmacy, we have adopted an embedded assessment approach to curriculum mapping and data collection on student learning outcomes achievement. In essence, we capture the data from the course level exams that our faculty members  craft to measure student learning. We have accepted the approach proposed by Linda Suskie (2009, p.5), “Assessments that are embedded into individual courses can often provide information on student achievement or program goals, general education goals, and institutional goals.” Furthermore, “…a more deliberate use of existing measures of student success can provide incremental evidence of student learning and move us toward meeting the call of accountability” (McCarthy, Niederjohn & Bosak, 2011, p.81). With this is mind, we tag every question on all course-level exams with multiple codes corresponding to program outcomes, course learning outcomes, and level of Bloom’s Taxonomy. The resulting data are entirely generated from information embedded in our coursework. No additional testing is needed, which addresses a concern of Janet Fontenot’s (2012) that faculty are leery of the additional time required on their part for assessment activities.  Furthermore, with embedded assessment the faculty members are the principal source of data, and the people who have the most control over the management of change suggested by data analysis. 

    At the end of the semester, longitudinal reports are generated that map the frequency of questions related to each course and program level outcome. This density table appears alongside the student achievement for each of those outcomes. For example, data reveal that students were questioned on immunology 45 times with any average percentage correct of 86.5. From a curricular perspective we create an evidence-based curriculum map. Instead of declaring that we address X program outcome in Y course, we now demonstrate that X outcome is tested 45 times with a result of 86.5% student achievement. We analyze the percentage of questions asked at the knowledge, application, and synthesis levels. Based on the data, we recommend change. 

    At the course level, faculty track coverage of their individual course outcomes. If student achievement is low in one area, then the instructor knows where to spend time in class. The validity of the questions is checked through a peer review process. The faculty member also has descriptive statistics for individual test items, which can help identify poor performing questions which can then be modified or retired. Faculty members also capture reports of student performance on their course learning outcomes to include in their dossiers as evidence of effective teaching. 

    At the student level, the bi-semester longitudinal reports reveal areas of strength and weakness. For example, a student presented with a 90+ average that appeared to have no areas of concern. After reviewing her longitudinal report, we saw that on questions spanning all coursework involving calculations she earned a 70% average. She now knows where to direct her efforts prior to taking the North American Pharmacist Licensure Examination. After students are presented with their longitudinal reports, they are required to write a reflection on their performance and briefly describe steps they will take to address areas of concern. In this way, we are encouraging students to take responsibility for their learning.

    In the Horizon Report: 2013 Higher Education Edition, learning analytics is suggested to be on the two-to-three year horizon. The report cites, “The promise of of learning analytics is actionable data relevant to every tier of the educational system…A key outcome of learning analytics pertains to the student on an individual level…” (p.24) I would suggest that the seeds of learning analytics have already been planted. While we still have far to go, we have begun mining large banks of data to do precisely what is suggested in this report: use the data sets to target areas for improvement. We use our data to inform change at the program, course and student levels. 

    Overall, our embedded assessment and evidence-based curriculum mapping approach has been well received in the school. We are realizing advantages at the programmatic, course, faculty, and student levels. We believe acceptance of this new approach is in large part to the keeping it simple, consistent with Swing’s advice that assessment is change management  and insuring that subsequent changes in curriculum and assessment emanate close to the data source.

    References

    Fontenot, J. (2012, July 11). Faculty concerns about student learning outcomes assessment. [National Institute for Learning Outcomes Assessment Web log comment]. Retrieved from http://illinois.edu/blog/view/915/76774?displayOrder=desc&displayType=none&displayColumn=created
    &displayCount=1

    Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., and Ludgate, H. (2013). NIMC Horizon Report: 2013 Higher Education Edition. Austin, Texas: The News Media Consortium.

    Kramer, G.L., Hanson, C. & Olsen, D. (2010). Assessment frameworks that can make a difference in achieving institutional outcomes.  In Kramer and Swing (Ed.).  Higher education assessments: Leadership matters (pp.27-56). Landham, MD: Rowman & Littlefield.

    McCarthy, M.A., Niederjohn, D.M., & Bosak, T.N. (2011). Embedded assessment: A measure of student learning and teaching effectiveness. Teaching of Psychology, 38(2) 78-82

    Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass. 

    Swing, R.L. (2010). Supporting assessment: cost/benefit considerations. The New England Association of Schools and Colleges annual meeting and conference. Boston, MA.