Viewpoints discussion board | University of Illinois

National Coalition for Learning Outcomes Assesment

blog navigation

Results for "November, 2012"

blog posts

  • Measuring Success in Internationalization: What are Students Learning?

    Over the past 30 years, various blue-ribbon commissions, association reports and studies have highlighted U.S. students’ woeful lack of foreign language competency and literacy about world geography, politics, and history. The events of September 11, 2001, gave new urgency to the message, providing a wake-up call about the importance of educating Americans about the rest of the world and our inextricably entwined fates.  This is not to say that U.S. higher education has been totally inactive with respect to internationalization. Many institutions have had long-standing international partnerships and study abroad programs, and have hosted impressive numbers of international students over time. Yet, rarely are institutional internationalization efforts strategic or coherent, or considered to be central to their academic mission or definition of quality. As is always the case, there is tremendous variation in the quantity, quality, and coherence of internationalization across campuses. But as the race to internationalize intensifies, it becomes all the more important to proceed with intentionality.  

    For most institutions, “success” in internationalization is judged by a series of widely-used indicators of institutional performance, such as numbers of students going abroad, numbers of international students, or courses offered with an international or global focus.  And as institutions around the world take up the challenge of internationalization, a robust literature has emerged outlining institutional indicators that help institutions judge their progress as well as to benchmark (see for example www.impi-project.eu, www.impi-toolbox.eu,  www.nuffic.nl/international-organizations/services/quality-assurance-and-internationalization; www.nuffic.nl/mint, http://www.iau-aiu.net/content/global-surveys.)  

    What these institutional activities mean for student learning is a different matter. Although many institutions cite producing “global citizens” as a goal, few have a clear set of learning outcomes associated with this label, a map of the learning experiences that produce such learning, or an assessment plan in place to determine whether they have achieved their goals. Clearly, institutional performance and the student learning perspectives can be related to each other, but one cannot assume causality in either direction. As anyone who has been engaged in assessing student learning knows all too well, the presence and quality of a given set of institutional activities or the participation rates in various courses or programs do not tell you anything about what students are learning.

    The field of education abroad has begun to seriously engage in the question of outcomes. It is no longer deemed acceptable in the field to cite the “it changed my life” argument as the self-evident truth of the positive impact of education abroad. The rapid growth of short-term education abroad programs has put into sharp relief the relationship of the learning achieved in these experiences to different program durations and pedagogies. As students go abroad for shorter periods of time, and are more likely to do so in a faculty-led program in the company of fellow U.S. students, it becomes even more important to determine the impact of these experiences on subject-matter learning, increased global awareness, and development of intercultural skills. The same questions must be asked of longer, more conventional programs, for there is no guarantee that “being there” produces learning, let alone “transformation.” The good news is that institutions are taking up this challenge, increasingly using pre-and post-tests, journals, and portfolios to capture student learning in education abroad (for a list of useful research and resources, see http://www.nafsa.org/resourcelibrary/Default.aspx?id=31791.)

    Although education abroad receives a great deal of attention nationally, it is not synonymous with international or global learning.  Although it is difficult to estimate the proportion of students who study abroad for credit during their undergraduate year, we do know that only 270,000 students out of more than 20 million enrolled in postsecondary education studied abroad in 2010(IIE, 2011; NCES, 2012). Thus, the key question for higher education institutions is how the overwhelming majority of students who do not go abroad will learn about the world and develop the intercultural skills they will need as citizens and workers. To address this question, institutions will need to be very clear about what knowledge and capacities students must learn, where and how they will learn them, and what constitutes evidence of such learning.

    Many institutions begin this work by including global learning as one or more of their stated goals of liberal education.  And they need not reinvent the wheel in crafting a specific set of goals. The Association of American Colleges and Universities includes intercultural learning as one of the 15 essential learning outcomes of its VALUE initiative, (http://www.aacu.org/leap/vision.cfm) and also provides specific goals for liberal education and global citizenship (Musil, 2006).  The American Council on Education also has also developed a list of global learning goals with institutional examples, drawn from the literature, and categorizing them under knowledge, skills, and attitudes (see Olson, Green, and Hill, 2006).  It is important to note, however, that these first steps of stating global learning as a goal and crafting more specific goals are only the beginning of an ongoing process.  

    Identifying which courses and programs actually enable students to acquire these skills and competencies is more difficult work. Having a global or international requirement as part of the general education sequence is one common way to ensure that every student gets at least a small dose, but certainly not the only one.  Institutions also need to look at majors, programs, and individual courses, to map which ones address specific global learning goals.

    The next step involves assessment. It is through assessment that institutions can find out whether they are really producing “globally competent” graduates, “global citizens,” or graduates who can navigate multicultural situations.  And finally, institutions must take the crucial step of “closing the loop” (Banta & Blaich, 2011) by applying what they learned from assessment to improving curriculum and teaching.  I recently produced (2012) a detailed guide on steps in assessing global learning and examples of good practice.

    As long as success in internationalization is measured largely or solely by institutional performance, colleges and universities will be missing the mark.  Although internationalization, alas, is increasingly a matter of numbers, profile, and branding, the real measure of success should be how well students are equipped to live and work in a rapidly changing global environment.

     

    References:

    Banta, T. W., & Blaich, C. (2011). Closing the assessment loop. Change: The Magazine for Higher Learning, 43(1), 22–27. 

    Green, M. (2012). Measuring and assessing internationalization. Washington, DC: NAFSA: Association of International Educators. Retrieved from www.nafsa.org/epubs

    Institute for International Education (2011).  Open Doors 2011: Fast Facts. Retrieved from http://www.iie.org/en/Research-and-Publications/Open-Doors

    Musil, C. (2006). Assessing Global Learning: Matching Good Intentions with Good Practice. Washington, DC: American Association of Colleges and Universities.

    National Center for Education Statistics (2012). Fast Facts: Enrollment. Retrieved from http://nces.ed.gov/fastfacts/display.asp?id=98

    Olson, C., Green, M. & Hill, B. (2006). A Handbook for Advancing Comprehensive Internationalization: What Institutions Can Do and What Students Should Learn. Washington, DC: American Council on Education.

  • Demonstrating How Career Services Contribute to Student Learning

    Career services are too often thought of separate from the core learning activities in which students engage in classrooms, laboratories and studios.  But dividing what students gain from college into academic learning (what happens in the classroom) and personal development (what happens outside of the classroom) is a byproduct of historical, physical structures in higher education institutions. Thinking about student accomplishment as bifurcated in this way does not serve either students or institutions well, as students grow, develop, and learn in a holistic fashion.1

    Career services professionals are well-positioned to bridge the gap between the academic learning and personal development outcomes. They are educators who help students learn how to explore career options, make career decisions, and develop career management skills that students will use throughout their lifetime. Career interventions are the medium through which career professionals provide these learning opportunities to students. We intentionally use the term “career interventions,” as opposed to “programs, services, and resources,” as the latter are static entities that focus attention on the activities that are carried out by career professionals. Interventions, on the other hand, focus on the process of helping students change, develop, or move from point A to point B – essentially, to help students learn.

    Viewing career interventions as learning opportunities sets a high bar in terms of evaluating the effectiveness of career services on higher education campuses.  If education and learning are to be at the core of career services, then learning outcomes must be measured to evaluate success. Fortunately, many career professionals in higher education are no strangers to the process of collecting evidence to show the influence of their work. They have been involved in gathering and interpreting data since the earliest days of the profession. Historically, the most common data-driven strategies for demonstrating the influence of career services have been counting participants, measuring satisfaction, and tracking placement rates. These data tell important parts of the story of career interventions – which students take part, how comfortable students are with their experiences, and where students go after their college experiences.

    However, these types of data have limitations that fall short of what contemporary times demand. When participation rates, satisfaction, or placement numbers are low, few insights can be gained regarding how or what to improve. Additionally, these strategies can encourage a “more is better” focus on increasing quantity with little attention to quality. A different approach is needed for career professionals to demonstrate the quality of their career interventions and the difference that career interventions make in students’ lives. Career professionals must build upon past data collection and analysis experiences to rise to the challenge of conducting rigorous, meaningful learning outcomes assessments.

    Conducting learning outcomes assessments can be daunting for many career professionals, as it is with many other higher education faculty and staff.  One approach is to focus on well-defined programs and services that have clear boundaries – a resume review, a career exploration workshop, career counseling appointments, and so on. Doing so can help keep assessment efforts manageable. Additionally, career professionals can often clearly identify interested parties, such as prospective and current students, families, institution administration, faculty, and student affairs staff who want to know whether career interventions make a difference in students’ lives. Understanding what these audiences want to know offers career professionals clues regarding the types of learning outcomes to focus their assessment efforts on, as well as with whom to share the results. 

    Furthermore, career services professionals often draw upon established guidelines (e.g., CAS Standards, National Career Development Guidelines)2 and theories (e.g., Holland’s Typology, Planned Happenstance, Social Cognitive Career Theory, Super’s Career-Life Roles) 3 to inform their career interventions. For example, the National Career Development Guidelines offers specific outcomes statements regarding the mastery of career management skills. One such skill, related to career decision-making, is that career development clients should be able to take into consideration how personal priorities, culture, beliefs, and work values affect their decision making by: (a) recognizing the role that these influences play in decision making, (b) showing examples of how these influences have affected them in the past, and (c) evaluating the impact of these influences in current career decision-making processes (p. 10). Attending to the content of guidelines and theory documents such as this helps career professionals clearly express desired milestones or outcomes of career interventions that can be measured in assessment efforts.

    Many career professionals are well-positioned to demonstrate the value of their services and career interventions through learning outcomes assessment.  A continuing challenge is for career professionals to find a way to embrace the assessment of student learning in their day-to-day practice.  Our advice is to think of assessing learning in terms of building a house.  That is, have an overall plan, and start with activities that are of reasonable scale, and gradually build up, brick by brick. Focus on laying one brick at a time, no matter how small, and building a strong foundation—a rich body of evidence.  Small, positive experiences with learning outcomes assessment can teach useful skills, build confidence and capabilities, and motivate future learning outcomes assessment efforts.4

    More information on how career professionals can get started with learning outcomes assessment are described in a recent monograph from the National Career Development Association, entitled: Learning Outcomes Assessment Step-by-Step: Enhancing evidence-based practice in career services (http://tinyurl.com/7sy6krl). 

    1 Ideas presented in this paragraph are influenced by scholarship such as: 

    American College Personnel Association. (1996). The Student Learning Imperative. Washington, DC: Author. Retrieved from http://www.acpa.nche.edu/sli/sli.htm  

    American College Personal Association, & National Association of Student Personnel Administrators. (2004). Learning reconsidered: A campus-wide focus on the student experience. Washington, DC: Author. 

    2 References for sample professional guidelines:

    Dean, L. A. (Ed.). (2009). CAS Professional Standards for Higher Education (7th ed.). Washington, DC: Council for the Advancement of Standards in Higher Education.

    America’s Career Resource Network. (2004). National Career Development Guidelines. Retrieved from http://associationdatabase.com/aws/NCDA/asset_manager/get_file/3384?ver=16587

    References for sample career development theories:

    Lent, R. W., Brown, S. D., & Hackett, G. (1994). Toward a unifying social cognitive theory of career and academic interest, choice, and performance [Monograph]. Journal of Vocational Behavior, 45, 79–122. 

    Mitchell, A., Levin, A., & Krumboltz, J. D. (1999). Planned happenstance: Constructing unexpected career opportunities. Journal of Counseling and Development, 77, 115-124.

    Reardon, R. C., & Lenz, J. G. (1998). The Self-Directed Search and Related Holland Career Materials: A practitioner’s guide. Odessa, FL: Psychological Assessment Resources, Inc. 

    Sampson, J. P., Jr., Reardon, R. C., Peterson, G. W., & Lenz, J. G. (2004). Career counseling and services: A cognitive information processing approach. Belmont, CA: Brooks/Cole.

    4 Ideas presented in this paragraph are influenced by scholars such as:

    Keeling, R. P., & Associates. (2007, June). Putting Learning Reconsidered into practice: Developing and assessing student learning outcomes. Workshop presented at the National Association of Student Personnel Administrators’ Learning Reconsidered Institute in St. Louis, MO.

    Schuh, J. H. (2009). Assessment methods for student affairs. San Francisco, CA: Jossey-Bass.

    Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). San Francisco, CA: Jossey-Bass