ESSAY 4: Educational Quality: Student Learning, Core Competencies, and Standards of Performance at Graduation As mentioned in Essay 3, CSUSM is defined in part by our nontraditional and diverse student population. The percentage of Underrepresented Minority (URM) undergraduate students (African American, Pacific Islander, Hispanic, and Native American) grew from 31% in Fall 2009 to 40% in Fall 2013 (IPA, Appendix 1). With nearly a third of our students being first-generation college students, CSUSM is both a Hispanic Serving Institution (HSI) and an Asian American Native American Pacific Islander Serving Institution (AANAPISI). Our campus strives to graduate students who not only meet the needs of the region but also excel with the essential learning outcomes identified by the Association of American Colleges and Universities’ Liberal Education and American Promise (LEAP). These learning outcomes are the foundation for our institutional Undergraduate Learning Outcomes (ULOs) and our General Education Program Student Learning Outcomes (GEPSLOs). The framework of five core competencies— written communication, oral communication, information literacy, critical thinking, and quantitative reasoning— comprise the essential knowledge, skills and abilities that students should have at graduation. This essay will further explore educational quality at CSUSM, highlighting examples of student learning at both the undergraduate and graduate level, as well as learning centeredness embedded across the institution.
Core Competencies and General Education Student Learning Outcomes
In its ongoing commitment to support teaching and learning and further institutionalize assessment, CSUSM has brought together faculty, staff, and administrators at a Discovery Café, an Institutional Learning Outcomes Task Force, a Quality of the Degree Team, and a Core Competency Team (CCT). The CCT is comprised of a former WASC Accreditation Liaison Officer (ALO) and General Education Assessment Coordinator; Director of General Education Writing (GEW); Librarian, Director, Information Literacy Program at CSUSM Library; General Education Oral Communication Coordinator; Director of First-Year Programs; and faculty from Philosophy and Mathematics. During academic year 2013-14, the CCT designed and implemented a plan to assess the five core competencies (written communication, oral communication, information literacy, critical thinking, and quantitative reasoning) in General Education (GE) and major courses.
Consistently CSUSM’s seniors participating in the National Survey of Student Engagement (NSSE) identify clear and effective writing as a distinguishing characteristic of their education. This result highlights CSUSM’s university-wide writing requirement. Consequently, the CCT’s plan for assessment began with written communication in AY 2013-14, with the assessment occurring spring 2014. The cycle for the remaining core competencies is as follows: oral communication (Fall 2014), critical thinking/information literacy (Spring 2015), and quantitative reasoning (Fall 2015).
When developing the rubrics used for scoring the essay samples and the speeches and oral presentations for the written and oral communication assessments, the CCT held rubric-building sessions consisting of faculty from various disciplines. A different group of faculty was enlisted to actually score the essays. This level of participation at the planning and implementation stages helps broaden faculty understanding of the University’s assessments efforts. The total sample size for this first round of core competency assessment was 122 papers (39 from GE, 83 from senior level major courses). Overall, the majority of students met the minimum standard for each criterion, with greatest strengths for purpose and audience/voice in the criteria (Appendices 2 and 3).
The CCT shared the assessment data with various entities across campus including all college deans and the University Assessment Committee (UAC). Conversations about how this snapshot of CSUSM’s graduating seniors’ written communication skills can help faculty revisit their own classroom practices, aid departments/programs in examining how they support writing in their curriculum, and help the University reevaluate the curricular structures that support writing across the disciplines are beginning. In other words, CSUSM is looking at “closing the loop” as it moves forward with assessment of the remaining core competencies. The CCT recruited an even larger sample size of 241 samples of in-class student presentations for the oral communication assessment in Fall 2014 (see Appendix 4 for rubric).
Furthermore, in keeping with the California State University Executive Order 1065 on GE Breadth Requirement, our General Education Committee (GEC) brought General Education Student Learning Outcomes to the Academic Senate in Spring 2014. Again, these GEPSLOs are rooted in the LEAP core competencies mentioned above, with GEPSLOs #3, 5 and 6 on written communication, oral communication, and information literacy already required in all GE courses (Appendix 5). GEC’s next step is to begin curriculum mapping of these GEPSLOs across the GE courses—specifically, upper division courses— to demonstrate that indeed these student-learning outcomes are being addressed across the curriculum.
Evidence of Undergraduate Student Learning
A wide range of strategies is used by the University to confirm that students meet key learning outcomes, particularly at the programmatic level. One example is the Biology Department’s work to assure that its majors learn to “Apply quantitative reasoning to analyze and solve complex problems” (ULO 2b). Supported by a National Institutes of Health MARC Curriculum Improvement Grant from 2008-2013, CSUSM faculty modified a total of 17 Biology courses, 6 Chemistry courses, 3 Mathematics courses, 2 Physics courses, and 2 Computer Science courses to increase quantitative and computational concepts and analyses related to the Biological Sciences. All of the modified courses are requirements or electives for Biology majors, ensuring that students in the major are introduced to quantitative reasoning and analysis early in their college careers, and that key concepts are reinforced multiple times during their coursework. A summary of the quantitative and computational modifications made to a single course (BIOL 210) is included in Appendix 6. Although evaluation of project assessment data is ongoing, student knowledge surveys have suggested substantial gains in student confidence with quantitative and computational concepts after completing modified coursework (Appendix 7). Beyond individual course modifications, another outcome of this project was the development of a new Quantitative and Computational Biology Minor, offered for the first time in Fall 2014. Thus, targeted and thoughtful efforts are underway to strengthen quantitative reasoning and problem solving in the Biological Sciences, and these efforts are representative of a broader University-wide commitment to assure that our students meet key learning outcomes.
Another programmatic example is our College of Business Administration, in conjunction with 7 other CSU campuses, using a Business Assessment Test (CSU-BAT) to assess student-learning outcomes in business classes from accounting to management and marketing. In Spring of 2012, the annual assessment report for our BS in Business Administration noted that the average scores the CSU-BAT were the highest CSUSM students have received since we began participating in the assessment in 2005. A further demonstration of student learning is the capstone Senior Experience for Business students. In an intensive, integrated course, teams of students apply their classroom-based knowledge to complete a consulting project with a real-world business problem, proposing solutions for their sponsoring business. For example, Senior Experience teams identified areas to increase sustainability efforts of the local eco-conscious Stone Brewery.
Additional evidence of student learning is gathered at the institutional level through three national surveys CSUSM administers: The Freshmen Survey (administered to incoming first-year students), The Senior Survey Surveys (administered to graduating students), and the National Survey of Student Engagement or NSSE (administered to freshmen and graduating seniors in their spring term). These surveys provide insight into how students self-report their skills on written and oral communication skills, and on time spent writing and presenting. NSSE responses to the National Survey of Student Engagement illustrate our students’ assessment of their writing and speaking skills compared to other participating California State Universities. For NSSE 2014 results, see Appendices 8, 9, and 10 (“Snapshot,” “High Impact Practices,” and “Engagement Indicators”).
Because our campus administers both the Freshmen Survey and the Senior Survey, the Higher Education Research Institute is able to provide responses to both surveys by the same students at these two different points in their college careers. As was the case in 2009 and 2011, the results for the spring 2013 Senior Survey show that the percentage of CSUSM respondents who rated their written and oral communication as “above average/highest 10%” increase substantially between their freshman and senior years:
Writing ability: 48% of freshmen vs. 66% of seniors
Public speaking ability: 33% of freshmen vs. 42% of seniors (Appendix 11).
As part of a system-wide initiative, CSUSM also administers the Collegiate Learning Assessment (CLA) to incoming freshmen and graduating seniors each year. The CLA is designed to “measure an institution’s contribution, or value added, to the development of higher-order-skills” such as critical thinking and written communication. In Spring 2014, 65 seniors took the test and results show that CSUSM’s total “Value-Added Percentile Rank” was better than 80% of other participating campuses (Appendix 12). The data from these surveys, in particular when comparing freshmen to seniors, indicate growth and student learning at graduation. And we continue to look for areas in which we can further institutionalize assessment from the programmatic level on up, a topic addressed more fully in Essay 6.