Core Area Assessment Report Set 1 - D1: Philosophy & B2: Natural Or Laboratory Science

History of Core Assessment Effort

The Core Assessment Working Group (CAWG) is a committee formed in 2015 by the Core Advisory Committee (CAC), a committee made up of department chairs who represent each Core area, faculty representatives from the School of Management and the School of Nursing and Health Professions, as well as the Associate Dean of Academic Effectiveness, with the support of the College Council, in response to College of Arts and Sciences Dean Marcelo Camperi’s call for an assessment of the Core Curriculum. That call was issued as a result of faculty concerns surrounding the growth and coherence of the Core, and from the Western Association of Schools and Colleges (WASC) requirement that colleges and universities engage in regular curricular assessment in order to retain their academic accreditation. The USF Core Curriculum had not been assessed since its inception in 2002. In the spring of 2015 the CAC, with the guidance of Associate Dean for Academic Effectiveness June Madsen Clausen, asked that a committee be created to investigate procedures, design materials, and develop and establish a timeline for assessing the Core. CAWG was then constituted by Dean Camperi, with a representative from each area of the College’s Core Curriculum. Its initial membership was Tracy Benning (Sciences), Christine Young (Arts), Yaniv Stopnitzky (Social Sciences), and Ronald Sundstrom (Humanities, CAWG Chair). In the spring of 2017, Joshua Gamson replaced Prof. Stopnitzky; in the fall of 2017, Ryan Van Meter replaced Prof. Sundstrom. In spring 2018, Eve-Anne Doohan replaced Prof. Joshua Gamson. Thus the current membership of the committee is Tracy Benning (Sciences, CAWG Chair), Christine Young (Arts), Ryan Van Meter (Humanities) and Eve-Anne Doohan (Social Sciences).

CAWG, with Associate Dean Clausen, created a timeline for assessing the Core (see Appendix A for Core Area Assessment Master Timeline; See Appendix B for D1 and B2 Assessment Process Timeline), and concurrently began to investigate and design materials to support an assessment of the Core. The group conferred with a consultant, Carol Gittens (Associate Dean, Santa Clara University). Based on Gitten’s recommendation, CAWG consolidated the 48 learning outcomes from the 11 Core areas (A1 through F) into a simplified and more measurable set of Higher Order Learning Goals (HOLGs) corresponding to each Core area (see Appendix C). The HOLGs were then used to design a draft rubric for each Core area, with the goal of developing rubrics specific enough to offer a meaningful measure of student learning in relation to Core learning outcomes and general enough that they could be applied to student work products from a variety of courses and disciplines within a Core area.

The Core areas were divided into five sets of 2-3 Core areas, with each set due to be assessed once during a five-year period. This assessment process is broken into five phases, with staggered start dates for the different Core area sets. The process includes the following steps: 1) faculty in a Core area are asked to align their Core courses with the respective Core learning outcomes; 2) rubrics for each area are developed with input from faculty teaching in the relevant Core area, and assessable student work products are identified; 3) workshops are conducted both to inform faculty about the assessment process and to recruit faculty raters; 4) student work products are gathered and rated by paid faculty raters; 5) the results are interpreted by CAWG and shared with faculty and administration (see Appendix D for Core Assessment Reporting Protocol).

Process and Methodology for Core Area Assessments - Set 1

Core Areas D1 (Philosophy) and B2 (Natural or Laboratory Science) were selected as the first set to be assessed based on the assumption that CAWG members representing Humanities and Sciences believed their Core area syllabi were already quite well aligned with their Core learning outcomes. This allowed CAWG to begin work without the semester long syllabi alignment process. As a result, rubric development started immediately. All full-time and part-time faculty teaching D1 and B2 classes were invited to attend rubric feedback sessions in their Core area, to ensure that rubrics remained true to the intentions of the existing Core learning outcomes, would make sense to raters, would reflect the language and practices of the Core area, and when applied to student work products would provide an accurate measure of whether and to what degree the learning outcomes were achieved. The rubrics were each reviewed by faculty teaching in the Core area during two rubric feedback sessions, before their final approval by the CAC in March 2017 (see Appendix E for the Rating Rubrics). In consultation with Core area faculty, CAWG reviewed D1 and B2 syllabi to determine what type of student work products would be available and useful for assessment. For D1, final papers from all D1 courses were collected; for B2, both an exam and lab report were collected. Student work products were then randomly sampled using a stratified approach based on overall course enrollments. However, we were not able to reach our target numbers for every course because many of the work products submitted had to be eliminated from rating for various reasons, such as being incomplete or illegible or not having a corresponding key (See Appendix F for details on the numbers of courses, student work products, and sampled student work products).

Rating workshops, in which the assessment goals and rating methodology were reviewed, were held in April 2017. All D1 and B2 faculty were invited to apply to serve as faculty raters in a daylong assessment of student work products in their Core area, for which they received a $250 honorarium. Six D1 faculty and four B2 faculty participated in rating sessions on May 30, 2017 (D1) and June 6, 2017 (B2) (see Appendix B for a list of participants). Rating was preceded by a calibration process, in which participants rated the same student work products and discussed any discrepancies in their application of the rubric. A portion of the work products was also rated by a second faculty rater to check inter-rater reliability (this procedure is explained later in this report). In total, raters assessed about 25% of the submitted D1 work products and about 16% of the submitted B2 work products.

Assessment Results - Set 1

D1 Philosophy

Results shown in Figures 1-5 reveal that a solid majority of students leave D1 courses able to identify key philosophical concepts, issues or positions and to articulate, interpret and evaluate them using philosophical methods. A little more than half leave with the capacity to explain the significance of concepts, issues or positions, and their inter-relations. A little less than half leave with the capacity to critically assess the concepts, issues or positions. A minority of students demonstrate the ability to apply philosophical content to self or the world, considering multiple perspectives and why they matter.

Student performance is strongest on Criteria 1 and 4, moderate on Criteria 2 and 3, and quite weak on Criterion 5. The failure rate is highest on Criterion 5, followed by Criterion 3, Criterion 4, Criterion 2, and Criterion 1, respectively. Specifically:

  • Criterion 1: Identifies key philosophical concepts, issues or positions. Around two-thirds of the students were rated to be meeting or exceeding expectations in this area, and the remaining one- third was below expectations. About one-tenth failed to meet expectations altogether.A verticle bar graphic for criterion 1 identifiing key philosophical concepts, issues or positions. The verticle scale is for the number of student with a range of 0 to 80. The horizontial scale is for the rate score and has four columns; 1 equals 14, 2 equals 30, 3 equals 55, 4 equals 35.

     

  • Criterion 2: Explains significance of concepts, issues or positions, and their inter-relations. Roughly half the students were rated to be meeting or exceeding expectations in this area, and roughly half were below the expected performance standard. About 13% failed to meet expectations altogether.A verticle bar graphic for criterion 2 explains significance of concepts, issues or positions, and their inter-relations. The verticle scale is for the number of student with a range of 0 to 80. The horizontial scale is for the rate score and has four columns; 1 equals 19, 2 equals 46, 3 equals 42, 4 equals 27.

     

  • Criterion 3: Critically assesses the concepts, issues or positions within their appropriate context (i.e., debates, problems, theories). More than two-fifths of the students were rated to be meeting or exceeding expectations in this area, and a little less than three-fifths below the expected performance standard. About 18% failed to meet expectations altogether.A verticle bar graphic for criterion 3 assesses the concepts, issues or positions within their appropriate context (i.e., debates, problems, theories). The verticle scale is for the number of student with a range of 0 to 80. The horizontial scale is for the rate score and has four columns; 1 equals 24, 2 equals 49, 3 equals 38, 4 equals 23.

     

  • Criterion 4: Articulates content analysis, interpretation, and evaluation using philosophical methods. Nearly 60% of the students were rated as meeting or exceeding expectations in this area, and the remaining two-fifths were below the expected performance standard. About 17% of the students failed to meet expectations altogether.A verticle bar graphic for criterion 4 articulates the content analysis, interpretation, and evaluation using philosophical methods. The verticle scale is for the number of student with a range of 0 to 80. The horizontial scale is for the rate score and has four columns; 1 equals 23, 2 equals 32, 3 equals 56, 4 equals 23.

     

  • Criterion 5: Applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. About two-fifths of the students were rated to be meeting or exceeding expectations in this area, with three-fifths below the expected performance standard. More than a fifth of the students failed to meet expectations altogether.A verticle bar graphic for criterion 5 applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. The verticle scale is for the number of student with a range of 0 to 80. The horizontial scale is for the rate score and has four columns; 1 equals 29, 2 equals 50, 3 equals 42, 4 equals 13.

     

Table 1. Percentage of students meeting expectations on assessed work. The percentage is based on the number of work products with a rating score of 3 or higher divided by the total number of rated products overall.
Criteria Percentage of Students Scoring 3 or Above for D1 Criteria
1 67.2
2 51.5
3 45.5
4 59.0
5 41.0

B2 Natural or Laboratory Science

Results shown in Figures 6-8 reveal that overall, a high percentage of Core Area B2 students are able to explain scientific concepts and principles and conduct investigative analyses using scientific principles. In addition, almost two-thirds of students are able to apply scientific content to self or the world, considering multiple perspectives and why they matter. Student performance is strongest on Criteria 1 and 2, and moderate in Criterion 3, but overall the results indicate fairly strong performance in this core area. Specifically:

  • Criterion 1: Explains scientific concepts and principles. More than 80% of the students were rated to be meeting or exceeding expectations in this area, and the remaining 20% were below expectations. About 2% failed to meet expectations altogether.A verticle bar graphic for criterion 1 explains scientific concepts and principles. The verticle scale is for the number of student with a range of 0 to 70. The horizontial scale is for the rate score and has four columns; 1 equals 2, 2 equals 14, 3 equals 57, 4 equals 18.

     

  • Criterion 2: Conducts an investigative analysis using scientific methodology. Nearly 75% of students were rated to be meeting or exceeding expectations in this area, with 24% failing to meet expectations. In addition, almost 17% of the students sampled failed to demonstrate competency for this criterion. Several raters noted in some cases that products scoring a “1” were the result of not having a “conventional” laboratory product to assess, such as a lab report or lab worksheet reflective of an investigative analysis. If the product did not demonstrate use of investigative analysis techniques such as the application of scientific method/methodology or parts of scientific method/methodology to address a science topic/issue or problem, a rating of 1 was given. This may explain the higher failure rate for this criterion.A verticle bar graphic for criterion 2 conducts an investigative analysis using scientific methodology. The verticle scale is for the number of student with a range of 0 to 70. The horizontial scale is for the rate score and has four columns; 1 equals 15, 2 equals 8, 3 equals 62, 4 equals 6.

     

  • Criterion 3: Applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. About two-thirds of students were rated to be meeting or exceeding expectations in this area, with about one-third below expectations. About 5% of student work products sampled failed to meet expectations all together. Raters discussed the interpretation of this criterion at length during the calibration process. While there is some ambiguity in how some specific work products demonstrate competency, raters agreed to a broad interpretation of how this criterion would be applied.A verticle bar graphic for criterion 3 applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. The verticle scale is for the number of student with a range of 0 to 70. The horizontial scale is for the rate score and has four columns; 1 equals 5, 2 equals 28, 3 equals 43, 4 equals 15.

     

Table 2. Percentage of students meeting expectations on assessed work. The percentage is based on the number of work products with a rating score of 3 or higher divided by the total number of rated products overall.
Criteria Percentage of Students Scoring 3 or Above for B2 Criteria
1 82.4
2 74.7
3 63.7

Reflections on Assessment Results - Set 1

D1 Philosophy

The Core D1 curriculum and instruction is quite successful in teaching students to identify philosophical arguments and to communicate about them with solid argumentation, composition, technical skill, clarity, and appropriate academic style. The more critical and analytical aspects of the D1 Core are less successfully achieved. These data cannot tell us whether this is because these more advanced skills are under-emphasized in courses, or because they are being delivered less effectively than they might be, whether student work products selected for assessment were not the most appropriate demonstration of student learning in these areas, or whether mastery of these advanced skills is actually more appropriate to expect of students in the major rather than students in Core classes. We recommend examining that question because the actions required to address these learning outcomes differ depending on its answer. For example, the Philosophy department is primarily responsible for offering Core D1 courses to our students, so it may decide to commit to the delivery of those criteria that have received the weakest results through the enforcement of pedagogical standards, or it may endeavor to more closely align assignments with learning outcomes so that they more accurately reflect student learning. Perhaps the department will propose to the Core Area D committee and the CAC to eliminate or reduce the scope of the criteria that have received the weakest results, if it determines that the expectations are not truly appropriate for a Core D1 class.

Clearly, student performance is weakest in the area of application of philosophical content to self or the world. This finding may indicate either that students are not achieving this outcome or that this element of the D1 Core is not currently being emphasized in courses. Because raters were instructed to rate as a 1 any student work product in which there was no evidence of the criterion, it is hard to tell whether this was due to poor student performance or because the element was absent in the assignment altogether. As the Core D1 Area is revisited in the future, the question may become whether this is a criterion that instructors need to be encouraged to emphasize more thoroughly or whether it might instead need to be removed from the D1 Core learning outcomes. In future assessments, we recommend instituting a mechanism for distinguishing between “not proficient” and “not addressed” in the rating of learning outcomes. In this round of assessment, it is difficult to interpret the relatively high failure rates on some D1 criteria.

B2 Natural or Laboratory Science

The Core B2 curriculum and instruction as assessed is highly successful in teaching students scientific concepts and principles and teaching aspects of investigative analysis using scientific methodology. In many ways, the assessment of these two criteria is very straightforward and consistent because the HOLGs closely mirror the original Core Learning Outcomes (CLOs). Thus as was assumed at the beginning of our work, results confirm that most B2 core courses were very well-aligned to CLOs and subsequent HOLGs. However, it is problematic that Criterion 2 also had the highest number of work products that did not meet expectations. This was an unexpected result given the nature of Core B2 courses. The laboratory component of B2 courses should be a point of emphasis with work products that clearly demonstrate the use of scientific methodology in an investigative nature. The laboratory portion of these courses should at a minimum provide an authentic laboratory and/or field experience with an appropriate product. Criterion 3 is also an area of concern because it had the lowest competency rate overall. Clearly, this criterion is easier to address in some scientific disciplines. However, raters used very broad interpretations of how this criterion could be met, yet the results were not to the level we would expect. Departments should reexamine how to address this criterion better or whether it should be modified in future assessments.

Inter-Rater Reliability Analysis for Set 1 Raters

Inter-rater reliability is a numerical estimate that measures the degree of agreement among raters when assessing the same work product. Inter-rater reliability was examined to ensure that the assessment process was both accurate and consistent. We used a basic two rater model to calculate percent agreement with both exact and adjacent ratings scored as agreement. Out of the 134 work products rated for D1, a sample of 26 work products (or 19.4 %) was rated twice; out of 91 total work products rated for B2, a sample of 13 work products (or 14. 3%) was rated twice.

Rule of thumb benchmarks for rater data containing 4 or fewer categories (as in the case of both the D1 and B2 datasets) is that a percent agreement of 90% (or higher) constitutes a high agreement, and 75% is considered minimal agreement. Using these benchmarks, rater agreement on all D1 criteria surpassed the minimal standard, with Criterion 2 having the highest percent agreement at 92% (Table 3).

Table 3. Inter-rater reliability for D1 criteria based on a two rater method of agreement.
Criteria D1 Rater Percent Agreement
1 88.0
2 92.0
3 77.0
4 88.0
5 81.0

Percent rater agreement for B2 criteria ranged from 92 to 100%, resulting in a high agreement for all criteria (Table 4). For both of the assessed areas, we concluded the calibration process was sufficient to train the raters and that results presented fall within an acceptable reliability range.

Table 4. Inter-rater reliability for B2 criteria based on a two rater method of agreement.
Criteria B2 Rater Percent Agreement
1 100.0
2 100.0
3 92.0

Next Steps for Core Assessment Reports- Set 1

Following the release of the Core assessment report and any department-specific data, departments and programs will be required to offer their interpretations of the results and to specifically evaluate the Core Learning Outcomes (CLOs) as they apply to courses they teach. Departments and programs will be required to provide feedback on the current set of CLOs for their Core Area and comment on both strategies to address deficiencies that were identified in the assessment process and whether CLOs should be modified as a result of the assessment. Potential outcomes at the department or program level include but are not limited to reporting: 1) modification of current CLOs are necessary, 2) identification of more appropriate student work products for the assessment process, 3) suggesting modifications to the rubric used for the assessment and finally, 4) identifying changes to specific Core courses to better align with CLOs. This information will be collected in a simple Google form designed to capture faculty sentiment at the department or program level after discussion of the assessment results. Once submitted, this form will be sent to the relevant Core Area Chair, the Core Advisory Committee (CAC) co-chairs and the Associate Dean of Academic Effectiveness. A timeline for reporting feedback will be established to align with other required assessment activities and reports for the College. Information from these reports will be used to inform Core Area Chairs and the CAC on the state of Core curriculum and should provide the data necessary to guide any subsequent changes to the Core curriculum.

Appendix A: Core Area Assessment Master Timeline

Core Assessment Timeline

Appendix B: D1 and B2 Assessment Process Timeline

FT = Full-time, PT = Part-time, PHP = Preferred Hiring Pool, ADAE = Associate Dean of Academic Effectiveness

  • 9/21/16: CAWG Meeting – D1 Rubric Review - Participating faculty: Gerard Kuperus (FT Tenured), Marjolein Oele (FT Tenured), Jeffrey Paris (FT Tenured)
  • 10/5/16: CAWG meets with external consultant Carol Gittens to discuss Rubric and Collection Methodology
  • 10/19/16: CAWG Meeting – B2 Rubric Review - Participating faculty: Louise Goupil (PT)
  • 10/27/16 Philosophy Dept Meeting – D1 Rubric Review - Led by Ron Sundstrom with quorum of FT and PT faculty
  • 10/27/16: CAWG attends CAC Meeting to give update on D1 Rubric status
  • 11/2/16: CAWG Meeting – D1 Rubric Review - Participating faculty: Rebecca Gordon (PT PHP), Jeffrey Paris (FT Tenured)
  • 11/16/16: CAWG Meeting – B2 Rubric Review - Participating faculty: None
  • 12/17/16: D1 and B2 Rubrics Approved at CAC Meeting
  • 3/10/17: Faculty Rater Applications Due
  • 4/11/17: D1/B2 Rating Workshop Orientation - Led by ADAE June Madsen Clausen with support from CAWG committee
  • 4/12/17: D1/B2 Rating Workshop Orientation - Led by ADAE June Madsen Clausen with support from CAWG committee
  • 5/30/17: D1 Rating Workshop - Led by ADAE June Madsen Clausen with support from CAWG committee. Participating faculty: Alexi Angelides (PT), Katherine Black (PT PHP), Greig Mulberry (PT PHP), Marjolein Oele (FT Tenured), Laurel Scotland Stewart (PT PHP), Ron Sundstrom (FT Tenured)
  • 6/6/17: D1 Rating Workshop - Led by ADAE June Madsen Clausen with support from CAWG committee. Participating faculty: Tracy Benning (FT Tenured), Amalia Kokkinaki (FT Term), Cary Lai (FT Untenured), Scott Nunes (FT Tenured)

Appendix C: Higher Order Learning Goals (HOLGs)

Created by CAWG to simplify the Core Area Learning Outcomes into more consistent and more measurable terms.

Note: the Higher Order Learning Goals (HOLGs) do not replace the Core Learning Outcomes (CLOs). CLOs will still be used to evaluate whether a course should receive Core designation. HOLGs will be used exclusively for assessing student learning in Core courses.

WASC ILO HOLG CLO
A1: PUBLIC SPEAKING
Written Communication
Oral Communication
Critical Thinking
ILO 1
ILO 3
Analyze, interpret, and evaluate using rhetorical concepts and principles, the effectiveness of their own and others' communication in both academic and civic contexts, and identity ethical problems in public address. A1: 4
A1: 5
Written Communication
Oral Communication
ILO 4 Compose and present well-organized speeches, and well-reasoned, appropriately supported oral arguments. A1: 1
A1: 2
A1: 3
A2: RHETORIC AND COMPOSITION
Written Communication
Oral Communication
Critical Thinking
ILO 1
ILO 3
Analyze, interpret, and evaluate linguistic and rhetorical strategies used in a variety of texts, and connect multiple texts in an argumentative essay, making comparisons and contrasts between them. A2: 1
A2: 2
Information Literacy ILO 5
ILO 6
Compose sophisticated research questions and arguments in response to those questions, conducting library research, and using academic documentation methods. A2: 3
A2: 4
A2: 5
B1: MATH
Critical Thinking
Quantitative Reasoning
Information Literacy
ILO 3
ILO 6
Design and implement mathematical solutions to algebraic, algorithmic, statistical, numerical, or computational problems. B1: 1
B2: 2
Critical Thinking
Quantitative Reasoning
Information Literacy
ILO 6 Evaluate the validity of a solution and its relevance to the original problem using quantitative reasoning as the norm for decision making. B1: 3
B2: SCIENCE
Critical Thinking
Quantitative Reasoning
Information Literacy
ILO 3 Demonstrate literacy in the content and principles of a scientific discipline. B2: 1
Critical Thinking
Quantitative Reasoning
Information Literacy
ILO 1
ILO 3
ILO 6
Conduct laboratory or field procedures that explore content, principles and application of scientific disciplines in a socially responsible manner. B2: 2
B2: 3
B2: 4
C1: LITERATURE
Critical Thinking
Information Literacy
ILO 1
ILO 3
Analyze, interpret, and evaluate the historical, social, and cultural influences that inform diverse literary works. C1: 1
C1: 2
C1: 3
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
Articulate the ideas, plural meanings, moral and social implications, and formal features of literary works. C1: 2
C1: 3
C1: 4
C2: HISTORY
Critical Thinking
Information Literacy
ILO 1
ILO 3
Analyze, interpret, and evaluate a significant span of history over a wide geographic area, and the histories of past societies and civilizations using the values and standards of their own contexts and times. C2: 1
C2: 3
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
Articulate how significant historical forces shape the development of societies and civilizations, and use historical thinking to consider ethical issues in the past and present. C2: 2
C2: 3
C2: 4
D1: PHILOSOPHY
Critical Thinking
Information Literacy
ILO 1
ILO 2
ILO 3
Analyze, interpret, and evaluate central philosophical issues. D1: 1
D1: 2
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
ILO 5
Articulate using philosophical methods primary philosophical themes and issues found in the writings of the major philosophers. D1: 3
D1: 4
D2: THEOLOGY AND RELIGIOUS STUDIES
Critical Thinking
Information Literacy
ILO 1
ILO 2
ILO 3
Analyze, interpret, and evaluate the value of how religion, theology, and spirituality underlies and correlate with a broad range of human experience. D2: 1
D2: 2
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
ILO 5
Articulate the similarities and differences among diverse religious traditions and their ethical and social implications. D2: 3
D3: ETHICS
Critical Thinking
Information Literacy
ILO 1
ILO 2
ILO 3
Analyze, interpret, and evaluate central ethical issues concerning right and wrong; good and bad; and equality, justice, and rights. D3: 1
D3: 2
D3: 3
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
ILO 5
Articulate ethical theories and values and apply them in professional and personal decision making. D3: 4
D3: 5
D3: 6
E: SOCIAL SCIENCES
Critical Thinking
Information Literacy
ILO 1
ILO 3
Analyze, interpret, and evaluate issues regarding humans and the processes that shape their relationships, institutions, and interactions with their environments. E: 1
E: 2
E: 3
Critical Thinking
Quantitative Reasoning
Information Literacy
ILO 3
ILO 6
Use qualitative or quantitative data, analysis, or theory to evaluate causal arguments in the social sciences. E: 2
E: 3
E: 4
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
Articulate social science arguments that recognize connections between the social, economic, political, and environmental spheres of human life in a socially responsible manner. E: 1
E: 5
E: 6
F: VISUAL AND PERFORMING ARTS
Critical Thinking
Information Literacy
ILO 1
ILO 3
Analyze, interpret, and evaluate the aesthetic, historical, socio-political, and cultural influences that inform diverse art works. F: 1
F: 2
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 6
Apply conceptual and technical skills related to an artistic discipline by engaging in creative and scholarly processes. F: 2
F: 3
Written Communication
Oral Communication
Critical Thinking
ILO 3
ILO 4
Articulate the ethical and socio-political significance of the content and form of artistic works and the processes used to create them. F: 1
F: 2
F: 4

Appendix D: Core Assessment Reporting Protocol

  1. In the reporting process, CAWG prepares a draft report. If Core Areas are made up of more than one department, then individual department data will be reported separately as an addendum, and only aggregated data for the Core Area will be included in the main report.
  2. The draft main report and addenda reports will be shared with CAC, and CAC will provide feedback to CAWG about the reports.
  3. CAWG will incorporate feedback and submit final drafts of the main report and addenda to the CAC for approval.
  4. Once approved, the CAC co-chairs will submit the final reports to the CAS Dean for approval.
  5. Once approved, the CAC co-chairs will share the main report with all Arts and Sciences faculty. However the addenda, with department-specific data, will be shared exclusively with each respective department, ideally via an in-person meeting with members of CAWG, in order to facilitate the “closing the loop” process.
  6. Departments will share a written response to the main report and appropriate addendum with the Core Area Chair, the CAC co-chairs, and the ADAE, to capture their interpretations of the findings and their action plans for completing the “closing the loop” part of the assessment process.
  7. Requests for course-specific data reporting from departments may be approved by the CAC if faculty within the department consent to its release using voting procedures outlined in their department bylaws. If any faculty member disagrees with, or has concerns about, the department consent to release course-specific reporting data, they may submit a statement of concern to the CAC along with the report of the department vote.

Developed 11/7/17 by CAWG Committee, Approved by CAC 11/29/17, Revised 2/2/18 by CAWG Revision approved by CAC 2/2/18

Appendix E: Rating Rubrics

D1. Philosophy Higher Order Learning Goals (HOLGS)

Students will:

  1. Analyze, interpret, and evaluate central philosophical issues.
  2. Articulate using philosophical methods primary philosophical themes and issues found in the writings of the major philosophers.
D1 (Philosophy) HOLG Rubric
Criteria Performance Standards
Exceeds Expectations (4) Meets Expectations (3) Needs Improvement (2) Below Expectations (1)
Identifies key philosophical concepts, issues or positions. Identifies key content and related elements with exceptional specificity and accuracy. Identifies key content and related elements with acceptable specificity and accuracy. Identifies some key content and related elements with limited specificity or accuracy. Did not identify key content, or articulates content with excessive errors.
Explains significance of concepts, issues or positions, and their inter- relations. Explains significance of content and inter- relations with exceptional clarity and accuracy. Explains significance of content and inter- relations with acceptable clarity and accuracy. Explains significance of content and inter- relations with limited clarity or accuracy. Did not explain significance of content and inter- relations or articulates significance with excessive errors.
Critically assesses the concepts, issues or positions within their appropriate context (i.e., debates, problems, theories). Critically assesses content within its appropriate context with exceptional understanding and insight (e.g., depth of analysis, astuteness, originality). Critically assesses content within its appropriate context with acceptable understanding and insight. Critically assesses content within its appropriate context with limited understanding or insight. Did not critically assess content within its appropriate context.
Articulates content analysis, interpretation, and evaluation using philosophical methods. Articulates with exceptionally effective argumentation, composition, technical skill, clarity, and appropriate academic style. Articulates with mostly effective argumentation, composition, technical skill, clarity, and appropriate academic style. Articulates with partially effective argumentation, composition, technical skill, clarity, or appropriate academic style. Did not articulate with effective argumentation, composition, technical skill, clarity, or appropriate academic style.
Applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. Applies content to self or the world, considering multiple perspectives and why they matter with exceptional insight (e.g., depth of analysis, astuteness, originality). Applies content to self or the world, considering multiple perspectives and why they matter with acceptable insight. Applies content to self or the world, considering multiple perspectives and why they matter with limited insight. Did not apply content to self, considering multiple perspectives and why they matter.

Developed by CAWG Committee - October 2016. Approved by CAC - December 2016.
Edited - March 2017.

D1. Philosophy Core Learning Outcomes (CLOs)

Students will:

  1. Understand the value of thinking philosophically by reflecting on the meaning of one’s own life, the conceptual foundations of human actions and beliefs, the nature of the self and of human responsibility. Criterion 1
  2. Understand and discuss coherently the central philosophical issues, such as the problem of evil, the existence of God, free will, the mind/body relation, human knowledge, and the question of being. Criterion 3
  3. Demonstrate an ability to identify and articulate, both orally and in writing, the primary philosophical themes and issues found in the writings of the major philosophers. Criterion 1 and Criterion 2
  4. Demonstrate an ability to evaluate philosophical arguments critically, both orally and in writing, using philosophical methods that have been developed by either historical or contemporary philosophers. Criterion 4

B2. Laboratory Science Higher Order Learning Goals (HOLGS)

Students will:

  1. Demonstrate literacy in the content and principles of a scientific discipline.
  2. Conduct laboratory or field procedures that explore content, principles and application of scientific disciplines in a socially responsible manner.
Rubric, B2 (Laboratory Science) HOLG
Criteria Performance Standards
Exceeds Expectations (4) Meets Expectations (3) Needs Improvement (2) Below Expectations (1)
Explains scientific concepts and principles. Accurately explains scientific concepts while demonstrating understanding and insight (e.g. depth of analysis, cleverness, originality, thoroughness). Accurately explains scientific concepts. Explains scientific concepts with limited accuracy. Did not explain scientific concepts, or makes excessive errors.
Conducts an investigative analysis using scientific methodology. Accurately conduct an investigative analysis while demonstrating understanding and insight (e.g. depth of analysis, cleverness, originality, thoroughness). Accurately conducts an investigative analysis. Conducts an investigative analysis with limited accuracy. Did not conduct an investigative analysis, or makes excessive errors.
Applies content to self or the world, considering multiple perspectives (e.g., comparative, historical, methodological) and why they matter. Applies content to self or the world, considering multiple perspectives and why they matter with exceptional insight (e.g. depth of analysis, astuteness, originality). Applies content to self or the world, considering multiple perspectives and why they matter with acceptable insight. Applies content to self and the world, considering multiple perspectives and why they matter with limited insight. Did not apply content to self and the world, considering multiple perspectives and why they matter.

Developed by CAWG Committee - October, 2016. Approved by CAC - December 2016. Edited - March, 2017.

B2. Laboratory Science Core Learning Outcomes (CLOs)

Students will:

  1. Demonstrate understanding of and literacy in the content and principles of a scientific discipline.Criterion 1
  2. Perform laboratory or field procedures that explore the content and principles of these disciplines. Criterion 2
  3. Carry out scientific procedures in a socially responsible manner. Criterion 3
  4. Accurately observe, record, analyze, and report data collected in the scientific laboratory or the field. Criterion 2

Appendix F: Expanded Methodology and Numbers for Assessment Process

In Spring 2017, the assessment time frame for D1 and B2, there were a total of 629 students enrolled in 18 sections of 9 different D1 courses and 916 students enrolled in 18 sections of 12 different B2 courses. In the beginning of the semester, a call went out to all D1 and B2 faculty for student work products to be assessed. Corie Schwabenland Garcia, Academic Data and Assessment Analyst for the Office of Academic Effectiveness, was charged with collecting all of the student work products submitted by faculty. A total of 544 student work products were submitted for D1 and 1404 total products were submitted for B2. A summary of enrollment numbers and overall assessment metrics are presented in Table 1. For B2 work products, Corie then went through the submitted items and paired the exams and lab reports for each student resulting in 569 sets (1138 products total) of B2 work products. Approximately one third of the submitted D1 work products (180) and a little over one third of B2 student work products (216 pairs) were randomly selected to be available for assessment. For all work products, student names and any identifying information were redacted, and any grading that appeared on the products were removed. Out of the available pool, the 6 faculty raters assessed 134 student work products, or 24.6% of the total D1 products submitted on May 30, 2017 and 91 pairs of student work products or 16% of the total number of B2 products submitted were assessed on June 6, 2017 by 4 faculty raters. A reliability check was performed during each rating session by having a subset of work products evaluated by two faculty raters. For D1, 19.4 % or 26 student work products were double-rated and for B2, 14.3% or 13 products were double rated. Those ratings were then used to perform an inter- rater reliability check as detailed in the report.

Table 1. Core D1 and B2 assessment metrics for student work products and faculty compliance. Student work products were requested for every student enrolled in Core B2 and D1 courses. Two work products were requested for each B2 student resulting in a potential 1832 products collected for B2.
Spring 2017 Student Enrollment Number of Work Products Submitted % Compliance Number of Assessable Work Products Revised % Compliance
D1 629 544 86.8% 544 86.8%
B2 916
(2 products/student)
1404 76.6% 1138 62.1%