Philosophy and process
Unanimously updated and reaffirmed by LIS faculty 11 January 2022
Overview: Not all of our strengths can be captured in numbers. For program-level assessment, we prefer an approach where many voices can be heard and considered, over a longer period of time than a single semester. When issues arise with a course, instructor or student, those are addressed individually by the LIS Program Director. When issues persist and recur across multiple courses and semesters, then we feel we are justified in claiming an area of strength, or in devoting resources to addressing areas where improvement is needed.
Lived experience: The most organic form of assessment is our lived experience, where we enact our values. While it would be inappropriate to attempt to quantify these experiences, our graduating student surveys tell us that the sense of community within the LIS Program is by far what our students value most. As part of our teaching, research, service and informal interactions, we share our experiences throughout the year, documented in meeting minutes and year-end reflections.
Critical refusal: While acknowledging the usefulness of assessment metrics for some purposes, our philosophy also encompasses critical refusal of harmful data regimes, as summarized in the Feminist Manifest-No.
Course evaluations: As a formative assessment of student learning and faculty instruction, each student evaluates each course in terms of content, workload, delivery and assessment methods, and instructor effectiveness. Course evaluation data are collected by the Program Coordinator and reviewed by the Director each semester. Instructors whose evaluations are lower than 4/5 meet with the Director to debrief. Adjuncts who consistently fail to reach the 4/5 standard, or who receive very low ratings, are not invited back to teach. Course evaluation data–viewed with an appropriately critical eye–are used to assess and evolve individual teaching and the curriculum as a whole.
Course SLO assessments: We are in the process of reviewing our course SLO assessments. Previously, as a formative assessment of student achievement of program SLOs, each instructor identified the number of students who exceeded, met, approached or did not meet the standards associated with the primary SLO of the course via a major assignment, or a set of related assignments. This is was thought to be a more direct, formative measure of SLO mastery than an overall course grade, but the results did not help faculty answer relevant questions (e.g., are students improving SLO mastery over time).. This is one example of how we practice critical refusal–we are trying to make SLO assessment “… intelligible, tangible, and controllable” following the Feminist Manifest-No.
Culminating experience assessments: Students completing the ePortfolio produce an artifact and reflective essay associated with each SLO, which is assessed on a Pass/Revise scale by two faculty members. Results are aggregated by the seminar instructor as a summative measure of student achievement by SLO. We use this and other data to revise both the curriculum and the ePortfolio requirements. Thesis students are assessed on specific elements of their research, including their problem statement, literature review, method, results, and the quality of their written thesis and oral defense; the evaluation form can be found on p. 7 of our thesis policy and FAQ.
Graduating student surveys: A summative assessment of the overall student experience, given each semester to graduating students. Response trends are reviewed by the Director and brought to faculty for potential programmatic changes as appropriate.
Alumni surveys: A reflective assessment of student experience and professional skills needed and acquired administered every 5 years. Response trends are reviewed by the Director and brought to faculty for potential programmatic changes as appropriate.
Employer surveys: A merged assessment of UHM LIS alumni skills, job performance and needed skills within each employer organization, administered every 5 years. Response trends are reviewed by the Director and brought to faculty for potential programmatic changes as appropriate.
Committee work and event documentation: Each faculty member leads one or more standing committees, task forces or initiatives, and most advise one or more student organizations. Evidence of student experience, stakeholder engagement and faculty service, the work of committees and student organizations are documented in announcements, event documentation, faculty meeting minutes and faculty CVs.
Year-end meeting. The faculty year-end reflection meetings are under review. Initially, to contextualize assessment data and to systematically collect and compile stories, reflections and any other issues, we imagined the LIS Associate Director would meet with each faculty member, either in person or virtually, at the end of the academic year, to review and elicit their reflections on:
- Opportunities or challenges with their courses for formative assessment of student achievement.
- Their assessment of the performance of the LIS Director and Program Coordinator.
However, year-end meetings were suspended in 2020 due to the pandemic, and also not carried out in 2021, LIS Director assessment notwithstanding. Our goal is to arrive at a lightweight assessment process that invites qualitative reflections on student achievement over the previous academic year, for example by identifying areas of strength or where students struggled, to allow reflections to be shared with other faculty and staff and to address and integrate these reflections into students’ subsequent courses.
Retention and persistence
According to the Mānoa Institutional Research Office (MIRO), student retention and persistence has been above 90% every year from 2005-2020, from a high of 99% in 2012 to a low of 91.5% in 2019.
Time to degree
From 2006-2020 (405 graduates), the mean time to degree has been 2.29 years.
In our 2018 Alumni survey, 60 graduates were contacted and 21 responded (35%). 81% of respondents reported that they were working in an LIS-related professional position within 12 months of graduation, primarily in academic (29%), public (29%), archives (14%), special (14%) or school (10%) libraries. Most alumni (62%) work in Hawaii, with 24% working on the US mainland.
Additional job placement data are available in the 2021 Library Journal Placements and Salaries Survey.
Job performance and advancement
We survey employers of our graduates every 5 years, most recently in 2019. We ask employers to indicate the extent to which UH LIS graduates are able to demonstrate skills in three broad areas, and over 70% of respondents agreed or strongly agreed with all three of these statements of our graduates’ abilities:
- Provide information services for a range of users’ needs
- Practice ethical responsibilities of the profession in providing services and resources
- Manage and work effectively in collaborative problem solving and team projects
Respondents also indicated that UHM LIS graduates demonstrate consistent strengths in the following:
- Inclusion of indigenous and local knowledge to inform library practices
- Passion for the field and a deep interest in serving patrons
- Critical thinking and ability to analyze data
- Reference services and information retrieval
- Web design and digital preservation
- Community engagement and customer service skills
- Connections with professional communities
- Ability to meet a range of user needs
- Work ethic
- Overall professionalism
Program and outcomes assessment
Assessment reports submitted to the Manoa Assessment Office, and previously to the Vice Chancellor for Academic Affairs and Graduate Division, are also available for review. Note: Assessments were not conducted in 2019 due to the pandemic, and then changed to a two-year cycle:
- Assessment Report 2022 (available early 2023)
- Assessment Report 2020
- Assessment Report 2018
- Assessment Report 2017
- Assessment Report 2016
- Assessment Report 2015
- Assessment Report 2014
- Assessment Report 2013
- Assessment Report 2012
- Assessment Report 2011
- Assessment Report 2010
- Assessment Report 2009