Program Learning Outcomes Process and Procedures

Assessment Clearinghouse Program Learning Outcomes Process and Procedures

Information about the process and procedures for the annual reporting of program learning outcomes assessment at Ohio University. Frequently asked questions are answered at the bottom of the page.

NILOA

Assessment Clearinghouse Background

The Assessment Clearinghouse was developed in 2016.  Prior to its development program learning outcomes assessment reporting resided within each academic college, department/school, or co-curricular unit and was stored on their respective websites, making it arduous to measure progress and promote good practice institution-wide. In collaboration with the Teaching, Learning, and Assessment Committee and the Assessment Liaisons, Institutional Effectiveness & Analytics (formerly Institutional Research) developed the Assessment Clearinghouse reporting model based on the National Institute for Learning Outcomes Assessment (NILOA) Transparency Framework.  The NILOA framework was determined to be the best fit for OHIO as it offers best practice for program learning outcomes assessment as well as flexibility in applying a centralized reporting model to OHIO’s decentralized assessment processes.

The reporting of program outcomes assessment requires all academic and co-curricular programs to articulate student learning outcomes, assess them, and use the results from the assessments to improve student learning within the program.  In addition to supporting academic program review and the continuous improvement process, program learning outcomes assessment is required by the Higher Learning Commission (HLC), Ohio University’s institutional accreditor.  This requirement is set forth in HLC’s Criteria for Accreditation as follows:

Criterion 4. Teaching and Learning: Evaluation and Improvement

The institution demonstrates responsibility for the quality of its educational programs, learning environments, and support services, and it evaluates their effectiveness for student learning through processes designed to promote continuous improvement.

4.B. The institution engages in ongoing assessment of student learning as part of its commitment to the educational outcomes of its students.

  1. The institution has effective processes for assessment of student learning and for achievement of learning goals in academic and cocurricular offerings.
  2. The institution uses the information gained from assessment to improve student learning.
  3. The institution’s processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty, instructional and other relevant staff members.

Assessment Liaisons

Prior to the development of the Assessment Clearinghouse, the Teaching, Learning, and Assessment Committee worked with each college to identify an Assessment Liaison.  The Assessment Liaison is the person within each college or school that facilitates a working relationship between Institutional Effectiveness and Analytics’ (IEA) Assessment Clearinghouse reporting process and the academic programs within their colleges. Initially, the Assessment Liaisons provided vital feedback to IEA in the development of the Assessment Clearinghouse structure and reporting processes. After implementation the Assessment Liaisons continue to provide paths for communication and feedback, assist with the collection of annual assessment reports, and share good assessment practices with other Liaisons and academic programs within their colleges and departments/schools.

Current Liaisons

CollegesAssessment Liaisons
College of Arts & SciencesBrian McCarthy, Senior Associate Dean
  
College of Business

Jim Strode, Assessment Liaison

Rebecca Dingus, Undergraduate AOL Coordinator

Barry Hettler, Graduate AOL Coordinator

Ting Ting Lin, Accounting AOL Coordinator

Kristin Hepworth, Undergraduate Certificate Assessment Coordinator

Janna Chimeli, Graduate Certificates & Graduate Program Concentrations Assessment Coordinator

Kelley Walton, Sports Management Graduate Assessment Coordinator

Melissa Davies, Sports Management Undergraduate Assessment Coordinator

  
College of Fine ArtsDavid LaPalombara, Professor, School of Art & Design
  
Graduate CollegeGreg Newton, Interim Associate Dean
  
College of Health Sciences & ProfessionsKimberly Ephlin, Assistant Dean for Academic Affairs
  
Heritage College of Osteopathic MedicineMary Wurm-Schaar, Director, Institutional Assessment and Accreditation
  
Honors Tutorial CollegeMary Kate Hurley, Interim Associate Dean
  
Center for International StudiesCat Cutcher, Associate Director, Center for International Studies
  
Patton College of EducationZacharay Schabel, Director of Assessment & Academic Improvement
  
Russ College of Engineering and TechnologyDavid Juedes, Associate Dean of Academics
  
Scripps College of CommunicationAimee Edmondson, Associate Dean for Graduate Studies, Research and Creative Activity
  
University CollegeJulie Cohara, Director of Degree Programs
  
Voinovich School of Leadership and Public ServiceJoseph Wakeman, Manager of Enrollment and Student Success

Academic Programs Required to Submit Annual Reports

For Assessment Clearinghouse (program learning outcomes assessment) purposes, an academic program is a credit-bearing, organized course of study that results in a degree or certificate.

All undergraduate, graduate, professional degree programs and certificate programs are required to participate.

Within degree programs, the focus of learning outcomes assessment is the major.  Minors offered to degree-seeking students may be assessed separately at the discretion of the dean or chair, but the results do not need to be reported to the Assessment Clearinghouse at this time.

A program with multiple degrees at the same level and a common core curriculum (e.g., BA and BS in Biology) may submit one assessment plan and report but should include some learning outcomes and assessment measures unique to each degree.

Programs with concentrations or tracks may submit one assessment plan and report but are required to include at least one learning outcome and assessment measure unique to each concentration or track.

Programs with residential and distance education versions of the same degree may submit joint or separate plans and reports, but however the approach, assessment reports should include comparisons between the knowledge and skills attained by graduates of the different modes of delivery.

Cocurricular Programs Required to Submit Annual Reports

Co-curricular Program Definition

Coming soon! 

Cocurricular Checklist:

Coming soon!

Assessment Clearinghouse Reporting Components

The Assessment Clearinghouse reporting structure is aligned to the National Institute for Learning Outcomes Assessment’s (NILOA) Transparency Framework and has four required reporting components:

  1. Program Learning Outcomes

Program learning outcomes are the learning outcomes that describe what a student should know or be able to do upon completion of the program. The number of student learning outcomes for each program is the decision of the program faculty, but it is recommended to focus on identifying 3-5 of the most important.  Some factors that may affect the number of student learning outcomes for each program might be the discipline, subject matter, degree level, credential the student is being prepared for, and professional or disciplinary/programmatic accreditation requirements.  Regardless of the number, program learning outcomes should:

  • focus on what students can demonstrate that they have learned appropriate to the degree or certificate sought,
  • be observable and measurable,
  • include what specific knowledge or skills will be obtained by students upon completion of the program, and
  • use action verbs to define the outcomes that students will demonstrate.
  1. Assessment Plans

An assessment plan contains information related to the program’s process for carrying out their assessment of each program learning outcome.  Assessment plans typically contain information about curriculum mapping, the types of assessment methods and tools that will be used to assess each program learning outcome, the frequency with which the program will measure each program learning outcome and indicate what the student achievement goals/targets are for each program learning outcome.  It is also recommended that a copy of each assessment tool, e.g. rubrics, that is developed is included in the assessment plan.

Curriculum mapping is process for which program faculty explicitly and intentionally align program learning outcomes to the program’s curriculum in terms of where learning opportunities for students to demonstrate achievement of the learning outcomes are provided within the curriculum.

  1. Evidence of Student Learning

The Evidence of Student Learning section in the annual report should describe the aggregated results for each of the assessments conducted during the prior academic year. The program should indicate whether each of the achievement goals/targets were met or not and then summarize the quantitative data. Qualitative results can be reported by describing any themes identified. Individual student results should never be provided in an annual report.

  1. Use of Student Learning Evidence

The Use of Student Learning Evidence section is the most important section in the annual report. Program information included in this section should document how the assessment results reported in the Evidence of Student Learning section have been reviewed by faculty and used to improve the program. They should describe actions taken or improvements initiated to advance student learning within the program. The actions and information reported in this section should correspond to the results reported and related issues. If the assessment results are recent, describe any actions plans for using the results.

Reporting Requirements, Cycle, and Deadlines

Academic and cocurricular programs, including certificate programs, are required to report all four components 1. Program Learning Outcomes, 2. Assessment Plans, 3. Evidence of Student Learning, and 4. Use of Student Learning Evidence to the Assessment Clearinghouse annually at the close of fall semester, typically mid-December, for the program’s assessment activities during the prior academic year.

For existing academic programs, the program learning outcomes and assessment plans components do not need to be reported annually unless these reporting components have been updated from the prior reporting year. If the program learning outcomes and assessment plans have not been updated from the prior year, then only the evidence of student learning and use of student learning evidence components would need to be reported. For example, reporting of the evidence of student learning and use of student learning evidence that was collected during the 2022-23 academic year would be due to the Assessment Clearinghouse on December 15, 2023.

A new program must report its program learning outcomes and assessment plan on the Assessment Clearinghouse reporting deadline for the academic year in which the program first appears in the University’s Undergraduate or Graduate Catalog. Evidence of student learning and use of student learning evidence components would be due on the Assessment Clearinghouse reporting deadline for the following academic year.

For example, for a new academic program first included in the University Catalog in 2023-24:

  • Program Learning Outcomes and Assessment Plan due: December 13, 2024
  • Evidence of Student Learning and Use of Student Learning Evidence due: December 12, 2025

Upcoming reporting cycles and dues dates are as follows:

2021-22 academic year – December 16, 2022

2022-23 academic year – December 15, 2023

2023-24 academic year – December 13, 2024

2024-25 academic year – December 12, 2025

For More Information or Assistance

Questions or assistance with the Assessment Clearinghouse Program Outcomes Assessment process or with completing assessment reports, please contact:

Joni Wadley, Senior Director for Institutional Effectiveness, Institutional Effectiveness & Analytics

schallej@ohio.edu

Questions or assistance with developing or revising program learning outcomes, assessment plans, or assessment methods, please contact:

Wendy Adams, Associate Director for Assessment, Center for Teaching, Learning and Assessment

adamsw1@ohio.edu

Additional FAQs

Why is it important to do program learning outcomes assessment?

The purpose of program learning outcomes assessment is continuous improvement. Programs can use learning outcomes assessment results to inform curricular changes, pedagogical change or professional development initiatives, program policy changes, or advising practices.  Assessment results might also help identify or address unmet student needs, improve the assessment plan itself, or to further explore questions raised by data. Programs should use the results of program learning outcomes assessment to demonstrate the need for resource allocations if needed to help the program meet their assessment targets.  The value of the assessment process is in yielding results that faculty find valuable and in using those results to improve learning.

To whom do we submit our annual assessment reports?

Each college has a different person, the Assessment Liaisons, charged with collecting and evaluating assessments. Please see the included in the Assessment Liaisons information above.

How do we know what to assess within a program?

Each program should have its own program learning outcomes and assessment plan. Programs must assess whether students are achieving each program learning outcome. If you need copies of your units' assessments plans, they can be found on the Assessment Clearinghouse website. For additional assistance please contact Wendy Adams (adamsw1@ohio.edu), Associate Director for Assessment in the Center for Teaching, Learning and Assessment.

How detailed or exhaustive must program assessment plans be?

Plans and assessments scale with program types and scopes. For instance, a certificate program assessment plan may contain only two program learning outcomes (PLO), but an undergraduate degree program’s plan would generally require more PLOs. In comparison, a minor program assessment plan may be more generalized than an BA or BS plan and possibly have fewer PLOs. However, graduate program assessment plans are generally similar in size to undergraduate program assessment plans but will differ in their PLOs.

Can programs use the same learning outcomes?

Each program should have its own unique learning outcomes, but some learning outcomes may be shared by programs. Programs with concentrations or tracks may share some learning outcomes, but at least one learning outcome and direct assessment measure should be unique to each concentration or track.

Can we assess programs together or does there need to be a separate assessment for every program?

Each program should have its own unique learning outcomes and therefore should also have separate assessment plans and evidence of student learning and use of student learning.  For programs with concentrations or tracks that share learning outcomes, the report may be combined, but the results for the shared learning outcomes should be disaggregated by the different concentrations or tracks as well as the results for the unique learning outcomes.

What students should we include when performing annual program learning outcomes assessment?

Program learning outcomes assessments are performed on students that are currently enrolled in the program or courses tied to specific program learning outcomes (e.g., capstone courses) during the relevant academic year.  If there are no students enrolled in a program in the academic year under consideration, you need to document that information in your annual assessment report under the evidence of student learning and use of student learning evidence sections.

What information should be included in the Use of Student Learning Evidence reporting component?

Programs can use learning outcomes assessment results to inform curricular changes, pedagogical change or professional development initiatives, program policy changes, or advising practices. Documenting the results to improve learning is important in the context of providing evidence to HLC to demonstrate that programs are meeting each of the core components for Criterion 4: Teaching and Learning: Evaluation and Improvement.  Programs should consider the following as ways to make it easier to tell the story of how the program is improving based on the assessment results:

  • Include the rationale for a decision that you are making. It’s helpful to include who was involved in a decision, what data or information was considered and what was decided.
  • When new assessment results are finalized, share them across your department/unit and discuss and document relevant findings.
  • Use the assessment results to record your program decisions AND to review progress on what you decided to implement the prior year.
  • Programs should document the use of the program learning outcomes assessment results to request any resource allocations to help the program meet its assessment targets.  
Where can I find example assessment plans and annual assessment results?

All program learning outcomes assessment plans and results can be found in the Assessment Clearinghouse’s Program Learning Outcomes Assessment webpage. Keep in mind that some assessment plans and results will be much more detailed than others because of highly specific disciplinary accreditation standards/requirements for those programs. Please reach out to Joni Wadley (schallej@ohio.edu), Senior Director for Institutional Effectiveness for examples to help meet your program’s specific needs.

Can we use voluntary student satisfaction or opinion polls as our evidence?

The focus of an assessment plan should be direct assessment of student learning.  These are assessments of student work products (papers, performances, presentations, research, etc.) evaluated using a rubric or other set of agreed-upon standards to rate the level of achievement of the stated program learning outcomes. Student satisfaction surveys are indirect assessments.  These surveys or other forms of indirect assessment such as focus groups, self-assessments, interviews, etc., might generate important information from student perspectives and can supplement, validate, and help explain results of direct assessment.

Other supporting measures that do not measure student learning but might provide useful feedback to the program might be course evaluations, course grades, student outcomes (job placement rates), student retention and completion rates, course completion rates, course D,F,W rates, etc., as they might be used to inform program effectiveness processes. Often these types of measures are included in UCC’s academic program review process, but it may be useful for programs to review these data on an annual basis.

What if our assessments yield data that are too small to analyze?

Describe the data that you have, describe what it suggests, and point out limitations. If the number of students is too low the program may consider collecting data annually but reporting the aggregated results every other year by combining two years of student assessment results.