Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents
indent15px

...

It is worth noting that the SACSCOC principles do not here require that administrative departments provide evidence of seeking improvement. Administrative support services may find success in having articulated standards of service and consistently achieving them. 

See Appendix 1 for lists of EMU’s academic programs and services and its administrative departments as identified for PACE assessment purposes.

Assessment Timeline and Data Collection Cycles

...

  • Fall Semester - Collection of data from fall courses/activities as per program/department assessment cycle requirements; survey administration per university survey calendar
  • Spring Semester - Collection of data from fall courses/activities as per program/department assessment cycle requirements; survey administration per university survey calendar
  • April - September - Window for data analysis and outcomes assessment reporting in SPOL
  • Summer - Collection of data from summer courses/activities as needed
  • August - September Develop assessment follow-up planning objectives for upcoming year to seek improvement in student learning on the basis of assessment findings

Special Considerations

It is important to note that academic programs and services and administrative departments need not collect and analyze data for each of its outcomes every year. A program/department may develop a multi-year (ideally 2-4 year) cycle over which it assesses each of its outcomes. Such a cycle allows for data collection over multiple years, and allows programs/departments to provide more focused attention on a subset of their outcomes in any given year. Beginning with the rollout of the updated academic program review cycle in 2020-21, academic programs are expected to have collected and analyzed data for each student learning outcome at least twice between each program review. See below for further details on program review.

Further, in order to ensure the sustainability and validity of assessment, small academic programs may opt to extend assessment data collection over several years in order to achieve adequate sample sizes to support analysis. Programs that opt for this approach should collect data for all outcomes each year, but may still focus on analysis of results for a subset of student learning outcomes each year. 

PACE - Executive Reporting

...

In addition to the PACE process, all academic programs that do not undergo an external accreditation review are reviewed on a six-year cycle (adapted as-needed to specific program considerations). See Appendix 2 for more details on the review process. See the Program Review Cycle for details on when each program is reviewed. This comprehensive review is conducted by a faculty task force, overseen by the provost council and includes consideration of:

...

Evaluation Team: The evaluation team will typically consist of two external consultants within the discipline(s) of the program cluster to be reviewed, and one internal consultant. Ideally, one external reviewer will be from an institution that is comparable to EMU and a second reviewer will be from an “aspirational” institution. The internal consultant will typically be a tenured faculty member from a different program cluster; the internal consultant will be offered some measurable reduction in workload during the academic year when the review is conducted (such as release from a committee or a dean’s hour). The program cluster should contact nominees for the evaluation team prior to submitting names to determine interest and availability. The program cluster should submit the names of at least three potential external reviewers, along with vitae, to the dean, with additional comments or a ranking if desired.

Outline of

...

Academic Program Self-study Report

A.  Academic Program

  1. Describe the program.
    1. List the majors and minors (if undergraduate) or programs and certificates (if graduate).
    2. Provide the program cluster’s mission statement.
    3. Describe how the program supports the mission of the university.
  2. For undergraduate programs, describe how the majors support the liberal arts within the university.
    1. Note how courses in program cluster majors are interwoven with general education requirements.
    2. Discuss courses offered as “service courses” for the liberal arts curriculum, and note the typical enrollments in these courses. [IR will provide summarized data for recent years regarding service course enrollments.]
    3. Describe how the program interacts with other majors.
  3. Describe any admission-to-program requirements for students.
  4. Provide a list of required courses for the majors/programs and note any recommended electives. 
  5. List the  student learning outcomes, along with the measures and criteria/benchmarks for each major or program and any minors or certificates that are not “miniature majors/programs.” 
  6. Create or review and update the student learning outcomes, curriculum map, and assessment plan for the current curriculum. [IR can provide guidance on creating and revising SLO’s, curriculum maps, and assessment plans.] 
  7. Assess the general philosophy of the program in comparison to current practices in the discipline. 
  8. Reflection on the future potential of the program, as determined by external forces such as market demand.
  9. Analyze the instructional and informal environment in the program cluster. 
    1. Assess the amount and quality of contact between students and faculty. [IR will provide the most recent results of the Student Satisfaction Inventory (SSI) and/or Adult Student Priorities Survey (ASPS) for the program cluster’s majors/programs, including data on satisfaction with advising.]
    2. Describe how support, collaboration, and cooperation among students is encouraged.
    3. Assess how active learning experiences are encouraged.

...