Program Review Glossary of Terms
Academic Charter: Articles 11 and 12 of BGSU’s Academic Charter respectively mandate that schools and academic departments conduct an evaluation of their problems, plans, and objectives at least once every five years. The said Articles call for schools and academic departments to apply the highest possible standards in evaluation of their own performance. Please visit the Academic Charter’s website.
Academic Programs: Academic programs are defined as a program of study, sharing common student learning goals and served by a specific group of faculty and support staff. An academic program may consist of degree programs, certificates, continuing education, professional development, and other related academic offerings. An academic program may lead to a certificate, associate, bachelor's, masters or doctorate degree.
Academic Program External Review Findings Response Template: Program (or cluster of programs) utilize this template to respond to findings by external reviewers. The template is submitted to the Dean/A-Dean and the Office of Institutional Effectiveness (OIE). The template comprises 2 sections: In the first section, the program (or cluster of programs) copy/paste recommendations by the external reviewers and respond in agreement or disagreement supported by a brief rationale as well as any impact on the initially submitted Action Plan. In the second section, the program (or cluster of programs) engage in a brief reflective exercise about the entire process of program review, including gains and challenges, and respond to prompt questions by way of providing input.
Academic Unit Head: The primary leader of an academic program and all its associated degree programs, certificates, and other related academic offerings. The academic unit head may be locally identified as a department chair, director, or by some other title.
Accreditation: Accreditation is a voluntary process of higher education oversight that serves to assure the public of the institution’s quality and to promote continuous institutional improvement. BGSU is accredited by the Higher Learning Commission (HLC). In addition, many programs are accredited by disciplinary organization. For example, the Nursing programs are accredited by the Commission on Collegiate Nursing Education (CCNE). These are commonly referred to as specialized accreditation. Please visit the specialized accreditation website. See also: Specialized accreditation.
Action Plan: At BGSU, the Comprehensive Program Review process results in the development of a measurable, feasible, actionable Action Plan for continuous improvement of the program (or cluster of programs) within a specified timeframe. The program (or cluster of programs) annually reports progress made on the Action Plan which is vetted by program faculty and Dean/A-Dean. Please see the Program Review Handbook for guidelines and template.
Adequacy of Resources: Examples may include: Laboratory/computer facilities • Faculty offices • Classrooms • Support staff, number and qualifications • Enrollment capacity, etc.
Alignment: Relationship of the program (or programs) and its goals, objectives, outcomes, and strategic plan with the University’s broader mission, vision, strategic plan and initiatives. Course learning outcomes (or student learning outcomes) should also be aligned with program learning outcomes. At the course level, alignment is also understood as the agreement between a set of content standards and an assessment used to measure those standards. When strongly aligned, standards and assessments bring clarity to the education system by providing a coherent set of expectations for students and educators. Educational researchers distinguish between horizontal alignment and vertical alignment.
APS: Academic Performance Solutions (APS) is a decision-support platform that enables individuals across institutional departments to easily access data and peer benchmarks around course offerings, faculty workload, course completion rates, department-level costs, and other key performance indicators. For undergraduate programs, APS is the primary data source for Program Vitality Analysis (PVA) which serves as an annual program review.
Applied, Admitted and Enrolled: The number of first-time, first-year (or graduate) students indicating an intended major in the program who (1) completed an application, (2) were subsequently admitted to, and (3) ultimately enrolled at the University.
Assessment: Assessment is a systematic, integrated, collaborative, ongoing process emphasizing student learning. Assessment focuses on: establishing clear, measurable learning outcomes; providing learning opportunities for students to achieve those outcomes; assessing student learning by systematically gathering, analyzing, and interpreting evidence to determine how well student achievement matches expectations; and using resulting information (i.e., assessment results) to review and enhance student learning (Suskie, 2009).
Attempted SCH per Instructor: The ratio between the total number of attempted student credit hours (SCH) taught and the total full-time equivalent instructors. Attempted credits are weighted by the percent responsibility associated with each instructor per section.
Authentic Evidence: Authentic evidence refers to assessment which measures a student’s ability to apply his or her knowledge in real world applications. See also: Direct Evidence, Indirect Evidence.
Average actual time to degree: The mean total number of years taken by students from time of first enrollment (first time student or first-time transfer) through the time of graduation.
Average actual credits to degree: The mean number of credit hours completed by students at the time of graduation. This measure includes the total number of credits earned at the degree-granting institution as well as credit transferred in after the first semester at the institution.
Average number of graduate advisees: The estimated mean number of program majors advised per program faculty member, by semester.
Average thesis/dissertation advising load: The estimated mean number of graduate thesis/dissertation committees chaired per program faculty member.
Benchmark: This involves using comparisons to make meaning of empirical data. A benchmark is a point of reference, a standard for measuring, evaluating or assessing other things of the same type.
Bottleneck Courses: Courses with course-level fill rates of 90% or higher.
Capstone: The capstone is a culminating project or experience that generally takes place in the student’s final year of study and requires review, synthesis, and application of what has been learned over the course of the student's college experience. The result may be original research, an innovative design, an art exhibit, or a performance. The capstone can provide evidence of assessment of a range of outcomes.
Closing the Loop: This is an iterative ongoing four-step process: 1) defining learning outcomes, 2) choosing a method or approach and using it to gather evidence of learning, 3) analyzing and interpreting the evidence, 4) using this information of improve student learning. The cycle must be completed and repeated to see whether the changes have produced the desired result.
Co-Curricular Activities/Experience: Activities, experiences, and/or initiatives primarily existing outside of formal degree completion content, processes, and/or disciplines that support the achievement of the University Learning Outcomes (emphasis on Integrative Learning, i.e., integrate, apply, and reflect and Engagement i.e., personal and social responsibility and engaging with others). Please visit the Office of Academic Assessment’s website for information on co-curricular activities assessment.
Cohort: A specific group of students established for tracking purposes, such as first-time first-year students or transfer students entering in a specific semester, or doctoral students entering in a specific academic year.
Course Completion Rate: The percentage of attempted student credit hours that were earned. Only registered, gradable, and non-transfer coursework records are included.
Comprehensive Program Review (CPR): Comprehensive Program Review process provides faculty with an opportunity for long-range reflection on the quality, cohesiveness, and effectiveness of academic areas of study. Comprehensive Program Review constitutes a “big picture look” at programs’ achievements, challenges, potential for growth, ongoing resources needs by reviewing multiple years of annual program reviews (APR). Comprehensive Program Review informs long-range planning, budgeting, as well as accreditation and provides faculty and administrators with the opportunity to assess the relative value of academic programs in terms of viability, productivity and quality. Several aspects comprise the Comprehensive Academic Review process: • Academic Program Self-Study • Feedback from Program Review Advisory Committee and/or Office of Institutional Effectiveness (OIE) • Review Team Site Visit & Report • Institutional Response, Planning, and Collaboration. At BGSU, Comprehensive Program Review occurs every 5 years at a minimum (results of cumulative annual program reviews or program vitality analysis may dictate otherwise).
Continuous Improvement: Continuous improvement is an applied science that emphasizes innovation, rapid and iterative cycle testing in the field, and scaling in order to generate learning about what changes produce improvements in particular contexts. The outcomes of each cycle inform the revision, development, and fine-tuning of practices. The purpose of continuous improvement is to help improve organizational processes, policies, and practices. Therefore, continuous improvement should be embedded in day-to-day work, in a systemized, organic, and formative way. At the heart of continuous improvement is the principle of Systems Thinking. Continuous improvement for systems change should: • address pressing problems • begin when new strategies, initiatives, and programs are being conceptualized • be an integral component of strategic improvement plans • take place in different levels of a system. Universities intentionally serve communities and make an impact on students’ lives when they achieve systems transformation in meaningful ways. A culture of continuous improvement emphasizes: Strategic visioning and planning • Problem identification and root cause analysis • Developing a theory of action • Identifying measures or PKIs for continuous improvement • Creating action plans for implementation cycles and data collection • Facilitating collaborative data study • Implementing innovations • Creating and maintaining a culture of collaboration and generating a sense of common purpose and ownership • Fostering a culture of critical reflection, inquiry, and problem solving • Facilitating communities of practice • Contributing to shared decision-making that ensures transparent involvement of multiple perspectives and stakeholders • Supporting systems alignment and cohesion of practices and policies (Source With Permission: Education Development Center, Guidebook and Toolkit, 2019).
Contribution to Institutional Mission/Strategic Plan/Values: A Unit/School or program’s contribution to BGSU’s Mission, Strategic Plan and Values may be exemplified by its: Program mission/vision • Program distinctiveness • Centrality to institution • Relationship to other programs • Social benefits • Service to continuing education • Fit with strategic vision • Student demand • Employer/Market demand • Growth overtime, etc.
Cluster: A cluster model is intended to support a culture of collaboration which extends beyond the program to share resources or support common standards in like programs across the University.
Credit Hour Production: Number of student credit hours generated by program faculty teaching in program courses over the academic year, encompassing summer, fall and spring.
Culture of Evidence: A culture wherein indicators of performance are regularly developed and data collected to inform decision-making, planning, and improvement. A culture of evidence is one in which decisions are based on facts, research, and systematic analysis. For example, in a culture of evidence, faculty are constantly asking questions about how their programs can better serve students’ learning goals and where program and curricular decisions are based on data and systematic observations. A culture of evidence comprises effective evidence. Effective evidence are: Relevant (i.e., reflecting the underlying concept of interest); Verifiable (i.e., documentable and replicable); Representative (i.e., typical of the underlying situation or condition); Cumulative (i.e., obtained from multiple sources using multiple methods); and, Actionable (i.e., provide guidance for action and improvement).
Curricular Quality: Curricular quality may be exemplified by (but not limited to) any of the following: Planning processes • Quality control mechanisms • Learning outcomes • Requirements for degree • Congruence of courses with curricular goals • Course coordination • Prerequisite patterns • Balance between depth and breadth • Percentage of courses involving active learning • Uniformity across multiple course sect• Availability of electives • Advising procedures • Role in service courses • Use of adjunct faculty • Use of student portfolios, competency capstone courses • Curricular revision procedures, etc.
Curriculum Mapping: Process of systematically indexing, diagraming or aligning learning activities, student learning outcomes and assessment with the course and program goals or outcomes. Curriculum mapping entails both individual course mapping and program mapping. Like course mapping, program mapping can reveal gaps, redundancies, areas of misalignments, and areas of over-concentration which may result in the need for a curriculum revision for purposes of improving the overall coherence of a course of study and, by extension, its effectiveness.
Data: Information, facts or numbers, collected to be examined and considered and used to help decision-making. There are two types of data: 1) Quantitative data can be expressed in numerical values, making it countable and including statistical data analysis. These kinds of data are also known as numerical data. 2) Qualitative or categorical Data is data that cannot be measured or counted in the form of numbers. Qualitative data tells about the perception of people. At BGSU, the Office of Institutional Research (OIR) serves as the official source of institutional quantitative data for external reporting. Please visit the Office of Institutional Research (OIR)’s website for available, updated institutional data dashboard.
Demographics: Demographics is the collection and analysis of general characteristics about groups of people and populations, such as age, gender, ethnic background, etc. The Office of Institutional Research (OIR) makes available demographics data dashboards for BGSU faculty, students, and staff.
Destination Program: A program that a student was not enrolled in during the previous fall term (e.g. Enrolled in BA-Mathematics this fall term but will switch to BA-Biology next fall term. BA-Biology is the destination program.).
Direct Cost per Student Credit Hour: The cost for an academic division to produce a single attempted credit hour, calculated by using the total direct costs divided by the total attempted credit hours produced.
Direct Evidence: Evidence gathered from a performance-based observation or sampling of student work. Example: using a rubric to evaluate the quality of student papers. Direct evidence may include locally- developed tests, performance appraisal, oral examinations, simulations, behavioral observations, portfolios, external examinations of student work, and other course activities assessed with rubrics. See also: Authentic Evidence, Indirect Evidence.
Diversity: Diversity is the range of human differences, including but not limited to race, ethnicity, gender, gender identity, sexual orientation, age, social class, physical ability or attributes, religious or ethical values system, national origin, and political beliefs, etc. Diversity involves multiple perspectives and the representation and recognition of people of different backgrounds and points of view in the various constituencies of the university: students, faculty, staff, and administration.
DFWU Rate: The DFWU rate is the percentage of students in a course or program who receive a D or F grade or who withdraw (W).
Educational Effectiveness Indicators (EEIs): EEIs are also known as educational performance indicators and are a list of direct and indirect ways in which the University examines student learning.
External Review: The review of the program which is conducted by an outside expert in the field or discipline.
Efficiency: Efficiency can be determined by (but not limited to) any of the following: Trends in unit costs • Faculty/student FTE • Faculty/staff FTE • Student credit hours/faculty FTE • Revenues/student credit hours • Operating budget/faculty FTE • Research expenditures/faculty FTE, etc.
Executive Summary Template: The Executive Summary Template is to be completed by external reviewers and attach to their full report of findings. Both the Executive Summary Template and full report are sent electronically to the Program Review Provost Designee in the Office of Institutional Effectiveness (OIE). While the external reviewers’ full report is expected to be more detailed and actionable in nature for the direct benefit of the program faculty/coordinator, the Executive Summary template provides a synopsis useful for the compilation of master reports and/or presentations.
Faculty Productivity: Examples of faculty productivity may include: Research funding • Faculty publications • Scholarly awards • National standing of program • Teaching loads • Student credit hours taught • Dispersion of faculty FTE • Theses advised, chaired • Students supervised • Service contributions • Academic outreach • Collaboration with other units/ programs, etc.
Fall-to-Fall Program Retention Rate: The ratio between the number of students who were enrolled in the selected program during the selected Fall term to the number of students who were enrolled in the selected program during the previous Fall term and did not graduate.
Fiscal Year: Funding year that includes July 1 through June 30. For example: FY 2018 refers to the fiscal year from July 1, 2017 through June 30, 2018.
Full-Time Equivalent (FTE) Student: 1 Undergraduate FTE = 15 credit hours; 1 Graduate FTE = 15 credit hours.
FTE Instructed Students (FTEIS): A measure of instruction offered and consumed, regardless of the majors of the students taught. Credits are divided by a full-time load to calculate full-time equivalency for courses taught in each discipline.
Full-Time Equivalent (FTE) Faculty: Each full-time faculty member is counted as one FTE. For part-time faculty, FTE is generally based on their HR appointment.
Full-Time Faculty/Staff: Faculty/staff employed 100% time; those on unpaid leave are excluded.
Full-Time Student: Undergraduates: Students registered for 12 or more credit hours at the census date; Graduate Students: Students registered for 8 or more credit hours at the census date.
Goals: Goals are achievable outcomes that are typically broad and long-term. Goals frame the “big picture” of general, abstract intentions and ambitions. See: Objectives.
Grade-Point Average (GPA): The sum of grade points a student has earned divided by the number of courses or credits taken. Sometimes high school GPAs are weighted to give additional points for their grades in advanced or honors courses.
Graduation Rate: The percentage of students from an entering cohort who graduate within a specific time frame at the reporting institution. For example, a first-time, full-time cohort of students in a bachelor’s degree program would typically be assessed at 4, 5, and 6 years out.
Graduation Rate After 60 Institutional Credits: The percentage of students who graduated from the program within three years of attaining "junior status." "Junior status" is defined as having sixty cumulative credits from the institution by the selected academic year/term.
Graduation rate of entering cohort finishing within/outside the program (UG programs): The percentage of first time, full-time degree-seeking students entering with a declared major in the program who (1) graduate with that major, or (2) graduate in another major.
Graduation rate of finishing cohort starting within/outside the program (UG programs): The percentage of first time, full-time degree-seeking students with a last declared major in the program, who graduated with that major and either (1) entered with that major or (2) entered undeclared or with another major.
Graduation Survey: The Graduation Survey is reported by academic year (summer, fall, and spring). The data are gathered around the time of commencement and six-months after commencement by the BGSU Office of Academic Assessment (OAA). Please visit the Office of Academic Assessment’s website.
Growth: Academic program growth may be exemplified by growing enrollments, maintaining relevancy in the curriculum, meeting employer demand, demonstrating an institutional Mission fit, and differentiating the institution. For undergraduate programs, growth opportunities may feature graduate outcomes by design, creative alternatives to traditional degrees, and varied experiences in the curriculum.
High-Impact Practices (HIPs): HIPs include first-year seminars, writing-intensive courses, collaborative assignments, undergraduate research, service learning, internships, capstones, and international programs.
ICA: Independent Contractor Agreement. External Reviewers who conduct program reviews for any of BGSU’s programs are external contractors. Independent Contractor Agreements are required for all individual contractors performing services at or for BGSU.
Indirect Evidence: Evidence that assesses the perceptions of students or faculty. This evidence may be collected through student surveys, questionnaires, focus groups, archival records, interviews, and other indirect methods. Example: using a survey to assess how students perceive the quality of your discipline’s writing curriculum. See also: Authentic Evidence, Direct Evidence.
Instructional FTE: Calculated for each instructional staff member, the ratio of credit hours taught to the full-time instructional workload. This metric does not take into account non-instructional responsibilities, like research, committee work or advising. This value is configured at the term level.
Instructional Workload: The number of credit hours taught by a full-time instructor at the institution.
IPEDS: The Integrated Postsecondary Education Data System (IPEDS) is a system of interrelated surveys conducted annually by the U.S. Department’s National Center for Education Statistics (NCES). IPEDS consists of institution-level data that can be used to describe trends in post-secondary education at the institution, state and/or national levels. IPEDS provides an annual Data Feedback Report that compares BGSU to its peer institutions across a number of indicators, based on the results of survey data. Please visit the Office of Institutional Research (OIR)’s website for available IPEDS reports.
Key Performance Indicators (KPIs): See Metrics.
Metrics: Metrics (or performance metrics) are measurable data that track achievement against an established business activity or process. Program metrics are standard measurements for monitoring, controlling, evaluating and benchmarking an ongoing program. Metrics measure performance or progress of strategic activities or processes within a specific department. Key Performance Indicators (KPIs) measure performance of goals or objectives that are applicable across departments. All KPIs are metrics but not all metrics are KPIs. KPIs are the metrics by which critical initiatives, objectives, or goals are evaluated. KPIs act as measurable benchmarks against defined goals. While KPIs measure progress toward specific goals, metrics are measurements of overall business health and they may be loosely tied to targeted objectives. See also: Key Performance Indicators.
Minimum Program Hours Required: The number of credit hours required in the major.
MoU: Memorandum of Understanding. The Memorandum of Understanding (MoU) serves as a mechanism for the program faculty, Program Review Coordinator, College Dean(s)/Associate Dean(s), OIE Provost Designee, and Provost to outline the primary focus or foci for the program review. The MoU is developed by the faculty and program review coordinator of the program (or cluster of programs) under review in collaboration with the Dean/A-Dean as a part of “Closing the Loop” and outlines the guidelines, expectations, and plans for program improvement over the next five years (or until the next comprehensive program review).
National Survey of Student Engagement (NSSE): This is a nationally normed, widely administered survey that asks students about their behaviors: how often they ask questions in class, use the library, consult a professor outside of class, and how many hours they study. NSSE surveys first-year and senior students to assess their levels of engagement and related information about their experience at their institution. The survey is not a direct measure of student learning. Please visit the Office of Academic Assessment’s website for recent NSSE data and/or reports.
No-Conflict of Interest Form: External reviewers who conduct program reviews for any of BGSU’s programs much fill out, sign and return a No-Conflict of Interest form, kept on file in the Office of Institutional Effectiveness. The attestation, vetted by BGSU’s General Counsel, exists to ensure an objective, diligent, and unbiased review of self-studies.
Number of Graduates: The number of students graduating with a given major during an academic year. An academic year includes August, December, and May graduation dates.
Number of Enrollments (or Program Enrollment): Unduplicated headcount enrollments (majors) in the selected program over the academic year, including summer, fall and spring.
Objectives: Objectives are measurable, specific local actions a program or unit/school needs to implement to meet a broader University or College goal. Objectives are generally formulated for the short-term. See: Goals.
Office of Academic Assessment (OAA): The Office of Academic Assessment (OAA) was created in 2013 as an infrastructure for academic assessment at BGSU. OAA provides systematic support for the assessment of University, general education (BGP), and programmatic learning outcomes, and coordinates the collection and distribution of institutional and programmatic assessment data. Please visit the Office of Academic Assessment’s website.
Office of Institutional Research (OIR): The Office of Institutional Research (OIR) serves as the official source of institutional data for external reporting. OIR provides data analytics, intuitive dashboards and information that supports the decision-making process, complies with reporting requirements of external agencies, and responds to ad hoc requests for information. Please visit the Office of Institutional Research’s website. Data dashboards are primary data sources for comprehensive program reviews.
Pedagogical Quality: Pedagogical quality may be exemplified by (but not limited to) any of the following: Process for evaluation of teaching and advising • Engagement in collaborative teaching • Class size • Pedagogical innovation • Characteristics of course syllabi • Strategies for promoting active learning • Procedures for setting academic stand• Adoption of technology, etc.
Peer Review: A process of evaluation of programs/self-studies by a group of experts external to BGSU who scrutinize programs/self-studies and provide feedback against best practices for the purpose of continuous improvement.
Persistence: See Retention.
PRAC: Program Review Advisory Committee. The Program Review Advisory Committee (PRAC) aids the Office of Institutional Effectiveness (OIE) in fostering a culture of program and department review and a culture of evidence by developing, refining and supporting its processes, resources and activities.
Prerequisite: A course students must complete before taking a more advanced course in the discipline.
Prioritization: Academic program prioritization is the activity or process in which an academic institution assesses and prioritizes its programs for the purpose of more strategically allocating its funding and resources. The task of program prioritization is a data-driven process that helps identify and recommend possible changes to ensure that programs are aligned with future enrollment trends and fiscal realities.
Program Learning Outcomes (PLOs): Program Learning Outcomes are broad statements of the skills, competencies, and “big ideas” students should be able to articulate, put into action, or utilize (theoretically or pragmatically) after the completion of a degree or certificate.
Program Review Coordinator (College/School/Unit-based): Individual who leads the program review process for a specific program (or cluster of programs) within a specific school or College. Please see the Program Review Handbook for a description of roles and responsibilities.
Program Review Process at BGSU: At BGSU, the program review process comprises 8 steps: 1) Data collection, 2) SWOT analysis and approval of Memorandum of Understanding, 3) Self-study composition, 4) Approval of external reviewers, 5) External review visit and report, 6) Submission of revised action plan, 7) Action plan implementation, 8) Yearly progress report. Please visit the Program Review Step-by-Step Infographic’s website.
Program Review Provost Designee: Individual from the Office of Institutional Effectiveness (OIE) who facilitates initial program review meetings with Dean/A-Deans, College/School-based program review coordinator and faculty, serves as a resource throughout the program review process, troubleshoots as needed, and coordinates policies and procedures for program review at BGSU.
Program Vitality Analysis (PVA): The Program Vitality Analysis (PVA) includes analyses of enrollment, migration, retention, and completion data. The Dean/A-Dean in consultation with the Provost determine the organization of the Program Vitality Analysis (PVA). Specializations, concentrations, or other sub-divisions within an academic program may be included in a single PVA report.
Provost/Senior Vice President for Academic Affairs: Per Article 6 of the Academic Charter, the Provost has the overall responsibility for the operation and development of the academic areas within the University insofar as this responsibility is delegated by the President and the Board of Trustees through its Bylaws. As educational leader for the faculty and the administrators of academic areas, the Provost serves in planning, developing, and maintaining the quality of all the instructional and research programs, institutes, and centers of the University, including oversight of a systematic process of review and academic policies, procedures, guidelines, and regulations. The Provost works to ensure cohesiveness and high quality across the University and promote growth.
Retention (Persistence) Rate: A measure of the rate at which students persist in their educational program, expressed as a percentage. This generally represents the percentage of first-time first-year students in a given cohort who enrolled from fall semester to fall semester.
Rubrics: an evaluation tool or set of guidelines used to promote the consistent application of learning expectations, learning objectives, or learning standards in the classroom, or to measure their attainment against a consistent set of criteria.
SAAC Forms: Student Achievement Assessment Committee. Please visit the Office of Academic Assessment’s website for descriptions of the short-form and long-form purposes.
Section Fill-Rate: The proportion of total enrollment against set class capacity per course section at the time of the last posted enrollment date.
Self-Study: The self-study document is both a data-based description and analysis of important aspects of an academic program under review. The self-study serves to address and document progress on specific areas and/or questions from an initial Memorandum of Understanding (See MoU), by providing salient evidence and pertinent data analyses. Further, the self-study includes findings from a Strengths/Weaknesses/Opportunities/Threats analysis and sets forth a 5-year action plan informed by the evidence and data reviewed. The self-study (not including the evidence and Action Plan) should not exceed 50 pages, should stand alone as a document, and should follow guidelines and format established in the Program Review Guidebook. To present its arguments concisely, the program (or programs) under review should draw from the data and other documents or evidence provided in appendices. The program (or cluster of programs) under review should clearly identify the source of any data provided in the narrative. The self-study is reviewed by the College Dean, Program Review Advisory Committee (PRAC) and/or the Office of Institutional Effectiveness (OIE). OIE issues a Self-Study Approval Status letter (SSASL) notifying program (or programs) under review of needed edits and/or recommendations to enhance the quality of the self-study prior to its dissemination to external reviewers. Following review of the self-study and site visit by the external reviewers, faculty of the program (or cluster of programs) under review prepares a response to the external review findings, edits the action plan (as applicable) for resubmission and approval by the College Dean.
Service Majors: Students who take a course in a college or department that is not associated with their declared major in a term or academic year.
Specialized Accreditation: See: Accreditation.
Student Credit Hour (SCH): A credit hour is the unit of measurement used to indicate the amount of instructional and learning time required to achieve the student learning outcomes of a college-level course. See also: Attempted SCH per Instructor.
Student Learning Outcomes (SLOs): Student Learning Outcomes are statements that specify what students will know, be able to do or be able to demonstrate when they have completed a course. SLOs specify an action by the student that must be observable, measurable and able to be demonstrated. While relating to the PLOs, Student Learning Outcomes (SLOs) should specifically define what students should be able to do upon completion of the course. SLOs are the basis for selecting the course materials, activities, assignments and assessments. SLOs are generally shared with students in course syllabi. Student Learning Outcomes encompass both direct (i.e., • Evidence of mastery of generic skills • Student achievements • Accomplishment of learning outcomes • Performance in capstone projects • Performance on licensing/certification exams, standardized tests, etc.) and indirect measures (i.e., • Processes for evaluating learning • Student cognitive development • Student satisfaction • Student placement • Employer satisfaction • Alumni satisfaction, etc.).
Student Migration: • Migration into Program: The number of students that are enrolled in the selected program during the fall of the selected academic year that were not enrolled in the program during the prior fall term • Migration out of Program: The number of students that were enrolled in the selected program during the fall of the year prior to the selected academic year, that are not enrolled in the program during the fall term of the selected academic year.
Student Productivity: Student productivity may be exemplified by (but not limited to) any of the following: Enrollment patterns • Demands on students • Student effort • Retention/graduation rates • Degrees awarded • Time to degree • Student involvement in program activities, etc.
Student Quality: Student quality may be demonstrated by (but not limited to) any of the following: Recruitment strategies • Entrance exam scores • Acceptance ratio • Monetary support • Demographic diversity, etc.
Student Success (or Success of Program Graduates): The number and percentage of students graduating in a given year who sought and attained employment or admission to graduate /professional school within one year after graduation.
SWOT: Strengths, weaknesses, opportunities and threats analysis.
Time-to-Degree: The average time required to complete an undergraduate or graduate program of study in terms of years to graduation.
Trends: A trend is the general direction or pattern of a data set over time. Trends are recognizable upward or downward shifts which may inform decision-making, predictions, or planning.
Updated: 11/17/2022 02:57PM