Program Review Guidebook for Academic Programs (Updated Fall 2022)
Bowling Green State University (BGSU) is committed to the comprehensive review of all academic programs as an essential part of improving educational programs for effective student learning, ongoing continuous improvement, and data-driven strategic planning.
Program review is intended to be helpful, meaningful, collaborative, and useful to program faculty. The primary goal of the review and evaluation of academic programs is to gather feedback and engage in a comprehensive analysis and evaluation of programs to inform strategic action planning. Therefore, program review is rooted in the use of evidence and data to analyze and evaluate specific and critical question(s) and/or issues identified by programs to inform ongoing goals, priorities, and actions that address those question(s) and/or issues in alignment with institutional priorities and to improve program quality.
This handbook outlines the guidelines, general policies, and processes for self-study and program review, incorporating elements from the Higher Learning Commission (HLC), the Ohio Department of Higher Education (ODHE), and the Regents’ Advisory Committee on Graduate Study (RACGS) Guidelines for Graduate Program Review. Per BGSU’s Academic Charter, programs (undergraduate and graduate), if not accredited by external specialized accreditation agencies, need to be reviewed every 5 (five) years.
At Bowling Green State University (BGSU), systematic program review provides a vehicle to ensure evidence of educational quality and consistency with national trends; documentation of student performance, inclusive of enrollment, retention, and completion rates, and achievement of stated program outcomes; evaluation of resources including student support, faculty and space; improvement of educational quality and strategies for improvement; contributions to the University’s Mission; a self-reflective and evaluative process which identifies strengths, weaknesses, threats, and opportunities with a forward-looking projection; and an emphasis on actions or action plans focused on continuous improvement in alignment with BGSU’s strategic plan. BGSU’s Provost has emphasized programmatic and curricular accountability and contextualized the activity of program review as one intrinsically linked to initiatives #1, #2, #3, #10, #12, #13 and #14 of the University’s strategic plan, Forward.
▪ Initiative 1: Right programs that are sustainable
▪ Initiative 2: Intensive focus on outcomes
▪ Initiative 3: Differentiating the traditional undergraduate experience
▪ Initiative 10: Teaching and service excellence
▪ Initiative 12: Culture of innovation and accountability
▪ Initiative 13: Enhancing value
▪ Initiative 14: Efficiency and alignment
The Office of Institutional Effectiveness (OIE), reporting to the Provost, has charge of the coordination and monitoring of program review for the University at large, including making a program review schedule widely available and updating it on an ongoing basis and providing assistance with logistics and details of the program review process.
However, the primary responsibility for implementing the program review process lies with the College Dean (or Associate Dean) who has administrative responsibility for the program(s). The College Dean (or Associate Dean) may assign a Department Chair, School Director, or faculty member as “Program Review Coordinator” to more efficiently lead the program review process locally. It is recommended that a small committee at the Department/School/College level be created to spearhead data collection and analysis and writing responsibilities.
Program review is an opportunity for faculty of one program or a cluster of programs to conduct a self-assessment to evaluate and strengthen the quality of a program or cluster of programs by examining specific and critical issue(s) and/or question(s).
Program review is intended to be a focused, formative, and on-going process although a formal 5-year cycle has been established by the University.
Program review is an activity designed to engage all faculty, staff, students, and other stakeholders (when appropriate) in a guided appraisal of the programmatic mission, student learning outcomes, faculty, and activities within the context of the College(s) and institution to address issue(s) and/or question(s) important to the program or cluster of programs.
Program review is an opportunity for closely related programs to identify novel opportunities for collaboration, consider new interdisciplinary programs, strengthen signature programs, investigate joint hires, investigate program consolidation, repositioning, and/or closure to align with new opportunities, and/or strengthen petitions for maximizing existing or new resources. In this way, program review serves as a tool towards greater efficiency, innovation, and maximal use of resources.
The Higher Learning Commission (HLC), the Ohio Department of Higher Education (ODHE), and the Regents’ Advisory Committee on Graduate Study (RACGS) along with BGSU’s Academic Charter require a periodic review of academic (undergraduate and graduate) programs to ensure that academic programs maintain quality, rigor, and currency. The review of undergraduate and graduate programs is an institutional responsibility. The process is designed to provide information to faculty and administrators at the local level, so that necessary changes can be made to maintain program quality and sustain a culture of continuous improvement. Program review at BGSU is not meant to be used to compare programs across the University System of Ohio or to determine state funding of undergraduate and/or graduate programs but should :
▪ Be evaluative and forward looking.
▪ Be fair and transparent as well as distinct from other reviews.
▪ And must result in action.
Every September, an annual report of all graduate program reviews must be submitted to the Chancellor and RACGS detailing the review process for the prior academic year as well as any updates and/or changes at the graduate level.
 Ohio Board of Regents Advisory Committee on Graduate Study. (2012, November 30). Guidelines and procedures for review and approval of graduate degree programs, p. 23. Retrieved from
https://www.ohiohighered.org/files/uploads/racgs/documents/RACGS_Guidelines_113012.pdf Hereafter: RACGS
 RACGS, p. 23.
Article 11- Section E
Each school shall prepare, at least once every five years, an evaluation of its problems, plans, and objectives. Specific performance objectives shall be established prior to each evaluation period and shall be used to determine the extent to which the goals are being met. The school shall endeavor to apply the highest possible standards in evaluation of its own performance. This report shall be prepared by the Director with input from the faculty, graduate students, and undergraduate majors of the school as applicable. If deemed appropriate by the Dean or the school, persons outside the school may be invited to participate in the evaluation process. Attention shall be given to the effectiveness of personnel, the quality of the academic programs, efficiency in utilization of existing resources, the research and service activities of the school, adequacy of physical facilities, long-range plans and objectives, adequacy of monetary support, and appropriateness of internal organization and communication. The report shall be presented to the school faculty and filed by the Director with the appropriate Dean; a copy also shall be submitted to the Provost, the CAA, and the President of the University.
Article 12- Section D
Each academic department shall prepare, at least once every five years, an evaluation of its problems, plans, and objectives. Specific objectives shall be established prior to each evaluation period and shall be used to determine the extent to which the goals are being met. The department shall endeavor to apply the highest possible standards in evaluating its own performance. If deemed appropriate by the Dean or the department, persons outside the department may be invited to participate in the evaluation process. A report shall be prepared by the Chair based upon contributions from the faculty, graduate students, undergraduate majors, and perhaps others as well. Attention shall be given to the effectiveness of personnel, the quality of academic programs, efficiency in utilizing existing resources, the research and service activities of the department, adequacy of physical facilities, long-range plans and objectives, adequacy of financial support, and the appropriateness of internal organization and communication. The Chair shall present the report to the departmental faculty and to the Dean or Deans; the Chair shall forward copies of the report to the Provost, the Undergraduate Council, and the President.
The schedule for academic program review is set by the Office of Institutional Effectiveness (OIE) with input from College Deans/Associate Deans, the Program Review Academic Committee (PRAC), and the Provost and in compliance with BGSU’s Academic Charter (e.g. program reviews must occur at least once every 5 years). Once the schedule is set, the Office of Institutional Effectiveness (OIE) helps to administer it. While a comprehensive program review should occur every 5-year, various reasons may signal the need for a program to be reviewed earlier than the scheduled dates, such as concern about the performance of a program, a decrease in the number of degrees awarded, leadership changes or staffing concerns). A program that has already participated in a review may be asked to be reviewed again outside of its typically scheduled review to assure that progress towards improvement is being made. The program review process itself should span no more than 3 academic semesters (Fall/Spring/Fall). When needed, a request for a Spring/Fall/Spring timeline may be submitted to the Office of Institutional Effectiveness (OIE) who will consult with the Provost’s Office for final determination.
Major steps in the program review process include:
(1) Request of institutional data from the Office of Institutional Research (OIR) and the Office of Institutional Effectiveness (OIE).
(2) Completion of a SWOT analysis.
(3) Completion of a Memorandum of Understanding (MoU) highlighting four to five questions/issues that came up during the SWOT analysis and need prioritizing in a more in-depth manner in the self-study report, inclusive of an action plan.
(4) Completion of a self-study report, inclusive of an action plan, with input from faculty, staff, and students and institutional data.
(5) Submission of a No-Conflict-of-Interest form and Independent Contractor Agreement (ICA) by external reviewer.
(6) Completion of a site visit (virtual or in-person) completed by a team of external reviewers to include class observations, tour of facilities, interviews of students and faculty, etc.
(7) Provision of an external review report and executive summary by the team of external reviewers
(8) Review of the external findings by the program and response made to the College Dean/Associate Dean.
(9) Revision of program action plan against the external findings and approval by the College Dean/Associate Dean of the revised action plan.
(10) Implementation of the final action plan and follow-up.
See the program review infographics webpage for a visual of the process flow.
Funding in support of program review expenditures is the prerogative of the Provost’s Office who is responsible for its oversight, recommendation, and allocation. Funding from the Provost’s Office may cover program review honorarium, per diem, travel (airfare/hotel, etc.). OIE has charge to communicate updates relative to budget breakdown as appropriate.
The Office of Institutional Effectiveness (OIE) is charged with contacting College Deans/Associate Deans in determining which programs should be included in the program review schedule. Once input from College Deans/Associate Deans has been secured, the schedule is submitted to the Provost for final review and approval. The approved schedule is then confirmed with the College Deans/Associate Deans and posted on the OIE’s website.
• A program can self-nominate for review at any time with the dean(s) approval.
• For graduate programs, both the Dean/Associate Dean of the College in which the discipline resides and Dean of the Graduate College must grant approval for program review.
• For programs that are not administered through a College, the administrator to whom the program reports should perform the responsibilities identified herein as those of the College Dean/Associate Dean.
If a program or program needs to delay their review for any reason, a written request must be made from the College Dean/Associate Dean to the Office of Institutional Effectiveness who will notify the Provost’s office. The letter must be received at least one semester before the review is set to begin. If a delay is sought from the College Dean/Associate Dean, the respective program and/or College/School will not benefit from Provost’s funds in support of program review. All costs incurred from a non-approved, delayed program review are the College/School’s sole responsibility.
The following roles and responsibilities are non-exhaustive suggestions for faculty involved in the program review process, Program Review Coordinators, College Deans/Associate Deans, Provost Designee in the Office of Institutional Effectiveness, and Provost. The College Dean(s)/Associate Deans are responsible for identifying the Program Review Coordinators in their respective College/School.
▪ Engage in a Strengths/Weaknesses/Opportunities/Threats analysis of the program.
▪ Engage in the program review process and identification of issue/question for investigation.
▪ Participate in meetings and events to develop the program Self-Study and Action Plan.
▪ Assist in the collection of data for program Self-Study.
▪ Provide feedback and comments on review team questions, program Self-Study and Action Plan.
▪ Meet with review team members during site visit.
▪ Collaborate to create a response to the review team report and assist in the creation of a final Action Plan.
▪ Actively participate in the implementation of program Action Plan.
▪ Regularly participate in a semester and annual review of progress on Self-Study and Action Plan items.
Program Review Coordinator (College/School/Unit-based):
▪ Communicate data needs to the Office of Institutional Effectiveness (OIE) and the Office of Institutional Research (OIR) and facilitate the collection of additional data needed for the program Strength/Weakness/Opportunity/Threats analysis and for the Self-Study.
▪ Lead a Strengths/Weaknesses/Opportunities/Threats analysis of the program.
▪ Develop the Self-Study and Program Review Memorandum of Understanding (MoU) and submit to College Dean(s)/Associate Dean(s) for approval. The next levels of MoU approval include Provost Designee from the Office of Institutional Effectiveness, Graduate College Dean (if graduate programs are involved), and Provost.
▪ Submit the program Self-Study and Action Plan by the agreed-upon due date to the College Dean(s)/Associate Dean(s) and to the Provost Designee in the Office of Institutional Effectiveness (OIE).
▪ As pertinent, make any changes/edits/updates to the Self-Study and Action Plan based on recommendations in the Self-Study Approval Status Letter (SSASL) issued by OIE (Note: There may be more than one SSASL issued as OIE may require multiple reviews of the Self-Study and Action Plan prior to its submission to external reviewers).
▪ Submit a list of potential external and internal reviewers by the agreed-upon due date to the College Dean(s)/Associate Dean(s) and Provost Designee in the Office of Institutional Effectiveness.
▪ Contact potential reviewers to arrange for site or virtual visit once a visit schedule is approved by OIE.
▪ Communicate with external reviewers to secure a signed No-Conflict of Interest form, copies of their resumes/CVs and Independent Contractor Agreements – and forward a copy of all documentation received to the Provost Designee in the Office of Institutional Effectiveness.
▪ Create site visit schedule for the review team in consultation with program faculty, College Dean(s)/Associate Dean(s), and Provost Designee (Provost as needed).
▪ Distribute materials to external review team members at least one (1) month before the site/virtual visit is scheduled.
▪ Finalize visit logistics with respective external reviewers (i.e., travel, reimbursement, lodging, etc.).
▪ Revise program Action Plan to incorporate comments/feedback from external review team and College Dean(s)/Associate Dean(s) and submit to the College Dean(s)/Associate Dean(s), Provost Designee, and Provost by the agreed-upon due date.
▪ Follow up with OIE concerning updates on the Action Plan.
College Dean(s)/Associate Dean(s):
▪ Review and approve program MoU that will drive the review and action planning process.
▪ Identify a program designee (i.e., Program Review Coordinator) to oversee the program review process, write the Self-Study and Action Plan, coordinate site/virtual visit by external reviewers, and submit finalized Action Plan.
▪ Review and approve program internal and external review team members in consultation with the Provost Designee in the Office of Institutional Effectiveness.
▪ Provide and approve fiscal support for the program review process including the site/virtual visit by the review team members as needed. (See note above concerning College endorsement of costs associated with delayed program review).
▪ Provide feedback and comments on the program Self-Study and Action Plan, external review team findings report and executive summary, and final program Action Plan.
▪ Inform Provost Designee in the Office of Institutional Effectiveness of any changes impacting the program review process.
Office of Institutional Effectiveness (OIE) Provost Designee:
▪ Facilitate initial meetings, serve as a resource throughout the program review process, and troubleshoot as needed.
▪ Review and sign on the program MoU, Self-Study, and Action Plan, selection of external reviewers, and visit agenda.
▪ Generate Self-Study Approval Status Letters and follow-up on changes/edits made to Self-Study and Action Plan to ensure actionability, feasibility and alignment with institutional plan and goals. The OIE Provost Designee may request multiple reviews of Self-Studies and Action Plans until such meet quality standards for program review. (See checklist and rubric for Self-Study and Action Plan evaluation by OIE and PRAC in Appendices F and G).
▪ Organize meeting(s) to discuss data needs with the Program Review Coordinator.
▪ Keep signed copies of all program review documentation (i.e., MoU, Program Self-Study & Action Plan, Final Action Plan, External Review Team Findings Report and Executive Summary, Annual Action Plan Updates, Independent Contractor Agreements, No-Conflict of Interest form, etc.) for accreditation purposes.
▪ Assist College Program Review Coordinators with data retrieval, visit logistics, etc.
▪ Sit on the exit session of all external site/virtual visits.
▪ Communicate with external reviewers to follow up on timely submission of comprehensive findings report and executive summary. Once received directly from external reviewers, share these documents with Program Review Coordinators and College Deans/Associate Deans.
▪ Share copies of all program review documentation with the Graduate College in July (for inclusion in the September Ohio Board of Regents (OBOR) report and with Provost as requested.
▪ Communicate with College Deans/Associate Deans to determine and/or update the program review schedule master.
▪ Create/manage/monitor/update program review reporting platforms, portals, websites, and/or tools/forms/templates as needed.
▪ Compile program review reports annually for Provost.
▪ Promotes program review best practices.
▪ Facilitate the Program Review Advisory Committee.
▪ Review and approve program MoU that will drive the review and action planning process.
▪ Review and approve program external review team members once provided by OIE.
▪ Make final determinations/recommendations on programs based on feedback from College Deans/Associate Deans and external reviewers’ findings.
▪ Approve the program review schedule master for final dissemination by OIE.
Program Review Advisory Committee (PRAC) Members:
▪ Provide recommendations for improving and refining the program review process for greater efficiency and efficacy.
▪ Provide recommendations for identifying or developing professional development, support, and supplemental materials to maximize outcomes associated with program review.
▪ Advocate in the application of program review findings in colleges' campus-level programmatic planning and decision-making processes in alignment with the University’s strategic plan.
▪ Provide a second read of program review documents prior to external reviewers’ visits and provide feedback to programs as needed at various stages of the program review process.
▪ Provide guidance/assistance with the development of MoUs, data literacy, and other program review steps or materials.
▪ Recruit faculty members with recent experience with program review at BGSU to serve as PRAC volunteers who assist in training other faculty or facilitate program level evaluation and alignment under the supervision and guidance of the Program Review Advisory Committee.
Step-by-Step Program Review Process
An initial meeting is scheduled with the College Dean and/or Associate Dean designee, Program Review Coordinator, program chair(s), and the OIE Provost Office Designee to establish a timeline for program review and to discuss the following:
▪ Steps in the program review process;
▪ Forms, templates, reporting platforms and protocol;
▪ Data needs;
▪ Guidelines for SWOT analysis, MoU questions/critical issues, Self-Study format/content, Action Plan, selection of external reviewers, visit logistics, funding, etc.;
If a graduate program or cluster of programs with a graduate program is being reviewed, the Dean of the Graduate College (or designee) must also attend the initial meeting. The OIE Provist Office Designee facilitates the initial meeting.
The OIE Provost Designee and Program Review Coordinator meet to discuss data needs to complete a Strengths/Weaknesses/Opportunities/Threats (SWOT) analysis of the program or cluster of programs under review. Since results from the SWOT analysis inform the critical questions and/or issues to be articulated in the formal Memorandum of Understanding (MoU) for the Self-Study, data categories ought to reflect those to be included in the Self-Study (See Appendix D for possible list of data).
SWOT analysis is intended as part of an assessment technique to help units identify and discuss all factors that impact the unit. In discussing and presenting factors that are favorable and unfavorable to future improvements, effectiveness and efficiency of the unit, SWOT helps foster a collaborative environmental scan in which all can participate. The goal of SWOT is to provide meaningful information for each of the categories to inform future decision making. SWOT coupled with assessment helps units to self-evaluate and determine future direction related to unit goals and objectives, aligning with institutional mission and goals.
A SWOT analysis is a planning tool that, when used properly, provides an overall view of the most important factors influencing the future of the program. The SWOT analysis is the foundation for a reflective self-assessment of how a unit is performing its mission. The SWOT analysis results form the basis for articulating critical questions, issues, or salient foci of interest for inclusion in the Memorandum of Understanding (MoU) that will guide the goals, directions, and contents of the program review Self-Study.
A SWOT analysis is a simple, but powerful, framework for leveraging the unit’s strengths, improving weaknesses, minimizing threats, and taking the greatest possible advantage of opportunities. Included below are examples of questions that can be employed to jumpstart data-based conversations in the unit.
Strengths (Internal): Positive attributes of the unit
• What does the program do well?
• What are the program’s advantages?
• What do others see as the program’s strengths?
• What could the program boast about its operation?
Weaknesses (Internal): Negative attributes of the unit
• What can be improved?
• What should be avoided?
• What could be done more effectively and efficiently in the program?
• What is the program not doing that it should be doing?
Opportunities (External): Conditions external to the unit that have a positive effect on achievement
• What are the opportunities facing the program?
• What are some current trends that could have a positive impact on the program?
Threats (External): Conditions external to the unit that have a negative effect on achievement
• What obstacles does the program face?
• How are changing resources, technology, or external required specifications affecting the program’s ability to provide services?
• What are some current trends that could have a negative impact on the program?
How Can Results of a SWOT Analysis Be Used for Program Review Purposes?
A SWOT analysis is a subjective assessment of data that is organized into a four- dimensional SWOT matrix, similar to a basic two-heading list of pros and cons.
What the unit does very well internally:
(1) Strengths and Externally favorable conditions (opportunities): Pursue opportunities that are a good fit with the program’s strengths.
(2) Strengths and External unfavorable conidtions (threats): Identify ways the program can use its strengths to reduce its vulnerability to external
Where are functions performed less than preferred internally less:
(1) Weaknesses and Externally favorable conditions (opportunities): Overcome weaknesses to pursue opportunities.
(2) Weaknesses and Externally unfavorable conditions (threats): Establish a defensive plan to prevent the program’s weaknesses from making it highly susceptible to external threats.
To develop initiatives (strategies) that take into account the SWOT profile, the Program Reviewer Coordinator can help faculty translate the four lists into a matrix (see above) that associates strengths (maintain, build, and leverage), opportunities (prioritize and optimize), weaknesses (remedy), and threats (counter) into actions that can be agreed and owned by the unit. For completed samples, feel free to email email@example.com
Non-Exhaustive List of Areas for Consideration in Conducting the SWOT Analysis
- NEED: Is the need for the program/unit increasing, decreasing or staying the same?
- COST: Are programs/unit costs increasing, decreasing or staying the same? In what areas is the budget adequate/inadequate?
- PROGRAM EFFECTIVENESS: Questions will vary. Examples can include:
- Student achievement: Are graduation/transfer rates a concern?
- Student retention: Are course completion rates higher/lower than average?
- Student outcomes: Why do students transfer out? How do students do on the job market? What are the DFWs rates? What specific skills/outcomes result from graduation from the program?
- Access: Is the gender/diversity of the students a concern? For instance, in STEM, what efforts exist towards gender equity of access? Is the program offered in different yet equal modalities? Are the number of sections/services appropriate to meet needs/demand? Can students get timely service?
- Curriculum: To what does the curriculum align? How is it benchmarked against best practices and how often? What are needed updates in the discipline? Is it aligned with BGSU University Learning Outcomes?
- Instruction and Technology: What creativity exists in this area, including technology? What accommodations exist and/or are necessary? How does the delivery of instruction reflect best practices in the field? How is technology included in the varied modes of teaching? What impediments exist? How can technology resources be maximized? Which ones are needed for equal access and outcomes?
- Faculty/Staff Development: In what area(s) are faculty strong? Where does faculty development need to focus? What are field-specific best practices to help faculty grow?
- Diversity: What are markers of diversity among faculty/staff/students? How are they assessed? How are results utilized for continuous improvement in regard to diversity?
- Student Satisfaction and Transfer: What do we know about students’ level of satisfaction with the program and its services? Where do students transfer to? Why? Are transfer institutions making any changes or offering services that we should consider? Are we providing quality services/programs? How do we know? What are measurable measures of quality? How are they assessed and how often?
We recommend the following to maximize engagement in the SWOT process at the program/unit level:
▪ Encourage all unit/department faculty/staff/students to participle in the SWOT. If possible, set aside a few days to conduct the SWOT and propose an agenda. It might be useful to set up a SWOT calendar in advance with invitations to faculty and stakeholders of available “SWOT slots” when they can freely join and contribute.
▪ During the sessions, make sure to document in writing every input/suggestions or comments.
▪ There should be a facilitator appointed to direct the SWOT meeting days and keep the focus of the group on task. This can be one the tasks of the Program Review Coordinator.
▪ Contact OIE to facilitate data sets retrieval from OIR and APS at least 3 weeks prior to the SWOT set dates with your respective unit/department. Note: Prior to submitting a Data Request Form with the Office of Institutional Research (OIR), review the data dashboards available on their website. If the information you are looking for is not readily available, then submit the Data Request Form.
▪ Contact OIE to schedule an introductory meeting with your respective unit/department to explain why the SWOT analysis is needed and fits into the overall program review process. Check Appendix A for further examples as well.
The Memorandum of Understanding (MoU) serves as a mechanism for the program faculty, Program Review Coordinator, College Dean(s)/Associate Dean(s), OIE Provost Designee, and Provost to outline the primary focus or foci for the program review. In addition to assessing overall academic program quality, the program review process is an opportunity for a program to self-assess and gain feedback on action plans that address critical issue(s) and/or question(s).
Examples of potential issue(s)/question(s) in alignment with institutional strategic objectives (when appropriate) follows:
• Redefining student success: Provide undergraduate and graduate students (traditional and post-traditional) a demonstrably superior and innovative learning experience that intentionally prepares them to lead meaningful and productive lives (BGSU Strategic Objective I).
• Increasing and connecting research and creative activities for public good: Support and focus BGSU’s research and creative activities to serve the public interest and support commitment to the public good (BGSU Strategic Objective II).
• Empowering and supporting faculty and staff to achieve excellence: Support faculty and staff in building a quality learning community that fosters diversity, inclusion, collaboration, creativity, and excellence (BGSU Strategic Objective III).
• Advancing impact through engagement: Expand domestic and international engagement and partnerships to benefit students, academic programs, research, and outreach (BGSU Strategic Objective IV).
• Aligning for excellence and value: Enhance the quality and value of a BGSU education by developing a physical, organizational, academic, and financial infrastructure that ensures the University’s short- and long-term vitality and success (BGSU Strategic Objective V).
• Telling our story: Raise BGSU’s profile as a national, comprehensive research university that drives the social, economic, educational, and cultural vitality of the Ohio region, nation, and world (BGSU Strategic Objective VI).
• Strategically strengthening the relationships among and contributions to other programs and the Mission of BGSU.
• Increasing the vitality and sustainability of the program.
• Developing and increasing the regular use of program assessment data to drive program improvements.
Additional questions can be found in Appendix B.
A template for the MoU can be found in Appendix C.
The MoU is never to be developed as a stand-alone, contextually self-serving document, but should always be aligned to the strategic plans of the respective College, Department/School, and University. In this sense, the MoU serves to support that the program under review contributes to BGSU’s Mission, Vision, and Strategic Plan. The Program Review Coordinator is never to develop and submit a MoU in isolation. The Program/Cluster Self-Study MOU must be approved by the College Dean(s)/Associate Dean(s), Dean of the Graduate College if graduate programs are included in the MoU, reviewed by the OIE Provost Designee, and signed by the Provost. Once all signatures have been secured through Adobe Sign, the fully executed Program MoU is kept on file in the Provost’ s Office and in the Office of Institutional Effectiveness.
The self-Study process is an opportunity to engage faculty, staff, administrators, and students in the self-assessment and improvement of the program or cluster of programs to meet the changing needs of students, revise program goals and priorities as may become needed based on data, trends or other internal and external factors (i.e., best practices, new regulations/laws at state or University level, etc.).
The program faculty, under direction of the Program Review Coordinator, is responsible for preparing the Self-Study and Action Plan. The College Dean(s)/Associate Dean(s) will work with the program faculty, Program Review Coordinator, and Department/Program Chair or School Director (as pertinent) to ensure that the Self-Study and Action Plan are completed in a timely manner. The Dean of the Graduate College (or designee) will provide assistance with graduate programs as needed. The Provost’s Designee from the Office of Institutional Effectiveness will also be available for support/troubleshooting and to monitor progress to ensure due dates and timelines are met.
The Self-Study (not including attachments and Action Plan) should not exceed 50 pages. Supporting evidence for claims made in the Self-Study may be attached and/or uploaded separately provided they are clearly referenced in the narrative and a consistent nomenclature is followed. Organization, focus on addressing the MoU’s questions/issues, data-based claims and findings, and coherence of the Self-Study as a whole are essential. Results from the Self-Study and Action Plan should be incorporated into future annual strategic planning at the Unit and College level. The Self-Study should be viewed as an opportunity for the program/Unit to “tell its story” with the necessary backing data, facts, and artifacts. See Appendix B for prompt questions to assist with the Self-Study narrative development.
The program review process should result in the development of a measurable, feasible, actionable, Action Plan for continuous improvement of the program or cluster of programs within a specified timeframe. The program review process should focus on improvements that can be made using institutional and external resources. A program review conclusion/finding may be that the program or cluster of programs ought to focus on generating additional external revenues in support of its future strategic activities. Consideration may be given to proposed program improvement and expansions requiring additional institutional resources, such as new faculty lines. In such cases, the need and priority for additional resources must be strongly data-based (e.g., enrollment/retention gain, impact on scholarly and instructional productivity, partnerships, innovation, public good, student success, etc.), clearly justified, and incorporated into demonstrated annual strategic planning at the Unit/School/College levels. See Appendix E for the template to follow for the Action Plan.
Data should be used to inform and support claims in the narrative of the Self-Study. Data from the Office of Institutional Research (OIR) are the primary source of data. Additionally, OIE may also supplement data from Academic Performance Solutions (APS). The Office of Academic Assessment (OAA) may also provide data relative to Bowling Green Perspective outcomes, student achievement outcomes, graduate surveys, co-curricular outcomes, etc. See Appendix D for the types of data to include in the Self-Study. These data may be requested by the Program Review Coordinator. During the initial meeting with the OIE Provost Designee, College Dean/Associate Dean, and Program Review Coordinator, the type of data, who will be collecting data, and additional data needs of the program may be discussed and coordinated.
Outline and Content for the Self-Study Report and Action Plan
The connections among the elements of the Self-Study should be planned carefully. In all sections, strengths as well as areas needing improvement should be noted, especially as they relate to the MoU critical issues and questions.
I. Introduction and Overview (Approx. 5-10 pages)
a. Executive Summary (Program Self-Assessment)
b. Program Description, History, Mission, Vision, Strategic Plan/Goals
i. Past Program Goals
ii. Current Program(s) Learning Outcomes (must correspond to Catalog)
iii. History of the Program within the Context of the Institution
II. Critical Issue(s) and/or Question(s) from the MoU (Approx. 10-25 pages)
a. Data presentation/analysis/discussion and Narrative addressing the Critical Issue(s) and/or Questions from the MoU
III. Program Action Plan (Approx. 5-8 pages)
a. Prioritized Goals for Program to Address Critical Issue(s)/Question(s) by Area
i. Completed Action Plan Tables for each goal (Use Template in Appendix E)
1. Identify Action Steps/Strategies to Reach Goals (What will the program/cluster do to reach goals?)
2. Identify Responsibilities/Person or Party (Who will be responsible in making sure that actions are completed? Example: Name of Individual or Group within the Program)
3. Identify Existing Resources (What support is currently available to assist in the action steps/strategies identified? Example: Office of Academic Assessment. What external/institutional resources are needed?)
4. Identify measurable Metrics (What key performance indicator will be used to gauge progress/success of identified goal/strategy?)
5. Action Timeline (When is the program going to complete each identified action step/strategy? Example: Spring 2018)
IV. SWOT Analysis Reporting (3-5 pages)
a. Program Strengths
b. Program Weaknesses
c. Program Opportunities or Aspirations
d. Program Threats or Results
V. Concluding Remarks (2-3 pages)
a. Program Data (i.e., OIR, APS, OAA, Unit/Department data, etc.)
b. Prior Program Strategic Plans (and Self-Study if available and/or as needed)
c. SAAC Assessment Plans & Assessment Reports
d. Other as needed
The Program Review Coordinator submits the Program Self-Study and Action Plan to the College Dean(s)/Associate Dean(s) as well as to the OIE Provost Designee for feedback and review. The College Dean(s)/Associate Dean(s) and the OIE Provost Designee review the Self-Study and Action Plan and provide feedback on goals, priorities, and the feasibility of the Action Plan within the context of the College and University.
Prior to review of the Self-Study and its evidence by external reviewers, a first layer of internal review occurs. The internal review is completed by the OIE Provost Designee who works collaboratively with PRAC members who serve as “second readers.” In conducting their internal evaluation of program self-studies and action plans, the OIE Provost Designee and PRAC “second readers” use the Self-Study and Action Plan Checklist and Evaluation Rubric (See Appendices F and G). The OIE Provost Designee issues a Self-Study Approval Status Letter (SSASL) with recommendations for improvement of the Self-Study to the Program Review Coordinator. The OIE Provost Designee, in consultation with PRAC members, may necessitate multiple reviews of the Self-Study to confirm finetuning prior to dissemination to external reviewers and/or Provost. In this reiterative process, there may also be multiple SSASLs. OIE approves the final version of the Self-Study (and accompanying evidence) to be shared with external reviewers and explains next steps.
Three months in advance of the external review visit, the Program Review Coordinator will work with program faculty to identify and submit the names and credentials of 4 to 6 recognized peers from similar programs at other universities (or professional sectors when appropriate). At this stage, the Program Review Coordinator may send informal email inquiries directly to potential external reviewers to confirm willingness and availability (See Appendix J for sample email inquiries). Working from this slate of potential reviewers, the OIE Provost Designee will communicate agreement on a list of at least three (3) potential reviewers – two required and an additional one should in extremis replacements become necessary. Budget allocations from the Provost’s Office will cover only 2 external reviewers. Departments/Units/Schools/Colleges who desire to host additional external reviewers are responsible for the endorsement of all incurred costs. All parties will be sensitive to issues of conflict of interest at all levels. Once two external reviewers have been confirmed by the Provost’s Office, a No-Conflict of Interest Form (See Appendix I), a current C.V/resume, and an Independent Contractor Agreement must be executed and archived with the Office of Institutional Effectiveness (OIE).
Review visits can be conducted on site or virtually. Programs requiring the use of laboratories must arrange for a site-only visit. The Program Review Coordinator is responsible for scheduling and coordinating all aspects of the external reviewers’ site/virtual visit, including meetings with the College Dean(s)/Associate Dean, Dean of the Graduate College (as applicable), and the OIE Provost Designee who must be included on the visit exit session. Opportunities should be arranged for reviewers to meet with faculty members of the program (individually, if possible), Department Chairs or School Directors, program staff, a sampling of undergraduate and graduate students and, if possible, alumni. The schedule should be arranged to accommodate the external reviewers’ need to discuss and time to work individually and as a team to assemble preliminary findings. The visit should also be planned in such a way as to provide reviewers the opportunity to tour facilities and meet with the Director of the Office of Academic Assessment (as applicable) and/or the Office of Institutional Effectiveness. When creating a site visit schedule, it is important to allow breaks for the reviewers in between back-to-back meetings. It is also important to give colleagues enough lead time to confirm their availability for meeting with the reviewers. The length of time the team is on campus will vary with the size and complexity of the program. The Program Review Coordinator in consultation with the external reviewers will determine if the scope of review activities warrants the span of one day or one day and a half. The Program Review Coordinator will communicate with external reviewers to secure logistical arrangements (including travel, transportation, accommodations, etc.) and ensure a high standard of hospitality. A copy of the finalized visit schedule is due to the Provost Designee in the Office of Institutional Effectiveness (OIE) six (6) weeks prior to the visit. The OIE Provost Designee facilitates the exit session of the visit which serves to provide an overview of preliminary findings. Click this link for a sample visit schedule.
At least one month prior to the scheduled visit, the Program Review Coordinator should provide the following materials to each external reviewer:
1. The program’s Self-Study & Action Plan and all accompanying evidence;
2. Copy of Bowling Green State University’s Mission, Vision, and Strategic Plan;
3. Copy of the finalized visit schedule for the visit (if virtual, all Zoom or virtual links should be included with special attention to time zone);
4. Copy of Sample Questions for Use by External Reviewers During the Visit (See Appendix K)
5. Copy of the Executive Summary Template (See Appendix L);
6. Copy of the program’s fully executed Memorandum of Understanding (MoU).
If these materials are distributed via electronic mail, the OIE Provost Designee should be copied on that communication. If these materials are uploaded in a dynamic e-platform (i.e., Microsoft TEAMS, Trello, webpage, etc.), login access should be provided to the external reviewers and the OIE Provost Designee ahead of time to test the technology. If, following the visit, outstanding questions or items remain, external reviewers, the Program Review Coordinator, and program faculty are free to continue communicating.
After the visit, external reviewers will be asked to provide specific recommendations and commendations on the program based upon the Self-Study and accompanying evidence. Specifically, external reviewers are asked to provide an evaluation of how well the critical questions and issues of the program’s MoU have been successfully addressed. The external reviewers’ evaluation of the program Self-Study and evidence and findings following the site will contribute to the final programmatic Action Plan focusing on the critical issue(s) and/or question(s) articulated in the program’s MoU. In their reports of findings, the external reviewers should cite specific evidence/data in support of claims and identify strengths, weaknesses, threats and potential opportunities for program improvement.
The external reviewers’ final consolidated report of findings along with their Executive Summary (See Appendix L) are due three (3) weeks after their visit and should be sent directly to the OIE Provost designee who will forward these documents to the Program Review Coordinator and College Dean(s)/Associate Dean(s). These documents are kept on file in the Provost’s Office and in the Office of Institutional Effectiveness.
There are no page limits to the external reviewers’ final report of findings; however, the Executive Summary is not to exceed 5 pages. In redacting the final report of findings, external reviewers may follow the critical questions/issues of the MoU as an outline or adopt the outline found in the Sample Questions for Use by External Reviewers During the Visit document (See Appendix K). Another outline option can be:
a. Should include primary questions being explored by the program as defined in the MOU
II. Core strengths of the program
III. Main challenges the program faces
IV. Feedback and recommendations regarding the programs’ plans for continuous development and improvement
V. Discussion of findings relative to the Action Plan
VI. Concluding remarks
Following receipt of the external reviewers’ final report of findings and Executive Summary, the Program Review Coordinator shall meet with faculty and the College Dean/Associate Dean to discuss the review outcome. If graduate programs were included in the review, the Dean of the Graduate College shall also be included in this meeting. If there are any factual errors in the report, the Program Review Coordinator should call these errors to the attention of the College Dean/Associate Dean and OIE Provost designee in writing as soon as they are identified. Within two weeks of the conclusion of the meeting to discuss the report, the Program Review Coordinator shall complete the Response to External Findings template (See Appendix M) and forward the document to the College Dean/Associate and the OIE Provost designee. The program faculty Response should focus on the recommendations in the external review report, and specifically to any particular recommendations that do not seem likely to lead to improvement for the program or cluster of programs. The faculty response to the external reviewers’ report that is agreed upon by the Program Review Coordinator and the College Dean/Associate Dean will be formally integrated into the finalized Program Action Plan to be implemented the following semester.
Program/cluster faculty may decide to make changes, incorporating feedback of external reviewers, to the Program Action Plan. A final Program Action Plan shall be submitted to the College Dean/Associate Dean and the OIE Provost Designee.
The College Dean/Associate Dean are responsible for writing a report, which synthesizes the information in theSelf-Study and the external report of findings and is informed by and responsive to input from the Program Review Coordinator, the dean of the Graduate College (as applicable), and the OIE Provost Designee. The College Dean/Associate Dean’s response reflects both the external reviewers’ report and the program faculty response. In particular, it focuses on points of disagreement between those documents. The focus of the College Dean/Associate Dean’s report should be a set of concrete, action-oriented recommendations cast within a specific timeline. These recommendations are guided by the College Dean/Associate Dean’s understanding of the following:
▪ The quality and importance of the program to the mission of the College and the University;
▪ The contribution of the program to the University’s strategic plan and the Graduate College strategic plan (as applicable); and
▪ The program strategic plans, as submitted as part of the University’s strategic planning process and described in the final Action Plan.
The response from the College Dean/Associate Dean commits the College to a course of action. The College Dean/Associate Dean’s response could endorse the program review and/or program/cluster report as written; it could commit to only specified parts of the reports; it could adopt revisions suggested in the program response; or it could add recommendations overlooked in both documents.
The final Program Action Plan, with the College Dean/Associate Dean’s response, is to be signed by the Program Review Coordinator, Dean of the Graduate College (as applicable), and Provost and one copy will be kept in the Provost’s Office. Annual reports by the units will be based on the recommendations in the College Dean/Associate Dean’s response and, by reference, to recommendations in the external reviewers’ or unit report.
Following consultation with the Provost and dean of the Graduate College (for graduate programs as applicable), the College Dean/Associate Dean will meet with faculty and administrators from the program to discuss the program’s final action plan. The discussion should include aspects of the review that concern how the program contributes to other units of the University and how its activities and goals relate to College and University strategic plans. Following this meeting, the final Action Plan must be implemented by the program the following semester.
Ongoing program faculty conversations should occur on the implementation of the Action Plan. The Program Review Coordinator will provide annual follow-up progress reports (endorsed by the College Dean/Associate Dean) in a platform and/or reporting venue set up by the Office of Institutional Effectiveness (OIE).
As BGSU engages in the development and refinement of an institution-wide effectiveness infrastructure, the concept of a Program Vitality Analysis (PVA) has emerged as a value-added shorter annual reporting tool that could serve to compliment the more extensive 5-year program review cycle. The purpose of this uniform but decentralized mechanism is to strengthen ownership of the overall program review process by respective departments, colleges, and schools while informing overall unit strategic planning and resource management on an incremental year-by-year basis leading to the comprehensive program review at the five-year mark. Specifically, PVA will lead programs to annual, in-depth analyses of the key metrics of program enrollment, student retention, and student graduation. The Office of Institutional Effectiveness is charged with the overall orchestration of the two processes.
Questions regarding program review should be directed to the Dean or Associate Dean within your College or the Provost Designee in the Office of Institutional Effectiveness (OIE).
Questions may be sent to firstname.lastname@example.org
Updated: 10/17/2023 10:05AM