Video Script: Program Review FAQ

[Music in Background]. Welcome to this presentation brought to you by the Office of Institutional Effectiveness at Bowling Green State University. In this presentation, we would like to afford a few insights to frequently asked questions about program review at BGSU. We hope this brief walk-through of questions that have emerged from colleagues who have recently conducted a program review will be useful to you as you also embark in this process for your own program.

1)      I am a new department chair, and I am new to program review. This is my first time hearing any of this.

Not a problem. Early in the process, the Office of Institutional Effectiveness schedules an introductory meeting with all new program review coordinators. The purpose of this introductory meeting is to go over details of the process, including step-by-step action items, the program review handbook, associated forms, technology involved, data types and data sources, etc. This is a chance to bring along any questions – and no, there are no “silly” questions and this is a learning process for all. In addition to the designated program review coordinator, the department chair, associate dean, dean and other impacted faculty, staff or students may also attend.

2)      Should I benchmark my program?

While benchmarking is not a requirement of the program review requirements you will read about in the program review handbook, we strongly recommend that you do consider including a measure of benchmarking in your overall program analysis. Benchmarking is an opportunity to gauge your program performance against peers, understand developing national and local trends, identify strengths and areas for improvement, and, in general, benchmarking is a venue by which to keep up with current trends, including the demands and needs for specific fields or careers that might inform continuous improvement of your BGSU program.

3)      Should I use student data for this?

The overarching goal of program review is to improve programs in order to maximize student success and learning. In other words, students are at the very core of any program review endeavor. So, yes, definitely, student data ought to be the basis of any analysis and suggestions for improvement for any program. Further, integrating student data in the program review self-study helps build a culture of improvement rather than just a culture of “assessment for the sake of assessment”. We would like to encourage programs to view “data” as a “live” measure: Just as students are “alive”, data are also “alive” and speak a story. Program review is the process whereby data stories can emerge and influence future story-telling.

4)      OIR and APS gave us these sets of data but we have different data. Also, I couldn’t find that data in the data file you gave me.

The Office of Institutional Research (OIR) is the official institutional source for all data. APS or Academic Performance Solutions data may be used as a secondary source for the sake of triangulation and/or confirmation of data. Because these platforms utilize different data pull methods and/or timelines, small discrepancies in the data should be expected. However, if discrepancies appear too significant, early in the process, the Office of Institutional Effectiveness will facilitate a “data conversation” with the Office of Institutional Research and program review coordinators. This is to ensure that questions are answered in a timely fashion to best inform the articulation of the program’s Memorandum of Understanding or MOU. We know and understand that reading, understanding, creating, making sense of, and communicating data sets may not be everyone’s area of predilection. Data literacy entails certain competencies to work with data. Many data conversations may be necessary as the needs for data literacy arise.  The Office of Institutional Research creates and updates a program review Tableau dashboard. If you are having difficulty locating and/or retrieving a specific data category on the dashboard, please do never hesitate to reach out to the Office of Institutional Effectiveness who will facilitate joint data meetings with the Office of Institutional Research as needed.

5)      It fell off my radar.

The Office of Institutional Effectiveness puts together a program review schedule in conjunction and with feedback from designated Associated Deans from each of the Colleges. The schedule is updated periodically and disseminated on our website and broadly to the many different College administration. In general, the program review timeline is built in such a way to allow for several weeks and/or months before the next program review action item is due. The timeline helps spread out the process over the course of 2 semesters (Fall and Spring). In addition, the Office of Institutional Effectiveness is happy to work a two-weeks buffer whenever a specific action item is due (for example, the submission of a Memorandum of Understanding or MOU). The key though is to reach out to the Office of Institutional Effectiveness as soon as possible or as soon as you are becoming aware that a due date or deadline might become challenging to meet. If you know you will be away for a while, we strongly encourage you to designate a secondary and temporary program review coordinator so that the work does not become stalled in your absence.

6)      This process has been very fruitful for us because it has helped us change the way we do grading. Also, as a result of this process, we came up with a new metric – which we never knew we needed.

Claims such as these are fabulous! They illustrate how the program review process can indeed serve to leverage change and continuous improvement while supporting the program’s commitment to quality and effectiveness. There is no change that is too small, even it is, for example, the revamping of a grading rubric for a key assignment used to showcase student learning against critical program learning outcomes. Clear and measurable metrics are critical for making data-informed decisions focused on continuous improvement and on maximizing existing resources. In the grand scheme, metrics help provide accountability and transparency to all stakeholders.

7)      These are two different cluster programs. It is like comparing apples and oranges.

The cluster model for program review was an inherited legacy in 2020. Clusters are no longer required although programs who find they have many programmatic affinities may consider a combined program review to augment partnership or collaboration efforts or combine resources for better efficiency.

8)      What do you mean “time-to-degree”?

Time to degree refers to the mean total number of years taken by students to complete their degree from the time of the first enrollment. There is a general formula that time-to-degree = (graduating year – entering year) + ((graduating semester + entering semester) -1). For additional definitions, consult the Glossary of Terms compiled by the Office of Institutional Research at the link on the slide:  https://www.bgsu.edu/institutional-research/GlossaryOfTerms.html#t

9)      No matter what data trends look like, tell your story!

The program review process includes an opportunity to conduct a Strengths, Weaknesses, Opportunities and Threats analysis. As a result of this analysis then, it is expected that areas for improvements and threats will be identified. This is not a bad thing. The Action Plan (embedded requirement of the self-study) will serve as a tool for programs to address areas of weaknesses, threats, and other opportunities they wish to develop in support of program foreseeable growth for the next 5 years. So, for example, if enrollment in your capstone course has been stagnant or you are noticing a decrease in enrollment or graduation rates over the last 3 or 5 years, make a note of the trend but then capitalize on problem-solving and actionable, feasible strategies you can begin implementing now to redress the situation or give a new impetus or direction to your program. Program reviews are not intended for programs that are at their apogee only. Program reviews cover all: the good, the bad and the ugly with the intentional goal of moving forward, of learning from deficits as may be needed, of pivoting creatively and courageously forging a new trail forward where none was before.

10)   Then COVID hit! I work primarily remotely.

At BGSU, the option exists for the program review process to be completed in person or virtually and remotely. The Office of Institutional Effectiveness sets up a Microsoft Teams and Trello platform and provides access and training to any new onboarding program review coordinators. Not much secretarial load is required. The visit by external reviewers may also be mutually decided upon as a virtual visit (with the caveat that if programs necessitate labs, at least one of the two primary external reviewers should arrange for an in person visit and inspection for some or all of the visit duration).

11)   I have been here for 40 years. Why are we doing this? Isn’t this only for compliance?

It is true that BGSU’s Academic Charter and the Higher Learning Commission (HLC), the accrediting agency of the institution, both require institutional program reviews. However, program review best fulfills its intrinsic purpose when it is understood in terms of value-addition where programs having put analytical lenses on, are proactively engaged in the process of identifying problems or weaknesses, testing possible solutions or strategies, verifying effectiveness, establishing new goals and implementing best practices. Prior to 2020, the practice of program reviews at BGSU may not have followed a “regular” pattern. However, since then, the Office of Institutional Effectiveness has been charged to spearhead the effort and guide the institution towards a more systematic and efficient approach to program reviews.

12)   A curriculum map, isn’t that the same as those SAAC reports?

A curriculum map is a matrix that shows where (i.e., in which course) and how specific curriculum learning outcomes are addressed and assessed. The Student Achievement Assessment Committee (SAAC) reports do a bit more than just that: they are a vehicle through which programs are afforded feedback and strategies on how to improve student achievement of student learning outcomes. SAAC reports serve to identify and describe the programmatic learning outcomes of any given academic year and how results will be applied to improve programs. Key to an exemplary assessment is for programs to demonstrate how student achievement data that have been collected are actively being used to improve. SAAC reports serve to document that this is done. If your unit or program has already submitted SAAC reports, there is no need to re-invent the wheel. Just use what you currently have as supporting evidence for your program review self-study. We know that program review materials will most likely include reports already compiled for other institutional purposes such as SAAC, Program Viability, etc. and this is accepted as these internal processes should, in the end, serve as multiple sources of triangulation in support of programmatic performance.

13)   Does my dean/associate dean need to read this?

Yes, your dean/designated associated dean must read and approve all program review-related documents that you submit to the Office of Institutional Effectiveness. This is to ensure transparency, buy-in, as well as alignment with your College specific goals and strategic plan. Further, sharing these documents with your local College administration fosters communication, collaboration, and support. A shared-task mindset is critical to upholding a positive and nurturing environment. For example, your dean or associate dean may have access to fiscal information that you may not readily have access to. They may be privy to upcoming changes in hiring, restructuring or curricular changes that you may not have yet been appraised of. Open and transparent communication is vital to the collegial process and these routine internal interactions will accrue the effectiveness of the program review process overall. Program review is intended to be a highly collaborative process.

14)   I know our program aligns with the College’s strategic plan. I just don’t know what the College’s strategic plan is or where to find it. Also, do you have an old self-study for our program that I can work from?

Most Colleges will have their mission and vision statements as well as their strategic plans posted on their website. Start there! In some instances, these key documents may be under revision. Contact your dean and/or designated associate dean for the most current copies. In some instances, the Office of Institutional Effectiveness may also be able to assist in retrieving archived samples if they exist.

15)   How much evidence can I have?

We strongly encourage quality over quantity. The types and number of evidence are highlighted in the program review handbook. A non-exhaustive list of sample metrics is also available at the link on the slide: https://www.bgsu.edu/institutional-effectiveness/institutional-effectiveness/program-review/summary-of-program-review-metrics-sample-for-consideration0.html

In general, evidence can be furnished in appendices to the self-study narrative; they may also be hyperlinked textually within the narrative (with the caveat that all hyperlinks be accessible outside of BGSU). Your program review Microsoft Teams/Trello platform will feature a tab where you will be able to upload all of your evidence (no size limit). In a few instances, programs have creatively designed a unique mini program review “website” with password protection for the sake of user-friendliness and access for external reviewers. While creativity is welcomed, please note that the Microsoft Teams/Trello portal remains the primary archive for all of your program review materials uploads.

[Music in Background]. Thank you for joining us for this presentation. Should you have any questions, do feel free to email us at institutionaleff@bgsu.edu

Updated: 03/05/2026 04:53PM