University-wide Course Evaluation of Teaching and Learning Instructor Guide

Introduction

The University-wide Evaluation of Teaching and Learning is composed of six common course evaluation questions, which are asked across all courses every semester. These six questions are referred to as the common questions and were developed by an ad hoc committee, including faculty, administrators, union, undergraduate and graduate student representation. The questions were piloted in 2017-2018 and fully implemented in Fall 2018.

When students complete their course evaluations, they are presented with the six common questions first, followed by any additional targeted questions used by their college or academic unit, and then the three Ohio Senate Bill 1 (SB1) questions. When targeted questions have not been assigned, students are presented with the common questions followed by the SB1 questions. In classes that are co-taught, students are presented with drop-down lists that display their instructors’ names. Students can use these drop-down lists to evaluate each instructor in their course separately for any instructor-specific questions.

Common Questions

The six common questions provide a focus on continuous improvement and begin with the statement, “The instructor....” The questions align with four categories, as summarized below:

Category 1: Expectations

1.      The instructor clearly explains course objectives and requirements.

2.      The instructor sets high standards for learning.

Category 2: Feedback and Assessment

3.      The instructor offers helpful and timely feedback throughout the semester.

Category 3: Support for Student Success

4.      The instructor provides opportunities and/or information to help students succeed (for example, tutoring resources, office hours, mentoring, research projects, etc.).

Category 4: Engagement

5.      The instructor encourages student participation (for example, by inviting questions, having discussions, asking students to express their opinions, or other activities).

6.      The instructor creates an environment of respect.

Students are provided the following response options for all six common questions:

Response

Score

Strongly Disagree

1

Disagree

2

Neither Agree nor Disagree

3

Agree

4

Strongly Agree

5

Students also can provide comments on each of the six questions.

Accessing the Course Evaluation System

The University-wide Evaluation of Teaching and Learning is administered through the Watermark Course Evaluations & Surveys (CES) platform. Instructors can access CES to track response rates for active projects and access results for previous projects.

The system can be accessed directly using BGSU’s single sign-on or through Canvas.

Accessing via BGSU Single Sign-On

Use the BGSU single sign-on link to access Course Evaluations: https://bgsu.evaluationkit.com.

When you log in using the BGSU single sign-on link, the default is account level is “Student/Respondent”.  You will need to use the dropdown in the upper-right corner to switch to “Instructor” to access your evaluation data.

Screenshot of the Watermark Course Evaluations & Surveys login screen showing the Student Course Evaluation Dashboard. The page displays a 'My Surveys' heading. In the upper-right corner, the user-type dropdown currently reads 'Student/Respondent' and is expanded to reveal two additional role options: Administrator and Instructor. A callout annotation points to the dropdown with the instruction 'Click on the User-type dropdown and select Instructor.' The logged-in user is shown as Freida Falcon. A red warning banner at the bottom of the screen advises users that the application does not fully integrate with BGSU Single Sign-On and to close their browser upon completion.

Accessing through Canvas

To access course evaluation results via Canvas:

1.      Click on Account.

2.      Click on Settings.

3.      Click on Course Evaluations.

Note: When logging in through Canvas, the system may automatically determine your role (Instructor or Student) based on your Canvas account. If the system does not default to Instructor, use the dropdown in the upper-right corner to switch roles.

Course Evaluations Homepage

After logging into the system as an instructor, the home screen displays three main features:

Most Recent Project Results: A quick access list limited to 5 recent course evaluation projects with their end dates, results start dates, and results status. Click on any Project Name to access its evaluation results.

Response Rate Tracker: Displays response rate information for any deployed, in-progress evaluation project. When no project is currently in progress, this widget displays “No Project Found.”

My Surveys: Displays any active surveys assigned to your Student account, if you are taking any courses.

Screenshot of the Course Evaluations & Surveys homepage after logging in as an Administrator. The left side of the screen displays the Most Recent Project Results section, showing a list of courses from Fall 2025 and Fall 2024 with their course codes, titles, and unique IDs. The right side displays the Response Rate Tracker widget showing Spring 2026 with a 0.00% response rate and 0 out of 111 responses. A 'My Surveys' button is also visible, which displays any active surveys assigned to the user's Student account. Three callout annotations label each feature: 'Project Results — Displays courses from the most recent project results,' 'My Surveys — Displays any active surveys assigned to your Student account,' and 'Response Rate Tracker — Displays Response Rates for any surveys that are currently in progress.  Screenshot of the Course Evaluations & Surveys homepage after logging in as an Instructor. The left side of the screen displays the Most Recent Project Results section, showing a list of courses from Fall 2025 and Fall 2024 with their course codes, titles, and unique IDs. The right side displays the Response Rate Tracker widget showing Spring 2026 with a 0.00% response rate and 0 out of 111 responses. A 'My Surveys' button is also visible, which displays any active surveys assigned to the user's Student account. Three callout annotations label each feature: 'Project Results — Displays courses from the most recent project results,' 'My Surveys — Displays any active surveys assigned to your Student account,' and 'Response Rate Tracker — Displays Response Rates for any surveys that are currently in progress.

*Best practice: To access evaluation results, click the Results menu item in the top navigation bar, then select Project Results to view and search across all available projects.

When Results Become Available

At the end of each semester, results are available for department administrators, office staff with Report Administrator account access, and instructors to review the day after the Registrar posts final grades in CSS.

All results are released at the same time, regardless of when the course’s evaluation period concluded. Results for courses taught during early-term sessions (e.g., 6, 7, or 8-week) are not available until the end of the semester. As grades for those courses can be modified through the end of the semester, evaluation results are withheld until grades are due.

Accessing Evaluation Results

Navigating to Results

To view course evaluation results, after logging into the course evaluation system:

1.      Click on Results.

2.      Click on Project Results.

3.      Select a semester evaluation project to access the course evaluation data.

Screenshot showing how to navigate to the Project Results page in the Course Evaluations & Surveys system. The Results dropdown menu in the top navigation bar is expanded, revealing menu options including Results Home, Response Rate Tracker, Project Results (highlighted), Instructor Results, Report Builder, and Results Feedback. Below the menu, the Project Results page is visible with a Search Projects form. The Project Results table lists evaluation projects including Fall 2025 and Summer 2025. Three numbered callout annotations guide the user: '1. Select Results,' '2. Select Project Results,' and '3. Select a Project.'

Instructors are provided with a list of all courses in which they were evaluated as an instructor of record for the selected project.

Screenshot of the Project Results page for Fall 2025 showing an instructor's course list. The Project Results table displays three course sections with their Course Code, Title, Unique ID, and a Report download icon in the rightmost column. Course codes shown are FALC10001001, FALC10001002, and FALC1000401W. A Batch Report link appears above the table. No callout annotations are present.

Downloading Reports

Reports are downloaded from the Project Results page either individually by clicking on the Report Icon, or as a group using the Batch Report link.  The course evaluation system provides the same set of reports for each course, regardless of which method is used.

Screenshot of the Project Results page for Fall 2025, highlighting the two methods available for downloading reports. The same three-course list is displayed. Two callout annotations label the interface elements: 'Batch Report — Combined results for selected sections' pointing to the Batch Report link above the table, and 'Report Link — Generates reports for individual course sections' pointing to the download icon in the rightmost column.

Generating Individual Instructor Reports

To generate a course evaluation report for a single class section, click the Report Icon and select a report format from the dropdown menu.  Descriptions of the reports are provided in the Report Formats section of this manual.

Screenshot of the Project Results page for Fall 2025, showing the report format dropdown menu that appears after clicking the Report icon for a single course section. The dropdown menu reveals five format options: Detailed Report, Detailed Report + Comments (highlighted in purple), Short Report, Short Report + Comments, Raw Data, and Feedback. A callout annotation reads '4. Click the View Icon to generate individual reports.'

Batch Reports

The Batch Reporting feature can be used to download multiple reports simultaneously or to aggregate data from more than section or course into a single report.

Using the Batch Report Feature

To combine results, first select the courses from which data are required, and then select a batch report type.

1.      Select courses to be included in the report by checking the box next to their name.

2.      Click the Batch Report link.

3.      Provide a Report Name.

4.      Select a Report Type.

5.      Select a Delivery Type.

6.      Click Go to generate the report.

Screenshot demonstrating the complete Batch Report workflow on the Project Results page for Fall 2025. The upper-left portion shows the Project Results table with two courses selected with orange checkboxes: PSYC10108101 and PSYC10108301. The Batch Report dialog box is displayed in the center of the screen. The dialog contains a Report Name field (populated with 'Falcon, Frieda Fall 2025'), a Report Type dropdown (set to 'Detailed Report'), and three delivery type radio buttons: 'Download Multiple Reports as ZIP File for Selected Courses' (selected), 'Merge Multiple Reports into one PDF for Selected Courses,' and 'Aggregate Data for Selected Items into One Report.' Cancel and GO buttons appear at the bottom. Six numbered callout annotations walk through each step: '1. Select Courses to combine in the Batch Report,' '2. Click Batch Report icon,' '3. Name the report,' '4. Select a Report Type,' '5. Select a Delivery Type,' and '6. Click Go.'

The report types available for batch reports include Detailed Report, Detailed Report + Comments, Short Report, Short Report + Comments, and Raw Data.

Three delivery types are available:

1.      Download Multiple Reports as Zip File: Separate PDF reports will be produced for each of the selected courses and saved into a .zip file.

2.      Merge Multiple Reports into one PDF: Separate reports will be created for each of the selected courses or instructors and combined one after the other to form a continuous, multi-page PDF file.

3.      Aggregate Data for Selected Items into One Report: A single summary report will be created by combining data from each of the selected courses, sections, or instructors to form a composite. This option will combine the data obtained from each of the course evaluation questions to create aggregated frequencies, percentages, means, standard deviations, and medians. This report also will provide the means, standard deviations, and medians for BGSU overall and the college for each of the six university-wide questions.

Note: When batch files are requested, the course evaluation system generates an email that provides a link to the results. 

Understanding and Using Reports

Report Formats

Five types of report formats are available from the Project Results page:

1.      Detailed Report — A PDF report which provides summary statistics obtained for each question in the course evaluation, including frequencies, standard deviation, median, and mean scores shown at the course, college, and university levels.

2.      Detailed Report + Comments — A PDF report which includes student comments in addition to the summative statistics described above.

3.      Short Report — A PDF report that displays summary statistics, including mean scores for each question in a condensed format at the course, college, and university levels.

4.      Short Report + Comments — A PDF report that includes student comments in addition to the summative statistics provided in the condensed report described above.

5.      Raw Data — An Excel data file which includes anonymized results at the individual response level. The raw data file can be used in Microsoft Excel or Power BI to generate reports that go beyond the four packaged reports which are outlined above.

In all reports, results for the six common questions are shown first, followed by any targeted questions the college or department/unit might be using, and then the three SB1 questions.

Note: The Detailed Report + Comments report is recommended as the default report for both instructors and administrators because it includes student comments in addition to question-level response frequencies and mean scores. The six common questions all include a Comments open-ended response option, and those data can only be viewable using the “+Comments” report options. For colleges and departments/units that use open-ended targeted questions, the "+ Comments” report options must be used to view the qualitative results.

Detailed Report Format

There are two types of Detailed Report formats (Detailed Report and Detailed Report + Comments), both of which may be downloaded from the evaluation system in PDF format. Each format produces a PDF report which includes a header, results, and mean of means section.

The header section displays the name of the course, identifies the instructor of record, and shows the overall response rate (number of students who participated in the course evaluation divided by the total number of students enrolled in the course).

Annotated screenshot of the header section from a Detailed Report + Comments PDF. The header displays 'Bowling Green State University Fall 2025' in the upper-left with the BGSU logo in the upper-right. Below the header, the report shows Course: FALC10001001 with the full Canvas course title, Instructor: Falcon, Frieda, and Response Rate: 16/34 (47.06%). Three callout annotations identify key elements: 'Evaluation Period' pointing to the term and year, 'Canvas Course Title & Instructor Name' pointing to the course and instructor lines, and 'Unduplicated Response Rate' pointing to the response rate value.

Note: When courses are co-taught, all instructor names appear in the Instructor line and the instructor for whom the report is being generated is indicated by an asterisk. The response rate shown in the header reflects the percentage of enrolled students who submitted a response for the specific instructor.

The results section summarizes response statistics for each course evaluation question, including the question text, response options, response option frequencies (number of students selecting each response item), question response rate (number of students who completed the question divided by the total number of students in the course), mean score (average), standard deviation (measure of variability), and median (middle). Summary results for each item are displayed side-by-side at the course, university, and college levels. Student comments are included only when the Detailed Report + Comments report has been selected.

Annotated screenshot of the results section from a Detailed Report + Comments PDF, showing data for question 1: 'The instructor clearly explains course objectives and requirements.' The display includes a response option table with columns for Response Option, Weight (1–5), Frequency, Percent, and Percent Responses (with a horizontal bar chart). Below the table, course-level summary statistics show Response Rate (16/34, 47.06%), Mean (4.50), STD (0.82), and Median (5.00). A bar chart compares the means across Question (4.50), BGSU (4.34), and College (4.28). University-level and college-level comparison statistics are shown in separate rows. At the bottom, a student comment reads 'Dr. Falcon clearly explained her expectations for what were would be learning and when assignments would be due.' Callout annotations identify each section: 'Question Text,' 'Response Options with Frequencies & Percents,' 'Unduplicated Response Rate,' 'Course-level Summary Statistics,' 'Comparison Statistics shown at the University and College levels,' and 'Student Comments.'

The mean of means section is shown at the bottom of the Detailed Report and provides a combined mean score for all six common questions.

Screenshot of the Mean of Means section from the bottom of a Detailed Report + Comments PDF. The section is labeled 'DETAILED REPORT + COMMENTS MEAN OF MEANS SECTION' and contains a single-row table with an orange header bar. The row labeled 'Common Question Mean (Questions 1 - 6)' shows a Mean value of 4.49.

Short Report Format

As with the Detailed Report, there are two types of Short Report formats (Short Report and Short Report + Comments). The header section of the Short Report identifies the name of the course, instructor of record, and response rate. The results section presents results for each question in a linear fashion with response frequencies first, followed by mean scores obtained at the university and college levels, and then the mean, standard deviation, and median score obtained at the course level.

Annotated screenshot of a complete Short Report + Comments PDF. The header section displays 'Bowling Green State University Fall 2025' with the BGSU logo, followed by the course code, title, instructor name (Falcon, Frieda), and response rate (16/34, 47.06%). The results section shows a condensed table format for question 1, displaying both counts (n) and percentages for response options 1 through 5. Columns B1 and B2 show the university average (4.34) and college average (4.28) respectively. The rightmost columns display the course-level Mean (4.50), Std (0.82), and Median (5.00). A scale legend identifies B1 as BGSU and B2 as College. A student comment appears at the bottom. Callout annotations identify each element: 'Evaluation Period,' 'Canvas Course Title & Instructor Name,' 'Unduplicated Response Rate,' 'Response Frequencies,' 'B1 = University Average,' 'B2 = College Average,' 'Question-level Summary,' and 'Student Comments.'

Raw Data Report

Raw data output is available for each course or selection of courses using the Raw Data Report. When this report format is selected, the course evaluation system produces an Excel file with anonymized evaluation data. Each row in the raw data file represents a single student’s response to a single instructor within a single course section.  In co-taught courses, a student who evaluates both instructors will have two rows—one for each instructor evaluated—so the total row count may differ from the enrollment figure for that course.

The raw data file contains three categories of columns:

1.      Course and instructor identification: These columns identify the context for each response, including the hierarchy path, hierarchy level name, course code, course title, unique course ID, survey start and end dates, instructor name, and instructor username.

2.      Course-level summary statistics: These columns—including Enrollments, Respondents, and ResponseRate—represent totals at the course level. These values are the same on every row for a given course and reflect the true course-level counts. They are not affected by co-taught courses; even when a student evaluates both instructors, the enrollment and respondent counts are not duplicated in these columns.

3.      Individual response data: These columns contain the student’s question-by-question responses, including quantitative ratings and open-ended comments. Each question occupies its own column, labeled Question 1, Question 2, and so on. The raw data file includes a QuestionMapper tab that maps each column label to its full question text, making it easy to identify which responses correspond to the common questions, targeted questions, and SB1 questions.

Common Question, Targeted Question, and SB1 Reporting

Because the common questions are included in all course evaluations, the comparison statistics shown on the Detailed and Short Reports for the course, university, and college levels all reflect different sets of respondents and produce meaningful comparisons. However, since most targeted questions apply only to specific colleges or departments/units, the university and college level comparison values for those questions are limited to the respondents within the assigning unit. The comparisons may appear duplicated because they are not evaluated at those higher levels.

Annotated screenshot from a Detailed Report showing the results section for a targeted question (question 8: 'The instructor clearly explained complex material'). The display shows the same statistical layout as the common questions, with response option frequencies and comparison statistics. A callout annotation highlights that the university-level and college-level comparison values (both showing 630 respondents, Mean 4.36, STD 0.88, Median 5.00) are identical, with the note 'University and College comparisons are identical when reported for targeted questions,' illustrating that because targeted questions are only assigned to specific colleges or departments, the comparison statistics at the university and college levels reflect the same pool of respondents.

The SB1 questions are a special case. While they are technically implemented as a targeted survey, they are assigned to all courses university-wide. Therefore, the university and college comparison values for SB1 questions reflect all courses, unlike other targeted surveys where comparisons are limited to the assigning unit.

Mean of Means Calculation

A Mean of Means calculation is provided at the end of the Detailed and Short Report formats to show the combined mean score obtained on the evaluation for all six common questions.

Screenshot of the Mean of Means section from the bottom of a Detailed Report, showing a single-row table with an orange header bar. The row labeled 'Common Question Mean' displays a Mean value of 4.86. This section demonstrates that only the six common questions are included in the Mean of Means calculation; targeted questions and SB1 questions are excluded.  Screenshot of the Mean of Means section from the bottom of a Detailed Report, showing a single-row table with an orange header bar. The row labeled 'Common Question Mean' displays a Mean value of 4.86. This section demonstrates that only the six common questions are included in the Mean of Means calculation; targeted questions and SB1 questions are excluded.

Note: The Mean of Means calculation averages only the six university-wide common questions. It does not include results from any targeted questions, including the SB1 questions.

Response Rate Tracker

The Response Rate Tracker allows instructors to monitor response rates in real time during an evaluation project and encourage students to complete course evaluations.

Accessing the Response Rate Tracker

Navigate to Results → Response Rate Tracker from the top navigation bar.

1.      Select Response Rate Tracker from the Results menu.

2.      Select a Project from the Response Rates table.

3.      Select a Hierarchy from the Response Rates table.

4.      Review the resulting response rates by course section.

Annotated screenshot showing how to access and use the Response Rate Tracker. The top portion shows the Results dropdown menu in the navigation bar with 'Response Rate Tracker' highlighted. Below, the Project Response Rates page is displayed with a Search form and a Response Rates table showing the Spring 2026 project. Beneath that, a second view shows the drill-down after selecting the Spring 2026 project: the Node Response Rates table displays hierarchy levels with their enrollments, responded counts, and response rates. Two hierarchy levels are visible: Bowling Green State University (1,499 enrollments, 29 responded, 1.93%) and CCP (40 enrollments, 0 responded, 0%). Three numbered callout annotations guide the user: '1. Select Response Rate Tracker from the Results menu,' '2. Select a Project,' and '3. Select a Hierarchy Node.'

The project response rates can also be accessed for any course evaluations that are in progress using the Response Rate Tracker widget on the home screen.

Viewing Response Rates

When viewing a project in the Response Rate Tracker, you can view the project start and end dates, overall survey enrollments, the number of respondents who have submitted evaluations, and the overall response rate (number of enrolled students who have submitted a response divided by total enrollments).

Screenshot of the Response Rate Tracker at the course level, showing individual course response rates after drilling down into a hierarchy node. The Courses table displays columns for Level, Code, Title, Unique ID, Instructor, Enrollments, Responded, Response Rate, and View. Four course sections are visible, all taught by Frieda Falcon within the Bowling Green State University hierarchy level. Course-level data includes FALC10001001 (42 enrollments, 22 responded, 52.38%), FALC10001002 (18 enrollments, 7 responded, 38.89%), FALC1000401W (29 enrollments, 0 responded, 0%), and FALC20201001 (50 enrollments, 0 responded, 0%). Each row includes a View icon in the rightmost column.

How Evaluation Projects Are Built and Administered

This section provides background on how course evaluation projects are created, how courses and instructors are assigned, and how evaluations are administered each semester. While instructors do not manage these processes directly, understanding them can help clarify how courses and instructors appear in the evaluation and how results are generated.

Project Creation and Deployment

A new course evaluation project is created each semester. Much of the project setup work (i.e., creating the project shell, configuring communications, setting up surveys, and establishing evaluation dates) begins before census date. The course and enrollment import occurs after the 15th day of classes to ensure that instructors have had time to create their Canvas course shells and assign the appropriate instructors of record. Project setup includes the following tasks:

  • Importing courses, instructors of record, and student rosters from Canvas.
  • Processing course data through the Office of Academic Assessment’s Access Application, which assigns courses to the correct hierarchy levels and identifies courses to be excluded from the evaluation.
  • Generating Department Acceptance Reports (DARs) and distributing them to department administrators for review.
  • Reviewing and processing corrections to course and instructor assignments based on feedback from department administrators.
  • Assigning targeted survey questions to courses.
  • Specifying the administration dates during which students will complete course evaluations.
  • Defining the date after grades have posted upon which course evaluation results will become available to administrators and instructors.
  • Configuring project communications.
  • Course, Instructor, and Student Import

Course, Instructor, and Student Import

Course Import: Courses are imported once (and only once) from Canvas when the evaluation project is created. This initial import establishes the mapping between CES and Canvas that enables daily enrollment refresh. Following this initial import, courses are manually updated based upon information provided to the Office of Academic Assessment by administrators from the academic offices. Course adjustments may include reassigning courses to the correct hierarchy node, removing courses that should not be evaluated, and adding courses that were not captured in the initial import.

Instructor Import: Any individual assigned the “Instructor” role in a Canvas course will be imported as an instructor of record into the course evaluation project, including all co-instructors in co-taught courses. Faculty are advised to assign additional helpers in their courses a role other than Instructor—such as Observer—to prevent them from being pulled into the evaluation as an instructor of record. Following the initial import, instructor assignments are manually updated based upon information received from department administrators. During the DAR review, administrators typically request corrections such as removing instructors who should not be evaluated for a given course. For College Credit Plus (CCP) dual-enrollment courses, instructor assignments are obtained from the Office of Admissions and added after the initial import.

Note: The Office of Academic Assessment uses Canvas to pull instructor information.  Because instructors sometimes add additional instructors to their Canvas shell that are not instructors and instructor assignments in CSS do not always reflect who is actually teaching, it is important that instructors carefully review all their course evaluation messages to ensure that the lists of courses for which they will be evaluated are accurate. Only instructors of record are assigned to courses in the University-wide evaluation. Teaching assistants are not included.

Student Import: Rosters of enrolled students are loaded into the course evaluation system following the 15th day of courses. Student enrollments are refreshed daily from Canvas up until the completion of the evaluation period for each course. The daily enrollment refresh is configured per course and is tied to the Canvas enrollment roster, which means students who are added to or dropped from a course in the Campus Solutions System (CSS), and subsequently reflected in Canvas, will automatically appear in or be removed from the evaluation.

Students who drop a course are removed from the course evaluation list once the drop has been fully processed through CSS and reflected in Canvas. Students who withdraw from a course, however, are still invited to complete an evaluation. It is expected that students who have withdrawn will restrict their comments and evaluation to the period during which they were actively attending the course.

Instructor Course Review

Approximately five days before evaluations begin, instructors receive an automated email listing all courses for which they will be evaluated as instructor of record. This email lists all courses assigned to the instructor for the entire term, including courses in sessions that have not yet begun. This is by design so that instructors can review their full course list and report any errors before evaluations begin.

Instructors should review this message closely and contact their academic office, or email the Office of Academic Assessment (assessment@bgsu.edu), if they notice any errors in their list of assigned courses. Corrections to instructor assignments must be complete before the administration period begins.

Note: Changes to courses, instructors, and targeted question assignments must be made prior to the start of the evaluation period. Once students begin submitting evaluations for a course, its settings, including the instructor of record and assignment of targeted questions, cannot be modified.

Evaluation Administration Dates

Students are permitted to complete course evaluations only during prescribed time periods that are dependent upon the semester and the course session. Timelines for the administration of course evaluations occur as follows:

Fall and Spring Semesters

  • 15-week sessions: 2 weeks prior to the final exam week.

Note: Only the 15-week session has a designated finals week.

  • 11-week sessions: Last 2 weeks of the session.
  • 7-week sessions: Last week of each session.

Summer Semesters

  • 6, 8, or 12-week sessions: Last week of each session.
  • Sessions shorter than 6 weeks (3 or 4 weeks): Condensed evaluation period.

Some programs operate on non-standard academic calendars and may have evaluation timing that differs from the standard schedule. The Office of Academic Assessment coordinates with these departments individually to determine appropriate evaluation dates.

Course Evaluation Administration

The University-wide Evaluation of Teaching and Learning is administered online. Students may access their evaluations by any of the following methods:

  • Following links in their announcement emails.
  • Logging in through their Canvas accounts.
  • Navigating directly to https://bgsu.evaluationkit.com and logging in with their BGSU single sign-on credentials.

Note: Students receive reminders when course evaluations are available every time they log into Canvas.

Messaging

Email messages are configured to send automatically through the evaluation project based upon course evaluation administration dates. The following communications are sent to instructors and students throughout the evaluation cycle:

Upcoming Course Evaluation Email: Instructors receive an announcement message 5 days prior to each evaluation start date with a list of their courses for the semester. This email lists all courses assigned to the instructor for the entire term, including courses in sessions that have not yet begun. This is by design so that instructors can review their full course list and report any errors before evaluations begin. Corrections to instructor assignments must be complete before the administration period begins.

Course Evaluation Email: Students and instructors receive an announcement message on the first day that evaluations become available for each session.

Non-Respondent Emails: Non-responding students receive up to three reminder messages during each evaluation period. The first reminder is sent 2 days after the evaluation start date, the second is sent 2 days before the end date, and the final reminder is sent 1 day before the end date. For summer sessions with shorter evaluation periods, the reminder schedule is condensed.

Certificate of Completion Email: A certificate of completion is emailed to students following the completion of each course evaluation. Only one certificate email is sent per evaluation. If a student deletes or loses the email, it cannot be re-sent; however, the Office of Academic Assessment can send the student an email for confirmation if needed.

Note: When desired, students may forward their Certificate of Completion email to their instructors as evidence that they have completed an evaluation of their course. The Certificate of Completion does not include information that would identify responses as belonging to an individual respondent.

Results Notification Email: Instructors receive an email stating that their evaluation results are available. This message is typically sent the day after grades are due.

Resetting Student Responses

In some situations, a student may need to revise or redo a course evaluation after it has been submitted  For example, if the student accidentally selects “Disagree” when they meant to select “Agree”, the Office of Academic Assessment can reset or re-open a student’s response upon request by the student, if the request is during the course evaluation period.

Two options are available:

1.       Re-open: Makes the survey active again for the student without deleting their previous responses. This option can be used during the active evaluation period, when the student needs to make minor changes or additions to an evaluation that has already been submitted.

2.       Reset: Deletes the student’s responses for all questions on the course evaluation and allows them to retake the entire survey. This option can be used during the active evaluation period, when the student needs to start over completely for a given course.

Students cannot edit their responses after the evaluation period for their course session has closed. If students have concerns about their responses after the course evaluation period has ended, they can contact the Office of Academic Assessment and request that their responses be deleted. Course evaluation responses can only be deleted before results are made available to faculty and administrators. Students who wish to provide feedback about a course after an evaluation period has expired can contact their department directly, with the understanding that this is not an anonymous way of providing such feedback.

Students who need a response reset or re-opened during an active evaluation period should contact the Office of Academic Assessment (assessment@bgsu.edu). Department administrators and instructors also can forward student requests to the Office of Academic Assessment (assessment@bgsu.edu).

Student Anonymity

The anonymity of students who respond to the course evaluation is protected to the degree possible. Several measures are in place to safeguard student anonymity.

Enrollment thresholds: Course sections with fewer than 4 students enrolled may be removed from the evaluation to protect the anonymity of respondents. Departments are given the opportunity to review these sections through the Department Acceptance Report and can request their removal or retention. When evaluating results from course sections with small enrollments, the best practice is to use the batch reporting feature to aggregate results across multiple sections, which helps protect the anonymity of respondents.

Cross-listing combined sections: Starting in Summer 2023, combined courses are cross-listed in the evaluation to protect student anonymity in smaller sections. Cross-listing is applied when the course sections meet all of the following criteria: same career, same subject, same four-digit catalog number, same instructor, and same meeting details, and the smaller section has fewer than 4 students enrolled. Evaluations are always administered separately to students in each section regardless of whether a cross-list is created. Cross-listing controls how results are presented: when a cross-list is created, results from both sections are aggregated for anyone outside the Office of Academic Assessment. When both sections have 4 or more students enrolled, no cross-list is created, and results are reported separately. The Office of Academic Assessment can view each section’s results separately for purposes of internal review. If departments would like cross-listed results separated by section, the Office of Academic Assessment can provide those results upon request.

Results timing: Course evaluation results are not made available to administrators or faculty until after grades have been posted at the end of each semester.

Ohio Senate Bill 1 Questions

Beginning in Summer 2025, three additional questions were added to every course evaluation in response to Ohio Senate Bill 1 requirements. These questions were developed by the Ohio Department of Higher Education Chancellor and must be asked exactly as worded by the state. The SB1 questions are included on every course evaluation across all colleges, departments, and terms.

SB1 Question 1:

Does the faculty member create a classroom atmosphere free of political, racial, gender, and religious bias?

Response options: Yes, No

SB1 Question 2:

Are students encouraged to discuss varying opinions and viewpoints in class?

Response options: Yes, No, Not applicable

SB1 Question 3:

On a scale of 1–10, how effective are the teaching methods of this faculty member? With 1 being not effective at all and 10 being extremely effective.

Response options: 1–10

Note: The SB1 questions are implemented as a targeted survey assigned to all hierarchy nodes. Although they appear alongside other targeted questions in evaluation reports, they are required on every evaluation and cannot be removed. Because SB1 questions use different response scales (Yes/No and 1–10) than the common questions (1–5 Likert scale), they are reported separately in summary statistics and are not included in the Mean of Means calculation.

Quick Reference

Key Contacts

Need

Contact

Course or instructor assignment errors

Your academic office or assessment@bgsu.edu

Reset or re-open a student response

Department administrator forwards to assessment@bgsu.edu

Technical issues

assessment@bgsu.edu

General questions

assessment@bgsu.edu

Common Instructor Tasks

Task

When

How

Review your course evaluation assignments

When you receive the Upcoming Course Evaluation email

Check the list of courses and report any errors to your academic office or assessment@bgsu.edu

Monitor response rates

During evaluation periods

Log in to CES and check the Response Rate Tracker

Encourage student participation

During evaluation periods

Dedicate class time, explain the value of feedback, remind students responses are anonymous

Access evaluation results

After grades are posted

Log in to CES; navigate to Results > Project Results

Combine results from multiple sections

After grades are posted

Use the Batch Report feature in Project Results

Troubleshooting

I can’t see my course evaluation data: After logging in, check that your role is set to Instructor (not Student) using the dropdown in the upper-right corner. If you still cannot see your data, contact the Office of Academic Assessment at assessment@bgsu.edu.

A course is missing from my evaluation list: The course may have been auto-deleted based on institutional rules (e.g., independent studies, labs, practica), or it may not have been assigned to you as instructor of record in the system. Contact your academic office or the Office of Academic Assessment.

I am listed as the instructor for a course I am not teaching: Contact your academic office or the Office of Academic Assessment before the evaluation period begins. Instructor assignments cannot be changed after students have begun submitting evaluations.

A student needs to redo or revise their evaluation: The student’s response can be re-opened or reset, but only while the evaluation period for their course is still active. After the evaluation period has closed, responses can only be deleted upon the student’s request—not edited. Contact the Office of Academic Assessment with the student’s name, course, and whether a re-open, reset, or deletion is needed.

A student says they didn’t receive a Certificate of Completion: Certificate emails are sent automatically upon submission and cannot be re-triggered. Advise the student to check their email inbox (including spam/junk folders) for the certificate. Only one certificate is sent per evaluation.

Additional Information

Additional information or a Word version of this guide may be obtained by contacting BGSU’s Office of Academic Assessment at assessment@bgsu.edu.

Other course evaluation resources are available at https://www.bgsu.edu/institutional-effectiveness/office-of-academic-assessment/evaluation-of-teaching-learning.html

Updated: 04/23/2026 02:05PM