Video Script: Data Analysis and Interpretation
Welcome to this video on data analysis and interpretation in relationship to Program Review at Bowling Green State University. This video focuses on tidbits for engaging in relevant and meaningful data analysis and interpretation for successful program reviews. Specifically, we will first walk you through:
1. An overview of the general steps involved in data analysis and interpretation;
2. Then, we will briefly cover two data analysis and interpretation protocols or tools – specifically (a) the Data Driven Dialogue Protocol and (b) the ATLAS Looking at Data Protocol.
3. We will conclude this session with a description of common issues related to data clarity and possible solutions.
SOME DEFINITIONS
Before diving in the bulk of this presentation, let us, however, establish a few definitions.
Poor Data Quality
Causes of poor data quality include system problems, human error, and outdated data or outdated data systems or software or repositories.
Real-Time Data
is presented as soon as it is acquired. This type of data is useful when decisions require up-to-the-minute information.
Structured versus Unstructured Data
Structured data is comprised of clearly defined data types with patterns that make them easily searchable;
Structured data usually resides in relational databases (RDBMS). Fields store length-delineated data like phone numbers, Social Security numbers, or ZIP codes. Data may be human- or machine-generated, as long as the data is created within an RDBMS structure. This format is eminently searchable, both with human-generated queries and via algorithms using types of data and field names, such as alphabetical or numeric, currency, or date.
Unstructured data is “everything else” – comprises data that is usually not as easily searchable. Unstructured data has an internal structure but is not structured via predefined data models or schema. It may be textual or non-textual, and human- or machine-generated. These can include emails, audio, video, social media, media, website, etc.
BIG Data
Big data is defined as a huge data set that continues to grow at an exponential rate over time. The four fundamental characteristics of big data are volume, variety, velocity, and variability. Put simply, big data is larger, more complex data sets, especially from new data sources. These data sets are so voluminous that traditional data processing software just can’t manage them.
VOLUME refers to the amount of data matters. With big data, you’ll have to process high volumes of low-density, unstructured data. This can be data of unknown value, such as Twitter data feeds. For some organizations, this might be tens of terabytes of data. VELOCITY is the fast rate at which data is received and acted on. VARIETY refers to the many types of data that are available. For VARIABILITY, consider this: Traditional data types were structured and fit neatly in a relational database. With the rise of big data, data comes in new unstructured data types.
Data Silos
The lack of a single source of truth may result in data silos or disparate collections of information not effectively shared. A data silo is a repository of data that's controlled by one department or business unit and isolated from the rest of an organization, much like grass and grain in a farm silo are closed off from outside elements. Siloed data typically is stored in a standalone system and often is incompatible with other data sets. That makes it hard for users in other parts of the organization to access and use the data. FOR EXAMPLE, at BGSU, you may receive data sets from different software pulls or sources, including Academic Performance Solutions (APS), the Office of Institutional Research (OIR), PeopleSoft, HCM, CSS, FMS, Slate, Canvas, Tableau, PowerBI, data collected at departmental or college level, etc. At times, these sources may present discrepancies which may lead to a need for data clean-up.
Data cleaning is the process of fixing or removing incorrect, corrupted, incorrectly formatted, duplicate, or incomplete data within a dataset. When combining multiple data sources, there are many opportunities for data to be duplicated or mislabeled. If data is incorrect, outcomes and/or results may be unreliable, even though they may look correct. Effective data governance can break down data silos and enable organizations to obtain purposeful value from their data.
Generally, data collected for program-level assessment fall into two categories: quantitative and qualitative.
· Quantitative data relies on numerical scores or ratings and is helpful in evaluation because it can provide quantifiable results that are easy to calculate and display.
· Qualitative data consists primarily of words and observations, rather than numbers. Qualitative data can come from a variety of sources including open-ended survey questions, focus group notes, essay responses, and student portfolios. Qualitative data can be useful for answering “why” and “how” questions about student performance, satisfaction, motivation, engagement, or experience.
Broadly speaking, why are we concerned with data analysis and interpretation in the first place? Believe or not, data analysis and interpretation are not a higher-ed. invention. Businesses of all kinds use and analyze data to improve decision-making and forecasting, enhance business performance and competitiveness, maximize sales and marketing effectiveness, streamline operational processes, create better customer experiences, drive business agility, lower costs and reduce waste, and raise quality standards. These benefits are applicable to higher education. Specifically: at BGSU, consider how these reasons align with our Strategic Plan:
· To improve decision-making and forecasting à Foundational Objective IV. Initiative 14: Efficiency and alignment. We will align processes, organizational structures and financial budgets to minimize administrative costs.
· To enhance business performance and competitiveness à Foundational Objective IV. Initiative 13: Enhancing value. We will reduce the net cost of a BGSU education for our students by streamlining curriculum, adopting innovative academic affordability initiatives and implementing innovative scholarship programs. OR Initiative 12: Culture of innovation and accountability - We will create a culture of innovation and accountability within our community to empower our people to drive the success of the University in a post-COVID-19 world.
· To maximize sales and marketing effectiveness à Strategic Objective I. Initiative 4: Broadening access. We will extend what we have learned during the global pandemic, and we will leverage technology to expand our ability to meet the needs of students. We will make a BGSU degree more accessible to broader groups of qualified students through partnerships, pathways and innovative programs. OR Strategic Objective II. Initiative 8: Telling our story. We will implement a coordinated and aligned strategic marketing and communication plan that communicates the relevance and importance of BGSU to all constituencies. This will include highlighting our current exemplars of excellence.
· To streamline operational processesà Foundational Objective IV. Initiative 14: Efficiency and alignment. We will align processes, organizational structures and financial budgets to minimize administrative costs.
· To create better customer experiences à Strategic Objective 1. Initiative 3: Differentiating the traditional undergraduate experience. We will recognize our position as a comprehensive, residential university and we will strategically leverage our unique strengths as a high-research university with a strong tradition of high-impact practices and student engagement. We will transform our curriculum to ensure mastery of content and application of knowledge, and we will empower our students to intentionally design an educational experience that allows them to take full advantage of all that BGSU has to offer. OR Strategic Objective 2. Initiative 7: Innovative engagement. We will creatively engage our extended community of alumni, friends, government officials, families and others to demonstrate the value of BGSU to our region, state and world, as well as to build support for the University.
· To drive business agility à Strategic Objective 1. Initiative 1: Right programs that are sustainable. We will continue to evaluate our undergraduate and graduate academic programs to ensure we are providing opportunities for our students that meet their needs, as well as society’s needs. We will continue to support our strong existing programs and develop new programs that are in great demand, in areas such as healthcare and applied sciences.
· To lower costs and reduce waste à Foundational Objective IV. Initiative 14: Efficiency and alignment. We will align processes, organizational structures and financial budgets to minimize administrative costs.
· To raise quality standards à Foundational Objective 3. Initiative 10: Teaching and service excellence. We will support faculty and staff in achieving excellence in their jobs to significantly improve the quality of teaching and learning by leveraging technology and innovative pedagogy, as well as professional development of our staff to support their efforts to enhance the University.
Now, what of the relationship between data analysis, interpretation and integration within the program review process at BGSU?
Program review is designed to enhance planning within and among academic units to effectively use campus resources and advance the University priorities and mission. The overall goal of program review is to allow academic units within normal activities to articulate their goals and objectives in relation to the university’s priorities and mission through a regular process of internal and external review of qualitative and quantitative information about program activities, demonstration of progress toward achievement of goals, and the use of outcomes for continuous program improvement. A whole set of processes is involved once the goals, objectives, and data are collected. Perhaps the most important processes involve analysis of data and data-driven reflection by members of the unit on their progress toward achieving goals and objectives, on the appropriateness of continuing certain goals and objectives versus others or the need to develop new ones.
Assessment data provide a means to look at student performance to offer evidence about student learning in the curriculum, provide information about program strengths and weaknesses, in other words, program health and vitality, and guide decision-making.
The analysis and interpretation of data should support the fundamental purpose of assessment. Analysis and interpretation of data and evidence has several components, including:
· To 'read' and understand what has been collected. Depending on the skills and experience of those having to engage with the data and the complexity of the data or evidence, this may require access to professional learning support. This is where the Office of Institutional Effectiveness at BGSU can provide guidance and support.
· To make sense of data in light of the intended learning and associated curriculum achievement standards. Sense making can be supported by structured learning conversations with colleagues and students to enable learning from and with each other. At BGSU, a good example of such would be from the Student Achievement Assessment Committee or SAAC who is charged with developing, and coordinating assessment activities and fostering a culture of assessment that is faculty-driven and administratively supported.
· To clarify what the evidence clearly indicates students know, understand and can do, what can be inferred from the evidence, and what needs further investigation in order to make a confident judgement based on a mix of data and evidence.
·
è With these three key component of the data analysis process in mind, now we come to the nitty-gritty question: How is assessment data analyzed?
First, and foremost, it’s important to remember that data exist in context. Context gives meaning to the information collected and is essential to appropriately utilize and communicate the assessment results.
Analyzing data includes determining how to organize, synthesize, interrelate, compare, and present the assessment results. These decisions are guided by what assessment questions are asked in the Memorandum of Understanding that guides the program review process, the types of data that are made available (either through data dashboards provided by OIR, OIE, etc.), comparisons with existing findings (such as previous similar assessment, baseline data, benchmark data, etc.), as well as the needs and wants of the audience/stakeholders. Since information may be able to be interpreted in various ways, it may be insightful to involve others in reviewing the results. Discussing the data in groups will result in greater understanding often through different perspectives.
Disseminating the assessment findings is an important part of a comprehensive program review process. Programs will need to identify the stakeholders or audience interested in the program review outcomes. Typically, at BGSU, this could include members of the Administration (Academic Affairs, Provost, President), the Board of Trustees, Alumni, Office of Institutional Effectiveness, Accrediting agencies, College committees or task forces, Office of Academic Assessment, Office of Institutional Research, Department faculty, etc.
On this next slide, we’ve provided an example of various methods to compare data at course-level.
Considering that 55…., we want to answer the question…. Where do we look to find that answer…. To answer this question, we can look at …
On the next two slides, we’ve provided visuals of how you might consider displaying program-level assessment results in a simple table that includes a comprehensive alignment of program mission and goals, results, analysis, recommendations, stakeholders, and action plan. These tables can be adjusted as preferred in a vertical or horizontal format.
Data analysis and interpretation of themselves are not sufficient. There has to be a closing of the Loop.
What does it mean to “close the loop?” “Closing the Loop” simply means utilizing assessment results for meaningful, data-driven program change and continuous improvement. An effort to “close the loop” might result in the articulation or drafting of a program improvement or action plan. The formulation of an action plan is a requisite step in the program review process at BGSU. its intent is to have programs yearly report on progress against goals and objectives they have set towards continuous improvement based off the findings of a first round of program review. ----
What is a program improvement or action plan? A program improvement or action plan is intended to provide programs a format for translating the recommendations made into actions for improvement or maintenance. The plan also identifies who is involved and when the action steps are to be achieved. Such a plan serves as an accountability measure towards facilitating ongoing continuous improvement in an intentional, data-driven and meaningful rather than haphazard, hit-and-miss way.
The table on this slide is an example of elements that can be included in a program improvement or action plan. Please refer to the BGSU Program Review Guidebook for the actual template that the University uses.
TOOLS
We turn now to the second major part of our session in which we would like to briefly discuss two data analysis and interpretation protocols. The first is known as the Data Driven Dialogue Protocol developed by the Teacher Development Group, in 2002 and Based on the work of Nancy Love, author of “Using Data/Getting Results”.
“Dialogue comes from the Greek word dialogos. Logos means ‘the word,’ or in our case we would think of the ‘meaning of data’. A dialogue can be among any number of people, not just two. Even one person can have a sense of dialogue within themselves, if the spirit of dialogue is present. The picture or image that this derivation suggests is of a stream of meaning flowing among and through us and between us. This will make possible a flow of meaning in the whole group, out of which will emerge some new understanding. It’s something new, something creative. And this new shared meaning is the ‘glue’ or ‘cement’ that propels continuous improvement.
The structured approach of this protocol, with clear norms and expectations for conversation, creates a safe space for all participants. This protocol supports equity of voice and allows all members to describe the data, make inferences, and share implications for future work.
This protocol builds awareness and understanding of the participant’s viewpoints, beliefs, and assumptions about data while suspending judgments. All participants have equal voice. The 4 phases of data-driven dialogue assist groups in making shared meaning of data. This dialogue is a useful tool to help replace hunches and feelings with data-based facts, examine patterns and trends of performance indicators, and generate “root-cause” discussions that move from identifying symptoms to possible causes of student performance and program quality. So, what are those 4 phases?
Phase I: Predictions – Surfacing perspectives, beliefs, assumptions, predictions, possibilities, questions, and expectations.
è (entails) Reflect(ing) privately and recording preliminary thoughts about the data. One or more of the following thought-starters may be helpful: I assume… I predict… I wonder… My questions/expectations are influenced by… Some possibilities for learning that this data may present…
Phase II: Go Visual – Re-create the data visually.
è (entails) Re-creating the data visually, on large sheets of paper, on a data wall, through software such as PowerBI or dashboard, etc. Participants mark up the data so they better understand it (i.e., highlighting trend lines in different colors, doing math calculations and charting them, color coding parts of the data that relate to each other). Participants might create visuals individually or in pairs. Depending upon the amount of data, it might be helpful to divide them into subsets and identify who in the group will work with different subsets.
Phase III: Observations – Analyzing the data for patterns, trends, surprises, and new questions that “jump” out.
è (entails) Engaging with the actual data and noting only the facts that can be observed in the data. Conjectures, explanations, conclusions, and inferences are off-limits. You would make statements about quantities (e.g., Over half the students…), the presence of certain specific information and/or numerical relationships between ideas (e.g., Over 90% of the students achieved below standard in Problem Solving; OR Compared to last year’s data, the percentage of students performing at the advanced and on-standard levels in Skills increased by 8%…). At this phase, Remember: Just the facts! The following prompts should help: I observe that… Some patterns/trends that I notice… Are there any anomalies? Are there any significant gaps in trends over time? If so, how can these be explained (which would be addressed in Phase IV, making inferences)?
Phase IV: Inferences – Generating hypotheses, inferring, explaining, and drawing conclusions. Defining new actions and interactions and the data needed to guide their implementation. Building ownership for decisions.
è (entails) Generating multiple explanations for your Phase III Observations; (b) identifying additional data that may be needed to confirm/contradict your explanations; (c) proposing solutions; and (d) identifying data needed to monitor their implementation. You would reflect privately, using one or more of the following thought starters to prompt your thinking: I believe the data suggests… because… Additional data that would help me verify/confirm my explanations is… I think the following are appropriate solutions/responses that address the needs implied in the data… Additional data that would help guide implementation of the solutions/responses and determine if they are working…
Here is a fictitious data set for a fictitious major: religious studies. If we were to apply the Data-driven dialogue protocol to it, we would come to make the following observations: In 2015, there were 10 students enrolled in the program while there were only 4 enrolled in the program in 2020. Then, we could generate inferences or hypotheses to explain this decrease in enrollment.
a) Internally, we might ask:
§ Maybe the students have graduated and have not been replaced number-wise?
§ Maybe students have migrated away or transferred to another BGSU major
§ In defining new actions, (we might wonder): Have student program/course satisfaction surveys been conducted and results been looked at for possible reasons?
b) Externally, we might ask:
§ Is there a job market for graduates of a straight Masters’ degree in German (without any specializations)?
§ Is there a demand for the major?
§ Is the curriculum rich, relevant, up-to-date, innovative, attractive of a broader generation?
§ Are delivery methods conducive to a healthy enrollment? Are there student pools we are not tapping into? Are there partnership with local stakeholders that might help increase enrollment?
§ Who are our competitors in the field? Have we done a benchmark of our German Master degree program against peer-like/similar programs? is our program recent?
§ Do we have the resources (financial, faculty, etc.) to offer a top-notch program?
§ What are scholarly and scholarship opportunities? What are opportunities for student internships? Study abroad? Research? Etc.
The following slide includes an example of a Data-Driven Dialogue Protocol Facilitation Plan.
The ATLAS Looking at Data Protocol, drawing from work by Steve Seidel and Evangeline Harris-Stefanakis at Harvard University is yet another tool that can help us learn from data.
The protocol comprises 4 steps: identifying facts, interpreting what the data suggest, drawing implications for the work at hand and formulating high leverage next steps toward improvement. The table on this slide includes a few class-level examples for each step. For a program review at BGSU, we might see that data show a significant D and F rate in specific religious studies courses, perhaps over 80% for three subsequent offerings of the courses. What might this data suggest? The high failing rate might be associated with a too strong academic rigor, perhaps lack of student attendance, perhaps tied to lack of instructional engagement in the delivery format or pedagogical diversity and so on and so forth. What would this mean for the religious studies program? It could mean an overwhelming attrition rate, students migrating out to other majors, thus poor retention rate. It might also mean that faculty need to provide more interactive or innovative modes of instructional engagement, which would entail course content revision or at-large curriculum update. Finally, what would be our next step? An example would be to conduct a student satisfaction survey or a series of focus group to gauge student interest and motivation in the major. Another step might include partnering with related majors and/or disciplines where students migrate the most to enhance retention overall. An example of an ALTAS Looking at Data Facilitation Plan is provided on the next slide. While the example is provided for the classroom-level, it can be well adjusted at the program level to meet program review data analysis and interpretation needs.
RELATIONSHIP WITH PROGRAM REVIEW.
You might still be wondering: How would all that we’ve talked about thus far look like when applied to the program review process? The questions below are provided as starting prompts to guide you through the process of working with and analyzing data for your program review.
Data alone are not sufficient. Telling a story with data is the most straightforward and powerful way to contextualize your program review self-study. On this slide, we’ve provided a few final tips that can help ensure your success with data when it comes to writing the self-study.
DATA CLARITY AND DATA ISSUES
In conclusion, we would like to present a few issues that commonly emerge when dealing with data analysis, interpretation and use.
1. First, a common issue resides in misunderstanding or confusion about the data population being presented.
For Example: Some of the enrollment data presented only juniors, seniors, and second majors and not all students enrolled in program. Similarly, other data points such as graduation presented the first-time freshmen.
This could simply be due to human error in the compilation of the data set.
2. Another frequent issue is that of a proper definition of measurements used in formulating the data.
For Example: The inclusiveness data does not define what a first-generation student is (whether or not both parents have no college or did not complete college) or what low-income students are (what level of income defines low). à A Solution here would be to consult OIR’s website for a glossary of terms and more detailed definitions of data and data formulations.
3. An Issue might arise also for faculty to confirm the validity of data sources that are familiar to them.
For Example: In the surveys, many faculty members were concerned with an inaccuracy of data, (i.e., data not matching departmental and course rosters or data from CANVAS (or other University platforms). Departments should be allowed to provide a list of data for verification. When areas are questioned by departments, a solution might be to set up a process to allow for verification.
4. A very salient issue resides in the relevance of the data being presented for program reviews and their relationship with the critical questions defined in the program Memorandum of Understanding.
5. Last, but not least, we often here that data look as if programs are made to compare “apples to apple slices”. For example, one data source might break data down by CIP codes while another by a different grouping system. Therefore, to obtain an accurate data comparison summary, one would have to compute some calculations.
We know that a combination of timing issues, definitions, and the “garbage in, garbage out” issues associated with human error recorded in systems, mean that data differences will undoubtedly occur. However, they can always be reconciled.
This concludes our video session on program review and data analysis and interpretation. If any questions, feel free to email us at institutionaleff@bgsu.edu
Updated: 03/05/2026 04:09PM