Student Achievement Assessment Committee
Report on Under-Graduate Assessment
At the completion of baccalaureate degree studies in Psychology, students will:
$ Exhibit broad knowledge about human behavior from a variety of psychological perspectives (e.g., biological, cognitive, developmental, social).
$ Have the necessary skills in research and other forms of inquiry in order to develop new knowledge about behavior.
$ Be able to communicate their knowledge of psychology to others.
$ Have the necessary skills and content knowledge to be an informed and critical consumer of existing knowledge.
$ Be prepared for post-baccalaureate studies in psychology or related disciplines, or for entering the workforce in areas related to their training.
1. Learning (or Service) Outcomes assessed this year:
The assessment activities initiated during 2004-05 were (a) developing a skills-based inventory used to assess learning outcomes, and (b) pilot-testing the new inventory. Recall that the instrument that we have used in the past, the Assessment of Student Learning, measured students’ perceptions of what they learned in various courses. Although student perceptions are useful in their own right, they lack the objectivity that would allow us to determine which skills are in fact acquired at what points of the curriculum. To borrow an example from last year’s report, a first-year student taking an introductory psychology class (PSYC 101) -- which typically devote a lecture or two to statistics -- may indicate that she has ‘learned a great deal about statistics.’ If her first-ever exposure to statistics was in PSYC 101, her rating may be appropriate. A fourth-year student taking our advanced statistics course might also indicate that she has ‘learned a great deal about statistics,’ as she undoubtedly has, but her level of knowledge would far exceed that of our first-year student despite that the ratings are identical. Accordingly, we felt that establishing objective criteria – sets of verifiable skills and experiences – would allow us to determine what our majors learn rather than what they think they learn. Last year we conducted focus groups to get a sense of what a small sample of psychology majors learned in their courses of study. This year, we aimed to extend our reach by getting a sense of what a larger sample of our majors are learning over their courses of study.
As an aside, we note that our skills-based approach to assessment is entirely consistent with the trend toward emphasizing skill acquisition in BG Perspective courses.
2. Assessment Methods and Procedures:
Our new instrument, the Psychological Skills Inventory, is a 57-item survey organized at two different levels. At the higher level are dimensions of skills that reflect the learning outcomes described above: Written/Oral Communication, Research Skills, Engagement, Critical Thinking/Problem Solving, Ethics/Values, and Integration of Knowledge. Within each dimension, items that reflect common sub-skills are clustered together in groups of three. For example, within the dimension “Written/Oral Communication,” three items that refer to speaking in public are grouped together, as are three items that refer to writing skills and three referring to the ability to convey scientific information coherently. The format of items is shown below (the complete inventory is available in a Word document attached to this report):
A. Speaking in public: __ I have contributed to discussions in my psychology classes at least once a week __ I have given a formal oral presentation in a psychology class.
__ I have made a psychology-related presentation outside of the classroom (e.g., at a conference).
B. Writing Skills: __ I have taken a psychology exam where a large portion of the grade was based on essays. __ I have written an APA style literature review paper.
__ I have written an APA style research (i.e., empirical) paper.
C. Conveying scientific psychological information coherently: __ I have explained scientific psychological concepts to friends and family.
__ I have presented (orally or in writing) the results of a psychological study.
__ I have created a poster detailing the results of a psychological study.
Students are asked simply to check the items in each skill area that apply to them. The exact instructions are: Please read the items in each skill area and check those that apply to you. NOTE: You are not expected to have acquired all of the following skills, particularly if you’ve taken only one or two psychology classes. Thus, you may find that you’ll check a lot of items in some areas, fewer in others, and possibly none in still other areas.
Students are also asked to indicate which psychology courses they have taken, their year in school, GPA, major, and minor. (Not all students who take psychology classes are our majors.)
Two comments before we proceed further:
First, our Psychological Skills Inventory is a modified version of the skills inventory described by Kruger and Zechmeister (2001). Ours is shorter (57 items across six dimensions rather than 90 items across 10 dimensions), and two of our dimensions (Engagement, Integration of Knowledge) are unique to learning outcomes at BGSU.
Second, many of the items within a cluster show a progression from lower- to higher-order accomplishments, the implicit assumption being that the more sophisticated the accomplishment, the higher the skill level. This scaled aspect of the inventory is what we hope will allow us to discriminate beginning from more experienced cohorts of psychology majors. It also has the added benefit of letting beginning psychology majors know what sorts of skills they might expect to acquire before graduation.
Our hope was to pilot test the instrument by using an “extreme groups” approach, comparing first-year with fourth-year psychology majors. Between April 21 and April 27, 2005 the Psychological Skills Inventory was distributed to psychology majors in a subset of laboratory classes, which are taken mostly by fourth-year students, and introductory psychology sections, where we hoped we would be able to get data from first-year students. As the majority of first-year psychology majors take introductory psychology in the fall rather than the spring, the response rate here was lower than we had anticipated. Accordingly, we decided to collect additional data from a psychology elective class (PSYC 303 Psychology of Child Development). Overall, data were obtained from 88 students, of whom 41% were fourth-year, 33% were third-year, 11% were second-year, and 9% were first-year students (Note: totals do not equal 100% due to missing data and rounding). Seventy-eight percent of the respondents were women. Students completing surveys listed between 1 and 16 psychology classes they had taken, with the average number of classes across all respondents equal to 7.8 classes.
3. Inferences from Assessment
Because of the low response rate from the first-year students, we combined the data from first- and second-year students in order to compare their data with those obtained from fourth-year students. The total number of skills endorsed in each category was compared for fourth-years (n = 37) and the combined group of first- and second-year students (n = 18) in a series of independent t-tests. As can be seen in the table below, fourth-year students endorsed significantly more skills than first- and second-year students in all skill dimensions except engagement (which showed a trend in the right direction). For example, out of the 9 possible items in the “Written/Oral Communication” skill that could be endorsed, first- and second year students endorsed (on average) only 3.22 items whereas fourth-year students endorsed over twice as many. Similarly, fourth-year students endorsed nearly 13 of the 18 items in the “Research Skills” dimension, two-and-a-half times as many as the first- and second-year students.
|Skill dimension||Mean for 1st- and 2nd Year Students||Mean for Fourth-Year Students||Number of Items|
|Number of Classes Taken/In Progress||3.06||9.92*||N/A|
|Critical Thinking/Problem Solving||4.00||6.30*||9|
|Integration of Knowledge||3.06||4.19*||6|
Note. * indicates that mean for fourth-year is significantly higher than that of first- and second-year students at the p < .05 level, + indicates a difference at the p < .10 level.
Class standing (i.e., first-, second-, third-, and fourth-year student) is actually a proxy for the number of psychology classes our majors have taken, and not a very good one at that. After all, transfer students may come in as third-year students but they may have had taken fewer psychology classes at BGSU than some of our “native” second-year students. In addition, there is a wide range in the number of classes that fourth-year psychology majors have taken. Some have completed only the 8-9 classes required for the 30-hour major whereas others have taken many more. Consequently, in order to explore further the relationship between taking psychology classes and the development of relevant skills, we examined the correlations between the number of classes taken and the number of items endorsed in each of the six skill dimensions. The results are shown in the table below:
|Skill Dimension||with No. of Classes Taken|
|Critical Thinking/Problem Solving||.30*|
|Integration of Knowledge||.33*|
In contrast to the consistent pattern of relationships shown above, additional analyses indicated that self-reported GPA correlated only with Critical Thinking/Problem Solving [r(86) = .23, p < .05] and Integration of Knowledge [r(86) = .23, p < .05], indicating that students with higher grades tend to endorse more items in those skill dimensions than those with lower grades. These results make intuitive sense: better students do more critical thinking and knowledge integrating.
What is particularly nice about only two of the six skills correlating with self-reported GPA is that it means that taking psychology classes provides all students – not just the better students -- with skills that we deem important. Indeed, when we look at the correlations between number of psychology classes taken with the number of items endorsed on the six skill dimensions while controlling (statistically) for self-reported GPA, we get almost identical values to those shown above. We are clearly measuring more than general scholarly aptitude (as measured by self-reported GPA).
One of the skill dimensions, Engagement, deserves special mention. The mean number of items endorsed is the lowest for this dimension: our fourth-year students endorsed an average of only two out of nine possible items. What this means isn’t exactly clear. It could mean that this area is one where we need to focus more energy, especially if we discover, for example, that Engagement activities are highly valued by employers and graduate school admissions committees (see below). On the other hand, it could mean that in writing the items for this dimension, we may have set the bar too high for the typical psychology major. One of the clusters in this dimension, for example, is “Doing research in an applied setting,” and it may be that exceedingly few of our students get such opportunities. The weak correlation between Engagement and Number of Classes Taken suggests that there is a trend for students who are involved in engagement activities to be further along in school, which both makes sense and is desirable, as we wouldn’t want unskilled students working out in the community.
4. Actions Taken/Program Improvements
We have yet to take any direct actions as a result of the past year’s assessment activities. Having pilot tested the Psychological Skills Inventory in 2004-05, our goal for 2005-06 is to run it through the equivalent of a beta test by gathering data from more than 88 students. We plan to obtain data from first-year psychology majors by administering the survey during the mandatory advising sessions. (This ought to get us more than the 8 first-year students that were in the pilot sample.) Our goal is to obtain a sufficient amount of data that we will use to improve the inventory (if needed) and look for ways that we might make changes in the curriculum as a result of what we learn from the larger sample of students. We can envision using the data from the inventory in the following ways:
• We can ask the BGSU psychology faculty to indicate which of the 57 skills they would expect an incoming graduate student to have. In last year’s report we noted that the focus groups felt that one of the perceived weaknesses of the department was that we provided insufficient information about career and graduate school planning. By asking faculty to indicate which skills are important for graduate students to have using the same list that the students have access to, we should be able to provide better guidance for students who hope to attend graduate school in psychology, and ascertain if our students are appropriately prepared for graduate study should they choose to pursue it.
• We can ask the students to identify which courses (if any) were taken at BGSU versus those taken at two-year colleges. We could then look to see if there is value added – in terms of acquiring important skills -- to coming here for three or four years instead of only one or two. Based on anecdotal experience, we believe that students who are here for only two years often don’t get the opportunity to acquire research skills outside of the classroom, which we feel are important skills to have if one plans to go to graduate school. It would be useful to obtain data that speak to this point.
• We will be able to do fine-grained analyses of what skills are acquired in which courses. This could result both in changes to the curriculum (for example, we may need to develop or modify courses to promote the acquisition of Engagement skills) and to skills-centered advising.
• We will analyze the inventory itself to explore the relationships among the six skill dimensions we have proposed. These dimensions were derived rationally, and although it has reasonable face validity, the inventory itself has not been analyzed from a psychometric standpoint. Doing so would enhance our confidence in its validity.
• We can explore using the inventory to form the backbone of an electronic portfolio for our majors, by incorporating it, for example, as the matrix of learning outcomes.
As you can see, we are excited about developing the Psychological Skills Inventory further and using it to find out more about what skills and experiences our majors obtain as they progress through the curriculum. We stress that the inventory is not content-based; it is not a list of “What Psychology Majors Should Know.” Rather, it is a list of “What Psychology Majors Should Know How To Do.” Cognitive scientists (e.g., Anderson, 1982) maintain that proper cognitive skill acquisition requires conjoining factual knowledge with knowledge of what to do when. Implicit in a skills-based approach to assessment is the assumption that the appropriate factual knowledge must be learned before a skill can be properly demonstrated. By assessing skills that can be applied in nearly all domains of psychology, as we believe we are doing with the Psychological Skills Inventory, we can assess relevant content knowledge without having to specify what it is.
Report on Graduate Assessment
For the past six years, our program has been assessing graduate student outcomes at both an individual level and a group level.
At the individual level, toward the end of the spring semester, students are asked to complete a “Graduate Student Academic and Professional Development Update.” Information is requested in several areas: coursework, thesis/dissertation progress, teaching experiences, scholarly outcomes (presentations given at professional conferences, publications in peer-reviewed journals, research recognitions), service activities (e.g., non-paid services provided at the department, university, and community levels). In each of these five areas, student list their experiences, self-evaluate their progress, and map out future goals. The student then meets with his or her faculty sponsor to develop joint academic and professional goals for the coming year. Performance in these five areas is used to assess the following learning outcomes, progress toward which is rated by both the student and faculty member:
• coursework performance;
• research progress;
• non-required research contributions (e.g., involvement in independent research projects, publications, presentations);
• written communication skills;
• oral communication skills;
• performance in assistantship duties;
• performance in teaching experiences;
• performance in practicum experiences (e.g., clinical skills; consultation skills depending on the student’s area of specialization);
• ability to interact with faculty, peers, clients, others.
Sample forms (i.e., the “Graduate Student Academic and Professional Development Update Graduate Student Academic and Professional Development Update, Student Self-Rating of Performance, and Sponsor Rating Form) are included in the Appendix. At the time of the preparation of this report, 64 of the current 93 students (69%) in residence in the doctoral program have returned the Graduate Student Academic and Professional Development Update form. Nearly all students and their sponsors have evaluated the students’ performance toward these 9 learning outcomes as “exceeds expectations” or “meets expectations.”
Presenting research at professional conferences and through publications is an indication that students are meeting various learning outcomes (e.g., research progress, oral and written communication skills, ability to interact with faculty and students involved in research projects). At the group level, 72% of the students have given at least one conference presentation since August of 2004 and 36% have been included at least once as a co-author of a published article in a peer-reviewed journal or book chapter. As one would expect, these results vary by year of student, with more senior students being more productive. For example, in the past year, 23% of first year students versus 61% of students in the fourth year and beyond have been a co-author on a published manuscript. It is important to note, however, that a majority of first year students (53%) have presented their research at professional conferences. Venues for dissemination of data at national and international professional conferences included: the American Psychological Association, the American Psychological Society, the Association for the Advancement of Behavior Therapy, Society for Behavioral Medicine, Society for the Scientific Study of Religion, the Cognitive Science Society, the Society of Industrial and Organizational Psychology, the 5th Annual Pilot Research Project Symposium for the University of Cincinnati NIOSH-supported Education and Research Center, the Society for Research on Child Development, Society for Research on Adolescence, Society of Judgment and Decision Making, the International Society for Gerontechnology, the Cognitive Aging Conference, the International Society for the Study of Behavioural Development, Psychonomics, National Harm Reduction Conference, Society for Community Research and Action, and the National Academy of Neuropsychology. Venues for dissemination at regional or local conferences included: the Midwestern Psychological Association, the Ohio Psychological Society, the Ohio Academy of Sciences Research meeting, the Research Symposium in Psychiatry, Psychology, and Behavioral Sciences at the Medical College of Ohio, and the annual BGSU Research Conference. Publications have appeared in scientific journals such as: Aggressive Behavior, the Journal of Clinical Psychology, Eating Disorders, Psychology and Aging, Educational and Psychological Measurement, Journal of Applied Psychology, Encyclopedia of Applied Psychology, Journal of Vocational Behavior, Personnel Psychology, Studies in Higher Education, Journal of Applied Animal Welfare Science, and Personality and Individual Differences.
Several learning outcomes are met through students’ involvement in engagement or service activities. These experiences contribute to students’ oral and written communication skills, non-required research contributions, and their ability to interact with professionals outside of the university setting. As part of their required coursework, 90-100% of the Clinical psychology students in their third and fourth years are placed at area hospitals and mental health agencies to provide clinical counseling and assessment services; 100% of first and second year Clinical psychology students provide assessment and intervention services to the community (children, adults, families) through the department’s Psychological Services Center. In addition, nearly all of the Industrial-Organizational psychology students provide consultation and research expertise to businesses in the region through the department’s Institute of Psychological Research and Application. Finally, we are beginning to track graduate students’ volunteer (i.e., non-paid, not part of their assistantship or coursework) service activities at the department, university, and community levels. Based on the sample of Graduate Student Academic and Professional Development Updates, 42% of the students listed a non-paid service activity. Department service activities included membership in a variety of committees (e.g., Student Representatives, by Specialization Area; Graduate Student Recruitment Committee; Brown Bag (colloquium) Committee). University service activities included: President of the Fellowship of Christian Graduate Students, Library Advisory Committee, Student Legal Services Board, Student Housing Action Committee, and Graduate Student Senate. Volunteer service in the community included: peer counselor at the Bowling Green Pregnancy Center, Bowling Green Senior Citizens Center, and the Sterling House Residential Center.
At the group level, we evaluate success in coursework and research progress as two important learning outcomes for graduate students. During the current year covered in this report (August 2004-May 2005), 16 students received the MA degree and 18 students received the Ph.D. in the Department of Psychology. Nearly all students are meeting the required GPA. To gain a broader picture of degree progress, we gather program-level data every five years; the most recent large-scale data collection was completed in December 2001. First, we examine degree completion rates. For students entering our doctoral program in the years 1989-1994 (6 cohorts): 91% received their MAs and 64% received their Ph.D.s. Of the ones who received their MAs, 70% received their Ph.D.s. Of those who received their Ph.D.s, average time to degree was 5.81 years (SD=1.49). If you delete the one-year internship for the clinical students in the sample, average time to Ph.D. was 5.36 years (SD=1.44). These results are comparable to the data reported in our 1995 OBOR review.
An additional method of assessing student learning outcomes at the group level is to examine job placement rates. Based on the December 2001 5-year survey, we obtained data for 70 (64%) of the 110 students awarded Ph.D.s from 1995-2000. One hundred percent of these alumni have obtained jobs in the field of psychology. Thirty-six percent obtained employment in the field of academia (e.g., University of Alaska-Anchorage, Minnesota, Nebraska, Washington; Oregon Health Sciences Center; Trinity College in Dublin, Ireland; Concordia College in Canada; Psychological Institute of Tubingen, Germany; Ohio colleges including the Medical College of Ohio, Mercy College, University of Dayton); 24% gained employment in the industrial-business sector (e.g., Kraft Foods, Dow Chemical, Microsoft, Procter & Gamble, Motorola, Ernst & Young, Owens Corning); 14% obtained employment in purely clinical positions (e.g., U.S. Air Force, Veterans Administration Medical Center in Cincinnati, private practice sites); 14% gained employment in clinical-research settings (e.g., Boston VA, Children’s Hospital in Ottawa, Canada, Geisinger Medical Center, Cancer Prevention Center in Rhode Island); and 12% gained employment in purely research positions (e.g., Ohio Department of Mental Health, Wright-Patterson Air Force Base, Los Angeles Tobacco Control Program, post-doctoral positions at Harvard, University of Pittsburgh).