Assessment at Purdue: A Brief History[137]

In pursuit of its educational mission to facilitate effective learning and teaching, Purdue has established activities assessing student learning, resources and strategies devoted to its implementation, documentation of effort and future directions.

The goals for student learning outcomes are clearly stated for each educational program and make effective assessment possible. Learning expectations at Purdue are defined by eight University core competencies for undergraduate students and four core competencies for all PhD students. In many cases, competencies established for PhD programs have also been adopted by master's programs. Moreover, the vast majority of undergraduate and graduate programs report learning outcomes at the program level.

Purdue's Assessment Framework[138]

Purdue's basic framework for assessment is constructed upon four key components — Define, Facilitate, Assess and Improve. These key components guide faculty in completing the assessment loop by asking what they want their students to learn (Define); how they will help their students to learn (Facilitate); how they will know if and why their students have or have not learned (Assess); and how they will use assessment information to improve their students' learning (Improve).

The assessment framework has been enhanced by creating the Boilermaker Accreditation and Learning Outcomes Tracking Site (BALOTS), which serves as the central repository for assessment reports. BALOTS is an important tool for documenting assessment efforts in a systematic, structured and unified manner.

Assessment Leadership[139]

One of the challenges presented by the University's decentralized structure is coordination and direction of assessment efforts at the campus level. Purdue reinforced the importance of centrally coordinated assessment by forming the Student Learning Outcome Assessment Workgroup (SLOAW) group, with representation from all colleges/schools (undergraduate and graduate programs) and other relevant campus units (Academic Advising, the Graduate School, Information Technology and the Office of International Programs). The University also created a campus-wide Director of Assessment position, whose responsibilities include leading the SLOAW group and leading learning outcomes assessment, a task previously delegated to administrators with many other responsibilities.

Assessment leadership is also provided at the college level and in other units on campus. Several of the colleges/schools have created assessment positions. Each academic program on campus has at least one person responsible for documenting the program's learning outcomes assessment on the BALOTS website.

The assessment of student learning occurs both at the institutional and program levels. The responsibility for establishing and assessing learning outcomes within and beyond the parameters of the core competencies rests with the faculty. In keeping with Purdue's decentralized structure, the faculty of each school define and implement their own assessment programs, but do so with an institutional model and a set of common principles set forth in the general University plan. Assessment methods at the institutional level include surveys such as the Cooperative Institutional Research Program, GSLOS and the National Survey of Student Engagement. Course Signals, described earlier, is an innovative assessment program.

A review process is crucial for ensuring that student learning is monitored on an ongoing basis and that assessment evidence is used for continuous improvement of student learning. Most academic programs have some or all of the following elements in their review process to ensure that the assessment loop is closed. Assessment evidence is analyzed and translated into findings and desired changes are implemented to improve student learning. New evidence is then gathered to assess changes and to continuously monitor learning. Responsibilities for the review process are shared by key stakeholders and effective mechanisms are in place for implementing the process.


Best practices regarding assessment are communicated and shared in a variety of ways. This includes assessment-related workshops offered by the CIE, training sessions for BALOTS users, workshops and consultations available to academic units and faculty upon request and online resources integrated into the BALOTS website. Feedback is another venue that is used for disseminating assessment knowledge and practices. Programs receive general feedback about common problems that emerge as BALOTS reports are reviewed.

Purdue is also a member of the Higher Learning Commission's Academy for Assessment of Student Learning. The academy provides assistance with learning outcomes assessment and opportunities to learn best practices from fellow members.

Addressing the uneven progress in assessment will be a focus area in the coming years. The provision of additional resources, better communication and integration into existing practices are key strategies for advancement in the future. As described earlier, in an effort to integrate assessment into existing practices, Purdue revised Form 40, which is used for proposing new courses and course revisions. In the new 2009 version, faculty are asked to provide learning outcomes for the courses. Adding an assessment report to the college dean's annual progress report to the provost is also under consideration.

The Assessment Landscape Today

Overall, Purdue does a good deal of work in assessing the overall student experience, including in the first year. However, the decentralized nature and lack of coordinated dissemination of these efforts means the campus community has made poor use of the valuable information produced. Additionally, findings suggested that when data and assessment results are available, there is still a lack of actionable steps and data-driven decisions derived from those efforts.

Decentralized Assessment Efforts and Use of Assessment Data

Very good assessment activities occur centrally and decentrally, but the coordinating unit would enable a more strategic and unified effort for assessment of the first-year experience.

Many departments at Purdue conduct their own assessment efforts, including the Discovery Learning Research Center (DLRC), CIE, OIR, Enrollment Management Analysis and Reporting (EMAR), Student Analytical Research (SAR), the Provost's Assessment Office, data arms in the Vice President of Student Affairs, SATS, Housing and Food Services and the academic departments. Data indicates that there is a lot of assessment activity being performed within these programs, but there is also evidence that the findings are not broadly exposed to the campus community.[141]

Among Committee members, opinions on assessment performance at Purdue were highly polarized, which is most likely attributed to each member's access to assessment research and results as well as knowledge about the use of those programs on campus. The evidence on the use of assessment at Purdue showed that individuals who had direct experience with assessment results being used (e.g., CCO, Learning Communities and STAR) rated the programs higher. However, responses clearly indicate that the majority of the committee did not have that perspective. Additionally, a brief survey of Deans and Department Heads had an extremely low response rate; results for those who responded indicated that several decisions have been based on beliefs within the department rather than research results. For many programs it appears that the assessment that has been done is very outcome- and student satisfaction-oriented. There is not nearly as much work done on the process for continued improvement.

Dissemination of Assessment Data

The committee felt overall that the University produces a lot of data about the first-year student, but the larger challenge is how faculty and staff receive and process that information. Factors such as ease of access to information can have a significant impact. For instance, a great deal of information is presented at a University level as opposed to a departmental or course level, which might be more meaningful or useful.

Additionally, assessment information is often delivered to faculty and staff in the lowest-cost method (i.e., posted on websites) and the effectiveness of that delivery mechanism depends largely on the willingness of individuals to engage in or seek out the data. The Foundations of Excellence Faculty and Staff Survey results clearly demonstrate that there is a split between faculty's beliefs and administrators'/staff's beliefs, with faculty consistently rating the University lower on what it assesses, how it disseminates those results and the use of those results.[142]

The Committee feels that, overall, what we do for University aggregate data and reporting is very different from what can be done at the individual course level. We have data on a lot of components but do not disseminate it at a unit-record level possibly because of interpretation of privacy laws (FERPA) or potentially because of the University culture. For instance, Signals has records of unit-level data, but it is currently used in less than 10 percent of courses.[143]

Understanding of Assessment Data

The fundamental question is that even if attendance at professional meetings, reading of journals or other efforts is very high, how well is that information being utilized back on campus? Additionally, these efforts are carried out by individuals and appear to rarely impact activities at a unit or institutional level.

Some Committee members wanted to further emphasize that rigorous assessment needs to be implemented for all programs within the University, particularly those making claims for achieving student success.

With such a large number of varied assessments, use and dissemination of the data, we recommend the creation of an area within the University that focuses on the first-year experience and that a key component of that area is a department that focuses on assessment overall and in particular first-year assessment, providing a clear message and direction on assessment. This area would coordinate and organize the priorities related to first-year assessment.


  1. Create a first-year coordinating unit, as part of a broader unit that is focused on the first-year experience, which would coordinate first-year (and broader University) assessment efforts.
    • The unit could be a new department, a repurposing of an existing department or a task force of existing departments. The key is that the unit be empowered to coordinate the current disparate assessment practices.
    • The unit would meet periodically with an advisory board of faculty and representatives from key offices (SATS, OIR, Housing and Food Services, Enrollment Management, Office of the Dean of Students, Office of the Vice President for Student Affairs).
    • The unit would discuss what research needs to be done regarding the first-year experience, decide what information needs to go on the master repository website and what information should be in the annual first-year report.
    • One of the first tasks would include meeting with key faculty members to discuss what information, data and metrics they need in regards to the first-year experience and what data they need specifically at the course level.
    • The unit would be tasked with meeting with and coordinating the efforts of the assessment units that are currently embedded in various administrative and academic departments. This would allow for dissemination of results throughout the University and allow for units to be involved in the discussion of what changes should be implemented based on the results. This would also allow for a closing of the loop on how we use data to make improvements on campus, thus helping to develop a culture of improvement based on evidence throughout the University. It would allow for a coordination of efforts and an elimination of redundancy of items that we don't use such as the number of survey efforts conducted on campus.
    • The unit would need to immediately link to the faculty advisory board that is currently in discussions with the Institutional Review Board (IRB) about the current IRB process at Purdue. This would allow for a discussion to occur with the IRB regarding at what point the review board should be involved in campus internal assessment efforts to ensure that said efforts are not delayed by moving through the IRB process.
  2. Develop a four-pronged approach to disseminating assessment results.
    • Create a website to serve as a master repository of assessment information on the first-year experience. The site would not just be a place for data and results, it would also gather current instruments and information about assessment efforts on campus to allow others to view and use the resources. Implementing the Student Voice Web platform. Student Voice allows for a single point of secure storing for assessment data as well as provides a portal for faculty and staff to access the information. The creation of the master repository would need to be a collaborative effort among multiple offices (OIR, SAR, Enrollment Management, Housing and Food Services, EMAR, SATS, Office of the Vice President for Student Affairs, academic units). By linking to the Student Voice platform the site would have a database capability that would allow Purdue members to search and manipulate data from first-year assessment efforts. Ideally the master site would also allow for better targeted messaging to specific constituencies on campus (faculty, advisors, students, etc.). Be mobile device accessible and allow for mobile applications that would list available services and remind students of important events during the first year.
    • Distribute an annual print publication of the first-year experience on campus (including assessment results) to all faculty and staff and potentially mailed to home addresses. The publication should be a professional glossy publication of the University's efforts.
    • Create a subscriber-based email function that would allow individuals to opt into periodic messaging on research, workshops, webinars and training. It should be built with the ability to link to social media such as Facebook.
    • Make course-level data access available through an infrastructure that allows for delivery of key course metrics (e.g., distribution of majors, academic preparation, student demographics) to faculty members teaching the course. Ideally these metrics would be made accessible via the "Faculty Tab" in myPurdue for all faculty.
    • The maintenance of these efforts would be within the purview of the first-year coordinating unit (Recommendation 1). Given the scope of the coordinated communication plan it could be staged in its roll-out with the website being an early focus.
  3. Establish a set of research studies on the first-year experience/student success.
    • The Provost's Office will establish a set of research studies that are carried out on a regular basis and are focused on the first-year experience and student success. The coordinating unit (Recommendation 1) will work with the existing assessment offices and interested faculty to conduct the research series. This research will ultimately provide a set of useful longitudinal data on student success at Purdue. The research should range from individual course level to the overall University level, encompassing both outcomes assessment as well as socio/attitudinal assessment, including a mechanism that allows for capturing student perspectives of the first-year experience. Some examples of initial research that should be examined include:
      • An examination of what predicts student success in courses. This effort needs to consider social cognitive factors beyond the traditional academic preparation metrics that have been examined in the past. An example of these efforts can be found in First-Year Engineering's Student Attitudinal Success Instrument (SASI) efforts. These additional metrics then need to be part of the distribution of course-level data to faculty.
      • An examination of the relationship between students' use of time in certain activities (e.g., attendance, studying, socializing, social media, etc.) and student success. Some data on students' use of time already exists in Cooperative Institutional Research Program (CIRP), NSSE and the Foundations of Excellence surveys, but those results need to be correlated with student success.
      • A master effort across the University examining exactly what the first-year student experience is, conducted with only first-year students, similar to the Your First College Year (YFCY) survey. Currently the University does not have a regular and ongoing assessment of the students' first-year experience.
  4. Endorse a consistent assessment message. There must be explicit administrative support of the assessment message from the President, Provost and Deans to create a consistent and sustained messaging effort at Purdue. An example would be a monthly report, of which the first page would be high-level University metrics that the President's and Provost's offices feel are important for the University to be tracking in regard to student success, with the remainder of the document being Academic School- or College-specific information. The deans could present this data/assessment/report with their department chairs and gather feedback on what information is relevant, needed and useful and decisions can be made on what improvements are necessitated. This top-down effort would help promote a culture of improvement based on evidence at the University.
  5. Present an institutional assessment forum/workshop at least annually, if not every semester, to allow individuals who conduct research and assessment efforts on the first-year experience and student success to present and discuss their work with colleagues at the institution. This would allow both for a presentation of the work being conducted on campus and time for individuals that are doing similar work in different areas on campus to meet, discuss and collaborate on their efforts. This would enhance assessment knowledge and skills and promote a culture of assessment. Additionally, campus initiatives to improve student success could be discussed (e.g., current IMPACT efforts, Supplemental Instruction, etc.)
  6. Establish a culture of improvement based on evidence tied to job responsibilities. A method to allow for this is to endorse the engagement in and use of assessment data as an integral part of every staff and faculty member's job responsibilities and annual departmental reviews. These efforts by faculty/staff should be reflected in the annual job performance evaluation process as well as the departmental budget discussions.

137. 2010 Re-accreditation Self-Study Report for the Higher Learning Commission of the North Central Association of Colleges and Schools: Reaching New Heights. Purdue University. (accessed July 12, 2012)

138. Ibid.

139. Ibid.

140. Ibid.

141. Evidence Library #99: HLC/NCA Advising Assessment Report; #101: University Advisor Assessment Tool; #339; Data Digest Student Orientation and Learning Communities; #434: Career Counseling Contact Hours; #435: Career Counseling Survey Summary, #436: Career Counseling Survey Presentation; #437: Data Digest Career Counseling Page; #438: Career Counseling Peer Comparison Presentation; #478: Dept Heads Responses to Use of Data; #480: Deans responses to use of Data; #481: Purdue Promise Assessment Plan; #482: Purdue Promise FYE Course Feedback; #483: Purdue Promise FYE Course Evaluation; #485: Purdue Promise Cohort Goals; #486: Purdue Promise FYE Course Pre-Test; #487: Purdue Promise First-Year Evaluation; #488-489: Purdue Promise Academic Coaching Evaluation (and student form); #490: Purdue Promise Mentoring Evaluation, #492: Purdue Promise Signals Intervention; Purdue Promise #493: Sophomore Focus Group; #494-495: Purdue Promise Student Leader Training Pre-Test and Post-Test; #496: Purdue Promise Mentoring Focus Groups, #497-498: Purdue Promise Four-Year Graduation Plan and Presentation; Student Survey Questions: 85, 80; Faculty/Staff Survey Questions: 89, 90, 101, 11. Purdue University.

142. Evidence Library #55: Fall 2011 Retention and Graduation Rates; #68: Examination of Four-Year Baccalaureate Completion Rates; #304-305: SON entering class profile (Fall 2009-2011); #306: SON retention data; #336: Data Digest Academic Preparation of New First-Year Students; #337: Data Digest Headcount of New First-Year Students; #338: Data Digest Retention and Graduation Rates of New First-Year Students; #341: Data Digest Enrollment by Age; #342-344: Common Data Set Section B, C and D; #347: OIR New Beginners Report; #363: Fall Housing Summary; #364: Resident/Nonresident Comparison; #422: Admissions Dashboard; #423: School Profiles; Faculty/Staff Survey Questions: 9, 10. Purdue University.

143. Evidence Library #1 National Survey of Student Engagement (NSSE); #4: 2010 Student Importance & Satisfaction Survey; #5: 2011 Graduating Student Learning Outcomes Survey Report; #11-12: Eduventures 1 and 2; #339: Data Digest Student Orientation and Learning Communities; #356: BOT Governance Report on Student Success; #364: Resident/Nonresident Comparison; Faculty/Staff Survey Questions: 11, 52, 53, 54. Purdue University.