FREquently asked questions

Institutional Data

Where can I find institutional data?

Purdue University’s institutional data can be accessed via the following tools:

  • Data Cookbook
    Metadata & Definitions
  • Data Digest 
    A collection of public-facing dashboards meant to provide comprehensive quantitative information on the major dimensions of the university, including students, faculty, staff, and budget through interactive and visual information.
  • Cognos
    An IBM product that is Purdue’s enterprise reporting tool supported by the Business Intelligence Competency Center (BICC). Data has been grouped by subject matter, summarized into standard reports, and is available for custom reporting. Faculty and staff with a business need for data at a granular level should visit the BICC’s website.

What reports and dashboards should I use?

What is the difference between warehoused versus frozen data?

  • Do I need to see how things stand now? Live data
  • Do I need to see how things were when the event happened?  Frozen data

Does Purdue have a Factbook?

How do I access the restricted filters on the Data Digest?

The restricted filters can be accessed by those affiliated with Purdue by having a current FERPA certification. It may take up to 24 hours between completion of the certification before access is granted.

What certifications are required for me to have access to Student data, Human Resources data, or Finance data?

How do I get access to Cognos?

Where can I find training resources for Cognos?

What is a census date and why is it important?

Which other offices on campus also collect and provide data on students, faculty, and staff?

There are several offices on campus that collect data on Purdue University students, faculty, and staff and all have separate and vital reporting responsibilities. Human Resources Data and Analytics collects and provides data on faculty, staff, and graduate research/fellowships. Academic Data Managers provide college and departmental level information.

When are enrollment data updated for the current academic year?

Why is my number different than yours?

  • Different assumptions 
    For example, you included lecturers in faculty count, but I excluded them.
  • Different data sources 
    For example, you’re looking at live data and I’m looking at frozen (See live vs. frozen data for more information).
  • Different systems: 
    For example, you’re looking at faculty counts from Elements whereas I’m looking at faculty counts from Success Factors.

What are the steps for adding students to a cohort?

Peer Comparisons/Benchmarking

What is the Common Data Set (CDS)?

What is the Integrated Postsecondary Education Data System (IPEDS)?

Who are Purdue’s peer institutions? 

  • Expanded Big Ten:
    • University of California Los Angeles
    • University of Illinois
    • Indiana University
    • University of Iowa
    • University of Maryland
    • University of Michigan
    • Michigan State University
    • University of Minnesota
    • University of Nebraska-Lincoln
    • Northwestern University
    • Ohio State University
    • University of Oregon
    • Pennsylvania State University
    • Rutgers University-New Brunswick
    • University of Southern California
    • University of Washington
    • University of Wisconsin-Madison
  • Association of American Universities (AAU) Non-Medical Peers:
    • California Institute of Technology
    • Georgia Institute of Technology
    • Massachusetts Institute of Technology
    • University of Colorado-Boulder
    • Carnegie Mellon University
    • University of Texas-Austin
    • Princeton University
    • University of California Berkeley
    • Brandeis University
    • Rice University
    • University of California Santa Cruz
    • University of California Santa Barbara
    • University of Oregon
    • University of Notre Dame
    • University of Illinois (has new medical school)
    • Arizona State University (has new medical school)

How do I get access to Peer Benchmarking Data?

Many items are self-service and available publicly on the web. 

  • ICHE/CHEDSS (Indiana Commission for Higher Education / Commission for Higher Education Data Submission System) 
  • AAUDE (Association of American Universities Data Exchange)
  • CSRDE (Consortium for Student Retention Data Exchange)
  • AAUP (American Association of University Professors)
  • UIA (University Innovation Alliance)
  • Academic Analytics
  • OSU FSS (Oklahoma State University Faculty Salary Survey) 

To what other organizations does Purdue University report its student and faculty data?

Purdue University provides student and Human Resource (HR) data on a scheduled basis to the groups outlined below.

Government Agencies/Mandated Reporting:

  • IPEDS (The Integrated Postsecondary Education Data System)
  • ICHE/CHEDSS (Indiana Commission for Higher Education / Commission for Higher Education Data Submission System)
  • NCAA (National Collegiate Athletic Association)
  • CRRSAA: HEERF (Coronavirus Response and Relief Supplemental Appropriations Act Higher Education Emergency Relief Fund)

Rankings:

  • US News and World Report
  • Times Higher Education (THE) World University Rankings

Data Exchanges/Consortiums:

  • CDS (Common Data Set Exchange)
  • AAUDE (Association of American Universities Data Exchange)
  • CSRDE (Consortium for Student Retention Data Exchange)
  • AAUP (American Association of University Professors)
  • UIA (University Innovation Alliance)
  • Academic Analytics
  • NC-SARA (National Council for State Authorization Reciprocity Agreements)
  • OSU FSS (Oklahoma State University Faculty Salary Survey)
  • College Board

Is there a location to explore peer institutions’ websites and data resources?

The AAUDE (American Association of Universities Data Exchange) provides a tool for exploring participating peer institution websites to retrieve a variety of information quickly and efficiently. This Tableau viz of AAUDE Institutional Links provides an easier access point to visit these webpages.

What metrics are available for comparison?

IDA+A Products and Services

Who can request work from IDA+A?

  • Academic Data Managers
  • Athletics
  • Board of Trustees
  • Center for Instructional Excellence
  • CILMAR
  • Graduate School
  • Purdue Information Technology
  • Purdue Online
  • Student Life
  • Student Success
  • Summer Session
  • Treasury and Operations
  • Vice Provost of Diversity, Inclusion and Belonging
  • Vice Provost of Faculty Affairs
  • Vice Provost of Teaching and Learning
  • Undergraduate Curriculum Council

How do I make a data request, general, or survey project request?

What is the cost for IDA+A services?

IDA+A provides services to the campus community and partners for little to no cost. We are service-based, internal consultants, and we are passionate about using data to inform decision making across campus.

What is the average turnaround on a data request?

IDA+A works to meet unit/local deadlines whenever possible. Turnaround time varies based on complexity of the request.

The Data Digest does not have the information I need. Who can I contact?

I need help building a dashboard/report, who can I contact?

Where can I find an overview on Data Digest?

How can I connect with Purdue’s other data offices?

Data Governance

How is data governance defined at Purdue?

Data governance is the process of managing data quality, consistent data definitions, business logic, and reporting practices to allow for efficient operations and strategic decision-making. 

What is Purdue University’s Data Governance structure?

Who decides what access I have to data?

What steps do I need to take to share Purdue data externally?

Metadata

Where can I find more information on the university’s metadata? Is there a data dictionary available?

What is Data Cookbook? 

Data Cookbook software is the enterprise source for Purdue’s data definitions, standard reports, and dashboards. It allows the institution to discuss, agree upon, document, and share definitions and report specifications. Having a well-documented data dictionary and catalog of official, standard reports allows for the efficient use of the abundance of institutional data.

Who has access to Data Cookbook?

Where can I find training on Data Cookbook?

How do I add definitions and specifications to Data Cookbook?

How can I add feedback or ask a question regarding a Data Cookbook definition or specification?

Definition and Specification pages include a ‘History and Comments’ section at the bottom of the page. If a user has a question or would like to provide feedback to improve the definition, they can add a comment. Comments are routed to the functional area for review.
Please note, all comments are publicly visible to everyone with a Cookbook account.

Assessment

What is assessment?

In essence, assessment is a process comprised of purposeful and intentional planning aimed at collecting, analyzing and making meaningful connections of the data to inform decisions on effectiveness and drive continuous improvement. It is synonymous with continuous improvement, which you can learn more about in the Assessment 101 training video .
Typically, ‘assessment’ implies an emphasis on student learning, however, at Purdue, assessment describes gathering and analyzing data for “institutional, departmental, divisional, or agency effectiveness” (Upcraft & Schuh, 1996). At its core, assessment guides good practice (Erwin, 1991).

Why do we assess?

Assessment is a process of improvement; Assessment is primarily a mechanism that informs practitioners and serves to advance the practices encompassed in learning and development (across individuals or groups [students, faculty, staff, etc.], programs, institution or campus, and organization). By assessing, we are identifying areas of improvement and solutions that will lead to better outcomes. It is also a form of accountability in demonstrating that and the practice(s) of teaching and its outcomes- namely, learning and development- are reflective of best practices and equitable achievements (effectively contribute to equitable achievements?).
“In this sense, assessment for improvement is essentially an internal matter. In contrast, assessment data collected for the purpose of accountability are used primarily to demonstrate that the institution is using its resources appropriately to help students develop the knowledge, skills, competencies, and dispositions required to function effectively in the 21st century. The information is typically intended for external audiences” (Ewell, 2009).

What assessment services does IDA+A offer?

  • Assistance to academic and co-curricular programs with evaluation of their curriculum, specialized and regional accreditation activities, and assessment of student learning achievements.
  • Support the measurement of cognitive (e.g., critical-thinking, civic literacy, problem-solving, quantitative reasoning) and dispositional or affective outcomes. We can also support you in measuring skill-based outcomes measurement.
  • Survey methodology guidance and assistance with securing a sample of Purdue constituents to survey.
  • Assist or manage the collection, analysis, and reporting of qualitative data (written responses to open-ended questions, interviews, focus groups, text, etc.) to inform your decision-making needs.

What data collection support does IDA+A’s assessment team offer?

  • IDA+A supports the collection of data via methods that occur outside of typical business intelligence mechanisms or information technology.
  • IDA+A offers their expertise in both quantitative and qualitative methodologies, with a staff that provides a variety of diverse ontological, axiological, and epistemological views.
  • IDA+A assists our colleagues in combating survey or measurement fatigue by providing custom samples of students, faculty or staff. IDA+A draws these samples to ensure that researchers get a sufficient list of their target population while also ensuring that no individual on campus is contacted for surveys too frequently. Please use our project requests form to garner our services in this area.
  • IDA+A provides expertise on survey methodology and measurement in education topics, such as, but not limited to: inference and error in surveys; target populations, sampling frames, and coverage errors; sample designs and sample errors; methods of data collection; messaging strategies;  nonresponse in sample surveys; questions and answers in surveys or questionnaires; post collection processing of survey data; assessment of qualitative results; principles and practices related to ethical survey and measurement in higher education.
  • IDA+A supports survey design and administration in Qualtrics. We can help ensure a survey will obtain unambiguous results and we can add desired bells and whistles to keep the survey interesting and unconfusing to respondents.
  • IDA+A maintains a marketplace of survey metadata and contact information. Researchers wanting data about any topic can look there and see what exists and who to contact to learn what the previous survey found. 
  • Additionally, IDA+A supports the collection, secure storage, and analysis of, plus reporting from qualitative data. IDA+A knows the value of this rich source of data and is happy to support you in garnering insights from it.
  • IDA+A does not make judgements or recommendations about the propriety of any specific survey protocol. That is the job of the Institutional Review Board and IDA+A defers to the IRB.

Why can’t we rely on grades?

In short, measuring student achievement with grades is like measuring distances with unstandardized pieces of string. Course grades are designed or decided separately by hundreds of different instructors to match expectations for hundreds of different courses and hundreds of different learners, at any given point in time.
Academic departments often work to ensure that all sections of a course have similar content and similar requirements and expectations, but there is no similar effort to ensure that requirements for all courses across Purdue University require similar knowledge gain, or effort, or subject mastery to earn an ‘A’.

Why can’t we use student survey data in assessment?

Surveys are a method to indirectly assess student learning; via surveys or questionnaires students are able to self-assess their own learning achievement or developmental progress. However, students cannot be relied on to objectively evaluate the degree to which they have gained or demonstrated the necessary knowledge, attributes, and/or skills to earn a certain credential (e.g., degree, certificate) or succeed in a profession. Instead, objective standards or direct measures (rubrics, tests, quizzes, etc.) are needed to be established and maintained, and measured in non-subjective ways by experts with valid measures and methods to ascertain achievement of outcomes or competencies.

How is assessment data used?

Assessment is used for the continuous improvement of the organization engaging in the assessment activity or activities. Those who work with IDA+A can make decisions, modify practices, or take other actions based on our analysis of their data. For those at the beginning of their assessment journey, assessment data may be used simply to gather a baseline from which to inform future assessment.
At the outset of any project, IDA+A works with campus partners to understand and agree on how and with whom assessment data collected during the project is shared.

What is the difference between assessment and evaluation?

See above for the definition of assessment. Evaluation differs from assessment principally in its focus on the program or process rather than on the outcomes (whether they be learning, developmental, affective, kinesthetic or behavioral). As defined by Russ-Elft and Preskill (2009), “Evaluation is a form of inquiry that seeks to address critical questions concerning how well a program, process, product, system, or organization is working.”
In other words, the question is not so much “To what degree did the outcome occur?”, but is instead “Is the program or process functioning to its highest and best capacity?”

What is the difference between assessment and research?

See above for the definition of assessment. Research is defined as “…a truth-seeking activity which contributes to knowledge, aimed at describing or explaining the world (Coryn, 2006, p.1).” 
As opposed to assessment or evaluation, research focuses almost solely on contributing to a field or discipline of knowledge, rather than on making recommendations for bettering practice (namely, teaching/instruction) and, therefore, outcomes from it (namely, learning and development). Thus, the role of the assessment professional is nearly always more prescriptive than that of the researcher; in other words, the ethical assessment professional is required to suggest options for a course of action based on findings.
Further, as opposed to research, assessment is more likely to be constrained by considerations of time/resource limitations, and to be focused on “… utility, feasibility, propriety, and inclusion of stakeholders (Mathison, 2008).”

What is the difference between assessment and institutional research?

See above for the definition of assessment. “Institutional research is research conducted within an institution of higher education to provide information which supports institutional planning, policy formation and decision making” (Saupe, 1990). A variety of models exist for units of institutional research (IR) which can include IR functions such as research and planning of institutional resource allocation, evaluation and assessment activities, leadership in institutional data management and use, federal and state reporting.  
IDA+A performs research, statistical and predictive analysis, and reporting for campus leaders and decision makers in support of evidence-based planning, evaluation, and assessment.
As opposed to assessment, IR is an encompassing term that includes using institutional resources for the methodological inquiry into organizational structure, function, and operations.

Is there a minimum number of individuals needed to report across various demographics?

The minimum number depends on the proportion of individuals belonging to a demographic and/or to the total number of respondents given a certain demographic, descriptor or qualifier. To protect the confidentiality of individuals belonging to demographics that are under-represented in the responses to a survey or other data collection effort, IDA+A recommends erring on the side of caution by not reporting publicly on demographics when relatively few individuals (typically, anything between 4 and 10) can be counted or grouped within specific demographic, or set of demographics, and/or comprise the total number of respondents. When data may reveal instances of inequity or discrimination, results may be reported to those who need to be aware, even when the guidelines above are not met. Please consult with a member of IDA+A’s Assessment Team, if you are in doubt. 

Where can I find the most common annual survey questionnaires used at Purdue? 

Where can I find an annual calendar of survey projects currently being administered across our campus?

How can I add a survey to the calendar?

Are surveys at Purdue confidential?

Do all survey projects have to be submitted to the IRB for review?

No. Whether an IRB application needs to be submitted is based on whether the proposed project meets the federal definition of research; it is not based on the research or data collection method.

Who decides whether an IRB application or protocol should be submitted?

Accreditation

What is accreditation? Why should a university be accredited?

What role does IDA+A play in accreditation?

Where should I direct questions or comments about Purdue’s HLC accreditation? 

When was Purdue last reviewed for accreditation?

Purdue’s most recent accreditation review by the Higher Learning Commission of the North Central Association of Colleges and Schools (HLC) was in October 2019. As a result, Purdue received formal notification of its continued 10-year, unrestricted, or open-pathway, accreditation . The lengthy process included a written self-study report with input from a large number of Purdue faculty and staff, and an extensive visit from an HLC external peer review team. Purdue has been accredited continually since 1913 and will be evaluated again in 2029-30.

Data Science & Engineering

What are predictive analytics?

Predictive analytics is the use of data, statistical algorithms and machine learning techniques to identify the likelihood of future outcomes based on historical data. The goal is to go beyond knowing what has happened to providing a best assessment of what will happen in the future.

What is PDAP?

PDAP stands for Purdue Data Analytics Platform.  It is comprised of two platforms:

  • The Greenplum Database
  • The High-Compute Research Platform Zeus

These platforms facilitate faculty research and cutting-edge data science projects.

What is Greenplum?

Greenplum is a massively parallel processing (MPP) distributed database designed for analytics. It serves as secondary storage for several datasets across campus, allowing IDA+A to quickly and easily combine disparate data sources together to complete novel projects.

What is Zeus?

Zeus is a high-compute machine with 4 NVIDIA A100 GPUs, 96 processors and 1 TB RAM. It is used for cutting-edge experimentation including Generative AI using Large Language Models (LLMs) as well as faculty research.

How does Purdue make Artificial Intelligence decisions and ensure that data is used ethically?

What types of data science projects does IDA+A do?

The data science team has a broad mandate. Our team has done work with academic units to predict student success, has worked with energy & utilities to help predict utility usage, worked with the registrar to build degree requirements as complex graphs, and more. If you have a data project with a predictive component, or that seems more novel or complicated than something the industry has done before, we can likely help in some way.

What data sources are available in PDAP?

The Greenplum database stores data from several sources.  Some of the heavily used data sources are:

  • Student Systems
    Data from the Banner Student System pertaining to courses, instruction, student applications, admissions, grades, and profiles.
  • Student Housing
    Data pertaining to student residence halls and housing options. 
  • Retail Dining
    Data pertaining to retail dining choices in residential halls and other university locations.
  • Card Services
    Data pertaining to card swipes for accessing academic buildings and the RecWell Center, purchasing meals or supplies in dining halls, and On-the-GO locations.
  • Wireless Access Logs
    Data pertaining to mobile and other devices connected to Purdue’s Wi-Fi network access points.
  • Learning Management Systems
    Blackboard Learn and Brightspace
  • Contact Tracing
    Contact Tracing data shared with Protect Purdue Health Center during COVID.
  • COVID Testing
    COVID Test Results data from COVID test administered by various vendors for Purdue.
  • Degree Audit Data
    Data pertaining to the progress a student has made towards a degree and the courses required to graduate from Degreeworks and Edunav.
  • UniTime
    Course scheduling data.
  • Space 
    Data pertaining to the various spaces on the Purdue campus including, but not limited to the location, size and purpose.
  • Slate – Undergraduate and Graduate
    Undergraduate and graduate pre-entry data pertaining to potential new students and their applications.
  • PREMIS (Purdue Registration and Event Management Information System)
    Registration data for Virtual Student Transition, Advising, and Registration (VSTAR) and Boiler Gold Rush (BGR) orientation programs for incoming students.
  • Student Organizations (BoilerLink)
    Data pertaining to student organizations.
  • Boilerconnect
    Data from BoilerConnect – a student success system that links staff, faculty, and students in a coordinated support network.
  • Surveys
    SERU – Student Experience in the Research University
    Student Orientation
  • Curriculog
    Data from the tool used to submit, review, and approve curriculum and course proposals.

References

  • Erwin, T.D. (1991). Assessing Student Learning and Development: A Guide to the Principles, Goals, and Methods of Determining College Outcomes.
  • Ewell, P. T. (2009, November). Assessment, accountability, and improvement: Revisiting the tension. (Occasional Paper No. 1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).
  • Lundquist (N.D.) https://www.utep.edu/student-affairs/_Files/docs/Assessment/Campus-Labs-Assessment-Evaluation-Research-Definitions-Handout.pdf
  • Palomba, C. and Banta, T.W. (1999) Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. Jossey-Bass, Inc., San Francisco.
  • Saupe, J. L. (1990, March). The Functions of Institutional Research, 2nd Edition. The Association for Institutional Research.
  • Upcraft, M.L., & Schuh, J.H. (1996). Assessment in Student Affairs: A Guide for Practitioners. The Jossey-Bass Higher and Adult Education Series.