Assessment, Research & EvaluationTools of the Trade

Tools of the Trade

In crafting an assessment plan, variety is the order of the day to achieve the most successful results. Using both qualitative and quantitative methods will enhance the trustworthiness of overall findings and conclusions. Further, different types of data serve different functions, so combining techniques will deliver more and better quality data to analyze and use to guide programming decisions.

Different assessment techniques also have their own distinct merits and limitations. Regardless of the type of program, there are assessment techniques that will meet your specific needs..

Quantitative Techniques:






Participants or program staff complete a list of criteria or factors relating to program or activity

+ Easy to use

+ Reduce chances of omitting important criteria or factor

+ Criteria need validation

+ Data may be superficial

Validated Instruments

A validated instrument is administered to participants before and after or just after their involvement in a program.

+ Rigorously documents learning gains

+ Data analysis is easy

+ Facilitates comparisons across programs and populations

+ Can reliably measure the magnitude of learning gains

+ Creating and validating new instruments is impractical for most programs

+ An existing instrument may not be available that measures a program's specific outcomes

+ Unanticipated outcomes are missed

Quantitative Surveys

Data is collected from participants using quantitative question items. Instruments can be sent out, completed on site or in structured interviews.

+ Easy to administer to large groups

+ All individuals have an equal opportunity to respond

+ Facilitates comparisons across programs and populations

+ Developing instruments takes time and skill

+ Unexpected themes and outcomes are missed

+ Item wordings can bias responses

+ Return rates can be low

Qualitative Techniques:






Recording of events, feelings and observations over time revealing the personal perspectives of the writer.

+ Easy to initiate

+Can reveal an insiders evolving views of a program

+ Can identify unanticipated themes

+ Data is unreliable if not systematically recorded

+ Data can be difficult to analyze

+ Writers can be biased or untruthful


Participants respond in writing to focused and open-ended questions.

+ Easy to administer to large groups

+ Unanticipated ideas or outcomes may emerge

+ Data can be thin or superficial

+ Return rates can be low

Focus Groups

A moderator facilitates a group discussion of pertinent issues and topics

+ Gather in-depth data

+ Group interactions can produce rich data

+ More time effective than individual interviews

+ The process is flexible

+ Unanticipated outcomes may emerge

+ Requires a skilled moderator

+ Shy individuals may be reluctant to vocalize their thoughts and ideas

+ Difficulty scheduling group meetings


An interviewer asks program participants open-ended questions

+ Gather in-depth data

+ The process is flexible

+ Unanticipated ideas or outcomes may emerge

+ Requires a skilled interviewer

+ Can be time-consuming to gather, transcribe and analyze data


While observing a program activity, an observer records what he sees and hears. Observations may be done live or recorded.

+ View operations of a program as they occur

+ Can adapt to events as they occur

+ Observers need training

+ Observer's presence may affect participants' behavior

+ Interpreting observed behaviors can be difficult

+ Data collection is time-consuming

Embedded Assessments

Students demonstrate their mastery of skills and knowledge by completing targeted course assignments.

+ Data directly demonstrates what students can and cannot

+ Performance based

+ Students have an incentive to do their best

+ Data analysis is time consuming

+ Rating rubrics need validation

+ Raters require training and practice to reduce inter-rater variability

Learning Portfolios

Program participants collect, organize and reflect on samples of their work covering the breadth of the program

+ Data provides a continuous picture of progress

+ Performance based

+ Data is time-consuming to analyze

+ The process can be logistically difficult

+ Rating rubrics need validation

Reflective Essays

Participants write brief essays explaining what they gained from a program or activity

+ In-depth measure of perceived learning

+ Data may reveal unanticipated outcomes

+Analyzing large numbers of essays is time-consuming

Methodologically Mixed Techniques:





Follow-up Contacts

Participants are periodically contacted following their participation in a program

+ Participants can appraise a program's value in light of subsequent experience

+ Longer-term outcomes can be documented

+ Locating past program participant can be difficult

+ Contacting large numbers of participants is time consuming

Longitudinal Tracking

A diverse range of data about program participants is tracked and stored in a database

+ Tracks long-term behavioral and performance outcomes

+ Integrates newly-collected and existing data

+ Facilitates comparisons across populations and programs

+ Databases require time and thought to establish and maintain

+ Locating past program participant can be a challenge


DLC Assessment Staff,