Home » Student Behavior

Category Archives: Student Behavior

Feedback: “Nice job. I enjoyed reading this. A-“

By in Classroom, Course Redesign, General Education, Student Behavior on .

I took a cohort program for my master’s and had the same instructor for 4 or 5 courses. Each assignment was an essay. On every essay I got exactly the same feedback – absolutely no comments on grammar or specific ideas, but rather the generic “Nice job. I enjoyed reading this. A-.“ To this day, I have no idea on why “Nice job. I enjoyed reading this. A-“ rather than “Nice job. I enjoyed reading this. A+“ (which the student who sat next to me always got). (This feedback was especially sad when considering that this was a masters in adult learning. But that’s another story.)

Feedback to students can guide students, but in different ways. Here I would like to focus on three types of feedback: Feed back, feed forward and feed up (not to be confused with “fed up” – which is what I was in my master’s program).

Feed back, up and forward

  • Feed forward (FF) – feedback that explains how to improve future assignments

Feedforward provides suggestions for the next assignment

  • Feedback (FB) – ipsative feedback on current compared to past performance

Feedback compares past and current assignments

  • Feed up (FU) – feedback that explains why this (the assignment or assignment details) is important

Gives info on why the assignment is important


If we identify our purpose(s) when we provide feedback, we can support students in learning and applying from both the assignment and the feedback!

Essay examples:

FF – “Organizing your essay will help your readers. If you follow the sequence of what is asked in the assignment this will help you both ensure that you cover all elements and organize your thoughts more.”

FB – “On your last assignment I noted that you changed ‘voice’ often. Here you are consistent and your essay is much easier to read because of it!”

FU – “You do not seem to have a firm grasp on the differences between the behaviorist and constructivist theories. Understanding this is important because workplaces will want you to develop training based on these.”

Multiple choice exam examples:

FF – “In order to improve your performance on the upcoming [assignment/exam/group project], please review the [notes and materials/resources] posted in Blackboard.” (Purdue ITaP, 2013)

FB – “You are doing a better job studying. Your improvement is great!”

FU – “Understanding the basics of Excel which we cover here will be critical to your success in your accounting class.”

Here’s the whole model:

full model

(Somewhat based on Hughes, 2012)

Is feedback important?

I remember the feedback I got 15 years ago in my master’s program because it was so bad. It did not inspire me or help me improve.

Good feedback may not be as memorable long term, but research has shown that it can help students improve not only what they know, but how to study and how to apply their learnings.

Passing note on Passnote:

By the way, writing appropriate feedback can be hard. At Purdue, we created Passnote to help. This is a very easy-to-use tool which has a selection of feedback notes which you can select and edit to make your feedback to each student individualized! And you don’t have to download or sign-in to use it. Take a look: http://www.purdue.edu/passnote/


Hughes, G. (2012). Ipsative assessment: comparison with past performance. Higher Education Academy Workshop and Seminar Series 2012. Retrieved June 15, 2014, from http://www.ucl.ac.uk/~ucgbarg/OU_workshop_files/TWO37-GH.pdf

Purdue ITaP. (2013). PassNote. Retrieved June 15, 2014, from http://www.purdue.edu/passnote/

Never Overlook the Value of Communication When Teaching Online

Tags: , ,

By in Distance Education, Musings on Technology, Student Behavior on .

I’ve been teaching online at an institution other than Purdue for about 7 years now.  During the Fall 2013 semester, a student commented to me that they really appreciated the amount of communication I had with them during the semester.  Another student mentioned that I was much more engaged compared to his previous online course instructors.

For some reason these comments really haunted me after that term.  Yes, it felt great to get that kind of feedback from students because it was positive.  However, I have since been curious about why these students praised my involvement.  Why is it odd to students that online instructors are engaged in their courses?  If so, shouldn’t that be somewhat alarming?

Engagement is a two-way street.  We can’t expect students to be highly engaged in their classes while as faculty, we appear to either simply observing the class…or at worst, completely unengaged and uncaring about what is going on.

One aspect where student performance can be impacted positively by communication from faculty is through feedback.  Chickering and Gamson (1991), in their Seven Principles for Good Practice in Undergraduate Education, list two principles that work hand-in-hand when it comes to communication:  Giving prompt feedback, and communicating high expectations.

If I simply state in my syllabus that I expect strong performance from my students on an assignment, but I provide little to no feedback to students, I am not being effective in providing guidance to high-performing students who may simply need reinforcement that they’re on the right track.  I am also not being effective with lower-performing students by not providing them with the feedback and information they need to improve their work and rise to the expectations I have for the class.  If I don’t tell a student what I expect and clearly communicate to them what they need to do to improve, how can I expect them to do better?

So what’s so important about prompt feedback?   Prompt feedback plus communication about what the student needs to continue doing (or improve upon) can make a difference in the student’s performance.  Not providing prompt feedback can put a student in a position where they don’t know what to improve upon until after the submission of additional assignments or assessments.

There are other components of communication that can be accomplished to keep you engaged with the course.  Consider using Announcements within Blackboard to provide updates and information that can help them, such as tips on how to complete assigned tasks, or emphasizing due dates.  If you do use Announcements, change your course entry page from Course Content to Announcements so those are the first thing a student sees when they log in.  In addition, critical announcements can also be emailed to students.

Furthermore, if you’re teaching online or a blended course where synchronous activity with your students is limited, you may wish to add online office hours using web conference tools provided by Purdue.  This can allow you to host real-time discussions with students wherever you are.

Communicating feedback and expectations is important for student success.  However, simply communicating with your students to let them know that you’re engaged and available can also demonstrate that you care about your students and their involvement in your class.

To discuss ways to increase communication with your students, please contact us at tlt-consulting@purdue.edu.

Brett Creech
Educational Technologist

Chickering, A. W., & Gamson, Z. F. (1991).  Appendix A: Seven principles for
good practice in undergraduate education.  New Directions for Teaching
and Learning, 47
, 63-69.

What is Blackboard Learn’s Retention Center?

Tags: , , , , ,

By in Blackboard Learn, Student Behavior, Tools on .

Envision this. You are an instructor who wants real-time tracking in the course management system you’re already established in. You enjoy simplicity in picking your criteria for monitoring student performance and fancy something that provides an easy mechanism for informing students who are falling short of the course’s expectations.

Retention Center can be that tool for you. It is a replacement for its predecessor, the Early Warning System and has been available since the start of the Fall 2013 semester.

And now, here is a Frequently Asked Questions roundup:

Give me a quick definition: The Retention Center is defined by Blackboard as a tool that can determine if students are at risk compared to the criteria you choose to setup and monitor. Once the criteria settings are in place, the instructor is notified which students are currently at risk.

Doesn’t Course Signals already do something similar?: First, it is based on the same basic philosophy as Course Signals, meaning it is a tool that enables you to take action to improve student performance in your course(s). Second, while Course Signals requires reports to be generated, Retention Center provides automatic monitoring. It is important to note that Course Signals works to predict where a student will finish performance wise in a course, given their current grades and interactions with content in Blackboard Learn. Retention Center is designed to give the instructor an up-to-the-minute picture of how students are performing, but does NOT predict performance. Lastly, there are benefits to using either or both tools, and an article in the near future will provide a comparison.

So, what allows the Retention Center to work?: Retention Center is built on the idea of using different types of monitoring guidelines, called Rules. Currently, it employs four types of rules, and here is a breakdown of each type:

  • Course Activity: This monitors the overall activity of students using your course, such as viewing pages, clicking links to items, taking online assessments, and writing in the collaborative tools (blogs, discussion board, journals, wikis).
    • Criteria for measuring: student’s activity in the last # of days/weeks/months compared to a above/below the # percentage of the course’s average.

  • Course Access: Tracks the number of days since a student was last recorded accessing the course.
    • Criteria for measuring: # of days since last course access.

  • Grade: Determines if a student is above/below a specific or average point/percentage value in what they have earned as a final grade or from other grade items (assignments, tests, etc.).
    • Criteria for measuring: Choosing to monitor final grade or specific item. Set Grade Value above/below # of point/percentage value. Or, grade is above/below the average grade by a percentage of #.

  • Missed Deadline: Tracks if a student has many or a specific deadline for an assignment, test or survey.
    • Criteria for measuring: Choosing to monitor all or a specific deadline(s) if # of deadlines have been missed by missed by more than/less than # of days.

Fair enough, so do I need to set them up by scratch or are there already some in place?: Each course is given four default risk rules, one for each rule type. They are…

  • For Course Activity: Activity in the last 1 week(s) is 20% below average
  • For Course Access: Last access more than 5 days ago
  • For Grade: External Grade is 25% below class average
  • For Missed Deadline: 1 deadline(s) have been missed by more than 0 days

^As an added note, you can edit these default rules to change their criteria.

Can I make as many rules as I want?: Yes you can. While you will not see more than four columns in the Retention Center risk table, each new rule is a part of each rule type. Thus, if you use the rule for 1 deadline that was missed in 0 days, and create a new rule for alerting if a student missed 2 deadlines in the last 30 days, both will show when you click on the red dot indicating an active risk. Here is an example below:

*In another blog article in this series, we will cover the advanced features in the matching risk factors dropdown.

On another note, do I always have to monitor students at risk? Can I monitor students who are doing well?: Students will appreciate your constructive criticism when it comes to issues in their performance, but they may enjoy your insight even more if decide to let them know they are doing well and to keep up their on-time, excellent work.

Can I pick out certain students to monitor for risks?: Definitely. Click on any of the red dots that appear to the right of the student’s name, and then click the Monitor button on the Matching Risk Factors dropdown box. The students you are monitoring will appear on the right side of the Retention Center page. Here is an example of the data on a student being monitored:


Can I track my own activity?: Certainly and its encouraged. Just as you expect your students to be involved and turning in high-quality work, you too should be involved in how you contribute to your course. The types of course activity tracked for instructors are assessment grading, interaction & collaboration with the collaborative tools (discussion board, blogs, journals, and groups), announcement creation, and content created/uploaded. Here is an example of the activity interface:

I am already a month and half into teach my course, is it too late to get started?: The beauty of Retention Center is that you can get started at any point in the semester and the data generated is instantaneous and relevant. The midterm period of the semester is an important time to inform your students of their performance in your course. Think of it this way, while they may have made mistakes and earned average scores so far, there is always a chance that your intervention will influence them to make more of an effort in the second half of the semester.

How do I email multiple students who have the same risk?: By clicking on the red bar with the current number of at risk students, you can view which types register a student being at risk (first image below shows an example). Once you click one of the risks, a dropdown option box will appear and by hovering over the Notify button, you can then send out a message.


Is there any documentation available for me to use to get started?: Purdue does not currently have documentation for Retention Center, however Blackboard Inc. has materials that can walk you through the features. However, like other features of Learn we will plan to release how-to documentation and best practices resources in the near future.

Please check back at the IDC blog for an upcoming blog article in November on Retention Center. In the meantime, feel free to contact us at tlt-consulting@purdue.edu should you have any questions and/or issues.

Developing A Questioning Strategy

By in Classroom, General Education, Getting Started, Morning Musings, Student Behavior on .

Developing A Questioning Strategy

Many of the seminars I attend focus on the strategies instructors can employ to engage students in their own learning and enhance their learning outcomes. The appropriate use of questioning strategies by instructors is a method that can facilitate this process. Research highlights the importance of instructors being able to ask questions that engage students and allows them to expand, clarify, and justify their answers. Nevertheless, instructors often do not receive any training in the use of questioning strategies.

Below are some of the questioning strategies instructors use to engage students:

Instructors Ask Closed vs. Open-ended Questions

Closed-ended questions require a single answer, such as “yes”, “no”, or a brief phrase, and do not invite an elaborated response from students. For example, this question is an example of a closed ended question: Was Purdue University founded in 1869? The answer is yes. In addition, closed-ended questions can be used wrap up discussions, obtain more information from students, or help groups reach consensus. Examples of these kinds questions include: Have we covered everything?, Does everyone agree this is the best choice?, or Is the class ready to move on?

  • Pros: May require little time to develop and grade. There is one correct answer. Not ideal if the goal is to stimulate in-depth thinking by students.
  • Cons: Questions may not provide students with the opportunity to explain that they do not understand the content or have an opinion about a topic. These questions may also discourage students from thinking on their own or expressing their real feelings. In addition, students can answer without knowing anything about a topic.

Open-ended questions do not have a single correct answer and leaves the formulation of the answer up to the individual. When open-ended questions are posed, students have the opportunity to be creative, structure their response in a manner that best suits them, and develop critical thinking skills. These questions usually begin with “What”, “How”, or “Why.” Some examples of open-ended questions include: What kind of information were you looking for?, How does this information related to our goal of…?, and What suggestions do you have for…?.

  • Pros: These questions encourage students to share their ideas, concerns, and feelings; facilitate the development of enhanced levels of cooperation and understanding among students; and help faculty support diverse ways of student learning.
  • Cons: It is sometimes difficult for faculty to formulate an open-ended question in such a way that students understand the type of response that is expected of them.

Instructors Utilize Bloom’s Taxonomy to Guide the Development of Questions

Bloom’s Taxonomy is a framework that describes three domains or types of learning: cognitive, affective, and psychomotor (Bloom & Krathwohl, 1956). The cognitive domain, pertinent for this discussion, focuses on the development of a hierarchy of thinking skills important in the learning process. The levels of learning found in the cognitive domain can be used by instructors to develop questions that enhance the development of critical thinking skills in students. The grid below provides a glimpse of the types of questions that can be posed to students during the learning process.

Instructors Integrate A Four-Question Technique into Their Discussions

I recently read an interesting Faculty Focus blog post authored by Dr. Maryellen Weimer which described the use of a four-question set that could be used to engage students with course content and promote deeper ways of learning. The strategy was developed and used by Dietz-Uhler and Lanter (2009) in an introductory psychology course. Students were asked to analyze, reflect, apply, and question the content they read. The following question prompts were used:

  • [Analyze}:“Identify one important concept, research finding, theory, or idea…they learned while completing this activity.”
  • [Reflect]: “Why do you believe that this concept,, research finding, theory or idea…is important?”
  • [Apply]: “Apply what you have learned from this activity to some aspect of your life.”
  • [Reflect]: “What question(s) has the activity raised for you? What are you still wondering about?” (Dietz-Uhler & Lanter, 2009, p. 39)

These researchers found that students performed significantly better on a quiz when they were able to answer the four-question set prior to rather than after they had taken a quiz. A benefit of using this strategy is that it can be applied to learning environments that tend to be lecture-based as well as those that promote active learning.

Some Tips on Developing A Questioning Strategy

  • Ask a mix of questions (from all levels of Bloom’s Taxonomy)
  • Create a classroom climate that invites student questions
  • Plan your questions in advance (noting when you will pause to ask and answer questions)
  • Create questions that help students link important concepts
  • Frame questions in language students understand
  • During class discussions, ask one question at a time
  • Rephrase the question if it seems unclear to students



Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s Taxonomy of educational objectives: Complete edition, New York : Longman.

Bloom, B. & Krathwohl, D. (1956) Taxonomy of educational objectives. The classification of educational goals. Handbook I: Cognitive domain. N.Y.: Longman Green.

Dietz-Uhler, B. and Lanter, J. R. (2009). Using the four-questions technique to enhance learning.Teaching of Psychology, 36 (1), 38-41.

Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview, Theory Into Practice, 41(4), 212-218.