Home » Student Behavior (Page 2)

Category Archives: Student Behavior

Developing A Questioning Strategy

By in Classroom, General Education, Getting Started, Morning Musings, Student Behavior on .

Developing A Questioning Strategy

Many of the seminars I attend focus on the strategies instructors can employ to engage students in their own learning and enhance their learning outcomes. The appropriate use of questioning strategies by instructors is a method that can facilitate this process. Research highlights the importance of instructors being able to ask questions that engage students and allows them to expand, clarify, and justify their answers. Nevertheless, instructors often do not receive any training in the use of questioning strategies.

Below are some of the questioning strategies instructors use to engage students:

Instructors Ask Closed vs. Open-ended Questions

Closed-ended questions require a single answer, such as “yes”, “no”, or a brief phrase, and do not invite an elaborated response from students. For example, this question is an example of a closed ended question: Was Purdue University founded in 1869? The answer is yes. In addition, closed-ended questions can be used wrap up discussions, obtain more information from students, or help groups reach consensus. Examples of these kinds questions include: Have we covered everything?, Does everyone agree this is the best choice?, or Is the class ready to move on?

  • Pros: May require little time to develop and grade. There is one correct answer. Not ideal if the goal is to stimulate in-depth thinking by students.
  • Cons: Questions may not provide students with the opportunity to explain that they do not understand the content or have an opinion about a topic. These questions may also discourage students from thinking on their own or expressing their real feelings. In addition, students can answer without knowing anything about a topic.

Open-ended questions do not have a single correct answer and leaves the formulation of the answer up to the individual. When open-ended questions are posed, students have the opportunity to be creative, structure their response in a manner that best suits them, and develop critical thinking skills. These questions usually begin with “What”, “How”, or “Why.” Some examples of open-ended questions include: What kind of information were you looking for?, How does this information related to our goal of…?, and What suggestions do you have for…?.

  • Pros: These questions encourage students to share their ideas, concerns, and feelings; facilitate the development of enhanced levels of cooperation and understanding among students; and help faculty support diverse ways of student learning.
  • Cons: It is sometimes difficult for faculty to formulate an open-ended question in such a way that students understand the type of response that is expected of them.

Instructors Utilize Bloom’s Taxonomy to Guide the Development of Questions

Bloom’s Taxonomy is a framework that describes three domains or types of learning: cognitive, affective, and psychomotor (Bloom & Krathwohl, 1956). The cognitive domain, pertinent for this discussion, focuses on the development of a hierarchy of thinking skills important in the learning process. The levels of learning found in the cognitive domain can be used by instructors to develop questions that enhance the development of critical thinking skills in students. The grid below provides a glimpse of the types of questions that can be posed to students during the learning process.

Instructors Integrate A Four-Question Technique into Their Discussions

I recently read an interesting Faculty Focus blog post authored by Dr. Maryellen Weimer which described the use of a four-question set that could be used to engage students with course content and promote deeper ways of learning. The strategy was developed and used by Dietz-Uhler and Lanter (2009) in an introductory psychology course. Students were asked to analyze, reflect, apply, and question the content they read. The following question prompts were used:

  • [Analyze}:“Identify one important concept, research finding, theory, or idea…they learned while completing this activity.”
  • [Reflect]: “Why do you believe that this concept,, research finding, theory or idea…is important?”
  • [Apply]: “Apply what you have learned from this activity to some aspect of your life.”
  • [Reflect]: “What question(s) has the activity raised for you? What are you still wondering about?” (Dietz-Uhler & Lanter, 2009, p. 39)

These researchers found that students performed significantly better on a quiz when they were able to answer the four-question set prior to rather than after they had taken a quiz. A benefit of using this strategy is that it can be applied to learning environments that tend to be lecture-based as well as those that promote active learning.

Some Tips on Developing A Questioning Strategy

  • Ask a mix of questions (from all levels of Bloom’s Taxonomy)
  • Create a classroom climate that invites student questions
  • Plan your questions in advance (noting when you will pause to ask and answer questions)
  • Create questions that help students link important concepts
  • Frame questions in language students understand
  • During class discussions, ask one question at a time
  • Rephrase the question if it seems unclear to students

Resources

References

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s Taxonomy of educational objectives: Complete edition, New York : Longman.

Bloom, B. & Krathwohl, D. (1956) Taxonomy of educational objectives. The classification of educational goals. Handbook I: Cognitive domain. N.Y.: Longman Green.

Dietz-Uhler, B. and Lanter, J. R. (2009). Using the four-questions technique to enhance learning.Teaching of Psychology, 36 (1), 38-41.

Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview, Theory Into Practice, 41(4), 212-218.

 

 

Evaluating Your Students- Blackboard Learn and its Underused Unique Feature-Part 2

Tags:

By in Blackboard Learn, Student Behavior on .

We are at the end of the spring semester, which means Summer courses will begin soon and planning for the Fall semester is already underway. Now is a great time to consider how to use student activity reports in addressing participation issues or finding out what content/tools in your course(s) are getting attention and which ones might not receiving as much.

As promised in the previous article, this entry will cover the features of other reports you can run in Learn. If you have not yet read the previous article, please click this link to review it (that way, what you see below makes sense): http://blogs.itap.purdue.edu/learning/2013/03/12/evaluating-your-students-blackboard-learn-and-its-underused-unique-feature-part-1/ .

Last time, we discussed the aspects of the All User Activity inside Content Areas report, and how it displays a summary of all user activity inside Content Areas for the course.  The reports that will be covered in this entry are: Course Activity Overview, Overall Summary of User Activity, and Student Overview for Single Course. The information displayed comes from a large enrollment course, and all student names/usernames are blacked out for confidentiality reasons.

First up is the Course Activity Overview report, which displays the totals and averages of student’s time (in hours) spent in the course on Blackboard Learn.

The first part of the report shows the overview of total hours of student activity for each day of the week in the course. In addition, the amount of students in the course are displayed the top, including the date range of the report’s data. Below we can see the total time in hours in the course and the average time spent per user.

Course Activity Overview part 1

The second part of the report, which can extended to several pages after due to the amount of students, shows the total amount of hours each student spent (the blue bars) and avg amount of hours the class of students spent in the course (orange line)- given the date range of report. We can see that many students either spent many more or less hours in the course than the class average.

Note: Again, this data comes from an actual course, and the names of students included have been blacked out for confidentiality purposes.

Course Activity Overview part 2

Now, let’s take a look at the Overall Summary of User Activity report. This report is similar to the All User Activity inside Content Areas report in the last entry, but the biggest difference is that this one keeps track of course tool/mashup usage.

The first part of the report shows the total amount of hits each tool had during the date range set for the report. The report shows the tools/mashups that currently exist in Learn.

Overall Summary of User Activity part 1

The second part of the report shows the user activity totals per user per tool/mashup.

Overall Summary of User Activity part 2

The third part of the report shows the overall user activity per user for each month/day. A graph is shown for each day of the date range on the x-axis and the total hours of activity of all users on the y-axis.

Overall Summary of User Activity part 3

The fourth part of the report shows an overview of user activity on which hours of the day users accessed the course the most. The table on the left lists the hours of the day and the total hits of for that hour, and finally the comparible percentage. The total hits for each hour of the day is display in the graph on the right.

Overall Summary of User Activity part 4

The fifth, and last part of the report is an overview of user activity on which days of the week users accessed the course the most. The table on the left shows the

Overall Summary of User Activity part 5

Now, let’s take a look at the last report type that will be covered, which is the Student Overview for Single Course report. This report is similar to the Course Activity Overview report, but focuses on one student’s level of activity and the total hours/number of times acessed for each item in the course.

The first part is similar to the first page of the Course Activity Overview report, and focuses on total hours the student had on the days of the week for a date range.

Student Overview for Single Course part 1

The second part of the report covers the hours the student has spent looking at items in the course, including the number of times they were accessed and the initial access date/time of the item by the studenStudent Overview for Single Course part 2

As you can see there are several more report types to choose from and they can be useful for seeing trends in overall student activity or pinpointing students who have certain activity levels.Again, I hope this article has been useful to you and inspires you to use these reports in your course(s). I will have one more follow-up blog post on this topic that will cover the remaining reports that Blackboard provides. If you have any questions, please contact ITaP’s Consulting & Training group at tlt-consulting@purdue.edu .

Good Principles for a Successful Semester

Tags: , , ,

By in Classroom, General Education, Student Behavior on .

As we embark on another academic year, I think it’s important to consider how we can create the best environment for learning. In 1987, Chickering and Gamson put forth a brief article titled Seven Principles for Good Practice in Undergraduate Education. This document has become a touchstone for educators and instructional designers over the past two and a half decades, and still remains very relevant today.

Chickering and Gamson (1987) pulled the seven principles together in an effort to better understand over 50 years of research on not only how students learn, but also how instructors teach. What resulted were a set of guidelines that, if implemented in part or in whole, has the potential to greatly impact student success in the classroom. As you reflect on your teaching for the semester, consider these principles – and how you might incorporate them – as you prepare future class sessions or courses.

1. Good practice encourages student-instructor contact.

Students interacting with their faculty members has been shown to increase student performance and overall retention to the university. This can be done through emails to students , in- or -out-of-class activities, or simply learning your students’ names.

2. Good practice encourages cooperation among students.

When students interact with others, particularly those with different backgrounds, ethnicities, experiences, or ideologies, they have an opportunity to learn more about the world around them and develop critical thinking and analysis skills. Group projects, study groups, or case/team/problem based learning are all great ways to have students cooperatively learn a concept.

3. Good practice encourages active learning.

The more active a student is in class, the more likely they are to learn the materials being presented. Encourage your students to ask questions and to answer other students’ questions. Further, consider employing one or more tools supported by ITaP designed to increase active and involved learning in your classroom.

4. Good practice involves prompt feedback.

The more students know how they’re doing and how they can improve their performance, the more likely it is that they’ll be successful in the course. Consider employing Course Signals or the Early Warning System (in Learn) as a means of providing feedback with tips for success on a regular basis. Early intervention is key – the earlier and more often you provide feedback, the better for the students.

5. Good practice encourages time on task.

The more good time a student spends on a task, the better they’ll understand the concept and be able to perform the same task the next time. “Good” time is purposeful time – not time spent multi-tasking or working on multiple things at once. It is time that is devoted to one thing with a strong concerted effort. Encouraging students to enhance their learning and studying skills is a great way to help them increase their overall effectiveness.

6. Good practices communicates high expectations.

Most students will work to reach the bar you set for them. If a high bar is set, they’ll work to reach it – provided if you also provide support for them at the same time. Telling students where the bar is set and how they can reach it with your support or the assistance of other offices on campus (resource rooms, help labs, etc.) will go a long way in helping your students succeed.

7. Good practice respects divers talents and ways of learning.

How you learn is not necessarily the same way your students learn, and that’s ok. Understanding where these differences lie, and using varying methods of assessment (oral projects, written papers, team work, multi-media, etc.) will allow for students with different styles and skill sets to flourish.  Purdue’s Center for Instructional Excellence has some information on learning styles, and can work with you to better understand how these can be incorporated into your classroom.

 

Reference: Chickering, A. W. & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7): 3-7.

Technology: What students know vs. what we want them to know

Tags: ,

By in Classroom, Course Redesign, General Education, Musings on Technology, Student Behavior, Student Technology Kit on .

One of the common technology disconnects we see is faculty expectations compared to student abilities. One of the possible reasons for this is the types of technologies student most commonly use compared with the types of technologies we want them to use.

Student Tech Use

First off, some students have significantly more computer experience than others. Some will have had home computers before they started to talk and others will have had limited access in schools. This gives a spectrum from no experience to constant experience…

 

Range from No Technology Use to Constant Use

Range from No Technology Use to Constant Use

In addition, however, we need to think about what students are doing when they use technology. The types of technologies students are frequently using are social networking, gaming, and ‘productivity’ tools (such as Word and email). And each student will have a different level of experience with each. So while one student may have focused on productivity and gaming, another might have focused on social networking.

 

Differering student expertise

Differering student expertise

So, graphing a class of students, you might end up with something like this:

Students will have various areas of expertise as well as different levels of use

Instructor Expectations

 The types of technologies we want them use could be grouped into productivity tools (perhaps expanded to include presentation and spreadsheet tools), subject-specific technologies (such as electronic medication administration), and instructional technologies (such as research databases, DoubleTake and Blackboard).

Students’ experiences in subject-area and instructional technologies are often pretty limited. So a typical student might look like this…

Student experience with subject and instructional technologies

Student experience with subject and instructional technologies

and a class might look more like this…

Class experience with subject and instructional technologies

Class experience with subject and instructional technologies

So what?

As instructors increase the amounts and types of technologies used for teaching, the students may need additional support. Programs we think of as intuitive may only be so because of our experience and background. For example, I don’t care what my kids say, I struggle with Facebook constantly. They don’t.

It might help us think through student technology learning needs if we think through their probable experiences and compare these with the technologies we are asking them to use.

This, of course, puts another burden on the instructor – as the main person associated with the technology, the instructor is probably students’ first contact.

If you are planning on using instructional technologies in class or in assignments, you might want to check your student’s readiness first. Attached is a simple and quick survey that might help you with this.

By thinking through what types of support students may need, when they might need it, and who is the most appropriate contact for the students, you can help them get support more quickly.

  • Many technologies have quick-start guides that you can provide students before they need them.
  • We also have student trainers who can provide basic instructions on many technologies your students might need to complete your assignments (http://www.itap.purdue.edu/learning/trainingnew/st/ or email itaptrainers@purdue.edu).
  • And if you are not sure who the contact should be, you can always start with the ITC Help Desk (x44000).

If you are interested in learning more specifically about instructional technologies, our team in IDC is ready to help. You can contact us by emailing itap@purdue.edu.

Pat Reid, Ed.D., Manager, Teaching and Learning Initiatives