I took a deep breath when I heard the announcement: Our school wanted to emphasize open-ended questions in all classes. Teachers would spend less time asking questions that could be answered with a simple term or a “true/false,” as is the case with closed-ended questions, and instead had to implement more questions designed to assess how well students could describe a process or apply knowledge to write out a more thorough and often subjective passage.
That seemed like a scary situation for my English language learners at first, but then I realized what a great opportunity this was for my students to practice speaking and writing if they could overcome their fears of making mistakes. To make that happen, I had to demystify what the types of questions were.
First, I made it clear that I expected my students to not be exempted from the open-ended question assignments. Many teachers are hesitant to offer these to language learners; this hesitancy may be based in concern, but avoiding such questions may have the unintentional effect of making students unaccustomed to answering questions about abstract reasoning that native speakers often receive (Himmele & Himmele, 2009). As long as the subject teachers didn’t expect impeccable grammar and flawless spelling, I was confident even my lower-intermediate students could get their points across.
The next step was to make sure students knew about this change. I made the difference between closed-ended and open-ended questions a theme for my next few assignments. Their reading assignments had some clearly marked closed-ended questions, where they had to find a specific fact or detail, with open-ended questions at the beginning to activate prior knowledge and at the end to assess understanding. In effect, I tried to show students that the goal of the open-ended questions was to explore ideas and make connections without actually telling them that (Black, Harrison, Lee, Marshal, & William, 2003).
Once students knew they couldn’t always find the answer to open-ended questions directly in the texts, I focused on how to answer them. This came down to reading the question closely to see what was being asked and how much support the teacher wanted. I started with short rubrics for opinion-based questions that put emphasis on the need to support an idea—I didn’t care what the idea or opinion was, as long as it was relevant to the question and supported. After that, we moved into more analytical questions, where students had to find facts in a passage to support what they thought and why.
When possible, I put the rubrics right next to the question (usually right below it) so my students could see them easily and know what I wanted. These included how many points of support, what grammatical forms I wanted to see done correctly, and the length for this particular answer. I specifically went through each point and noted what they did right or wrong while grading the work.
After a few weeks, I saw my students’ abilities to think beyond the blank space on the answer sheet improve noticeably. They started giving me more thorough answers to meet my expectations and took some chances expressing different thoughts with supported reasons. I like to think that will do more for their language development in the long-term than just circling letters for multiple-choice questions.
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: Putting it into practice. New York, NY: Open University Press.
Himmele, P., & Himmele, W. (2009). The language-rich classroom: A research-based framework for teaching English language learners (1st ed.). Florence, KY: Heinle ELT.