Designing open-book exams

Last updated: May 6, 2022, 4:26 p.m.

***The content on this page will be continually updated.

Guidelines for creating open-book exams

Open-book and take-home exams are a method of testing, usually in an unsupervised environment, that allows students to use textbooks, class notes, memory aids and other reference material to complete the exam. These exams can assess a range of competencies, but are especially useful for evaluating a student’s ability for higher order of thinking over their ability to recall factual information. 

Many essay questions and short answer questions already target the application of higher-order thinking skills and can be readily used as an open-book or take-home exam with minor revisions. Questions that test ‘recall’ are typically better suited to multiple choice exams, so will need to be revised to become effective open-book exam questions.

Target higher-order thinking skills

Higher-order thinking references the different levels of knowledge according to Bloom’s Taxonomy of Knowledge. Depending on your learning outcomes and where your course is situated within your program, you will be targeting different levels of discipline-specific knowledge and concepts with your open-book exam questions. Below is a summary of lower and higher order thinking skills and associated question words.

Important note: Students should be well accustomed to higher-order question types prior to taking the exam.

Tips for writing open-book exam questions

  1. Clearly define the learning outcome you want to assess. In other words, your questions should measure the skills or knowledge that students acquired as part of your course.
  2. Keep the number of questions realistic and in line with the time limit of your exam and the grading required.
  3. Situate your question within a particular context, problem or situation. Ensure the context, problem or situation is clearly outlined for students.
  4. Formulate a question that requires students to demonstrate their thinking related to their discipline knowledge / the course concepts (E.g., Apply their knowledge of a theory to your context, problem or situation. *This will ensure they are using higher order thinking skills rather than simply locating and summarizing the relevant information to answer the question).
  5. Construct the question so that the task is clearly defined for students. Avoid passive sentence structure and ambiguous phrasing.
  6. Use directive verbs to clarify the types of thinking and content you require in the response. Refer to the directive verbs in the section on Higher-order question types above.
  7. Keep the number of questions realistic (with open book exams, we might be tempted to give more questions)
  8. Validate your questions by having a TA or colleague review them for clarity and alignment with the intended assessment outcomes.

Note: Avoid using vague terms like “discuss” as this can lead to responses that are too broad or vague.

Preventing plagiarism

In a systematic review published in November 2019, Bengtsson, included a range of remedies from multiple research studies for decreasing plagiarism on non-proctored open-book and take-home exams. A summary of these remedies follows below.


  • Bengtsson, L. Take-Home Exams in Higher Education: A Systematic Review. Educ. Sci. 2019, 9, 267.
  • Bredon, G. Take-home tests in economics. Econ. Anal. Policy 2003, 33, 52–60.
  • Freedman, A. The take-home examination. Peabody J. Educ. 1968, 45, 343–347. [CrossRef]
  • Frein, S. Comparing In-class and Out-of-Class Computer-based test to Traditional Paper-and-Pencil tests in Introductory Psychology Courses. Teach. Psychol. 2011, 38, 282–287. [CrossRef]
  • Fernald, P.; Webster, S. The merits of the take-home, closed book exam. J. Hum. Educ. Dev. 1991, 29, 130–142.
  • Lancaster, T.; Clarke, R. Rethinking assessment by examination in the age of contract cheating. In Proceedings of the Plagiarism Across Europe Beyond, Brno, Czech Republic, 24–26 May 2017; ENAI: Brno, Czech Republic; pp. 215–228.
  • Lopéz, D.; Cruz, J.-L.; Sánchez, F.; Fernández, A. A take-home exam to assess professional skills. In Proceedings of the 41st ASEE/IEEE Frontiers in Education Conference, Rapid City, SD, USA, 12–15 October 2011.
  • Svoboda,W. A case for out-of-class exams. Clear. House J. Educ. Strateg. Issues Ideas 1971, 46, 231–233. [CrossRef]
  • Tao, J.; Li, Z. A Case Study on Computerized Take-Home Testing: Benefits and Pitfalls. Int. J. Tech. Teach.Learn. 2012, 8, 33–43.
  • Williams, B.J.;Wong, A. The efficacy of final examination: A comparative study of closed-book, invigilated exams and open-book, open-web exams. Br. J. Educ. Technol. 2009, 40, 227–236. [CrossRef]

Additional resources


Back to top

© Concordia University