These websites are relevant

In this section you will find information that will help you design comparison-based activities for students that will help them generate productive feedback.  The information is divided up in the into a number of sections.

SECTION 1: Provides the design template which essentially comprises a "DO-Compare-Make Outputs Explict" structure.  It also includes a set of design principles, a proposed list of comparison artefacts and identifies scenarios where comparison based processes are implicit in teaching and learning but that would benefit from making them explcit as well as the learning outputs from them.



Key Principles of Comparison-Based Feedback Design

  1. Different kinds of comparison information lead to different kinds of internal feedback - hence the need to go beyond comments as the only comparator we plan for as teachers. There are two broad categories of comparison. I refer to these as similar-entity comparisons and different entity comparisons. Each of these categories has a range of possible variations. Both types should be used in designing for feedback. The first is usually used to help students improve the quallity of their work. However, the combination of both turbo-charges feedback processes. It helps students more deeply anchor their understanding in the topic domain, it develops their ability to see their work from different perspectives, to see different ways of doing things, to aquire the discourse of the discipline etc. I believe that by combining similar and dissimilar entity comparisons, we can make feedback serve many different learning purposes. My current research for example involves directly using feedback comparisons to improve students critical thinking, their ability to analyse, synthesis, evaluate create, apply etc 
  2. Similar entity comparisons refers to students comparing their work or the strategies they used to produce that work against similar works or similar strategies. Similar entity comparisons might involve items that are similar in content and form; for example, a student produces an essay or report and compares it with other essays or reports on the same topic. Alternatively the items might be similar in form but differ in content: for example, a student writes an essay or report and comapares it with essays or reports in a different topic domains. The latter will focus students on the writing of the report as the content is not relevant and does not distract. However, another possibility is somewhere between the two. For example, a student might compare an essay they have written on Piaget with a few well written essays on Vygotsky. In this case the topic for the essay is different but the content is related. While students will compare how their essays is written with the Vygotsky essays, and generate feedback about the structure and argumentation in their own essay, the Vygotsky essays might actually extend their understanding of Piaget's writings. 
  3. Diissimilar entity comparison refers to situations where students produce one kind of item and compare it with a different kind of item. For exmple, the student produces a report and compares it against the content of a video discussion, or produces a written explanation of a model and compares it against some diagrams of models in different texbooks; a student produces a solution to a complex problem and compares it against a flow chart of the problem solving process; a student produces a research report and compares it against the assessment rubric. This type of comparison is quite different from similar entity comparisons yet it is equally natural in everyday life and valuable when the outputs of the comparison are made explicit by students. It also follows the same principle, that students might generate different kinds of feedback depending on the comparator.
  4. A mix of simialr entity and/or dissimilar entity comparisons can increase the power of inner feedback. For example, it is better if teachers provide comments and examples than just comments alone (e.g. "the argument here needs to be stronger - try to make it more like the one you produced earlier in your essay" or "the argument here needs to be stronger - here are some examples of arguments - identify the pattern and try to reframe what you are saying in one of these ways"). Another reason for mixing similar and dissimilar entity comparisons, for example similar works and comments, is that this will help students acquire the language to explain similarities and differences across different works.
  5. Multiple sequential comparisons (one after the other) amplify and elaborate the feedback students generate. In Nicol and McCallum (2021) students, made three comparisons of their essay against three other essay. Sixty-five percent wrote self-feedback that matched the feedback the teacher wrote on the essay. After a fouth comparison 90% matched the teacher comments. However, all students generated feedback that surpassed the feedback the teacher provided. For example, they identified areas for improvement and alternative ways of approaching their work that the teacher did not identify and they provided feedback of a type that the teacher would have found it difficult to provide. 
  6. Dialogue is one of the most powerful opportunities for feedback comparisons as students compare their thinking with that of others and generate feedback out of that comparion. Hence, dialogue comparisons can be wrapped around all other comparison activities so as to enhance students' self-generation of feedback.
  7. Perhaps the most important principle is that to leverage the power of internal feedback the comparison process must be deliberate and the results of the learning from that comparison must be made explicit in writing, discussion and action. Writing is especially powerful as students generate self-feedback and have to think about the feedback they are generating. This activates a metacognitive process - thinking about your own thinking - which supports the development of self-regualation and the transfer of learning from (internal) feedback to new situations.
  8. Note: the terminology of similar-entity and dissimilar entity comparisons used here differs from the terminology of analogical and analytical comparisons used in Nicol (2021). I have now revised my thinking in this area and realise that this aspect is underconceptualised and so I intend to provide an update on this very soon (26 November 2021)


SECTION 2: Provides examples of how these ideas might be implemented. These range from simple ideas that can be implemented in a single classroom session (e.g. a tutorial or workshop) through to examples of comparison based activities spanning a whole course. 

Approaches to Implementation

One approach is to turn some of the comparisons that students are already making informally, for example, against rubrics, learning outcomes, exemplars, the work of peers, with texbook explanations, information in journal articles, into formal and explicit comparisons.

Another approach is for teachers to design or select new types of information for comparison. Technology affords great potential here and the possibilities are untapped. For example the teacher could ask students to solve a problem and write down their step by step thinking. Then the teacher could create a video where she solves the problem on a white board and talks through her step by step thinking. She might ask a colleague to do the same. Then after students have solved the problem the teacher could give them her video and ask them to compare her problem solving strategy with hers and write down where it is similar and different. They could then discuss their findings with peers and write down an agreed set of steps in their own works, then share that with other peer pairs.

An area in need of further investigation is the exact form of the instructions for comparison. More to be added here.

Practice Examples

Lecture-based comparison: In online contexts it has become common to present a brief lecture input (e.g. 10 minutes) then have students do an activity and then follow this with another brief input. This activity-lecture input sequence is easy to turn into an explicity feedback opportunity by asking students to make a comparison of the work they have produced with the second lecture input. This input will very likely build on the activity, However, if the lecturer deliberately plans for this and deliberately asks students to make the comparison and write down what they learn from this then this turns a natural process into a powerful pedagogical process. Of course the lecturer could then ask students to report back in class what they learned from this comparison. I have produced a whole article on this methodology which I will make available soon.

Clickers and peer instruction: Mazur (1997: 2000) has pioneered the use of clickers in the classroom. The normal sequence is that students prepare some work before class, listen to a lecture presentation, answer some multiple choice questions and if the majority in class do not get the rigth answer they are asked to convince their neighbour about their answer (i.e. discuss in groups) and then they are retested. This method could be considerably improved in two ways. First, instead of students' selecting an answer to the multiple choice question, they were asked to also give a reason for their selection. This draws on the comparison principle that the deeper the processing before the comparison the more students learn from it. Second, if after the discussion the teacher gave students a further lecture input, for example, an alternative representation of the answer (concepual, mathematical, diagramatic) and asked students to compare their answers with that and identify any new insights. Note that peer instruction embodies many comparison opportunities both with the material and the thinking of others. 

One-minute paper: The OMP has a long history in education and the basic method is to take 5 mins at the end of a lecture class where students are asked to respond to a question posed by the instructor: e.g. what was the most important concept we discussed today and why? can you draw a flow chart of this process? what was the muddiest point for you?  Students write a response hand it in and at the next session the lecturer gives them some good examples from the class responses and explains where others have misunderstood. Ideally, from a comparison perspective students should make a direct comparison of their prior work with the better responses and identify how they differ for maximum learning rather than passively listen to the lecturer. However, an alternative process that might not take two days to implement would be to have students at the end of the lecture class to make a response and then for the lecturer to give them two or more good quality responses (e.g. from a textbook or created by the lecturer) and ask students in pairs to compare their work with another and with the good quality examples and to identify what they are learning from that comparison and then have some report back to the class verbally.

Group presentations:

Creating powerful questions:

Addressing a client brief:

Writing a good literature review:


More to come in this section....

Other examples: anyone reading this who has other examples please send them for me. Try to keep them brief. Almost anything that is currently being done can be turned into a deliberate feedback opportunity. This email address is being protected from spambots. You need JavaScript enabled to view it.

SECTION 3: I am reserving this section for ideas for implementation submitted by practitioners who will be fully recognised in this site. As the number of examples grow I will start categorising them into disciplines. 

Where are these ideas being implemented?

University of Glasgow [accountancy and finance, economics, management, psychology, chemistry, higher education teaching]

University of Padova, Italy [education]

University of Edinburgh [medicine]

Trinity College Dublin [business, computer science, nursing and midwifery, pharmancy, health sciences, higher education]

University of Utrecht [medicine]

City University of New York [education]

Origins of Comparison Idea

My thinking about inner feedback and comparison can be traced through earlier publications, especially an article on how formative assessment could be re-designed so as to support the self-regulation of learning (Nicol and Macfarlane-Dick, 2006). However, it was by interviewing students in 2014 when researching peer review that the role of comparison fully crystalised (Nicol, Thomson and Breslin, 2014). In focus groups I asked students "How did you learn during peer review?" and "What mental processes did you engage in?" Most students said that as they were reviewing the work of peers that they compared that work with their own and generated new ideas about, new perspectives on or new approaches to their own work. Many used the word comparison or a phrase with a similar meaning. It was not until after I had written my own framing of comparison as the pivot for all feedback processes that I then found that there was a literature on comparison in the cognitive sciences (Gentner, 2014) that although it differed from my framing provided some insights for its advancement. 

Given this origin, my early work on comparison was linked to peer review, a multifaceted methodology with considerable power (see Nicol, 2014; Nicol, 2019). However, peer review is not the only locus for the making of comparisons. Comparison is a pervasive and natural process that is 'built-in" to our brains so my current research has started to fan out from peer review and investigate the comparisons students make of their work against information in other resources, for example, in documents, in videos, in lecture presentation, in exemplars and in rubrics. The key is to integrate many different types of comparison and stage them across the timeline of a course. 

The feedback problem

This website has relevance across all education. It addresses the problem of improving feedback on students' learning. It sees this problem as exisiting at two levels, a practical level and a deeper conceptual level. Practically, the problem is that:

  • §  students are often dissatisfied with the quality, quantity and frequency of feedback available to them

  • §  teachers find providing detailed and tailored feedback burdensome, especially for large student numbers

  • §  feedback processes are not always effective especially in helping students develop higher order thinking and life-skills

  • §  students, especially weaker students, become more dependent on teachers the more feedback they receive.

Conceptually, the deeper problem is how we frame feedback. In everyday language, feedback constitutes comments teachers or others provide on students’ work. This framing suggests that comments are the main or primary source of feedback information. This frames teachers as responsible for feedback provision and this, in turn, disempowers students as the architects of their own feedback processes. As a result, any increase in feedback quality, quantity or frequency, increases teacher workload, especially if teachers are conscientious and student numbers large. For decades, researchers have tried to solve these problems using technology-supported feedback, audio- feedback, whole-class feedback, feedback requesting, peer feedback etc. However, none of these approaches has been successful because of the conception of feedback underpinning them.  What follows is based on the re-conceptualisation of feedbaack as proposed in "The power of internal feedback: Exploiting natural comparison processes" (Nicol, (2020).  

Students are generating feedback all the time

This solution starts from the premise that students are always generating feedback. They do this by comparing current knowledge against external information. Musicians learn to be better musicians by comparing their compositions and performances against those of other musicians. Scientists learn by comparing their models, methods and findings against those of others in peer-reviewed publications. Students use information from multiple sources for feedback generation (e.g. from textbooks, online resources, videos). Teacher comments only represent a drop in the ocean of information students use for feedback generation.

The key to unlocking the power of internal feedback is for teachers to have students turn some natural comparisons some are already making anyway into formal and explicit comparisons and also to design new resources for comparison. Having students explicate the feedback they generate in writing, discussion and action is crucial to this approach and to the development of students’ self-feedback capacity. This not only builds students metacognition and their self-regulatory capacity but teachers also learn what comments students really need and can adapt their input accordingly.

This approach addresses problems raised when feedback is conceptualised as comments. Staging different comparisons across a task or course does not increase teachers’ long-term workload, as comparison resources are reusable. Also, teachers can select comparators that help students generate feedback in known areas of conceptual difficulty or to promote their higher-order reasoning. The potential is vast and this approach might be equally applied to graduate attribute development which also rests on processes of comparison.

How does this differ from other feedback approaches?

There is almost no literature positing feedback as underpinned by comparison processes  (Nicol, 2020). This is surprising given the extensive research showing that comparison underpins all major cognitive functions (e.g. memory, categorisation, evaluation) including higher-order reasoning. There are none, as far as I know, direct studies of feedback based on this comparison conception, although some researchers have carried out studies that when reinterpreted offer powerful support for it and provide some clear ideas for its implementation (e.g. Lipnevich et al, 2014; Anders et al, 2007)

Identifying comparison as its core mechanism transforms the way we think about feedback. It suggests we build on natural feedback processes that all learners (and indeed all humans) are engaging in and that we enhance them based on a simple and intervention methodology that applies across all disciplines.

Another merit of this conceptual re-framing is that it re-connects informal and formal feedback processes in mutually reinforcing and authentic ways. Until now these have been separated in the literature and in practice. For example, in educational settings, it is easy to identify situations where comparisons might happen implicitly in almost any teaching situation (e.g. when student groups present their findings after doing the same project). All teachers need to do is to harness their pedagogical power. What teachers need to do is to have students consciously make these comparisons and articulate and apply the learning that results. Students should also be invited to identify comparisons themselves. This builds students metacognitive knowledge and their capacity for self-regulation of learning.


Research Papers

Dawson, P. Carless, D. & Lee, P.P.W (2020). Authentic feedback: Supporting learners to engage in disciplinary feedback practices. Assessment & Evaluation in Higher Education.

Gravett, K. (2020). Feedback literacies as sociomaterial practice. Critical Studies in Education,

Joughin, G., Boud, D., Dawson, P., & Tai, J. (2020). What can higher education learn from feedback seeking behaviour in organisations? Implications for feedback literacy. Assessment & Evaluation in Higher Education.

Malecka, B. Boud, D. & Carless, D. (2020). Eliciting, processing and enacting feedback: Mechanisms for embedding feedback literacy within the curriculum. Teaching in Higher Education.

Winstone, N. & David Boud (2020): The need to disentangle assessment and feedback in higher education, Studies in Higher Education, DOI: 10.1080/03075079.2020.1779687

Winstone, N., Balloo, K. & Carless, D. (2020). Discipline-specific feedback literacies: A framework for curriculum design. Higher Education


The Power of Inner feedback

Nicol, D. 2021. "Guiding Learning by activating students' inner feedback". In THE [Times Higher Education] Campus available HERE
Nicol, D and S. McCallum. 2021. "Making internal feedback explicit: Exploiting the multiple comparisons that occur during peer review." Assessment and Evaluation in Higher Education
Nicol, D. and G. Selvaretnam. 2021. "Making internal feedback explicit: Harnessing the comparisons students make during two-stage exams." Assessment and Evaluation in Higher Education
Nicol, D. 2020. "The power of internal feedback: exploiting natural comparison processes, Assessment & Evaluation in Higher Education", DOI: 10.1080/02602938.2020.1823314
Nicol, D. 2019. Reconceptualising Feedback as an Internal Not an External Process.” Italian Journal of Educational Research, Special Issue : 7183.


Nicol, D., A. Serbati, and M. Tracchi. 2019. Competence Development and Portfolios: Promoting Reflection through Peer Review.” All Ireland Journal of Higher Education 11 (2): 113. 664.

Nicol, D., A. Thomson, and C. Breslin. 2014. Rethinking Feedback Practices in Higher Education: A Peer Review Perspective.” Assessment & Evaluation in Higher Education 39 (1): 102122. doi:10.1080/02602938.2013.795518.
Nicol, D. 2014. Guiding Principles of Peer Review: unlocking Learners’ Evaluative Skills.” In Advances and Innovations in University Assessment and Feedback, edited by C. Kreber, C. Anderson, N. Entwistle and J. McArthur. Edinburgh: Edinburgh University Press
Nicol, D. 2013. Resituating Feedback from the Reactive to the Proactive.” In Feedback in Higher and Professional Education: Understanding It and Doing It Well, edited by D. Boud and E. Molloy, 3449. Oxon: Routledge.

Nicol, D. 2010. From Monologue to Dialogue: Improving Written Feedback Processes in Mass Higher Education.” Assessment & Evaluation in Higher Education 35 (5): 501517. doi:10.1080/02602931003786559.

Nicol, D and D. Macfarlane‐Dick. 2006. "Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice, Studies in Higher Education, 31:2, 199-218, DOI: 10.1080/03075070600572090