From Evaluation to Sensemaking: Emergent Development of a Masters Distance Learning Research Methods Module

Palitha Edirisingha, Phil Wood, University of Leicester, United Kingdom


Evaluation of student learning is ever more important in higher education today, due to managerial reasons as well as a genuine interest among teachers to improve students’ learning. Such module evaluation is generally carried out using end-of-module/semester questionnaires. While these evaluation tools may capture some reflections of student learning, they are often poorly focused, and rely wholly on summative perspectives which are captured at a point remote to the learning process itself. This paper reports on an initial investigation centring on developing a formative framework for evaluating distance learning modules. It is distinguished from typical summative questionnaire evaluations by the collection of live feedback from students as they undertake a module, allowing for insight and feedforward to develop materials as students undertake the module. This is achieved by using a modified version of an approach called “Lesson Study”, a collaborative planning and evaluation tool which originated in Japan (Lewis et al, 2009). The evaluation framework that we develop here is centred on the notion of sensemaking (Weivk, 1995).

Keywords: Distance Learning, Emergent Evaluation, Sense Making, Lesson Study, Module Evaluation, Research Methods.


The evaluation of modules in higher education is an important driver for change and curriculum development. Whilst a number of different approaches have been developed to evaluate modules, the standard tool used in evaluating student experience is the summative questionnaire which students are asked to complete at the conclusion of a module. These questionnaires often cover a spectrum of issues, but they often focus on issues around learning, rather than the experience and nature of the learning itself. Statements are often general in nature, e.g. “what was the quality of learning resources?” which can lead to over-simplified responses which lose the nuances of personal experiences.

There are a number of issues with the accuracy and utility of summative questionnaires. Firstly, many students, especially on distance learning courses, do not bother to fill in the questionnaires, either because they are busy professionals, or because they are happy with their experiences so do not value the opportunity to share their views. Secondly, questionnaires are inherently retrospective, which can lead to over-simplification of views, as a nominal overall impression is given, with little differentiation concerning a range of experiences. Thirdly, with the evaluation being undertaken after the module has finished, means that any potential improvements can only be undertaken after the students have finished their learning; future cohorts might gain from any given evaluation, but not those completing the evaluation. Finally, the analysis of questionnaire data is often reductive, leading to numeric summaries, with little explanatory or discursive insight into the complexities of the activities undertaken. This means that general patterns become important rather than individual experiences, losing the idiosyncratic nature of student experience, which might be especially important for distance learners.

The restrictions above are recognised within the literature already (Wachtel, 1998) and have led to the development of alternative approaches in an attempt to gain a better evaluation of learning, whilst also acting as the starting point for cyclical development of the curriculum. For example, Ellery (2006) developed a multidimensional evaluation framework for use on a campus-based course on data analysis in social science research methods. The approach not only gathered information from students, but also captured lecturer perspectives in an attempt to create a more complete picture of student experience and learning. A number of methods were used to capture evaluative data that were then used to inform curriculum and pedagogic development. Benson et al. (2009) extended the idea of formative evaluation further by developing a participatory evaluation model, that again was multi-modal, based on the work of Jackson and Kassam (1998) which is

“a process of self-assessment, collective knowledge production, and cooperative action in which the stakeholders in a development intervention participate substantively in the identification of the evaluation issues, the design of the evaluation, the collection and analysis of the data, and the action taken as a result of the evaluation findings” (Jackson & Kassam, 1998; p.3).

Here, students were involved in identifying the terms of evaluation before being involved in data capture and interpretation. This made them and lecturers joint investigators into their own work, and gave a sense of joint responsibility for improving modules and learning. However, in both cases, these alternative approaches were developed within campus-based contexts. In addition, some of the same restrictions can occur, with evaluation leading to improvements for future learners rather than those who have completed the evaluation. In this investigation, we attempted to develop an alternative model which could be used in distance-learning contexts, and which could lead to new activity in-action, i.e. which would allow those involved in the consultation to gain from their involvement as they follow the module under analysis.

In developing a different approach to evaluation, we have chosen to develop a process which is closer to sensemaking (Weivk, 1995). The underlying basis for sensemaking is that groups or organisations attempt, collectively, to understand and give meaning to their experiences. As with evaluation, sensemaking often occurs at the end of a process, retrospectively gaining insight into experiences to help develop better processes for the future. However, Klein et al, (2006; p.71) state that, “Sensemaking is a motivated, continuous effort to understand connections (which can be among people, places, and events) in order to anticipate their trajectories and act effectively”. This leaves the possibility of sensemaking as an interactive tool for understanding and developing practice as it emerges. If we can begin to understand the trajectories being followed by students as they undertake a module, we can begin to act effectively as the module unfolds. We can also begin to gain insights into the idiosyncrasies of student engagement and learning within a module. This approach also sees tutors and students as partners in the development of their learning, which itself can lead to a cyclical process of communal change (Figure 1).

Figure 1

Figure 1. The process of sensemaking and sensegiving between students and tutors (based on Iveroth & Hallencreutz, 2016; p.51)

Iveroth and Hallencreutz (2016) see sensemaking as offering a cyclic process where change leaders and employees are able to use action and dialogue to move practice forward. Figure 1 shows a process where tutors are able to build new elements in a distance learning module, thereby sensegiving to the students. Iveroth and Hallencreutz (2016; pp.49-50) describe sensegiving as

“the process whereby an individual tries to influence others’ sensemaking processes…A sensegiver provides open communication and cues in such a way that it affects the construction of meaning of others.”

Students in turn make sense of these materials, and through their work, plus any dialogue about the work in turn sensegive to the tutors. By trying to understand the student work and perspectives, the tutors in turn make sense of the experiences, adding and changing subsequent materials and approaches to take account of the ideas they receive. The cycle then begins again. It is this cyclical sensemaking process that we chose to develop to help us change, and hopefully improve, the learning experiences of the students. The process which we used to aid this cyclical change process was an approach called lesson study which is outlined below.

Aims of the pilot study

This investigation was undertaken on a distance-learning MA in International Education course. The programme includes a 30-credit module on research methods, the second module of four which make up the first 120 credits of the masters course. We decided to focus on this particular module as it has been identified as one which students struggle with and which often leaves them with poor and incomplete understanding. Other researchers have also identified the issues in teaching research methods at Master’s level (e.g., Barraket, 2005) and at a distance (e.g., Schulze, 2009). In developing an evaluative process, we wanted to create an approach which allowed for:

  • diagnostic and formative module information;
  • a clear link to curriculum development;
  • a framework for distance learning change and development which relies on more than a performative activity;
  • putting pedagogy (interpenetration of teaching, learning, curriculum and assessment and their interaction with teachers and students) at the centre of the process;
  • emergence and trialling of new approaches as a standard element of our work.

The cohorts we work with are small, typically 8 to 10 students per intake (with two intakes per year) which needs to be kept in mind when considering the process and results we have gained. We are presenting this work within a particular context and the degree to which it might translate, especially to very large distance learning courses, is not known.

Outlining the Evaluative Framework and Pilot Data Collection

In developing a cyclical model to allow a sensemaking approach, we decided to use a variant of lesson study (Lewis et al., 2009), a framework which attempts to help teachers improve their pedagogy by working collaboratively to improve student learning. Lesson study has been a core feature of educational development in Japan for over 100 years. Since the end of the 20th century it has moved beyond Japan, and is now a well-established method for pedagogic development in countries around the world, being increasingly embedded in countries across all continents. It is a collaborative form of action research (Wood et al., 2017) which relies on dialogue and joint endeavour meaning that it cannot be undertaken by individual tutors to reflect on their own practice.

A basic cycle of lesson study is given in Figure 2, and begins with a group (as few as two will work) of teachers coming together to identify a learning challenge. The learning challenge is a specific element of learning that students struggle with and often fail to understand well. Having identified such a challenge, the group then work together to plan a lesson which engages with the elements of that challenge to create a pedagogic experience which will help students gain a greater level of understanding. This requires the teacher group to spend time considering not only the teaching element of the lesson, but also the learning of the students. How will they engage with and make sense of the subject matter? How will particular activities be understood and completed?

Once the lesson is planned and resourced, one member of the group teaches the lesson, whilst the other members observe a number of students. The observations focus on trying to note how students react and make sense of the lesson material. Once the lesson has finished, the teacher group reconvenes to consider the evidence for student learning and the degree to which the lesson has helped them move forward in their understanding. If there is the opportunity, a second lesson can be taught again to a parallel group, making amendments to the original lesson where necessary, to maximise the level of pedagogic insight gained from the process.

Figure 2

Figure 2. A basic lesson study cycle

To date, lesson study has only been applied in face-to-face contexts, in large part due to the school-based context within which the vast majority of lesson study takes place together with the central role for observation. We decided that we would develop a modified version of this approach as the basis for developing and evaluating student learning as it occurred within a distance learning module. Our research was undertaken with three students from a cohort of five who were undertaking the research methods module, and did so using an amended version of the cycle in Figure 2. A central element of the lesson study approach is the observation of the research lesson, an activity which obviously does not translate directly to a distance learning context. However, where appropriate, the use of discussion board dialogue might stand in place of this element of the cycle. We decided to use individual semi-structured stimulated recall interviews (Lyle, 2003) as the main source of evidence for approaches to, and levels of, learning (see Appendix 1 for questions used) in place of observations, and hence the amended lesson study cycle looked like that given in Figure 3.

Figure 3

Figure 3. Modified lesson study cycle for distance learning contexts

This cycle was used to investigate two areas of the research methods module which often cause problems in student learning:

  • developing research questions;
  • organisation, analysis and interpretation of data.

Having completed these two cycles of investigation, we decided to include an extra step in the process, based on research in lesson study at higher education level (Wood & Cajkler, 2016). In a final, third, cycle which focused on the development of critical writing in student work, we included a step before the first planning meeting which consisted of individual interviews with the three students to investigate their understanding of some of the key concepts of critical writing, and to ascertain the pedagogic approaches they preferred when learning online. This amended cycle is shown in Figure 4. This amended cycle, by engaging with students both before and after the learning which was to be undertaken meant that they became part of the process of sensegiving early in the cycle. It allowed us to get a much clearer and deeper understanding of the issues they faced before beginning to plan an area of work.

The module lasted for 16 weeks, with the three cycles of modified lesson study occurring at weeks 2, 7 and 12 (see Figure 5). In addition, general individual interviews were undertaken with the three students at the beginning, middle and end of the module. The intention of using this model was to help us to develop a deeper understanding of what students believed they were learning, but also how they were making sense of the module materials. As such, this gave us an opportunity to evaluate, amend and develop approaches as the module unfolded (hence the idea of an emergent sensemaking approach). The approach also allowed us to gain ideas and insights from each other as tutors as we developed the module together.

Figure 4

Figure 4. Enhanced modified lesson study cycle for distance learning contexts

Figure 5

Figure 5. Overall schematic of data collection for emergent evaluation approach


This pilot study allowed us to gain a number of insights. Here, we focus on the reflections and advantages we gained as curriculum developers, and also consider some of the possible issues and challenges of scaling up the approach to larger groups.

Initial starting points

At the beginning of the module, the initial interviews we completed gave us very useful insights into relevant prior learning of students. One student had completed a social research methods course at undergraduate level, and therefore felt that she had a relatively clear foundation on which to build her studies,

“The fourth year [of my degree] was a dissertation research. So I’d got some experience with it. I mean it’s been a few years but with the readings, it’s starting to all kick back in and I’m starting to remember the things that I was doing back then.” (student C)

Another student had come from a non-social sciences discipline, but had been involved professionally in small-scale research projects, so had some practical experiences of research involvement, but little theoretical perspective.

“I’ve done little bits and bobs subsequently. Things like in my work when I have been generating outcomes of projects. Carrying out things like student focus groups etc., things like that. So not hugely detailed or in-depth in terms of my actual research in the past, but as I say, some exposure to it.” (Student A)

This helped us understand the very different starting points from which students enter the module on research methods, and the extent to which we could assume a basis in research methods or not.

The interviews also gave us opportunity to understand in some detail the learning practices students had developed in their first module, prior to studying the research methods module. Again, even though we only conducted interviews with three students, all had very different approaches to their studies, often based on prior learning approaches, but also practical, work-based constraints. This included different preferences in terms of working with others, and sources of information. Student A had a preference for collaborating with others in his work, whilst Student B preferred to read and work individually, partly the result of a hectic work schedule which meant she was away from home on work-related visits on a regular basis. There was also a wide variation in the use of digital media. Student A made use of a number of Apps, and worked predominantly on an iPad, whilst Student C preferred to use a laptop unless she was away from home.

By using the information from the initial interviews we were able to gain some critical and rich insights into prior learning and students’ approaches to their learning. These provided useful starting points for our collaborative planning sessions once the lesson study cycles started.

Cycles 1 and 2

The first two modified lesson study cycles (see Figure 3) which focused on research questions and data organisation, analysis and interpretation, proved to be very positive experiences for the two of us as researchers. The opportunity to discuss and build a week-long work package of study activities through discussion allowed us to develop a more critical and in-depth consideration of the content to be covered. During the second cycle, we were also able to use student stimulated recall data from the first cycle to inform our discussions. In the planning meetings we built out from some basic principles to create possible narratives and activities to create a coherent package for students. An example of notes written on a white board from the planning meeting for modified lesson study Cycle 2 is shown in Figure 6.

Figure 6

Figure 6. An example of board notes from the planning meeting for lesson study Cycle 2

In these meetings we considered how we thought the students would engage with the materials and how this would help them in understanding the issues and concepts covered during that week. These predictions were then considered again when we evaluated the week’s work package in the evaluation meeting, having interviewed students to understand their perception of their learning. As such, this approach gave us a lot of insight and further questions concerning the development of curriculum materials. As we evaluated one element of the work, it helped us consider the development of the next element, often in ways we had not envisaged, leading to the notion of an emergent process.

Cycle 1 of lesson study focused on helping students understand and develop research questions. Past experience had suggested to us that students often struggle moving from a broad research topic to specific questions. The research question work package therefore focused on the process of filtering from an initial idea to a set of research questions. This was done by asking students to begin by watching a short video on developing research questions before writing a 2-3 paragraph outline of their potential research project. Once this was complete, they were then asked to construct 2 or 3 statements central to this initial description. These statements were then to be converted into questions. Finally, both the initial description and the questions were posted onto a communal wiki for peer discussion.

The interviews with students at the end of the first cycle of lesson study helped us to understand what students had focused on, and how they had made sense of their experiences of their work on research questions. Student A reflected on how the materials had helped him to narrow in on a set of research questions from a broad starting point, and from this to how the research questions might be linked to the use of particular data collection methods,

“ what I’d got from the learning I suppose is taking it down to something quite tight and quite focuse.” (Student A).

Student C took a very different approach to the work package. She decided to use a thinking framework she used in science teaching with her students as a medium for focusing in on research questions,

“you start off with a big circle on the outside, a smaller one inside, and getting to a smaller idea. So you start off with a broader concept on the outside of the circle, and with research and everything, slowly getting it smaller and smaller” (Student C).

Whilst the focus on starting with a broad idea and narrowing from there worked well, the collaborative nature of the wiki had less universal success. Whilst two of the students found the opportunity to share ideas useful, the third student was less sure,

“..everyone’s just too busy with their own areas and people are selfish with their time. They don’t want to dedicate, rightly or wrongly, they don’t have the time to input into somebody else’s work or reflect. If it was marked I’m sure you would get greater input.” (Student B).

These insights helped us understand that the students had started positively in relation to the learning challenge, i.e. that the materials which had been developed had helped them create some good research questions. This meant that we could be confident going forward into the module that they had a sound basis for further work, and did not require further support in this area. However, it also gave us insights into the partial success of the wiki as a collaborative space.

Cycle 2 of lesson study focused on another area which often challenges students, the development of data analyses. Often, the challenge here is in understanding how analysis methods work, and how data should be treated after collection. In many research methods texts, data analysis is seen as a singular process, leading to ‘mechanical’ understandings of process rather than a more holistic engagement with data. With this challenge in mind we started by breaking down the process of data analysis (Figure 7).

Figure 7

Figure 7. Developing a finer-textured approach to data analysis

Therefore, the work package started with an explanation of how data should be organised, with consideration of transcribing, and organising quantitative data. This then led to a section data analysis which asked students to analyse data they had collected themselves for a pilot study they were developing across the module. This offered worked examples on which to base their work, but the concrete opportunity to work on their own data was deemed important in supporting their understanding. The work package then concluded with some reflections on some simple frameworks for interpreting and writing about data.

Interviews with the students after they had completed the work package suggested that they had found the resources easy to interact with. They tended to focus on those resources which aligned with their own collected data, and did not necessarily work with those activities which outlined analysis methods which they did not need,

“Yeah. So I basically tried to focus on the things that I was working on. So I focused on the interview and observation thinking that if it would come to it, I would come back to the website to have a little look at the questionnaire part.” (Student C).

The students did voice a general confidence in the work they had completed, being able to talk at length about the analysis methods they had used. They also commented on the focus on interpretation, finding it useful, but also to an extent, difficult. This appeared in part to be the result of a difficulty regarding the creation of a well-considered and weighted interpretation. It was this reflection which led us to complete a third cycle focused on helping students understand the structure and process of writing a research methods assignment.

Cycle 3

In the final cycle of modified lesson study we attempted to consider how a participative model (Wood & Cajkler, 2016) would work (see Figure 3). In cycles 1 and 2, we had started to develop a sensemaking approach which allowed us to consider and modify materials as the students were completing the module, but the student involvement was still, overall, retrospective. By using a participative approach, we could gain insights from the students before planning the session We started by asking the students to explain particular pertinent issues such as ‘critical writing’ to gauge their understanding of core concepts for the week-long work package on assignment writing. We then went on to ask them what they believed they would gain from most in the week given the focus on starting their assignments. These reflections were extremely useful in helping us understand what activities would have most potential impact in taking their learning forward, and some of their ideas and reflections were incorporated into our discussions and curriculum development. For example,

“Well I suppose I look at criticality in two ways. Criticality I suppose around questioning…It’s going into it in a bit more detail and interpreting that. Then at the same time, balancing those interpretations against other findings I suppose.” (Student A).

Other issues which came across as important, and in which some of the students had les confidence, was the development of a coherent argument, the discussion of data (reinforcing comments from Cycle 2), and how to develop a reflective voice regarding critical writing about the research process. Interesting, in each of these cases, students often offered clear, but brief definitions of key terms which suggested some understanding, but little depth.

On the basis of the student interviews, we planned a week-long work package which looked at an exemplar of an assignment from a previous cohort. It was introduced in general and then offered consideration and advice on the issues of:

  • Coherence, showing how various aspects of the writing needed to create threads through the work. An example of this was the importance of the research questions as a thread through the assignment (see Figure 8).
  • Discussing data, particularly in terms of how patterns and exceptions might be the focus of quantitative data writing, and how quotes might be used in qualitative analyses of data.
  • Finding a reflective voice, with suggestions for further reading, as well as consideration of the difference between analytical and reflective writing.

Figure 8

Figure 8. Diagram from the section on coherence in writing showing how research questions could be a thread throughout the assignment.

After the work package had been completed by the students, there was strong support for the utility of the resources provided. For example,

“Well it helped in terms of helping me getting mine structured and giving an idea between the tone or voice…the focus you need to do within the sections.” (Student A).

We believe that by engaging the students before the planning, we were able to target our work and how we planned for their learning much more clearly. Indeed, a participatory approach allowed the students the opportunity for a possibility of sensegiving from which we made sense, before creating a work package. In this way, whereas in Cycles 1 and 2 we had been the primary sensegivers in the process (i.e. creating the work packages as we believed they needed to be), in Cycle 3 the students became the primary sensegivers whilst we then made sense of their perceptions. We then used this sensemaking together with our own expertise to develop a framework for helping them overcome the challenges they had identified for us.

Understanding patterns of engagement over the module

The emergent sensemaking approach allowed us to develop elements of the curriculum in real time, driven by student response and reflection, thereby helping us shape the content and approach of the module. This meant that we gained a deeper understanding of the complexities of pedagogies, with the chance to respond to need. In this way the process became both diagnostic and formative rather than summative as often happens in module evaluation. Whilst the lesson study cycles allowed us to engage with specific parts of the process of student learning. It also allowed us to begin to understand the very different ways in which students make sense of their studies, and the approaches they used over the course of the module. Table 1 shows the contrast of Students A and B in terms of their approaches to learning over the course of the module.

Table 1:   Comparison of two students’ approaches to learning during the module

Table 1

The two contrasted quite markedly in the way they approached their learning. Student A worked quite collaboratively, enjoying the opportunity to discuss his ideas with others, and posting regularly on a communal wiki as he found peer feed forward helpful in his work. His iPad was also central to his work, including regular use of video content. However, Student B mainly engaged with content through the use of reading which was completed through a mixture of iPad and laptop use. Here, engagement with others, through either informal discussion or the use of the wiki, was almost absent. This student very much preferred to work alone on a specific work-based project.

These insights helped us to understand the need for creating rich resource environments which would allow students to navigate their way through the course in ways which would suit them. There needed to be a flexibility in the way we presented materials so that all participants could construct a positive and well-structured approach to their learning.


Sensemaking, in this context pursued through the use of a modified version of lesson study, makes the processes students engage in through their studies more explicit than would be the case in dominant summative evaluation systems. As the outlines in Table 1 above demonstrate, the students in our sample have approached their work very differently, whilst arriving at similar endpoints by the conclusion of the module. These processes might remain totally hidden in traditional end of unit evaluations, or partially hidden if using learning analytics. The use of lesson study does not itself give a complete picture, but it does make room for dialogue, and in the case of Cycle 3, for joint development of learning approaches and materials. As such, it is possible to begin to construct rich narratives which highlight the complexity of student work, and which may help build more robust and well targeted online environments for students to navigate and engage with.

An important part of the opening up of dialogue is the understanding, all be it somewhat incomplete, of the learning ecologies the students build in making a success of their studies. The students in this study are predominantly educational professionals from a range of contexts. Their work can sometimes lead them to need to travel abroad during their studies, or may require them to work in short bursts between periods of high professional workload. To begin to understand how these impact on their studies, how they react, and how they mediate the different stresses and constraints on their time can all be useful in designing successful online resources.

By developing dialogue with students over the course of the module, together with just in time materials development, we were able to focus on elements of the module we knew from experience students struggled with. This led to course development and reflection in real time, focusing on the students and their emerging work. In Cycle 3, the use of interviews before the materials development, allowed us to create focused and well-targeted materials which could then be tested and assessed. As such, there was a rapid cycling of sensegiving and sensemaking which the lesson study approach allowed to emerge. This is a time intensive process, but for elements of work where students really struggle, it appears to provide a possible process for creating better learning experiences.

From our own perspective, the use of lesson study also allowed us to make space for deep work (Newport, 2016). All too often, development work on distance learning materials can fall into an accelerated process in a busy academic department where academics are responsible for a number of different programmes and other activities. This can lead to quick fixes and default approaches. Using lesson study led us to make time to focus on important pedagogic issues. Newport (2016) characterises deep work as a process of acting in a focused and undistracted way on cognitively demanding issues. By focusing on elements of a module, with collaborative planning and evaluation meetings, we were able to engage in this way.

These insights demonstrate that we were able to gain a deeper, though still partial, understanding of student learning, but also makes clear the difficulties which exist when attempting to capture the nature and detail of student learning. We were only able to identify and work with some aspects of the module, and from the perceptual accounts of three students. However, whilst we accept that we were not gaining a whole picture of the student experience, we were nevertheless engaging, and making sense of student learning at a much deeper level than standard evaluative processes allow.

Final thoughts

Using a modified approach to lesson study as the basis for sensemaking and change has proved to be a very positive experience. It allows for rolling renewal and development of materials, and moves away from the overly-general summative evaluations which are often too vague to help develop new pedagogic approaches. We believe that a sensemaking process, in this case through the use of a modified lesson study approach, can offer useful insights and allow for curriculum development which is both well-grounded in an understanding of student needs, and also which helps programme tutors gain a shared perspective concerning the course they are responsible for. We also believe that lesson study, in a modified form, translates well from a face-to-face setting to one which supports development of distance learning pedagogies.

There were inevitable challenges, the most important being time. The three cycles of lesson study led to intensive work, with interviews leading to planning and weekly study package design and development within one working week before student use. This means that time was not only being made to complete the lesson study cycle, but also to complete a work package within a five-day window. This was intensive work, but did rely on a foundation of pre-existing module material, so that package development was in some cases a process of editing and reorienting rather than starting from scratch. Because of the intensive work required to develop a work package, and the multiple cycles used across the module, we envisage this approach being used in a targeted manner, at most across two modules per year. In this way, it could be used as an integral approach to renewing and innovating on distance learning courses. To attempt to use it on a larger number of modules over one academic year would, in our opinion, be unsustainable. The core aspect of this approach is the dialogic and formative nature of the process. This takes time, and whilst the rewards, we believe, are potentially great, the required input is intense over a relatively short period of time.

It also has to be stressed that the cohort involved in this pilot study was small, with only three students being involved in the interviews, and five overall in the cohort (The enrolment in this cohort was less than the usual number which is 8 -12). There is a question mark as to how well this model would scale-up, but we see no reasons why it should not work with larger cohorts. A small group of students would need to be identified and invited to participate. Particular characteristics would also need to be selected to ensure that a representative, or recognised, set of issues could be considered. This might include students with different first languages, if an international group, or participants located in different technological contexts.

Finally, there is a wider question mark over the degree to which sensemaking would fit within wider, increasingly performative, evaluation frameworks used by universities. We see this approach as being used instead of summative evaluations for the simple reasons that it appears to give more nuanced, more critical and in-depth insight into the learning and needs of students. However, as such it is working with the complexities of pedagogy and students; the use of summative statistics would be drastically over-reductive in this context, but is often the type of data that university quality assurance systems appear to require within some national contexts, such as the UK.

We see sensemaking as focusing on developing the quality and focus of curriculum approaches through diagnostic and formative debate with students and other colleagues. This pilot has demonstrated that a great deal can be gained by working collaboratively through a modified lesson study approach, supported by more general periodic interviewing. The constant, iterative approach allows for immediate incorporation of lessons learned and suggests that we can gain a much more in-depth understanding of student learning patterns and needs.


  1. Barraket, J. (2005). Teaching Research Methods Using a Student-Centred Approach? Critical Reflections on Practice. Journal of University Teaching and Learning Practice, 2(2), 62-74.
  2. Benson, R., Samarawickrema, G., & O’Connell, M. (2009) Participatory evaluation: implications for improving electronic learning and teaching approaches. Assessment & Evaluation in Higher Education, 34(6), 709-720.
  3. Ellery, K. (2006) Multi‐dimensional evaluation for module improvement: a mathematics‐based case study. Assessment & Evaluation in Higher Education, 31(1), 135-149.
  4. Iveroth, E., & Hallencreutz, J. (2016) Effective Organisational Change: Leading through sensemaking. Abingdon: Routledge.
  5. Jackson, E. T., & Kassam, Y. (1998). Knowledge shared: Participatory evaluation in development cooperation. West Harford, CT: Kumarian Press.
  6. Klein, G., Moon, B., & Hoffman, R. F. (2006) Making sense of sensemaking I: alternative perspectives. IEEE Intelligent Systems, 21(4), 70–73.
  7. Lewis, C. C., Perry, R. R., & Hurd, J. (2009). Improving mathematics instruction through lesson study: A theoretical model and North American case. Journal of Mathematics Teacher Education, 12, 285–304.
  8. Lyle, J. (2003). Stimulated recall: a report on its use in naturalistic research. British Educational Research Journal, 29(6), 861-878.
  9. Newport, C. (2016). Deep Work, Rules for focused success in a distracted world. London: Little Brown.
  10. Schulze, S. (2009). Teaching Research Methods in a distance education context: Concerns and challenges. South African Journal of Higher Education, 23(5), 992-1008.
  11. Wachtel, H. K. (1998) Student Evaluation of College Teaching Effectiveness: a brief review. Assessment & Evaluation in Higher Education, 23(2), 191-212.
  12. Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks, CA: Sage.
  13. Wood, P., & Cajkler, W. (2016). A participatory approach to Lesson Study in higher education. International Journal for Lesson and Learning Studies, 5(1), 4-18.
  14. Wood, P., Fox, A., Norton, J., & Tas, M. (2017). The Experience of Lesson Study in the UK. In L.L. Rowell, C. Bruce, J.M Shosh, & M.M. Riel (Eds.), The Palgrave International Handbook of Action research.



e-learning, distance learning, distance education, online learning, higher education, DE, blended learning, MOOCs, ICT, information and communication technology, collaborative learning, internet, interaction, learning management system, LMS,

Current issue on Sciendo

– electronic content hosting and distribution platform

EURODL is indexed by ERIC

– the Education Resources Information Center, the world's largest digital library of education literature

EURODL is indexed by DOAJ

– the Directory of Open Access Journals

EURODL is indexed by Cabells

– the Cabell's Directories

EURODL is indexed by EBSCO

– the EBSCO Publishing – EBSCOhost Online Research Databases

For new referees

If you would like to referee articles for EURODL, please write to the Chief Editor Ulrich Bernath, including a brief CV and your area of interest.