Education-line Home Page

Quality versus time: a rationale for blended learning?

Karen Fill

Learning and Teaching Enhancement Unit
University of Southampton, SO17 1BJ, UK

Paper presented at the British Educational Research Association Annual Conference, University of Warwick, 6-9 September 2006


An externally funded project, DialogPlus, has facilitated the creation, use and evaluation of a number of computer-based activities to enhance learning on geography degree courses. Student-focused evaluation of some of these innovations has been conducted on six separate courses of study in two British universities. The research methods included observation of students’ use of the learning activities, one to one interviews, focus group discussions and analysis of questionnaire data.

The questionnaire design drew on a specific methodology developed and piloted during a previous two year collaborative project (Riddy and Fill, 2003). This Methodology for the Analysis of Quality of Open and Distance Learning (MECA-ODL) proposes generic quality criteria for three groups of participants: developers, tutors and learners. In the research described here, ten standard quality criteria for learners were used in the questionnaires completed by some three hundred students, facilitating comparative analysis across courses. The results of quantitative analysis for each individual unit, together with student comments and the researcher’s observations of use were discussed with the teachers and their reflections informed the final evaluation reports.

While the results of the quantitative analysis are interesting, and can be used by the teachers and materials developers to modify and improve the online learning activities, students’ comments and contributions to discussions facilitated or observed by the researcher have proved illuminating and occasionally surprising. Overall, the MECA-ODL approach and criteria have proved robust. The online activities have largely engaged students and contributed effectively to their learning. However, both teachers and students have found high quality, computer-enabled learning to be satisfying but time consuming and would not want all the units they either deliver or take to have too large an online component. This would appear to endorse the current promotion and take-up of blended learning in British universities.

This paper presents and discusses the context of the DialogPlus project, the MECA-ODL methodology, and the findings of the quantitative and qualitative analyses.


The Digital Libraries in Support of Teaching and Learning in Geography at Penn State, Leeds, UCSB and Southampton (DialogPlus) project was funded, by the JISC(1) in the UK and NSF(2) in the US, to undertake development and evaluation work during 2003-2006 and to embed the lessons learned in the partner universities and wider academic communities during 2006-2008. A number of papers have already been published about the tools and resources developed during the project. These are listed on the DialogPlus website(3).

The project team included academics in the Schools of Geography at the four partner universities, supported by learning technologists, computer scientists, and educationalists. A minority of the developed resources were used in completely online courses. The majority were implemented in a blended learning mode, involving both students and teachers in online and offline learning activities. One of the roles the current author fulfilled on the project was that of internal evaluator. Primarily, this entailed helping the academic staff to understand the effect their innovations were having on the student learning experience. Towards the end of the development phase it also involved analysis of staff views on the project processes and outcomes, including their assessment of the impact on their colleagues and institutions.

Student-focused evaluation

Early findings from student focused evaluation were presented at the European Conference on Educational Research (Fill, 2005). This paper builds on that research, using the same methodology, and reporting results from 308 student surveys supplemented by observation and discussion groups over two academic years.

The student surveys used the ten generic quality criteria (Fill, 2005, p. 4) derived from the MECA-ODL methodology (Riddy and Fill, 2003). Table 1 below shows the criteria and range of responses permitted. Students were also encouraged to write comments.

For the Online Learning Activity, {title}, which you have just completed, please score each statement as

0 – No 1 – Somewhat 2 – Yes N/A – not applicable


There was a full description of the learning activity, including learning objectives.


The interface was easy to use. 


Required tools were included (e.g. database, spreadsheet, note making, bulletin board).


The content met the needs of my preferred learning style.


The content was relevant, appropriate and clear.


All embedded materials were easily accessible. 


Mechanisms were provided for information and support.


Maximum response times to learner queries were defined.


The assessed elements of the activity were appropriate for the learning objectives.


The activity improved my {subject} knowledge and skills.

Table 1: Generic quality criteria

These MECA-ODL quality criteria were supplemented by a further ten questions, derived in conjunction with the course tutor, and, when possible, observation of the students doing the online activities, and/or discussion with them individually or in groups. An example of additional questions is given in Table 2. For one of the course units the survey was mounted online; in the other five units the surveys were paper based.

For all the {unit title} online activities, please score each statement as

0 – No 1 – Somewhat 2 – Yes N/A – not applicable


I have enjoyed doing all the online activities.


I learnt a lot from doing the online activities.


I found the time suggested for completing the online activities to be accurate.


I found the formative questions and quizzes useful.


I enjoyed contributing to the discussion boards.


I found the contributions that other students made online helped my learning.


The tutor’s contributions to the discussion boards were helpful to my learning.


I found the {title} quiz useful.


The blend of online activities with traditional lectures is a good way to learn.


I would like more online activities on other units.

Table 2: Example of supplementary questions

As noted previously, (Fill, 2005, p. 11), use of the same ten generic criteria across courses facilitates comparison of the mean scores derived from student questionnaires. Careful analysis of the comparative statistics can be useful, especially when augmented by student comments, in trying to identify any aspects of an online learning activity that might need to be improved. It can also facilitate year on year comparison, or investigation of gender differences, within a course unit. Examples of comparative charts are shown in Figures 1 and 2. It is extremely important that such comparisons are not treated in a simplistic way, for example to suggest that one online activity or unit is better than another. However, they can indicate where there are issues that would bear further investigation. On the DialogPlus project such indications were discussed with the tutors and any further analysis was undertaken only when they considered it might prove useful and with their full support.

In addition to the statistical analysis, great attention was paid to student comments made on the survey forms, in online and offline discussion forums, and during individual or group interviews. It became apparent that many of the students found the online elements of these units more time-consuming than offline, or conventional, learning activities. Typical comments about this aspect were:

It took longer than the 3-4 hrs stated.

They took me a little longer than suggested.

It took a lot longer than was suggested.

Figure 1: Example of a comparison across units

Note: In this instance, it would be important to investigate with the tutors possible reasons for the lower ratings of (for example) response times on units 1, 4 and 5 and to identify from student comments or further interviews the factors that contributed to (for example) satisfaction with unit 4’s interface, tools, accessibility and assessment.

 Figure2: example of a year on year comparison for one unit

Note: In this instance, it would be paramount to investigate with the tutors possible reasons for fall off in ratings for criteria 4 (learning style), 7 (support mechanisms), 8 (response times), 9 (assessments) and 10 (skills).

Time suggested definitely a minimum.

Not sure what the suggested time was but it took a long time to work out how to do some things.

Other comments related this common perception of elearning as time consuming to the embedded formative or summative assessments, particularly the maximum mark available for completing any online elements:

As it didn't count for any marks, it seemed a waste of time when we had more important things to do.

Each practical takes a lot of work but is only worth 10% & this seems quite small.

Perhaps 3 elearnings (sic) and an essay is too much for one module (along with an exam) - especially bearing in mind that we have other modules.

A nominal focus group session with 51 students on one of the courses, revealed that they had felt that the two worst aspects of the online learning activities were ‘lack of face to face support’ (nominated by 41% of participants), and that they were ‘really time consuming’ (18%).

However, many students had been impressed with the quality of both the computer based learning activities and the online support from tutors. Some of the thoughtful comments about these aspects were:

e-geomorphology is a perfectly viable means of learning about places which cannot be feasibly accessed.

They were a new form of learning and involve a lot more personal input. I believe I had to think more about what I was doing and why.

I found it very interesting, and while challenging, an enjoyable way to learn and understand.

Reading some of the queries other students had helped me to understand the material better.

(The tutor)’s responses were always very fast and very helpful.

Obviously a lot of time has been put in to constructing the weekly OLAs(4) - and this shows. I benefited quite considerably from this style of teaching, supplemented with personal assistance from the staff when necessary.

This last comment illustrates some of the students’ perception that designing and implementing the online learning activities is time consuming for staff too, something that emerged quite clearly from staff focused evaluation.

Staff-focused evaluation

Early findings from staff focused evaluation were presented at the International Conference on eLearning (Fill, 2006). Since then the detailed transcripts of one-to-one interviews with sixteen key project staff have been analysed in detail and reported to the team. For the purposes of the current paper, the reflections of academic staff that pertain specifically to quality and time aspects are now considered.

As the project progressed, the teachers experienced at first hand the challenges posed by both the simplicity and complexity of blended learning (Garrison and Kanuka, 2004), as illustrated by the quotations below.

At its simplest, blended learning is the thoughtful integration of classroom face-to-face learning experiences with online learning experiences. (…) At the same time, there is considerable complexity in its implementation with the challenge of virtually limitless design possibilities and applicability to so many contexts. (Garrison and Kanuka, 2004, p.96).

I have never been involved, with my fellow geographers, or me myself doing it, in anything else where we have thought so long and hard about the constructive alignment of the learning we are creating, the learning objectives, the student experience, the assessment, the rationale, how it fits into the course, how much student contact time is involved, how it would transfer into other programmes. (From transcript of interview with a Professor, 2006)

This process (…) has led me to creating materials that are much deeper and much more active than I had before, that is they're doing real projects with real data, and because I'm so cloistered here in my academic job, I've gone out to people whom I know to be expert. (…) But I would never have got to the point of needing expert assistance had I not, through our methodology, greatly deepened and enriched the activities that I was engineering for students. (From transcript of interview with a Senior Lecturer, 2006)

The teachers had the continual support of learning technologists and educationalists and some ‘buy out’ of their teaching time, but several of them reported an overall drop in, what they termed, their ‘productivity’.

To do a proper job on this is incredibly time consuming. I will henceforth always work this way. The quality of the things I make is much better – the students might not always recognise this – but my productivity has gone way down because it takes so much longer to do something right. (From transcript of interview with a Senior Lecturer, 2006)

I did spend quite a lot of time on it, much more so than the support that was supposed to release my time. So it’s been a net sink of research output time. (From transcript of interview with a Senior Lecturer, 2006)

The aspects that they appear to have found particularly time consuming include the additional thought that goes into the overall pedagogic design when there are online learning components; getting programme approval for blended learning units; and monitoring and/or contributing to online activities such as discussion boards. Finding or creating appropriate online resources and mounting them in the various virtual learning environments was delegated to some extent to other project members, but the teachers were involved in testing and approving them, inevitably an iterative and time-consuming process. One team member who combined both roles, creating online learning activities from scratch and teaching on a unit that adopted the blended learning approach, commented:

eLearning is not only about saving you time, it’s about enhancing the student experience. (...) My perception is that there is an expectation of time saving and not about how much effort is involved in creation and just how difficult it is to do some things that you think would be quite straightforward, but then not an appreciation of how easy it is to do other things. (From transcript of interview with a Teaching Fellow, 2005)

One of the relatively ‘easy things’ to do via the online infrastructure is to provide timely advice and feedback which remains available to the cohort, something that is valued by the students and contributes to their perception of quality.

Given that there were 70 people on the course this time around, that lot couldn’t be taught in less than probably three runs of the practical class, they wouldn't all be doing it at the same time and they would not get the archive of response nor the frequency of response that we've been able to deliver online. When you look at what they say, one of the good points was getting feedback from the course tutor the next day, in some cases immediately, which would never be available in the conventional framework. (From transcript of interview with a Professor, 2006)

Timeliness is crucial, however, and the teachers involved in this project learnt how important it is to set student expectations carefully in this area and then ensure that they are met. Mazzolini and Maddison (2006) offer some interesting advice for online instructors in this regard.

With regards to the impact on student learning, the teachers and evaluator are aware that year-on-year grade comparisons are potentially invidious and compromised by many aspects, including base-line attributes, prior experience and the tendency to normalise the range of marks awarded. However, some teachers noticed and remarked on qualitative improvements.

I would say I’m very pleased with student reaction and impact on them in general. I’ve noticed that students who have taken that 2nd year course have been quite good about recovering information from that unit and actually using it and helping to think and bridge topics which is something that they weren’t necessarily always very good at in the past. (From transcript of interview with a Senior Lecturer, 2006)

There aren't any mechanisms in place that would reveal to outsiders what is better about what we're doing. The students don't recognise it either. I recognise it. The only proof I have is that the quality of the discussions that we have feels better to me. (From transcript of interview with a Senior Lecturer, 2006)

However, many of the teachers had noted the comments in student feedback about the online components being overly time consuming and their reactions included puzzled concern, rationalisation of the study hours required, and reflections on the differences between adult and young learners.

I am a bit puzzled still by where in the communications it goes wrong, that we can get to the end of the course which is fundamentally about using online resources, where you look at the student evaluations and a lot of them say the best thing about this was using the real resources and yet quite a few of them say the downside was using all these online resources. (From transcript of interview with a Professor, 2006)

We now know that some students view the issue of time-consuming as being an issue but, when you deconstruct what that means, when you say to them 'well 15 credit point units, you’re supposed to have 150 hours of study time, isn’t that about right?' … one of the interesting things is because they are communicating about this via discussion boards and engaging in face to face sessions you actually find about these things which you never would except maybe when you got right to the end of the course and did the unit evaluation. So you can jump in and say 'OK, fair point, but get real'. (From transcript of interview with a Senior Lecturer, 2006)

The expectations are still there that a web course is easy and I don't think the adult students labour under that misconception. I think this comes from the following fact that while young students grew up with technology they use it for play, entertainment, they're not used to and not accepting of the concept of using it for work. Whereas the adult students know that technology is all about work. (From transcript of interview with a Senior Lecturer, 2006)

These reflections suggest that the tension between quality and time for both students and teachers contribute to a current rationale for blended learning .

A rationale for blended learning

On all six evaluated UK units, the students were campus based and their learning activities involved a mix of online and offline tasks. The amount of face-to-face contact they had with teachers and teaching assistants varied, with one unit providing all the material online with only computer practicals offering any face-to-face contact, whilst the others had a mix of large lectures, seminars, labs, fieldwork and online activities. Of these five units, four truly attempted the "real test of blended learning" (Garrison and Kanuka, 2004, p. 97), that is "the effective integration of the two main components (face-to-face- and Internet technology)". Staff and students appeared to value the results of this integration with respect to the enhanced learning experiences it provides in terms of variety, access to real world data for simulation of real world tasks to explore theories and concepts, and the immediacy of online communications.

Variety, that is the experience of "variation in the critical aspects of the topic being learnt" has been suggested by Oliver and Trigwell (2005, p. 22-24) as the one possible redeeming feature of blended learning as a pedagogic approach. To their example involving engineering students, we can now add several from the fields of Human and Physical Geography (see for example Priest and Fill, 2006). I would take issue with Oliver and Trigwell’s suggestion that the enhanced learning opportunities offered are "still largely serendipitous, because the space of learning is not currently a focus in blended learning design" (p. 24). It is consideration of exactly these issues and how best to address them that has made the design process so time-consuming for our teachers and the learning activities more demanding for our students.

The design sets them all a unique piece of work to do and even if they sit at adjacent computers in a workstation room they have to think it all through for themselves because they’re all working with data for a unique neighbourhood, the interpretation of the data is unique (…) Problems crop up at different points so they’re actually doing something which is really quite challenging but much closer to the real world (…) so I'm unrepentant about pushing them through that process because I'm really quite convinced that they're actually learning. (From transcript of interview with a Professor, 2006)

Concomitant with this, is the ability of the online communications infrastructure to facilitate "communities of enquiry" (Garrison and Kanuka, 2004, p. 98-99). So, confronted with these real problems the students engage in both face-to-face discussions but also online ones. In evaluation interviews, they are often reticent about their own online contributions but they value the contributions of other students and the teacher(s). Where these remain accessible over the period of study they are particularly valued.

Returning to the time issue. From the teachers’ perspective, effective pedagogic design and passing quality assurance procedures is more time consuming when they include online components. When a unit is running, online communications with students can take up more of their time than face-to-face. Students also find the online components more time consuming than offline activities. This bears further investigation, especially with reference to the teachers’ views on credit equivalence, but is likely to stem from a mix of access difficulties (unavailability of computers, lower speed of access than teachers enjoy), online distractions (email, chat, surfing), as well as the deep learning involved. Staff and students appear to acknowledge that, done well, it does enhance learning experiences and outcomes but they do not, as yet, want all units to include online components. This surely is a rationale for a continuing judicious approach to blending face-to-face with online teaching and learning.

I don’t foresee everybody in the school, or even the majority of people in the school, rushing out and having lots of online learning in their units, I think it’ll continue to be maybe 20-25%, a minority. As we’ve discussed before, maybe that’s actually the right amount because it’s enough to enjoy all the good things and to provide a little bit of mix. There’s definitely room for a range of types of teaching and learning within the programme. (From transcript of interview with a Senior Lecturer, 2006)


A three year development project has resulted in the creation, delivery and evaluation of online learning activities within a blended pedagogic approach. The use of the MECA-ODL user criteria as a basis for student-focused evaluation of the online components has proved robust, particularly as a guide to quality issues that bear further investigation and improvements. Well designed online tasks and communications have engaged campus-based students and contributed effectively to their learning, particularly their application of theories to real world problems and their exploration of these within communities of enquiry. Although both staff and students have found high quality, computer-enabled teaching and learning to be satisfying, it is also very time consuming for a number of reasons. There is some opposition to the suggestion that all the units they either offer or take should have significant online components. Proponents of this sort of blended learning in Higher Education must address the tensions between time and quality at specific stages of the design and delivery cycle, notably pedagogic specification, programme approval, realisation and testing of the design, online access and task completion, online communications and response times, and appropriate credit allocation.


1. Joint Information Systems Committee

2. National Science Foundation

3. DialogPlus website

4.Online Learning Activities (OLAs)


Fill, K. (2005) Student-focused Evaluation of eLearning Activities. Short paper presented at the European Conference on Educational Research, University College Dublin, Ireland, September 2005.
 [Available online: ]

Fill, K. (2006) Refreshed and Reflective: the impact of an elearning project on university teachers. In Proceedings of the International Conference on e-Learning, University of Quebec at Montreal, Montreal, Canada, June 2006, 139-146. ISBN: 1-905305-23-0

Garrison, R. and Kanuka, H. (2004) Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7,2, 95-105.
[Available online: ]

Mazzolini, M. and Maddison, S. (2006) The role of the online instructor as a guide on the side. In O'Donoghue, J. (ed.) (2006). Technology Supported Learning and Teaching: A Staff Perspective. Hershey, PA: Information Science Publishing.

Oliver, M. and Trigwell, K. (2005) Can ‘Blended Learning’ Be Redeemed? E-Learning, 2, 1, 17-26.
[Available online:  ]

Priest, S. and Fill, K. (2006) Online learning activities in second year Environmental Geography. In O'Donoghue, J. (ed.) (2006). Technology Supported Learning and Teaching: A Staff Perspective. Hershey, PA: Information Science Publishing.

Riddy, P. and Fill, K. (2003) Evaluating the quality of elearning resources. Paper presented at the British Educational Research Association Annual Conference (BERA 2003), Edinburgh, UK, September 2003.
[Available online: ]

This document was added to the Education-Line database on 07 September 2006