Education-line Home Page

Enhancing Small Group Teaching in Plant Sciences: A Research and Development Project in Higher Education


 Carmichael, P., Irvine, N., Jordan, K., Johnstone, K., Tracy, F. and Truscott, H.(1)

University of Cambridge

Paper presented at the British Educational Research Association Annual Conference, University of Warwick, 6-9 September 2006


The Department of Plant Sciences at the University of Cambridge uses a range of learning and teaching environments including lectures, practical laboratories and small group tutorials'. Under the auspices of the Cambridge-MIT Institute's Pedagogy Programme, a two-year research and development project concerned with the development of small-group teaching is being undertaken. The research element of this project endeavours to illuminate current practice and identify areas in which evidence-based development might take place. The development element will include professional development activities and the production of curriculum resources including appropriate online material. This is a multi-method study including a series of student questionnaires; focus groups of students; semi-structured interviews with staff members; and the collection of video of small group teaching. In this paper we report selected findings from the 'student data' of the first year of this project.

The questionnaire, conducted with two cohorts of students (2nd and 3rd year Undergraduates), used a double-scale questionnaire in which students were asked to report both on the prevalence of a range of teaching and learning practices and on how valuable these were in supporting their learning. This type of questionnaire instrument is particularly appropriate because the data it generates is suggestive of areas for changes in practice. The gaps between 'practices' and 'values' (across both cohorts) suggested that students valued activities which improved their understanding of how elements of the course were interrelated; which related course content to 'authentic' examples; and those in which teachers made explicit the characteristics of 'high quality' student work. Small group teaching, in the view of most students, was best used to extend and explore concepts introduced in lectures rather than simply reinforcing them or assessing student understanding.

Data gathered through focus group activities illuminated the questionnaire data, providing detailed accounts of how students managed their own learning, and the roles played in this by lectures, small group teaching and other resources. Students identified the processes of planning and writing essays as key learning activities during which they integrated diverse course content and reflected on problematic knowledge. Questionnaire and focus group data suggested that students had less clear views regarding the value of collaborative learning, peer-assessment or activities such as making presentations to other students. When students talked in positive terms about these activities, they often referred to the learning benefits of preparation for the tasks rather than of the collaborative activities themselves. These views may provide indications of potential barriers to changes in learning and teaching environments, and suggest that any such changes may have to be carefully justified to students in terms of benefits to their own learning.  Many of our findings are broadly in accord with other work on teaching and learning in Higher Education settings (such as the 'Oxford Learning Context Project' and the 'Enhancing Teaching-Learning Environments in Undergraduate Courses' Project) in that 'deep learning' and 'authenticity' in learning activities are valued by students, and that the introduction of specific formative practices (such as sharing notions of 'quality') would be welcomed. At the same time, amongst the students in our sample, a view of learning as an individual process of 'learning-as-acquisition' predominates over a view that it is a social process of 'learning-as-participation', and this will inform the planning of the 'development' aspect of the project.  We conclude with a discussion of how the approach we have used might be more widely applied both within and beyond the Cambridge-MIT partnership. We also identify potential affordances of, and barriers to, the development of research-informed teaching in Higher Education.


Since 2004, a series of projects have taken place at the University of Cambridge which have involved the Departments of Engineering and Plant Sciences in collaboration with the Centre for Applied Research in Educational Technologies (CARET), funded by the Cambridge-MIT Institute (CMI); and a parallel (but not identical) set of projects have also taken place at the Massachusetts Institute of Technology (MIT).  These projects were designed to develop an evidence base which would  enhance teaching and learning within undergraduate courses as part of iterative process of development and research.  This paper reports the results of a series of research activities intended to illuminate and explore student perspectives on teaching learning in the context of a specific undergraduate course: namely, the 2nd Year Plant and Microbial Science (PMS) course.  Potential outcomes for this process included the identification of characteristic elements of teaching and learning environments at Cambridge, particularly within small-group settings; the identification of teaching  practices with the potential to improve student experience and learning outcomes (and as a result to inform decision-making by teachers as to how, when and with whom to employ specific practices); and the   identification of  potential opportunities for the support, enhancement or extension of learning through the application of new technologies.

It was fortunate that links already existed (at the project and individual levels) with researchers working within the Teaching and Learning Research Programme (TLRP), a coordinated research initiative funded by the UK government through the Economic and Social Research Council (Pollard, 2005).  Now comprising over fifty research projects ranging from pre-school to professional education, the TLRP also seeks to encourage cross-sector collaboration and the development of overarching educational ‘themes’ and integrative theory.  Several projects, in particular the ‘Enhancing Teaching and Learning in Undergraduate Education’ (ETL) project (Entwistle, Hounsell and McCune, 2002) and the ‘Social and Organisational Mediation of University Learning’ (SOMUL) project (SOMUL, 2005) provided good methodological and analytical frameworks for our work.  Some of the research approaches and user engagement strategies employed were derived from the work of the TLRP 'Learning how to Learn' project;  although this project was concerned with learning (and specifically with assessment) in schools, it had an integrated design involving exploration of student, teacher and organisational learning (James et al., 2006) and it provided a useful model for the type of expansive development and research processes envisaged for project work in Plant Sciences at Cambridge. 

A further driver for the project was the fact that while lecture content within undergraduate courses at Cambridge  is well-established, responsibility for the activities involved in small group teaching is largely devolved to the constituent colleges of the University and are not prescribed.  While there is guidance as to the curriculum content which they should address (in the Department of Engineering, for example, ‘problem sets’ are used as a common focus; Plant Sciences is in fact atypical in that small group teaching is administered through the department), supervisors on the whole have considerable autonomy as to how they select and manage learning activities.  The resulting variation in student experience was raised as a point of concern in a recent student survey (CUSU, 2004), and was one of the reasons behind the recommendation of the QAA Institutional Audit (2003) that training be provided for new staff.  The project was therefore seeking to address a set of broad concerns, applicable to higher education systems in general, but also had a focus on small-group teaching and its relationship to broader student learning in the specific context of Cambridge. Methodologically, the project can be conceived of as 'curriculum action research' of the kind characterised by Stenhouse (1978), although with the added element that many aspects of the teaching and learning environments to be studied were poorly documented and very variable in character: while anecdotes and idealisations stressing the potential of small group teaching are common, it was unclear to what extent these were accurate or typical.  This meant that it was also important to develop a substantial evidence base for use as the basis for discussions with teaching staff and to aid in the development of interventions and changes in practice.  To this end, a wide range of data was collected in the course of the project with the intention of documenting teacher and student perspectives on current practice and potential areas for development of that practice.  One immediate concern was to explore the extent to which institutional aspects shaped not only practices but also any notions of agency and constraint on the part of both teachers and students.   A range of research instruments and approaches were developed including:

1.      A questionnaire designed to document reported teaching and learning practices on the PMS course from a student perspective, together with their attitudes towards those practices.

2.      A second questionnaire designed to document students' assessments of their self-efficacy, both in relation to generic skills and behaviours as well as those specifically developed in the PMS course.

3.       A series of student focus groups in which they discussed learning, teaching and assessment.

4.      Interviews with teaching staff about the teaching practices they reported using, their perceptions of students as learners, and the factors which support and constrain effective small-group teaching

5.      Video data collected during small group teaching activities

6.      Documentary analysis of course materials: course outlines, lecture notes, practical documentation, essay titles, past examinations, examples of student written work and resources produced by teaching staff to support student learning.

This paper will concentrate on the first three of these – the 'student' data sets and research activities; although it will also present an emerging analytical framework which draws on common elements of all of these data sources and those from other projects funded by the Cambridge MIT Institute.

Teaching and learning questionnaire

The questionnaire was constructed to a relatively novel design using thirty items, each of which had two five-point Likert scales. This differs from traditional single-sided Likert scales (Likert, 1932) and double-sided Likert scales (Taylor, Curtice and Heath, 1995), as each item or stimulus is assessed by the participant in terms of two distinct and orthogonal variables.  Research instruments of this nature have been previously used in health and social care; for example, the two scales on a questionnaire dealing with the context of chronic pain might be 'Frequency' and 'Severity'. In relation to educational research, the questionnaire here draws on a design used in the Improving School Effectiveness Project (ISEP) (Robertson et al., 2001; MacBeath and Mortimore, 2001) and subsequently in the Teaching and Learning Research Programme's 'Learning how to Learn' project (James and Pedder, 2006). In light of these models, the two scales were concerned with 'frequency of practice' and 'value'. The items were derived from the list of practices identified by the ETL project (Entwistle, Hounsell and McCune, 2002), and from inventories developed by Trigwell and Ashwin (2003) in their work on small group teaching at the University of Oxford. A typical item is presented as follows:

Figure 1: Typical double-scale questionnaire item 

In the context of our study, it was not known with any confidence how frequently particular activities took place, so the values on the scales are nominal rather than referring to specific time intervals ('weekly' or 'monthly') and this was made clear to participants.  In any case, absolute values were of less interest than the comparative differences between the practices reported and values expressed.  If values exceeded practices, this was taken to indicate teaching and learning activities which, while rare, were highly valued by students and in which levels of practice might be increased; if practices were higher than values, this was taken to indicate areas of practice which, while common, could be used less often.  One special case is a low rating of a practice on both value and practice, which was of interest as it might correspond to teaching and learning approaches so unfamiliar to students that they were unclear as to what affordances they might offer.  One use of these 'value-practice gaps' is to indicate potential areas for intervention and change in practice - or at the very least the initiation with teaching staff of discussions as to whether intervention is desirable and feasible. Changes in 'gaps' can then also be  used to assess the extent of change in practice and the associated values, or to assess the efficacy of specific changes in practice, via repeated circulations of the questionnaire.

The questionnaire was first trialled and subsequently piloted with groups of 3rd Year undergraduate students who had taken the PMS course in the previous year before being administered twice to a group of 50 PMS students.  Of these, 37 completed and submitted questionnaires in Autumn term (a 74% return rate) and 28 went on to also submit questionnaires completed in Summer term (76%);  hence, 56% of the total class returned both questionnaires. Although these are fairly high response rates given the voluntary nature of the questionnaire, the sample size is small overall (n<30).  Mean values and mean value-practice gaps were calculated based on the two administrations of the questionnaire.  The inventories from which items were drawn (Entwistle, Hounsell and McCune, 2002; Trigwell and Ashwin, 2003) had been used in questionnaire instruments the data from which had then been subject to factor analysis;  so it was therefore of interest to see whether the findings from this questionnaire confirmed these earlier analyses, albeit with appropriate levels of caution given the small size of our sample.  Trigwell and Ashwin's (2003, 68-69) factor structure covers a range of issues beyond the scope of our questionnaire, but we did find elements of their factor structure useful in accounting for variance in our results when we carried out confirmatory factor analysis.  Preliminary exploratory factor analysis (PCA using Varimax Rotation) suggests that the items load onto three components which account for about 40% of variance in relation to student responses as to the value of activities.  These three components correspond to conceptual categories concerned with 'explicit learning and feedback', 'authentic and relevant learning' and 'engagement and motivation'.  (It proved much more difficult to develop a compelling factor structure for practices, possibly a reflection of the fact that students were reporting on their experiences with a number of different tutors who employed different teaching practices).  The process of developing this factor structure and its relation to the development of the broader conceptual structure  across projects at both Cambridge and MIT will be reported in a future paper.

Responses were largely consistent across the two administrations, confirming the 'baseline' role of the questionnaire ahead of any interventions planned for the following (2006-2007) academic year, although there were a number of differences which are described below.  By far the largest value-practice gap (on both administrations, and 0.8 greater than the next largest gap) was 'assessment criteria or model answers are used to help me understand how well I am doing in my studies'.  Other items with large gaps where values exceeded practices on the first administration are shown in Table 1. 

It is notable that while the largest gap  relates to sharing criteria and model answers, there are items which are more highly valued.  This may be a  result of  students being less clear about the value of practices which are very rarely used – having little experience of  them or any benefits accruing from them.  Another area where both practices and values were comparatively low was in responses to items concerned with peer learning and peer assessment, both relatively rare occurrences within the context of lectures and small group teaching at Cambridge.  As Table 1 illustrates, it  was clear that for the students in our sample, priorities for the development of teaching and learning environments and activities included: making the purposes of learning activities and success criteria explicit; linking specific activities to other course elements and to overall course objectives; and providing feedback on specific tasks and overall progress.  These were also to be common themes when we conducted focus groups with a subset of the students who had participated in the questionnaire.

Table 1: High Value-Practice Gaps from Student questionnaire Data


Value (V)

mean (stdev)

Practice (P)

mean (stdev)



Assessment criteria or model answers are used to help me understand how well I am doing in my studies

3.5 (1.0)

1.6 (1.0)


Teachers provide helpful feedback on my progress


4.4 (0.8)

3.3 (1.1)


Tutorials help me to have a clear idea of how the course is structured

3.9 (0.9)

2.8 (0.9)


Tutorials help me to focus on the importance of integrating concepts rather than learning rules and laws

4.8 (0.5)

3.7 (0.9)


I receive useful comments (orally and/or in writing) on my work

4.7 (0.6)

3.7 (1.0)


Tutorials help me to develop new ways of thinking rather than just extending specific knowledge

4.3 (0,9)

3.3 (0.9)


Tutorials help to develop problem-solving skills


4.2 (0.7)

3.2 (1.1)


Teachers help me to link together the different parts of the course

4.6 (0.7)

3.6 (0.8)


Tutorials help me to develop analytical skills


4.2 (0.7)

3.4 (0.6)


Teachers make it clear right from the start what they expect from students

3.9 (0.9)

3.1 (1.1)


Tutorials help me understand what is required on this course for me to do well

4.3 (0.9)

3.4 (0.6)



A small number of items produced negative value-practice gaps, representing student dissatisfaction with the practices involved.  These related to activities oriented towards 'surface learning' (Entwistle,1981; Ramsden, 1992):'tutorials are more about me showing how much I have learned rather than developing my understanding' (mean v-p  = -0.5 stdev = 1.1 ); 'I am mainly asked questions which require recall of facts' (mean v-p = -0.2 stdev = 1.1) and 'I am assessed on what I have memorised rather than what I have understood' (mean v-p = -0.5, stdev = 1.4).  Practice ratings for these items were low compared with other items (all with a mean of less than 3 on the value scale), but it seems reasonable to interpret this as meaning that despite these being less common activities, they are still more prevalent than the student sample would prefer.

The second administration of the questionnaire returned very similar values to the first (albeit with a smaller sample size).  There were, however, some items where there were small (but generally statistically insignificant) differences perhaps reflecting the changing teaching and learning environment over time (and particularly in the light of the important second year exams which were imminent).  Practices such as 'concentrat[ing] on learning just those bits of information I have to know to pass examinations' were apparently slightly more common, while the value attached to  'teaching ... techniques and procedures rather than arguments and reasoning' had showed a statistically significant increase (p < 0.05), possibly in response to the demands of an assessed practical course and upcoming practical examination.  Two  further instances of value-practice gaps reducing as a result of reported increases in practice referred to team work skills and oral communication skills; this may be attributable to the activities of some tutors who described how they changed their teaching practices to involve second year students in more varied activities such as case-based learning, debates and giving presentations ('teacher' data and comparisons of teacher and student data  from the project will be reported in full elsewhere).

Student focus groups

Between the administrations of the questionnaire, a series of focus groups were held involving a subgroup of the students who had completed the questionnaire.  These had multiple purposes; in addition to providing a better insight into the perspectives of students on teaching and learning, they also served as opportunities for the research team to collect qualitative data to triangulate with questionnaire data, to illuminate further some of the patterns in the questionnaire data and to validate some of the emerging hypotheses and constructs.  The focus groups were organised to form a coherent series as we wished to ensure that participating students would find these activities engaging and worthwhile.  Each of the four sessions with each group had a distinct theme: “Learning in different settings”,  “What can teachers do to help you learn?”, “Feedback and assessment” and  “Different views on teaching and learning”.  In each case activities were arranged so as to generate information both about students' experiences of teaching and learning (corresponding with the 'practices' of the questionnaire) and their values, expectations and models of learning. Focus groups were held at what we hoped would be perceived as a neutral location, free from associations with teaching, at the Centre for Applied Research in Educational Technologies (CARET), and it was made clear to students that the purpose was not to audit departmental activities or their own progress, but rather the promote a discussion as part of a broader programme of development and research activities.

A pilot series of four focus groups was run with third-year students as preparation for those with the second-years. The third-year focus groups were intended primarily to verify the appropriateness of topics for discussion, and to ensure that the practical aspects ran smoothly (although some issues arose from these groups which will be discussed further).  The pilot also provided the opportunity to test the best way of recording the content of the discussion.  Sound quality using video and microphones proved poor, and given the need for informality and for division of the group into twos and threes, it was decided not to attempt to use multiple microphones or microphones attached to individual participants.  Instead, detailed notes were taken by one of the researchers, and participants were encouraged to note down any issues prior to 'feeding back' to the larger group and these notes were also collected.  Researchers met immediately after the focus groups to review notes, to make additional observations and to offer immediate reflections. 

Analysis of the focus group data showed that one of the main concerns for students (and one which they couched both in terms of their own learning and the things which teachers could do in order to support them) was making the role and purpose of specific course content and learning activities more explicit.  While it was recognised that course documentation included learning objectives and that criteria for assessment were made available, these were seen as being too general and were not operationalised in terms that students fully understood and (more critically) in ways that could inform their management of their own learning. Similar criticisms were made with respect to the advice and feedback provided by some teachers; the formal tutorial reports which were used to report progress were described as being 'too generic' and several students reported only 'knowing how they were doing' once they had received end of year examination results.  While expertise and enthusiasm for the subject was regarded as important, the most effective tutors were characterised as those who were took an interest in student learning, responded to individual and collective difficulties and offered specific advice and targets.  Students said that ideally the feedback that they should receive would combine written comments and some kind of predictive grades 'to show where they should be aiming'; this applied as much to work which was well received as much as to that which was deficient in some way – one of the students reporting that one of the most depressing occurrences was to have written work returned with the comment 'that's fine'.

Discussion of these specific issues led to several broader questions (which were not originally part of the planned schedule of the focus groups) being raised by students.  The first was the extent to which norm- and criterion-referencing was used in assessment processes, and how appropriate and 'fair' these were;  students were particularly concerned about the comparability of 'firsts' across subjects and across institutions and students drew comparisons with the national criteria used in the grading of 'A' and 'AS' level qualifications.  The second was the extent to which there was any  system of moderation (either internal or external) of student experience and teaching of the quality of learning, and whether arbitration and appeal mechanisms existed - for example, to address issues of supervisor reliability, or the quality and quantity of feedback.  While some students reported that they had made use of college or departmental procedures, some expressed unease about the role and influence of social networks and personal associations which existed between staff and of which they, as students, might be unaware.

While the focus groups were originally intended to explore attitudes to teaching and learning amongst second year students, and those involving third-year students were intended as a pilot, there were a number of differences between the attitudes of the two groups of students, especially in relation to their participation in 'authentic' learning activities, the different methods deployed to engage with and synthesise course material and their role as potential participants in broader academic activity.   These raised some additional questions about the differences between student populations – specifically, those who had chosen to specialise (or were likely to do so) in Plant and Microbial Sciences (and might also be considering postgraduate study in that area) and those who were studying Plant and Microbial Sciences as a course element subsidiary to another subject.

A good example of this pattern concerned attitudes to learning activities – both those that were organised by tutors and those that students undertook independently.  Second year students described a range of activities which helped them synthesise the diverse course content; examples included: essay writing; watching natural history documentaries on DVD and extending the information provided by the presenter with concepts and insights gained from the course; rewriting lecture notes (in a limited number of cases as 'mind maps'); and annotating images and diagrams.  What was evident was a tendency amongst the second year students to associate 'learning' with 'knowing' and 'memory', particularly in relation to the course content delivered addressed in lectures. When researchers enquired about learning strategies, some students consistently responded with accounts of how they remembered information.  The role of teachers was generally defined in terms of 'explaining' those areas which the students found difficult and there was an expectation that they would have both subject knowledge and the ability to help students to understand it – one third year student stating that 'I want my tutor to know everything – or at least to pretend they do'(2).  Teachers who were perceived as not knowing course content, or who withheld explanations or advice in order to stimulate independent student learning were viewed in a negative light.

While some second year students made reference to essay writing as a learning activity, third year students argued that this represented the main learning opportunity within the course – rather than just being a form of preparation for examinations, or a means of assessment or monitoring.  They also stressed how project work, which as final year students they were involved in as a major course element, represented a significant opportunity to synthesise information, develop new techniques and extend their knowledge and understanding beyond the scope of the taught course.  Their emphasis on these activities may represent their additional experience and opportunity to reflect on this; but may also be a result of their increased orientation towards the 'authentic' and participatory activities that the final year course involved.  In this context, essay writing was perceived by some students as a preparation for academic and professional activity and was accompanied by departmental activities such as seminar presentations and preparation of work for academic publication. 

These orientations towards 'learning as acquisition' and 'learning as participation' mirror the two 'metaphors for learning' identified by Sfard (1998) but it proved important to explore with students not only their attitudes towards different learning activities but the rationales which informed them.  Second year students responded more positively to some 'participatory' activities such as group work and projects, peer assessment and student-led seminars (with some appreciating the absence of a ‘big academic’).  This may be linked to their concern with 'covering' course content, and activities which contributed to their getting a good grasp of core concepts, involved peers 'explaining' areas of difficulty, and in doing so equipped them for examinations were viewed in a positive light.  It should be noted that such activities were quite rare on the whole, and students from both groups had to be prompted before mentioning these as examples of learning activities in which they had participated (a pattern confirmed by the low 'practice' ratings in the questionnaire for peer learning activities). 

In the final focus group of the series the question of acquisition and participation models of learning were revisited and Sfard's (1998) distinction used as a starting point for discussion.  Echoing Sfard's own warning as to the danger of choosing 'only one' metaphor for learning, students were keen to point to a 'cross-over' or combination of the two metaphors in all learning environments.  That said, they associated learning as acquisition more with their experience of learning at school and in the early parts of their undergraduate courses (particularly lecture courses); third year students, with their greater experience of project work, identified this as representing a shift towards learning as participation.  Practical classes earlier in the course were seen in terms of acquisition , albeit of skills and techniques, although the presence of postgraduate 'demonstrators' did provide the opportunity for learning from the experience and 'craft knowledge' of others. 

Third year students, who had more experience of working directly with academic staff on research activities, were interested in Sfards' assertion that under the participation  metaphor (represented, amongst others by Wenger's (1998) model of 'Communities of Practice'), a prime concern of teachers is the preservation of domain-specific practices and a professional community.  They characterised their year group in terms of those who did not intend to continue to postgraduate study, and for whom the acquisition metaphor provided a better description, and those who were already considering an academic career (and who were recognised as such by staff) for whom the participation metaphor provided a better match.

A final issue that emerged from the focus groups was the fact that the environment which we had established, in which groups of 4-8 individuals took part in activities centred on enquiry, was a novel one for the students for students who for the most part characterised their teaching and learning environments as comprising lectures, small group tutorials, practicals and individual study.  Furthermore, students reported that the focus groups represented their first opportunity to explicitly discuss teaching and learning, to share ideas and to voice concerns about teaching and learning issues.  At the end of the series of focus groups, all groups were asked to make suggestions for improvements to teaching and learning, and in all cases, there was a consensus that 'sessions like these' should take place - possibly in the first term of undergraduate courses.

Student self-efficacy and its relation to other project data

An important aspect of the project is assessment of the impact of evidence-informed development of small-group teaching practices. Evidence (including the questionnaire and focus group data described here) has been collected to indicate changes with the potential to have a positive impact on student learning, and repeated administrations of the practice-value questionnaire will provide evidence of how practices and attitudes change as a result.  At the same time it is also important for the project to have a means of analysing the impact of those changes on student achievement; clearly one measure of this would be student performance in examinations, and the project team intend to use raw examination scores as one measure of change – both within and between cohorts of students.  Another approach is to measure self-efficacy as part of a strategy which matches the broader aim of academic courses to increase the confidence in their abilities of their students.  The Plant and Microbial Science course which is the subject of this research is part of the Natural Sciences degree at the University of Cambridge, the overall aim of which is to produce graduates with the skills and knowledge sought by industry, the professions, and public service, as well as providing academic teachers and researchers for the future. This requires that students complete their degree with a high level of confidence in their academic abilities and in the transferable skills which they have developed as part of their studies.

Self-efficacy refers to beliefs about one’s own capability to learn or perform behaviours at designated levels and in specific domains (Bandura, 1977, 1986, 1997).  The nature of self-efficacy has been researched across a range of contexts and has been found to be related to clinical problems; to management of stress and pain and activities such as smoking; and to athletic performance (Pajares, 1997). Self-efficacy has also received increasing attention in education , where it is associated with (and can be used as a proxy for) academic motivation, learning, and achievement (Pajares, 1996; Schunk, 1995). As part of this project, a self-efficacy questionnaire based on a model developed by William Lucas of MIT (Lucas and Cooper, 2005) was designed and administered as an online questionnaire using the Ostrakon student feedback server hosted at CARET.

Self-efficacy measures are varied: they range from those designed to describe generalised self-efficacy (which is correlated with a broad notion of self-confidence); through the self-efficacy to perform a set of tasks involved in pursuing a domain-specific career or activity such as setting up a business; and to more narrowly defined confidence to perform specific tasks (Lucas and Cooper, 2005).

  With this in mind the questionnaire included a range of items to include self-efficacy in relation to general academic and scientific ability as well as domain-specific items derived from the course objectives, (including both those that were generally perceived by teachers to cause students difficulty and those that caused few problems for them).  Other items were selected to investigate levels of student self-efficacy in relation to potential threshold concepts identified within the course[3]. General items included 'Carrying out scientific experimental data analysis' and 'Communicating with peers and academics in a scientific environment', while more specific items included 'Understanding how the mechanisms of photosynthesis are affected by environmental conditions' and 'Understanding strategies adopted by bacteria and fungi to cope with major environmental challenges'.  One item was included which related to a topic that had not been addressed in the course, allowing a check for the validity of individual student responses (or alternatively their disinclination to acknowledge a low level of self-efficacy on any item).

The Ostrakon server generated an automatic email to all 53 second year Plant and Microbial Science students with links to the online questionnaire which they were invited to complete. A response rate of 62% was achieved (31 out of 50), of whom 62% were female (the proportion of female students on the course also being 62%) and of whom 39% intended to take Plant and Microbial Sciences as their third year course.

  The most striking aspect of the results gained from this questionnaire was that students were generally very confident about their scientific abilities, both general and domain-specific. On a nine-point Likert scale of 1 to 9 (with 1 being 'not confident' and 9 'completely confident'), the mean score across all items was 6.7 (stdev =1.5, n=31).  Even on the item which referred to the topic not covered in the course, the mean was 5.8 (stdev = 1.7, n=31), and while this is significantly lower than the other items (t = -2.4, p <0.05), it does suggest that only a very small number of students were willing to acknowledge a level of self-efficacy below the mid-point of the scale on any item.  There were two areas of significant difference in levels of self-efficacy.  Firstly, female students reported significantly higher self-efficacy on plant science specific items (t = 6, p <0.05) but there was no significant gender difference for the general scientific items. Secondly, those students who had chosen to continue with plant sciences reported significantly higher levels of self-efficacy on the plant science specific items (t = 2.06, p <0.05) but there was no significant difference with general scientific skills.

These data collected on self-efficacy are of interest in their own right and it is clear that repeated administrations of this research instrument could provide useful information both about the progress of individual students and cohorts as a whole, particularly if it were extended across other courses in the Natural Sciences; it would be useful to discover whether those students not intending to continue with Plant and Microbial Sciences have higher domain-specific levels of self-efficacy in the subjects in which they do intend to specialise. However, they also raise some important issues when compared with the other data described here. 

Students appear to be confident about their 'understanding' of specific course content (as evidenced by self efficacy scores) and their view of teachers' roles (gathered in focus groups) suggests that they value highly teaching and learning activities designed to help them develop this understanding.  At the same time, they expressed considerable concerns about their ability to demonstrate this understanding particularly by writing essays under examination conditions.  Amongst the clearest themes to emerge from the practice-value questionnaire were the needs to have more clarity as to teacher and institutional expectations; better connections made between learning activities,; and explicit demonstrations of success criteria and models of 'quality'.  This disjunction between 'understanding' and 'being able to show your understanding' may lie at the root of students' expressed concerns about the assessment system and their demand for greater transparency and accountability. 

It is worth noting, however, that there was a belief on the part of the  students who took part in focus groups that there was some rationality behind the assessment system, even if they were not fully party to it.  Despite the lack of formal opportunities for peer-learning and the assertion that teaching and learning were not discussed, it was clear that there exists a culture of advice, craft knowledge and student folklore which has emerged in this conceptual space, and what focus group participants provided us with was a range of examples not only of how to learn per se but how to show under examination conditions that you had done so.   Even students who had struggled to explain what strategies they used to learn (or remember) course content were  able to describe sophisticated approaches to demonstrating what they knew in end of year examinations.

Responding to the data – and an emerging analytical framework

These student data have been used in two ways.  The first is the identification of a number of specific 'high-leverage' strategies which have been identified primarily through the use of the practice-value questionnaire.  Some of these have been used to inform interventions to be put in place in the Plant and Microbial Sciences course in 2006-2007, including:

·        Making lecture notes, presentations, practical handouts and notes, essay lists and previous examination questions available in a standard format within a 'wiki' within the Camtools Virtual Collaboration Environment hosted by CARET.   This is 'searchable' so that students can identify areas of commonality across the diverse aspects of the course.

·        Identifying and more explicitly 'signposting' key concepts and areas of problematic knowledge (including 'threshold concepts'), together with their scope, relevance and explanatory power.

·        Making available annotated essays in each substantive area of the course, including exemplars of work at different grades and including examiner comments.  Tutors will be supported in using these as part of formative assessment activities rather than simply as 'model answers'.

·        Linking course materials to external resources including websites, publications (from access points such as 'Web of Science') and case studies.

The second role for these data is in informing the development of a broader, overarching explanatory framework; this has emerged from existing literature (Entwistle et al, 2002;  Trigwell and Ashwin, 2003); from the explanatory factor analysis on the practice-value gap questionnaire; and from qualitative analysis of the focus group described here; and also of interviews with teachers involved in the course.  This involves six main elements, which have framed discussion within the research team and have also proved to be a useful point of departure for discussions with subject teaching staff (although they are not intended to be a 'plan of action' but rather, a means of identifying areas for further study and potential action).  The first four elements are concerned with teaching and learning practices, and primarily involve reflection and possible changes in practice on the part of teachers and students:

·          Making Learning Explicit: this includes sharing and revisiting learning objectives, discussion of success criteria and relating specific learning activities to broader course objectives and vocational priorities.

·          Motivation, Engagement and Contingency: this includes teachers not simply being expert in their field but encouraging and engaging with student learning and the problems they may encounter.  It includes the notion of teaching 'contingently' by using diagnostic activities and questions to explore student understanding and misconceptions and responding accordingly.

·          Authentic Learning: this includes teaching with 'real-life' examples and through case studies, but also 'modelling' the authentic strategies and activities of experts in the field.

·          Student Self-Regulation and Independence: this involves the issues of how students can be encouraged to take greater responsibility for their own learning and to reduce their dependence on tutors as arbiters and 'explainers'.  This may involve providing them with more guidance and information (as part of 'Making Learning Explicit' activities), or it may involve the placing of more emphasis on peer learning and peer assessment.

Two further themes are more concerned with course team, department and institutional activities.

·          Constructive Alignment, Synthesis and Throughlines: this theme borrows from the work of the ETL project (Entwistle et al., 2002) and concerns the alignment of learning activities, assessment processes and student priorities.  It represents an underpinning to the themes listed above, so for example the job of teachers in 'Making Learning Explicit' would be easier if the different elements of a course were closely integrated.  This is the level at which the provision of the online resources mentioned above is conceptualised – as a set of resources for teachers and students to frame specific teaching and learning activities

·          Transparency and Accountability: This further extends the organisational level of the framework, and introduces issues such as the application of standards, inter-institutional differences and social aspects of teacher-student relations.

The data presented here, then, are perhaps best seen as a 'baseline' evidence base, not only useful as for comparisons and evaluation of any intervention, but also the basis for informed decision making and managed change.  If the data help identify what changes might be made, the analytical framework points up some of the tensions which might be felt and factors which might support or hinder those changes.  While there are some specific changes in practice which would seem to have broad support across staff and students, others may be harder to implement (even if all parties believe them to be desirable).  Both staff and students expressed an unwillingness to reduce the examination system to what one student described as 'hoop-jumping', so the process of introducing 'Making Learning Explicit' strategies should be introduced will need to be carefully managed so as not to have detrimental effects on what we have termed 'Student Self-Regulation and Independence'.  In the same way, any introduction of activities involving students in peer learning may (on the evidence of the practice-value questionnaire and the focus groups) not be successful if it is seen solely as a social exercise to develop collegiality within the student cohort, and would have to be reframed in terms of its learning benefits to individuals or as an 'authentic' and transferable activity in which it important for students to gain experience and expertise.

The 2006-2007 academic year, then, represents an opportunity for an expansive programme of research and development into teaching and learning at individual, classroom and organisational level, with the progressive elaboration and extension of an evidence base and the development of explanatory frameworks informed by both teacher and learner perspectives.  It also represents an opportunity to develop a replicable process of evidence-informed development of practice with potential applications across the University of Cambridge, the Cambridge-MIT partnership and the higher education sector more widely.


The project team would like to thank the staff and students of the Department of Plant Sciences and staff at CARET for their participation and support; and the Cambridge MIT Institute for its funding.


 1. Author names are presented alphabetically following the corresponding author: Patrick Carmichael, CARET, 16 Mill Lane, Cambridge, CB2 1SB.  Email:

 2.  Another student in this discussion expressed a view that it was reassuring to imagine that one's teachers were 'omnipotent' – on reflection researchers decided that they probably meant 'omniscient', although it is impossible to be sure.



Bandura. A. (1977) Self-efficacy: Toward a Unifying Theory of Behavioural Change. Psychological Review, 84, 191-215

Bandura, A. (1986) Social Foundations of Thought and Action: A Social Cognitive Theory (Englewood Cliffs, NJ: Prentice-Hall)

Bandura, A. (1997) Self-efficacy: The Exercise of Control (New York: Freeman)

Carmichael, P., Burchmore, H., Tracy, F. and Johnstone, K. (2006 forthcoming) 'Threshold Concepts: an Emerging Interdisciplinary Research Agenda in Higher Education Research' in: D. Good, S. Greenwald, R. Cox, and M. Goldman (eds.) University Cooperation for Innovation: Lessons from the Cambridge-MIT Institute (Rotterdam: Sense Publishers)

Cambridge University Students’ Union (2004) Quality of Supervisions Report Cambridge: Cambridge University Students’ Union.

Entwistle, N. (1981) Styles of Learning and Teaching; an integrated outline of educational psychology for students, teachers and lecturers (Chichester: John Wiley)

Entwistle, N., McCune, V. and Hounsell, J. (2002) Approaches to Studying and Perceptions of University Teaching-Learning Environments: Concepts, Measures and Preliminary Findings. ETL Occasional Paper 1.

James, M., Black, P., McCormick, R., Pedder, D. & William, D. (2006). Learning how to learn, in classrooms, schools and networks: aims, design and analysis Research Papers in Education. 21(2), 101-118.

James, M. and Pedder, D. (2006). Beyond Method: assessment and learning practices and values. Curriculum Journal. 17(2), 109-138.

Likert, R. (1932) A Technique for the Measurement of Attitudes (New York: McGraw-Hill)

Lucas, W. and Cooper, S. (2005) Measuring Entrepreneurial Self-Efficacy: Paper presented at the EDGE conference 'Bridging the Gap: Entrepreneurship in Theory and Practice' Singapore.

MacBeath, J. & Mortimore, P. (Eds.). (2001) Improving School Effectiveness. (Buckingham, Open University Press)

Meyer, J.H.F. and Land, R. (2005) Threshold concepts and troublesome knowledge: epistemological considerations and a conceptual framework for teaching and learning. Higher Education 48(3), 373-388

Pajares, F. (1996) Self-efficacy Beliefs in Academic Settings. Review of Educational Research, 66, 543-578

Pajares F. (1997) Current Directions in Self-efficacy Research. In: M. Maehr & P. R. Pintrich (Eds.). Advances in Motivation and Achievement: Volume 10 (Greenwich, CT: JAI Press);  1-49

Pollard, A. (2005) 'Taking the Initiative: TLRP and Education Research' Annual Educational Review Guest Lecture, School of Education, University of Birmingham, 12th October 2005.

Quality Assurance Agency (2003) Institutional Audit: University of Cambridge (QAA: Quality Assurance Agency)

Ramsden, P. (1992) Learning to Teach in Higher Education (London: Routledge)

Robertson, P., Sammons, P., Thomas, S., & Mortimore, P. (2001) The Research Design and Methods. In MacBeath, J. & Mortimore, P. (Eds.). Improving School Effectiveness. (Buckingham: Open University Press)

Schunk, D. H. (1994)  Self-regulation of Self-efficacy and Attributions in Academic Settings. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulation of Learning and Performance: Issues and Educational Implications (Hillsdale, NJ: Erlbaum); 75-99.

Sfard, A. (1998) On Two Metaphors for Learning and the Dangers of Choosing Only One. Educational Researcher, 27, 4-13.

Taylor, B., Curtice., J. and Heath A. (1995) Balancing Scales: Experiments in Question Form and Direction CREST Working Paper 37 (Oxford: Centre for Research into Elections and Social Trends)

Trigwell, K. and Ashwin, P. (2003) Undergraduate students' Experience of Learning at the University of Oxford (Oxford: University of Oxford)

Wenger, E. (1998) Communities of Practice: Learning, Meaning and Identity. (Cambridge: Cambridge University Press).   


[1]               Author names are presented alphabetically following the corresponding author: Patrick Carmichael, CARET, 16 Mill Lane, Cambridge, CB2 1SB.  Email:

[2]               Another student in this discussion expressed a view that it was reassuring to imagine that one's teachers were 'omnipotent' – on reflection researchers decided that they probably meant 'omniscient', although it is impossible to be sure.

[3]               Threshold concepts are described by Meyer and Land (2005) as concepts the understanding of which is transformative, integrative, probably irreversible, context-specific and often problematic.  They are associated with 'conceptual change' models of teaching and learning and their study forms another element of the work of the project and of an associated ESRC/EPSRC project funded under the Technology-Enhanced Learning Programme ('Transforming Perspectives: technology to support the teaching and learning of threshold concepts' RES-139-25-0361).  See Carmichael et al (2006 forthcoming) for further details.

This document was added to the Education-Line database on 20 December 2006