Thesis Boot Camp: Drop and given me 20,000 words!

Over the past few years CAPOD has developed and integrated various ‘Writing Boot Camps’ for postgraduate students.  The impetus for such events arose from a PhD student who had previously studied at another institution where an assortment of student-led writing groups had been instrumental in providing a supportive and productive environment for ‘getting words down on the page’.  The student, Dawn Hollis, was keen to initiate the same sort of cross-disciplinary, grass-roots groups at the University of St Andrews, and approached CAPOD for some support.

Although CAPOD already offered a number of workshops on the mechanics of writing and improving writing, less support was offered in terms of the process of generating writing, a common problem faced by Masters’ and PhD students alike when confronted with producing a longer piece of written work.  The initial outcome of Dawn approaching CAPOD was a ‘home-grown’ Writers’ Boot Camp that was open to both cohorts, which ran over three days and received positive feedback.  Dawn designed the Boot Camp based on research about such events elsewhere, including the well-known Thesis Boot Camp model, which had also featured on the popular research blog The Thesis Whisperer.

Based on the initial success of this event, in 2016 CAPOD paid to bring in an external consultant, Dr Peta Freestone, to run the award-winning Thesis Boot Camp developed at the University of Melbourne.  The first St Andrews Boot Camp had 32 attendees and ran over three days: Day 1- 16:00-20:30, Day 2- 09:30-20:30 and Day 3- 09:30-20:30 (though participants could leave any time from 16:00 onwards).

The core tenet of the model is a focus on generative writing (participants have to come with a plan and are focused simply on getting words down on the page at the actual event).  A number of techniques were deployed to help students reach the aim of reaching 20, 000 words (or the equivalent of a thesis chapter) over the 21 of hours of intensive writing across the weekend, including goal setting and motivational tricks, peer support and the introduction of the Pomodoro technique as a good way to structure writing time and remain focused. (The Pomodoro Technique is a time management method developed by Francesco Cirillo in the late 1980s. The technique uses a timer to break down work into intervals, traditionally 25 minutes in length, separated by short breaks).

Thesis Boot Camp 2016 was highly successful, with participants rating the quality of materials, presenter and structure of the event very highly: 

Thesis Boot Camp Level 1 Evaluations 2016

Evaluation question No of responses Average result (max score 5)
How well the event met its stated objectives 26 4.8
Quality of materials 23 4.6
Ability of presenter(s) 25 4.8
Structure of event 21 4.6

They also rated the event very highly in terms of relevance to development, their likelihood to make a change to behaviour as a result of attending, and meeting personal objectives.  Qualitative feedback was also overwhelmingly positive, with participants commenting on what a thesis-changing experience it had been and how much more manageable writing had become.  The supportive and intensive nature were also highlighted as strengths, as well as many positive comments on the break-out activities (a yoga session and walk were offered on each weekend day respectively).  Comments on what could be improved included:

  • Larger desk space.
  • Elements of the catering—including having fruit available at all times.
  • Heating.
  • Word target feeling exclusionary for Science students who do not write as long theses.
  • Further advice about how to edit what you produce at Thesis Boot Camp.

While running Thesis Boot Camp, Peta also ran a ‘train the trainer’ experience for CAPOD to allow the event to continue to run in house.  We were keen to run the event for Masters’ students and, although Peta was sceptical about how the model would transfer, she was happy for her model to be adapted; we invited Dawn Hollis, who had attended Thesis Boot Camp as a participant, back to help run and develop a Dissertation Boot Camp, which also ran successfully in July of 2016.

After the success of Dissertation Boot Camp, Dawn was invited to facilitate Thesis Boot Camp in 2017 and 2018; although largely based on the same structure and principals of the ‘original’ Thesis Boot Camp, the model was adapted based on feedback from participants and our in house experience of running Dissertation Boot Camp.

The time table for 2017 and 2018 remained largely the same, although the content of the facilitated sessions was changed slightly, most notably to incorporate the inclusion of a session on editing on the final day.  Some participants had commented that it was a bit demotivating having participants leave any time from 16:00 onwards on the final day, and so in subsequent years we have encouraged people to stay until 19:00 and have provided a light dinner.  A bigger change made this year was to create a separate ‘word target’ for science students; previously they had been left to work out what their equivalent target would look like (e.g. how many words a figure might count for, or what was the equivalent of 20, 000 words in their discipline).  Although we encouraged students to work to their own personal targets, we felt that providing more structure to this for Science students would help them feel less excluded from the model.  As such, a distinct word target of 10,000 was introduced and students picked the one that they felt most closely matched their discipline.

Feedback on these subsequent iterations has again been overwhelmingly positive, with the highest levels of satisfaction across the board so far reported this year:

Thesis Boot Camp Level 1 Evaluations 2018

Evaluation question No of responses Average result (max score 5)
Quality of materials 26 4.7
Ability of presenter(s) 25 4.9
Structure of event 25 5.0
Evaluation question No of responses Average result (max score 5)
How relevant was the event for your professional/personal development? 23 4.8
How likely are you to make a change (to a process or behaviour) as a result of attending this event? 24 4.8
How well did the event meet your personal objectives? 24 4.8

At Thesis Boot Camp 2018, students collectively wrote a whopping 278, 489 words.

This year Level 3 evaluations were also distributed a few months after the event.  Although only 12 participants responded, selected comments suggest the event had a lasting effect on those that did respond:

“I am more confident I can easily draft a chapter. Working with the 25′-method [pomodoro technique] regularly, I write and work more efficiently than before I attended the workshop.”

“I feel more capable of writing without fear and getting stuff down on paper – which was exactly what I needed!”

“The session has given me the extra motivation to accomplish my writing.  This workshop helped me very much in finding a way to deal with my writer’s block that I was going through at the time. I’m able to write more using the strategies suggested in the Boot Camp. Since I’m able to write, the level of my confidence also improving.”

Significantly, some participants have subsequently been involved in similar ‘boot camp’-like events run on a smaller level:

“… Additionally, other aspects that I changed after attending boot camp is [sic] looking for this “writing atmosphere” in other places. I have changed the main place I work, which is much more silent (the library).  I have also participated in another CAPOD session, as well as a thesis boot camp weekend with some colleagues.”

“Something particularly helpful was the fact that together with other Boot Camp attendees, we hold 2 weekly sessions of thesis writing, which prove to be a big success and motivation in day-to-day struggle with thesis writing. I find this the most helpful part of the consequences of the Thesis Boot Camp.”

One Thesis Boot Camp participant from this year explained in a short follow up interview how the Boot Camp had a longer term impact on her and other PhD students in her department:

“Since my success at Thesis Boot Camp, I became very interested in how I could use the techniques I learnt to make my writing time more effective. I was particularly interested in collective writing and the Pomodoro technique. As the rest of my office (4 other IR PhD students) were also interested in this too, we set up a regular writing group.

We normally write every Monday, for around 3 – 4 hours. We use the Pomodoro technique and everyone has to physically get up from their desks in the 5 minute breaks. We all set a target before we start writing, which we share with the group, and at the end of the session we discuss if we made the target, and if not why we think this was.

We have all found the sessions extremely helpful and even if we didn’t all reach our targets each week we all achieved a lot. We also found it useful to have them on a Monday as it set up good practice for the rest of the week, and was motivating knowing that we had already achieved a big goal.

After hearing about our writing sessions other offices began to replicate similar sessions. I know another IR office (none of whom were at the boot camp) aim to have a writing afternoon once a week using the Pomodoro technique. They even have a bag were everyone has to deposit their phone during the session.

Rebecca Wilson, PhD Student, International Relations

Thesis Boot Camp has been an exceptionally successful event from its introduction and the, albeit small, increase in satisfaction from year to year suggests that the developments made to the model have been effective and suit the audience in St Andrews.  Follow up feedback this year which highlights subsequent spontaneous writing sessions taking place is also exceptionally encouraging.

Now, drop and give me 20,000 words!

Dr Eilidh Harris, Student Developer, CAPOD

Posted in Student development, Uncategorized | Leave a comment

Passport to Research Futures programme evaluation

1. Background

The Passport to Research Futures (PRF) is a structured development programme for early career researchers designed to focus thinking about career planning, professional development and employability.  It is recognised by the Institute for Leadership and Management (ILM). The programme includes a range of development activities, including:

  • Workshops
  • Question and Answer panel sessions
  • Networking events
  • Epigeum online Professional Skills for Research Leaders courses
  • Kintish networking online resources
  • Access to the Teaching, Research and Academic Mentoring scheme
  • A personal Vitae Researcher Development Framework Planner account.

The PRF programme has been running for 3 years and an evaluation of the programme was carried out in autumn 2017. This report presents the findings of the evaluation.

2. Evaluation method

Two questionnaires were developed, one for graduates of the programme and the other for current participants. The questions were broadly the same with some minor variations. The questionnaires were developed as an online survey and were tested out by a graduate of the programme and a current participant. The time taken to complete the survey was timed so that an estimate of how long the survey would take to complete could be provided. Following feedback and minor revisions the survey link was emailed to the PRF graduates and current participants via the PRF management platform SUMAC. A three week timescale for completion was set and an estimate of 15 minutes was given for completion. A reminder email was sent out at the end of the three week period with an extended closing date.

3. Response rate

3.1     PRF graduates

15 graduates completed the evaluation out of a total of 24 graduates a response rate of 62.5%. However, seven graduates had left the University since completing the programme and there were no forwarding email addresses, so the survey was only sent to 17 graduates making the response rate 88%.

3.2    Current PRF participants

14 current participants completed the evaluation out of a total of 28 people registered as being current participants, giving a response rate of 50%. Two were just due to start and therefore could not complete the evaluation. An additional four people had left the University and one further participant had dropped out of the programme although they remain at the University. This gives a total of 21 people who would have been able to complete the evaluation and a completion rate of 66.7%.

4. Summary of Results

4.1    Participants

The majority of participants are Research Fellows and they have come from 12 out of the University’s 21 schools. They take an average of approximately one year to complete the programme. The most common reason for joining the programme was to provide structure for career development, followed by having interesting courses on offer.

Area for development: Connect directly with individual schools to identify if there are any barriers to participation.

4.2    Courses

a) Face to face workshops

The PRF is divided into nine themes:

  1. Programme orientation
  2. Equality and diversity
  3. Career futures for research staff
  4. Raising your research profile
  5. Public engagement
  6. Entrepreneurship and enterprise
  7. Funding and financing research
  8. Leading the team
  9. Get the job.

The most frequently mentioned theme when participants were asked about usefulness of workshops was “Leading the Team” followed by “Career futures for research staff” and “Get the job”. The most mentioned individual workshop was “Psychometric Masterclass”.

The most frequently mentioned theme when participants were asked about the least useful workshops was “Career futures for research staff” and the most frequently mentioned workshop was “Managing research information: Pure hands on training”.

The most mentioned idea for improving the courses was to reduce the time allocated for some of the workshops.

Areas for development: Ensure workshop time is appropriate for the content and look at content and suitability of the least helpful workshops.

b) Online courses (excluding Epigeum)

PRF includes a small number of online courses and the most frequently mentioned helpful course was “Recruitment and selection”. Having too many web pages to flick through and it not being clear where to click was a criticism of some of online content.

Areas for development: Ensure efficient layout of online content. Use feedback from evaluations to promote PRF.

c) Epigeum on-line courses

One of the components of the PRF is access to the six Epigeum Professional Skills for Research Leaders online courses. There was a low uptake for these Epigeum courses with six graduates and three current participants having completed some of the courses. The most common reason for not taking up this development opportunity was lack of time followed by preferring face to face workshops.

Area for development: Consider setting up an Epigeum workshop where participants can register for the courses and complete the first course in a workshop setting.

4.3 Vitae Researcher Development Framework planner

Five graduates and five of the current participants had used the Vitae Researcher Development Framework planner. The most common uses had been to help structure grant applications and to identify gaps in competencies.

Area for development: Consider setting up a Vitae workshop to help participants to register for the planner and to provide guidance on how to benefit from the Framework.

4.4 Other supporting services

At the orientation meeting for PRF, participants are signposted to the Careers Centre, the Research Business and Development Contracts Team and to the Teaching, Research and Academic Mentoring Scheme. Nine participants had used the Careers Centre and they mentioned CV help, identifying career options and help with job applications as being the main reasons for seeking support. Six participants had used the Research Business and Development Contracts Team to seek advice on funding applications. Ten people had joined the Teaching, Research and Academic Mentoring Scheme and a number of them commented on how their mentor had helped them.

4.5 Main benefits of completing the PRF programme.

The participants commented on how the programme had helped with career development, helped them to meet new people and expanded their personal growth and development.

The PRF has a number of key objectives and participants were asked to what extent they felt the programme contributed towards these objectives:

Objective Rank
Clarifying my career path =3
Formulating a career development plan 5
Assessing my level of competence against my development goals =3
Filling gaps in my confidence 2
Becoming more confident in my chosen career path 6
Becoming more self-aware about my development needs 1

4.6 Improving the programme

Participants were asked how the PRF programme could be improved.

Ideas included:

  • Offer advanced classes in some subjects
  • Better advertising
  • Provide a better way to keep track of progress
  • Providing support for teaching
  • Encourage more social engagement
  • Set requirements of PRF to correspond with the ILM certificate.

Barriers to making progress with the PRF programme included lack of time and when courses were full.

Areas for development: Look at all suggestions for improvement and implement where feasible.

5. Overall conclusions

Participants were asked about some aspects of the organisation of the programme and the results are given below.

Average score (out of 5)
Programme information on Passport to Research Futures’ website 3.9
Programme information received at the orientation meeting 4.3
Overall support from CAPOD during the programme 4.6
Overall quality of the Passport to Research Futures programme 4.4


Participants were also asked to compare the programme with similar programmes they had experienced at other institutions.

Rating Number or respondents
Better 10
Much better 6


Participants were also asked what they would say to early career researchers thinking about signing up to PRF.

Advice included:

  • It helps focus on a career plan
  • It was a good way to fill the gaps in training
  • There are some great courses on offer
  • Recommend it to all early career researchers.

Overall the Passport to Research Futures evaluates positively and provides a valuable way to structure professional development for early career researchers. Useful areas of development have been identified which will be further analysed by the Research Staff Developers in order to inform changes to the programme in 2018/19.

6. Recommendations

  1. Connect directly with individual schools to identify if there are any barriers to participation.
  2. Ensure workshop time is appropriate for the content.
  3. Look at content and suitability of the “least helpful” workshops.
  4. Ensure efficient layout of online content.
  5. Use feedback from all forms of evaluation to promote Passport to Research Futures.
  6. Consider setting up an Epigeum workshop where participants can register for the courses and complete the first course in a workshop setting.
  7. Look at how to increase the use of the Vitae planner e.g. provide a workshop where participants can register and start to use the planner.
  8. Continue to promote the Careers Centre, Research Business and Development Contracts Team and the Teaching, Research and Academic Mentoring Scheme throughout the PRF programme.
  9. Develop a course for Fellowship applications with attention to finances.
  10. Provide advanced classes in some subjects e.g. research funding, presenting your research, public engagement.
  11. Build in flexibility for new and more experienced post-docs.
  12. Survey Post-Docs to ascertain what further courses they wish to see in the programme.
  13. Look at advertising process and ensure the target audience is reached.
  14. Assess which courses would benefit from being divided into social vs natural sciences.
  15. Look at developing a better way to keep track of progress e.g. online or physical passport.
  16. Look at increasing the amount of targeted and specific advice available.
  17. Provide support for teaching and applications for lectureships.
  18. Support social engagement.
  19. Identify most popular courses and run them more frequently.
  20. Improve programme information on the website.
  21. Look at matching PRF and ILM requirements.
  22. Deliver a higher level project management course.
  23. Hold an event for experienced researchers to share their experiences with new researchers.
  24. Look at teaching-focused route to Higher Education Academy fellowship.
  25. Look at setting up a Passport to Teaching Futures programme.
  26. Provide a course on funding management delivered by Finance Advice and Support team.


Authors: Marie Paterson and Diane Munday, Staff Developers (Research Staff)

Jos Finer, Head of Organisational and Staff Development.

February 2018.

Posted in Passport programmes, Uncategorized | Leave a comment

CAPOD – Supporting Professional Skills Development for Entrepreneurial Researchers

CAPOD – Supporting Professional Skills Development for Entrepreneurial Researchers

Back in November 2015, CAPOD provided 2 workshops for Researchers as part of the Passport to Research Futures programme.  These events were delivered by experts in the field, Dr Ewan Chirnside from the Knowledge Transfer Centre at University of St Andrews and Brian Butchart a CEO at Aberdeen based pharma company SIRAKOSS.

Understanding what it is like to start your own business and getting investment and budgeting right is key to survival in your first few months of starting a business.

DavidOne guy who has taken his first step into the start-up world is David Harris-Birtill.  David (pictured) has attended a number of workshops that CAPOD has provided since he came to St Andrews.  When I met with David in early March this year, I asked him what he thought about CAPOD.  His response was “I love CAPOD and the courses – I’ve been on loads of them”.  We talked about his business, his product and how he is commercialising it and how the CAPOD courses have helped him along the way.  He explained how useful the Business Budgeting and Let’s Make a Business events have been and about useful golden nuggets of information these events have provided. He was particularly grateful for the information on how important cash flow is when budgeting.   He also highlighted how he appreciated the fact that Brian Butchart, who ran the Business Budgeting event, is a CEO of a start-up company in Aberdeen and how the first-hand experience Brian offered was great…this highlighted to David what ‘it is like to run out of cash!’

David registered his company ‘Beyond Medics Limited’ in September 2015. David’s company creates imaging and sensing platforms for patient benefit.  This is how David describes the product:

“A person’s heart rate and blood oxygenation levels are vital signs which provide clinicians an indication of how well a person is. Conventionally measured using a pulse oximeter, a finger probe, this requires contact with the skin and a clinician to place the device on the person. Beyond Medics has created a camera-based system to automatically remotely measure these vital signs from a distance and can do so for up to six people at once, no longer needing these clips, enabling better triage of patients in A&E and a more seamless workflow through a hospital, care home or security setting.”

CAPOD offers a number of events to support the professional development of researchers to be ‘entrepreneurs’.  Ewan Chirnside from Knowledge Transfer offers the ‘Let’s Make a Business’ event on the Passport to Research Futures programme to those who want to explore the development of an idea and using the Blank Canvas technique to develop the ideas into business propositions.  Working with Dawn Shand, Senior Business & Innovation Advisor from Scottish Institute for Enterprise, the 3 hour session enabled attendees to explore their business ideas and create an action plan for further exploration.

Brian Butchart, CEO from Aberdeen based company Sirakoss, offered his time and hands-on expertise to deliver a 2 hour session on business budgeting, exploring the quite complex way of managing cash flow when you are trying to create a product, getting the product to market and attracting investors along the way. Brian led the spin-out process of SIRAKOSS, with his fellow founding Directors.  Brian brings extensive UK and international expertise in medical devices to the session.  He was 22 years with Johnson & Johnson, where he proved his well-developed Sales, Marketing, Operations and Business Unit Management skills. In 2002 he joined Isotron (Synergy Health) as UK General Manager responsible for seven operating sites and 200+ employees in Operations, Sales, Marketing, Finance and Quality, before serving as Global Sales & Marketing Director (2007-2009).

Things are moving ahead at a rapid pace for David in the ‘entrepreneurial world’.  David has been supported along the way by Ewan Chirnside helping him with Intellectual Property Rights and has signposted David to funding opportunities such as ‘Pitch at the Palace’ and ‘2 minute pitch’.

In Jan 16, the Universities Scotland’s Research Training Sub-Committee (Including Heads of Researcher Development and Heads of Graduate Schools) and supported by the Innovation Scotland Forum organised a 1 day event ‘Creating an entrepreneurial research culture in Scottish HEIs’.  The objective was to discuss and identify actions at an institutional and national level to create an entrepreneurial culture for all researchers (PhD students to Professors) in Scottish HEIs.

Our professional development events continue to create this culture here at St Andrews.  If you are a researcher and want specialist support with developing your business then please contact the Knowledge Transfer Centre.  If you wish to develop your professional skills and knowledge then CAPOD offers a wide range of courses and events.  Events such as ‘Working with the Media’, ‘Leadership Development’, ‘Business Budgeting’, ‘Let’s Make a Business’, ‘Effective Communication’, ‘Microsoft Office’ etc, all provide the foundation for the professional entrepreneur.  We also provide coaching and mentoring.  For an informal chat about any of the training we offer for Research Staff, please contact Michelle Paterson, Staff Developer.

Finally, here’s what people think about the Let’s Make a Business and Business Budgeting events….

  • “Excellent intro to starting a business”
  • “It is helpful and interesting.  Easy to understand”
  • “This is absolutely necessary if you want to try your own business”
  • “Fantastic knowledgable speaker”
  • “The breadth of material covered was significant, but absolutely relevant.  I wouldn’t have thought of half of it otherwise!”
  • “Fantastic event.  Excellent breakdown of what needs to be budgeted for.  Also excellent for business planning and creativity.”
  • “Good overview of Basic Structure of business planning.  Very useful as foundation to build my own business.  Presenter very approachable”
  • “If you are at all interested in entrepreneurship then do attend”



Posted in Researchers | Leave a comment

Awakening students to the value of feedback


Student feedback, particularly in relation to assessed work, is a hot topic. Various initiatives over the years have focused on ways to improve levels of student engagement and satisfaction with feedback but it continues to be both a challenging and key area for the institution and individual Schools.

In support of this, a student intervention called Making feedback work for you was piloted in two schools in Academic Year 2013-14. Designed by Edinburgh Napier University, the intervention did not directly address the widely reported topic of what constitutes well designed and delivered feedback, but instead considered the attitudinal perspectives of students who are receiving academic feedback, how it affects their associated actions and its impact upon their subsequent performance.

Coordinated by Ros Campbell (CAPOD) and Erwin Lai (CAPOD), the pilot aimed to assess the viability, suitability and sustainability of the intervention for implementation across the wider University.

The intervention

The intervention draws on two key concepts:

  1. The Conscious Competence Matrix: Adapted from Howell (1982), the Conscious Competence matrix describes a four step journey to becoming accomplished at any skill. The key concepts highlighted to students are that: awareness of what ‘competence’ consists of is critical, i.e. a clear understanding of learning outcomes and marking criteria; it is normal for students to be at different points on the journey for different skills; effort that is efficiently targeted through effective feedback is key to success on the journey.
  1. Growth (incremental) versus Fixed (entity) Mindset: Mindset is situational and is your view of your ability to do a particular thing. A Fixed mindset believes that ability is ‘set’ (whether high, medium or low) and unlikely to change over time. This attitude promotes the setting of personal goals that are performance (rather than learning) oriented, which in turn discourages attention to feedback and effort, and over time promotes helplessness. Adopting a growth mindset requires a belief that ability is malleable, i.e. can improve or worsen over time.  This attitude promotes the setting of learning-based personal goals, which in turn promotes attention to feedback and targeted effort to develop and improve skills, promoting perseverance over time.

Delivery mode

A blended learning approach was adopted for the pilot:

  1. Online course: An adapted online Moodle course offered potential for large numbers of students to be exposed to the material in a consistent and cost-effective way, reduced the importance of the skill set of the staff member delivering the workshop, and provided an opportunity to trial a relatively new online course. Students were asked to complete the course in their own time. In addition to the two key concepts, students were encouraged to reflect on sources of feedback; consider personal barriers to learning/using feedback; and complete a feedback action sheet drawing on ‘real’ feedback from their last assignment.
  2. Face-to-face workshop: Students were then required to attend an interactive follow-up workshop during class time. Drawing on key concepts in the online course, students worked individually and in groups to: Consider skills being assessed within the assignment they had just completed; identify personal areas of strengths and weakness regarding these skills, share strategies and tips; reflect on feedback from their latest assignment; and begin producing a personal action plan.

This pilot project produced useful results and important lessons. Overall, the intervention appeared to be of benefit to a majority of students in both pilot schools. Both the online course and face-to-face workshops had some positive impact on the student cohorts involved in the pilot, with the face-to-face workshops reviewing far more positively than the online course.

Key results

  1. 61% of students felt the online course should be made available to all students in the University
  2. 88% of students felt the workshop should be made available to all students in the University
  3. A significant number of students (60% of Computer Science students and 69% of Classics students) felt the workshop will help them in their studies.

Sample of student comments gathered at various stages of the pilot

“It changed the way I view feedback. I pay much closer attention to it now”.

“I gained new ideas on how to improve and gain confidence in the methods I was already using”.

“I plan to put more time into reviewing feedback, particularly negative feedback, which until now I was often very dismissive of”.

“I will really think about feedback, view it positively and think about how I can improve in weaker areas”.

“I will start taking feedback more seriously”.

“I will analyse feedback more carefully and use some tactics discussed in groups to improve my skills”.

“I was pleasantly surprised by how thought-provoking it was and would like others to have this experience”.

“I’m more likely to consult with peers for extra ideas and feedback”.

Computer Science students perceived an improvement in the level and quality of feedback received (although the school confirmed the feedback was in line with their standard provision):

“The feedback we received for the feedback session was really fantastic, as there was so much more than usual…and everyone received broadly the same amount of feedback. People are really hoping this carries on”. (Minutes, SSCC meeting, 10-3-14)


There were indications that the intervention should be considered for wider roll-out across the University, although it must be noted that support for this was less apparent for the online course (only 60%), compared with the face-to-face workshop (88%).

Next steps

An overview and outcomes of the pilot were presented at a Learning and Teaching Open Forum event in February 2015. On the basis of the pilot results, challenges, and effort required, all five discussion groups agreed that a University-wide roll-out of this initiative would be worthwhile. When asked to consider the most effective way of doing this, participants suggested a further pilot. Ros and Erwin are in the process of recruiting pilot schools and designing an enhanced version of the intervention tailored to St Andrews students.

Posted in Evaluation & Feedback | Leave a comment

The Academic Skills Project


1. Background:

Following a successful Enhancement Themes Funding Application, three PhD students in the School of History introduced a series of academic skills workshops for History undergraduates in 2012. In the summer of 2014 the scheme was expanded, via CAPOD, to all Arts Schools. The aim of this Academic Skills Project was to create a framework for subject-specific academic skills to be delivered to a large number of UG students via high quality workshops delivered by PGR students, creating benefits for both cohorts.

In the summer months of 2014, Directors of Teaching were briefed about the scheme, PGR School Coordinators were recruited and trained, and the project infrastructure was established. The School Coordinators in turn recruited PGR workshop leaders to design and deliver the sessions, and advertised the programmes to UG students at the start of the AY2014/15 session.

The Schools taking part are:

  • Art History
  • Classics
  • Divinity
  • Film Studies
  • Geography and Sustainable Development
  • History
  • IR
  • Management
  • Modern Languages
  • Philosophy

2. Structure:

The project ran in each participating School through the following structure:


CAPOD’s role:

  • To take an overview of the project and provide continuity between Schools.
  • To provide financial backing.
  • To provide opportunities for ideas dissemination between participating Schools.
  • To evaluate the success of the project.
  • To provide quality guidance via our Academic Skills tutors.

The structure of having PGRs deliver subject-specific academic skills was welcomed by University of St Andrew’s Psychologist, Dr K Maver who researches in the field of personal and social self-categories and identity. He welcomed the “discipline-based social-identity” which relates to the use of deep-learning approaches:

 “We argue that this is also modified by the normative effect – that is, it matters what they think is normative for students in their discipline. We speculate that again this could work either way: if you identity as a student in a discipline and see deep learning is normative, you are more likely to engage in deep learning; reciprocally, if you already engage in deep learning and see that as normative, then your identification [as a student of the subject] will increase. IF we are right, then the current strategy being used for the academic skill project is quite close to optimal, and either making the courses discipline free, or losing the interactive element, would reduce the effectiveness.” [email correspondence, 15/10/14]

3. Programmes:

The Schools’ academic skills programmes ran across broad themes (comprehension, analysis, rhetoric), but often presented as subject-specific topics. Many of the School Coordinator’s arrived at their programmes through extensive discussions with module coordinators, PGR tutors and Directors of Teaching to identify skills gaps within the Schools. Examples of this specificity include:

  • Group work (Management)
  • Critical Engagement and Research Skills (Modern Languages)
  • Thinking critically – thinking Geographically (G&SD)
  • Visual Analysis Trip (Art History)
  • How to argue like a Philosopher (Philosophy)
  • How do I watch film? (Film Studies)
  • Analysis and use of different sources (Classics)
  • Researching in IR (IR)
  • Speaking Skills (Modern Languages)

4. Engagement:

Exact numbers  of students engaging with the Academic Skills Project are not known as record keeping by School Coordinators and Workshop Leaders was not 100% accurate.

However, a strong indication can be taken from the number of workshop evaluation forms returned (the true number of students attending will be higher).  In total 787 feedback forms were returned. These help show the engagement of participants by School. The School with the largest engagement was International Relations, that had 32% of their sub-honours students (194 students) attend at least one workshop.

pie chart

  1. Impact:

Ideally, the project should show a correlation between workshop attendance and academic performance. In reality it is hard to demonstrate a causal relationship due to the large number of other variables involved in academic performance. Discussions have taken place between Head of Student Development, Catriona Wilson, and Dr Kenneth Mavor (School of Psychology and Neuroscience), about capturing more robust pre and post intervention data should the project be rolled out to Science schools.


In the absence of additional hard data, an indication of impact is provided in the observations of participants, Workshop Leaders, School Coordinators and Directors of Teaching.

6. Evaluation:

6a. Participant evaluation:

“What is expected of me is much clearer now. I feel more comfortable and confident about beginning my academic studies here.” [Student participant]

Participants rated the workshops highly in terms of objectives, material, presenters, structure and timing. They were also very positive about the workshops’ relevance and the likelihood of making a change to their behaviour as a result. Amalgamated data from the 787 returned evaluation forms is below:


Selected quotes:

“I will be more selective in how I read sources.”

“The prospect of coursework is terrifying. This was so helpful and reassuring. A++”

“I will look at sources differently and be able to choose them better.”

“I’ll spend more time than I previously have structuring my essay research.”

“I’m much clearer about what’s expected now.”

“I’ll change the way I take notes.”


6b. Director of Teaching evaluation:

 “This should be ideally permanent in the Arts. It has proven to be very positively reviewed both by PGRs delivering workshops and students’ take up and responses.” [Director of Teaching]

7 Directors of Teaching completed the online evaluation form.

Directors of Teaching reported that they were mostly involved with advising on workshop content (6 said they were partly involved) and deciding on the workshop programme (1 very involved, 4 partly involved, 2 not involved). Some but not all Directors of Teaching played a role in recruiting PGR students, publicising the workshops and one stated that they were partly involved in workshop delivery.

The Directors of Teaching had a range of objectives from the programme:

  • Provision of academic skills to JH students
  • Supporting PGR develop their teaching skills
  • Helping students recognise that different subjects require specific skills
  • Focusing on International MSc students
  • Helping first years understand what tutorials are for
  • Increasing the approachability of staff (PGRs) by students

All the Directors of Teaching felt the project had been beneficial to the undergraduates and the PGRs who had taken part.

 6c. School Coordinator evaluation:

“I think it’s helped me understand our students better, improve my communication with students, and to manage expectations – of myself, students and tutors” [School Coordinator]

10 School coordinators took part in the project. 9 completed the online survey about their experiences.

School coordinators put varying amounts of time into the project, from 5 hours to 30+ hours. The average was ‘13-20’ hours. Some Schools reported less engagement than they would have liked from fellow PGRs to take part in the project. Frustration with participant drop-out rates was also reported, a common feature of any undergraduate development programme.

CAPOD support was valued by all School Coordinators. In order of ranked importance: supplying funding; photocopying materials; evaluating the workshops; consulting with CAPOD’s academic skills tutor; proving a project Moodle space; offering networking meetings.

School Coordinators each reported personal benefits from taking part in the project. The skill most mentioned as having been developed was curriculum design, followed by Teaching (2nd=) and Leadership (2nd=); Recruitment and Selection (4th=) and Team working (4th=). Budget management, event management and Marketing were also mentioned.

Selected quotes:

“I think we helped clarify what the School expects of its first and second year student, and we helped explain key concepts and techniques to help the students perform better in their modules.”

“It provided training that lecturers and tutors have said students need.”

“The students were very keen and seemed to be grateful for the existence of the project. The tutors were enthusiastic and committed to doing a good job, and recognised the value of improving the academic skills of our undergraduates.”


6d. Workshop leader evaluation:

“I think it has taught me about teaching to different levels of students” [Workshop Leader]

57 workshop leaders took part in the project. 21 completed the online survey.

The main motivator for taking part in the project was a wish to enhance the undergraduate learning experience, cited as being a factor by 80% of respondents, followed by having a different kind of teaching experience (60%), teaching at an earlier stage of their PhD (35%) and financial reward (25%).

Approximately 50% of workshop leaders were only involved with one workshop. 30% delivered 2 workshops, 15% delivered 3 workshops and 5% more than 3.

In terms of time invested into the scheme, 38% of workshop leaders invested 3-5 hours, with an additional 33% investing 6-10 hours. 25% invested more than 11 hours and 5% between 1-2 hours. Some workshop leaders commented that they invested more hours than there was budget to pay them for.

91% of workshop leaders thought that the project had a positive impact on the students who took part, and a large majority felt it had also been beneficial to them as a PGR student, with the skills most frequently cited as having being developed being teaching delivery, teaching design, team working and leadership. 91% of workshop leaders indicated that they would be willing to continue in the project should it run again.

Selected quotes:

“I heard from a number of tutors that mistakes in referencing were less compared to previous years”

“Students expressed frustration and confusion at the beginning of the class; understanding and relief at the end. Feedback attested to students’ confidence going forward.”

“PhD students who design and teach these courses should be remunerated properly for the time it takes to prepare such essential workshops.”

“A fantastic programme. Very happy to have taken part, and would love to see this continue to grow.”



The Academic Skills Project has had a successful initial roll out, with benefits reported by all parties. Over the summer of 2015 the project will be further expanded to Science Schools, and hopefully become embedded as a valued co-curricular programme.

Posted in Student development | Leave a comment

Polishing Up – the impact of professionalising training for Estates Cleaning Staff


CAPOD runs several extremely successful Passport to Excellence development programmes for different groups of professional staff in the University. One such group is the Estates Cleaning Team. The Cleaning Manager approached CAPOD in 2011 to see if we could developing a passport scheme for the team. There was existing training in place, but the new programme would have the following aims:

• Ensure that everyone was trained to the same standard
• All training properly documented
• Staff to get recognition for the training they have done and the skills they have acquired

The programme was designed to consist of short training sessions which covered both technical and interpersonal skills:

delivered by 2

The programme saw its first graduates in 2012, when all current staff participated. Sessions were thereafter run twice a year so that all new staff could attend. The total number of graduates at the end of 2014 was 117. All staff who graduate are presented with a certificate at a recognition event on completion.

The breakdown of these is as follows:
2012 86 staff
2013 21 staff
2014 10 staff


To measure the impact of the programme, two sets of data were considered:

• Set 1 relates to absence and performance related issues relating to work standards.
• Set 2 is data derived from a questionnaire issued to all cleaning staff in January 2015

Set 1 – Cleaning staff sickness absence

Since the launch of the programme in June 2011, there has been a 27% drop in sickness absence.

Number of absence instances – averaged per year over the period

sickness absence table

Set 2 – Performance related issues (work standards)

Although the numbers here are small, they represent a 35% decrease in performance issues. It should be noted that the number of staff employed has increased by 33% due to new buildings being serviced by the team, however, the number of performance issues has continued to drop overall.

Number of issues – averaged per year over the period

performance issue table

What Cleaning Staff think

In January 2015, Estates Cleaners were asked to complete a short questionnaire about the Passport to Cleaning Excellence. 84 questionnaires were returned from a total staff of 120 which represents an excellent 70% response rate. Some non-returners were accounted for by new staff who have not yet completed their training.

The staff were asked to choose Strongly Disagree, Agree, Disagree or Strongly Disagree with the in response to the following 4 statements:

1. Participating in the Passport programme was a positive experience.
2. The training I received was helpful to me.
3. The training I received helped me feel more confident in the workplace.
4. Going to the training sessions enabled me to get to know more of the Estates Team.

The results were overwhelmingly positive with over 94% choosing either strongly agree or agree for each question. In the case of Question 1, this figure was 99% with only one respondent choosing a negative response.

strongly disagree

The questionnaire had space for staff to make additional comments. Only a few took this opportunity, and the comments received were as follows:

Love the training
Some of it was fun!
We tend to stay in our groups
Being trained 3 years after I started was useless
I had already learned it on my own
Great Experience

chart 2

(click on chart to enlarge)


The Estates Cleaning team were already an excellent one with high standards. However, since the introduction of the Passport to Cleaning Excellence there has been a percentage decrease in absence instances and performance related work standard issues with the trends going in an increasingly positive direction.

Perhaps more significant has been the way in which the staff have responded to the training opportunities given. The hugely positive response illustrated in the column chart above shows that the training has been extremely well received and appreciated by the staff concerned. Further anecdotal evidence of this positive attitude towards the programme is that many of the certificates awarded are now proudly displayed on noticeboards and walls around the University.

Moving forward, CAPOD plans to review and update the content of the interpersonal training sessions which are delivered, and is committ ed to continu e to deliver the programme in the future.


With grateful thanks to Gillian Jordan, Cleaning Manager and to the Cleaning Staff for their feedback and for being excellent participants at the training sessions.

Cert Pres January 2014 2

Some graduates from 2014.

Posted in Passport programmes | Leave a comment

Winning Research Funding

Posted on behalf of Emma Compton-Daw


Winning Research Funding is a full day workshop for postdoctoral research assistants, fellows and early career academics who are just starting out on their independent research careers. Participants attend the courses to learn more about the UK/EU research funding landscape, how to tailor funding applications effectively and to discuss how to manage rejection. Alongside the external trainer and an experienced academic who facilitate the day, participants also hear from a range of academics from St Andrews, both those early in their careers who have recently been successful in securing funding and also senior academics who are experienced in assessing funding applications.

Two workshops, one for the Sciences, and one for the Arts, Humanities and Social Sciences (AHSS), have been held during each of the 2014/2015, 2013/2014 and 2012/2013 academic years, with a total of 57 people attending.


All 57 participants were asked to complete a survey about how what they felt the long term impact of this course was, if any; approximately 40% (23) of participants responded to the survey. They were asked to rate how well they felt the workshop had on the following areas for them on a scale of 1 and 6 (1 = not at all and 6= to an extremely large extent):

  1. increased understanding of the EU/UK funding landscape – almost all the responses were 4, 5 or 6, with an average of 4.7
  2. increased confidence in writing funding applications – almost all the responses were 4, 5 or 6, with an average of 4.5
  3. increased competence at writing funding applications – almost all the responses were 4, 5 or 6, with an average of 4.5

Four of the respondents had gone on to secure research funding and in some part attributed their success to attending the course:

“It contributed somewhat in that:
– it focused my mind on tailoring my application to what the funding body wants
– and the most important bit I got from the workshop is that you should apply to everything possible, and seek out opportunities”
Early Career Industrial Fellowship from the Scottish Funding Council, via SICSA

…”making me think of what the experience of reviewers/panel members is like and how I can tailor my application to make it easy and convincing for them” BBSRC New Investigator Award

“Better understanding of details to provide in the application.” Successful application to Carnegie Trust’s Small Research Grants scheme.

Even some of those who have not been successful in securing research funding since attending this course have found it useful in the longer term with respondents reporting that it has helped them in shaping proposals, clarifying the differences between funding bodies, understanding expectations of interview committee panellists and giving them the confidence to apply at all:

“Following this workshop I did submit an application for funding under the AHRC’s Digital Transformations scheme. Although the application was unsuccessful, I wouldn’t have had the confidence to attempt it at all if I hadn’t attended this session.”

Posted in Researchers | Leave a comment

Introduction to University Teaching modules


These optional, 10-credit Masters-level modules were introduced in academic year 2009-10 (for ID5101) and 2010-11 (for ID5102).

  • Introduction to University Teaching 1: Supporting Student Learning (ID5101)
  • Introduction to University Teaching 2: Curriculum Design and Assessment (ID5102)

They were specifically designed to support the professional development of postgraduate tutors and demonstrators who wished to pursue a career in academia after their PhDs.  (See Long term impact, below.) However, the modules are open to all staff in the University who support learning and teaching, and over the years an increasing number of research staff and early career academics have enrolled as well.  Both modules are accredited by the Higher Education Academy (HEA) at Descriptor 1 of the UK Professional Standards Framework, which means that successful completion of either module confers Associate Fellowship of the HEA.  HEA Fellowship or some other form of teaching qualification is increasingly becoming an essential requirement for academic posts.


ID5101 runs every year in semester 1, and ID5102 runs every year in semester 2.

Enrolment on each module is capped at 16 to ensure a highly interactive and engaging learning experience for all participants. Uptake has increased steadily since the modules were first introduced. There were 8 students in the first ID5101 cohort in AY2009-10, 13 in AY2011-12 (and 12-13) and 16 this year. For ID5102, the first cohort in AY2010-11 was 13, and this has remained fairly stable since.


Participant feedback recorded on the standard University module evaluation forms, and anonymous surveys via Moodle, is highly positive. For standard questions relating to module design (eg The module was well organised) and delivery (eg The lecturer was good at explaining things, The teaching style was engaging), the average rating has always been between 1 and 2 (out of 5, where 1 is strong agreement and therefore the best response).

Participants are able to identify specific improvements to their teaching as a result of having completed the modules, eg:

“It has allowed me to improve the quality of my feedback tremendously, in addition to giving me skills in module design.”

“Although I was initially cynical about the relevance of the CAPOD modules to my teaching, I have adopted several of the practices suggested within the course. In particular, I have developed formative exercises to support students in their summative assessments, often incorporating peer-feedback. I also found the development of a reflective journal surprisingly useful to the development of my teaching in the medium to long term.”

“It emphasized for me that not only what is covered in a module but how it is taught determines what students learn, and needs to be addressed intentionally in designing the course. It was helpful to see my own practice and the practice of very good teachers in my department in this light.”

“I think the reflective aspect was most helpful, it forced me to consider what I did (and most importantly didn’t) do well and consider how I can improve on this next semester.”

Participants found the modules very rewarding and universally agreed that they would recommend the modules to others, eg:

“It’s eye opening, enhances one’s teaching and attitude, teaches how to think outside the box, and in reality practice of reflective technique is a transferable skill. It really should be mandatory.”

“I think it’s helpful to be formally taught how to put a module together–everything from choosing topics to weighting them to making sure the assessment reinforces what you want your students to learn–and the philosophy behind it. Also, I loved being in an interdisciplinary group and learning about my colleagues’ perspectives through the discussion and their module topics.”

“It’s an invaluable opportunity to engage with pedagogical theory and practical techniques. Chance for open and supportive discussion is excellent. Has certainly helped me develop my teaching practice, and would imagine this would be case for any PGR.”

However, in addition to self-improvement, it was clear from feedback that at least some participants also had a career advancement motivation for doing the modules:

“Mainly because the classroom discussion is enjoyable and it looks good on your CV.”

“Because it’s not a lot of work to get something that can make a nice difference on a CV.”

“Thinking instrumentally, it’s a great CV enhancement.”

“I would recommend it to anyone who wants to be in academia, especially post-docs who will be doing some kind of teaching and supervising.”

Long term impact

One of the aims of these modules is to support and develop postgraduate tutors and demonstrators who wish to pursue a career in academia after their PhD. Feedback from past participants (some of whom have now graduated and found academic posts) testify to the positive and lasting impact that the modules have had on their professional development and career progression as academics:

“I’m moving to London for 1 July as I got a (research) post-doc at King’s! But I just wanted to say thanks for your help and for running these teaching modules. I think it’s actually more important than anyone is really stressing at the moment to get that first foot on the HEA ladder – more or less all of the teaching jobs I’ve applied for over the last few months have asked specifically whether the candidate has any HEA accreditation. So, at least from a historian’s perspective, maybe you can pass that on to try and ‘sell’ the courses (and I’ll certainly continue telling students I know). I think in an era with large numbers of PhDs and lots of competition, HEA looks like it’s becoming a way to set yourself apart from other candidates. I said that all the teaching jobs I applied for asked for it, but thinking about it so did the research ones.” [Edward Roberts, Mediaeval History, completed ID5101 in 2013-14, graduated in 2014]

“I’d like to let it be known that taking Supporting Student Learning (ID5101) and Curriculum Design and Assessment (ID5102) was one of the best things about my entire PhD process. I learned that teaching is very much a craft. Because of your course, I am developing and sharing my pedagogical philosophies and learning and teaching practices with senior faculty members in my department and others. As a postdoctoral fellow at the University of Pittsburgh–who is designing and teaching several of my own courses–the modules I took have really helped me to deliver effective courses for my students. The lessons I learned from the modules (like creating clear learning objectives, and then linking them together in a cumulative and coherent way) have enabled me to receive extremely high marks in my student course evaluations. Not just this, but several assistant professors now come to me with their classroom problems (e.g. getting students engaged, what to do with later papers, failure to grasp threshold concepts, etc.)!” [Philip Kao, Social Anthropology, completed ID5102 in 2011-12 and ID5101 in 2012-13, graduated in 2014]

“I put your teaching to good use after all: that module on film criticism I designed never saw the light (sadly), but I got a position within the Foundational Programme at the ELC, and had to design my own introductory module to Film Studies. It was great fun, and I tried to make it as constructively aligned as I could. So, well, thanks again – ID5012 was one of the most rewarding and energising experiences I have had in my time in St A, and – retrospectively – one of the most useful so far in terms of my professional life.” [Pasquale Cicchetti, Film Studies, completed ID5102 in 2012-13]

“I wanted to let you know how useful both the ID5101/ AFHEA qualification, and the IRLT experience, has been for my CV. They’ve led to some very positive conversations about employment – sadly not to actual employment, but certainly playing an important part in getting noticed as something more than just another not-yet-published/ soon-to-submit PhD student. I was at a conference and mentioning the AFHEA qualification was clearly an important step in being invited to apply for a lectureship at Leicester.” [Management student, completed ID5101 in 2013-14]

For more information on these modules, visit the Research postgraduates who teach page (and scroll to the bottom), and you may wish to read an article about the modules published in the journal Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education Vol 8, No 2 (2013): Postgraduates who teach: a forgotten tribe? Not here!

Posted in Researchers | Leave a comment

Microsoft Certifications: Performance IT

The Microsoft Office Specialist Certification program (or MOS for short) was introduced 2 years ago to give staff and students the opportunity to accredit their IT skills in using the Microsoft Office suite of programs. To gain a certification in a given Office program, the applicant must pass a task-based practical exam. The aim and objectives in developing this program were:

Aim: To give staff and students the opportunity to earn certification credentials to validate their desktop computer skills.

  • Improve the staff IT skills profile through accreditation
  • Improve staff and student IT usage and thus efficiency through engaging with the exam preparation and training resources
  • Enhance student employability through attaining a marketable credential
  • Increase staff motivation for career and personal development by offering the program at no cost to the participants

The Program
A MOS certification is obtainable in each of the MS Office applications at graduated levels, the culmination of which is the Master level certification which denotes fluency across a range of applications. Our statistics show not just a healthy uptake, but success rate as well:

  • Number currently registered: 170 (total registrations from the program launch 285)
  • Number of exams delivered : 330 (111 at Expert level)
  • Number of Master Certifications achieved: 30
  • Exam pass rate: 81% (compares very favourably with the national average of 65%)

The much vaunted benefits of MOS certification in relation to the workplace as promoted by the vendors are:

  • Increased productivity
  • Increased effectiveness and initiative
  • Increased employability

But how do these attributes playout in practise with our MOS program?

Evaluation & Results
As part of our evaluation process, we survey those who complete their Master level certification for feedback on the program and for an assessment of the impact it has had on their computer use. This survey is generated and submitted back as an online form, 3-6 weeks following their program completion.
In addition to questions relating to their personal objectives for participating in the program, they are asked to reflect on the extent to which their MOS certification skills have impacted both their confidence and competence in their role.
Response rate: 45%

  • 83% reported improved confidence in their role
  • 75% reported improved competence in their role

The response from those that have completed the MOS program survey indicates the vast majority have experienced increased confidence and competence in using IT in their role and given the extent to which IT underpins work output, will feed through to increased productivity. These results compare favourably with the published results of findings from a MOS Productivity Study conducted by the University of Utah for Microsoft (David Eccles Business School – MBA Field Study- University of Utah, 2012) where they reported:

  • 82% of employees becoming MOS certified felt more confidence in their abilities as a worker
  • 88% of surveyed employees felt MOS made them more effective in their work

This correlation with productivity and effectiveness is further underscored by responses in the formative feedback sections of our MOS Master survey where they were asked to comment on the most significant impact. To give some examples:

  • “I am now able to […] report on data significantly quicker (and more stylishly).”
  • “Having the confidence to use packages at an advanced level.”
  • “Improved my understanding greatly in areas I don’t use every day.”

The alleged benefits do thus appear to be borne out in practise. However, the focus of the discussion up to this point has been on the benefits derived from the acquisition and transfer of skills as an individual phenomenon. Gallivan, et al (2005, p179) in their study on co-workers influence on IT usage in the workplace concluded “having co-workers who are knowledgeable and confident IT users (and who hold positive attitudes toward the training they received) does positively influence an employee’s IT usage”; a passive ‘leakage’ of benefit, as it were. They even go so far as to assert that, with regards to work-groups, this influence “shapes users’ beliefs, skill levels and motivation to use IT within an organisation more effectively than does user training.” Thus the positive experience and upskilling that has been documented in our MOS Masters can have a significant wider impact on their work environment.
This is supported both from our survey feedback (“Able to use some features of the training in everyday work and to train others”) and from other unsolicited reports received from those who had been sought out by co-workers as a direct result of their improved skills related to their Master certification. There is evidence in the literature that suggests benefits are derived not only from the direct intervention by these ‘resident experts’ but that the degree of a co-worker’s self-efficacy also creates an environment that encourages IT usage (Gallivan, et al., 2005, p. 163).
That’s productivity and effectiveness addressed; how does MOS relate to an increase in student employability? According to a study by the technology industry body, CompTIA, 86% of hiring managers indicated IT certifications are a high or medium priority during the candidate evaluation process (CompTIA, 2012). It’s not surprising then that most students seek the certifications to validate their IT skills and to differentiate themselves in the job market. Again, formative feedback gives some degree of corroboration: “Everyone assumes these days that our generation has advanced computer skills, being able to prove that easily is a huge asset.”
There is more to this statement than just affirmation of the value of MOS. It also acknowledges that there is a presumption that students, the ‘digital natives’, having grown up with ever more pervasive and sophisticated technology, should therefore somehow have some innate computer literacy and competency. This is certainly true when it comes to social media. However, studies indicate that this is not the case with business related software where there was a significant gap between their perceived ability and their actual efficacy (Grant, et al, 2009). MOS certifications are thus an opportunity not only for accreditation but are also a source whereby these shortfalls can be addressed on an individual basis.
MOS certifications do appear to deliver on all fronts: productivity, effectiveness, employability with potential broader implications for IT usage in workplace environments. It could be argued that this would be the likely outcome of the engagement with any concerted training program; the difference here is the increased motivation to engage through the achievement and recognition of the accreditation.

Like all technology, MOS certifications are evolving along with the software on which it is based. The new MOS program currently being developed will deliver an exam even better designed to gauge and assess the efficient use of technology and which we will continue to monitor.
To find out more information on the MOS program, visit the comprehensive MOS website.
CompTIA, 2012. State of the IT Skills Gap. [Online]
Available at:
[Accessed 15 Oct 2014].
David Eccles Business School – MBA Field Study- University of Utah, 2012. MOS Productivity Study. [Online]
Available at:
[Accessed 15 Oct 2014].
Gallivan, M. J., Spitler, V. K. & Koufaris, M., 2005. Does information technology training really matter? A social information processing analysis of coworkers’ influence on IT usage in the workplace. Journal of Management Information Systems, 22(1), pp. 153-192.
Grant, D., Malloy, A. & Murphy, M., 2009. A comparison of student perceptions of their computer skills to their actual abilities. Journal of Information Technology Education, Volume 8, pp. 141-160.

Posted in IT training | Leave a comment

Performance management training: Line managers’ experience of performance management, their attitudes towards it and the impact of training.

1. Introduction

As part of a larger project to explore performance management in the Institution, CAPOD (University’s Centre for Academic, Professional and Organisational Development) recently ran a survey of line managers to find out more about their attitude towards performance management, their perceptions of how this is handled in the organisation, and their own experience of dealing with performance issues in their teams.

Within the data collected it is possible to compare the responses of participants who have attended relevant training in the last 2 years (CAPOD runs a number of workshops related to performance management) to those who have not. It is therefore possible to identify where there are significant differences in responses, which may indicate that attendance on performance management training has an impact not only on perceptions and attitudes around managing performance, but also on the experience line managers have with managing performance issues in their teams.

2. Survey methodology and participation

The survey was carried out using an online questionnaire. A covering letter, including a hyperlink to the questionnaire was sent out to all line managers in the institution. The survey, covering letter and survey methodology were approved by the University Teaching and Research Ethics Committee (UTREC) and complied with the required standards in terms of participation and protection of personal data.

The anonymised and aggregated survey results are not therefore attributable to individual survey participants.

The survey invitation was sent to a total of 456 people. Of this total, 107 had attended relevant training within the last 2 years and 349 were line managers who had not attended relevant training in the last 2 years.

It should be noted that the population of people who had attended training may have included some former members of staff no longer employed at the University, and also included some workshop participants who do not currently have line management responsibilities and who (as their line management status was unknown at the time of conducting the survey) were asked not to complete the questionnaire.

The total number of completed questionnaires was 127, or whom 36 had attended training and 91 had not.

This represents an overall response rate of 28%. This breaks down to 34% for people who had attended training and 26% for those who had not. As the ‘trained’ population included an unknown number of people who may have left the University or may not have been current line managers, the actual percentage response rate for ‘trained’ population and for the population overall is likely to be slightly higher than the reported rate.

3. Interpretation of the results

A number of different question types were used and this analysis only draws on those question types with responses that are quantifiable and which can be aggregated together and averaged (i.e. excluding free text responses). These include questions scored on a Likert scale, questions where respondents could choose more than one option from a list, could choose just one option from a list and simple Yes/No questions.

As not all questions were ‘required’, the sample size varies between questions. In every case the percentages presented represent the percentage of the sample group for that question who gave the specified response. Raw numbers of sample size and respondents choosing the specified response are also included.

A ‘positive response’ is defined as one where respondents selected either 4 or 5 on a 5 point scale, with 5 being the highest.

4. Overall results of the survey

The overall results for all respondents are as follows:

  • 72% of respondents (91 out of 127) reported that they have experienced underperformance issues in their team in the last 2 years
  •  52% of respondents (56 out of 108) stated that they currently have unresolved performance issues in their team
  •  Only 44% of respondents (40 out of 90) felt they have been successful in dealing with poor performance in the past
  •  Respondents rated the most common types of underperformance as ‘negative attitudes’ (64% of respondents, or 61 out of 96) and ‘poor quality of work’ (58% or 56 out of 96)
  •  Respondents rated the most common causes of underperformance as ‘failure to deal with underperformance in the past’ (55% or 53 out of 96) and ‘failure to recruit the right people in the first place’ (53% or 51 out of 96)
  •  49% of all respondents (62 out of 126) feel well-equipped to deal with underperformance.
  •  59% of all respondents (73 out of 124) feel their own line manager supports them in dealing with underperformance issues
  •  35% of all respondents (44 out of 126) feel the University provides effective support for managing underperformance.
  •  46% of respondents (58 out of 126) say they are familiar with written procedures for performance management.
  •  47% of all respondents (58 out of 124) are confident that they will receive effective support for taking action under formal written procedures.
  •  35% of all respondents (44 out of 127) are confident that the University will deal effectively with poor performance when formal action is taken.

Three further questions asked respondents to rate a list of eight items in terms of how important each of them are to effective performance management, how confident they feel with each of these and how well they feel the University supports them in each of these. The eight items are:

  • Recruiting the right people in the first place
  • Providing new staff with a well-planned induction programme
  • Setting out clear standards and expectations relating to conduct and performance
  • Setting clear and measurable targets/objectives
  • Supporting staff to develop their confidence and competence
  • Monitoring performance and providing feedback
  • Identifying and addressing underperformance at an early stage
  • Providing recognition for good performance

Respondents were most positive when rating the importance of these items to successful performance management, with an average positive response ranging between 76-97% for different items.

Respondents were less confident about their own effectiveness in all listed aspects of performance management, with an average positive response ranging between 48-78% for different items.

Respondents were even less positive about the effectiveness of the University in supporting them in different aspects of performance management, with average responses ranging between 19-50% for different items

5. Differences between trained/not-trained respondents

While the overall results are of interest, with no internal or external benchmarks against which to compare these results, only general conclusions can be drawn from the responses to each question or by comparing responses between questions. For example it is interesting to note that almost three quarters of respondents have experienced performance issues in their team in the last two years and that more than half of them currently have unresolved performance issues in their team. Or that almost half of respondents (48%) feel well-equipped to deal with underperformance issues, but only 44% of them felt that they have been successful in dealing with performance issues.   However, the data can be subjected to further analysis which enables us to explore any differences of perceptions, attitudes and experience of respondents who have attended training and those who have not.

Based on a comparison of responses between the two groups, line managers who had attended relevant training within the last 2 years are more positive about:

  • The importance of the range of eight listed issues in relation to successful performance management.

The overall aggregated average positive response across the range of issues was 95% for trained managers and 87% for not-trained managers. Notable differences in the positive response rates for specific issues include:

o   Importance of well-planned induction (89% [32 out of 36] for trained managers against 71% [65 out of 91] for not-trained)

o   Importance of setting clear and measurable targets/objectives (94% [34 out of 36] trained: 78% [70 out of 90] not-trained)

o   Importance of monitoring performance and providing feedback (100% [36 out of 36] trained: 79% [71 out of 90] not-trained)

This indicates that line managers who have been trained are more likely to recognise that a wide range of factors is important to successful performance management than those who have not, and to appreciate the importance of those factors. Managers who have been trained are therefore more likely to address these issues (or to address these issues more diligently) than managers who have not been trained, so they are more likely to deliver well-planned inductions, set clear targets, monitor performance and provide feedback.

  •  Their own confidence in addressing these issues.

The overall aggregated average positive response across the range of issues was 71% for trained managers and 66% for not-trained managers. Most notably this includes a difference in:

o   Confidence in identifying and addressing underperformance at an early stage (61% [22 out of 36] trained: 43% [38 out of 89] not-trained)

This shows that while confidence levels with the range of performance management issues are overall lower than the levels of recognition of the importance of those issues to effective performance management, those that have been trained are more confident than those that have not.

Confidence is an important component of performance and this must therefore be considered in relation to the effectiveness of trained managers in managing performance issues.

  • The effectiveness of the university in supporting them to address these issues.

The overall aggregated average positive response across the range of issues was 46% for trained managers and 30% for not-trained managers. Notable differences in positive response rates for specific issues include:

o   Recruiting the right people in the first place (56% [20 out of 36] trained: 46% [41 out of 90] not-trained)

o   Setting clear standards and expectations (53% [19 out of 36] trained: 34% [30 out of 88] not-trained)

o   Setting clear and measurable targets (44% [16 out of 36] trained: 27% [24 out of 89] not-trained)

o   Monitoring performance and providing feedback (43% [15 out of 36] trained: 23% [20 out of 87] not-trained)

o   Identifying and addressing underperformance at an early stage (33% [12 out of 36 trained]: 13% [11 out of 86] not-trained)

o   Providing recognition for good performance (28% [10 out of 36] trained: 15% [13 out of 66] not-trained)

Again, this shows that while respondents rate the support they receive in addressing the listed aspects of performance management lower than either the importance of those issues or their confidence in dealing with them, those that have been trained rate the effectiveness of the University in supporting them as higher than those that have not.

  • Feeling that they have dealt effectively with performance issues in the past (57% [13 out of 23] trained: 40% [27 out of 67] not-trained).
  •  Feeling well-equipped to address performance issues in their team (61% [22 out of 36] trained: 44% [40 out of 90] not-trained)
  • Being supported by their line managers in addressing performance management issues (64% [23 out of 36] trained: 57% [50 out of 88] not-trained)
  • Receiving effective support from the University in managing underperformance (53% [19 out of 36] trained: 28% [ 25 out of 90] not-trained)
  • Familiarity with formal written procedures for dealing with poor performance (58% [21 out of 36] trained: 41% [37 out of 90] not-trained)
  • Confidence in receiving effective support when taking action under formal written procedures (56% [20 out of 36] trained: 43% [38 out of 88] not-trained)
  • Confidence that the University will deal effectively with poor performance when formal action is taken (53% [19 out of 36] trained: 27% [25 out of 91] not-trained)

The differences indicated above show that trained line managers are significantly more likely than their colleagues who have not been trained to feel that they have dealt effectively with performance management issues and to feel well-equipped to deal with performance issues.

They are also more likely to feel well-supported by their own line manager and by the university. Unsurprisingly, as this is covered during training, they are likely to rate themselves as more familiar with written procedures, but they also have higher levels of confidence that they will be supported in using those procedures and that the University will deal effectively with formal action.

Very significantly the analysis of the data reveals one more key difference between those who have attended training and those who have not:

Line managers who have not been trained in the last 2 years are more likely to:

  • Have unresolved performance issues in their team (58% [42 out of 72] not-trained: 39% [14 out of 36] trained)

This suggests that line managers who have attended relevant training are more effective at resolving (or preventing) performance management issues and therefore to have higher performing teams.

6. Conclusions

The results show (across the whole range of questions around perception, attitudes and experiences of performance management) that those who have been trained are more likely to appreciate how performance is effected by a variety of issues, they are more positive about their own level of knowledge and confidence in dealing with performance issues, they feel that they are better supported in managing performance and feel more confident that the University will support them and will deal with performance issues effectively.

Ultimately those who have been trained feel that they are more effective at managing performance and are less likely to have unresolved performance issues in their team.

This strongly suggests that attendance on relevant training has a positive impact on management skills, knowledge and attitudes and on the effectiveness of managers in dealing with performance management issues.

Posted in Evaluation & Feedback | Leave a comment