CAPOD – Supporting Professional Skills Development for Entrepreneurial Researchers

CAPOD – Supporting Professional Skills Development for Entrepreneurial Researchers

Back in November 2015, CAPOD provided 2 workshops for Researchers as part of the Passport to Research Futures programme.  These events were delivered by experts in the field, Dr Ewan Chirnside from the Knowledge Transfer Centre at University of St Andrews and Brian Butchart a CEO at Aberdeen based pharma company SIRAKOSS.

Understanding what it is like to start your own business and getting investment and budgeting right is key to survival in your first few months of starting a business.

DavidOne guy who has taken his first step into the start-up world is David Harris-Birtill.  David (pictured) has attended a number of workshops that CAPOD has provided since he came to St Andrews.  When I met with David in early March this year, I asked him what he thought about CAPOD.  His response was “I love CAPOD and the courses – I’ve been on loads of them”.  We talked about his business, his product and how he is commercialising it and how the CAPOD courses have helped him along the way.  He explained how useful the Business Budgeting and Let’s Make a Business events have been and about useful golden nuggets of information these events have provided. He was particularly grateful for the information on how important cash flow is when budgeting.   He also highlighted how he appreciated the fact that Brian Butchart, who ran the Business Budgeting event, is a CEO of a start-up company in Aberdeen and how the first-hand experience Brian offered was great…this highlighted to David what ‘it is like to run out of cash!’

David registered his company ‘Beyond Medics Limited’ in September 2015. David’s company creates imaging and sensing platforms for patient benefit.  This is how David describes the product:

“A person’s heart rate and blood oxygenation levels are vital signs which provide clinicians an indication of how well a person is. Conventionally measured using a pulse oximeter, a finger probe, this requires contact with the skin and a clinician to place the device on the person. Beyond Medics has created a camera-based system to automatically remotely measure these vital signs from a distance and can do so for up to six people at once, no longer needing these clips, enabling better triage of patients in A&E and a more seamless workflow through a hospital, care home or security setting.”

CAPOD offers a number of events to support the professional development of researchers to be ‘entrepreneurs’.  Ewan Chirnside from Knowledge Transfer offers the ‘Let’s Make a Business’ event on the Passport to Research Futures programme to those who want to explore the development of an idea and using the Blank Canvas technique to develop the ideas into business propositions.  Working with Dawn Shand, Senior Business & Innovation Advisor from Scottish Institute for Enterprise, the 3 hour session enabled attendees to explore their business ideas and create an action plan for further exploration.

Brian Butchart, CEO from Aberdeen based company Sirakoss, offered his time and hands-on expertise to deliver a 2 hour session on business budgeting, exploring the quite complex way of managing cash flow when you are trying to create a product, getting the product to market and attracting investors along the way. Brian led the spin-out process of SIRAKOSS, with his fellow founding Directors.  Brian brings extensive UK and international expertise in medical devices to the session.  He was 22 years with Johnson & Johnson, where he proved his well-developed Sales, Marketing, Operations and Business Unit Management skills. In 2002 he joined Isotron (Synergy Health) as UK General Manager responsible for seven operating sites and 200+ employees in Operations, Sales, Marketing, Finance and Quality, before serving as Global Sales & Marketing Director (2007-2009).

Things are moving ahead at a rapid pace for David in the ‘entrepreneurial world’.  David has been supported along the way by Ewan Chirnside helping him with Intellectual Property Rights and has signposted David to funding opportunities such as ‘Pitch at the Palace’ and ‘2 minute pitch’.

In Jan 16, the Universities Scotland’s Research Training Sub-Committee (Including Heads of Researcher Development and Heads of Graduate Schools) and supported by the Innovation Scotland Forum organised a 1 day event ‘Creating an entrepreneurial research culture in Scottish HEIs’.  The objective was to discuss and identify actions at an institutional and national level to create an entrepreneurial culture for all researchers (PhD students to Professors) in Scottish HEIs.

Our professional development events continue to create this culture here at St Andrews.  If you are a researcher and want specialist support with developing your business then please contact the Knowledge Transfer Centre.  If you wish to develop your professional skills and knowledge then CAPOD offers a wide range of courses and events.  Events such as ‘Working with the Media’, ‘Leadership Development’, ‘Business Budgeting’, ‘Let’s Make a Business’, ‘Effective Communication’, ‘Microsoft Office’ etc, all provide the foundation for the professional entrepreneur.  We also provide coaching and mentoring.  For an informal chat about any of the training we offer for Research Staff, please contact Michelle Paterson, Staff Developer.

Finally, here’s what people think about the Let’s Make a Business and Business Budgeting events….

  • “Excellent intro to starting a business”
  • “It is helpful and interesting.  Easy to understand”
  • “This is absolutely necessary if you want to try your own business”
  • “Fantastic knowledgable speaker”
  • “The breadth of material covered was significant, but absolutely relevant.  I wouldn’t have thought of half of it otherwise!”
  • “Fantastic event.  Excellent breakdown of what needs to be budgeted for.  Also excellent for business planning and creativity.”
  • “Good overview of Basic Structure of business planning.  Very useful as foundation to build my own business.  Presenter very approachable”
  • “If you are at all interested in entrepreneurship then do attend”

 

 

Posted in Uncategorized | Leave a comment

Awakening students to the value of feedback

Introduction

Student feedback, particularly in relation to assessed work, is a hot topic. Various initiatives over the years have focused on ways to improve levels of student engagement and satisfaction with feedback but it continues to be both a challenging and key area for the institution and individual Schools.

In support of this, a student intervention called Making feedback work for you was piloted in two schools in Academic Year 2013-14. Designed by Edinburgh Napier University, the intervention did not directly address the widely reported topic of what constitutes well designed and delivered feedback, but instead considered the attitudinal perspectives of students who are receiving academic feedback, how it affects their associated actions and its impact upon their subsequent performance.

Coordinated by Ros Campbell (CAPOD) and Erwin Lai (CAPOD), the pilot aimed to assess the viability, suitability and sustainability of the intervention for implementation across the wider University.

The intervention

The intervention draws on two key concepts:

  1. The Conscious Competence Matrix: Adapted from Howell (1982), the Conscious Competence matrix describes a four step journey to becoming accomplished at any skill. The key concepts highlighted to students are that: awareness of what ‘competence’ consists of is critical, i.e. a clear understanding of learning outcomes and marking criteria; it is normal for students to be at different points on the journey for different skills; effort that is efficiently targeted through effective feedback is key to success on the journey.
  1. Growth (incremental) versus Fixed (entity) Mindset: Mindset is situational and is your view of your ability to do a particular thing. A Fixed mindset believes that ability is ‘set’ (whether high, medium or low) and unlikely to change over time. This attitude promotes the setting of personal goals that are performance (rather than learning) oriented, which in turn discourages attention to feedback and effort, and over time promotes helplessness. Adopting a growth mindset requires a belief that ability is malleable, i.e. can improve or worsen over time.  This attitude promotes the setting of learning-based personal goals, which in turn promotes attention to feedback and targeted effort to develop and improve skills, promoting perseverance over time.

Delivery mode

A blended learning approach was adopted for the pilot:

  1. Online course: An adapted online Moodle course offered potential for large numbers of students to be exposed to the material in a consistent and cost-effective way, reduced the importance of the skill set of the staff member delivering the workshop, and provided an opportunity to trial a relatively new online course. Students were asked to complete the course in their own time. In addition to the two key concepts, students were encouraged to reflect on sources of feedback; consider personal barriers to learning/using feedback; and complete a feedback action sheet drawing on ‘real’ feedback from their last assignment.
  2. Face-to-face workshop: Students were then required to attend an interactive follow-up workshop during class time. Drawing on key concepts in the online course, students worked individually and in groups to: Consider skills being assessed within the assignment they had just completed; identify personal areas of strengths and weakness regarding these skills, share strategies and tips; reflect on feedback from their latest assignment; and begin producing a personal action plan.

This pilot project produced useful results and important lessons. Overall, the intervention appeared to be of benefit to a majority of students in both pilot schools. Both the online course and face-to-face workshops had some positive impact on the student cohorts involved in the pilot, with the face-to-face workshops reviewing far more positively than the online course.

Key results

  1. 61% of students felt the online course should be made available to all students in the University
  2. 88% of students felt the workshop should be made available to all students in the University
  3. A significant number of students (60% of Computer Science students and 69% of Classics students) felt the workshop will help them in their studies.

Sample of student comments gathered at various stages of the pilot

“It changed the way I view feedback. I pay much closer attention to it now”.

“I gained new ideas on how to improve and gain confidence in the methods I was already using”.

“I plan to put more time into reviewing feedback, particularly negative feedback, which until now I was often very dismissive of”.

“I will really think about feedback, view it positively and think about how I can improve in weaker areas”.

“I will start taking feedback more seriously”.

“I will analyse feedback more carefully and use some tactics discussed in groups to improve my skills”.

“I was pleasantly surprised by how thought-provoking it was and would like others to have this experience”.

“I’m more likely to consult with peers for extra ideas and feedback”.

Computer Science students perceived an improvement in the level and quality of feedback received (although the school confirmed the feedback was in line with their standard provision):

“The feedback we received for the feedback session was really fantastic, as there was so much more than usual…and everyone received broadly the same amount of feedback. People are really hoping this carries on”. (Minutes, SSCC meeting, 10-3-14)

Conclusion

There were indications that the intervention should be considered for wider roll-out across the University, although it must be noted that support for this was less apparent for the online course (only 60%), compared with the face-to-face workshop (88%).

Next steps

An overview and outcomes of the pilot were presented at a Learning and Teaching Open Forum event in February 2015. On the basis of the pilot results, challenges, and effort required, all five discussion groups agreed that a University-wide roll-out of this initiative would be worthwhile. When asked to consider the most effective way of doing this, participants suggested a further pilot. Ros and Erwin are in the process of recruiting pilot schools and designing an enhanced version of the intervention tailored to St Andrews students.

Posted in Uncategorized | Leave a comment

The Academic Skills Project

asp

1. Background:

Following a successful Enhancement Themes Funding Application, three PhD students in the School of History introduced a series of academic skills workshops for History undergraduates in 2012. In the summer of 2014 the scheme was expanded, via CAPOD, to all Arts Schools. The aim of this Academic Skills Project was to create a framework for subject-specific academic skills to be delivered to a large number of UG students via high quality workshops delivered by PGR students, creating benefits for both cohorts.

In the summer months of 2014, Directors of Teaching were briefed about the scheme, PGR School Coordinators were recruited and trained, and the project infrastructure was established. The School Coordinators in turn recruited PGR workshop leaders to design and deliver the sessions, and advertised the programmes to UG students at the start of the AY2014/15 session.

The Schools taking part are:

  • Art History
  • Classics
  • Divinity
  • Film Studies
  • Geography and Sustainable Development
  • History
  • IR
  • Management
  • Modern Languages
  • Philosophy

2. Structure:

The project ran in each participating School through the following structure:

structure

CAPOD’s role:

  • To take an overview of the project and provide continuity between Schools.
  • To provide financial backing.
  • To provide opportunities for ideas dissemination between participating Schools.
  • To evaluate the success of the project.
  • To provide quality guidance via our Academic Skills tutors.

The structure of having PGRs deliver subject-specific academic skills was welcomed by University of St Andrew’s Psychologist, Dr K Maver who researches in the field of personal and social self-categories and identity. He welcomed the “discipline-based social-identity” which relates to the use of deep-learning approaches:

 “We argue that this is also modified by the normative effect – that is, it matters what they think is normative for students in their discipline. We speculate that again this could work either way: if you identity as a student in a discipline and see deep learning is normative, you are more likely to engage in deep learning; reciprocally, if you already engage in deep learning and see that as normative, then your identification [as a student of the subject] will increase. IF we are right, then the current strategy being used for the academic skill project is quite close to optimal, and either making the courses discipline free, or losing the interactive element, would reduce the effectiveness.” [email correspondence, 15/10/14]

3. Programmes:

The Schools’ academic skills programmes ran across broad themes (comprehension, analysis, rhetoric), but often presented as subject-specific topics. Many of the School Coordinator’s arrived at their programmes through extensive discussions with module coordinators, PGR tutors and Directors of Teaching to identify skills gaps within the Schools. Examples of this specificity include:

  • Group work (Management)
  • Critical Engagement and Research Skills (Modern Languages)
  • Thinking critically – thinking Geographically (G&SD)
  • Visual Analysis Trip (Art History)
  • How to argue like a Philosopher (Philosophy)
  • How do I watch film? (Film Studies)
  • Analysis and use of different sources (Classics)
  • Researching in IR (IR)
  • Speaking Skills (Modern Languages)

4. Engagement:

Exact numbers  of students engaging with the Academic Skills Project are not known as record keeping by School Coordinators and Workshop Leaders was not 100% accurate.

However, a strong indication can be taken from the number of workshop evaluation forms returned (the true number of students attending will be higher).  In total 787 feedback forms were returned. These help show the engagement of participants by School. The School with the largest engagement was International Relations, that had 32% of their sub-honours students (194 students) attend at least one workshop.

pie chart

  1. Impact:

Ideally, the project should show a correlation between workshop attendance and academic performance. In reality it is hard to demonstrate a causal relationship due to the large number of other variables involved in academic performance. Discussions have taken place between Head of Student Development, Catriona Wilson, and Dr Kenneth Mavor (School of Psychology and Neuroscience), about capturing more robust pre and post intervention data should the project be rolled out to Science schools.

 

In the absence of additional hard data, an indication of impact is provided in the observations of participants, Workshop Leaders, School Coordinators and Directors of Teaching.

6. Evaluation:

6a. Participant evaluation:

“What is expected of me is much clearer now. I feel more comfortable and confident about beginning my academic studies here.” [Student participant]

Participants rated the workshops highly in terms of objectives, material, presenters, structure and timing. They were also very positive about the workshops’ relevance and the likelihood of making a change to their behaviour as a result. Amalgamated data from the 787 returned evaluation forms is below:

graph

Selected quotes:

“I will be more selective in how I read sources.”

“The prospect of coursework is terrifying. This was so helpful and reassuring. A++”

“I will look at sources differently and be able to choose them better.”

“I’ll spend more time than I previously have structuring my essay research.”

“I’m much clearer about what’s expected now.”

“I’ll change the way I take notes.”

 

6b. Director of Teaching evaluation:

 “This should be ideally permanent in the Arts. It has proven to be very positively reviewed both by PGRs delivering workshops and students’ take up and responses.” [Director of Teaching]

7 Directors of Teaching completed the online evaluation form.

Directors of Teaching reported that they were mostly involved with advising on workshop content (6 said they were partly involved) and deciding on the workshop programme (1 very involved, 4 partly involved, 2 not involved). Some but not all Directors of Teaching played a role in recruiting PGR students, publicising the workshops and one stated that they were partly involved in workshop delivery.

The Directors of Teaching had a range of objectives from the programme:

  • Provision of academic skills to JH students
  • Supporting PGR develop their teaching skills
  • Helping students recognise that different subjects require specific skills
  • Focusing on International MSc students
  • Helping first years understand what tutorials are for
  • Increasing the approachability of staff (PGRs) by students

All the Directors of Teaching felt the project had been beneficial to the undergraduates and the PGRs who had taken part.

 6c. School Coordinator evaluation:

“I think it’s helped me understand our students better, improve my communication with students, and to manage expectations – of myself, students and tutors” [School Coordinator]

10 School coordinators took part in the project. 9 completed the online survey about their experiences.

School coordinators put varying amounts of time into the project, from 5 hours to 30+ hours. The average was ‘13-20’ hours. Some Schools reported less engagement than they would have liked from fellow PGRs to take part in the project. Frustration with participant drop-out rates was also reported, a common feature of any undergraduate development programme.

CAPOD support was valued by all School Coordinators. In order of ranked importance: supplying funding; photocopying materials; evaluating the workshops; consulting with CAPOD’s academic skills tutor; proving a project Moodle space; offering networking meetings.

School Coordinators each reported personal benefits from taking part in the project. The skill most mentioned as having been developed was curriculum design, followed by Teaching (2nd=) and Leadership (2nd=); Recruitment and Selection (4th=) and Team working (4th=). Budget management, event management and Marketing were also mentioned.

Selected quotes:

“I think we helped clarify what the School expects of its first and second year student, and we helped explain key concepts and techniques to help the students perform better in their modules.”

“It provided training that lecturers and tutors have said students need.”

“The students were very keen and seemed to be grateful for the existence of the project. The tutors were enthusiastic and committed to doing a good job, and recognised the value of improving the academic skills of our undergraduates.”

 

6d. Workshop leader evaluation:

“I think it has taught me about teaching to different levels of students” [Workshop Leader]

57 workshop leaders took part in the project. 21 completed the online survey.

The main motivator for taking part in the project was a wish to enhance the undergraduate learning experience, cited as being a factor by 80% of respondents, followed by having a different kind of teaching experience (60%), teaching at an earlier stage of their PhD (35%) and financial reward (25%).

Approximately 50% of workshop leaders were only involved with one workshop. 30% delivered 2 workshops, 15% delivered 3 workshops and 5% more than 3.

In terms of time invested into the scheme, 38% of workshop leaders invested 3-5 hours, with an additional 33% investing 6-10 hours. 25% invested more than 11 hours and 5% between 1-2 hours. Some workshop leaders commented that they invested more hours than there was budget to pay them for.

91% of workshop leaders thought that the project had a positive impact on the students who took part, and a large majority felt it had also been beneficial to them as a PGR student, with the skills most frequently cited as having being developed being teaching delivery, teaching design, team working and leadership. 91% of workshop leaders indicated that they would be willing to continue in the project should it run again.

Selected quotes:

“I heard from a number of tutors that mistakes in referencing were less compared to previous years”

“Students expressed frustration and confusion at the beginning of the class; understanding and relief at the end. Feedback attested to students’ confidence going forward.”

“PhD students who design and teach these courses should be remunerated properly for the time it takes to prepare such essential workshops.”

“A fantastic programme. Very happy to have taken part, and would love to see this continue to grow.”

 

 Conclusion:

The Academic Skills Project has had a successful initial roll out, with benefits reported by all parties. Over the summer of 2015 the project will be further expanded to Science Schools, and hopefully become embedded as a valued co-curricular programme.

Posted in Uncategorized | Leave a comment

Polishing Up – the impact of professionalising training for Estates Cleaning Staff

Background

CAPOD runs several extremely successful Passport to Excellence development programmes for different groups of professional staff in the University. One such group is the Estates Cleaning Team. The Cleaning Manager approached CAPOD in 2011 to see if we could developing a passport scheme for the team. There was existing training in place, but the new programme would have the following aims:

• Ensure that everyone was trained to the same standard
• All training properly documented
• Staff to get recognition for the training they have done and the skills they have acquired

The programme was designed to consist of short training sessions which covered both technical and interpersonal skills:

delivered by 2

The programme saw its first graduates in 2012, when all current staff participated. Sessions were thereafter run twice a year so that all new staff could attend. The total number of graduates at the end of 2014 was 117. All staff who graduate are presented with a certificate at a recognition event on completion.

The breakdown of these is as follows:
2012 86 staff
2013 21 staff
2014 10 staff

Impact

To measure the impact of the programme, two sets of data were considered:

• Set 1 relates to absence and performance related issues relating to work standards.
• Set 2 is data derived from a questionnaire issued to all cleaning staff in January 2015

Set 1 – Cleaning staff sickness absence

Since the launch of the programme in June 2011, there has been a 27% drop in sickness absence.

Number of absence instances – averaged per year over the period

sickness absence table

Set 2 – Performance related issues (work standards)

Although the numbers here are small, they represent a 35% decrease in performance issues. It should be noted that the number of staff employed has increased by 33% due to new buildings being serviced by the team, however, the number of performance issues has continued to drop overall.

Number of issues – averaged per year over the period

performance issue table

What Cleaning Staff think

In January 2015, Estates Cleaners were asked to complete a short questionnaire about the Passport to Cleaning Excellence. 84 questionnaires were returned from a total staff of 120 which represents an excellent 70% response rate. Some non-returners were accounted for by new staff who have not yet completed their training.

The staff were asked to choose Strongly Disagree, Agree, Disagree or Strongly Disagree with the in response to the following 4 statements:

1. Participating in the Passport programme was a positive experience.
2. The training I received was helpful to me.
3. The training I received helped me feel more confident in the workplace.
4. Going to the training sessions enabled me to get to know more of the Estates Team.

The results were overwhelmingly positive with over 94% choosing either strongly agree or agree for each question. In the case of Question 1, this figure was 99% with only one respondent choosing a negative response.

strongly disagree

The questionnaire had space for staff to make additional comments. Only a few took this opportunity, and the comments received were as follows:

Enjoyed!
Love the training
Some of it was fun!
We tend to stay in our groups
Being trained 3 years after I started was useless
I had already learned it on my own
Great Experience

chart 2

(click on chart to enlarge)

Summary

The Estates Cleaning team were already an excellent one with high standards. However, since the introduction of the Passport to Cleaning Excellence there has been a percentage decrease in absence instances and performance related work standard issues with the trends going in an increasingly positive direction.

Perhaps more significant has been the way in which the staff have responded to the training opportunities given. The hugely positive response illustrated in the column chart above shows that the training has been extremely well received and appreciated by the staff concerned. Further anecdotal evidence of this positive attitude towards the programme is that many of the certificates awarded are now proudly displayed on noticeboards and walls around the University.

Moving forward, CAPOD plans to review and update the content of the interpersonal training sessions which are delivered, and is committ ed to continu e to deliver the programme in the future.

Acknowledgements

With grateful thanks to Gillian Jordan, Cleaning Manager and to the Cleaning Staff for their feedback and for being excellent participants at the training sessions.

Cert Pres January 2014 2

Some graduates from 2014.

Posted in Uncategorized | Leave a comment

Winning Research Funding

Posted on behalf of Emma Compton-Daw

Background

Winning Research Funding is a full day workshop for postdoctoral research assistants, fellows and early career academics who are just starting out on their independent research careers. Participants attend the courses to learn more about the UK/EU research funding landscape, how to tailor funding applications effectively and to discuss how to manage rejection. Alongside the external trainer and an experienced academic who facilitate the day, participants also hear from a range of academics from St Andrews, both those early in their careers who have recently been successful in securing funding and also senior academics who are experienced in assessing funding applications.

Two workshops, one for the Sciences, and one for the Arts, Humanities and Social Sciences (AHSS), have been held during each of the 2014/2015, 2013/2014 and 2012/2013 academic years, with a total of 57 people attending.

Results

All 57 participants were asked to complete a survey about how what they felt the long term impact of this course was, if any; approximately 40% (23) of participants responded to the survey. They were asked to rate how well they felt the workshop had on the following areas for them on a scale of 1 and 6 (1 = not at all and 6= to an extremely large extent):

  1. increased understanding of the EU/UK funding landscape – almost all the responses were 4, 5 or 6, with an average of 4.7
  2. increased confidence in writing funding applications – almost all the responses were 4, 5 or 6, with an average of 4.5
  3. increased competence at writing funding applications – almost all the responses were 4, 5 or 6, with an average of 4.5

Four of the respondents had gone on to secure research funding and in some part attributed their success to attending the course:

“It contributed somewhat in that:
– it focused my mind on tailoring my application to what the funding body wants
– and the most important bit I got from the workshop is that you should apply to everything possible, and seek out opportunities”
Early Career Industrial Fellowship from the Scottish Funding Council, via SICSA

…”making me think of what the experience of reviewers/panel members is like and how I can tailor my application to make it easy and convincing for them” BBSRC New Investigator Award

“Better understanding of details to provide in the application.” Successful application to Carnegie Trust’s Small Research Grants scheme.

Even some of those who have not been successful in securing research funding since attending this course have found it useful in the longer term with respondents reporting that it has helped them in shaping proposals, clarifying the differences between funding bodies, understanding expectations of interview committee panellists and giving them the confidence to apply at all:

“Following this workshop I did submit an application for funding under the AHRC’s Digital Transformations scheme. Although the application was unsuccessful, I wouldn’t have had the confidence to attempt it at all if I hadn’t attended this session.”

Posted in Uncategorized | Leave a comment

Introduction to University Teaching modules

Background

These optional, 10-credit Masters-level modules were introduced in academic year 2009-10 (for ID5101) and 2010-11 (for ID5102).

  • Introduction to University Teaching 1: Supporting Student Learning (ID5101)
  • Introduction to University Teaching 2: Curriculum Design and Assessment (ID5102)

They were specifically designed to support the professional development of postgraduate tutors and demonstrators who wished to pursue a career in academia after their PhDs.  (See Long term impact, below.) However, the modules are open to all staff in the University who support learning and teaching, and over the years an increasing number of research staff and early career academics have enrolled as well.  Both modules are accredited by the Higher Education Academy (HEA) at Descriptor 1 of the UK Professional Standards Framework, which means that successful completion of either module confers Associate Fellowship of the HEA.  HEA Fellowship or some other form of teaching qualification is increasingly becoming an essential requirement for academic posts.

Uptake

ID5101 runs every year in semester 1, and ID5102 runs every year in semester 2.

Enrolment on each module is capped at 16 to ensure a highly interactive and engaging learning experience for all participants. Uptake has increased steadily since the modules were first introduced. There were 8 students in the first ID5101 cohort in AY2009-10, 13 in AY2011-12 (and 12-13) and 16 this year. For ID5102, the first cohort in AY2010-11 was 13, and this has remained fairly stable since.

Evaluation

Participant feedback recorded on the standard University module evaluation forms, and anonymous surveys via Moodle, is highly positive. For standard questions relating to module design (eg The module was well organised) and delivery (eg The lecturer was good at explaining things, The teaching style was engaging), the average rating has always been between 1 and 2 (out of 5, where 1 is strong agreement and therefore the best response).

Participants are able to identify specific improvements to their teaching as a result of having completed the modules, eg:

“It has allowed me to improve the quality of my feedback tremendously, in addition to giving me skills in module design.”

“Although I was initially cynical about the relevance of the CAPOD modules to my teaching, I have adopted several of the practices suggested within the course. In particular, I have developed formative exercises to support students in their summative assessments, often incorporating peer-feedback. I also found the development of a reflective journal surprisingly useful to the development of my teaching in the medium to long term.”

“It emphasized for me that not only what is covered in a module but how it is taught determines what students learn, and needs to be addressed intentionally in designing the course. It was helpful to see my own practice and the practice of very good teachers in my department in this light.”

“I think the reflective aspect was most helpful, it forced me to consider what I did (and most importantly didn’t) do well and consider how I can improve on this next semester.”

Participants found the modules very rewarding and universally agreed that they would recommend the modules to others, eg:

“It’s eye opening, enhances one’s teaching and attitude, teaches how to think outside the box, and in reality practice of reflective technique is a transferable skill. It really should be mandatory.”

“I think it’s helpful to be formally taught how to put a module together–everything from choosing topics to weighting them to making sure the assessment reinforces what you want your students to learn–and the philosophy behind it. Also, I loved being in an interdisciplinary group and learning about my colleagues’ perspectives through the discussion and their module topics.”

“It’s an invaluable opportunity to engage with pedagogical theory and practical techniques. Chance for open and supportive discussion is excellent. Has certainly helped me develop my teaching practice, and would imagine this would be case for any PGR.”

However, in addition to self-improvement, it was clear from feedback that at least some participants also had a career advancement motivation for doing the modules:

“Mainly because the classroom discussion is enjoyable and it looks good on your CV.”

“Because it’s not a lot of work to get something that can make a nice difference on a CV.”

“Thinking instrumentally, it’s a great CV enhancement.”

“I would recommend it to anyone who wants to be in academia, especially post-docs who will be doing some kind of teaching and supervising.”

Long term impact

One of the aims of these modules is to support and develop postgraduate tutors and demonstrators who wish to pursue a career in academia after their PhD. Feedback from past participants (some of whom have now graduated and found academic posts) testify to the positive and lasting impact that the modules have had on their professional development and career progression as academics:

“I’m moving to London for 1 July as I got a (research) post-doc at King’s! But I just wanted to say thanks for your help and for running these teaching modules. I think it’s actually more important than anyone is really stressing at the moment to get that first foot on the HEA ladder – more or less all of the teaching jobs I’ve applied for over the last few months have asked specifically whether the candidate has any HEA accreditation. So, at least from a historian’s perspective, maybe you can pass that on to try and ‘sell’ the courses (and I’ll certainly continue telling students I know). I think in an era with large numbers of PhDs and lots of competition, HEA looks like it’s becoming a way to set yourself apart from other candidates. I said that all the teaching jobs I applied for asked for it, but thinking about it so did the research ones.” [Edward Roberts, Mediaeval History, completed ID5101 in 2013-14, graduated in 2014]


“I’d like to let it be known that taking Supporting Student Learning (ID5101) and Curriculum Design and Assessment (ID5102) was one of the best things about my entire PhD process. I learned that teaching is very much a craft. Because of your course, I am developing and sharing my pedagogical philosophies and learning and teaching practices with senior faculty members in my department and others. As a postdoctoral fellow at the University of Pittsburgh–who is designing and teaching several of my own courses–the modules I took have really helped me to deliver effective courses for my students. The lessons I learned from the modules (like creating clear learning objectives, and then linking them together in a cumulative and coherent way) have enabled me to receive extremely high marks in my student course evaluations. Not just this, but several assistant professors now come to me with their classroom problems (e.g. getting students engaged, what to do with later papers, failure to grasp threshold concepts, etc.)!” [Philip Kao, Social Anthropology, completed ID5102 in 2011-12 and ID5101 in 2012-13, graduated in 2014]


“I put your teaching to good use after all: that module on film criticism I designed never saw the light (sadly), but I got a position within the Foundational Programme at the ELC, and had to design my own introductory module to Film Studies. It was great fun, and I tried to make it as constructively aligned as I could. So, well, thanks again – ID5012 was one of the most rewarding and energising experiences I have had in my time in St A, and – retrospectively – one of the most useful so far in terms of my professional life.” [Pasquale Cicchetti, Film Studies, completed ID5102 in 2012-13]


“I wanted to let you know how useful both the ID5101/ AFHEA qualification, and the IRLT experience, has been for my CV. They’ve led to some very positive conversations about employment – sadly not to actual employment, but certainly playing an important part in getting noticed as something more than just another not-yet-published/ soon-to-submit PhD student. I was at a conference and mentioning the AFHEA qualification was clearly an important step in being invited to apply for a lectureship at Leicester.” [Management student, completed ID5101 in 2013-14]

For more information on these modules, visit the Research postgraduates who teach page (and scroll to the bottom), and you may wish to read an article about the modules published in the journal Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education Vol 8, No 2 (2013): Postgraduates who teach: a forgotten tribe? Not here!

Posted in Uncategorized | Leave a comment

Microsoft Certifications: Performance IT

Background
The Microsoft Office Specialist Certification program (or MOS for short) was introduced 2 years ago to give staff and students the opportunity to accredit their IT skills in using the Microsoft Office suite of programs. To gain a certification in a given Office program, the applicant must pass a task-based practical exam. The aim and objectives in developing this program were:

Aim: To give staff and students the opportunity to earn certification credentials to validate their desktop computer skills.
Objectives:

  • Improve the staff IT skills profile through accreditation
  • Improve staff and student IT usage and thus efficiency through engaging with the exam preparation and training resources
  • Enhance student employability through attaining a marketable credential
  • Increase staff motivation for career and personal development by offering the program at no cost to the participants

The Program
A MOS certification is obtainable in each of the MS Office applications at graduated levels, the culmination of which is the Master level certification which denotes fluency across a range of applications. Our statistics show not just a healthy uptake, but success rate as well:

  • Number currently registered: 170 (total registrations from the program launch 285)
  • Number of exams delivered : 330 (111 at Expert level)
  • Number of Master Certifications achieved: 30
  • Exam pass rate: 81% (compares very favourably with the national average of 65%)

The much vaunted benefits of MOS certification in relation to the workplace as promoted by the vendors are:

  • Increased productivity
  • Increased effectiveness and initiative
  • Increased employability

But how do these attributes playout in practise with our MOS program?

Evaluation & Results
As part of our evaluation process, we survey those who complete their Master level certification for feedback on the program and for an assessment of the impact it has had on their computer use. This survey is generated and submitted back as an online form, 3-6 weeks following their program completion.
In addition to questions relating to their personal objectives for participating in the program, they are asked to reflect on the extent to which their MOS certification skills have impacted both their confidence and competence in their role.
Response rate: 45%

  • 83% reported improved confidence in their role
  • 75% reported improved competence in their role

Discussion
The response from those that have completed the MOS program survey indicates the vast majority have experienced increased confidence and competence in using IT in their role and given the extent to which IT underpins work output, will feed through to increased productivity. These results compare favourably with the published results of findings from a MOS Productivity Study conducted by the University of Utah for Microsoft (David Eccles Business School – MBA Field Study- University of Utah, 2012) where they reported:

  • 82% of employees becoming MOS certified felt more confidence in their abilities as a worker
  • 88% of surveyed employees felt MOS made them more effective in their work

This correlation with productivity and effectiveness is further underscored by responses in the formative feedback sections of our MOS Master survey where they were asked to comment on the most significant impact. To give some examples:

  • “I am now able to […] report on data significantly quicker (and more stylishly).”
  • “Having the confidence to use packages at an advanced level.”
  • “Improved my understanding greatly in areas I don’t use every day.”

The alleged benefits do thus appear to be borne out in practise. However, the focus of the discussion up to this point has been on the benefits derived from the acquisition and transfer of skills as an individual phenomenon. Gallivan, et al (2005, p179) in their study on co-workers influence on IT usage in the workplace concluded “having co-workers who are knowledgeable and confident IT users (and who hold positive attitudes toward the training they received) does positively influence an employee’s IT usage”; a passive ‘leakage’ of benefit, as it were. They even go so far as to assert that, with regards to work-groups, this influence “shapes users’ beliefs, skill levels and motivation to use IT within an organisation more effectively than does user training.” Thus the positive experience and upskilling that has been documented in our MOS Masters can have a significant wider impact on their work environment.
This is supported both from our survey feedback (“Able to use some features of the training in everyday work and to train others”) and from other unsolicited reports received from those who had been sought out by co-workers as a direct result of their improved skills related to their Master certification. There is evidence in the literature that suggests benefits are derived not only from the direct intervention by these ‘resident experts’ but that the degree of a co-worker’s self-efficacy also creates an environment that encourages IT usage (Gallivan, et al., 2005, p. 163).
That’s productivity and effectiveness addressed; how does MOS relate to an increase in student employability? According to a study by the technology industry body, CompTIA, 86% of hiring managers indicated IT certifications are a high or medium priority during the candidate evaluation process (CompTIA, 2012). It’s not surprising then that most students seek the certifications to validate their IT skills and to differentiate themselves in the job market. Again, formative feedback gives some degree of corroboration: “Everyone assumes these days that our generation has advanced computer skills, being able to prove that easily is a huge asset.”
There is more to this statement than just affirmation of the value of MOS. It also acknowledges that there is a presumption that students, the ‘digital natives’, having grown up with ever more pervasive and sophisticated technology, should therefore somehow have some innate computer literacy and competency. This is certainly true when it comes to social media. However, studies indicate that this is not the case with business related software where there was a significant gap between their perceived ability and their actual efficacy (Grant, et al, 2009). MOS certifications are thus an opportunity not only for accreditation but are also a source whereby these shortfalls can be addressed on an individual basis.
Conclusion
MOS certifications do appear to deliver on all fronts: productivity, effectiveness, employability with potential broader implications for IT usage in workplace environments. It could be argued that this would be the likely outcome of the engagement with any concerted training program; the difference here is the increased motivation to engage through the achievement and recognition of the accreditation.

Like all technology, MOS certifications are evolving along with the software on which it is based. The new MOS program currently being developed will deliver an exam even better designed to gauge and assess the efficient use of technology and which we will continue to monitor.
To find out more information on the MOS program, visit the comprehensive MOS website.
References
CompTIA, 2012. State of the IT Skills Gap. [Online]
Available at: http://www.comptia.org/resources/state-of-the-it-skills-gap?cid=download
[Accessed 15 Oct 2014].
David Eccles Business School – MBA Field Study- University of Utah, 2012. MOS Productivity Study. [Online]
Available at: ftp://ftp.certiport.com/marketing/MOS/doc/MOS-Productivity-Study.pdf
[Accessed 15 Oct 2014].
Gallivan, M. J., Spitler, V. K. & Koufaris, M., 2005. Does information technology training really matter? A social information processing analysis of coworkers’ influence on IT usage in the workplace. Journal of Management Information Systems, 22(1), pp. 153-192.
Grant, D., Malloy, A. & Murphy, M., 2009. A comparison of student perceptions of their computer skills to their actual abilities. Journal of Information Technology Education, Volume 8, pp. 141-160.

Posted in Uncategorized | Leave a comment

Performance management training: Line managers’ experience of performance management, their attitudes towards it and the impact of training.

1. Introduction

As part of a larger project to explore performance management in the Institution, CAPOD (University’s Centre for Academic, Professional and Organisational Development) recently ran a survey of line managers to find out more about their attitude towards performance management, their perceptions of how this is handled in the organisation, and their own experience of dealing with performance issues in their teams.

Within the data collected it is possible to compare the responses of participants who have attended relevant training in the last 2 years (CAPOD runs a number of workshops related to performance management) to those who have not. It is therefore possible to identify where there are significant differences in responses, which may indicate that attendance on performance management training has an impact not only on perceptions and attitudes around managing performance, but also on the experience line managers have with managing performance issues in their teams.

2. Survey methodology and participation

The survey was carried out using an online questionnaire. A covering letter, including a hyperlink to the questionnaire was sent out to all line managers in the institution. The survey, covering letter and survey methodology were approved by the University Teaching and Research Ethics Committee (UTREC) and complied with the required standards in terms of participation and protection of personal data.

The anonymised and aggregated survey results are not therefore attributable to individual survey participants.

The survey invitation was sent to a total of 456 people. Of this total, 107 had attended relevant training within the last 2 years and 349 were line managers who had not attended relevant training in the last 2 years.

It should be noted that the population of people who had attended training may have included some former members of staff no longer employed at the University, and also included some workshop participants who do not currently have line management responsibilities and who (as their line management status was unknown at the time of conducting the survey) were asked not to complete the questionnaire.

The total number of completed questionnaires was 127, or whom 36 had attended training and 91 had not.

This represents an overall response rate of 28%. This breaks down to 34% for people who had attended training and 26% for those who had not. As the ‘trained’ population included an unknown number of people who may have left the University or may not have been current line managers, the actual percentage response rate for ‘trained’ population and for the population overall is likely to be slightly higher than the reported rate.

3. Interpretation of the results

A number of different question types were used and this analysis only draws on those question types with responses that are quantifiable and which can be aggregated together and averaged (i.e. excluding free text responses). These include questions scored on a Likert scale, questions where respondents could choose more than one option from a list, could choose just one option from a list and simple Yes/No questions.

As not all questions were ‘required’, the sample size varies between questions. In every case the percentages presented represent the percentage of the sample group for that question who gave the specified response. Raw numbers of sample size and respondents choosing the specified response are also included.

A ‘positive response’ is defined as one where respondents selected either 4 or 5 on a 5 point scale, with 5 being the highest.

4. Overall results of the survey

The overall results for all respondents are as follows:

  • 72% of respondents (91 out of 127) reported that they have experienced underperformance issues in their team in the last 2 years
  •  52% of respondents (56 out of 108) stated that they currently have unresolved performance issues in their team
  •  Only 44% of respondents (40 out of 90) felt they have been successful in dealing with poor performance in the past
  •  Respondents rated the most common types of underperformance as ‘negative attitudes’ (64% of respondents, or 61 out of 96) and ‘poor quality of work’ (58% or 56 out of 96)
  •  Respondents rated the most common causes of underperformance as ‘failure to deal with underperformance in the past’ (55% or 53 out of 96) and ‘failure to recruit the right people in the first place’ (53% or 51 out of 96)
  •  49% of all respondents (62 out of 126) feel well-equipped to deal with underperformance.
  •  59% of all respondents (73 out of 124) feel their own line manager supports them in dealing with underperformance issues
  •  35% of all respondents (44 out of 126) feel the University provides effective support for managing underperformance.
  •  46% of respondents (58 out of 126) say they are familiar with written procedures for performance management.
  •  47% of all respondents (58 out of 124) are confident that they will receive effective support for taking action under formal written procedures.
  •  35% of all respondents (44 out of 127) are confident that the University will deal effectively with poor performance when formal action is taken.

Three further questions asked respondents to rate a list of eight items in terms of how important each of them are to effective performance management, how confident they feel with each of these and how well they feel the University supports them in each of these. The eight items are:

  • Recruiting the right people in the first place
  • Providing new staff with a well-planned induction programme
  • Setting out clear standards and expectations relating to conduct and performance
  • Setting clear and measurable targets/objectives
  • Supporting staff to develop their confidence and competence
  • Monitoring performance and providing feedback
  • Identifying and addressing underperformance at an early stage
  • Providing recognition for good performance

Respondents were most positive when rating the importance of these items to successful performance management, with an average positive response ranging between 76-97% for different items.

Respondents were less confident about their own effectiveness in all listed aspects of performance management, with an average positive response ranging between 48-78% for different items.

Respondents were even less positive about the effectiveness of the University in supporting them in different aspects of performance management, with average responses ranging between 19-50% for different items

5. Differences between trained/not-trained respondents

While the overall results are of interest, with no internal or external benchmarks against which to compare these results, only general conclusions can be drawn from the responses to each question or by comparing responses between questions. For example it is interesting to note that almost three quarters of respondents have experienced performance issues in their team in the last two years and that more than half of them currently have unresolved performance issues in their team. Or that almost half of respondents (48%) feel well-equipped to deal with underperformance issues, but only 44% of them felt that they have been successful in dealing with performance issues.   However, the data can be subjected to further analysis which enables us to explore any differences of perceptions, attitudes and experience of respondents who have attended training and those who have not.

Based on a comparison of responses between the two groups, line managers who had attended relevant training within the last 2 years are more positive about:

  • The importance of the range of eight listed issues in relation to successful performance management.

The overall aggregated average positive response across the range of issues was 95% for trained managers and 87% for not-trained managers. Notable differences in the positive response rates for specific issues include:

o   Importance of well-planned induction (89% [32 out of 36] for trained managers against 71% [65 out of 91] for not-trained)

o   Importance of setting clear and measurable targets/objectives (94% [34 out of 36] trained: 78% [70 out of 90] not-trained)

o   Importance of monitoring performance and providing feedback (100% [36 out of 36] trained: 79% [71 out of 90] not-trained)

This indicates that line managers who have been trained are more likely to recognise that a wide range of factors is important to successful performance management than those who have not, and to appreciate the importance of those factors. Managers who have been trained are therefore more likely to address these issues (or to address these issues more diligently) than managers who have not been trained, so they are more likely to deliver well-planned inductions, set clear targets, monitor performance and provide feedback.

  •  Their own confidence in addressing these issues.

The overall aggregated average positive response across the range of issues was 71% for trained managers and 66% for not-trained managers. Most notably this includes a difference in:

o   Confidence in identifying and addressing underperformance at an early stage (61% [22 out of 36] trained: 43% [38 out of 89] not-trained)

This shows that while confidence levels with the range of performance management issues are overall lower than the levels of recognition of the importance of those issues to effective performance management, those that have been trained are more confident than those that have not.

Confidence is an important component of performance and this must therefore be considered in relation to the effectiveness of trained managers in managing performance issues.

  • The effectiveness of the university in supporting them to address these issues.

The overall aggregated average positive response across the range of issues was 46% for trained managers and 30% for not-trained managers. Notable differences in positive response rates for specific issues include:

o   Recruiting the right people in the first place (56% [20 out of 36] trained: 46% [41 out of 90] not-trained)

o   Setting clear standards and expectations (53% [19 out of 36] trained: 34% [30 out of 88] not-trained)

o   Setting clear and measurable targets (44% [16 out of 36] trained: 27% [24 out of 89] not-trained)

o   Monitoring performance and providing feedback (43% [15 out of 36] trained: 23% [20 out of 87] not-trained)

o   Identifying and addressing underperformance at an early stage (33% [12 out of 36 trained]: 13% [11 out of 86] not-trained)

o   Providing recognition for good performance (28% [10 out of 36] trained: 15% [13 out of 66] not-trained)

Again, this shows that while respondents rate the support they receive in addressing the listed aspects of performance management lower than either the importance of those issues or their confidence in dealing with them, those that have been trained rate the effectiveness of the University in supporting them as higher than those that have not.

  • Feeling that they have dealt effectively with performance issues in the past (57% [13 out of 23] trained: 40% [27 out of 67] not-trained).
  •  Feeling well-equipped to address performance issues in their team (61% [22 out of 36] trained: 44% [40 out of 90] not-trained)
  • Being supported by their line managers in addressing performance management issues (64% [23 out of 36] trained: 57% [50 out of 88] not-trained)
  • Receiving effective support from the University in managing underperformance (53% [19 out of 36] trained: 28% [ 25 out of 90] not-trained)
  • Familiarity with formal written procedures for dealing with poor performance (58% [21 out of 36] trained: 41% [37 out of 90] not-trained)
  • Confidence in receiving effective support when taking action under formal written procedures (56% [20 out of 36] trained: 43% [38 out of 88] not-trained)
  • Confidence that the University will deal effectively with poor performance when formal action is taken (53% [19 out of 36] trained: 27% [25 out of 91] not-trained)

The differences indicated above show that trained line managers are significantly more likely than their colleagues who have not been trained to feel that they have dealt effectively with performance management issues and to feel well-equipped to deal with performance issues.

They are also more likely to feel well-supported by their own line manager and by the university. Unsurprisingly, as this is covered during training, they are likely to rate themselves as more familiar with written procedures, but they also have higher levels of confidence that they will be supported in using those procedures and that the University will deal effectively with formal action.

Very significantly the analysis of the data reveals one more key difference between those who have attended training and those who have not:

Line managers who have not been trained in the last 2 years are more likely to:

  • Have unresolved performance issues in their team (58% [42 out of 72] not-trained: 39% [14 out of 36] trained)

This suggests that line managers who have attended relevant training are more effective at resolving (or preventing) performance management issues and therefore to have higher performing teams.

6. Conclusions

The results show (across the whole range of questions around perception, attitudes and experiences of performance management) that those who have been trained are more likely to appreciate how performance is effected by a variety of issues, they are more positive about their own level of knowledge and confidence in dealing with performance issues, they feel that they are better supported in managing performance and feel more confident that the University will support them and will deal with performance issues effectively.

Ultimately those who have been trained feel that they are more effective at managing performance and are less likely to have unresolved performance issues in their team.

This strongly suggests that attendance on relevant training has a positive impact on management skills, knowledge and attitudes and on the effectiveness of managers in dealing with performance management issues.

Posted in Uncategorized | Leave a comment

Managing the sands of time…how are we doing?

Since the inception of CAPOD in May 2011, our Time Management workshop has been attended by over 180 people across the University. It has proven to be one of our most popular workshops, and anecdotally the feedback has been largely positive. However – what happens after the workshop? Do our workshop participants behave differently? Are they finding they are able to apply what they have learnt? In August 2014, we took the opportunity to find out. We contacted everyone who had ever attended our Time Management workshop in the last three academic years and we asked them to help us evaluate the impact of the workshop by completing an online questionnaire. Of the 183 people invited to participate, we had 47 respondents (response rate of just over 25%).

Here is what they told us…

70% of respondents agreed or strongly agreed that the workshop helped to shape their current system or approach to time management at work, with 13% disagreeing or strongly disagreeing.

72% of respondents agreed or strongly agreed that they are more aware of their psychological preferences, with regard to time management, and are able to use this awareness to their advantage. 9% disagreed or strongly disagreed.

79% of respondents agreed or strongly agreed that they are able to more effectively prioritise and manage their workload and stay focused on high priority work. 11% disagreed or strongly disagreed.

When asked about their competence and confidence in using and managing their task list, diary / calendar, and email inbox – these were the responses:

TASK LIST – 68% felt more competent, 72% felt more confident

DIARY / CALENDAR – 74% felt more competent, 79% felt more confident

EMAIL INBOX – 77% felt more competent, 74% felt more confident

68% of our respondents agreed or strongly agreed that they felt better able to manage personal behavioural aspects related to time management (e.g. reducing procrastination, avoiding multi-tasking activities), 15% disagreed or strongly disagreed.

49% of our respondents agreed or strongly agreed that they are able to manage stress at work more effectively, with 15% disagreeing or strongly disagreeing.

Overall, the results are encouraging with most questions garnering at least a 70% positive response (agree or strongly agree), with negative responses (disagree or strongly disagree) coming in at 15% or lower in all cases. On closer examination of our lowest positive score (being able to manage stress at work more effectively), it is worth noting that this score is at its highest in our most recent cohort (those who attended in AY2013-14) – 67% agreed or strongly agreed they are able to manage stress at work more effectively with none disagreeing or strongly disagreeing. This supersedes the scores from earlier cohorts, and we hope this will continue on an upward trend.

To finish, we would like to thank all our respondents for this useful feedback on our Time Management workshop. Here are some selected quotes on specific actions that have been taken, or specific changes that have been made, by our respondents:

“I prioritise my tasks for the day and week daily first thing and use my diary for planning tasks much more efficiently.”

“I am no longer fire fighting. I have a to do list with URGENT along with a daily and weekly list to help with prioritization.”

“Use more of the functions to organise my inbox using Microsoft Outlook. When working on something that requires concentration – turning off my email to prevent distraction.”

“I use my calendar more for time management and designating time for specific tasks.”

“I was happily surprised by all of the practical tips and helpful methods that I’ve since been able to apply to my work schedule.”

“It gave me more structure and increased my confidence in making decisions and dealing with staff.”

Posted in Uncategorized | Leave a comment

A masterful focus

As this blog has illustrated, CAPOD’s impact emanates from a wide range of sources; workshops, coaching, 1:1s, funding opportunities, internal reviews, mentoring, and so on.

Another mechanism that can lead to impactful change is the use of the simple focus group. Back at the start of semester 1, CAPOD was involved in running a series of focus groups to dig deeper into the initial experiences of postgraduate taught students (PGTs). The PGT Pro-Dean, Postgraduate Support Adviser from Student Services, and Head of Organisational Development from CAPOD, worked together to facilitate the sessions and listen to PGT students.

category-community

The focus groups centred on four main topics: arrival and first impressions, the co-curricular and  extra-curricular experience, the academic experience, and skill/career development. The students involved recounted a range of experiences ranging from a desire to have more specific Freshers’ week events,“It would have been great to have a Freshers’ week for postgrads. I would love to have had more events where it wasn’t just 17 and 18 year olds,”  to wanting to be part of the College of St Leonards, “we talked about this at the PG Society meeting, and we’d like to be part of St Leonard’s college. It seems that it’s just an administrative division,”  to surprise about academic courses, “I was very surprised when I saw we only had 10 hours class a week. I expected more.”

The impact of the focus groups was that the PGT Pro-Dean, CAPOD and Student Services were able to collaborate and involve other colleagues in working to improve the student experience for PGTs. Tangible improvements that happened subsequently included working with Registry, Accommodation Services and IT Services to improve the experience for arriving Masters students (Student Services will be providing a specific Post-grad programme in Orientation Week next year, and it will be made clearer that Masters students are welcome to join all student societies); making taught postgraduate students part of The College of St Leonard; improving student expectations about Masters study and amount of contact hours; improving and concentrating the MSkills development programme; removal of the ‘13.5 rule’ with immediate effect; and the creation of additional resources to help Masters students transition from UG study and to PhD study.

A second series of focus groups for PGTs will be held next week to take a snap-shot of their experiences at the end of the taught element of their degree. Again, their feedback will be used to review and further improve the PGT experience for future.

Posted in Uncategorized | Leave a comment