NOTSS It is hoped that we will, in the near future have the opportunity to combine the progress made in both the Non-Technical Skills for Surgeons NOTSS project see Chapter 2 in this vol
Trang 1Safer Surgery
44
system itself is located on a secure site at: <www.elogbook.org> As well as giving considerable benefits by automating tedious aspects of recording PBA, the online version offers the opportunity to gather information in real time and capture it so that trainees will submit a realistic record of their progression rather than simply retaining those PBAs they deem ‘their best’ – which is counter to the core values
of the system Naturally, electronic data permit one to contrast and compare data from different training programmes and differing contexts of training so that, hopefully, an evaluation may be made of learning in surgical training
International Compatibility
Considerable interest from overseas in the orthopaedic curriculum and in particular with the PBA tools has led to a number of proposed international pilot projects International compatibility of surgical training systems is a key issue in relation
to making it possible for trainees to complete part of their training overseas but,
at a wider level, may have considerable consequences for the mobility of surgical labour The PBA tool may offer a way of ensuring that widely differing training systems are producing compatible surgical skill sets
NOTSS
It is hoped that we will, in the near future have the opportunity to combine the progress made in both the Non-Technical Skills for Surgeons (NOTSS) project (see Chapter 2 in this volume) and in PBAs by either producing a new assessment tool based on the PBA or to integrate behavioural markers from NOTSS into the existing PBA
PowerPoint Guidance
For PBA, as for all elements of the orthopaedic curriculum, we have produced PowerPoint guides available through the website The use of this technology, in preference to a user manual, enables a trainer and trainee to sit together and review the guidance and also for a programme director to present the guide in a group setting These guides have been developed for all PBA applications to date and will be added to as work continues
Conclusion
PBAs have been an attempt to maintain and improve the high quality of surgical training in the UK Their development is still in its early stages compared to other, more established and practiced assessment methods We will have to monitor their progress for some time before we will be able to see whether, in the midst of many other changes, they have been successful
Trang 2Boardman, D., Pitts, D and Edge, J (2008) The Orthopaedic Curriculum and Assessment Project: A National Survey of SpR Views Two years after Introduction Poster presented at the British Orthopaedic Association Annual
Congress, Liverpool, September 2008
Department of Health (1993) Hospital Doctors: Training for the Future The report
of the Working Group on Specialist Medical Training (the Calman Report) London: Department of Health
Department of Health (2007) Trust, Assurance and Safety – The Regulation of Health Professionals in the 21 st Century CM 7013 London: Department of
Health
Donaldson, L (2002) Unfinished Business – Proposals for Reform of the Senior House Officer Grade A Paper for Consultation London: Department of
Health
Eraut, M (1994) Developing Professional Knowledge and Competence London:
Falmer Press
ISCP (2008) Intercollegiate Surgical Curriculum Programme: Available at:
<http://www.iscp.ac.uk> [accessed June 2008]
Kennedy, I (2001) Bristol Royal Infirmary Inquiry Retrieved from <http:/www.
bristol-inquiry.org.uk/final_report/index.htm> [last accessed October 2008] Langrish, J., Gibbons, M., Evans, W.G and Jevons, F.R 1(972) Linear models of
innovation, in J Langrish (ed.) Wealth from Knowledge: Studies of Innovation
in Industry London: Macmillan
Machiavelli, N (1515) The Prince, trans 1908 by W.K Marriott Available at:
<http://www.constitution.org/mac/prince.txt> [last accessed October 2008] OCAP Online (2008) Orthopaedic curriculum Available at: <http://www.ocap org.uk/orthocurriculum/Content/01_Intro_160707.pdf>
Oliver, C.W., Ross, E.R.S., Hollis, S and Pitts, D (1997) Impact of distance
learning material on trauma surgeons, Injury 28(3), 245–245(1).
Pitts, D and Rowley, D.I (2005) Establishing consensus on PBA; Workshops for SAC chairs Unpublished internal report for OCAP steering group
Pitts, D., Rowley, D.I and Sher, J.L (2005) Assessment of performance in
orthopaedic training Journal of Bone and Joint Surgery (British) 87–B(9),
1187–91
Pitts, D and Ross, E.R.S (2002) A competence assessment tool for the Dynamic
Hip Screw In D.I Rowley, D Pitts and C Galasko Competence Working Party report to the JCHST London: Joint Committee on Higher Surgical Training
Pitts, D., Rowley, D.I., Marx, C., Sher, L, Banks, A.J and Murray, A (2007)
Specialist Training in Trauma and Orthopaedics – A Competency Based Curriculum 2007 Available at: <http//:www.ocap.org.uk/curriculum> [last
accessed October 2008]
Trang 3Safer Surgery
46
Richards, R (1997) Clinical academic careers: Report of an independent task force chaired by Sir Rex Richards Available at: <http://www.rcgp.org.uk/docs/ISS_ SUMM97_14.DOC> [accessed November 2008]
Rowley, D, Pitts, D and Galasko, C (2002) Competence Working Party report to the JCHST London: Joint Committee on Higher Surgical Training.
Smith, J (2005) The Shipman Inquiry Availably at:
<http://www.the-shipman-inquiry.org.uk> [accessed June 2008]
Thornton, M., Donlon, M and Beard, J.D (2003) The operative skills of higher surgical trainees: Measuring competence achieved rather than experience
undertaken Royal College of Surgeons of England (bulletin), 85, 190–3.
Trang 4Implementing the Assessment of Surgical Skills and Non-Technical Behaviours in the
Operating Room
Joy Marriott, Helen Purdie, Jim Crossley and Jonathan Beard
introduction to the Study
The Sheffield Surgical Skills Study is currently evaluating the validity, reliability, feasibility and acceptability of three different workplace-based assessment tools for rating surgeons’ technical and non-technical skills in the operating room This chapter describes the design, methodology and implementation of the study It focuses on the problem-solving approach taken by the research team to address the practical issues of implementing this broad study of behaviours, drawing upon some of the successes and barriers we encountered, to illustrate this It is intended to provide valuable lessons for researchers in the field of surgical skills assessment, and for those involved in implementing workplace based assessment into surgical training
Background to Surgical Skills Assessment
Traditionally, surgical training in the UK has been based upon an apprenticeship and examination model without formal assessment of technical or non-technical skills Trainees undertook a set number of years of training and passed the Intercollegiate Examination of the Royal Colleges of Surgeons (FRCS) to achieve their Certificate of Completion of Specialist Training (CCST) for consultant practice Progress in surgical competence was historically achieved through many years and long hours spent in the operating room Although log books formed a useful record of surgical experience (Galasko and Mackay 1997), they did not provide evidence of competence (Thornton et al 2003) However, opportunities to gain experience in the operating room have decreased due to shorter training time following the Calman Report (Calman 1999) and the changes in working practices following the European Directive on Hours of Work (Department of Health 2003) This has resulted in trainees having reduced access to surgical experience before their CCST (Katory et al 2001)
Trang 5Safer Surgery
48
Over the last 15 years there has been a move to competency-based surgical curricula in the UK, driven by the introduction of regulations for training by the Postgraduate Medical Education Board (PMETB) The transitions in surgical training have been described previously by Pitts and Rowley in Chapter 3 of this book
Background to Surgical Skill Assessment Tools
The surgical skill assessment methods developed by the GMC Performance Procedures (Beard et al 2005b) and by the medical royal colleges and specialty associations responsible for postgraduate surgical training, are based upon the demonstration of surgical competencies and standards of competence The need for robust methods of assessment for technical and non-technical surgical skills
is axiomatic, as they underpin the competency based assessment strategy and curricula for all UK surgical specialties
Procedure Based Assessment (PBA) and Objective Structured Assessment of Technical Skill (OSATS) are two of the tools being considered in this study They are the current workplace-based assessment tools being used by UK royal colleges and specialty associations for assessing the surgical competence of trainees and for informing objective feedback The overall assessment strategies and individual assessment tools they have adopted conform to the assessment principles laid down by the Postgraduate Medical Education and Training Board (PMETB 2008),
and the assessment tools are also designed to measure all the domains of Good Medical Practice (General Medical Council 1998).
PBAs are embedded within the Orthopaedic Curriculum and Assessment Project (OCAP – <www.ocap.org.uk>) and the Intercollegiate Surgical Curriculum Programme (ISCP – <www.iscp.ac.uk>) The development of the PBA with examples of the assessment tool is covered by Pitts and Rowley in Chapter 3 PBAs have been used by OCAP since 2005, and were introduced into the surgical specialty curricula by ISCP in August 2007 Therefore, this study is taking place alongside the implementation of PBAs for trainees who are required to register onto the ISCP curriculum
Objective Structured Assessment of Technical Skill (OSATS) was introduced
by the Royal College of Obstetricians and Gynaecologists (RCOG – <www.rcog org.uk>) as a requirement of their New Training and Education Programme, launched in parallel with ISCP in August 2007 The OSATS tool was developed
by Reznick’s group in Toronto(Winckel et al 1994, Martin et al 1997)
Ensuring that our assessment methods are valid, reliable and feasible are the principal considerations of a well designed and evaluated assessment system (Van der Vleuten 1996) Evidence of validity and reliability are essential characteristics
of fair and defensible assessments, particularly in identifying under-performing surgeons who could compromise patient safety (Schuwirth et al 2002) The observation of real-time surgical performance in the workplace is essential in the authentic assessment of competence Direct observation of skills and behaviours in
Trang 6the operating theatre has good authenticity for assessing surgical competence, since this method approximates to the ‘real world’ as closely as possible In addition, the feasibility and acceptability of such assessments will influence the successful implementation of competency-based assessment, which is a key consideration for stakeholders with a responsibility for postgraduate surgical training
Preliminary validation studies on PBA have been performed by Rowley and Pitts (see Chapter 3) Our study seeks to further examine the validity and evaluate the reliability of the PBA tool OSATS has demonstrated inter-rater reliability and construct validity in assessing general surgeons performing common operations (Winckel et al 1994) However, there have not been validity and reliability studies performed for the ten OSATS of obstetrics and gynaecology procedures used by the RCOG
The third tool considered in this study is the Non-Technical Skills for Surgeons (NOTSS) tool (Yule et al 2008) described in Chapter 2 This tool is not currently used in a formal way for training in the UK However, there is increasing recognition of the need for training and assessment in non-technical skills because
of the importance of these skills for patient safety
Purpose of the Surgical Skills Study
The aim is to evaluate the validity, reliability, feasibility and acceptability of three different methods of rating the technical and non-technical skills of trainee surgeons in the operating room across a range of different procedures and surgical specialties
The three tools under evaluation in the study are:
PBA: Procedure-Based Assessment;
OSATS: Objective Structured Assessment of Technical Skill;
NOTSS: Non-Technical Skills for Surgeons
The PBA forms for index procedures used by each UK surgical specialty can
be downloaded from the ISCP (<www.iscp.ac.uk>) and OCAP websites (<www ocap.og.uk>) The OSATS forms used by the RCOG can be downloaded from:
<www.rcog.org.uk/resources/public/pdf/section6_at.pdf> The NOTSS rating form and booklet are available from <www.abdn.ac.uk/iprc/notss>
Design and Methodology
Timescale
The study commenced in April 2007 at a large UK teaching hospital NHS foundation trust and is due to be completed in June 2009
•
•
•
Trang 7Safer Surgery
50
Sample Size
Our intention is to perform between 400 and 500 assessments of surgical procedures The first case was assessed in June 2007 To date we have completed 240 cases Reliability estimates become more dependable as the evaluation includes more cases, assessors and trainees However, there is no accepted equivalent of a power calculation to guide sample sizes
Participants
We are assessing trainee surgeons using the tools for those cases which have the informed consent of the patient The assessments on individual trainees are performed with as little delay as possible to avoid the confounding effect of training
Procedures
We are assessing a total of 15 index procedures within six surgical specialties (see Table 4.1) Each case is judged for complexity by the supervising consultant
Observation and Assessment
Within each specialty, the aim is to assess each trainee performing at least two cases
of each relevant index procedure Assessments of their technical and non-technical
Upper Gastrointestinal Laparoscopic cholecystectomy
Open hernia repair Orthopaedics Primary hip replacement
Primary knee replacement Obstetrics & Gynaecology Elective Caesarean section
Urgent Caesarean section Diagnostic laparoscopy Surgical evacuation of uterus
Carotid endarterectomy Abdominal aortic aneurysm repair
Open anterior resection Cardiac Coronary artery bypass grafting
Aortic valve replacement
Table 4.1 index procedures within the surgical specialties
Trang 8skills are undertaken across the cases by as many supervising consultants (one for each case) and independent assessors (up to three in one case) as is practicable
Methods of Observation
Direct observation by assessors in the operating room
Video observation
We are currently filming approximately 20 per cent of the cases using a picture
in picture technique which records the operating field and the operating room Filming is performed by medical illustration technicians with audio provided by microphones fitted to the trainee surgeon and supervising consultant During the consent process, patients have the option to decline videoing, with consent only for the direct observation of their operation We will be able to compare the fidelity and reliability of video observation with direct observation The videos will also provide rich data on the non-technical skills of trainee surgeons in the operating room for collaborative work with the NOTSS team
Process of Study Implementation and Assessments
The implementation of the study within a surgical specialty is illustrated by the flowchart in Figure 4.1
Progress to Date
The original proposal for recruitment was 400–500 surgical cases from three teaching hospitals but it soon became clear that this would be logistically impossible without dedicated research staff at each hospital trust We have therefore recruited from a single teaching hospital’s NHS trust, including two hospital sites with an independent assessor based at each hospital At the time of writing (June 2008),
we have completed 240 cases in 5 surgical specialties, with a further 11 months of study time for recruitment Provided recruitment continues at the same pace, we will be on target to complete 400 to 500 assessments
Relating the Study Design to the Research Aim
The study aim encompasses several research questions We have outlined the main questions below, showing how they have driven the overall study design, and provided examples of how we have addressed them within the study Our research questions take into account the assessment characteristics proposed by Van der
Vleuten (1996) in his model of assessment utility.
1
2
Trang 9Safer Surgery
52
Are the Tools Valid?
Validity can be described in a number of ways depending on the context of the assessment For us, it refers to evidence presented to support or refute the interpretation of assessment scores, i.e., the degree to which the scores of the
assessment reflect the intention of the assessment In the case of the assessment
Figure 4.1 Flowchart of the study implementation
Trang 10tools included in this study, the intention is for the assessment scores to reflect the technical and non-technical surgical competence of the trainee being assessed Validity requires multiple sources of evidence to allow a meaningful interpretation of assessment scores (Downing 2003) Our study design provides many sources of validity evidence and these will all be used to support or refute the validity of the three assessment tools As one example, if the assessment tools are valid for the assessment of surgical competence, we would expect scores to increase with the trainee’s level of training and experience We have ensured that the study includes all grades of trainees and that our demographic questionnaires include questions addressing years of surgical experience and the number of index procedures previously performed by the trainee
Are the Tools Reliable?
Reliability refers to the reproducibility of assessment scores Indicators of test score precision (e.g., Standard Error of Measurement) and indicators of reliability (e.g., G co-efficient) are both based upon estimates of measurement error
Reliability within this study is a measure of how well an assessor’s score of the surgical competence for a particular trainee would reflect any assessor’s score when the trainee carried out the operation on any patient To be able to generalize the construct of ‘surgical competence’ to all of its possible measurements requires that all sources of error (termed ‘variability’) are quantified Therefore, its calculation depends on comparing the effect of assessor-to-assessor variability and case-to-case variability in scores with overall trainee-to-trainee variability in scores The use of generalizability theory for the analysis of assessment scores within the study
is fundamental in providing the most elegant estimates of assessor variability and case variability, which represent the greatest threats to the reliability of real time assessments in the workplace (Downing 2004)
Within each surgical specialty, we have aimed to assess each trainee performing two cases of each relevant index procedure, providing four to eight assessments overall for each trainee Assessing a particular trainee performing several index procedures of varying complexity with different assessors provides a broad sample
of observations for assessing surgical skill Assessment scores from observations
on a number of occasions by different assessors provides the most dependable reliability data (Crossley et al 2002)
Are the Assessment Tools Feasible in Practice?
Feasibility governs the likelihood of implementing an assessment method There are a number of strands to consider within the scope of assessment feasibility, including the time and resources required for implementation as well as cost effectiveness of the assessment strategy