The Royal College of Physicians and Surgeons of Canada (RCPSC) has written a series of white papers1 on a diverse group of topics. They were released last year, but the College now would like some feedback from its members. It is with interest (not much, though) that I read the white paper on assessment of competence. This should be the core of what the College does for physicians: assessment of training and maintenance of competence to a certain standard of acceptance.
However, the white paper seems to be more pertinent to what I feel is currently wrong with surgical training in the country. The RCPSC writes “In-training assessment should become the dominant method of determining competence, with formative assessment taking priority over summative assessment.” This is a laudable statement. Unfortunately, in practice, the reviewer cannot make this a reality. Surgical residents are sometimes only evaluated in their specific specialty/subspecialty after 1–2 years of core training. The residents on our service get pretty immediate feedback and are judged according to the current perception of CanMeds and the realistic performance criteria needed for surgical practice. This may go against the RCPSC’s desire to stop grading performance, but if a resident cannot pin a hip by their fifth year of training, they really need to do a couple of remedial training years. If, by our evaluation, the resident is judged to have failed their rotation, they are summoned by the training director and may be put on probation, depending on their history and any extenuating circumstances. The resident will eventually come back to our service for re-evaluation. If they fail again, the process starts over. The resident can then appeal any decision, and they often do this with a lawyer. Furthermore, at this stage, the resident is into his/her senior years. The perception in our program, and apparently in some others, is that you cannot fail a senior resident, and he/she is allowed to go forward. The resident will sometimes get a lawyer to write the university so that they can take their RCPSC exams, despite failing or receiving borderline evaluations.
You might think the In-Training Evaluation Report (ITER) will stop this from happening. Even the RCPSC writes that “while theoretically sound, the operational deployment of the [ITER] is seriously flawed in that it is rarely populated with reliable or objective data, allows faculty to focus on restricted performance domains, and is often completed long after the training experience has ended.” The RCPSC misses the point of why the ITER has failed. I remember more than once when I was an examiner that an ITER stating that the resident should not be writing the exam turned up after the candidate had already passed the written and oral portions of the exam. Apparently, the ITER has no weight despite containing the most meaningful information.
The RCPSC writes that “assessment is heavily weighted toward the Medical Expert role, while relatively little assessment is based on well-documented supervision or observation in a real working environment that concentrates on the more global skills that define competence.” Perhaps this statement is a little out of touch with what occurs in surgery. Staff surgeons at our centre are in clinic with the residents and stand side by side with them in the operating room. The residents are evaluated on a mainly personal contact basis, so I am unsure if we need to reinvent the wheel for surgery evaluations. More to the point, the surgeons in our centre (and not just the dinosaurs) feel that the residents are no longer trained or evaluated appropriately. Shouldn’t that be important?
The RCPSC would like to move toward more assessment exams on a regular basis. Is it really exams that will make the residents better doctors and surgeons? The trend certainly is moving toward small module–based training and examination, but how far do we need to let the pendulum swing? The residents are away from the ward and the operating room more as they gravitate to the classroom in an 8-to-4 cycle. If you are not present in the day-to-day evaluation of patients and their complications, then the first 5 years of practice may become too much to handle. I think the problem with current evaluation is that the evaluation tools in the hands of the staff are too restrictive. There is no way to identify a resident that needs to switch specialties, there is too little time spent in surgery and clinic in the current training programs, and there is almost no recourse but to let inadequate residents take their exams. For me, the RCPSC has lost its way. I hope for the next generation’s sake it can find its way to train competent residents — its real mission.
Footnotes
Competing interests: None declared.