Those of us in anesthesiology teaching programs supervise resident physicians. How well do we do? Dr. Franklin Dexter, Division of Management Consulting, Department of Anesthesia, University of Iowa, Iowa City, Iowa, and colleagues from the Department of Anesthesiology, Northwestern University, Chicago, Illinois, used a supervision scale that evaluates the quality of supervision of anesthesiology residents to survey 1500 residents randomly selected from the American Society of Anesthesiologists’ directory of anesthesia trainees. The results of this study are discussed in the article titled “Reliability and Validity of Assessing Subspecialty Level of Faculty Anesthesiologists’ Supervision of Anesthesiology Residents,” published in this month’s issue of Anesthesia & Analgesia.
This paper is one of several that present the results of a comprehensive survey of residents and fellows in anesthesiology on issues of staff supervision, career plans, scientific preparation, and burnout. In this article, the authors explore several questions regarding staff supervision. The survey used a previously validated (reference 1 & 2) 9-point questionnaire asking the residents to assess their supervision by attending staff on a 4-point Likert scale. Residents themselves feel that adequate supervision is achieved with scores of 3 or higher (the attending “frequently” or “always” exhibits the desired behavior). Six hundred and fifty-six residents responded to the survey in whole or part. The authors found that assessment scores for attendings were internally consistent and did not vary by age, gender, hours worked per week, program size, or current subspecialty rotation. They did find a positive correlation between higher scores and better “safety culture” in the institution and a negative correlation between supervision scores and self-reported burnout in the residents. The authors have shown that we have a reliable and valid way to evaluate the performance of the program in providing supervision in a clinical context.
Overall, the findings are consistent with what we think residents want from their training program. The domains of attending supervision explored in this study include attending availability for help and consultation, provision of constructive criticism, demonstration of professional behavior, and supporting resident autonomy. Further, the authors found a correlation between burnout, low supervision scores, and self-reported medical errors. In sum, this study provides a useful overview of an important component of resident education, namely supervision during real clinical care. Rigorous assessment of attending supervision performance is an important component of continuous improvement in anesthesia teaching and clinical care.
This issue of Anesthesia & Analgesia has two other articles that use the same assessment scale. The first companion paper shows that the scores of the program are dragged down by individual faculty who have low scores. The second companion paper shows that the same instrument is reliable and valid when individual anesthesiologists are evaluated by CRNAs. As Getúlio R. de Oliveira Filho, the instrument’s author, notes in his accompanying editorial (Departmental Evaluations by Residents: Widening the Scope of the Faculty Supervision Evaluation) “the metrics … can help prospective applicants avoid programs with poor supervision, and presumably poor teaching, and direct their attention to training programs with the best supervision.”