University of Iowa Health Care

Ophthalmology and Visual Sciences

EyeRounds.org

The University of Iowa Task Force on Competencies in Ophthalmology

Acknowledgement: This work supported in part by unrestricted grant from Research to Prevent Blindness, Inc., N.Y., N.Y. and an unrestricted educational grant from Alcon Laboratories (Ft. Worth, Texas) to the University of Iowa Department of Ophthalmology.

The current active faculty members of the University of Iowa Department of Ophthalmology Task Force for Meeting the ACGME Competencies are:

Dr. Abramoff
Michael Abramoff,
MD, PhD
Dr. Boldt
H. Culver Boldt, MD
Dr. Carter
Keith D. Carter, MD
Dr. Greenlee
Emily Greenlee, MD
Tim Johnson
A. Tim Johnson, MD, PhD
Dr. Lee
Andrew Lee, MD (adjunct)
Dr. Oetting
Thomas Oetting, MD
Dr. Olson
Richard Olson, MD

The guiding principles used by the Task Force

These principles follow the recommendations from the ACGME website (http://www.acgme.org):

  • Multiple assessments and comparative validity. The assessment process will require multiple assessments and multiple tools. The traditional global "end of rotation" faculty evaluation form was deemed to be insufficient alone to measure the new competencies. Asking faculty members to grade all of the seven competencies at once using a single evaluation form was deemed to be too onerous, time consuming, and not reliable to assess the desired outcomes. The recommendation of the Task Force was that each competency be assessed by at least four different tools. In this way, the four different tools could be directly compared to one another in order to determine if similar qualitative results were obtained (comparative validity).
  • Face validity. Each of the tools selected for each competency should have consensus external or face validity. For example, direct observation was believed to be the most reliable tool for measurement of surgical competence.
  • Concurrent validity. Each competency had to be measured by at least two tools and on at least three occasions during the same time period of training. In this way, an adequate sample size can be obtained to insure inter-rater and intra-rater reliability and to insure reproducibility of results. The outcomes of each tool at each year of residency (e.g. post-graduate year (PGY) 2 compared to PGY 3 and PGY 4) should be assessed and compared for concurrent and discriminative validity. That is separate tools assessing the same competency at the same time in training should produce comparable results (concurrent validity) and performance using the same tool but assessed at higher levels of training (i.e., PGY 4 should perform better than PGY 2 on the same assessment) should show improvement over time (discriminative validity).
  • Practicality. The tools should be applicable in the clinical or operating room setting (feasible) and easily used (convenient) during existing teaching opportunities (teach and assess in the same encounter).
  • Time commitment. All tools should be easy to use, require only a modest time commitment, and be inexpensive to administer and document. At the University of Iowa there are 5 residents per year (15 total) and 25 full time faculty members. Assuming twelve assessments (2 assessments using 6 tools per year) per resident per year (12 assessments x 15 residents = 180 assessments) and that each assessment will require one hour or less time on average per encounter, then if the work load is spread out over the 25 faculty members (180/25 = 7.2 hours per year). The program director would obviously have additional time commitment for maintaining the documentation, reviewing the results, and providing feedback to the faculty and residents on performance.
  • Budget. The tools should be inexpensive to develop and to implement, and have a specific annual budget. The committee recognizes that the national ACGME mandate is completely unfunded.
  • Outcomes data. The tools should have the ability to produce quantitative data for possible linkage to educational outcome assessment in the future.
  • Fairness. The tools should be directly linked to explicit and openly published learning objectives, the grading scale should be open and clearly defined, and the process should be deemed fair by faculty and residents.
  • Linkage to curriculum objectives. The residency curriculum will be reviewed and the methodology modeled on existing guidelines (http://www.icoph.org) from the International Council of Ophthalmology (ICO) Task Force on Resident Education. The objectives of the curriculum by rotation should be aligned with and outcomes directly linked to the competency assessments.

The Task Force’s phased plan for meeting the mandate and recommended matrix matching specific tools to specific competencies (Table 1)

Six pilot tools were selected for implementation including: 1) written and oral exams; 2) 360 degree (an evaluation that uses multiple observers from different perspectives including nurses, technicians, fellow residents, and patients to provide a wider assessment) global evaluation form; 3) resident portfolio; 4) direct observation of operative performance and clinical exam; 5) phone encounter tool; and 6) journal club tool.

  • Phase 1: Short-term response to ACGME requirements (current)

The Department curriculum based in part on the curriculum above was revised with specific written objectives stratified by year and by subspecialty rotation. The learning objectives should include demonstration of improvement over time in the competencies. The teaching, learning, and assessment of the competencies should be integrated into the didactic and clinical educational experiences as needed to insure learning opportunities.

The Department will develop an internal operational definition of "substantial compliance" in conjunction with the University of Iowa Graduate Medical Education Committee (GMEC). The ad hoc intradepartmental Task Force on Meeting the Competencies will review the relevant existing literature and formulate a compliance plan based upon the existing evidence and experience.

  • Phase 2: "Sharpening the focus and definition of the competencies and assessment tools" (July 2002 to June 2006)

The Program will be actively using tools to measure all seven competency domains (including surgery). Accurate resident performance data will be collected in order to provide evidence of aggregate resident performance for the program’s internal GMEC review.

  • Phase 3: "Full integration of the competencies and their assessment with learning and clinical care" (July 2006 to June 2011)

The Department will use resident performance data as the basis for improvement and will provide evidence of success for accreditation review. External measures of quality and outcomes will be used including patient surveys, employer (post-graduation) surveys, national pass rates on written and oral qualifying exams to verify resident and program performance.

  • Phase 4: "Expansion of the competencies and their assessment to develop models of excellence" (July 2011 and on)

The Task Force envisions an ongoing "work in progress" during Phase 4 and beyond with new tools being developed and tested and old tools being revised or discarded.

see Table 1

Managing the competencies in ophthalmology by Andrew G. Lee

The ACGME website lists the Iowa site as a "Good Practice"

Read more about the Competencies in Ophthalmology at "Academic Ophthalmology"

last updated: 05-01-2009
  Share this page: