I had always chosen to remain on the sidelines of the specialist Maintenance of Certification (MOC) furor from the 2015 era, content to let my colleagues build a groundswell of protest at the cost, effort, clinical irrelevance, and perceived tone-deaf obstinance of the governing body. Meanwhile, I had dutifully subjected myself to the rigors of the process every ten years. I had gritted my teeth through the demeaning pat-down in the testing center as if I were a criminal hiding illegal contraband, only for the “privilege” of making my way through endless questions of irrelevant minutiae after having spent months boning up on that same endless minutiae. Despite my spectator status, I’ll admit to complicity in the widespread indignation over such unnecessarily burdensome mandates from an organization that seemed like a black hole of disconnected and arbitrary decisions and decision-makers.
The offer of an alternative assessment option, the Longitudinal Knowledge Assessment (LKA), was finally a welcome response from the American Board of Internal Medicine (ABIM), validating how protest among the rank-and-file can effect change. In brief, the LKA is an ongoing test conducted over five years, consisting of thirty questions per quarter, with each question allocated five minutes. All resources (except “Phone-A-Friend”) are available, and MOC points are awarded. I remained skeptical, however, if the LKA was truly an improvement, especially when it launched in Sleep Medicine and other disciplines in 2022, it had yet to be experienced by physicians.
I was surprised, as a devout cynic of anything administrative, at receiving an invitation to participate in the ABIM Sleep Medicine LKA Standard Setting meeting. Such a glimpse behind the curtain was revealing, prompting me to share the aspects of how current LKA content and processes are actually generated and perhaps dispel some of the misconceptions I had previously held for others.
Our stated goal was to provide recommendations for a minimum passing score on the LKA from the perspective of a hypothetical “minimally qualified candidate.” The actual numerical score that will come out of this consensus doesn’t matter, in my mind, as much as the process and details intended to create a fairer, more relevant, and less taxing exam process while still preserving the status of specialty certification. Essentially, we were tasked with recommending the lowest passing score on the Sleep Medicine LKA such that the minimally qualified physician candidate would be able to pass the test.
The entire process was a personal revelation on many levels. As my need for passing my next MOC is years in the future, I hadn’t given much thought to the LKA except as an abstract, somewhat foreign concept. For those sleep specialists similarly hiding their heads in the proverbial sand, physicians due for an MOC assessment can choose the LKA pathway over the traditional ten-year MOC exam. One takes the LKA over five years and is permitted to use any available hardcopy and online resources to answer questions; everything except using another person.
The sample questions I viewed were, on the whole, reassuringly relevant to clinical sleep disorders practice. The passing score standard each of us was to recommend was to be generated by putting ourselves in the academic shoes of the least qualified candidate who still passed the LKA test; our perspective reflecting that of a borderline test-taking physician still capable of passing an “open-book” examination with allotted time over months instead of a soul-crushing several hours. The process removed my perception of specialty certification as a process by the elite, for the elite.
The meeting pulled back the curtain for me on what I had studiously avoided over my many years of dutiful compliance with licensing and certification. I had abdicated exploration of the rhyme and reason behind MOC in favor of simply accepting its necessity. Administration had never been my strong point, with my lack of enthusiasm extending from local hospital administration up to the national organizations governing my very clinical standing. I am gratified I finally pulled my head out of the sand (as well as out of certain anatomic orifices) and dipped my toe in the waters of national organization administration. Experiencing how “the sausage was made” reassured me instead of confirming any previous bias and suspicion. The process itself demonstrated what I perceived as an ongoing responsibility to collaborate and support as opposed to mandate and impose.
It is possible I am at one end of the spectrum in terms of ignoring anything remotely organizational and administrative, thus relegating the information herein as superfluous instead of revealing it. Nonetheless, it is my hope that this attempt at relaying transparency from a rank-and-file physician devoted to the trenches instead of higher office will lay to rest some previous concerns. The intent of the ABIM seemed in line with concerns raised previously in response to the standard every-ten-year single test pathway. The format throughout the meeting was supportive. It was neither adversarial nor did it try to force a Frankenstein’s monster of a test from a group of disconnected ivory tower physicians.
Certainly, unknowns and concerns will persist. Cost of MOC, whether by the long-form exam pathway or the LKA, will likely remain an issue. Clinical relevance of MOC within a specialty will likely remain somewhat subjective and always a source of concern, despite what I viewed during my time behind the curtain. This communication will hopefully serve as an initial vote of confidence in the direction this certifying organization has taken; a small token of transparency into the process upon which to hopefully build further trust. On a further personal level, the experience opened up cracks in the wall of suspicion and distrust I had built towards any governing body overseeing my professional life.
Dr. Eveloff has no conflicts of interest to report.
Image by Boris Zhitkov / Getty Images