Anesthesia: Essays and Researches  Login  | Users Online: 411 Home Print this page Email this page Small font sizeDefault font sizeIncrease font size
Home | About us | Editorial board | Ahead of print | Search | Current Issue | Archives | Submit article | Instructions | Copyright form | Subscribe | Advertise | Contacts


 
Table of Contents  
ORIGINAL ARTICLE
Year : 2021  |  Volume : 15  |  Issue : 1  |  Page : 87-100  

US residents' perspectives on the introduction, conduct, and value of american board of anesthesiology's objective structured clinical examination-results of the 1st nationwide questionnaire survey


1 Department of Anesthesiology and Critical Care Medicine Perelman School of Medicine, Hospital of the University of Pennsylvania, Philadelphia, PA, USA
2 Department of Molecular Biology, Princeton University, Princeton, NJ, USA

Date of Submission02-Jun-2021
Date of Acceptance04-Jul-2021
Date of Web Publication30-Aug-2021

Correspondence Address:
Prof. Basavana Goudra
Perelman School of Medicine, Hospital of the University of Pennsylvania, Philadelphia 19104, PA
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/aer.aer_76_21

Rights and Permissions
   Abstract 

Introduction: Passing the Objective Structured Clinical Examination (OSCE) is currently a requirement for the vast majority (not all) of candidates to gain American Board of Anesthesiology (ABA) initial certification. Many publications from the ABA have attempted to justify its introduction, conduct and value. However, the ABA has never attempted to understand the views of the residents. Methods: A total of 4237 residents at various training levels from 132 programs were surveyed by asking to fill a Google questionnaire prospectively between March 8th, 2021 and April 10th, 2021. Every potential participant was sent an original email followed by 2 reminders. Results: The overall response rate was 17.26% (710 responses to 4112 invitations). On a 5-point Likert scale with 1 as “very inaccurate” and 5 as “very accurate,” the mean accuracy of objective structured clinical examination (OSCE) in assessing communication skills and professionalism was 2.3 and 2.1 respectively. In terms of the usefulness of OSCE training for improving physicians' clinical practice, avoiding lawsuits, teaching effective communication with patients and teaching effective communication with other providers, the means on a 5-point Likert scale with 1 as “Not at all useful” and 5 as “Very useful” were 1.86, 1.69, 1.79, and 1.82 respectively. Residents unanimously thought that factors such as culture, race/ethnicity, religion and language adversely influence the assessment of communication skills. On a 5-point Likert scale with 1 as “not at all affected” and 5 as “very affected,” the corresponding scores were 3.45, 3.19, 3.89, and 3.18 respectively. Interestingly, nationality and political affiliation were also thought to influence this assessment, however, to a lesser extent. In addition, residents believed it is inappropriate to test non-cardiac anesthesiologists for TEE skills (2.39), but felt it was appropriate to test non-regional anesthesiologists in Ultrasound skills (3.29). Lastly, nearly 80% of the residents think that money was the primary motivating factor behind ABA's introduction of the OSCE. Over 96% residents think that OSCE should be stalled, either permanently scrapped (60.8%) or paused (35.8%). Conclusions: Anesthesiology residents in the United States overwhelmingly indicated that the OSCE does not serve any useful purpose and should be immediately halted.

Keywords: American Board of Anesthesiology, certification, objective structured clinical examination


How to cite this article:
Goudra B, Guthal A. US residents' perspectives on the introduction, conduct, and value of american board of anesthesiology's objective structured clinical examination-results of the 1st nationwide questionnaire survey. Anesth Essays Res 2021;15:87-100

How to cite this URL:
Goudra B, Guthal A. US residents' perspectives on the introduction, conduct, and value of american board of anesthesiology's objective structured clinical examination-results of the 1st nationwide questionnaire survey. Anesth Essays Res [serial online] 2021 [cited 2021 Nov 27];15:87-100. Available from: https://www.aeronline.org/text.asp?2021/15/1/87/325030


   Introduction Top


As the certifying body for anesthesiologists since 1938, the American Board of Anesthesiology (ABA), through its team of dedicated anesthesiologist volunteers and staff, administers, supports initial and subspecialty certification assessments as well as continuing certification programs. The ABA claims to “promote lifelong learning with a commitment to quality clinical outcomes and patient safety” although there is no evidence to support such an assertion.[1]

In 2018, the ABA became the first and only one of the 24 member boards of the American Board of Medical Specialties (ABMS; Chicago, IL, USA) to introduce the objective structured clinical examination (OSCE) as part of its examination for initial certification. Professionalism and interpersonal and communication skills are two elements of the six competency-based medical education attributes advocated by the Accreditation Council for Graduate Medical Education (ACGME) and tested in the OSCE. The other four tested attributes are patient care, medical knowledge, systems-based practice, and practice-based learning.[2] In part, the OSCE was introduced to satisfy the 2015 ACGME safety and quality improvement requirements for all residency programs, although ACGME itself did not specify that the OSCE should be used to test these requirements. In fact, the ACGME did not mandate any testing at the certification point. However, this does not mean that the ABA should not implement meaningful changes in an effort to serve the public interest by advancing the standards of anesthesiology practice through certification. Although nonprofit, the ABA is a private organization, and the certification is voluntary. Nonetheless, these changes should be evidence based, fair, and practical.

The OSCE implementation may have been a step in the right direction. In a recent publication, Warner et al. observed that the OSCE tested certain elements of the ACGME's core competencies that were not tested by the remaining components of the ABA examination.[3] However, an accompanying editorial and a letter to the editor that followed the publication were not as enthusiastic.[4],[5]

Our study aims to understand the perspectives of residents on the introduction, conduct, and value of the OSCE as part of the applied examination of the ABA. It is hoped that the ABA will take these residents' views into consideration in their future plans regarding the continuation or modification of this component of their certification examination. In addition, other member boards of the ABMS might use the results of this survey in their own planning. In this regard, it should be noted that the Federation of State Medical Boards and National Board of Medical Examinations, the co-sponsors of the United States Medical Licensing Examination®, announced the discontinuation of work to relaunch a modified Step 2 clinical skills assessment examination (an OSCE equivalent).[6]


   Methods Top


The institutional review board of the University of Pennsylvania approved this study on March 8, 2021.

The potential participants were the residents and fellows currently enrolled in various anesthesiology programs including combined programs (such as pediatrics–anesthesiology). Considering that the entire study was performed between March 8, 2021, and April 10, 2021, the participants were residents of the years 2021, 2022, 2023, and 2024 and fellows. 2021 refers to the academic year 2020–2021. The graduating class of 2021 would have started their residency in 2017 (academic year 2017–2018).

The ABA has performed various questionnaire surveys involving anesthesiology residents comprising several aspects of their training. The aim of this study was to encompass aspects that directly influence the residents that were not explored by the ABA. Although the OSCE was introduced in 2018 and had played a significant role in the lives of residents, the ABA had not performed any surveys to gather their opinion in this regard. The areas that were studied are assessment in the OSCE, format of the OSCE, value of the OSCE, factors affecting the communication section of the OSCE, possible motive for the OSCE introduction, and the future of the OSCE. The questions asked under various headings are summarized in [Appendix 1].



E-mails with a link to a Google Form containing the questions listed in [Appendix 1] and a selection of responses were sent to 4237 anesthesiology residents enrolled in 132 programs across the USA. The calculated sample size for a population size of 4237, to keep a margin of error of 4% and a confidence level of 95%, was 526. The response rate in online E-mail surveys is generally 5%–10%. In addition, we could not be certain of the E-mails likely to be blocked, nondelivered, etc., As a result, we decided to invite all the residents whose E-mails could be obtained.

The name and training year of the resident were obtained from the institutional program website. Majority of the E-mails were obtained from the program websites and the remaining from the member's section of the American Society of Anesthesiology website. Not all the E-mails were listed; as a result, some residents were left out from the survey. The first batch of E-mails was dispatched between March 8, 2021, and March 17, 2021. A second request to all the residents in the mailing list was sent between March 17, 2021, and March 27, 2021. Two residents requested their name to be removed from the list, and we obliged. A third and final reminder was sent between between March 28, 2021, and April 4, 2021. The form stopped receiving responses on April 10, 2021, at 10:00 p.m. The last E-mail mentioned the current response rate and total responses. Research has shown that providing this information increases the response rates.[7] The respondents' E-mail address was not recorded. The questions were formulated by the corresponding author, vetted by two other faculty members of the Perlman School of Medicine, and approved by the vice chair for research. Different answer options were used for different questions/group of questions. However, most responses were on a Likert scale (which assumes the strength of association as linear). Participation was entirely voluntary. By participating in the questionnaire survey, the residents gave consent to participate in the study. All questions were compulsory. The respondents could not proceed to the next page without answering all the questions in each page and could not submit the form without answering every question. The Google Form was set to prevent multiple responses from a single E-mail address after the 1st batch of E-mails was dispatched. E-mails were sent individually to each resident (instead of residency program directors or program coordinators) to increase the response rate. Residents at Penn Medicine (the institution of the corresponding author) were excluded to eliminate possible bias.

The responses were analyzed on April 11, 2021, with mean and standard deviation along with median values being reported as calculated from the 5-point Likert scale equivalents.

[Appendix 2] provides the list of residency programs in the United States to which the questionnaire was E-mailed.




   Results Top


A total of 4237 E-mails were dispatched to various anesthesiology residents across the country. Of these, 4112 were delivered, whereas 125 bounced back for a variety of reasons (Iron Port E-mail Security Appliance, remote server returns, and undeliverable E-mails). In addition, a total of three residents requested their names to be removed from the mailing list. Three attempts were made to obtain responses, with each E-mail reminder being sent for the following time periods: March 8, 2021–March 17, 2021 (1st attempt), March 17, 2021– March 27, 2021 (2nd attempt), and March 28, 2021–April 4, 2021 (3rd and final attempts). The form stopped receiving responses on April 10, 2021, at 10:00 p.m., with a grand total of 710 responses being recorded (17.26%). The total included 284, 167, and 259 responses during each of the respective time periods. [Table 1] summarizes the responses by the timing of E-mails sent along with the average response percentage for each.
Table 1: Responses by the attempts

Click here to view


Of the 710 anesthesiology residents who took part in the study, the majority who answered were from CA (clinical anesthesia training) 2 and CA3. The average response percentage was highest among fellows (36.51%), however only 63 E-mails were sent to this group. Respectively, 583, 1153, 1234, 1204, and 63 E-mails were sent to trainees in postgraduate year 1, CA1, CA2, CA3, and fellows. The distribution of total responses was 47 (8.06%), 146 (12.66%), 212 (17.18%), 282 (23.42%), and 23 (36.51%) among these cohorts, respectively [Table 2].
Table 2: Percentage response for each training year

Click here to view


In addition, the responses varied greatly by the state of program location. In terms of response distribution by state, Minnesota, Puerto Rico, and Connecticut had the best response rates of 62.5% (15/24), 54.55% (6/11), and 50.91% (28/55), respectively. The lowest three states were New Jersey, Iowa, and Oregon at 0% (0/28), 5.35% (3/56), and 5.71% (2/35), respectively. [Table 3] summarizes the response percentages for each state/territory to which E-mails were sent.
Table 3: Responses by state - percentage and total

Click here to view


[Table 4] summarizes the questions asked from the survey as well as the response percentage for each category of the 5-point Likert scale (1 as “very inaccurate” and 5 as “very accurate”). Pie charts of these responses may be found at http://dx.doi.org/10.13140/RG.2.2.25924.55680. Mean and standard deviation along with median values for each question were also reported. Responses regarding the accuracy of the OSCE examination in assessing communication skills were reported as largely inaccurate, as 24.5% and 30.8% responded with “very inaccurate” and “inaccurate,” respectively. The mean and median values (2.33 and 2, respectively) also support this finding. In addition, the residents were asked whether they felt the OSCE assessed professionalism accurately. Responses for this aspect of the examination were received even more poorly, as the mean value was reported at 2.14 (1.00).
Table 4: Questions asked in the survey, and the response percentage for each category of the 5-point Likert scale with their mean and median

Click here to view


The next portion of the survey asked anesthesiology residents to rank the appropriateness of different portions of the OSCE examination. The first of which asked whether residents believed it is appropriate for the OSCE to assess noncardiac anesthesiologists in transesophageal echocardiogram (TEE). Many of the respondents indicated that such an examination topic was inappropriate, as 24.1% and 33.5% responded “very inappropriate” and “inappropriate,” respectively. The calculated mean (2.39) and median (2.0) values also support the finding that this portion was believed to be inappropriate. Interestingly, the resident anesthesiologists believed that the OSCE's portion assessing the nonregional anesthesiologists in ultrasound (US) was somewhat appropriate, as the mean value was relatively better at 3.29 (1.21).

The residents were then asked a series of questions meant to determine the perceived value of OSCE training and assessment. A 5-point Likert scale was also used for analyses of this section, with responses ranging from “not at all useful”[1] to “very useful.”[5] As far as the OSCE's usefulness for improving physicians' clinical practice, 47.5% and 30.7% of the residents indicated that the OSCE was “not at all useful” and “of very little use,” respectively. The mean value for this question was also very low at 1.86 (1.04). The perceived usefulness of the OSCE in terms of avoiding lawsuits was reported even lower, with a mean value of 1.69 (0.88). In addition, the residents were asked how useful they believe OSCE training was for effectively communicating with patients as well as other providers. Both of these topics from the OSCE were regarded as of little use, with mean values being 1.79 (1.02) and 1.82 (1.07), respectively.

Communication skills are tested very widely in the OSCE examination, be with other physicians and providers or patients. The residents were asked to rank on a 5-point Likert scale (1 being “not at all affected” to 5 being “very affected) whether different categories such as culture, race/ethnicity, religion, language, presumed nationality, and presumed political affiliation affected their ability to perform in the communication section of the OSCE. Respectively, the mean values for each of the aforementioned categories were 3.45 (1.2), 3.19 (1.26), 3.89 (1.30), 3.18 (1.12), 2.40 (1.26), and 2.28 (1.26), indicating that the residents are in agreement that the OSCE's assessment of communication skills is affected by external factors. Presumed nationality and political affiliation were believed to be least affected, yet the means were very low (2.40 and 2.28, respectively).

In general, the purpose and conduct of the OSCE component of the ABA applied examination was viewed very poorly by the majority of respondents. A total of 560 of the 710 (78.9%) respondents agreed with the assumption that the main purpose of introducing the OSCE was to increase the ABA revenue. [Table 4] provides full response in percentages to various statements posed to the respondents. In addition, the respondents were asked to choose one of the four possible options for what they think should be the future of the OSCE. Of the 710 respondents, 432 (60.8%) believed that the OSCE should be permanently scrapped. Furthermore, 32.7% and 3.1% believed that the OSCE examination should be paused pending demonstration that it improves patient outcomes and decreases disciplinary proceedings against anesthesiologists, respectively. Only 24 respondents (3.4%) expressed that the OSCE should continue as is.

Of the 4112 E-mails sent, a total of 710 residents responded after three attempts to obtain responses (17.26%). [Figure 1],[Figure 2],[Figure 3],[Figure 4],[Figure 5],[Figure 6],[Figure 7],[Figure 8] summarize the response trends for select questions from the survey questionnaire, demonstrating how the distribution of responses remained largely consistent throughout the entire duration of the study. Such a finding adds credibility to the extrapolation of our findings to the larger resident anesthesiology community. [Figure 9] represents the residents' responses to the questions related to factors that can produce potential bias in the conduct of the OSCE in a bar graph.
Figure 1: Percentage response at the end of every 100th response to display the trend – answer to the question “How accurate do you believe the objective structured clinical examination is in assessing communication skills?”

Click here to view
Figure 2: Percentage response at the end of every 100th response to display the trend – answer to the question “How accurate do you believe the objective structured clinical examination is in assessing professionalism?”

Click here to view
Figure 3: Percentage response at the end of every 100th response to display the trend – answer to the question “How useful do you believe objective structured clinical examination training is for improving physicians' clinical practice?”

Click here to view
Figure 4: Percentage response at the end of every 100th response to display the trend – answer to the question “How useful do you believe objective structured clinical examination training is for avoiding lawsuits?”

Click here to view
Figure 5: Percentage response at the end of every 100th response to display the trend – answer to the question “How useful do you believe the objective structured clinical examination training is for teaching physicians how to effectively communicate with patients?”

Click here to view
Figure 6: Percentage response at the end of every 100th response to display the trend – answer to the question “How useful do you believe objective structured clinical examination training is for teaching physicians how to effectively communicate with other providers?”

Click here to view
Figure 7: Percentage response at the end of every 100th response to display the trend – answer to the question “What do you think is the main motive of the ABA to start objective structured clinical examination?”

Click here to view
Figure 8: Percentage response at the end of every 100th response to display the trend – answer to the question “What do you think should be the future of the objective structured clinical examination?”

Click here to view
Figure 9: Extent of the effects of culture, race/ethnicity, language, nationality, religion, and political affiliation on the objective structured clinical examination assessment of communication skills

Click here to view


[Table 5] contains the list of abbreviations used in this paper.
Table 5: Abbreviations used in the text

Click here to view



   Discussion Top


We made significant efforts to increase response rates. “respondent fatigue” is a major impediment to getting high response rates in online surveys.[8] We might have made the situation worse by compelling the respondents to answer all the questions. Even in the ABA conducted surveys (which could be considered as an internal survey), the overall response rate was 36%.[9],[10] External surveys with no participation incentive are known to produce even lesser response rates, about 10%.[11],[12] Several meta-analyses have revealed that web surveys generally get a 6%–15% lower response rate compared to other survey modes, with many studies conducted among students reporting response rates below 20%.[7] As a result, our overall response rate of 17.26% may be considered good especially given the lack of incentive and a relatively longer questionnaire. In addition, our survey could only be taken by those who were exposed to mock OSCE sessions. In fact, the corresponding author received E-mails from individual residents to that effect. With the ongoing pandemic, residents have limited opportunity to attend courses and departments have similar constraints in organizing OSCE training sessions. This is clearly indicated in increasing response rates among higher training year residents who might have had some exposure to mock OSCE before the pandemic.

Our survey explores an aspect of OSCE which is not studied by the ABA. Our results might indicate that the ABA has failed to communicate and convince the residents that the skills tested in the OSCE make a positive impact on their practice. It is also possible that such evidence might not exist. It seems that the major impetus for the ABA to begin the OSCE was monetary gain as opined by over 80% of the respondents. We hesitate to state that a cursory look at the ABA's tax returns gives credence to that belief. Being a nonprofit-making organization, these tax returns are available to the public (https://projects.propublica.org/nonprofits/organizations/60646523, [Figure 10]). The ABA program services revenue (in millions of US dollars) increased from 3.3 in 2001 to 15.6 in 2018. We do not know the impact of the OSCE on the revenue yet as it was only introduced in 2018. More research is underway to understand the perceptions of board-certified anesthesiologists participating in the Maintenance of Certification in Anesthesiology minute. If the feeling is widespread, it might have implications on the nonprofit status of the ABA. Nonprofit organizations are allowed to make profit, however they cannot design and implement policies for such purpose.
Figure 10: The American Board of Anesthesiology's revenue and expenses from 2001 to 20018 _source-https://projects.propublica. org/nonprofits/organizations/60646523)

Click here to view


The communication section is a major component of the OSCE. A majority of the residents think that the OSCE's assessment of communication skills is affected by factors such as candidate's culture, race/ethnicity, language, and nationality. Among these factors, race/ethnicity, culture, religion, and language are thought to have the highest sway. Unlike TEE and US components, this section is bound to be subjective and inclined to be affected by the implicit biases of the examiners and the standardized patients (SPs). Racial profiling is well known among physicians and dentists. In an interesting study, Patel et al. found that dentists are much more likely to recommend tooth extraction to a Black patient versus root canal work for a White patient for similar conditions, based on the race alone.[13] Black patients are seen by doctors as opioid seeking.[14] South Asian doctors are often presumed to have been trained in a foreign medical school.[15] While writing in the journal “Advances in Health Sciences Education,” Sharma and Kuper talked about the racism in our classrooms, educational and research institutions, and communities.[16] Graf et al. compared the students' (self-perception) and SPs' (external perception) perception of communication skills using uniform questionnaires in the context of an OSCE examination. They found significant gender difference in favor of female students performing better in the dimensions of empathy, structure, verbal expression, and nonverbal expression. In fact, male students deteriorated across all dimensions in the external perception between 2011 and 2014.[17] Recent ABA publications that attempted to validate the introduction of the OSCE have ignored any discussion of bias.[18],[19] Our results indicate that the unconscious bias training (which is provided by the ABA) may not remove the enormous suspicion held by over 90% of the residents. The futility of unconscious bias training in eliminating bias, racism, and discrimination is discussed by multiple authors.[20],[21],[22] Considering that five OSCE stations test the communication elements, this is a huge challenge which is difficult to overcome. Nonetheless, it has a major bearing on the results. It is heartening to note that the ABA is concerned with racism in public health and has recently issued a statement in this regard.[23]

A large number of residents think that it is inappropriate to assess noncardiac anesthesiologists in TEE skills. This is entirely understandable. These skills typically help intensivists and cardiac anesthesiologists in patient management. Like the common adage “you use it or lose it,” a practitioner is bound to lose the skills without regular use. Memorizing the videos and spending few days to learn these skills just to pass the examination is probably not the best way the residents can use their time and resources. Residents were more divided regarding testing for US skills, with a small majority agreeing that it is appropriate to assess nonregional anesthesiologists in US.

More concerning is the strong perception among the residents that OSCE training does not teach them to effectively communicate with either patients or other health-care providers. It is common knowledge that we all individually tailor our approach to patients and modify our body language, demeanor, tone and degree, and depth of explanation based on their anxiety levels, knowledge, and understanding. It takes many years, perhaps even decades for us to perfect this art, and not all of us can achieve the same level of competency. In fact, it is counterproductive to patient care if we approach all patients in a standardized fashion.

The majority of the residents strongly disagree that OSCE training improves their clinical practice or helps in avoiding lawsuits. There is no concrete evidence to support the idea that board certification itself will either increase patient safety or decrease disciplinary actions from the licensing bodies.[24],[25] As an indicator of its certification cogency, the ABA used the license disciplinary actions as an indicator of written and oral specialty certification examination effectiveness.[24] Although negligence and incompetence are the most common causes, factors such as alcohol and substance abuse, inappropriate prescribing practices, inappropriate contact with patients, and fraud are responsible for the majority of such actions, and these cannot be foreseen in the OSCE or structured oral examination.[25] In a study that looked at the effectiveness of board certification as a surrogate indicator of provider competence, Silber et al. examined Medicare claims records of 144,883 patients in Pennsylvania who underwent general surgical or orthopedic procedures between 1991 and 1994.[26] The authors only looked at midcareer anesthesiologists and found a slightly higher mortality rate within 30 days of admission and failure-to-rescue rate among nonboard-certified anesthesiologists. However, they very importantly noted that nonboard-certified anesthesiologists tended to work in less equipped hospitals. The mortality numbers were very low, and death can be ascribed to too many factors, and, as a result, it is impossible to provide a a verdict one way or another. However, what is true is that an inability to obtain certification has enormous psychological, social, and financial consequences.[27] Failing initial board certification examinations has been reported as contributing to physician suicide.[28] Considering that the ABA found a high prevalence of burnout, distress, and depression among residents, it is important to take measures to reduce them.[10] In a system where institutional support, work–life balance, strength of social support, workload, and student debt are impacting physician well-being, it is vital that the ABA removes untested layers of the multilayer anesthesiology initial board examinations, instead of increasing them. The ABA certification is not demonstrated to save any lives; however, failure to get certification can cost the lives of anesthesiologists.

Our research is likely to provide much-needed feedback to the ABA about the residents' perception of the OSCE. Based on these results, the ABA might consider performing a larger study to get the views of the majority of residents; although, it is unlikely to get a different result. We followed the trends in the answers throughout the study. We did not notice any difference as the study progressed. It is essential for the ABA to hire an independent agency to administer such questionnaire surveys. Administration of biased examination (going by the majority of resident's opinion) to grant certification might compromise public health (instead of improving) by eliminating good clinicians. The necessary hurdles for certification should be evidence based and must serve the public interest.

The residents are most unified with regard to the future of the OSCE. Nearly 97% of the residents would like to see OSCE stalled; either permanently scrapped (61.3%) or paused pending demonstration that it improves patient outcomes (32.4%) or paused pending demonstration that it decreases disciplinary proceedings against anesthesiologists by licensing bodies (3.2%). Clearly, the ABA wanted to make a positive influence both in training and practice by introducing OSCE. Nevertheless, it seems to have failed.

Limitations of the study

Web-based surveys suffer from poor response rates, and our survey is no exception. Yet, a low margin of error for a 95% confidence augurs well. Nonopened E-mails, reluctance on the part of the E-mail recipient, forgetfulness, and indefinite postponement are some of the potential contributing factors. The corresponding author (who personally dispatched all the E-mails after work and during weekends) noticed that the responses typically came while the E-mails were being sent. However, as pointed out by Cook et al., “response representativeness is more important than response rate in survey research.”[29] Our survey represents 42 states, 80 programs, and residents of all training years. They are also overwhelmingly swung in one direction, which increases the strength of the results. As a result, we believe that the results are true representation of the mood and opinion of the residents across the nation.


   Conclusions Top


Residents unanimously view negatively the introduction, conduct, and value of the OSCE component of the ABA certifying examination. They agree that ABA should abolish the OSCE. We are certain that as an organization that values resident's opinion very highly, the ABA will permanently scrap the OSCE with immediate effect.

We understand the sensitive nature and possible implications of the findings of our study. To maintain transparency, we will let anyone including the ABA to scrutinize our data after giving us adequate notice.

Acknowledgments

We thank all the residents who participated in this survey by answering all the questions.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
   References Top

1.
OUR HISTORY. Available from: https://theaba.org/our%20history.html. [Last accessed on 2021 Jan 16].  Back to cited text no. 1
    
2.
Guidance Statement on Competency-Based Medical Education during COVID-19 Residency and Fellowship Disruptions. ACGME. Available from: https://acgme.org/Newsroom/Newsroom-Details/ArticleID/10639/Guidance-Statement-on-Competency-Based-Medical-Education-during-COVID-19-Residency-and-Fellowship-Disruptions. [Last. [Last accessed on 2021 Mar 19].  Back to cited text no. 2
    
3.
Warner DO, Isaak RS, Peterson-Layne C, Lien CA, Sun H, Menzies AO, et al. Development of an objective structured clinical examination as a component of assessment for initial board certification in anesthesiology. Anesth Analg 2020;130:258-64.  Back to cited text no. 3
    
4.
Saddawi-Konefka D, Baker KH. The American Board of Anesthesiology gets a passing grade on its new objective structured clinical examination. Anesth Analg 2020;131:1409-11.  Back to cited text no. 4
    
5.
Goudra B. Objective structured clinical examination-Are they truly objective? Anesth Analg 2021;133:e3-5.  Back to cited text no. 5
    
6.
United States Medical Licensing Examination | Announcements. Available from: https://www.usmle.org/announcements/. [Last accessed on 2021 Jan 28].  Back to cited text no. 6
    
7.
Mol CV. Improving web survey efficiency: The impact of an extra reminder and reminder content on web survey response. Int J Soc Res Methodol 2017;20:317-27.  Back to cited text no. 7
    
8.
Respondent Fatigue. In: Encyclopedia of Survey Research Methods. Thousand Oaks, California, USA: Sage Publications, Inc.; 2008. Available from: http://methods.sagepub.com/reference/encyclopedia-of-survey-research-methods/n480.xml. [Last accessed on 2021 Mar 25].  Back to cited text no. 8
    
9.
Sun H, Chen D, Warner DO, Zhou Y, Nemergut EC, Macario A, et al. Anesthesiology residents' experiences and perspectives of residency training. Anesth Analg 2021;132:1120-8.  Back to cited text no. 9
    
10.
Sun H, Warner DO, Macario A, Zhou Y, Culley DJ, Keegan MT. Repeated cross-sectional surveys of burnout, distress, and depression among anesthesiology residents and first-year graduates. Anesthesiology 2019;131:668-77.  Back to cited text no. 10
    
11.
Survey Response Rates. PeoplePulseTM-Online Survey Software | Australian Survey Software. Available from: https://peoplepulse.com/resources/useful-articles/survey-response-rates/. [Last accessed on 2021 Mar 25].  Back to cited text no. 11
    
12.
Survey Response Rate | Good Survey Response Rate. QuestionPro; 2018. Available from: https://www.questionpro.com/blog/good-survey-response-rate/. [Last accessed on 2021 Mar 25].  Back to cited text no. 12
    
13.
Patel N, Patel S, Cotti E, Bardini G, Mannocci F. Unconscious racial bias may affect dentists' clinical decisions on tooth restorability: A randomized clinical trial. JDR Clin Trans Res 2019;4:19-28.  Back to cited text no. 13
    
14.
Tello M. Racism and discrimination in health care: Providers and patients. Harvard Health Blog 2017. Available from: https://www.health.harvard.edu/blog/racism-discrimination-health-care-providers-patients-2017011611015. [Last accessed on 2021 Mar 25].  Back to cited text no. 14
    
15.
Saadi A, February 7 M | P|, 2016. Muslim-American Doctor on the Racism in Our Hospitals; 2016. Available from: https://www.kevinmd.com/blog/2016/02/muslim-american-doctor-racism-hospitals.html. [Last accessed on 2021 Mar 25].  Back to cited text no. 15
    
16.
Sharma M, Kuper A. The elephant in the room: Talking race in medical education. Adv Health Sci Educ Theory Pract 2017;22:761-4.  Back to cited text no. 16
    
17.
Graf J, Smolka R, Simoes E, Zipfel S, Junne F, Holderried F, et al. Communication skills of medical students during the OSCE: Gender-specific differences in a longitudinal trend study. BMC Med Educ 2017;17:75. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5414383/. [Last accessed on 2021 Mar 28].  Back to cited text no. 17
    
18.
Warner DO, Lien CA, Wang T, Zhou Y, Isaak RS, Peterson-Layne C, et al. First-year results of the American Board of Anesthesiology's objective structured clinical examination for initial certification. Anesth Analg 2020;131:1412-8.  Back to cited text no. 18
    
19.
Wang T, Sun H, Zhou Y, Chen D, Harman AE, Isaak RS, et al. Construct validation of the American Board of Anesthesiology's APPLIED examination for initial certification. Anesth Analg 2021.  Back to cited text no. 19
    
20.
Noon M. Pointless diversity training: Unconscious bias, new racism and agency. Work Employ Soc 2018;32:198-209.  Back to cited text no. 20
    
21.
FitzGerald C, Martin A, Berner D, Hurst S. Interventions designed to reduce implicit prejudices and implicit stereotypes in real world contexts: A systematic review. BMC Psychol 2019;7:29. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6524213/. [Last accessed on 2021 Mar 23].  Back to cited text no. 21
    
22.
Dobbin F, Kalev A. Why doesn't diversity training work? The challenge for industry and academia. Anthropology 2018;10:48-55.  Back to cited text no. 22
    
23.
ABA Statement on Racism and Public Health. THE LATEST; 2020. Available from: http://aba-thelatest.org/2020/06/aba-statement-on-racism-and-public-health/. [Last accessed on 2021 Mar 26].  Back to cited text no. 23
    
24.
Zhou Y, Sun H, Culley DJ, Young A, Harman AE, Warner DO. Effectiveness of written and oral specialty certification examinations to predict actions against the medical licenses of anesthesiologists. Anesthesiology 2017;126:1171-9.  Back to cited text no. 24
    
25.
Morrison J. Physicians disciplined by a state medical board. JAMA 1998;279:1889.  Back to cited text no. 25
    
26.
Silber JH, Kennedy SK, Even-Shoshan O, Chen W, Mosher RE, Showan AM, et al. Anesthesiologist board certification and patient outcomes. Anesthesiology 2002;96:1044-52.  Back to cited text no. 26
    
27.
Dr. Wes: What Happens to Doctors Who Fail Their Maintenance of Certification Examination? Available from: http://drwes.blogspot.com/2015/05/what-happens-to-doctors-who-fail-their.html. [Last accessed on 2021 Mar 25].  Back to cited text no. 27
    
28.
Greenky D, Reddy P, George P. Rethinking the initial board certification exam. Med Sci Educ 2021:1-3.  Back to cited text no. 28
    
29.
Cook C, Heath F, Thompson RL. A meta-analysis of response rates in web- or internet-based surveys. Educ Psychol Meas 2000;60:821-36.  Back to cited text no. 29
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6], [Figure 7], [Figure 8], [Figure 9], [Figure 10]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]



 

Top
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
   Methods
   Results
   Discussion
   Conclusions
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed2368    
    Printed8    
    Emailed0    
    PDF Downloaded33    
    Comments [Add]    

Recommend this journal