header advert
Results 1 - 8 of 8
Results per page:
Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_12 | Pages 28 - 28
1 Dec 2022
Bornes T Khoshbin A Backstein D Katz J Wolfstadt J
Full Access

Total hip arthroplasty (THA) is performed under general anesthesia (GA) or spinal anesthesia (SA). The first objective of this study was to determine which patient factors are associated with receiving SA versus GA. The second objective was to discern the effect of anesthesia type on short-term postoperative complications and readmission. The third objective was to elucidate factors that impact the effect of anesthesia type on outcome following arthroplasty.

This retrospective cohort study included 108,905 patients (median age, 66 years; IQR 60-73 years; 56.0% females) who underwent primary THA for treatment of primary osteoarthritis in the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database during the period of 2013-2018. Multivariable logistic regression analysis was performed to evaluate variables associated with anesthesia type and outcomes following arthroplasty.

Anesthesia type administered during THA was significantly associated with race. Specifically, Black and Hispanic patients were less likely to receive SA compared to White patients (White: OR 1.00; Black: OR 0.73; 95% confidence interval [CI] 0.71-0.75; Hispanic: OR 0.81; CI, 0.75-0.88), while Asian patients were more likely to receive SA (OR 1.44, CI 1.31-1.59). Spinal anesthesia was associated with increased age (OR 1.01; CI 1.00-1.01). Patients with less frailty and lower comorbidity were more likely to receive SA based on the modified frailty index ([mFI-5]=0: OR 1.00; mFI-5=1: OR 0.90, CI 0.88-0.93; mFI-5=2 or greater: OR 0.86, CI 0.83-0.90) and American Society of Anesthesiologists (ASA) class (ASA=1: OR 1.00; ASA=2: OR 0.85, CI 0.79-0.91; ASA=3: OR 0.64, CI 0.59-0.69; ASA=4-5: OR 0.47; CI 0.41-0.53). With increased BMI, patients were less likely to be treated with SA (OR 0.99; CI 0.98-0.99).

Patients treated with SA had less post-operative complications than GA (OR 0.74; CI 0.67-0.81) and a lower risk of readmission than GA (OR 0.88; CI 0.82-0.95) following THA. Race, age, BMI, and ASA class were found to affect the impact of anesthesia type on post-operative complications. Stratified analysis demonstrated that the reduced risk of complications following arthroplasty noted in patients treated with SA compared to GA was more pronounced in Black, Asian, and Hispanic patients compared to White patients. Furthermore, the positive effect of SA compared to GA was stronger in patients who had reduced age, elevated BMI, and lower ASA class.

Among patients undergoing THA for management of primary osteoarthritis, factors including race, BMI, and frailty appear to have impacted the type of anesthesia received. Patients treated with SA had a significantly lower risk of readmission to hospital and adverse events within 30 days of surgery compared to those treated with GA. Furthermore, the positive effect on outcome afforded by SA was different between patients depending on race, age, BMI, and ASA class. These findings may help to guide selection of anesthesia type in subpopulations of patients undergoing primary THA.


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_13 | Pages 86 - 86
1 Dec 2022
Lex J Abbas A Oitment C Wolfstadt J Wong PKC Abouali J Yee AJM Kreder H Larouche J Toor J
Full Access

It has been established that a dedicated orthopaedic trauma room (DOTR) provides significant clinical and organizational benefits to the management of trauma patients. After-hours care is associated with surgeon fatigue, a high risk of patient complications, and increased costs related to staffing. However, hesitation due to concerns of the associated opportunity cost at the hospital leadership level is a major barrier to wide-spread adoption. The primary aim of this study is to determine the impact of dedicated orthopaedic trauma room (DOTR) implementation on operating room efficiency. Secondly, we sought to evaluate the associated financial impact of the DOTR, with respect to both after-hours care costs as well as the opportunity cost of displaced elective cases.

This was a retrospective cost-analysis study performed at a single academic-affiliated community hospital in Toronto, Canada. All patients that underwent the most frequently performed orthopedic trauma procedures (hip hemiarthroplasty, open reduction internal fixation of the ankle, femur, elbow and distal radius), over a four-year period from 2016-2019 were included. Patient data acquired for two-years prior and two-years after the implementation of a DOTR were compared, adjusting for the number of cases performed. Surgical duration and number of day-time and after-hours cases was recorded pre- and post-implementation. Cost savings of performing trauma cases during daytime and the opportunity cost of displacing elective cases by performing cases during the day was calculated. A sensitivity analysis accounting for varying overtime costs and hospital elective case profit was also performed.

1960 orthopaedic cases were examined pre- and post-DOTR. All procedures had reduced total operative time post-DOTR. After accounting for the total number of each procedure performed, the mean weighted reduction was 31.4% and the mean time saved was 29.6 minutes per surgery. The number of daytime surgical hours increased 21%, while nighttime hours decreased by 37.8%. Overtime staffing costs were reduced by $24,976 alongside increase in opportunity costs of $22,500. This resulted in a net profit of $2,476.

Our results support the premise that DOTRs improve operating room efficiency and can be cost efficient. Through the regular scheduling of a DOTR at a single hospital in Canada, the number of surgeries occurring during daytime hours increased while the number of after-hours cases decreased. The same surgeries were also completed nearly one-third faster (30 minutes per case) on average. Our study also specifically addresses the hesitation regarding potential loss of profit from elective surgeries. Notably, the savings partially stem from decreased OR time as well as decreased nurse overtime. Widespread implementation can improve patient care while still remaining financially favourable.


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_12 | Pages 95 - 95
1 Dec 2022
Gleicher Y Wolfstadt J Entezari B
Full Access

Ankle fractures are common orthopedic injuries, often requiring operative intervention to restore joint stability, improve alignment, and reduce the risk of post-traumatic ankle arthritis. However, ankle fracture surgeries (AFSs) are associated with significant postoperative pain, typically requiring postoperative opioid analgesics. In addition to putting patients at risk of opioid dependence, the adverse effects of opioids include nausea, vomiting, and altered mental status which may delay recovery. Peripheral nerve blocks (PNBs) offer notable benefits to the postoperative pain profile when compared to general or spinal anaesthesia alone and may help improve recovery. The primary objective of this quality improvement (QI) study was to increase PNB administration for AFS at our institution to above 50% by January 2021.

A root cause analysis was performed by a multidisciplinary team to identify barriers for PNB administration. Four interventions were chosen & implemented: recruitment and training of expert anesthesiologists in regional anesthesia techniques, procurement of additional ultrasound machines, implementation of a dedicated block room with training to create an enhanced learning environment, and the development of an educational pamphlet for patients outlining strategies to manage rebound pain, instructions around the use of oral multimodal analgesia, and the potential for transient motor block of the leg.

The primary outcome was the percentage of patients who received PNB for AFS. Secondary outcome measures included total hospitalization length of stay (LOS), post-anesthesia care unit (PACU) and 24-hour postoperative opioid consumption (mean oral morphine equivalent [OME]), proportion of patients requiring opioid analgesic in PACU, and proportion of patients experiencing post-operative nausea and/or vomiting (PONV) requiring antiemetic in PACU. Thirty-day post-operative emergency department (ED) visits were collected as a balance measure.

The groups receiving PNB and not receiving PNB included 78 & 157 patients, respectively, with no significant differences in age, gender, or ASA class between groups. PNB administration increased from less than 10% to 53% following implementation of the improvement bundle. Mean total hospital LOS did not vary significantly across the PNB and no PNB groups (1.04 days vs. 1.42 days, P = 0.410). Both mean PACU and mean 24-hour postoperative opioid analgesic consumption was significantly lower in the PNB group compared to the no PNB group (OME in PACU 38.96mg vs. 55.42mg [P = 0.001]; 24-hour OME 44.74mg vs. 37.71mg [P = .008]). A greater proportion of patients in the PNB group did not require any PACU opioid analgesics compared to those in the no PNB group (62.8% vs. 27.4%, P < 0.001). The proportion of patients experiencing PONV and requiring antiemetic both in the PACU did not vary significantly across groups. Thirty-day postoperative ED visits did not vary significantly across groups.

By performing a root cause analysis and implementing a multidisciplinary, patient-centered QI bundle, we achieved significant increases in PNB administration for AFS. As a result, there were significant improvements in the recovery of patients following AFS, specifically reduced use of postoperative opioid analgesia. This multi-faceted approach provides a framework for an individualized QI approach to increase PNB administration and achieve improved patient outcomes following AFS.


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_12 | Pages 100 - 100
1 Dec 2022
Du JT Toor J Abbas A Shah A Koyle M Bassi G Wolfstadt J
Full Access

In the current healthcare environment, cost containment has become more important than ever. Perioperative services are often scrutinized as they consume more than 30% of North American hospitals’ budgets. The procurement, processing, and use of sterile surgical inventory is a major component of the perioperative care budget and has been recognized as an area of operational inefficiency. Although a recent systematic review supported the optimization of surgical inventory reprocessing as a means to increase efficiency and eliminate waste, there is a paucity of data on how to actually implement this change. A well-studied and established approach to implementing organizational change is Kotter's Change Model (KCM). The KCM process posits that organizational change can be facilitated by a dynamic 8-step approach and has been increasingly applied to the healthcare setting to facilitate the implementation of quality improvement (QI) interventions. We performed an inventory optimization (IO) to improve inventory and instrument reprocessing efficiency for the purpose of cost containment using the KCM framework. The purpose of this quality improvement (QI) project was to implement the IO using KCM, overcome organizational barriers to change, and measure key outcome metrics related to surgical inventory and corresponding clinician satisfaction. We hypothesized that the KCM would be an effective method of implementing the IO.

This study was conducted at a tertiary academic hospital across the four highest-volume surgical services - Orthopedics, Otolaryngology, General Surgery, and Gynecology. The IO was implemented using the steps outlined by KCM (Figure 1): 1) create coalition, 2) create vision for change, 3) establish urgency, 4) communicate the vision, 5) empower broad based action, 6) generate general short term wins, 7) consolidate gains, and 8) anchor change. This process was evaluated using inventory metrics - total inventory reduction and depreciation cost savings; operational efficiency metrics - reprocessing labor efficiency and case cancellation rate; and clinician satisfaction.

The implementation of KCM is described in Table 1. Total inventory was reduced by 37.7% with an average tray size reduction of 18.0%. This led to a total reprocessing time savings of 1333 hours per annum and labour cost savings of $39 995 per annum. Depreciation cost savings was $64 320 per annum. Case cancellation rate due to instrument-related errors decreased from 3.9% to 0.2%. The proportion of staff completely satisfied with the inventory was 1.7% pre-IO and 80% post-IO.

This was the first study to show the success of applying KCM to facilitate change in the perioperative setting with respect to surgical inventory. We have outlined the important organizational obstacles faced when making changes to surgical inventory. The same KCM protocol can be followed for optimization processes for disposable versus reusable surgical device purchasing or perioperative scheduling. Although increasing efforts are being dedicated to quality improvement and efficiency, institutions will need an organized and systematic approach such as the KCM to successfully enact changes.

For any figures or tables, please contact the authors directly.


Orthopaedic Proceedings
Vol. 102-B, Issue SUPP_8 | Pages 12 - 12
1 Aug 2020
Melo L White S Chaudhry H Stavrakis A Wolfstadt J Ward S Atrey A Khoshbin A Nowak L
Full Access

Over 300,000 total hip arthroplasties (THA) are performed annually in the USA. Surgical Site Infections (SSI) are one of the most common complications and are associated with increased morbidity, mortality and cost. Risk factors for SSI include obesity, diabetes and smoking, but few studies have reported on the predictive value of pre-operative blood markers for SSI. The purpose of this study was to create a clinical prediction model for acute SSI (classified as either superficial, deep and overall) within 30 days of THA based on commonly ordered pre-operative lab markers and using data from the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database.

All adult patients undergoing an elective unilateral THA for osteoarthritis from 2011–2016 were identified from the NSQIP database using Current Procedural Terminology (CPT) codes. Patients with active or chronic, local or systemic infection/sepsis or disseminated cancer were excluded. Multivariate logistic regression was used to determine coefficients, with manual stepwise reduction. Receiver Operating Characteristic (ROC) curves were also graphed. The SSI prediction model included the following covariates: body mass index (BMI) and sex, comorbidities such as congestive heart failure (CHF), chronic obstructive pulmonary disease (COPD), smoking, current/previous steroid use, as well as pre-operative blood markers, albumin, alkaline phosphate, blood urea nitrogen (BUN), creatinine, hematocrit, international normalized ratio (INR), platelets, prothrombin time (PT), sodium and white blood cell (WBC) levels. Since the data met logistic assumption requirements, bootstrap estimation was used to measure internal validity. The area under the ROC curve for final derivations along with McFadden's R-squared were utilized to compare prediction models.

A total of 130,619 patients were included with the median age of patients at time of THA was 67 years (mean=66.6+11.6 years) with 44.8% (n=58,757) being male. A total of 1,561 (1.20%) patients had a superficial or deep SSI (overall SSI). Of all SSI, 45.1% (n=704) had a deep SSI and 55.4% (n=865) had a superficial SSI. The incidence of SSI occurring annually decreased from 1.44% in 2011 to 1.16% in 2016. Area under the ROC curve for the SSI prediction model was 0.79 and 0.78 for deep and superficial SSI, respectively and 0.71 for overall SSI. CHF had the largest effect size (Odds Ratio(OR)=2.88, 95% Confidence Interval (95%CI): 1.56 – 5.32) for overall SSI risk. Albumin (OR=0.44, 95% CI: 0.37 – 0.52, OR=0.31, 95% CI: 0.25 – 0.39, OR=0.48, 95% CI: 0.41 – 0.58) and sodium (OR=0.95, 95% CI: 0.93 – 0.97, OR=0.94, 95% CI: 0.91 – 0.97, OR=0.95, 95% CI: 0.93 – 0.98) levels were consistently significant in all clinical prediction models for superficial, deep and overall SSI, respectively. In terms of pre-operative blood markers, hypoalbuminemia and hyponatremia are both significant risk factors for superficial, deep and overall SSI.

In this large NSQIP database study, we were able to create an SSI prediction model and identify risk factors for predicting acute superficial, deep and overall SSI after THA. To our knowledge, this is the first clinical model whereby pre-operative hyponatremia (in addition to hypoalbuminemia) levels have been predictive of SSI after THA. Although the model remains without external validation, it is a vital starting point for developing a risk prediction model for SSI and can help physicians mitigate risk factors for acute SSI post THA.


Orthopaedic Proceedings
Vol. 102-B, Issue SUPP_6 | Pages 131 - 131
1 Jul 2020
Wolfstadt J Pincus D Kreder H Wasserstein D
Full Access

Socially deprived patients face significant barriers that reduce their access to care, presenting unique challenges for orthopaedic surgeons. Few studies have investigated the outcomes of surgical fracture care among those socially deprived, despite the increased incidence of fractures, and the inequality of care received in this group. The purpose of this study was to evaluate whether social deprivation impacted the complications and subsequent management of marginalized/homeless patients following ankle fracture surgery.

In this retrospective, population-based cohort study involving 202 hospitals in Ontario, Canada, we evaluated 45,444 patients who underwent open reduction internal fixation for an ankle fracture performed by 710 different surgeons between January 1, 1994, and December 31, 2011. Socioeconomic deprivation was measured for each patient according to their residential location by using the “deprivation” component of the Ontario Marginalization Index (ON-MARG). Multivariable logistic regression models were used to assess the relationship between deprivation and shorter-term outcomes within 1 year (implant removal, repeat ORIF, irrigation and debridement due to infection, and amputation). Multivariable cox proportional hazards (CPH) models were used to assess longer-term outcomes up to 20 years (ankle fusion and ankle arthroplasty).

A higher level of deprivation was associated with an increased risk of I&D (quintile 5 vs. quintile 1: odds ratio (OR) 2.14, 95% confidence interval (CI), 1.25–3.67, p = 0.0054) and amputation (quintile 4 vs. quintile 1: OR 3.56, 95% CI 1.01–12.4, p = 0.0466). It was more common for less deprived patients to have their hardware removed compared to more deprived patients (quintile 5 vs. quintile 1: OR 0.822, 95% CI 0.76–0.888, p < 0.0001). There was no correlation between marginalization and subsequent revision ORIF, ankle fusion, or ankle arthroplasty.

Marginalized patients are at a significantly increased risk of infection and amputation following operatively treated ankle fractures. However, these complications are still extremely rare among this group. Thus, socioeconomic deprivation should not prohibit marginalized patients from receiving operative management for unstable ankle fractures.


The Bone & Joint Journal
Vol. 101-B, Issue 9 | Pages 1087 - 1092
1 Sep 2019
Garceau S Warschawski Y Dahduli O Alshaygy I Wolfstadt J Backstein D

Aims

The aim of this study was to assess the effects of transferring patients to a specialized arthroplasty centre between the first and second stages (interstage) of prosthetic joint infection (PJI) of the knee.

Patients and Methods

A search of our institutional database was performed to identify patients having undergone two-stage revision total knee arthroplasty (TKA) for PJI. Two cohorts were created: continuous care (CC) and transferred care (TC). Baseline characteristics and outcomes were collected and compared between cohorts.


Orthopaedic Proceedings
Vol. 100-B, Issue SUPP_11 | Pages 33 - 33
1 Aug 2018
Waddell J Atrey A Wolfstadt J Khoshbin A Ward S
Full Access

A randomized trial was designed to compare the outcome of ceramic-on-ceramic with ceramic on conventional polyethylene. These patients have been followed for 15 years.

58 hips in 57 patients under 60 years of age were randomized into one of two groups. Patients were blinded to the type of hip they received. Both groups of patients were treated routinely with prophylactic peri-operative antibiotics and low molecular weight Heparin. All patients were seen at six weeks, three months and annually after surgery. Clinical and radiologic assessment was carried out at each visit.

Fifty-eight hips were available for analysis, 28 in the CoP group and 29 patients in the CoC group. Mean age of both groups was less than 45 years.

There were seven revisions (16%) among the 58 patients enrolled in the study. In the CoP group four patients underwent revision with head and liner exchange for eccentric polyethylene wear 16 years post-implantation. In the CoC group one patient had a cup revision at 15 years for acute aseptic instability of the acetabulum; two additional patients in the CoC group had femoral head exchange, one for fracture and one for trunnion corrosion. Both occurred 14 years after the index surgery.

Functional outcome scores showed no difference between the two groups at 15 years. Radiographically there was a statistically difference in wear between the two groups.

This study demonstrates that both ceramic-on-ceramic and ceramic-on-polyethylene produce satisfactory functional results with low revision rates in young patients.