header advert
Orthopaedic Proceedings Logo

Receive monthly Table of Contents alerts from Orthopaedic Proceedings

Comprehensive article alerts can be set up and managed through your account settings

View my account settings

Visit Orthopaedic Proceedings at:

Loading...

Loading...

Full Access

Hip

RELIABILITY AND VALIDITY OF THE REVISION HIP COMPLEXITY CLASSIFICATION

The British Hip Society (BHS) Meeting 2024, Belfast, Northern Ireland, 28 February – 1 March 2024.



Abstract

The Revision Hip Complexity Classification (RHCC) was developed by modified Delphi system in 2022 to provide a comprehensive, reproducible framework for the multidisciplinary discussion of complex revision hip surgery. The aim of this study was to assess the validity, intra-relater and inter-relater reliability of the RHCC.

Radiographs and clinical vignettes of 20 consecutive patients who had undergone revision of Total Hip Arthroplasty (THA) at our unit during the previous 12-month period were provided to observers. Five observers, comprising 3 revision hip consultants, 1 hip fellow and 1 ST3-8 registrar were familiarised with the RHCC. Each revision THA case was classified on two separate occasions by each observer, with a mean time between assessments of 42.6 days (24–57). Inter-observer reliability was assessed using the Fleiss™ Kappa statistic and percentage agreement. Intra-observer reliability was assessed using the Cohen Kappa statistic. Validity was assessed using percentage agreement and Cohen Kappa comparing observers to the RHCC web-based application result.

All observers were blinded to patient notes, operation notes and post-operative radiographs throughout the process.

Inter-observer reliability showed fair agreement in both rounds 1 and 2 of the survey (0.296 and 0.353 respectively), with a percentage agreement of 69% and 75%.

Inter-observer reliability was highest in H3-type revisions with kappa values of 0.577 and 0.441.

Mean intra-observer reliability showed moderate agreement with a kappa value of 0.446 (0.369 to 0.773).

Validity percentage agreement was 44% and 39% respectively, with mean kappa values of 0.125 and 0.046 representing only slight agreement.

This study demonstrates that classification using the RHCC without utilisation of the web-based application is unsatisfactory, showing low validity and reliability. Reliability was higher for more complex H3-type cases. The use of the RHCC web app is recommended to ensure the accurate and reliable classification of revision THA cases.


Email: