Abstract Session
ARP
Sydney Liles, MS, PT, DPT (she/her/hers)
University of Delaware
Newark, Delaware, United States
Disclosure(s): No financial relationships with ineligible companies to disclose
We used cross-sectional data from the Observational Arthritis Foundation Internet Study. Participants were recruited online using advertisements on Meta. Those who were ≥ 18 years old, lived in the US, and had a patient-recalled rheumatic condition were included in the study. Demographic information was collected using questionnaires. Study participants were asked what “Type(s) of Arthritis you have been diagnosed with:”. Response options included Osteoarthritis (OA), Fibromyalgia, Rheumatoid Arthritis (RA), Psoriatic Arthritis (PsA), Gout, Lupus, and/or Other. We compared patient-recalled diagnoses with health records/provider report. Health record/provider reported diagnoses were collected through uploaded documentation from an electronic medical record or receiving a completed form specifying rheumatic diagnoses completed by a physician. We assessed agreement by calculating percent agreement, kappa coefficients (κ) with 95% confidence intervals (CI), sensitivity, and specificity for each diagnosis. Kappa coefficients were interpreted using 0 no agreement, 0.1-0.2 slight, 0.21-0.4 fair, 0.41-0.6 moderate, 0.61-0.8 substantial, 0.81-0.99 near perfect, and 1.0 perfect agreement.
Results: 108 participants (mean age = 57.3 years old, 85.2% female, mean BMI =34.1 kg/m2, mean Charleson Comorbidity Index Score 3.4, 18.5% non-White, 78.7% with at least an associate’s degree) were included in the analysis. Percentages of agreement between physician diagnoses and patient-recalled ranged from 88% to 99%, with corresponding kappa coefficients ranging from 0.25 to 0.93 (Table 1). Lupus showed the highest agreement (99.1%) and a kappa coefficient of 0.93 95% CI (0.79, 1.00). Osteoarthritis and the “Other” category each had agreement percentages of 88.0% with kappa values of 0.69 (0.53, 0.85) and 0.25 (-0.03, 0.54), reflecting substantial and fair agreement, respectively. Specificity scores ranged from 0.81 to 1.00, while sensitivity scores ranged from 0.80 to 0.90 for all conditions except “Other” (sensitivity 0.27). Examples of reported “Other” diagnoses include ankylosing spondylitis, Sjogren’s arthritis, or scleroderma.
Conclusion: We found, in our preliminary study, that patient-recalled diagnoses of specific rheumatic conditions have substantial agreement with physician diagnoses. However, patient-recalled diagnoses of ‘Other’ have fair agreement with physician report. Patient-recalled diagnoses may serve as a reasonable proxy for provider-confirmed diagnoses. These findings need to be replicated in larger and more diverse samples.