
Understanding the acceptability of artificial intelligence as a support for healthcare providers in the diagnosis of prostate cancer - the patient at the heart of his care
Healthcare professionals face increasingly complex challenges, including evolving health issues and rising workloads. Simultaneously, the integration of artificial intelligence (AI) in healthcare presents significant opportunities to enhance medical practices (1). In the context of prostate cancer (PCa) diagnosis, AI could improve information from medical images and creation of predictive models. This will represent a significant advance in optimizing diagnosis of PCa and predicting its aggressiveness, with the final objective of personalizing treatment. This will help reduce the need for prostate biopsies, thus increasing the individuals' compliance and encouraging them to get tested before the appearance of symptoms. Moreover, early detection of PCa dramatically improves the treatment success rate (2).
While many studies have focused on AI’s technical performance, its successful implementation also depends on its acceptability by patients and healthcare professionals (3).
The acceptability of clinical AI by patients and healthcare professionals has been studied in different contexts but most studies concern the acceptability of AI in healthcare in general. The majority of systematic reviews show good acceptability of AI in healthcare in general (4). On the other hand, studies in specific health contexts are rare (5) and the AI acceptability in these contexts cannot be defined by simply extrapolating the AI broad acceptability in healthcare. Indeed, the acceptability of AI differs according to the context (disease diagnosed, severity of the disease, consequences of the AI's decision, complexity of decision-making) but also to the tasks that the AI performs (diagnosis, choice of treatment, prognosis) (6–8). In addition, AI acceptability factors differ depending on the population studied (patients, healthcare professionals, researchers and healthcare managers) due to differences in needs, preferences and the context of use (9). It is therefore important to study the acceptability of AI in specific contexts and, in particular, in the context of prostate cancer.
The availability of (large) health datasets is another challenge for the implementation of AI in healthcare. The use of this data is inevitable, as it enables the tools to progress and evolve (10). As explained above, clinical AI must be accepted by patients and carers, but it must also be fed by patient data (6). Patients must therefore agree to the use of their health data.
Although the patients’ willingness to provide personal health data has been widely studied in the context of secondary uses in general (data sharing for clinical, public health research, epidemiology, etc), it has been not well studied in the context of AI in particular. It’s therefore essential to study this willingness to provide data for AI development purposes (11).
One of the aims of the FLUTE project is to study the acceptability of prostate cancer patients regarding the implementation of AI as a support to healthcare providers in the diagnosis of prostate cancer and explore the conditions for this implementation. We also study the willingness of patients to provide their health data for the development of clinical AI and explore the conditions for this willingness. To this end, we have developed an online questionnaire targeting prostate cancer patients. It can be accessed via this link: https://redcap.link/FLUTE.
Aurélie MATAGNE, Nicolas GILLAIN and Patrick DUFLOT
References:
1 Fazakarley CA, Breen M, Thompson B, Leeson P, Williamson V. Beliefs, experiences and concerns of using artificial intelligence in healthcare: A qualitative synthesis. Digit Health. 11 févr 2024;10:20552076241230075.
2 Morote J, Borque-Fernando A, Triquell M, Celma A, Regis L, Escobar M, et al. The Barcelona Predictive Model of Clinically Significant Prostate Cancer. Cancers. 21 mars 2022;14(6):1589.
3 Hua D, Petrina N, Young N, Cho JG, Poon SK. Understanding the factors influencing acceptability of AI in medical imaging domains among healthcare professionals: A scoping review. Artif Intell Med. 1 janv 2024;147:102698.
4 Young AT, Amara D, Bhattacharya A, Wei ML. Patient and general public attitudes towards clinical artificial intelligence: a mixed methods systematic review. Lancet Digit Health. 1 sept 2021;3(9):e599‑611.
5 Frost EK, Bosward R, Aquino YSJ, Braunack-Mayer A, Carter SM. Facilitating public involvement in research about healthcare AI: A scoping review of empirical methods. Int J Med Inf. 1 juin 2024;186:105417.
6 Antes AL, Burrous S, Sisk BA, Schuelke MJ, Keune JD, DuBois JM. Exploring perceptions of healthcare technologies enabled by artificial intelligence: an online, scenario-based survey. BMC Med Inform Decis Mak. 20 juill 2021;21:221.
7 Lennartz S, Dratsch T, Zopfs D, Persigehl T, Maintz D, Große Hokamp N, et al. Use and Control of Artificial Intelligence in Patients Across the Medical Workflow: Single-Center Questionnaire Study of Patient Perspectives. J Med Internet Res. 17 févr 2021;23(2):e24221.
8 Moy S, Irannejad M, Manning SJ, Farahani M, Ahmed Y, Gao E, et al. Patient Perspectives on the Use of Artificial Intelligence in Health Care: A Scoping Review. J Patient-Centered Res Rev. 2 avr 2024;11(1):51‑62.
9 Esmaeilzadeh P. Use of AI-based tools for healthcare purposes: a survey study from consumers’ perspectives. BMC Med Inform Decis Mak. 22 juill 2020;20:170.
10 Mikkelsen JG, Sørensen NL, Merrild CH, Jensen MB, Thomsen JL. Patient perspectives on data sharing regarding implementing and using artificial intelligence in general practice – a qualitative study. BMC Health Serv Res. 4 avr 2023;23:335.
11 Jutzi TB, Krieghoff-Henning EI, Holland-Letz T, Utikal JS, Hauschild A, Schadendorf D, et al. Artificial Intelligence in Skin Cancer Diagnostics: The Patients’ Perspective. Front Med. 2020;7:233.