Advanced search
1 file | 610.71 KB Add to list

Take five? A coherentist argument why medical AI does not require a new ethical principle

Seppe Segers (UGent) and Michiel De Proost (UGent)
Author
Organization
Project
Abstract
With the growing application of machine learning models in medicine, principlist bioethics has been put forward as needing revision. This paper reflects on the dominant trope in AI ethics to include a new 'principle of explicability' alongside the traditional four principles of bioethics that make up the theory of principlism. It specifically suggests that these four principles are sufficient and challenges the relevance of explicability as a separate ethical principle by emphasizing the coherentist affinity of principlism. We argue that, through specification, the properties of explicability are already covered by the four bioethical principles. The paper finishes by anticipating an objection that coherent principles could not facilitate technology induced change and are not well-suited to tackle moral differences.
Keywords
Principlism, Medical ethics, AI ethics, Explicability, Coherentism, Moral change, COMMON, PRAGMATISM

Downloads

  • (...).pdf
    • full text (Published version)
    • |
    • UGent only
    • |
    • PDF
    • |
    • 610.71 KB

Citation

Please use this url to cite or link to this publication:

MLA
Segers, Seppe, and Michiel De Proost. “Take Five? A Coherentist Argument Why Medical AI Does Not Require a New Ethical Principle.” THEORETICAL MEDICINE AND BIOETHICS, vol. 45, no. 5, 2024, pp. 387–400, doi:10.1007/s11017-024-09676-0.
APA
Segers, S., & De Proost, M. (2024). Take five? A coherentist argument why medical AI does not require a new ethical principle. THEORETICAL MEDICINE AND BIOETHICS, 45(5), 387–400. https://doi.org/10.1007/s11017-024-09676-0
Chicago author-date
Segers, Seppe, and Michiel De Proost. 2024. “Take Five? A Coherentist Argument Why Medical AI Does Not Require a New Ethical Principle.” THEORETICAL MEDICINE AND BIOETHICS 45 (5): 387–400. https://doi.org/10.1007/s11017-024-09676-0.
Chicago author-date (all authors)
Segers, Seppe, and Michiel De Proost. 2024. “Take Five? A Coherentist Argument Why Medical AI Does Not Require a New Ethical Principle.” THEORETICAL MEDICINE AND BIOETHICS 45 (5): 387–400. doi:10.1007/s11017-024-09676-0.
Vancouver
1.
Segers S, De Proost M. Take five? A coherentist argument why medical AI does not require a new ethical principle. THEORETICAL MEDICINE AND BIOETHICS. 2024;45(5):387–400.
IEEE
[1]
S. Segers and M. De Proost, “Take five? A coherentist argument why medical AI does not require a new ethical principle,” THEORETICAL MEDICINE AND BIOETHICS, vol. 45, no. 5, pp. 387–400, 2024.
@article{01J6H3Q8R6YBQ8KKVJDBMH454B,
  abstract     = {{With the growing application of machine learning models in medicine, principlist bioethics has been put forward as needing revision. This paper reflects on the dominant trope in AI ethics to include a new 'principle of explicability' alongside the traditional four principles of bioethics that make up the theory of principlism. It specifically suggests that these four principles are sufficient and challenges the relevance of explicability as a separate ethical principle by emphasizing the coherentist affinity of principlism. We argue that, through specification, the properties of explicability are already covered by the four bioethical principles. The paper finishes by anticipating an objection that coherent principles could not facilitate technology induced change and are not well-suited to tackle moral differences.}},
  author       = {{Segers, Seppe and De Proost, Michiel}},
  issn         = {{1386-7415}},
  journal      = {{THEORETICAL MEDICINE AND BIOETHICS}},
  keywords     = {{Principlism,Medical ethics,AI ethics,Explicability,Coherentism,Moral change,COMMON,PRAGMATISM}},
  language     = {{eng}},
  number       = {{5}},
  pages        = {{387--400}},
  title        = {{Take five? A coherentist argument why medical AI does not require a new ethical principle}},
  url          = {{http://doi.org/10.1007/s11017-024-09676-0}},
  volume       = {{45}},
  year         = {{2024}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: