Advanced search
1 file | 495.06 KB Add to list
Author
Organization
Abstract
Objective: Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. Methodology: The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Results: Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. Conclusions: The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization.
Keywords
ELECTRONIC HEALTH RECORDS, REQUIREMENTS, STANDARDS, electronic health record, semantic interoperability, clinical, information modeling tools, archetypes, detailed clinical models, qualitative research, delphi study

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 495.06 KB

Citation

Please use this url to cite or link to this publication:

MLA
Moreno-Conde, Alberto et al. “Evaluation of Clinical Information Modeling Tools.” JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION 23.6 (2016): 1127–1135. Print.
APA
Moreno-Conde, A., Austin, T., Moreno-Conde, J., Parra-Calderón, C. L., & Kalra, D. (2016). Evaluation of clinical information modeling tools. JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 23(6), 1127–1135.
Chicago author-date
Moreno-Conde, Alberto, Tony Austin, Jesús Moreno-Conde, Carlos L. Parra-Calderón, and Dipak Kalra. 2016. “Evaluation of Clinical Information Modeling Tools.” Journal of the American Medical Informatics Association 23 (6): 1127–1135.
Chicago author-date (all authors)
Moreno-Conde, Alberto, Tony Austin, Jesús Moreno-Conde, Carlos L. Parra-Calderón, and Dipak Kalra. 2016. “Evaluation of Clinical Information Modeling Tools.” Journal of the American Medical Informatics Association 23 (6): 1127–1135.
Vancouver
1.
Moreno-Conde A, Austin T, Moreno-Conde J, Parra-Calderón CL, Kalra D. Evaluation of clinical information modeling tools. JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION. 2016;23(6):1127–35.
IEEE
[1]
A. Moreno-Conde, T. Austin, J. Moreno-Conde, C. L. Parra-Calderón, and D. Kalra, “Evaluation of clinical information modeling tools,” JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, vol. 23, no. 6, pp. 1127–1135, 2016.
@article{8511655,
  abstract     = {Objective: Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. 
Methodology: The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. 
Results: Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. 
Conclusions: The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization.},
  author       = {Moreno-Conde, Alberto and Austin, Tony and Moreno-Conde, Jesús and Parra-Calderón, Carlos L. and Kalra, Dipak},
  issn         = {1067-5027},
  journal      = {JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION},
  keywords     = {ELECTRONIC HEALTH RECORDS,REQUIREMENTS,STANDARDS,electronic health record,semantic interoperability,clinical,information modeling tools,archetypes,detailed clinical models,qualitative research,delphi study},
  language     = {eng},
  number       = {6},
  pages        = {1127--1135},
  title        = {Evaluation of clinical information modeling tools},
  url          = {http://dx.doi.org/10.1093/jamia/ocw018},
  volume       = {23},
  year         = {2016},
}

Altmetric
View in Altmetric
Web of Science
Times cited: