Background
Objective and research question
Methods
Search process
Term | Term | Term | Hits | ||
---|---|---|---|---|---|
Framework | AND | Evaluation | AND | Technol* | 1381 |
Framework | Evaluation | ICT | 24 | ||
Framework | Evaluation | Robot | 48 | ||
Framework | Evaluation | Sensor | 135 | ||
Framework | Evaluation | Telecare | 10 | ||
Framework | Evaluation | Telemed | 97 | ||
Framework | Evaluation | Digital | 260 | ||
1955 |
Eligibility criteria for systematic search
Identifying relevant frameworks
Data extraction
Purpose, perspective, and success definitions
Strengths and weaknesses
Guiding Category | Content |
---|---|
Focus of the framework | Description of the specific focus of the framework. This can include a - description of the purpose (and the addressed question) - the application setting - the technology (area) |
Illustration | - Clarity/ complexity of illustration - Visualization of connections and relationships within the framework |
Terminology | - Transparent definitions of terms and key concepts |
Instructions for use | - Concrete application strategy and instructions for use - Instruction on how the results can be interpreted |
Scientific quality | - Transparency of development process - Reflection of the limitations of the framework - Transferability of the framework (Settings, technologies, questions) |
Areas of evaluation
Charting the data
Focus | Product/Technology | Objective Value/Effect | Individual | Organization |
This area includes what the technology focuses on in terms of its objectives and purpose and the problems and needs it aims to solve for a specific target group in a specific setting. | This area includes all aspects of the technology itself. This ranges from visual appearance to functionality and certain specific technological aspects such as interoperability. (However, there is also an interface to the category “individual”, because certain individually perceived aspects are covered here, such as usability and access). | This category includes the relevant information on evidence, aspired values as well as intended and unintended effects of the technology. | This area includes reactions and perceived impressions, as well as the behaviour and the relationship of individuals towards the technology. | This area includes aspects that are relevant in the relationship between the technology and an organization. |
Societal | Ethics | Economics | Strategic | |
This area includes relevant aspects of the technology in a societal context (e.g. political, juridical, regulatory, or socio-cultural aspects - Overlaps with the area of ethics are possible). | This area includes relevant ethical standards and ethical implications to be considered in relation to the technology. | This area includes relevant economic aspects for the technology (e.g. business model, price, economic evaluation). | This area includes strategic aspects that may be relevant for the introduction and dissemination of the technology. |
Results
Search results
Analysis results
Information and Communication technologies | Infoway benefits evaluation Framework [25] |
Health Information Technology Evaluation Framework (HITREF) [26] | |
Hospital Information System Success Framework [27] | |
Development of an Evaluation Framework for Health Information Systems (DIPSA Framework) [28] | |
Human, Organization, Process and Technology-fit (HOPT-FIT) [29] | |
Clinical Information Systems Success Model (CISSM) [21] | |
Adapted nursing care performance framework [30] | |
Telemedicine/Telecare | Model for Assessment of Telemedicine (MAST Manual) [31] |
Comprehensive evaluation framework for telemedicine implementation [24] | |
The layered telemedicine implementation model [23] | |
Sensor Technologies | Evaluation Framework for Fit-For-Purpose Connected Sensor Technologies [19] |
Digital Health | Design and Evaluation of DHI Framework [14] |
Health technology assessment framework for digital healthcare services (Digi HTA) [20] | |
Digital Health Score Card [32] | |
Health (and care) technologies | Health Technology Adoption Framework [33] |
Nonadoption, Abandonment, Scale-up, Spread, and Sustainability Framework (NASSS Framework) [34] | |
E-health programs | Khoja–Durrani–Scott Framework for e-Health Evaluation [22] |
Clinical informatic interventions | RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) (expanded to clinical informatics)) [35] |
Purpose and perspectives of the frameworks
Analysis | |||||
---|---|---|---|---|---|
Technology Field | Framework | Authors/Year | Perspective | Stated Purpose | Success Definition/Description |
Information and Communication technologies (Health Information Systems (HIS)) | Infoway benefits evaluation Framework | Francis Lau et al. 2007 [25] | Investment programs for digital technologies (to guide evaluations) | 1. Provide a high-level evidence based model to guide subsequent field evaluation | Success measured by analysing the results of the evaluation (Factors based on the van der Meijden et al. model [36]) |
Information and Communication technologies (Health Information Technologies (EHR)) | Health Information Technology Evaluation Framework (HITREF) | Sockolow et al. 2012 [26] | Universal perspective (mainly influenced by health services research and informatics) | 1. Conceptual tool for framing evaluations studies in assessing EHR-based implementations in organizational, systematic, and environmental contexts 2. Displaying evaluation criteria | No success definition (measuring the success by analysing the results of the evaluation) |
Information and Communication technologies (Hospital Information Systems) | Hospital Information System HIS Success Framework | Sadoughi et al. 2013 [27] | Universal perspective | 1.Identification of Hospital Information System success and failure factors and the evaluation methods of these factors | Success as a dynamic concept. Success is when the technology achieves its intended purpose. (+ time, budget, and user satisfaction) |
Information and Communication technologies (Integrated Health Information Systems (IHIS)) | Development of an Evaluation Framework for Health Information Systems (DIPSA Framework) | Stylianides et al. 2018 [28] | Healthcare Organization | 1.Evaluation framework for hospitals utilizing IHIS to help identify any existing deficiencies in the system | No success definition (measuring the success by analysing the results of the evaluation) |
Information and Communication technologies (Health Information Systems) | Human, Organization, Process and Technology-fit (HOPT-FIT) | Yusof 2019 [29] | Healthcare Organization (focus on technology induced errors) | 1. Evaluate HIS performance and efficiency 2. Systematically guide error evaluation 3. Describing the Human-Organization-Process-Technology fit | No success definition (measuring success with the included dimensions of HIS success) |
Information and Communication technologies (Clinical Information Systems (CIS)) | Clinical Information Systems Success Model (CISSM) | Garcia-Smith & Effken 2013 [21] | Nurse’s perspective | 1. Framework for evaluating CIS success from the nurse’s perspective | Success = net benefit (“degree to which a nurse believes that using a particular system enhances job performance”) |
Information and Communication technologies (Information and Communication technologies for nurses) | Adapted nursing care performance framework | Rouleau et al. 2017 [30] | Nurse’s perspective | 1. Illustrate how ICTs interventions influence nursing care and impact health outcomes | No success definition (measuring the success by analysing the results of the evaluation) |
Telemedicine/Telecare | Model for Assessment of Telemedicine (MAST Manual) | Kidhom et al. 2010 [31] | Universal perspective (user-based decision making, research) | 1. Describe effectiveness 2. Contribution to quality of care of telemedicine applications 3. Produce a basis for decision making | No success definition (measuring the success by analysing the results of the evaluation) |
Telemedicine/Telecare | Comprehensive evaluation framework for telemedicine implementation | Chang 2015 [24] | Universal perspective (decision making for individuals, organizations, and communities) | 1. Summarising important themes for the evaluation of telemedicine systems 2. Support related stakeholders’ decision-making by promoting general understanding, and resolving arguments and controversies | Long-term implementation |
Telemedicine/Telecare | The layered telemedicine implementation model | Broens et al. 2007 [23] | Universal perspective (the focus on individual determinants/ perspectives changes throughout the development life cycle) | 1. Detailed classification of the determinants of the success of future telemedicine implementations | Successful implementation (“putting an idea or a concept into actual practice”) |
Sensor Technologies (Connected Sensor Technologies: including wearables. biosensors) | Evaluation Framework for Fit-For-Purpose Connected Sensor Technologies | Coravos et al. 2020 [19] | Healthcare System Perspective (users and other stakeholders) | 1. Working evaluation framework that reflects different types of risks 2. Framework is conducted to better manage these risks 3. Make information on sensor technologies more comparable and understandable | No success definition (success could be measured by analysing the results of the evaluation and comparing them with the standards for connected sensors) |
Digital Health (Digital Health Interventions (DHI)) | Design and Evaluationof DHI Framework | Kowatsch et al. 2019 [14] | Universal perspective (researchers and practitioners) | 1. Framework for the design and evaluation of DHI 2. Showing evaluation criteria and implementation barriers to be considered during the life cycle phases of DHI 3. Support researchers and practitioners from conception to large-scale implementations | A successful DHI needs to consider “the selection of suitable evaluation criteria and the overcoming of implementation barriers” |
Digital Health (Digital Healthcare Services: mHealth, AI, and robotics) | Health technology assessment framework for digital healthcare services (Digi HTA) | Jari et al. 2019 [20] | Healthcare System Perspective (decision making) | 1. Inform decisionmakers in order to better support the introduction of new health technologies | No success definition (success could be measured by analysing the results of the evaluation) |
Digital Health (Digital Health Technologies) | Digital Health Score Card | Mathews et al. 2019 [32] | Universal perspective (multi-stakeholder approach) | 1. Multi-stakeholder approach to objectively evaluate digital health solutions | Measuring the success by analysing the results of the evaluation (Success as the successful delivery of validated digital health solutions) |
Health (and care) technologies | Health Technology Adoption Framework | Poulin et al. 2013 [33] | Healthcare Organization | 1. Framework with clear, user-validated criteria for evaluating new health technologies for adoption at the local level | No success definition (success could be measured by analysing the results of the evaluation) |
Health (and care) technologies | Nonadoption, Abandonment, Scale-up, Spread, and Sustainability Framework (NASSS Framework) [34] | Greenhalgh et al. 2017 | Universal perspective | 1. Framework to help predict and evaluate the success of a technology-supported health or social care program 2. Help to design, develop, implement, scale up, spread, and sustain technology-supported health or social care programs by identifying key challenges in different domains and the interactions between them | Adoption, scale-up, spread, and sustainability of a technology |
E-health programs | Khoja–Durrani–Scott Framework for e-Health Evaluation | Khoja et al. 2013 [22] | Universal perspective (included tools usable for managers, healthcare providers, and clients) | 1. Comprehensive Framework to show relevant themes for e-health evaluation | No success definition (measuring the success by analysing the results of the evaluation) |
Clinical informatic interventions | RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) (expanded to clinical informatics) | Bakken & Ruland 2009 [35] | Healthcare Organization (implementation in organizational practice) | 1. Used to design, implementation, evaluation, and reporting of clinical informatics with a goal of translation of research into practice | No success definition (measuring the success by analysing the results of the evaluation) |
Success definitions/descriptions
Strengths and weaknesses of the frameworks
+ = strength | - = weakness | +/− = strength and weakness | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Criteria | Summary | |||||||||||||
Frameworks | Description of the purpose (and the addressed question(s) | Description of the application setting | Description of the technology (area) | Clarity/ complexity of illustration | Visualization of connections and relationships within the framework | Transparent definitions of terms and key concepts | Concrete application strategy and instructions for use | Instruction on how the results can be interpreted | Transparency of development process | Reflection of the limitations of the framework | Transferability of the framework (Settings, technologies, questions) | + | – | +/− |
Health Technology Adoption Framework [33] | + | + | + | + | – | + | + | + | + | + | +/− | 9 | 1 | 1 |
Clinical Information Systems Success Model (CISSM) [21] | + | + | + | + | + | + | + | + | + | – | +/− | 9 | 1 | 1 |
Nonadoption, Abandonment, Scale-up, Spread, and Sustainability Framework (NASSS Framework) [34] | + | +/− | +/− | + | + | + | +/− | + | + | + | + | 8 | 0 | 3 |
Health Information Technology Evaluation Framework (HITREF) [26] | + | +/− | + | + | + | + | + | – | + | – | + | 8 | 2 | 1 |
Evaluation Framework for Fit-For-Purpose Connected Sensor Technologies [19] | + | +/− | + | + | – | + | + | + | + | – | + | 8 | 2 | 1 |
Hospital Information System Success Framework [27] | + | + | + | – | – | + | + | – | + | + | + | 8 | 3 | 0 |
Adapted nursing care performance framework [30] | + | +/− | + | + | + | + | – | +/− | – | + | + | 7 | 2 | 2 |
The layered telemedicine implementation model [23] | + | +/− | + | + | + | + | +/− | – | + | + | – | 7 | 2 | 2 |
Model for Assessment of Telemedicine (MAST Manual) [31] | + | +/− | + | – | – | + | + | – | + | + | + | 7 | 3 | 1 |
Health technology assessment framework for digital healthcare services (Digi HTA) [20] | + | +/− | + | – | – | + | + | + | + | – | + | 7 | 3 | 1 |
Infoway benefits evaluation Framework [25] | + | +/− | +/− | + | + | + | – | – | + | – | + | 6 | 3 | 2 |
Design and Evaluation of DHI Framework [14] | + | +/− | +/− | – | – | + | + | – | + | + | + | 6 | 3 | 2 |
RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) (expanded to clinical informatics)) [35] | + | +/− | +/− | – | – | + | + | + | + | – | + | 6 | 3 | 2 |
Development of an Evaluation Framework for Health Information Systems (DIPSA Framework) [28] | + | + | + | – | – | + | – | – | + | – | + | 6 | 5 | 0 |
Khoja–Durrani–Scott Framework for e-Health Evaluation [22] | + | +/− | +/− | – | + | + | – | – | + | – | + | 5 | 4 | 2 |
Digital Health Score Card [32] | + | +/− | +/− | + | + | + | – | – | – | – | – | 4 | 5 | 2 |
Human, Organization, Process and Technology-fit (HOPT-FIT) [29] | + | +/− | + | + | + | – | – | – | – | – | – | 4 | 6 | 1 |
Comprehensive evaluation framework for telemedicine implementation [24] | + | +/− | + | + | – | – | – | – | – | – | – | 3 | 7 | 1 |
Areas of evaluation in relation to the assigned perspectives
