Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

EDITING INSTRUCTIONS:

Option 1: Click on the pen (Edit) to make changes directly in the text:

use strikethrough to preserve changed text sections (don;t worry about tracking typos and don;t worry about formatting) - if you want to make sure people know you made the edits, add your initials (and if you want to pick a font color for your edits, you coudl do that, too). Click on Publish when done to save.

Option 2: Highlight the text you want to suggest a change for and use the bubble (in-line comment) feature; these changes will not be in the actual text, so may be a good option when your comment is more descriptive / explanatory, rather than creating direct content.

1      Accountability and Key Performance Indicators 

The strategies and related action needed to transform the US laboratory system must be monitored to for accountability.  Accountability requires the joint action of all stakeholders.  This section provides a framework for to track milestones and metric, to analyses case studies, and to provide critical feedback to all the program to ensure that progress is being make towards laboratory interoperability.  A set of key performance indicators have been created for this effort and approach to implementation this effort is offered.

...

The framework for this evaluation effort is described as layered, participatory, and based on a logic model.  A layered evaluation directions attention to a variety of the metrics and milestone from global goals (including improving patient safety), efforts of partners i.e., federal agencies and private sector partners, and programmatic efforts of SHIELD.  The evaluation is participatory, meaning that it is on-going and will help re-direct implementation of the plan. To learn from the early stage of implementation and to accommodate changes in the environment the evaluation efforts will provide important to adjust and correct the future course of efforts.  Traditional evaluation has happened at the end of a program.  Participatory evaluation is integrated into the program management.  A logic model is also used to organized evaluation around the transformation of laboratory process beginning with the manufacture, through clinical facilities, to the end users of the data.

...

Further, the evaluation is linked to specific components of the strategic plan to make accountability very specific to groups responsible for activities. 

...

SHIELD Accountability planning is based on the CDC Introduction to Program Evaluation for Public Health Programs, a “how to” guide for planning and implementing evaluation activities (Figure 6).  The first two steps have been accomplished as part of the early development of SHIELD and through the strategic planning process.  Steps 3 and 4 are developed in this section.  Steps 5 and 6 are referred to here to guide future activities.

1.1       Steps of SHIELD Evaluation

  1. Engage Stakeholders

This section of the report pulls together stakeholder engagement efforts that have been done previously and provides the evaluation framework with an understanding of the value of SHIELD to stakeholders.  An evaluation should provide evidence to stakeholders that value has been realized by building and using SHIELD. A description of stakeholder and the value they derive from SHEILD is descripted in section x of this Strategic Plan.

...

Ensure use and share lessons learned with these steps: design, preparation, feedback, follow-up, and dissemination

1.2       Evaluation Design: Layers, Logic Model and Methods

The evaluation design recommend for SHIELD is multi-layered and mixed method.  The layers of the evaluation are set out in Figure x.    Within the base layer is the “logic model” called for by the CDC evaluation guide, which for SHIELD is the process by which harmonized coding is established as the expanded LIVD file and implemented at each step in the process by why laboratories generate data.  Activities of other ecosystem partners must be evaluated.  

 

1.2.1        Layers

The layers of the evaluation framework [RG1] [PG2] are presented in figure x above.

...

Program outcomes are related to activities specific to the strategic plan and supportive of the broader strategic plan activities.

 

Pro

...

Goal

 

 

 

...

·         

...

Activity focused metrics/milestones

Specific to the SHIELD Key to KPI table

...

Internal output metrics/milestones

SHIELD projects with partners, Key to KPI table

...

External impact metrics/milestones for Individual stakeholder Key to Table of KPI belwo

...

Impact focused metrics/milestone

High level outcomes linked directly to the strategic goals Key to table of KPI below

...

Mission Impacts

...

5-6 years

...

3-5 Years

...

2-4 Years

...

0-2 Years

               

...

          Figure 7. KPIs

gram Outcomes

1.2.2        Methods

The proposed mixed method evaluation will track KPI (metrics and milestones), use case studies, and conduct interviewers with those involved in implementation of this Strategic Plan.

...

 

KPI

Metric

Goal

Data Source

Mission Impact

 

 

 

 

 

 

 

 

 

External Impact

 

 

 

 

 

 

 

 

 

Internal Impact

 

 

 

 

 

 

 

 

 

Program Outcomes

 

 

 

 

 

 

 

 

 

Figure 8 KPI by layers

1.2.3        Logic Model

The logic model for evaluation follows the clinical laboratory process and flow of data to users.  To achieve interoperability the SHIELD strategic plan recommends action at each step.  The Evaluation Framework provides a way to create accountability around each of these activities. The logic model for the proposed evaluation is the same model that describes the flow of laboratory date adopted in Section 1 of this document.  

The logic model provides roles of various stackholders, sequences of activiites some of which are dependant, activities to build (and refine) infrastrucutre, and pilots in laboratory enterpirse that bring together the various pieces of the project for testing.

...

&E Figure 7 Logical Model (adaption of End-to-End Diagram[PG5] )

 

1.3       Conducting the Evaluation

The development of this system of KPI may best be done by a contract firm with experience in conducting evaluation.  Different from third-party evaluation, this proposal for participatory evaluation should nonetheless be conducted by a group with the appropriate skills and some distance from programs to ensure objectivity.  Annual written reports and quarterly meeting with each aspect of the SHIELD initiative are proposed.

 The estimated cost of the evaluation over the 5 year period is $xxx.  [RG6] 

...

[RG7] [PG8] 


...

[i] CDC citation for framework https://www.cdc.gov/eval/framework/index.htm CDC Introduction to Program Evaluation for Public Health Programs:  A Self-Study Guide

...