IVD Data Hub Landscape Assessment

 

 

 

 

 

 

Background:

The IVD and Clinical Information System (CIS) industries strive to meet the needs of customers and promote public health through the provision of safe and effective products. One barrier to laboratory interoperability is the lack of standardization.  Addressing this through semantic harmonization and standardization will have a significant positive impact on clinical laboratory operations, as it will significantly reduce the time and cost involved with deploying, connecting, and updating instruments in the laboratory. Such standardization will reduce the need for vendor-customized connectivity implementations and serve as a catalyst and common incentive for stakeholders across the ecosystem to promote interoperability of laboratory data. In this context, SHIELD aims to promote the use of high-quality data for secondary use purposes. A practical application of this goal is the establishment of a tool or data system that serves as an IVD Data Hub to facilitate the use of RWD. An IVD Data hub provides incentive for all stakeholders for the for the purposes of supporting regulatory decision making, improving public health reporting, and developing and effectively implementing future innovation.

 

Vision

 

Landscape outside of SHIELD

A review of other initiatives outside of SHIELD which are seeking to establish a publicly avaialble data platform empowering use of quality real world data for secondary use purposes (e.g. public health reporting, regulatory reviews, etc)

https://www.iderha.org/about/work-packages/infrastructure-and-tools-integration-data-wp1

https://seer.cancer.gov/data-software/crwdi/

Baseline Requirements & Use Cases:

Use case 1 : Real-World IVD Clinical Evidence

Use case overview

This use case is described in both

  • The MDIC publication where an IVD manufacturer intends to perform an observation study for a given test (i.e. IVD system & reagent) in order to get clinical performance insight.

  • The draft FDA guideline in appendix B presenting exemples were RWE is used

This draft guidance contains explicit data requirements described in sections A - 1, Data Availability (lines 385 – 417) and B - 1, Reliability, lines 467 - 513.

 

Actors

IVD manufacturers

May also be a regulatory body or a medical laboratory rep.

Preconditions

Lab data

  • from one or multiple medical laboratories are made available with a sufficient granularity to enable targeted analytics of the test and IVD system of interest.

  • are harmonized and standardized across the data providers

Relevant patient data 

 

 

This use case is detailed in the following situation

Use Case1-A 

 

Use case overview

As illustrated in fig 2 of MDIC publication either of the two situation may be considered

  • patients leftover samples are tested with a candidate test followed by a comparator test (e.g., gold standard) according to routine clinical practice. 

  • patients leftover samples are tested with a candidate test and compared to clinicals status

 

 

Actors

IVD Manufacturer

Medical Laboratory

Data Hosting Service(s) - Identified RW Data Source

 

Preconditions

Lab performs test as part of routine clinical practice 

Patients have targeted condition/disease for the IVD 

Comparator test or clinical status must be available

 

Assumption

  1. Candidate and Comparator Test are run for each patient sample leftover OR Candidate test is run for each patient sample leftover and patient clinical status is known

  2. Populations are characterized and comparable

  3. Data Sources are equivalent (i.e., the data is consistently represented and available)

  4. Test Target is the same 

 

Flow

  1. Candidate Test is run for each patient sample 

  2. Comparator Test is run for each patient sample

  3. Candidate and Comparator observations & results are sent to the Lab IS

  4. Patient & lab data are transferred to a repository suitable for data analytics

  5. Data analytics is performed across one or several repositories

 

 

Use Case 1-b - Assess Health Technologies

 

Use case overview

This use case is described in the MDIC publication where a regulatory body intends to perform an observation study for a given test (i.e. IVD system & reagent) .

 

Actors

Regulatory body

 

Preconditions

Lab data

  • from multiple medical laboratories are made available with a sufficient granularity to enable targeted analytics of the test and IVD system of interest.

  • are harmonized and standardized across the data providers

Relevant patient data are gathered from XXX

 

Flow

TBD

 

Use case 2 : Aggregate data from numerous data source

Use case overview

UC1 and 2 require that laboratory data and patient data be aggregated in a data repository while preserving data ownership, data privacy according to XXX

 

Actors

Medical Laboratory

Data Hosting Service(s) - Identified RW Data Source

 

Preconditions

  • Standardized and harmonized data are produced at the data source and conveyed without losses across all the health care chain : IVD system, LIS, EHR, etc.

 

Alternatives

  • Data are made available and aggregated in a centralized repository

  • Data are made available through a network of federated databases, leaving the data under the control & ownership of the production site (i.e. hospital or laboratory)

 

Flow

TBD

 

Baseline high level requirements

 

id

Requirement

Name

Requirement description

UC ref

UC

Priority

  1.  

Lab data granularity

Lab data shall associate, from the producer IVD system,

  • ordered test

  • performed test(s)

  • test(s) results : qualitative; quantitative, semi-QT (depending on the test) ; 

    • when applicable, quantitative results + transformation into qualitative results and reference to transformation standard

  • systems & reagents UDIs, at least the DI part

  • ? to be completed ?

UC 1 & 2

1

  1.  

Patient data

Patient covariates that may impact the outcome of interest shall be recorded. Such as (non limiting list)

  • signs, symptoms, treatments, procedures, diagnoses, patient and family history

  • pre-existing conditions, labs, demographics, and results which may be used to construct covariates that are relevant to the study question

UC 1 & 2

1

  1.  

Longitudinally

data longitudinality shall cover a sufficient time span to support RWE generation

UC 1 & 2

1

  1.  

data reliability

NOT SURE IF WE SHOULD REFER TO THE DRAFT GUIDANCE? SEE B – 1, Reliability, lines 467 - 513

 

 

 

  1.  

Standardization of lab data 

Standards presented in CLSI AUTO 17 shall be used to describe lab data elements from req. #1 & #2

UC 1 & 2

1

  1.  

IVD data source

The IVD data systems shall produced standardized and harmonized lab data as proposed in CLSI AUTO 17

UC 1 & 2

 

  1.  

IVD to LIS

IVD data shall be sent to the Lab IS

  • without loses to data

  • with all level of granularity, in case of data generalization, the original data shall also be kept

UC 1 & 2

 

  1.  

LIS to EHR

IVD data shall be received by the Lab IS

  • without loses to data

  • with all level of granularity, in case of data generalization, the original data shall also be kept

UC 1 & 2

 

  1.  

IVD / LIS / EHR to data data aggregation

IVD and patient data shall be received and passed

  • without loses to data

  • with all level of granularity, in case of data generalization, the original data shall also be kept

UC 1 & 2

 

 

 

 

Use Case

 

Use case overview

Patients are tested with a candidate test followed by a comparator test (e.g., gold standard) according to routine clinical practice.

 

Actors

IVD Manufacturer

Medical Laboratory

Data Hosting Service(s) - Identified RW Data Source

 

Preconditions

Lab performs test as part of routine clinical practice 

Patients have targeted condition/disease for the IVD 

Comparator test must be available 

 

Assumption

  1. Candidate and Comparator Test are run for each patient sample 

  2. Populations are characterized and comparable

  3. Data Sources are equivalent (i.e., the data is consistently represented and available)

  4. Test Target is the same 

 

Flow

  1. Candidate Test is run for each patient sample 

  2. Comparator Test is run for each patient sample

  3. Candidate and Comparator observations & results are sent to the Lab IS

  4. Patient & lab data are transferred to a repository suitable for data analytics

  5. Data analytics is performed across one or several repositories

 

 

Baseline data high level requirements

(neither functional nor non-functional requirements)

 

id

Requirement description

UC ref

UC Priority

1

Lab data granularity: lab data shall associate

  • ordered test

  • performed test(s)

  • test(s) results : QL; QT, semi-QT (depending on the test) ; when applicable, quantitative results + transformation in qualitative results and reference to transformation standard

  • systems & reagents UDIs 

  • to be completed 

 

 

2

 

 

 

 

 

 

 

 

*Jurisdictional requirements must be followed.  Those are not captured in this set of requirements.

Actors

 

Systems -  RW Data Sources

  • Electronic health records (EHRs)

  • Laboratory information systems (LISs)

  • Administrative claims databases

  • Hospital discharge databases

  • Disease registries

  • Medical device registry

  • Image databases and repositories

  • Patient-reported outcomes (PROs)

  • Death indexes

  • Public health databases (e.g., Centers for Disease Control and Prevention)

 

Other

  • AI - Where do we put this

  • Synthetic Data 

 

 

 

RWD/RWE - 

 

Use Cases in Regulatory decision making

 

Support premarket authorization

 

  • In settings where clinical studies are impractical to conduct (e.g., rare diseases, long follow-up needed to achieve endpoint(s), cost)

  • For filling important evidentiary gaps not typically addressed with traditional clinical studies

  • When allowing sponsors to generate evidence in support of an effectiveness claim that is potentially useful to multiple stakeholders, regulators, patients, and the public

  • For potentially reducing time and cost of evidence development for regulatory decisions

 

Premarket Use Cases

  • Evidence to identify, demonstrate, or support the clinical validity of an IVD device

  • Generating performance characteristics as clinical sensitivity and specificity

  • Evidence to support an Investigational Device Exemption (IDE) submission for a significant risk IVD

 

 

Study Design Types

  • Registry embedded

  • Feasibility planning

  • External Control of clinical study

  • Pragmatic trial

 

  • Clinical Performance studies - RWD is suitable for Observational Studies

  • Examples

    • Virtual clinical performance study.  Patients are tested with a candidate test followed by a comparator test (e.g., gold standard) according to routine clinical practice.

    • Scheme for RWD aid in diagnosis and follow-up.  RWD are intended to demonstrate the claim extension of an “aid in diagnosis” IVD for risk assessment/prognosis of a given disease or target condition.

    • RWD for an AI system as a binary test. RWD are from routine clinical practice in which positive results will be referred for additional testing.

    • Virtual clinical study for an AI system as a test with three categories. In this system, the positive category is divided into two categories (medium positive and high positive).

    • New IVD test using AI/Machine Learning. RWD are “multiple individual tests,” and the virtual study is a “new IVD device using AI/machine learning.”

 

R&D Uses

  • Site or patient identification – for enrollment in clinical studies

  • Feasibility studies – to test protocols for new products or a potential new intended use

  • Selection of historical control, comparator, or predicate device(s) for clinical study

  • Quality improvements – surveillance of specific postmarket IVD performance factors

  • Meeting design control requirements – for example, RWD could be used as design input for new product development

  • New uses and expanded indications for IVD devices – safety and effectiveness of an assay outside of its intended use (i.e., “off-label” use), extending indications to other patient populations, tissues, or sample types; RWE can provide evidence for identifying and supporting new claims or otherwise expanding the labeling of an IVD

  • Methods selection – use of RWD to simulate clinical trial results or outcomes of testing prior to implementing a costly clinical trial

  • Review and comparison of safety and effectiveness endpoints – Objective Performance Criteria and Performance Goals derived from clinical studies and/or registries may be used

  • Redefining current use models, for example, to define new scoring methods in histology or to define new cutpoints for existing assays

 

 

IVD Data Characteristics

 

  • Inclusion of data from heterogeneous patients with a wide range of characteristics, experiences, comorbidities, and treatment protocols. 

  • Target Patient population identified in the IFUs - should be included in the analysis; and any additional patient populations identified as to the IVDs performance 

  • RWD may also include pre-analytical variables (e.g.,specimen collection containers, sample storage, and handling).

  • Not all RWD are of appropriate quality

  • RWE is fit for use in regulatory submissions

  • Level of granularity needs to be captured/documented in study design

 

Can we define which IVD data in the RWD Sources will be acceptable for use in various use cases?

  • Data Quality

  • Validity

  • Completeness

  • Fit for Purpose

 

Consider data granularity

  • Data lost in downstream workflow (e.g., LOINC codes)

  • Data not exchanged from the originating system 

 

Regulatory requirements - 

 

Dependencies:

  1. Define the role and responsibilities of each party in building the IVD Data Hub:

a.     Data creators/collectors

                                          i.    Laboratories

                                         ii.    IVD vendors

                                        iii.    Laboratory information and EHR system vendors, middleware vendors

                                        iv.    Standards development organizations

b.     Data sources (e.g. hospitals, laboratories, registries …)

c.     Data Hub users (e.g. researchers, clinicians, patients, public health reporting, IVD vendors, regulatory agencies…)

d.     Data collection/sharing responsibilities and restrictions

  1. Successful completion and implementation of the LIDR standard:

a.     Build out of the LIDR infrastructure.

b.     Build out of LIDR content.

c.     Finalization and definition of some of the other data elements being proposed:  Harmonization indicator, UDI (instrument vs kit vs reagent systems)

                                          i.    Create a master catalog mapping between each of the tests/assays/reagents names, unit, harmonization indicator and UDI to LOINC/SNOMED codes

d.     Define the process for additions/expansion to the LIDR

e.     Determine if LIDR is a new SDO and who oversees/curates it.

f.      Eliminate the opportunities for labs to create variability in the LIDR elements they utilize.

g.     Funding for LIDR. Staffing required to manage it as well as curate it and ensure its’ entries are correct and if not communicate back to IVD vendors.

h.     Determine if existing LOINC/SNOMED codes are sufficient or can be used interchangeably.

i.      Interface developed between LIS and LIDR

j.      Interface developed between IVD Vendors and LIDR

  1. Implementation of LIDR content in the IVD vendor platforms

a.     IVD Vendors must be ardent in pursuing correct codes which may mean hiring the correct type of staff to dedicate to determining them based on methods. A strong knowledge of lab methods will be required.

  1. Implementation of LIDR content in the LIS vendor platforms

a.     Development of interfaces and logic/rules which can digest the encoded lab results and utilize appropriately within the LIS/EMR so that all required elements are evaluated and compared to existing test records to determine if they are the same tests and should be trended together or utilized in best practice advisories.

  1. Education and training on LIDR, Interoperability and Standards for Labs, LIS teams and IVD vendors

a.     Including general marketing and communication about the importance of this endeavor and what goals are trying to be achieved.

b.     Publications in ASCP, AMIA, etc.

  1. Availability of required data elements from the sending source

  1. Data use/sharing agreements in place from the sending source

  1. Transmission interface from Lab to Data Hub in a structured and standardized way

a.     Completion of any work that may need to be done by HL7, FHIR, etc.

  1. Determination of an entity to create, manage and maintain the Datahub.

a.     Including the necessary staff, expertise, and budget

10)  Determination of patient identity

11)  Will this be de-identified patient data?

a.     If not, the necessary HIPAA measures and considerations will need to be taken.

12)  Security and Access

 

 

Other ongoing efforts around IVD Databases: