This Validated Pattern is based on a demo implementation of an automated data pipeline for chest Xray analysis previously developed by Red Hat. The original demo can be found here. It was developed for the US Department of Veteran Affairs.
This validated pattern includes the same functionality as the original demonstration. The difference is that we use the GitOps framework to deploy the pattern including operators, creation of namespaces, and cluster configuration. Using GitOps provides a much more efficient means of doing continuous deployment.
What does this pattern do?:
- Ingest chest Xrays from a simulated Xray machine and puts them into an objectStore based on Ceph.
- The objectStore sends a notification to a Kafka topic.
- A KNative Eventing Listener to the topic triggers a KNative Serving function.
- An ML-trained model running in a container makes a risk assessment of Pneumonia for incoming images.
- A Grafana dashboard displays the pipeline in real time, along with images incoming, processed and anonymized, as well as full metrics collected from Prometheus.
This pipeline is showcased in this video.
- How to use a GitOps approach to keep in control of configuration and operations
- How to deploy AI/ML technologies for medical diagnosis using GitOps.
Red Hat Technologies
- Red Hat OpenShift Container Platform (Kubernetes)
- Red Hat OpenShift GitOps (ArgoCD)
- Red Hat AMQ Streams (Apache Kafka Event Broker)
- Red Hat OpenShift Serverless (Knative Eventing, Knative Serving)
- Red Hat OpenShift Data Foundations (Cloud Native storage)
- Grafana dashboard (OpenShift Grafana Operator)
- Open Data Hub
- S3 storage
In this iteration of the pattern there is no edge component . Future releases have planned Edge deployment capabilities as part of the pattern architecture.
Components are running on OpenShift either at the data center or at the medical facility (or public cloud running OpenShift).
The diagram below shows the components that are deployed with the various networks that connect them.
The diagram below shows the components that are deployed with the the data flows and API calls between them.