Updated: Jul 8, 2020
The healthy functioning of the human body is affected by a myriad of diseases. Some diseases are monitored and treated by fixing external medical implants. These implants include pacemakers, chest leads, sternal sutures, various forms of tubes like nasogastric tubes and endotracheal tubes, etc. Foreign objects are easy to identify in a chest X-ray and do not require any computer-aided methods for detection, but sometimes their presence weakens the performance of automatic diagnostic methods. For example, pulmonary disease detection models give a higher number of false positives if they learn from training data that consists of images with foreign objects. At DeepTek, we developed a technique to detect these foreign objects to help us strengthen other models.
We selected chest leads, pacemakers, and sternal sutures as the foreign objects to detect. We initially built a three-class classification model — chest leads, pacemakers, and sternal sutures being the three different classes — using deep neural networks. Its recall (the ability of the model to identify the cases with chest leads, pacemakers, or sternal sutures present) was below 0.6 for all three objects, much lower than the ideal recall of 1. This led us to build an object detection model, which could zero down on the right region of interest (ROI), as an alternative to the classification model. The performance of the three-class object detection model turned out to be superior to the classification model with an average recall of 0.85 and an average dice score (dice score measures the accuracy of localization of the object) of 0.72.
Our model can be used both as a classifier and as an ROI extraction tool for the three foreign objects. It detects and classifies 95% of the occurrences of each of these foreign objects, making it a useful filter for other disease classification as well as object detection models. In the next section, we present a case study in which the use of a foreign object detection model helped reduce false positives.
Use Case: Reducing False Positives Given by Granuloma/End-On Vessel/Nodule Detection Models in Chest X-Rays
Diseases are often falsely diagnosed as some other disease due to similar visual characteristics. Lung granulomas, end-on vessels, and nodules are examples of diseases that look similar to each other. Each of these diseases looks like a tiny spot in a chest X-ray. Many computer-aided approaches exist to differentiate between lung granuloma, end-on vessels, and nodules using chest X-rays, but one major problem is that the model gets confused between granuloma, end-on vessels and foreign bodies.
(A) original X-ray image; (B) red bounding boxes indicating predictions given by the granuloma/end-on/nodule detection model. The larger of the two boxes is a chest lead which the model confused with granuloma; (C) yellow masks depicting locations of foreign objects given by the foreign objects model; (D) foreign objects filter applied on the granuloma/end-on/nodule model. The foreign objects filter corrected the original model’s predictions.
While developing a model to detect granuloma, end-on vessels, and nodules, we found that almost 50% of the false positives the model predicted were due to chest leads, which were attached to the patients while scanning them. It is possible to reduce the number of false positives by half if the false positives due to chest leads get eliminated. To achieve this, we used a foreign object detection model that detected chest leads from an image. We used this model as a filter on the granuloma/end-on vessels/nodule-detection model. During evaluation, we observed a 48% reduction in the number of false positives, just by applying the chest leads filter.
Our model detects granuloma, end-on vessels, and nodules rather well, with fewer false positives, after applying the chest leads filter and gives their positions enclosed in bounding boxes. We are currently investigating ways to further reduce the number of false positives given by our models because each such causes issues. For example, every false positive case requires extra radiologist attention. By reducing false positives, we are reducing radiologist burden and increasing confidence in the model.