The majority of current medical CBIR systems perform retrieval based only on "imaging signatures" generated by extracting pixel-level quantitative features, and only rarely has a feedback mechanism been incorporated to improve retrieval performance. In addition, current medical CBIR approaches do not routinely incorporate semantic terms that model the user's high-level expectations, and this can limit CBIR performance.We propose a retrieval framework that exploits a hybrid feature space (HFS) that is built by integrating low-level image features and high-level semantic terms, through rounds of relevance feedback (RF) and performs similarity-based retrieval to support semi-automatic image interpretation. The novelty of the proposed system is that it can impute the semantic features of the query image by reformulating the query vector representation in the HFS via user feedback. We implemented our framework as a prototype that performs the retrieval over a database of 811 radiographic images that contains 69 unique types of bone tumors.We evaluated the system performance by conducting independent reading sessions with two subspecialist musculoskeletal radiologists. For the test set, the proposed retrieval system at fourth RF iteration of the sessions conducted with both the radiologists achieved mean average precision (MAP) value ~ 0.90 where the initial MAP with baseline CBIR was 0.20. In addition, we also achieved high prediction accuracy (>0.8) for the majority of the semantic features automatically predicted by the system.Our proposed framework addresses some limitations of existing CBIR systems by incorporating user feedback and simultaneously predicting the semantic features of the query image. This obviates the need for the user to provide those terms and makes CBIR search more efficient for inexperience users/trainees. Encouraging results achieved in the current study highlight possible new directions in radiological image interpretation employing semantic CBIR combined with relevance feedback of visual similarity.
View details for PubMedID 29981490