Motion estimation-based image enhancement in ultrasound imaging

Renaud Morin (Lead / Corresponding author), Adrian Basarab, Stéphanie Bidon, Denis Kouamé

    Research output: Contribution to journalArticlepeer-review

    17 Citations (Scopus)


    High resolution medical ultrasound (US) imaging is an ongoing challenge in many diagnosis applications and can be achieved either by instrumentation or by post-processing. Though many works have considered the issue of resolution enhancement in optical imaging, very few works have investigated this issue in US imaging. In optics, several algorithms have been proposed to achieve super-resolution (SR) image reconstruction, which consists of merging several low resolution images to create a higher resolution image. However, the straightforward implementation of such techniques for US imaging is unsuccessful, due to the interaction of ultrasound with tissue and speckle. We show how to overcome the limit of SR in this framework by refining the registration part of common multiframe techniques. For this purpose, we investigate motion estimation methods adapted to US imaging. Performance of the proposed technique is evaluated on both realistic simulated US images (providing an estimated best-case performance) and real US sequences of phantom and in-vivo thyroid images. Compared to classical SR methods, our technique brings both quantitative and qualitative improvements. Resolution gain was found to be 1.41 for the phantom sequence and 1.12 for the thyroid sequence and a quantitative study using the phantom further confirmed the spatial resolution enhancement. Furthermore, the contrast-to-noise ratio was increased by 27% and 13% for simulated and experimental US images, respectively.

    Original languageEnglish
    Pages (from-to)19-26
    Number of pages8
    Early online date21 Feb 2015
    Publication statusPublished - Jul 2015


    Dive into the research topics of 'Motion estimation-based image enhancement in ultrasound imaging'. Together they form a unique fingerprint.

    Cite this