Remote heart-rate estimation with self-adaptive matrix completion @ CVPR’16 (Las Vegas)


Sergey Tulyakov, Xavier Alameda-Pineda, Elisa Ricci, Lijun Yin, Jeffrey Cohn and Nicu Sebe


Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state-of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. Our approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation. Thorough experimental evaluation conducted on public benchmarks suggests that the proposed approach significantly outperforms state-of-the-art HR estimation methods in naturalistic conditions[Could not find the bibliography file(s) [?].

Reference

3 thoughts on “Remote heart-rate estimation with self-adaptive matrix completion @ CVPR’16 (Las Vegas)”

  1. Dear Xavier Alameda-Pineda,
    After reading your paper ”Self-Adaptive Matrix Completion for Heart Rate Estimation from Face Videos under Realistic Conditions”, I find it is amazing and I want do some job following this paper. It is greatly appreciated if you could offer me the codes of this paper. Thank you.

    1. Dear Rui,
      Unfortunately the code is not publicly available yet. We are working on it, and we will release it as soon as we can.
      Cheers,
      Xavi

Leave a Reply to Xavier Alameda-Pineda Cancel reply

Your email address will not be published. Required fields are marked *