Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

10-2020

Abstract

As the GAN-based face image and video generation techniques, widely known as DeepFakes, have become more and more matured and realistic, there comes a pressing and urgent demand for effective DeepFakes detectors. Motivated by the fact that remote visual photoplethysmography (PPG) is made possible by monitoring the minuscule periodic changes of skin color due to blood pumping through the face, we conjecture that normal heartbeat rhythms found in the real face videos will be disrupted or even entirely broken in a DeepFake video, making it a potentially powerful indicator for DeepFake detection. In this work, we propose DeepRhythm, a DeepFake detection technique that exposes DeepFakes by monitoring the heartbeat rhythms. DeepRhythm utilizes dual-spatial-temporal attention to adapt to dynamically changing face and fake types. Extensive experiments on FaceForensics++ and DFDC-preview datasets have confirmed our conjecture and demonstrated not only the effectiveness, but also the generalization capability of DeepRhythm over different datasets by various DeepFakes generation techniques and multifarious challenging degradations.

Keywords

DeepFake detection, heartbeat rhythm, remote photoplethysmography (PPG), dual-spatial-temporal attention, face forensics

Discipline

Graphics and Human Computer Interfaces | Software Engineering

Research Areas

Software and Cyber-Physical Systems

Publication

Proceedings of the 28th ACM International Conference on Multimedia, MM 2020, Seattle, October 12–16

First Page

4318

Last Page

4327

ISBN

9781450379885

Identifier

10.1145/3394171.3413707

Publisher

Association for Computing Machinery

City or Country

Virtual Conference

Share

COinS