Author:

Julian Geheeb
Supervisor:Prof. Gudrun Klinker
Advisor:Nassim Eghtebas (@ga53xoy)
Submission Date:[created]

Abstract

Interpersonal synchrony makes up a large part of human communication and interaction. There exists tools that can analyze videos of persons conversing to identify potential synchrony between them. However, existing solutions are usually not applicable in a realtime scenario, because they are not fast or exact enough. In this research, we investigate a quantifiable measure of synchrony known as OASIS, which is fast and reliable, but works on extracted Facial Action Units (AUs) of pre-recorded video data. We create an interface for OpenFace to extract AUs in realtime and refactor a sandbox for online social experiments we call Synchrony Hub to implement the interface as a filter for live video streams. The filter can extract AUs on multiple realtime video streams simultaneously with an average frame rate of about 15 frames per second. Furthermore, the Synchrony Hub’s new structure allows to add filters and additional functionality. The AU extraction filter and the structural changes to the hub enable various social experiments to be conducted through an online video conferencing tool, including the integration of a synchrony score like OASIS, which opens up new doors for researcher in the fields of computer science, human–computer interaction, and psychology alike.

Results/Implementation/Project Description

Conclusion

[ PDF (optional) ] 

[ Slides Kickoff/Final (optional)]