Emotion Recognition from still images in PyTorch

I have a few questions I would like to ask. Please feel free to answer partially as I would like to hear your feedback/answer.

  1. Is there a good baseline code written in PyTorch for emotion recognition from still images say for FER2013 dataset or another dataset that has a discrete set of emotion classes?
  2. Should I detect the emotion based on Facial Action Units or should I detect emotion based on Facial Landmarks? Which one would be a better indicator?
  3. If I find a good baseline code that detects emotion from still images, how can I use transfer learning to detect emotions on a dataset that consists of video segments of 10-30 seconds? Basically, now I want to detect the emotion on a different modality: video.
  4. Are you aware of a good dataset that detects the facial expression/emotion based on videos for deep learning application? For example, for each 10-30 second video segment, it has a emotion label from 6-8 classes of emotion.
  5. Is there a good baseline code you are aware of that can detect emotion from video segments? (assuming there is a dataset as in 4).

Additionally, please feel free to link me with papers for 3 and 5.

I can’t answer your questions directly, but I think you might get some ideas from this thread.
It’s dealing with facial expressions and some experiments were performed regarding different input features etc.

1 Like