DCDC PROJECT HUB
AI Emotion Recognition for Mental Health Monitoring
Problem statement
Mental health issues are often difficult to detect early, especially when individuals do not regularly consult psychologists. Tracking emotional patterns throughout the day could reveal stress, depression, or anxiety trends before they become severe.
Abstract
This project uses a webcam or smartphone camera to capture facial expressions and classify them into emotional states using a CNN-based model. Emotions are logged over time, and analytics charts visualize weekly or monthly mood patterns. The system can optionally alert guardians or therapists if prolonged negative emotions are detected.
Components required
- Camera (Webcam/Mobile)
- OpenCV
- TensorFlow/Keras
- Facial Expression Dataset (FER2013, RAF-DB)
- Local or Cloud Database
- Dashboard UI
Block diagram
Working
The camera continuously captures video frames. OpenCV detects the face region, which is then passed to the CNN classifier. The predicted emotion is time-stamped and stored in a database. The dashboard displays emotional history through graphs and charts. Long-term stress patterns are easily identifiable.
Applications
- Therapists and psychologists
- Schools and colleges
- Workplace stress monitoring
- Personal well-being apps