To create a joyful, emotion-aware learning environment that detects a child’s emotions using facial expressions and adjusts game difficulty, UI design, and feedback dynamically to enhance focus, confidence, and memory. Core Features: Emotion Detection System (AI Model): Built using CNN or Transformer architecture. Trained on a dataset with 7 emotions — happy, sad, angry, fear, disgust, surprise, neutral. Uses the webcam feed to detect emotions in real-time. Adaptive Game Environment: Includes memory-boosting games and shape-joining activities for dyslexic children. The game’s UI, colors, and sounds change based on detected emotions. Example: If the child is sad → soothing music, encouraging messages. If happy → higher difficulty and rewards. Automatic Emotion Tracking: Emotion detection starts when the game begins and stops when it ends. No manual intervention is required. Data Logging and Analysis: Emotion data and gameplay stats are stored in MongoDB. Helps track emotional and learning progress over time. Child-Friendly UI/UX: Simple navigation, large buttons, vibrant visuals. Accessibility-focused for dyslexic learners. Real-Time Adaptation: The system continuously monitors emotions and adjusts gameplay dynamically. Tech Stack: Layer Technologies Used Frontend HTML, CSS, JavaScript, React AI/Model Python, TensorFlow / PyTorch (Transformer-based model) Backend Node.js / Flask Database MongoDB Integration WebRTC or MediaPipe for webcam access System Workflow: Child starts the JoyVerse learning game. Webcam captures live facial expressions. The EmotionTransformer model detects real-time emotions. The UI/game logic adapts based on the detected emotion. Emotional data and performance are stored in MongoDB. Reports can be used by parents/teachers for understanding emotional patterns.