AI-Driven Exam Prep Website

Заказчик: AI | Опубликовано: 07.03.2026

I plan to launch a preparation hub for UPPSC, UPSC and similar competitive exams. At its core the site must let registered students write mains-style answers online, have them instantly evaluated by an AI model and display a clear score with actionable feedback. In a later phase I will plug in optional human evaluation, so the codebase and database schema need to be modular enough to add manual scoring workflows without rewriting the foundations. Beyond answer-checking, the platform has to host a full prelims test-series engine, allow me to upload study materials in PDF/HTML, push daily news summaries, and keep an interactive discussion forum alive. Every day the system should automatically publish a set of free descriptive questions that any visitor can view; enrolled users can then submit their written answers from a rich-text editor. Three user roles are required. Students see their dashboards, tests, AI feedback and forum threads. Instructors can post questions, attach solutions, moderate discussions and—later—override or add to AI scores. Admins will control everything from user rights to payment toggles and analytics. Front-end can be React, Vue or an equally responsive stack; back-end should pair comfortably with the AI model (OpenAI, Cohere or a custom transformer served via Python/FastAPI—happy to discuss). I will need secure authentication, role-based access control, a content-management panel for news uploads, a cron or queue service for scheduled question drops, and clean databases for responses, scores and comments. Please include unit tests and deployment scripts so I can push easily to AWS or DigitalOcean. Deliverables • Fully functional website with student, instructor and admin portals • AI evaluation module producing scores plus feedback paragraphs • Prelims test-series module with question banks, timers and analytics • CMS sections for study materials, daily news and free descriptive questions • Discussion forum integrated with the user system • Documentation and hand-off session covering future manual-evaluation expansion If you have built ed-tech or NLP scoring tools before, let me know; seeing a brief demo of similar work will speed up selection.