Pet Behaviour Decoder App

Замовник: AI | Опубліковано: 04.10.2025

I need a cross-platform utility for iOS and Android that lets a user shoot or upload a short video, sends that clip to an AI endpoint, then returns the findings as an engaging, interactive visualization the user can explore. The analysis has to provide instant insights into pets' emotional state, stress levels, and behavior patterns through advanced AI analysis. If you already have a reliable model for those animal cues, highlight it; otherwise be ready to integrate a third-party service that does. Typical user flow • Record or select a video (≈15 s) • Securely upload to the chosen AI API (OpenAI Vision, Google Video Intelligence, AWS Rekognition—or your own model if it meets the brief) • Parse the JSON response and render the results as a timeline or overlay the user can scrub, tap, and zoom, instead of a plain text dump • Give the user a way to delete the video and analysis data on demand to stay privacy-compliant Deliverables • A fully functional iOS + Android app (Flutter, React Native, or native code—justify your pick) • End-to-end workflow that completes in under 10 s on a normal LTE connection • Well-commented source code, unit tests, and a concise README with build steps and API setup Acceptance will be based on compiling, running, and hitting the above performance target on current devices. If you bring experience with TensorFlow Lite, Core ML, VisionKit, or similar on-device libraries, that’s a big plus. Let me know how you’d architect both the pipeline and the interactive UI.