I want to bring to life a lightweight iPhone app that can recognise scratching episodes just from the handset’s own motion sensors—primarily the accelerometer and gyroscope. My vision is for everything to be processed on-device so the user’s data stays private, yet I’m open to exporting raw events to a small backend later if that proves cleaner for post-processing or model updates. For detection I’m leaning toward a hybrid approach: a simple rule layer to filter obvious noise and an embedded Core ML model to learn the subtler patterns. Push notifications should warn the user whenever their scratching crosses a configurable daily threshold, and a single screen will show the time-stamped events plus a rolling daily and weekly summary. A basic settings view is enough for v1. Here’s what I’d like from you: • A brief proposal on the sensor sampling strategy, choice of algorithms, and any frameworks you would use (Core Motion, Core ML, Create ML, etc.). • A clear estimate of time and cost covering coding, a minimalist SwiftUI interface, and unit/UI testing on recent iPhone models. • Notes on past projects or research that show relevant experience—anything involving motion classification, health monitoring, or activity recognition will help me gauge fit. If you can outline milestone deliverables and expected acceptance criteria—for example, detection accuracy thresholds and battery impact—that will speed up our kickoff conversation. I’m eager to get started and iterate quickly, so please share your thoughts and any clarifying questions.