I need a skilled Web AR engineer to transform our ready-made GLB ring and bracelet models into an in-browser try-on experience that runs smoothly on both mobile and desktop. Using MediaPipe Hands or TensorFlow.js for tracking and Three.js/WebGL for rendering, you’ll anchor each model to the correct finger joints, apply real-time smoothing, scaling, and rotation, and push realism with full occlusion and segmentation. Because the feature sits inside an open product page, no user authentication is required. What matters is visual fidelity and performance: 60 fps on modern iOS/Android devices, graceful fallback on desktop webcams, and fast initial load from our Cloudflare CDN. You’ll wire up a simple interface—try-on toggle, size adjuster, screenshot capture, and “add to cart” action—exactly as supplied in our UI files. We provide all jewellery models, Figma mocks, CDN endpoints, and detailed product specs so you can focus purely on the AR layer. Core deliverables • Web AR module (ES6/TypeScript) integrating MediaPipe Hands or TF.js with Three.js • Accurate ring/bracelet attachment with scaling logic per joint • Occlusion + segmentation shader for high realism • Lightweight UI interactions (toggle, resize, shot, cart) • Optimised bundle ≤ 1 MB gzipped, sustaining 60 fps on flagship mobiles • Implementation notes and clean, documented source code Timeline MVP in 1–2 weeks, full production polish and edge-case handling in 6–8 weeks. If you also bring Next.js know-how or experience fine-tuning PBR materials, let me know—those extras will help us move even faster.