I need an iOS application that leverages the LiDAR sensor to capture an entire room or entire home, display the scan in real time, and then process the scan data into a textured / colorized .obj. The core workflow is straightforward:I have the application built, I am only looking for the scanning functionality for this project. Accuracy matters more than flashy graphics. Measurements should stay within 1% - 2% or so of reality across typical living-room sized spaces. A tap-to-measure tool inside the live view would be ideal for quick dimension checks. Once the scan is finished, the user must be able to: • preview the mesh on device, even if offline •upload the scan data for postprocessing in the cloud, • export to a textured/colorized, scaled and oriented OBJ file I’m happy to lean on Apple’s ARKit LiDAR APIs, SceneKit, Metal, or whatever pipeline you prefer, as long as the end result is smooth on an iPhone 12 Pro and above. If you’ve already tackled meshing, hole-filling, or real-time occlusion on device, tell me how you’d slot that code in. Deliverables 1. Xcode project with clearly commented Swift (or Objective-C) code. 2. Integrated into our existing application 3. Brief read-me covering setup, third-party libraries, and export instructions. I’ll consider the job complete when I can scan multiple rooms, verify the dimensions inside the app, upload the exported model to our google cloud account, and see fidelity that matches the physical space.