I’m running a study and need a reliable partner to source and organise the raw data that will feed my analysis. The subject matter and variables are ready; what’s missing is the actual dataset. Here’s what I need from you: • Locate and collect the specified records exactly as outlined in the brief I’ll share after kickoff. • Validate the information for accuracy and completeness, flagging any gaps or anomalies. • Deliver the cleaned dataset together with a short log of your methodology and sources so the work is fully replicable. I’m comfortable receiving the final file in whichever structure you prefer—Excel, CSV, or a lightweight database—as long as it’s clearly labelled and ready for statistical processing. I’m open to your recommendations on the most efficient tools or scripts to expedite the process; Python-based scrapers, R, or even manual techniques are all acceptable if they meet academic-quality standards. Quality, transparency, and prompt communication are my top priorities. If you’ve handled data collection for peer-reviewed papers or similar research projects before, I’d love to see examples.