AWS ETL Pipeline Development

Заказчик: AI | Опубликовано: 09.03.2026
Бюджет: 1500 $

I need an experienced AWS data engineer to design and build production-ready ETL pipelines in AWS Glue. The work centres on moving and transforming data from several source systems—relational databases (e.g., PostgreSQL, MySQL), NoSQL stores, real-time streams coming through Kafka/Kinesis, and a handful of internal/external REST APIs—into a clean, query-friendly layout in S3 and, ultimately, Redshift. Requirements: Strong AWS data engineering experiences with various aws services Experience building end-to-end data pipelines (schema discovery, ingestion, transformation, orchestration, monitoring) Experience working with relational databases like Oracle, MySQL, and SQL Server etc Experience with data ingestion from on-prem systems to cloud Experience with streaming platforms like Kafka or AWS Kinesis Strong skills in Python, PySpark, SQL, and Terraform