Overview
Role involves designing and maintaining scalable data pipelines using AWS technologies.
Ideal candidate should have 7+ years of experience with deep expertise in the AWS ecosystem.
remoteseniorpermanentfull-timeEnglishAWSPythonSQLApache AirflowETL
Locations
Requirements
7+ years experience required Deep expertise in AWS ecosystem Proficient in Python and SQL Hands-on experience with orchestration tools Familiarity with CI/CD pipelines
Responsibilities
Design and maintain data pipelines Transform raw data into clean datasets Implement data lake and warehouse architectures Collaborate with teams for data requirements Optimize performance of data systems Uphold data governance standards