Overview
Role involves designing and maintaining data pipelines and architecture on AWS.
Ideal candidate has 4 years of data engineering experience with strong AWS and programming skills.
remotemidpermanentfull-timeS3DynamoDBTerraformSpark
Locations
Requirements
Bachelor's degree required Proficiency in AWS data services Proficiency in Python or Go
Responsibilities
Design and maintain ETL/ELT pipelines Develop data warehouse and lake solutions Implement data modeling best practices Optimize data pipeline performance Manage data quality checks Collaborate with data scientists