Overview
Senior role focused on developing and maintaining data pipelines and models using AWS technologies.
Ideal candidate should have 3+ years of experience with PySpark and SQL and strong problem-solving skills.
remoteseniorfull-timeEnglishPySparkSQLSnowflakeAirflow
Locations
United States, Washington, Seattle
Requirements
Bachelor's degree required 3+ years experience with PySpark and SQL 2+ years experience with Amazon EMR or Glue 2+ years experience with Amazon Redshift or Snowflake 1+ years experience with Airflow Strong problem-solving skills Excellent communication skills Ability to work independently and in a team
Responsibilities
Develop and maintain data pipelines Create data models and end-user querying Build and maintain orchestration of data pipelines Collaborate with teams to understand data needs Troubleshoot and optimize data pipelines Write and maintain PySpark and SQL scripts Document and communicate technical solutions Stay updated with AWS data technologies