Overview
Develop and sustain innovative data engineering solutions to enhance customer data experience.
Ideal candidate should have 5+ years of experience in data engineering with strong skills in Python and AWS.
120k usd / yearhybridmidpermanentfull-timeEnglishPythonSQLPySparkAirflowAWSTerraformDatabricksJava+ 2 more
Locations
United States, California, Los Angeles United States, California, Irvine United States, Washington, Seattle
Requirements
US citizen or green card holder 5 years experience with Python, SQL, PySpark 5 years experience with Airflow 5 years experience building ETL pipelines 5 years experience in AWS production environment
Responsibilities
Implement new data engineering features Drive design and implementation of ETL workflows Collaborate with business partners Own operations of ETL workflows Share technical solutions and product ideas Design large-scale data processing systems Initiate projects and improve processes Work with disparate vendor data sets