Overview
Role involves designing and developing data platforms to support data-informed decisions.
Ideal candidate has 3-5 years of experience with data architectures and strong skills in Python and SQL.
remotemidfull-timeEnglishPythonSQLSnowflakeAirflowSparkAWS
Locations
Requirements
3-5 years experience with data warehouse/data lake 1-2+ years experience with Airflow, Iceberg, or Spark/Flink Experience in Python and SQL Hands-on experience with Snowflake, dbt, and Python
Responsibilities
Own data warehouse and pipelines Design and support data pipelines Implement data strategy for governance and quality Communicate data processes to cross-functional groups Identify and implement process improvements
Benefits
Flexible work arrangements Competitive pay and benefits Team events and off-sites