Overview
Role involves maintaining and operating the data warehouse and connecting data sources.
Ideal candidate has 5+ years in data engineering with strong data modeling and ETL experience.
remoteseniorpermanentfull-timeEnglishPythonAirflowHadoopSparkKafka
Locations
Requirements
5+ years experience in data engineering Experience in data modeling and ETL pipelines Bachelor's degree in a quantitative field
Responsibilities
Develop and maintain data pipelines Implement automated monitoring and alerting Ensure data quality and accuracy Define data models and populate data warehouse Collaborate with business units on data strategy