Overview
Role involves leading data engineering projects and orchestrating complex data workflows.
Ideal candidate should have 5-10 years of experience in data engineering with strong Snowflake expertise.
remotemidpermanentfull-timeSnowflakePythonHadoopSparkHive
Locations
Requirements
5-10 years experience in data engineering Strong experience in Snowflake Strong programming skills in Python and Linux Bash Hands-on experience with Luigi Expertise in Hadoop ecosystem tools In-depth knowledge of AWS services Familiarity with monitoring solutions Proven ability to lead projects
Responsibilities
Lead data engineering projects Orchestrate complex data workflows Deploy and manage data solutions Track and troubleshoot data pipelines Guide junior team members Collaborate with non-technical teams Design or optimize data lake solutions Ensure data security and compliance